文献調査で、用語一覧、参考文献一覧を作ることがある。
用語一覧は、awkのスクリプトを使っている。
pdftotextなどで、PDFをTextファイルにする処理も行う。
@kazuo_reveさんの「自動車の故障診断に関連するプログラマーになりたての方が参照するとよさそうな情報」の読み方
の元記事である
「自動車の故障診断に関連するプログラマーになりたての方が参照するとよさそうな情報」
https://qiita.com/kazuo_reve/items/f773b320dcbf2ab316da
で参照している文献の
クルマの自己診断機能「OBD2」の用途は“診断”だけじゃない
https://monoist.atmarkit.co.jp/mn/articles/1511/11/news004.html
AUTOSARを使いこなす
https://monoist.atmarkit.co.jp/mn/series/9343/
は、すごく内容が豊富だ。
AUTOSARを使いこなす(16)
自動車の“安全”を考える、ISO 26262の先にある「SaFAD」にどう対応すべきか
https://monoist.atmarkit.co.jp/mn/articles/2008/27/news008.html
この文の課題は、SaFADはISO 26262の先ではないこと。
もっと根っこの話。
上記文章で、下記文献を参照している。
SAFETY FIRST FOR AUTOMATED DRIVING
https://www.daimler.com/documents/innovation/other/safety-first-for-automated-driving.pdf
"SAFETY FIRST FOR AUTOMATED DRIVING" に追加するとよいかもしれないこと
https://qiita.com/kaizen_nagoya/items/0bab4a2c184c8fbfb0ef
参考文献の参考文献の一覧を作成中。
この文献の参考文献の一つに、
[22] HINTON, J., & SEJNOWSKI, T. (1999). Unsupervised Learning: Foundations of Neural Computation. Cambridge, MA: MIT Press.
がある。この文献を解読しようとして、さらに参考文献の一覧を作成しはじめる。
<この節は書きかけです。順次追記します。>
Table of Contents
https://papers.cnl.salk.edu/PDFs/Unsupervised%20Learning_%20Foundations%20of%20Neural%20Computation%201999-3636.pdf
Introduction
DOI: https://doi.org/10.7551/mitpress/7011.003.0001
https://papers.cnl.salk.edu/PDFs/Unsupervised%20Learning_%20Foundations%20of%20Neural%20Computation%201999-3514.pdf
1: Unsupervised Learning
DOI: https://doi.org/10.7551/mitpress/7011.003.0002
2: Local Synaptic Learning Rules Suffice to Maximize Mutual Information in a Linear Network
DOI: https://doi.org/10.7551/mitpress/7011.003.0003
3: Convergent Algorithm for Sensory Receptive Field Development
DOI: https://doi.org/10.7551/mitpress/7011.003.0004
4: Emergence of Position-Independent Detectors of Sense of Rotation and Dilation with Hebbian Learning: An Analysis
DOI: https://doi.org/10.7551/mitpress/7011.003.0005
5: Learning Invariance from Transformation Sequences
DOI: https://doi.org/10.7551/mitpress/7011.003.0006
6: Learning Perceptually Salient Visual Parameters Using Spatiotemporal Smoothness Constraints
DOI: https://doi.org/10.7551/mitpress/7011.003.0007
7: What is the Goal of Sensory Coding?
DOI: https://doi.org/10.7551/mitpress/7011.003.0008
8: An Information-Maximization Approach to Blind Separation and Blind Deconvolution
DOI: https://doi.org/10.7551/mitpress/7011.003.0009
9: Natural Gradient Works Efficiently in Learning
DOI: https://doi.org/10.7551/mitpress/7011.003.0010
10: A Fast Fixed-Point Algorithm for Independent Component Analysis
DOI: https://doi.org/10.7551/mitpress/7011.003.0011
11: Feature Extraction Using an Unsupervised Neural Network
DOI: https://doi.org/10.7551/mitpress/7011.003.0012
12: Learning Mixture Models of Spatial Coherence
DOI: https://doi.org/10.7551/mitpress/7011.003.0013
13: Bayesian Self-Organization Driven by Prior Probability Distributions
DOI: https://doi.org/10.7551/mitpress/7011.003.0014
14: Finding Minimum Entropy Codes
DOI: https://doi.org/10.7551/mitpress/7011.003.0015
15: Learning Population Codes by Minimizing Description Length
DOI: https://doi.org/10.7551/mitpress/7011.003.0016
16: Helmholtz Machine
DOI: https://doi.org/10.7551/mitpress/7011.003.0017
17: Factor Analysis Using Delta-Rule Wake-Sleep Learning
DOI: https://doi.org/10.7551/mitpress/7011.003.0018
18: Dimension Reduction by Local Principal Component Analysis
DOI: https://doi.org/10.7551/mitpress/7011.003.0019
19: A Resource-Allocating Network for Function Interpolation
DOI: https://doi.org/10.7551/mitpress/7011.003.0020
20: Learning with Preknowledge: Clusterin with Point and Graph Matching Distance Measures
DOI: https://doi.org/10.7551/mitpress/7011.003.0021
21: Learning to Generalize from Single Examples in the Dynamic Link Architecture
DOI: https://doi.org/10.7551/mitpress/7011.003.0022
Index
DOI: https://doi.org/10.7551/mitpress/7011.003.0023
Introduction
Reference
Amari, S.-I. (1998) Natural gradient works efficiently in learning. Neural Com- putation 10(2):252-276. Reprinted in this volume.
Atick,J.J. andRedlich,A.N.(1993)Convergencealgorithmforsensoryreceptive field development. Neural Computation 5(1):45-60. Reprinted in this volume.
Attneave, F. (1954)Informational aspects of visual perception. Psychological Re-
view 61:183-193.
Barlow, H. B. (1959) Sensory mechanisms, the reduction of redundancy, and intelligence. In National Physical Laboratory Symposium No. 10, The Mech- anisation of Thought Processes, pp. 535-559. London: Her Majesty's Statipnary Office.
Barlow, H. B. (1989) Unsupervised learning. Neural Computation 1(3):295-311. Reprinted in this volume.
Becker, S. and Hinton, G. E. (1993) Learning mixture models of spatial coherence. New01Computation 5(2):267-277. Reprinted in this volume.
Bell, A. J. and Sejnowski, T. J. (1995) An information-maximization approach to blind separation and blind deconvolution. Neural Computation 7(6):1129- 1159. Reprinted in this volume.
Bell, A. J. and Sejnowski, T. J. (1997) The "independent components" of natural scenes are edge filters. Vision Research, 37:3327-3338.
Bienenstock, E. L., Cooper, L. N., and Munro, P. W. (1982)Theory for the develop- ment of neuron selectivity: orientation specificity and binocular interaction in visual cortex. Jollrnal of Neuroscience 2 3 2 4 8 .
Brown, T., Kairiss, E., and Keenan, C., (1990). Hebbian synapses: biophysical mechanisms and algorithms, Ann. Rev. Neurosci. 13:475-511.
Comon, P. (1994) Independent component analysis: a new concept? Signal Pro- cessing 36:287-314.
Crick, F. and Mitchison, G. (1983) The function of dream sleep. Nature 304:lll- 114.
Dayan, P., Hinton, G. E., Neal, R. M., and Zemel, R. S. (1995) The Helmholtz machine. Neural Computation 7(5):889-904. Reprinted in this volume.
Dempster, A. P., Laird, N. M., and Rubin, D. B. (1977) Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society 39:l-38.
Fahlman, S., and Lebiere, C. (1990).The cascade-correlation learning architec- ture. In Touretzky, D. (ed.), Advances in Neural Information Processing Systenls 2,pp.524-532, SanMateo,CA:MorganKaufma~.
Field, D. J. (1994) What is the goal of sensory coding? Neural Co~nputation 6(4):559401. Reprintedin this volume.
Foldiak, P. (1991) Learning invariance from transformation sequences. Neural Conzputation 3(2):194-200. Reprinted in this volume.
Gold, S., Rangarajan, A. and Mjolness, E. (1996)Learning with preknowledge: clustering with point and graph matching distance measures. Neural Compu- tation 8(4):787-804. Reprinted in this volume.
Hinton, G. and Sejnowski, T. (1986)Learning and relearning in Boltzmann ma- chines. In Rumelhart, D. and McClelland, J. (eds.), Parallel DistributedProcess- ing,volume1,chapter7,pp.282-317. Cambridge,MA:MITPress.
Hopfield, J. J., Feinstein, D. I., and Palmer, R. G. (1983) "Unlearning" has a stabilizing effectin collective memories. Nature 304(5922):158-159.
Hubel, D. H. and Wiesel,T.N. (1968)Receptive fields and functional architecture of monkey striate cortex. 1.Physiol. 195:215-244.
Hyvarinen, A. and Oja, E. (1977) A fast Fixed-point algorithm for independent component analysis. Neural Complltation 9(7):1483-1492. Reprinted in this volume.
Intrator, N. (1992) Feature extraction using an unsupervised neural netivork. Neural Computation 4(1):9&107. Reprinted in this volume.
Kambhatla, N. and Leen, T. K. (1997) Dimenison reduction by local principal component analysis. Neliral Computation 9(7):1493-1516. Reprinted id this volume.
Lee, T.-W., Girolam, M., and Sejnowski, T. J. (in press) Independent component analysis using an extended infomax algorithm for mixed sub-Gaussian and super-Gaussian sources. Neural Computation ll(2).
Linsker, R. (1986) From basic network principles to neural architecture: emer- gence of spatial-opponent cells. Proceedings ofthe National Academy of Sciences of the United States ofAmerica 83:7508-7512.
Linsker, R. (1992) Local synaptic learning rules suffice to maximize mutual in- formation in a linear network. Neural Compzltation4(5):691-702. Reprinted in this volume.
Markram, H., Lubke, J., Frotscher, M., and Sakmann, B. (1977) Regulation of synaptic efficacy by coincidence of postsynaptic APs and EPSPs. Science 275:213-215.
Miller, K. D., Keller, J. B., and Stryker, M. P. (1989) Ocular dominance column development: analysis and simulation. Science 245:605-415.
Neal, R. M. and Dayan, P. (1997) Factor analysis using delta-rule wake-sleep learning. Neural Computation 9(8):1781-1803. Reprinted in this volume.
Neal, R. and Hinton, G. E. (1998)A new view of the EM algorithm that justifies incremental and other variants. In M. I. Jordan (ed.), Learning in Graphical Models. Dordrecht: Kluwer Academic Press.
Obermayer, K. and Sejnowski, T. J. (1998) Self-organizing Map Formation: Foiin- dations of Neural Computation. Cambridge, MA: MIT Press.
Olshausen, B. A. and Field, D. J. (1996)Emergence of simple-cell receptive field properties by learning a sparse code for natural images. Nature 1381:607-609. Platt, J .(1991) A resource-allocating network for function interpolation. Neural
Computation 3(2):213-225. Reprinted in this volume.
Stone, J. V. (1996) Learning perpetually salient visual parameters using spatiotemporal smoothness constraints. Neural Computation 8(7):1463-1492.Reprinted in this volume.
Zemel, R. S. and Hinton, G. E. (1995) Learning population codes by minimizing descriptjon length. Neural Compiitation 7(3):549-564. Reprinted in this volume.
Zhang, K., Sereno, M. I., and Sereno, M. E. (1993) Emergence of position-independent detectors of sense of rotation and dilation with Hebbian learn- ing: an analysis. Neural Computation 5(4):597412. Reprinted in this volume.
Review
Unsupervised Learning: Foundations of Neural Computation
A Review
DeLiang Wang
AI Magazine Volume 22 Number 2 (2001) (© AAAI)
https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.656.4261&rep=rep1&type=pdf
<この記事は個人の過去の経験に基づく個人の感想です。現在所属する組織、業務とは関係がありません。>
最後までおよみいただきありがとうございました。
いいね 💚、フォローをお願いします。
Thank you very much for reading to the last sentence.
Please press the like icon 💚 and follow me for your happy life.