R3(References on References on References) on "W.a.t.m.i. (What are the most important) statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(17)
R3(References on References on References) on "W.a.t.m.i. (What are the most important )statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(0)
https://qiita.com/kaizen_nagoya/items/a8eac9afbf16d2188901
What are the most important statistical ideas of the past 50 years?
Andrew Gelman, Aki Vehtari
https://arxiv.org/abs/2012.00174
References
17
Breiman, L. (2001). Statistical modeling: The two cultures. Statistical Science 16, 199–231. Breiman, L., Friedman, J. H.,
References on 17
17.1
Amit, Y. and Geman, D. (1997). Shape quantization and recognition with randomized trees. Neural Computation 9 1545- 1588.
References on 17.1
17.1.1
Baum, E. B., & Haussler, D. (1989). What size net gives valid generalization? Neural Comp., 1, 151–160.
17.1.2
Binford, T. O., & Levitt, T. S. (1993). Quasi-invariants: Theory and exploitation. In Proceedings of the Image Understanding Workshop (pp. 819–828). Washington D.C.
17.1.3
Bottou, L., Cortes, C., Denker, J. S., Drucker, H., Guyon, I., Jackel, L. D., LeCun, Y., Muller, U. A., Sackinger, E., Simard, P., & Vapnik, V. (1994). Comparison of classifier methods: A case study in handwritten digit recognition. Proc. 12th Inter. Conf. on Pattern Recognition (Vol. 2, pp. 77–82). Los Alamitos, CA: IEEE Computer Society Press.
17.1.4
Breiman, L. (1994). Bagging predictors (Tech. Rep. No. 451). Berkeley: Department of Statistics, University of California, Berkeley.
17.1.5
Breiman, L., Friedman, J., Olshen, R., & Stone, C. (1984). Classification and regres- sion trees. Belmont, CA: Wadsworth.
Brown, D., Corruble, V., & Pittard, C. L. (1993). A comparison of decision tree classifiers with backpropagation neural networks for multimodal classifica- tion problems. Pattern Recognition, 26, 953–961.
17.1.6
Burns, J. B., Weiss, R. S., & Riseman, E. M. (1993). View variation of point set and line segment features. IEEE Trans. PAMI, 15, 51–68.
Casey, R. G., & Jih, C. R. (1983). A processor-based OCR system. IBM Journal of Research and Development, 27, 386–399.
17.1.7
Casey, R. G., & Nagy, G. (1984). Decision tree design using a probabilistic model. IEEE Trans. Information Theory, 30, 93–99.
Cover, T. M., & Thomas, J. A. (1991). Elements of information theory. New York: Wiley.
17.1.8
Dietterich, T. G., & Bakiri, G. (1995). Solving multiclass learning problems via error-correcting output codes. J. Artificial Intell. Res., 2, 263–286.
17.1.9
Forsyth, D., Mundy, J. L., Zisserman, A., Coelho, C., Heller, A., & Rothwell, C. (1991). Invariant descriptors for 3-D object recognition and pose. IEEE Trans. PAMI, 13, 971–991.
17.1.10
Friedman, J. H. (1973). A recursive partitioning decision rule for nonparametric classification. IEEE Trans. Comput., 26, 404–408.
17.1.11
Fukushima, K., & Miyake, S. (1982). Neocognitron: A new algorithm for pattern recognition tolerant of deformations and shifts in position. Pattern Recogni- tion, 15, 455–469.
Fukushima, K., & Wake, N. (1991). Handwritten alphanumeric character recog- nition by the neocognitron. IEEE Trans. Neural Networks, 2, 355–365.
17.1.12
Gelfand, S. B., & Delp, E. J. (1991). On tree structured classifiers. In I. K. Sethi & A. K. Jain (Eds.), Artificial neural networks and statistical pattern recognition (pp. 51–70). Amsterdam: North-Holland.
17.1.13
Geman, D., Amit, Y., & Wilder, K. (1996). Joint induction of shape features and tree classifiers (Tech. Rep.). Amherst: University of Massachusetts.
17.1.14
Geman, S., Bienenstock, E., & Doursat, R. (1992). Neural networks and the bias/variance dilemma. Neural Computation, 4, 1–58.
17.1.15
Gilbert, C. D., Das, A., Ito, M., Kapadia, M., & Westheimer, G. (1996). Spatial integration and cortical dynamics. Proc. Natl. Acad. Sci., 93, 615–622.
17.1.16
Gochin, P. M. (1994). Properties of simulated neurons from a model of primate inferior temporal cortex. Cerebral Cortex, 5, 532–543.
17.1.17
Guo, H., & Gelfand, S. B. (1992). Classification trees with neural network feature extraction. IEEE Trans. Neural Networks, 3, 923–933.
17.1.18
Hastie, T., Buja, A., & Tibshirani, R. (1995). Penalized discriminant analysis. Annals of Statistics, 23, 73–103.
17.1.19
Hussain, B., & Kabuka, M. R. (1994). A novel feature recognition neural network and its application to character recognition. IEEE Trans. PAMI, 16, 99–106.
17.1.20
Ito, M., Fujita, I., Tamura, H., Tanaka, K. (1994). Processing of contrast polarity of visual images of inferotemporal cortex of Macaque monkey. Cerebral Cortex,5, 499–508.
17.1.21
Ito, M., Tamura, H., Fujita, I., & Tanaka, K. (1995). Size and position invariance of neuronal response in monkey inferotemporal cortex. J. Neuroscience, 73(1),218–226.
17.1.22
Jedynak, B., Fleuret, F. (1996). Reconnaissance d’objets 3D a` l’aide d’arbres de classification. In Proc. Image Com 96. Bordeaux, France.
17.1.23
Jung, D.-M., & Nagy, G. (1995). Joint feature and classifier design for OCR. In Proc. 3rd Inter. Conf. Document Analysis and Processing (Vol. 2, pp. 1115–1118).
17.1.24
Montreal. Los Alamitos, CA: IEEE Computer Society Press.
Khotanzad, A., & Lu, J.-H. (1991). Shape and texture recognition by a neural net- work. In I. K. Sethi & A. K. Jain (Eds.), Artificial Neural Networks and Statistical Pattern Recognition. Amsterdam: North-Holland.
17.1.25
Knerr, S., Personnaz, L., & Dreyfus, G. (1992). Handwritten digit recognition by neural networks with single-layer training. IEEE Trans. Neural Networks, 3, 962–968.
17.1.26
Kong, E. B., & Dietterich, T. G. (1995). Error-correcting output coding corrects
bias and variance. In Proc. of the 12th International Conference on Machine Learn-
ing (pp. 313–321). San Mateo, CA: Morgan Kaufmann.
Lamdan, Y., Schwartz, J. T., & Wolfson, H. J. (1988). Object recognition by
affine invariant matching. In IEEE Int. Conf. Computer Vision and Pattern Rec.
(pp. 335–344). Los Alamitos, CA: IEEE Computer Society Press.
LeCun, Y., Boser, B., Denker, J. S., Henderson, D., Howard, R. E., Hubbard, W., & Jackel, L. D. (1990). Handwritten digit recognition with a back-propagation network. In D. S. Touretzky (Ed.), Advances in Neural Information (Vol. 2). San
Mateo, CA: Morgan Kaufmann.
Lee, D.-S., Srihari, S. N., & Gaborski, R. (1991). Bayesian and neural network pat-
tern recognition: A theoretical connection and empirical results with hand- written characters. In I. K. Sethi & A. K. Jain (Eds.), Artificial Neural Networks and Statistical Pattern Recognition. Amsterdam: North-Holland.
Martin, G. L., & Pitman, J. A. (1991). Recognizing hand-printed letters and digits using backpropagation learning. Neural Computation, 3, 258–267.
Mori, S., Suen, C. Y., & Yamamoto, K. (1992). Historical review of OCR research and development. In Proc. IEEE (Vol. 80, pp. 1029–1057). New York: IEEE. Mundy, J. L., & Zisserman, A. (1992). Geometric invariance in computer vision.
Cambridge, MA: MIT Press.
Neuenschwander, S., & Singer, W. (1996). Long-range synchronization of os-
cillatory light responses in cat retina and lateral geniculate nucleus. Nature, 379(22), 728–733.
1588 Yali Amit and Donald Geman
Niyogi, P., & Girosi, F. (1996). On the relationship between generalization error, hypothesis complexity, and sample complexity for radial basis functions. Neural Comp., 8, 819–842.
Quinlan, J. R. (1986). Induction of decision trees. Machine Learning, 1, 81–106. Raudys, S., & Jain, A. K. (1991). Small sample size problems in designing artificial neural networks. In I. K. Sethi & A. K. Jain (Eds.), Artificial Neural Networks
and Statistical Pattern Recognition. Amsterdam: North-Holland.
Reiss, T. H. (1993). Recognizing planar objects using invariant image features. Lecture
Notes in Computer Science no. 676. Berlin: Springer-Verlag.
Sabourin, M., & Mitiche, A. (1992). Optical character recognition by a neural
network. Neural Networks, 5, 843–852.
Sethi, I. K. (1991). Decision tree performance enhancement using an artificial
neural network implementation. In I. K. Sethi & A. K. Jain (Eds.), Artifi- cial Neural Networks and Statistical Pattern Recognition. Amsterdam: North- Holland.
Shlien, S. (1990). Multiple binary decision tree classifiers. Pattern Recognition, 23, 757–763.
Simard, P. Y., LeCun, Y. L., & Denker, J. S. (1994). Memory-based character recog- nition using a transformation invariant metric. In Proc. 12th Inter. Conf. on Pattern Recognition (Vol. 2, pp. 262–267). Los Alamitos, CA: IEEE Computer Society Press.
Werbos, P. (1991). Links between artificial neural networks (ANN) and statistical pattern recognition. In I. K. Sethi & A. K. Jain (Eds.), Artificial Neural Networks and Statistical Pattern Recognition. Amsterdam: North-Holland.
Wilkinson, R. A., Geist, J., Janet, S., Grother, P., Gurges, C., Creecy, R., Hammond, B., Hull, J., Larsen, N., Vogl, T., & and Wilson, C. (1992). The first census optical character recognition system conference (Tech. Rep. No. NISTIR 4912). Gaithersburg, MD: National Institute of Standards and Technology.
17.2
Arena, C., Sussman, N., Chiang, K., Mazumdar, S., Macina, O. and Li, W. (2000). Bagging Structure-Activity Relationships: A simulation study for assessing misclassification rates. Presented at the Second Indo-U.S. Workshop on Mathematical Chemistry, Duluth, MI. (Available at NSussman@server.ceoh.pitt.edu).
17.3
Bickel, P., Ritov, Y. and Stoker, T. (2001). Tailor-made tests for goodness of fit for semiparametric hy potheses. Unpublished manuscript.
17.4
Breiman, L. (1998). Arcing classifiers. Discussion paper, Ann. Statist. 26 801-824.
Digital Object Identifier: 10.1214/aos/1024691079
Google Scholar: Lookup Link
Euclid: euclid.aos/1024691079
zbMATH: 0934.62064
MathSciNet: MR99g:62083
17.5
Breiman, L. (2000). Some infinity theory for tree ensembles. (Available at www.stat.berkeley.edu/technical reports).
URI: http://www.stat.berkeley.edu/technical
17.6
Breiman, L. (2001). Random forests. Machine Learning J. 45 5- 32.
zbMATH: 01687841
17.7
Breiman, L. and Friedman, J. (1985). Estimating optimal transformations in multiple regression and correlation. J. Amer. Statist. Assoc. 80 580-619.
Digital Object Identifier: 10.2307/2288473
Google Scholar: Lookup Link
zbMATH: 0594.62044
MathSciNet: MR803258
17.8
Breiman, L., Friedman, J., Olshen, R. and Stone, C. (1984). Classification and Regression Trees. Wadsworth, Belmont, CA.
zbMATH: 0541.62042
MathSciNet: MR86b:62101
17.9
Cristianini, N. and Shawe-Tay lor, J. (2000). An Introduction to Support Vector Machines. Cambridge Univ. Press.
Daniel, C. and Wood, F. (1971). Fitting equations to data. Wiley, New York.
17.10
Dempster, A. (1998). Logicist statistic 1. Models and Modeling. Statist. Sci. 13 3 248-276.
Digital Object Identifier: 10.1214/ss/1028905887
Google Scholar: Lookup Link
Euclid: euclid.ss/1028905887
MathSciNet: MR1665717
17.11
Diaconis, P. and Efron, B. (1983). Computer intensive methods in statistics. Scientific American 248 116-131.
zbMATH: 0555.62037
MathSciNet: MR773679
17.12
Domingos, P. (1998). Occam's two razors: the sharp and the blunt. In Proceedings of the Fourth International Conference on Knowledge Discovery and Data Mining (R. Agrawal and P. Stolorz, eds.) 37-43. AAAI Press, Menlo Park, CA.
17.13
Domingos, P. (1999). The role of Occam's razor in knowledge discovery. Data Mining and Knowledge Discovery 3 409-425.
17.14
Dudoit, S., Fridly and, J. and Speed, T. (2000). Comparison of discrimination methods for the classification of tumors. (Available at www.stat.berkeley.edu/technical reports).
URI: http://www.stat.berkeley.edu/technical
17.15
Freedman, D. (1987). As others see us: a case study in path analysis (with discussion). J. Ed. Statist. 12 101-223.
17.16
Freedman, D. (1991). Statistical models and shoe leather. Sociological Methodology 1991 (with discussion) 291-358.
17.17
Freedman, D. (1991). Some issues in the foundations of statistics. Foundations of Science 1 19-83.
MathSciNet: MR1798108
17.18
Freedman, D. (1994). From association to causation via regression. Adv. in Appl. Math. 18 59-110.
Digital Object Identifier: 10.1006/aama.1996.0501
Google Scholar: Lookup Link
zbMATH: 0873.90019
MathSciNet: MR1425950
17.19
Freund, Y. and Schapire, R. (1996). Experiments with a new boosting algorithm. In Machine Learning: Proceedings of the Thirteenth International Conference 148-156. Morgan Kaufmann, San Francisco.
17.20
Friedman, J. (1999). Greedy predictive approximation: a gradient boosting machine. Technical report, Dept. Statistics Stanford Univ.
17.21
Friedman, J., Hastie, T. and Tibshirani, R. (2000). Additive logistic regression: a statistical view of boosting. Ann. Statist. 28 337-407.
Digital Object Identifier: 10.1214/aos/1016218223
Google Scholar: Lookup Link
Euclid: euclid.aos/1016218223
zbMATH: 1106.62323
MathSciNet: MR2002c:62050
17.22
Gifi, A. (1990). Nonlinear Multivariate Analy sis. Wiley, New York.
zbMATH: 0697.62048
MathSciNet: MR1076188
17.23
Ho, T. K. (1998). The random subspace method for constructing decision forests. IEEE Trans. Pattern Analy sis and Machine Intelligence 20 832-844.
17.24
Landswher, J., Preibon, D. and Shoemaker, A. (1984). Graphical methods for assessing logistic regression models (with discussion). J. Amer. Statist. Assoc. 79 61-83.
17.25
McCullagh, P. and Nelder, J. (1989). Generalized Linear Models. Chapman and Hall, London.
MathSciNet: MR727836
17.26
Meisel, W. (1972). Computer-Oriented Approaches to Pattern Recognition. Academic Press, New York.
17.27
Michie, D., Spiegelhalter, D. and Tay lor, C. (1994). Machine Learning, Neural and Statistical Classification. Ellis Horwood, New York.
17.28
Mosteller, F. and Tukey, J. (1977). Data Analy sis and Regression. Addison-Wesley, Redding, MA.
17.29
Mountain, D. and Hsiao, C. (1989). A combined structural and flexible functional approach for modelenery substitution. J. Amer. Statist. Assoc. 84 76-87.
17.30
Stone, M. (1974). Cross-validatory choice and assessment of statistical predictions. J. Roy. Statist. Soc. B 36 111-147.
JSTOR: jstor.org
zbMATH: 0308.62063
MathSciNet: MR50:8847
17.31
Vapnik, V. (1995). The Nature of Statistical Learning Theory. Springer, New York.
MathSciNet: MR98a:68159
17.32
Vapnik, V (1998). Statistical Learning Theory. Wiley, New York.
MathSciNet: MR99h:62052
17.33
Wahba, G. (1990). Spline Models for Observational Data. SIAM, Philadelphia.
zbMATH: 0813.62001
MathSciNet: MR91g:62028
参考資料(References)
Data Scientist の基礎(2)
https://qiita.com/kaizen_nagoya/items/8b2f27353a9980bf445c
岩波数学辞典 二つの版がCDに入ってお得
https://qiita.com/kaizen_nagoya/items/1210940fe2121423d777
岩波数学辞典
https://qiita.com/kaizen_nagoya/items/b37bfd303658cb5ee11e
アンの部屋(人名から学ぶ数学:岩波数学辞典)英語(24)
https://qiita.com/kaizen_nagoya/items/e02cbe23b96d5fb96aa1
<この記事は個人の過去の経験に基づく個人の感想です。現在所属する組織、業務とは関係がありません。>
最後までおよみいただきありがとうございました。
いいね 💚、フォローをお願いします。
Thank you very much for reading to the last sentence.
Please press the like icon 💚 and follow me for your happy life.