0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

R3 on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(28)

Last updated at Posted at 2021-10-15

R3(References on References on References) on "What are the most important statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(28)

R3 on "What are the most important statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(0)
https://qiita.com/kaizen_nagoya/items/a8eac9afbf16d2188901

What are the most important statistical ideas of the past 50 years?
Andrew Gelman, Aki Vehtari
https://arxiv.org/abs/2012.00174

References

28

Cortes, C., and Vapnik, V. (1995). Support-vector networks. Machine Learning 20, 273–297.

Reference on 28

28.1

Aizerman, M., Braverman, E., & Rozonoer, L. (1964). Theoretical foundations of the potential function method in pattern recognition learning.Automation and Remote Control, 25:821–837.

28.2

Anderson, T.W., & Bahadur, R.R. (1966). Classification into two multivariate normal distributions with different covariance matrices.Ann. Math. Stat., 33:420–431.

Reference on 28.2

28.2.1

Regression Models and Life-Tables
D. R. Cox
Journal of the Royal Statistical Society. Series B (Methodological), Vol. 34, No. 2. (1972), pp. 187-220.

REFERENCES

[1] ANDERSON, T. W. (1958). An Introduction to Multivariate Statistical Analysis. Wiley, New York: ]2] CaveLLI, L. L. (1945). Alcuni problemi della ans1isi biometrica di popolasioni natursli. Mem. Isl. Ital. Idrobiol. 2 301-323. CLIINIES-ROSS, C. W. and IFtunnnarpon, R. 11. (1960). Geometry and linear discrim-ination. Biometrika 47 185-189. [4] KELLBACE, SOLOMON (1959). Information Theory and Statistics. Wiley, New York. [5] PENROSE, L. S. (1947). Some notes on discrimination. Ann. Eugenics 13 228-237. 16] SMITH, C. A. B. (1947). Some examples of discrimination. Ann. Eugenics 13 272-282. RI WELCH, PETER and Wsnrnsan, RICHARD S. (1961). Two multivariate statistical com-puter programs and their application to the vowel recognition problem. J. Acoustical Soc. of America 33 426-434.

28.3

Boser, B.E., Guyon, I., & Vapnik, V.N. (1992). A training algorithm for optimal margin classifiers. InProceedings of the Fifth Annual Workshop of Computational Learning Theory, 5, 144–152. Pittsburgh, ACM.

28.4

Bottou, L., Cortes, C., Denker, J.S., Drucker, H., Guyon, I., Jackel, L.D., LeCun, Y., Sackinger, E., Simard, P., Vapnik, V., & Miller, U.A. (1994). Comparison of classifier methods: A case study in handwritten digit recognition.Proceedings of 12th International Conference on Pattern Recognition and Neural Network.
Bromley, J., & Sackinger, E. (1991). Neural-network andk-nearest-neighbor classifiers. Technical Report 11359-910819-16TM, AT&T.
Cournant, R., & Hilbert, D. (1953).Methods of Mathematical Physics, Interscience, New York.

28.5

Fisher, R.A. (1936). The use of multiple measurements in taxonomic problems.Ann. Eugenics, 7:111–132.

28.6

LeCun, Y. (1985). Une procedure d'apprentissage pour reseau a seuil assymetrique.Cognitiva 85: A la Frontiere de l'Intelligence Artificielle des Sciences de la Connaissance des Neurosciences, 599–604, Paris.
LeCun, Y., Boser, B., Denker, J.S., Henderson, D., Howard, R.E., Hubbard, W., & Jackel, L.D. (1990). Handwritten digit recognition with a back-propagation network.Advances in Neural Information Processing Systems, 2, 396–404, Morgan Kaufman.

28.7

Parker, D.B. (1985). Learning logic. Technical Report TR-47, Center for Computational Research in Economics and Management Science, Massachusetts Institute of Technology, Cambridge, MA.
Rosenblatt, F. (1962).Principles of Neurodynamics, Spartan Books, New York.

28.8

Rumelhart, D.E., Hinton, G.E., & Williams, R.J. (1986). Learning internal representations by backpropagating errors.Nature, 323:533–536.

28.9

Rumelhart, D.E., Hinton, G.E., & Williams, R.J. (1987). Learning internal representations by error propagation. In James L. McClelland & David E. Rumelhart (Eds.),Parallel Distributed Processing, 1, 318–362, MIT Press.

28.10

Vapnik, V.N. (1982).Estimation of Dependences Based on Empirical Data, Addendum 1, New York: Springer-Verlag.

参考資料(References)

Data Scientist の基礎(2)
https://qiita.com/kaizen_nagoya/items/8b2f27353a9980bf445c

岩波数学辞典 二つの版がCDに入ってお得
https://qiita.com/kaizen_nagoya/items/1210940fe2121423d777

岩波数学辞典
https://qiita.com/kaizen_nagoya/items/b37bfd303658cb5ee11e

アンの部屋(人名から学ぶ数学:岩波数学辞典)英語(24)
https://qiita.com/kaizen_nagoya/items/e02cbe23b96d5fb96aa1

<この記事は個人の過去の経験に基づく個人の感想です。現在所属する組織、業務とは関係がありません。>

最後までおよみいただきありがとうございました。

いいね 💚、フォローをお願いします。

Thank you very much for reading to the last sentence.

Please press the like icon 💚 and follow me for your happy life.

0
1
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?