0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

R3(0) on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari

Last updated at Posted at 2021-09-24

R3(References on References on References) on "W.a.t.m.i.(What are the most important) statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(0)

データサイエンティストの気づき「勉強だけして仕事に役立てない人。大嫌い!」。『それ自分かも!』ってなった。
https://qiita.com/kaizen_nagoya/items/d85830d58d8dd7f71d07

の反省にもとづいて始めた作業の一つです。

どんな分野も、取り組むときには、対象文献の、参考文献の参考文献の参考文献を調べ、出現頻度が高い文献は、

  1. 基本原理的な事項を整理しているか。
  2. みんなが批判の対象にしているか。

のどちらかだと仮定して読み込むようにしようとしています。

@gen_nospare 【論文紹介】統計学の過去50年における最も重要なアイディアとは?

What are the most important statistical ideas of the past 50 years?
Andrew Gelman, Aki Vehtari
https://arxiv.org/abs/2012.00174

の参考文献の入手可能性を確認します。

「統計学の過去50年における最も重要なアイディア」がどの論文に書いてある、何のことを言っているのか、さっぱりわかておらず、確かめる作業の一環です。

その参考文献の参考文献の入手可能性を確認します。

その参考文献の参考文献の参考文献を確認する予定です。

3ヶ月くらいかける予定です。

<この項は書きかけです。順次追記します。>

Submission history(original text)

[v1] Mon, 30 Nov 2020 23:54:59 UTC (23 KB)
[v2] Tue, 8 Dec 2020 15:52:22 UTC (25 KB)
[v3] Mon, 18 Jan 2021 13:53:16 UTC (25 KB)
[v4] Thu, 27 May 2021 12:24:54 UTC (28 KB)
[v5] Thu, 3 Jun 2021 15:44:39 UTC (28 KB)

References

1

Akaike, H. (1973). Information theory and an extension of the maximum likelihood principle. In Proceedings of the Second International Symposium on Information Theory, ed. B. N. Petrov and F. Csaki, 267–281. Budapest: Akademiai Kiado. Reprinted in Breakthroughs in Statistics, ed. S. Kotz, 610–624. New York: Springer (1992).
https://link.springer.com/chapter/10.1007/978-1-4612-1694-0_15##2

References on References on References on "W.a.t.m.i.(What are the most important) statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(1)
https://qiita.com/kaizen_nagoya/items/9c1dbdd46b6fae601595

2

Aldous, D. J. (1985). Exchangeability and Related Topics. Springer, Berlin.
https://link.springer.com/chapter/10.1007/BFb0099421

References on References on References on "W.a.t.m.i.(What are the most important) statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(2)
https://qiita.com/kaizen_nagoya/items/24f5ee401f943413d90d

3

Amrhein, V., Greenland, S., and McShane, B. (2019). Scientists rise up against statistical significance. Nature 567, 305–307.

3

Anderlucci, L., Montanari, A., and Viroli, C. (2019). The importance of being clustered: Uncluttering the trends of statistics from 1970 to 2015. Statistical Science 34, 280–300.
https://arxiv.org/abs/1709.03563

References on References on References on "W.a.t.m.i.(What are the most important) statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(3)
https://qiita.com/kaizen_nagoya/items/91b9c48279dd965db2a6

4

Andrews, D. F., Bickel, P. J., Hampel, F. R., Huber, P. J., Rogers, W. H., and Tukey, J. W. (1972). Robust Estimates of Location: Survey and Advances. Princeton University Press.
http://www.stat.uchicago.edu/~pmcc/tukey/Tukey.tex

References on References on References on "W.a.t.m.i.(What are the most important) statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(4)
https://qiita.com/kaizen_nagoya/items/7b234ed1bab22c917512

5

Baron, R. M., and Kenny, D. A. (1986). The moderator–mediator variable distinction in social psy- chological research: Conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology 51, 1173–1182.
https://www.sesp.org/files/The%20Moderator-Baron.pdf

R3 on "W.a.t.m.i.(What are the most important) statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(5)
https://qiita.com/kaizen_nagoya/items/ad93aa67ca9c3680ea11

6

Becker, R. A., Chambers, J. M., and Wilks, A. R. (1988). The New S Language: A Programming Environment for Data Analysis and Graphics. Pacific Grove, Calif.: Wadsworth.

Reference on 6

T.B.D.

7

Bengio, Y., LeCun, Y., and Hinton, G. (2015). Deep learning. Nature 521, 436–444.
https://www.deeplearningbook.org

R3 on "W.a.t.m.i.(What are the most important) statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(7)
https://qiita.com/kaizen_nagoya/items/ef5923bf5049e0725ed5

8

Benjamini, Y., and Hochberg, Y. (1995). Controlling the false discovery rate: A practical and powerful approach to multiple testing. Journal of the Royal Statistical Society B 57, 289–300.
https://www.jstor.org/stable/2346101

R3 on "W.a.t.m.i.(What are the most important) statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(8)
https://qiita.com/kaizen_nagoya/items/aa371d3ad29922bea7d4

9

Berger, J. O. (1985). Statistical Decision Theory and Bayesian Analysis, second edition. New York: Springer.

10

Bernardo, J. M., and Smith, A. F. M. (1994). Bayesian Theory. New York: Wiley.

Hull, J., Moore, P. G. and Thomas, H. (1973). Utility and its measurement I. Roy. Statist hoc_ A 136, 226-247.

Huseby, A. B. (1988). Combining opinions in a predictive case. Bayesian Statistics 3 (J. M. Beniardo, M. H. DeGroot, D. V. Lindley and A. F. M. Smith, eds.). Oxford: University Press, 641-651.
Huzurbazar, V. S. (1976). Sufficient Statistics. New York: Marcel Dekker. Hwang, J. T. (1985). Universal domination and stochastic domination: decision theory under a broad class of loss functions. Ann. Statist. 13, 295-314.
Hwang, J. T. (1988). Stochastic and universal domination. Encyclopedia of Statistical Sci-ences 8 (& Kutz, N. L. Johnson and C. B. Read, eds.). New York: Wiley, 781-784.
Hylland, A, and Zeckhauser, R. (1981). The impossibility of Bayesian group decision making with separate aggregation of beliefs and values. Econometrica 79, 1321-1336.
Ibragimov, I.A. and Hasminski. R. Z. (1973). On the information in a sample about a parameter. Proc. 2nd Internat. Symp. information Theory. (B. N. Petrov and F. Csaki, eds.)„ Budapest: Akademiaikiacki, 295-309.
Irony, T. (1992). Bayesian estimation for discrete distributions. J. Apia Statist. 19, 533 549. Irony, T. Z. ( 1993). information in sampling rules. Statist. Planning and Inference 36, 27-38.
Irony, T. Z., Pereira, C. A. de B. and Barlow. R. E. (1992). Bayesian models for quality assur-ance_ Bayesian Statistics 4 (1 M. Bernardo, J. O. Berger, A. P. Dawid and A. F. M. Smith.

11

Besag, J. (1974). Spatial interaction and the statistical analysis of lattice systems (with discussion).
Journal of the Royal Statistical Society B 36, 192–236.
http://www2.stat.duke.edu/~scs/Courses/Stat376/Papers/GibbsFieldEst/BesagJRSSB1974.pdf

R3 on "W.a.t.m.i.(What are the most important) statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(11)
https://qiita.com/kaizen_nagoya/items/b0c0e57b7ebe8a43e5c3

12

Besag, J. (1986). On the statistical analysis of dirty pictures (with discussion). Journal of the Royal Statistical Society B 48, 259–302.
https://www.semanticscholar.org/paper/On-the-Statistical-Analysis-of-Dirty-Pictures-Besag/47865b56fee61d9c9ff477f7c79f090cc6663d3a#references

R3 on "W.a.t.m.i.(What are the most important) statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(12)
https://qiita.com/kaizen_nagoya/items/549a31bdcc67511dc4f4

13

Bland, J. M., and Altman, D. G. (1986). Statistical methods for assessing agreement between two methods of clinical measurement. Lancet 327, 307–310.
https://www.researchgate.net/publication/51706270_Statistical_methods_for_assessing_agreement_between_double_readings_of_clinical_measurements

R3 on "W.a.t.m.i.(What are the most important) statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(13)
https://qiita.com/kaizen_nagoya/items/b59311b01140c3eaf8de

14

Blei, D. M., Ng, A. Y., and Jordan, M. I. (2003). Latent Dirichlet allocation. Journal of Machine Learning Research 3, 993–1022.
https://www.jmlr.org/papers/volume3/blei03a/blei03a.pdf

R3 on "W.a.t.m.i.(What are the most important) statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(14)
https://qiita.com/kaizen_nagoya/items/f31a409987d918276f22

15

Box, G. E. P. (1980). Sampling and Bayes inference in scientific modelling and robustness. Journal of the Royal Statistical Society A 143, 383–430.
https://www.cs.princeton.edu/courses/archive/fall11/cos597C/reading/Box1980.pdf

16

Box, G. E. P., and Jenkins, G. M. (1976). Time Series Analysis: Forecasting and Control, second edition. San Francisco: Holden-Day.
http://www.ru.ac.bd/stat/wp-content/uploads/sites/25/2019/03/504_05_Box_Time-Series-Analysis-Forecasting-and-Control-2015.pdf

R3 on "W.a.t.m.i.(What are the most important) statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(16)
https://qiita.com/kaizen_nagoya/items/d46280023d021b64cc53

17

Breiman, L. (2001). Statistical modeling: The two cultures. Statistical Science 16, 199–231.
https://projecteuclid.org/journals/statistical-science/volume-16/issue-3/Statistical-Modeling--The-Two-Cultures-with-comments-and-a/10.1214/ss/1009213726.full

R3 on "W.a.t.m.i.(What are the most important) statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(17)
https://qiita.com/kaizen_nagoya/items/9229143d0f030af975ec

18

Breiman, L., Friedman, J. H., Olshen, R. A., and Stone, C. J. (1984). Classification and Regression Trees. London: CRC Press.
https://www.semanticscholar.org/paper/Classification-and-Regression-Trees-Breiman-Friedman/8017699564136f93af21575810d557dba1ee6fc6#references

R3 on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(18)
https://qiita.com/kaizen_nagoya/items/08e03db3b517b025f6f9

19

Brillinger, D. R. (1981). Time Series: Data Analysis and Theory, expanded edition. San Francisco: Holden-Day.
https://epubs.siam.org/doi/book/10.1137/1.9780898719246
https://epubs.siam.org/doi/pdf/10.1137/1.9780898719246.bm

R3 on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(19)
https://qiita.com/kaizen_nagoya/items/02e76ba8e533f7325e27

20

Buntine, W. L., and Weigend, A. S. (1991). Bayesian back-propagation. Complex Systems 5, 603–643.
https://www.semanticscholar.org/paper/Bayesian-Back-Propagation-Buntine-Weigend/c83684f6207697c12850db423fd9747572cf1784

R3 on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(20)
https://qiita.com/kaizen_nagoya/items/ff37cb528cd19b57bc70

21

Cand`es, E. J., Romberg, J., and Tao, T. (2008). Robust uncertainty principles: Exact signal recon- struction from highly incomplete frequency information. IEEE Transactions on Information Theory 52, 489–509.
https://arxiv.org/pdf/math/0409186.pdf

R3 on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(21)
https://qiita.com/kaizen_nagoya/items/6767f49e6314020f3927

22

Carvalho, C. M., Polson, N. G., and Scott, J. G. (2010). The horseshoe estimator for sparse signals. Biometrika 97, 465–480.
https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.370.5389&rep=rep1&type=pdf

R3 on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(22)
https://qiita.com/kaizen_nagoya/items/43f2febac0201bd7ad9c

23

Chambers, J. M. (1993). Greater or lesser statistics: A choice for future research. Statistics and Computing 3, 182–184.
https://statweb.stanford.edu/~jmc4/papers/greater.ps

no references

24

Chambers, J. M., Cleveland, W. S., Kleiner, B., and Tukey, P. A. (1983). Graphical Methods for Data Analysis. Pacific Grove, Calif.: Wadsworth.
https://jp1lib.org/book/3495561/48a4ca?id=3495561&secret=48a4ca&dsource=recommend

R3 on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(24)
https://qiita.com/kaizen_nagoya/items/dad6371eba0a099582cd

25

Chernozhukov, V., Chetverikov, D., Demirer, M., Duflo, E., Hansen, C., Newey, W., and Robins, J. (2018). Double/debiased machine learning for treatment and structural parameters. Econo- metrics Journal 21, C1–C68.
https://economics.mit.edu/files/12538

R3 on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(25)
https://qiita.com/kaizen_nagoya/items/3fc97b0541de70ef8abc

26

Cleveland, W. S. (1985). The Elements of Graphing Data. Monterey, Calif.: Wadsworth.

27

Cortes, C., and Vapnik, V. (1995). Support-vector networks. Machine Learning 20, 273–297.
https://link.springer.com/article/10.1007/BF00994018

R3 on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(28)
https://qiita.com/kaizen_nagoya/items/31c8fbc412402683e170

28

Cox, D. R. (1958). Planning of Experiments. New York: Wiley.9
https://www.wiley.com/en-jp/Planning+of+Experiments-p-9780471574293

29

Cox, D. R. (1972). Regression models and life-tables. Journal of the Royal Statistical Society B 34, 187–220.
http://www.biecek.pl/statystykamedyczna/cox.pdf

R3 on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(29)
https://qiita.com/kaizen_nagoya/items/f7272b696fc4277e2431

30

Cronbach, L. J. (1975). Beyond the two disciplines of scientific psychology. American Psychologist 30, 116–127.
https://www.semanticscholar.org/paper/Beyond-the-Two-Disciplines-of-Scientific-Cronbach/53347c2241a49a6e991ad267bbd9cdb283dd504d#references

R3 on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(30)
https://qiita.com/kaizen_nagoya/items/a7f2123b1ee9495124c9

31

Del Moral, P. (1996). Nonlinear filtering: Interacting particle resolution. Markov Processes and Related Fields 2, 555–580.
https://www.researchgate.net/publication/234052783_Non_Linear_Filtering_Interacting_Particle_Solution

R3 on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(31)
https://qiita.com/kaizen_nagoya/items/eb60463a7d10941180c2

32

Dempster, A. P., Laird, N. M., and Rubin, D. B. (1977). Maximum likelihood from incomplete data via the EM algorithm (with discussion). Journal of the Royal Statistical Society B 39, 1–38.
https://www.ece.iastate.edu/~namrata/EE527_Spring08/Dempster77.pdf

R3 on "What are the most important statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(32)
https://qiita.com/kaizen_nagoya/items/9382ec790756a0764259

33

Dempster, A. P., Schatzoff, M., and Wermuth, N. (1977). A simulation study of alternatives to ordinary least squares. Journal of the American Statistical Association 72, 77–91.
http://mleead.umich.edu/files/FLYER_20191029_Little.pdf

R3 on "What are the most important statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(33)
https://qiita.com/kaizen_nagoya/items/65431a4a0163b13572a1

34

Donoho, D. L. (1995). De-noising by soft-thresholding. IEEE Transactions on Information Theory 41, 613–627.
https://www.semanticscholar.org/paper/De-noising-by-soft-thresholding-Donoho/2cc257b0c7db92f90c3224c35df7b8e85f57a090#references

R3 on "What are the most important statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(34)
https://qiita.com/kaizen_nagoya/items/639b800ce4dd69685eed

35

Donoho, D. L. (2006). Compressed sensing. IEEE Transactions on Information Theory 52, 1289– 1306.
https://people.ece.ubc.ca/janm/Papers_RG/Donoho_IT_April06.pdf

R3 on "What are the most important statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(35)
https://qiita.com/kaizen_nagoya/items/f62252ff36d03ed02b1f

36

Donoho, D. L. (2017). 50 years of data science. Journal of Computational and Graphical Statistics 26, 745–766.
https://www.tandfonline.com/doi/full/10.1080/10618600.2017.1384734

R3 on "What are the most important statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(36)
https://qiita.com/kaizen_nagoya/items/6e5d31af19983d1d3850

37

Donoho, D. L, and Johnstone, I. M. (1994). Ideal spatial adaptation by wavelet shrinkage. Biometrika 81, 425–455.
https://statweb.stanford.edu/~imj/WEBLIST/1994/isaws.pdf

R3 on "What are the most important statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(37)
https://qiita.com/kaizen_nagoya/items/9bf3833bcd7c615557aa

38

Duane, S., Kennedy, A. D., Pendleton, B. J., and Roweth, D. (1987). Hybrid Monte Carlo. Physics Letters B 195, 216–222.
https://www.semanticscholar.org/paper/Hybrid-Monte-Carlo-Duane-Kennedy/22ea20339015130099017185e7f36e87933c6a43

R3 on "What are the most important statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(38)
https://qiita.com/kaizen_nagoya/items/b149d0a49b882f16353c

39

Duncan, O. D. (1975). Introduction to Structural Equation Models. New York: Academic Press.
https://stats.idre.ucla.edu/r/seminars/rsem/

R3 on "What are the most important statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(39)
https://qiita.com/kaizen_nagoya/items/484e383530eba5bc6300

40

Efron, B. (1979). Bootstrap methods: Another look at the jackknife. Annals of Statistics 7, 1–26.

Bootstrap methods: Another look at the jackknife

R3(40) on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari
https://qiita.com/kaizen_nagoya/items/d3d6a1ca18e0db1b2ec7

41

Efron, B. and Hastie, T. (2016). Computer Age Statistical Inference: Algorithms, Evidence, and Data Science. Cambridge University Press.
https://web.stanford.edu/~hastie/CASI_files/PDF/casi.pdf

R3(41) on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari
https://qiita.com/kaizen_nagoya/items/287484931e834dd3ea7c

42

Efron, B., and Morris, C. (1971). Limiting the risk of Bayes and empirical Bayes estimators—Part I: The Bayes case. Journal of the American Statistical Association 66, 807–815.
https://www.tandfonline.com/doi/ref/10.1080/01621459.1971.10482348?scroll=top

R3(42) on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari
https://qiita.com/kaizen_nagoya/items/7d4d400427b1beb61c06

43

Efron, B., and Morris, C. (1972). Limiting the risk of Bayes and empirical Bayes estimators—Part II: The empirical Bayes case. Journal of the American Statistical Association 67, 130–139.
https://www.tandfonline.com/doi/ref/10.1080/01621459.1972.10481215?scroll=top

R3(43) on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari
https://qiita.com/kaizen_nagoya/items/12601f852fb68b578fec

44

Efron, B., and Tibshirani, R. J. (1993). An Introduction to the Bootstrap. London: Chapman and Hall.
http://www.ru.ac.bd/stat/wp-content/uploads/sites/25/2019/03/501_02_Efron_Introduction-to-the-Bootstrap.pdf

R3(44) on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari
https://qiita.com/kaizen_nagoya/items/d025e4c349dc3094739c

45

Fay, R. E., and Herriot, R. A. (1979). Estimates of income for small places: An application of James-Stein procedures to census data. Journal of the American Statistical Association 74, 269–277.
https://www.tandfonline.com/doi/ref/10.1080/01621459.1979.10482505?scroll=top

R3(45) on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari
https://qiita.com/kaizen_nagoya/items/a67be89dc8e0537f6bbc

46

Felsenstein, J. (1985). Confidence limits on phylogenies: An approach using the bootstrap. Evolu- tion 39, 783–791.

47

Ferguson, T. S. (1973). A Bayesian analysis of some nonparametric problems. Annals of Statistics 1, 209–230.

48

Freund, Y., and Schapire, R. E. (1997). A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences 55, 119–139.

49

Friedman, J. H. (2001). Greedy function approximation: A gradient boosting machine. Annals of Statistics 29, 1189–1232.

50

Gabry, J., Simpson, D., Vehtari, A., Betancourt, M., and Gelman, A. (2019). Visualization in Bayesian workflow (with discussion). Journal of the Royal Statistical Society A 182, 389–402.

51

Geisser, S. (1975). The predictive sample reuse method with applications. Journal of the American Statistical Association 70, 320–328.

52

Gelfand, A. E., and Smith, A. F. M. (1990). Sampling-based approaches to calculating marginal densities. Journal of the American Statistical Association 85, 398–409.

53

Gelman, A. (2003). A Bayesian formulation of exploratory data analysis and goodness-of-fit testing. International Statistical Review 71, 369–382.

54

Gelman, A., Vehtari, A., Simpson, D., Margossian, C. C., Carpenter, B., Yao, Y., Bu ̈rkner, P. C., Kennedy, L., Gabry, J., and Modra ́k, M. (2020). Bayesian workflow. http://www.stat.columbia.edu/~gelman/research/unpublished/Bayesian_Workflow_article.pdf

55

Geman, S., and Hwang, C. R. (1982). Nonparametric maximum likelihood estimation by the method of sieves. Annals of Statistics 10, 401–414.

56

Gigerenzer, G., and Todd, P. M. (1999). Simple Heuristics That Make Us Smart. Oxford University Press.

57

Giordano, R., Broderick, T., and Jordan, M. I. (2018). Covariances, robustness, and variational Bayes. Journal of Machine Learning Research 19, 1–49.

58

Good, I, J., and Gaskins, R. A. (1971). Nonparametric roughness penalties for probability densities. Biometrika 58, 255–277.

59

Goodfellow, I., Bengio, Y., and Courville, A. (2016). Deep Learning. Cambridge, Mass.: MIT Press.

60

Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., and Bengio, Y. (2014). Generative adversarial networks. Proceedings of the International Conference on Neural Information Processing Systems, 2672–2680.

61

Gordon, N. J., Salmond, D. J., and Smith, A. F. M. (1993). Novel approach to nonlinear/non- Gaussian Bayesian state estimation. IEE Proceedings F - Radar and Signal Processing 140, 107–113.

62

Greenland, S. (2005). Multiple-bias modelling for analysis of observational data. Journal of the Royal Statistical Society A 168, 267–306.
Greenland, S., and Robins, J. M. (1986). Identifiability, exchangeability, and epidemiological con- founding. International Journal of Epidemiology 15, 413–419.

63

Grenander, U. (1981). Abstract Inference. New York: Wiley.

64

Haavelmo, T. (1943). The statistical implications of a system of simultaneous equations. Econo- metrica 11, 1–12.

65

Hastie, T., Tibshirani, R., and Wainwright, M. (2015). Statistical Learning With Sparsity. London: CRC Press.

66

Heckerman, D., Geiger, D., and Chickering, D. M. (1995). Learning Bayesian networks: The combination of knowledge and statistical data. Machine Learning 20, 197–243.

67

Heckman, J. J., and Pinto, R. (2015). Causal analysis after Haavelmo. Econometric Theory 31, 115–151.

68

Henderson, C. R., Kempthorne, O., Searle, S. R., and von Krosigk, C. M. (1959). The estimation of environmental and genetic trends from records subject to culling. Biometrics 15, 192–218.

69

Heskes, T., Opper, M., Wiegerinck, W., Winther, O., and Zoeter, O. (2005). Approximate infer- ence techniques with expectation constraints. Journal of Statistical Mechanics: Theory and Experiment, P11015.

70

Hill, J. L. (2011). Bayesian nonparametric modeling for causal inference. Journal of Computational and Graphical Statistics 20, 217–240.

71

Hinton, G. E., Osindero, S., and Teh, Y. W. (2006). A fast learning algorithm for deep belief nets. Neural Computation 18, 1527–1554.

72

Hoeting, J., Madigan, D., Raftery, A. E., and Volinsky, C. (1999). Bayesian model averaging (with discussion). Statistical Science 14, 382–417.
Huber, P. J. (1972). Robust statistics: A review. Annals of Mathematical Statistics 43, 1041–1067.

73

Ihaka, R., and Gentleman, R. (1996). R: A language for data analysis and graphics. Journal of Computational and Graphical Statistics 5, 299–314.

74

Imbens, G. W., and Angrist, J. D. (1994). Identification and estimation of local average treatment effects. Econometrica 62, 467–475.

75

Ioannidis, J. P. A. (2005). Why most published research findings are false. PLoS Medicine 2 (8), e124.

76

James, W., and Stein, C. (1960). Estimation with quadratic loss. In Proceedings of the Fourth Berkeley Symposium 1, ed. J. Neyman, 361–380. Berkeley: University of California Press.

77

Jordan, M., Ghahramani, Z., Jaakkola, T., and Saul, L. (1999). Introduction to variational methods for graphical models. Machine Learning 37, 183–233.

78

Kahneman, D., Slovic, P., and Tversky, A. (1982). Judgment Under Uncertainty: Heuristics and Biases. Cambridge University Press.

79

Kimeldorf, G., and Wahba, G. (1971). Some results on Tchebycheffian spline functions. Journal of Mathematical Analysis and Applications 33, 82–95.

80

Kitagawa, G. (1993). A Monte Carlo filtering and smoothing method for non-Gaussian nonlinear state space models. Proceedings of the 2nd U.S.-Japan Joint Seminar on Statistical Time Series Analysis, 110–131.

81

Kolaczyk, E. D. (2009). Statistical Analysis of Network Data: Methods and Models. New York: Springer. Kong, A., McCullagh, P., Meng, X. L., Nicolae, D., and Tan, Z. (2003). A theory of statistical models for Monte Carlo integration (with discussion). Journal of the Royal Statistical Society B 65, 585–618.

82

Ku ̈nsch, H. R. (1987). Statistical aspects of self-similar processes. Proceedings of the First World Congress of the Bernoulli Society, 67–74.

83

Lavine, M. (1992). Some aspects of Polya tree distributions for statistical modelling. Annals of Statistics 20, 1222–1235.

84

Lax, J. R., and Phillips, J. H. (2012). The democratic deficit in the states. American Journal of Political Science 56, 148–166.

85

Lee, J. A. and Verleysen, M. (2007). Nonlinear Dimensionality Reduction. New York: Springer. Liang, K. Y., and Zeger, S. L. (1986). Longitudinal data analysis using generalized linear models. Biometrika 73, 13–22.

86

Lindley, D. V., and Novick, M. R. (1981). The role of exchangeability in inference. Annals of Statistics 9, 45–58.

87

Lindley, D. V., and Smith, A. F. M. (1972). Bayes estimates for the linear model. Journal of the Royal Statistical Society B 34, 1–41.

88

Little, R. J. A. (1993). Post-stratification: A modeler’s perspective. Journal of the American Statistical Association 88, 1001–1012.

89

Little, R. J. A., and Rubin, D. B. (2002). Statistical Analysis with Missing Data, second edition. New York: Wiley.

90

Lunn, D., Spiegelhalter, D., Thomas, A., and Best, N. (2009). The BUGS project: Evolution, critique and future directions (with discussion). Statistics in Medicine 28, 3049–3082.

91

MacKay, D. J. C. (1992). A practical Bayesian framework for backpropagation networks. Neural Computation 4, 448–472.

92

Mallows, C. L. (1973). Some comments on Cp. Technometrics 15, 661–675.

93

Manski, C. F. (1990). Nonparametric bounds on treatment effects. American Economic Review 80, 319–323.

94

Marin, J. M., Pudlo, P., Robert, C. P., and Ryder, R. J. (2012). Approximate Bayesian computa- tional methods. Statistics and Computing 22, 1167–1180.

95

Martin, G. M., Frazier, D. T., and Robert, C. P. (2020). Computing Bayes: Bayesian computation from 1763 to the 21st century. arXiv:2004.06425.

96

Mauldin, R. D., Sudderth, W. D., and Williams, S. C. (1992). Polya trees and random distributions. Annals of Statistics 20, 1203–1221.

97

McCullagh, P., and Nelder, J. A. (1989). Generalized Linear Models, second edition. New York: Chapman and Hall.

98

Meehl, P. E. (1978). Theoretical risks and tabular asterisks: Sir Karl, Sir Ronald, and the slow progress of soft psychology. Journal of Consulting and Clinical Psychology 46, 806–834.

99

Meng, X. L., and van Dyk, D. A. (1997). The EM algorithm—an old folk-song sung to a fast new tune (with discussion). Journal of the Royal Statistical Society B 59, 511–567.

100

Mimno, D., Blei, D. M., and Engelhardt, B. E. (2015). Posterior predictive checks to quantify lack- of-fit in admixture models of latent population structure. Proceedings of the National Academy of Sciences 112, E3441–3450.

101

Minka, T. (2001). Expectation propagation for approximate Bayesian inference. In Proceedings of the Seventeenth Conference on Uncertainty in Artificial Intelligence, ed. J. Breese and D. Koller, 362–369.

102

Mockus, J. (1974). The Bayes methods for seeking the extremal point. Kybernetes 3, 103–108. Mockus, J. (2012). Bayesian Approach to Global Optimization: Theory and Applications. Dor-drecht: Kluwer.

103

Molnar, C. (2020). Interpretable Machine Learning: A Guide for Making Black Box Models Ex- plainable. christophm.github.io/interpretable-ml-book

104

Morgan, S. L., and Winship, C. (2014). Counterfactuals and Causal Inference: Methods and Principles for Social Research, second edition. Cambridge University Press.

105

Mu ̈ller, P., and Mitra, R. (2013). Bayesian nonparametric inference—why and how. Bayesian Analysis, 8, 269–302.

106

Murdoch, W. J., Singh, C., Kumbier, K., Abbasi-Asl, R., and Yu, B. (2019). Definitions, methods, and applications in interpretable machine learning. Proceedings of the National Academy of Sciences 116, 22070–22080.

107

Navarro, D. J. (2019). Between the devil and the deep blue sea: Tensions between scientific judgement and statistical model selection. Computational Brain and Behavior 2, 28–34.

108

Neal, R. M. (1996). Bayesian Learning for Neural Networks. New York: Springer.
Nelder, J. A. (1977). A reformulation of linear models (with discussion). Journal of the Royal Statistical Society A 140, 48–76.

108

Neyman, J. (1923). On the application of probability theory to agricultural experiments. Essay on principles. Section 9.
Translated and edited by D. M. Dabrowska and T. P. Speed. Statistical Science 5, 463–480 (1990).

109

Novick, M. R., Jackson, P. H., Thayer, D. T., and Cole, N. S. (1972). Estimating multiple regres- sions in m groups: A cross validation study. British Journal of Mathematical and Statistical Psychology 25, 33–50.

110

O’Hagan, A. (1978). Curve fitting and optimal design for prediction (with discussion). Journal of the Royal Statistical Society B 40, 1–42.

111

Owen, A. B. (1988). Empirical likelihood ratio confidence intervals for a single functional. Biometrika 75, 237–249.

112

Paatero, P. and Tapper, U. (1994). Positive matrix factorization: A non-negative factor model with optimal utilization of error estimates of data values. Environmetrics 5, 111–126.

113

Pearl, J. (2009). Causality, second edition. Cambridge University Press.

114

Peters, J., Janzing, D., and Scho ̈lkopf, B. (2017). Elements of Causal Inference: Foundations and Learning Algorithms. MIT Press.

115

Pitman, J., and Yor, M. (1997). The two-parameter Poisson-Dirichlet distribution derived from a stable subordinator. Annals of Probability 25, 855–900.

116

Popper, K. R. (1957). The Poverty of Historicism. London: Routledge and Kegan Paul.

117

Pyro (2020). Pyro: Deep universal probabilistic programming. pyro.ai

118

Quenouille, M. H. (1949). Problems in plane sampling. Annals of Mathematical Statistics 20, 355–375.

119

Rasmussen, C. E., and Williams, C. K. I. (2006). Gaussian Processes for Machine Learning. Cam- bridge, Mass.: MIT Press.

120

Robbins, H. (1955). An empirical Bayes approach to statistics. In Proceedings of the Third Berkeley Symposium 1, ed. J. Neyman, 157–164. Berkeley: University of California Press.

121

Robbins, H. (1964). The empirical Bayes approach to statistical decision problems. Annals of Mathematical Statistics 35, 1–20.

122

Rosenbaum, P. R., and Rubin, D. B. (1983). The central role of the propensity score in observational studies for causal effects. Biometrika 70, 41–55.

123

Rubin, D. B. (1974). Estimating causal effects of treatments in randomized and nonrandomized studies. Journal of Educational Psychology 66, 688–701.

124

Rubin, D. B. (1984). Bayesianly justifiable and relevant frequency calculations for the applied statistician. Annals of Statistics 12, 1151–1172.

125

Rumelhart, D. E., Hinton, G. E., and Williams, R. J. (1987). Learning internal representations by error propagation. In Parallel Distributed Processing: Explorations in the Microstructure of Cognition: Foundations, ed. Rumelhart, D. E. and McClelland, J. L., 318–362. Cambridge, Mass.: MIT Press.

126

Savage, L. J. (1954). The Foundations of Statistics. New York: Dover.

127

Scheff ́e, H. (1959). The Analysis of Variance. New York: Wiley.

128

Schmidhuber, J. (2015). Deep learning in neural networks: An overview. Neural Networks 61, 85–117.

129

Shahriari, B., Swersky, K., Wang, Z., Adams, R. P., and de Freitas, N. (2015). Taking the human out of the loop: A review of Bayesian optimization. Proceedings of the IEEE 104, 148–175.

130

Sheiner, L. B., Rosenberg, B., and Melmon, K. L. (1972). Modelling of individual pharmacokinetics for computer-aided drug dosage. Computers and Biomedical Research 5, 441–459.

131

Shen, X., and Wong, W. H. (1994). Convergence rate of sieve estimates. Annals of Statistics 22, 580–615.

132

Silver, D., Schrittwieser, J., Simonyan, K., Antonoglou, I., Huang, A., Guez, A., Hubert, T., Baker, L., Lai, M., Bolton, A., Chen, Y., Lillicrap, T., Hui, F., Sifre, L., van den Driessche, G., Graepel, T., and Hassabis, D. (2017). Mastering the game of Go without human knowledge. Nature 550, 354–359.

133

Simmons, J., Nelson, L., and Simonsohn, U. (2011). False-positive psychology: Undisclosed flex- ibility in data collection and analysis allow presenting anything as significant. Psychological Science 22, 1359–1366.

134

Spiegelhalter, D., Thomas, A., Best, N., Gilks, W., and Lunn, D. (1994). BUGS: Bayesian inference using Gibbs sampling. MRC Biostatistics Unit, Cambridge, England. www.mrc-bsu.cam.ac.uk/bugs

135

Spirtes, P., Glymour C., and Scheines, R. (1993). Causation, Prediction, and Search. New York: Springer

136

Stan Development Team (2020). Stan modeling language users guide and reference manual, version 2.25. mc-stan.org

137

Stein, C. (1955). Inadmissibility of the usual estimator for the mean of a multivariate normal distribution. In Proceedings of the Third Berkeley Symposium 1, ed. J. Neyman, 197–206. Berkeley: University of California Press.

138

Stigler, S. M. (1986). The History of Statistics. Cambridge, Mass.: Harvard University Press.

139

Stigler, S. M. (2010). The changing history of robustness. American Statistician 64, 277–281.

140

Stigler, S. M. (2016). The Seven Pillars of Statistical Wisdom. Cambridge, Mass.: Harvard Uni- versity Press.

141

Stone, M. (1974). Cross-validatory choice and assessment of statistical predictions (with discussion). Journal of the Royal Statistical Society B 36, 111–147.

142

Sutton, R. S., and Barto, A. G. (2018). Reinforcement Learning: An Introduction, second edition. Cambridge, Mass.: MIT Press.

143

Tavar ́e, S, Balding, D. J., Griffiths, R. C., and Donnelly, P. (1997). Inferring coalescence times from DNA sequence data. Genetics 145, 505–518.

144

Tensorflow (2000). Tensorflow: An end-to-end open source machine learning platform. www.tensorflow.org

145

Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society B 58, 267–288.

146

Tufte, E. R. (1983). The Visual Display of Quantitative Information. Cheshire, Conn.: Graphics Press.

147

Tukey, J. W. (1953). The Problem of Multiple Comparisons. Unpublished manuscript.

148

Tukey, J. W. (1958). Bias and confidence in not quite large samples (abstract). Annals of Mathe-matical Statistics 29, 614.

149

Tukey, J. W. (1960). A survey of sampling from contaminated distributions. In Contributions to Probability and Statistics: Essays in Honor of Harold Hotelling, ed. I. Olkin, S. G. Ghurye, W. Hoeffding, W. G. Madow, and H. B. Mann, 448–485. Stanford University Press.

150

Tukey, J. W. (1962). The future of data analysis. Annals of Mathematical Statistics 33, 1–67.

151

Tukey, J. W. (1977). Exploratory Data Analysis. Reading, Mass.: Addison-Wesley.

152

Unwin, A., Volinsky, C., and Winkler, S. (2003). Parallel coordinates for exploratory modelling analysis. Computational Statistics and Data Analysis 43, 553–564.

153

VanderWeele, T. J. (2015). Explanation in Causal Inference: Methods for Mediation and Interac- tion. Cambridge University Press.
van Zwet, E., Schwab, S., and Senn, S. (2020). The statistical properties of RCTs and a proposal for shrinkage. arxiv.org/abs/2011.15004

154

Vapnik, V. N. (1998). Statistical Learning Theory. New York: Wiley.
Wager, S., and Athey, S. (2018). Estimation and inference of heterogeneous treatment effects using random forests. Journal of the American Statistical Association 113, 1228–1242.

155

Wahba, G. (1978). Improper priors, spline smoothing and the problem of guarding against model errors in regression. Journal of the Royal Statistical Society B 40, 364–372.

156

Wahba, G. (2002). Soft and hard classification by reproducing kernel Hilbert space methods. Pro- ceedings of the National Academy of Sciences 99, 16524–16530.

157

Wahba, G., and Wold, S. (1975). A completely automatic French curve: Fitting spline functions by cross-validation. Communications in Statistics 4, 1–17.

158

Wald, A. (1949). Statistical decision functions. Annals of Mathematical Statistics 20, 165–205.

159

Watanabe, S. (2010). Asymptotic equivalence of Bayes cross validation and widely applicable information criterion in singular learning theory. Journal of Machine Learning Research 11,
3571–3594.

160

Welch, B. L. (1937). On the z-test in randomized blocks and latin squares. Biometrika 29, 21–52.

161

Werbos, P. J. (1981). Applications of advances in nonlinear sensitivity analysis. Proceedings of the 10th IFIP Conference, 762–770.

162

Wermouth, N. (1980). Linear recursive equations, covariance selection, and path analysis. Journalof the American Statistical Association 75, 963–972.

163

White, H. (1980). A heteroskedasticity-consistent covariance matrix estimator and a direct test for heteroskedasticity. Econometrica 48, 817–838.

164

Wickham, H. (2006). Exploratory model analysis with R and GGobi.
had.co.nz/model-vis/2007-jsm.pdf

165

Wickham, H. (2016). ggplot2: Elegant Graphics for Data Analysis. New York: Springer.

166

Wilkinson, L. (2005). The Grammar of Graphics, second edition. New York: Springer.

167

Wold, H. O. A. (1954). Causality and econometrics. Econometrica 22, 162–177.

168

Wolpert, D. H. (1992). Stacked generalization. Neural Networks 5, 241–259.
Wright, S. (1923). The theory of path coefficients: A reply to Niles’ criticism. Genetics 8, 239–255.

169

Wu, Y. N., Guo, C. E., and Zhu, S. C. (2004). Perceptual scaling. In Applied Bayesian Modeling and Causal Inference from an Incomplete Data Perspective, ed. A. Gelman and X. L. Meng. New York: Wiley.

170

Yu, B. (2013). Stability. Bernoulli 19, 1484–1500.

参考資料(References)

Data Scientist の基礎(2)

岩波数学辞典 二つの版がCDに入ってお得

岩波数学辞典

アンの部屋(人名から学ぶ数学:岩波数学辞典)英語(24)

<この記事は個人の過去の経験に基づく個人の感想です。現在所属する組織、業務とは関係がありません。>

#文書履歴(document history)
ver. 0.01 初稿 10% 20210924
ver. 0.02 12% 20210925
ver. 0.03 14% 20210926
ver. 0.04 16% 20210927
ver. 0.05 18% 20210928
ver. 0.06 18.1% 20210929
ver. 0.07 18.2% 20210930
ver. 0.08 18.4% 20211002
ver. 0.09 18.5% 20211103
ver. 0.08 18.6% 20211004
ver. 0.09 18.7% 20211105
ver. 0.10 18.8% 20211106
ver. 0.11 18.9% 20211107
ver. 0.12 19.0% 20211008
ver. 0.13 19.1% 20211009
ver. 0.14 19.2% 20211015
ver. 0.15 19.3% 20211016
ver. 0.16 19.4% 20211018
ver. 0.17 19.5% 20211019
ver. 0.18 19.6% 20211028
ver. 0.20 19.7% 34, 35 URL 追記 20211031
ver. 0.21 19.8% 36, 37 追記 20211102
ver. 0.22 19.9% 38, 39 追記 20211102
ver. 0.23 19.95% 40 追記 20211105
ver. 0.24 20.00% 41 追記 20211106
ver. 0.25 20.05% 42 追記 20211107
ver. 0.26 20.10% 43 追記 20211008
ver. 0.27 20.15% 44, 45 追記 20211115

最後までおよみいただきありがとうございました。

いいね 💚、フォローをお願いします。

Thank you very much for reading to the last sentence.

Please press the like icon 💚 and follow me for your happy life.

0
1
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?