LoginSignup
0
0

R3 on "What are the most important statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(14)

Last updated at Posted at 2021-10-05

R3(References on References on References) on "What are the most important statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(14)

R3 on "What are the most important statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(0)
https://qiita.com/kaizen_nagoya/items/a8eac9afbf16d2188901

What are the most important statistical ideas of the past 50 years?
Andrew Gelman, Aki Vehtari
https://arxiv.org/abs/2012.00174

References

14

Blei, D. M., Ng, A. Y., and Jordan, M. I. (2003). Latent Dirichlet allocation. Journal of Machine Learning Research 3, 993–1022.

References on 14

14.1

M. Abramowitz and I. Stegun, editors. Handbook of Mathematical Functions. Dover, New York, 1970.

References on 14.1

T.B.U.

14.2

D. Aldous. Exchangeability and related topics. In E ́cole d’e ́te ́ de probabilite ́s de Saint-Flour, XIII— 1983, pages 1–198. Springer, Berlin, 1985.

14.3

H. Attias. A variational Bayesian framework for graphical models. In Advances in Neural Informa- tion Processing Systems 12, 2000.

14.4

L. Avery. Caenorrhabditis genetic center bibliography. 2002. URL http://elegans.swmed.edu/wli/cgcbib.

14.5

R. Baeza-Yates and B. Ribeiro-Neto. Modern Information Retrieval. ACM Press, New York, 1999. D. Blei and M. Jordan. Modeling annotated data. Technical Report UCB//CSD-02-1202, U.C. Berkeley Computer Science Division, 2002.

14.6

B. de Finetti. Theory of probability. Vol. 1-2. John Wiley & Sons Ltd., Chichester, 1990. Reprint of the 1975 translation.

14.7

S. Deerwester, S. Dumais, T. Landauer, G. Furnas, and R. Harshman. Indexing by latent semantic analysis. Journal of the American Society of Information Science, 41(6):391–407, 1990.

14.8

P. Diaconis. Recent progress on de Finetti’s notions of exchangeability. In Bayesian statistics, 3 (Valencia, 1987), pages 111–125. Oxford Univ. Press, New York, 1988.

14.9

J. Dickey. Multiple hypergeometric functions: Probabilistic interpretations and statistical uses. Journal of the American Statistical Association, 78:628–637, 1983.

14.10

J. Dickey, J. Jiang, and J. Kadane. Bayesian methods for censored categorical data. Journal of the American Statistical Association, 82:773–781, 1987.

14.11

A. Gelman, J. Carlin, H. Stern, and D. Rubin. Bayesian data analysis. Chapman & Hall, London, 1995.

14.12

T. Griffiths and M. Steyvers. A probabilistic approach to semantic representation. In Proceedings of the 24th Annual Conference of the Cognitive Science Society, 2002.

14.13

D. Harman. Overview of the first text retrieval conference (TREC-1). In Proceedings of the First Text Retrieval Conference (TREC-1), pages 1–20, 1992.

14.14

D. Heckerman and M. Meila. An experimental comparison of several clustering and initialization methods. Machine Learning, 42:9–29, 2001.

14.15

T. Hofmann. Probabilistic latent semantic indexing. Proceedings of the Twenty-Second Annual International SIGIR Conference, 1999.

14.16

F. Jelinek. Statistical Methods for Speech Recognition. MIT Press, Cambridge, MA, 1997.

14.17

T. Joachims. Making large-scale SVM learning practical. In Advances in Kernel Methods - Support Vector Learning. M.I.T. Press, 1999.

14.18

M. Jordan, editor. Learning in Graphical Models. MIT Press, Cambridge, MA, 1999.
1016

14.19

M. Jordan, Z. Ghahramani, T. Jaakkola, and L. Saul. Introduction to variational methods for graph- ical models. Machine Learning, 37:183–233, 1999.

14.20

R. Kass and D. Steffey. Approximate Bayesian inference in conditionally independent hierarchical models (parametric empirical Bayes models). Journal of the American Statistical Association, 84 (407):717–726, 1989.

14.21

M. Leisink and H. Kappen. General lower bounds based on computer generated higher order ex- pansions. In Uncertainty in Artificial Intelligence, Proceedings of the Eighteenth Conference, 2002.

14.22

T. Minka. Estimating a Dirichlet distribution. Technical report, M.I.T., 2000.
T. P. Minka and J. Lafferty. Expectation-propagation for the generative aspect model. In Uncertainty in Artificial Intelligence (UAI), 2002.

14.22

C. Morris. Parametric empirical Bayes inference: Theory and applications. Journal of the American Statistical Association, 78(381):47–65, 1983. With discussion.

14.23

K. Nigam, J. Lafferty, and A. McCallum. Using maximum entropy for text classification. IJCAI-99 Workshop on Machine Learning for Information Filtering, pages 61–67, 1999.

14.24

K. Nigam, A. McCallum, S. Thrun, and T. Mitchell. Text classification from labeled and unlabeled documents using EM. Machine Learning, 39(2/3):103–134, 2000.

14.25

C. Papadimitriou, H. Tamaki, P. Raghavan, and S. Vempala. Latent semantic indexing: A proba- bilistic analysis. pages 159–168, 1998.

14.26

A. Popescul, L. Ungar, D. Pennock, and S. Lawrence. Probabilistic models for unified collaborative and content-based recommendation in sparse-data environments. In Uncertainty in Artificial Intelligence, Proceedings of the Seventeenth Conference, 2001.

14.27

J. Rennie. Improving multi-class text classification with naive Bayes. Technical Report AITR-2001- 004, M.I.T., 2001.

14.28

G. Ronning. Maximum likelihood estimation of Dirichlet distributions. Journal of Statistcal Com- putation and Simulation, 34(4):215–221, 1989.

14.29

G. Salton and M. McGill, editors. Introduction to Modern Information Retrieval. McGraw-Hill, 1983.

参考資料(References)

Data Scientist の基礎(2)
https://qiita.com/kaizen_nagoya/items/8b2f27353a9980bf445c

岩波数学辞典 二つの版がCDに入ってお得
https://qiita.com/kaizen_nagoya/items/1210940fe2121423d777

岩波数学辞典
https://qiita.com/kaizen_nagoya/items/b37bfd303658cb5ee11e

アンの部屋(人名から学ぶ数学:岩波数学辞典)英語(24)
https://qiita.com/kaizen_nagoya/items/e02cbe23b96d5fb96aa1

<この記事は個人の過去の経験に基づく個人の感想です。現在所属する組織、業務とは関係がありません。>

最後までおよみいただきありがとうございました。

いいね 💚、フォローをお願いします。

Thank you very much for reading to the last sentence.

Please press the like icon 💚 and follow me for your happy life.

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0