LoginSignup
0
0

R3 on "What are the most important statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(37)

Last updated at Posted at 2021-11-02

R3(References on References on References) on "What are the most important statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(37)

R3 on "What are the most important statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(0)
https://qiita.com/kaizen_nagoya/items/a8eac9afbf16d2188901

What are the most important statistical ideas of the past 50 years?
Andrew Gelman, Aki Vehtari
https://arxiv.org/abs/2012.00174

References

37

Donoho, D. L, and Johnstone, I. M. (1994). Ideal spatial adaptation by wavelet shrinkage. Biometrika 81, 425–455.

References on 37

37.1

􏰤􏰅􏰦 BICKEL􏰁 P􏰃 J􏰃 􏰞􏰅􏰋􏰢􏰇􏰟􏰃 Minimax estimation of a normal mean sub ject to doing well at a p oint􏰃, In Recent Advances in Statistics 􏰞M􏰃 H􏰃 Rizvi􏰁 J􏰃 S􏰃 Rustagi􏰁 and D􏰃 Siegmund􏰁 eds􏰃􏰟􏰁 Academic Press􏰁 New York􏰁 􏰉􏰅􏰅􏰩􏰉􏰆􏰢􏰃

References on 37.1

37.1.1

Limiting the Risk of Bayes and Empirical Bayes Estimators—Part I: The Bayes Case
B. Efron, C. Morris
Mathematics
1971
Abstract The first part of this article considers the Bayesian problem of estimating the mean, θ, of a normal distribution when the mean itself has a normal prior. The usual Bayes estimator for this… Expand

37.1.2

The use of Previous Experience in Reaching Statistical Decisions
J. L. Hodges, E. Lehmann
Mathematics
1952
Instead of minimizing the maximum risk it is proposed to re-strict attention to decision procedures whose maximum risk does not exceed the minimax risk by more than a given amount. Subject to this… Expand

37.1.3

Robust Estimation of a Location Parameter
P. J. Huber
Mathematics
1964
This paper contains a new approach toward a theory of robust estimation; it treats in detail the asymptotic theory of estimating a location parameter for contaminated normal distributions, and… Expand

37.1.4

On robust estimation of the location parameter
F. R. Forst
Computer Science

37.1.5

A Natural Identity for Exponential Families with Applications in Multiparameter Estimation
H. Hudson
Mathematics
1978

37.1.6

FISHER INFORMATION AND THE PITMAN ESTIMATOR OF A LOCATION PARAMETER
S. Port, C. J. Stone
Mathematics
1974

37.1.7

" Nonoptimality of Preliminary Test Estimators for the Mean of a Multivariate Normal Distribution
" Ann . Math . Statist .
1972

37.1.8

Non-Optimality of Preliminary-Test Estimators for the Mean of a Multivariate Normal Distribution
S. Sclove, C. Morris, R. Radhakrishnan
Mathematics
1972

37.1.9

Admissible Estimators, Recurrent Diffusions, and Insoluble Boundary Value Problems
L. Brown
Mathematics
1971

37.1.10

Generalized Bayes Solutions in Estimation Problems
J. Sacks
Mathematics
1963

37.2

􏰤􏰆􏰦 BREIMAN􏰁 L􏰃􏰁 FRIEDMAN􏰁 J􏰃H􏰃􏰁 OLSHEN􏰁 R􏰃A􏰃􏰁􏰜 STONE􏰁 C􏰃J􏰃 􏰞􏰅􏰋􏰢􏰇􏰟􏰃 CART􏰐 Classi􏰎cation and Regression Trees 􏰃 Wadsworth􏰐 CBelmont􏰁 CA􏰃

37.3

􏰤􏰇􏰦 BROCKMANN􏰁 M􏰃􏰁 GASSER􏰁 T􏰃􏰁 􏰜 HERRMANN􏰁 E􏰃 􏰞􏰅􏰋􏰋􏰆􏰟􏰃 Lo cally Adaptive Bandwidth Choice for Kernel Regression Estimators􏰃 To app ear􏰃 J􏰃 Amer􏰃 Statist􏰃 Assoc􏰃􏰃

37.4

􏰤􏰈􏰦 BROWN􏰁 L􏰃D􏰃 􏰜 LOW􏰁 M􏰃G􏰃 􏰞􏰅􏰋􏰋􏰇􏰟􏰃 Sup ere􏰏ciency and lack of adaptability in non􏰂 parametric functional estimation􏰃 To app ear􏰁 Annals of Statistics􏰃

37.5

􏰤􏰉􏰦 COHEN􏰁 A􏰃􏰁 DAUBECHIES􏰁 I􏰃􏰁 JAWERTH􏰁 B􏰃 􏰜 VIAL􏰁 P􏰃 􏰞􏰅􏰋􏰋􏰇􏰟􏰃 Multiresolution analysis􏰁 wavelets􏰁 and fast algorithms on an interval􏰃 Comptes Rendus Acad􏰃 Sci􏰃 Paris􏰞A􏰟􏰃 􏰇􏰅􏰊􏰃􏰁 􏰈􏰅􏰓􏰩􏰈􏰆􏰅􏰃316. 417-421

37.6

􏰤􏰊􏰦 CHUI􏰁 C􏰃K􏰃 􏰞􏰅􏰋􏰋􏰆􏰟􏰃􏰁 An Introduction to Wavelets􏰃 Academic Press􏰁 Boston􏰁 MA􏰃

37.7

􏰤􏰓􏰦 DAUBECHIES􏰁 I􏰃 􏰞􏰅􏰋􏰢􏰢􏰟􏰃 Orthonormal bases of compactly supp orted wavelets􏰃 Com􏰂 munications in Pure and Applied Mathematics 􏰈􏰅􏰁 Nov􏰃 􏰅􏰋􏰢􏰢􏰁 pp􏰃 􏰋􏰄􏰋􏰂􏰋􏰋􏰊􏰃

37.8

􏰤􏰢􏰦 DAUBECHIES􏰁 I􏰃 􏰞􏰅􏰋􏰋􏰆􏰟􏰃 Ten Lectures on Wavelets SIAM􏰐 Philadelphi a􏰃

37.9

􏰤􏰋􏰦 DAUBECHIES􏰁 I􏰃 􏰞􏰅􏰋􏰋􏰇􏰟􏰃 Orthonormal Bases of Compactly Supp orted Wavelets I I􏰐 Variations on a theme􏰃 SIAM J􏰃 Math􏰃 Anal􏰃􏰁 􏰆􏰈􏰁 􏰈􏰋􏰋􏰩􏰉􏰅􏰋􏰃

37.10

􏰤􏰅􏰄􏰦 EFROIMOVICH􏰁 S􏰃 YU􏰃 􏰜 PINSKER􏰁 M􏰃S􏰃 􏰞􏰅􏰋􏰢􏰈􏰟􏰃 A learning algorithm for non􏰂 parametric 􏰎ltering􏰃 Automat􏰃 i Telemeh􏰃 􏰅􏰅 􏰉􏰢􏰂􏰊􏰉 􏰞in Russian􏰟􏰃
􏰆􏰉## 37.11
􏰤􏰅􏰅􏰦 FRAZIER M􏰃􏰁 JAWERTH B􏰃􏰁 􏰜 WEISS G􏰃 􏰞􏰅􏰋􏰋􏰅􏰟􏰃 Littlewood􏰂Paley Theory and the study of function spaces􏰃 NSF􏰂CBMS Regional Conf􏰃 Ser in Mathematics􏰁 􏰓􏰋􏰃 American Math􏰃 So c􏰃􏰐 Providence􏰁 RI􏰃

37.12

􏰤􏰅􏰆􏰦 FRIEDMAN􏰁 J􏰃H􏰃 􏰜 SILVERMAN􏰁 B􏰃W􏰃 􏰞􏰅􏰋􏰢􏰋􏰟􏰃 Flexible Parsimonious Smo othing and Additive Mo deling􏰃 􏰞with discussion􏰟􏰃 Technometrics 􏰇􏰅􏰁 􏰇􏰩􏰇􏰋􏰃

37.13

􏰤􏰅􏰇􏰦 FRIEDMAN􏰁 J􏰃H􏰃 􏰞􏰅􏰋􏰋􏰅􏰟􏰃 Multiple Additive Regression Splines 􏰞with discussion􏰟􏰃 An􏰂 nals of Statistics􏰁 􏰅􏰋􏰁 􏰅􏰩􏰊􏰓􏰃

37.14

􏰤􏰅􏰈􏰦 GEORGE􏰁 E􏰃 I􏰃 􏰜 Foster􏰁 D􏰃 P􏰃􏰞􏰅􏰋􏰋􏰄􏰟􏰃 The risk in􏰖ation of variable selection in re􏰂 gression􏰃 Technical Rep ort􏰁 University of Chicago􏰃

37.15

􏰤􏰅􏰉􏰦 LEPSKII􏰁 O􏰃V􏰃 􏰞􏰅􏰋􏰋􏰄􏰟􏰃 On one problem of adaptive estimation on white Gaussian
noise􏰃 Teor􏰃 Veoryatnost􏰃 i Primenen􏰃 􏰇􏰉 􏰈􏰉􏰋􏰂􏰈􏰓􏰄 􏰞in Russian􏰟􏰃 Theory of Probability and Appl􏰃 􏰇􏰉􏰁 􏰈􏰉􏰈􏰂􏰈􏰊􏰊 􏰞in English􏰟􏰃

37.16

􏰤􏰅􏰊􏰦 MALGOUYRES􏰁 G􏰃 􏰞􏰅􏰋􏰋􏰅􏰟􏰃 Ondelettes sur l􏰝Intervalle􏰐 algorithmes rapides􏰃 Pr􏰗epublications Mathematiques Orsay􏰃

37.17

􏰤􏰅􏰓􏰦 MEYER􏰁 Y􏰃 􏰞􏰅􏰋􏰋􏰄􏰟􏰃 Ondelettes et Op􏰗erateurs􏰐 I􏰃 Ondelettes Hermann et Cie􏰁 Paris􏰃

37.18

􏰤􏰅􏰢􏰦 MEYER􏰁 Yves 􏰞􏰅􏰋􏰋􏰅􏰟􏰃 Ondelettes sur l􏰝intervalle􏰃 Revista Matem􏰗atica Ibero􏰂Americana

37.19􏰓 􏰞􏰆􏰅􏰅􏰉

􏰤􏰅MILLER􏰁 A􏰃J􏰃 􏰞􏰅􏰋􏰢􏰈􏰟􏰃 Selection of subsets of regression variables 􏰞with discussion􏰟􏰃 J􏰃 R􏰃 Statist􏰃 Soc􏰃 A􏰃􏰁 􏰅􏰈􏰓􏰁􏰇􏰢􏰋􏰩􏰈􏰆􏰉􏰃

37.20

􏰤􏰆􏰄􏰦 MILLER􏰁 A􏰃J􏰃 􏰞􏰅􏰋􏰋􏰄􏰟􏰃 Subset Selection in Regression􏰃 Chapman and Hall􏰃 London􏰁 New York􏰃

37.21

􏰤􏰆􏰅􏰦MU􏰬LLER􏰁Hans􏰂Georg􏰜STADTMULLER􏰁Ulrich􏰃􏰞􏰅􏰋􏰢􏰓􏰟􏰃Variablebandwidthkernel estimators of regression curves􏰃 Ann􏰃 Statist􏰃􏰁 􏰅􏰉􏰞􏰅􏰟􏰁 􏰅􏰢􏰆􏰩􏰆􏰄􏰅􏰃

37.22

􏰤􏰆􏰆􏰦 TERRELL􏰁 G􏰃R􏰃 􏰜 SCOTT􏰁 D􏰃W􏰃 􏰞􏰅􏰋􏰋􏰆􏰟􏰃 Variable kernel density estimation􏰃 Annals of Statistics􏰃􏰁 􏰆􏰄􏰁 􏰅􏰆􏰇􏰊 􏰩 􏰅􏰆􏰊􏰉􏰃

参考資料(References)

Data Scientist の基礎(2)
https://qiita.com/kaizen_nagoya/items/8b2f27353a9980bf445c

岩波数学辞典 二つの版がCDに入ってお得
https://qiita.com/kaizen_nagoya/items/1210940fe2121423d777

岩波数学辞典
https://qiita.com/kaizen_nagoya/items/b37bfd303658cb5ee11e

アンの部屋(人名から学ぶ数学:岩波数学辞典)英語(24)
https://qiita.com/kaizen_nagoya/items/e02cbe23b96d5fb96aa1

<この記事は個人の過去の経験に基づく個人の感想です。現在所属する組織、業務とは関係がありません。>

最後までおよみいただきありがとうございました。

いいね 💚、フォローをお願いします。

Thank you very much for reading to the last sentence.

Please press the like icon 💚 and follow me for your happy life.

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0