LoginSignup
0
0

R3 on "W.a.t.m.i. statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(22)

Last updated at Posted at 2021-10-09

R3(References on References on References) on "W.a.t.m.i. (What are the most important) statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(22)

R3(References on References on References) on "W.a.t.m.i. (What are the most important) statistical ideas of the past 50 years? " Andrew Gelman, Aki Vehtari(0)
https://qiita.com/kaizen_nagoya/items/a8eac9afbf16d2188901

What are the most important statistical ideas of the past 50 years?
Andrew Gelman, Aki Vehtari
https://arxiv.org/abs/2012.00174

Reference

22

Carvalho, C. M., Polson, N. G., and Scott, J. G. (2010). The horseshoe estimator for sparse signals. Biometrika 97, 465–480.

References on 22

22.1

J. Angers and J. Berger. Robust hierarchical bayes estimation of exchangeable means. The Canadian Journal of Statistics, 19(1):39–56, 1991.

Reference on 22.1

Limiting the Risk of Bayes and Empirical Bayes Estimators—Part I: The Bayes Case
B. Efron, C. Morris
Mathematics
1971
Abstract The first part of this article considers the Bayesian problem of estimating the mean, θ, of a normal distribution when the mean itself has a normal prior. The usual Bayes estimator for this…

22.1.2

A Robust Generalized Bayes Estimator and Confidence Region for a Multivariate Normal Mean
J. Berger
Mathematics
1980
It is observed that in selecting an alternative to the usual maximum likelihood estimator, 6', of a multivariate normal mean, it is important to take into account prior information. Prior information…

22.1.3

On Truncation of Shrinkage Estimators in Simultaneous Estimation of Normal Means
D. Dey, J. Berger
Mathematics
1983
Abstract In estimating a multivariate normal mean θ = (θ1, …, θ k ) t under sum of squares error loss, it is well known that Stein estimators improve upon the usual estimator (in terms of expected…

22.1.4

Robust empirical bayes analyses of event rates
D. Gaver, I. O'Muircheartaigh
Mathematics
1987
A collection of I similar items generates point event histories; for example, machines experience failures or operators make mistakes. Suppose the intervals between events are modeled as iid…

22.1.5

On Outlier Rejection Phenomena in Bayes Inference
A. O'Hagan
Mathematics
1979
SUMMARY Inference is considered for a location parameter given a random sample. Outliers are not explicitly modelled, but rejection of extreme observations occurs naturally in any Bayesian analysis… Expand

22.1.6

A bayesian approach to some outlier problems.
G. Box, G. C. Tiao
Mathematics, Medicine
Biometrika
1968
TLDR
The problem of outlying observations is considered from a Bayesian viewpoint and the linear model is considered, which assumes that a good observation is normally distributed about its mean with variance o.2, and a bad one is normal with the same mean but a larger variance.

22.1.7

Parametric Empirical Bayes Inference: Theory and Applications
C. Morris
Mathematics
1983
Abstract This article reviews the state of multiparameter shrinkage estimators with emphasis on the empirical Bayes viewpoint, particularly in the case of parametric prior distributions. Some…

22.1.8

Development of robust Bayes estimators for a multivariate normal mean
J. Angers
Mathematics
1987

22.1.9

Estimation of the Mean of a Multivariate Normal Distribution
C. Stein
Mathematics
1981

22.1.10

Posterior expectations for large observations
A. Dawid
Mathematics
1973

22.1.11

Bayes Estimates for the Linear Model
D. Lindley, A. Smith
Mathematics
1972

22.1.12

Proper Bayes Minimax Estimators of the Multivariate Normal Mean
W. Strawderman
Mathematics
1971
300

22.1.13

A Bayesian approach to the importance of assumptions applied to the comparison of variances
G. Box, G. C. Tiao
Mathematics
1964

22.2

J. O. Berger. A robust generalized Bayes estimator and confidence region for a multivariate normal mean. The Annals of Statistics, 8(4):716–761, 1980.

22.3

J. O. Berger and M. Delampady. Testing precise hypotheses. Statistical Science, 2 (3):317–52, 1987.

22.4

D. Berry. Multiple comparisons, multiple tests, and data dredging: A Bayesian per- spective. In J. Bernardo, M. DeGroot, D. Lindley, and A. Smith, editors, Bayesian Statistics 3, pages 79–94. Oxford University Press, 1988.

22.5

M. Bogdan, A. Chakrabarti, and J. K. Ghosh. Optimal rules for multiple testing and sparse multiple regression. Technical report, Purdue University, 2008a.

22.6

M. Bogdan, J. K. Ghosh, and S. T. Tokdar. A comparison of the Benjamini-Hochberg procedure with some Bayesian rules for multiple testing. In Beyond Parametrics in Interdisciplinary Research: Festschrift in Honor of Professor Pranab K. Sen, volume 1, pages 211–30. Institute of Mathematical Statistics, 2008b.

22.7

L. Brown. Admissible estimators, recurrent diffusions and insoluble boundary prob- lems. The Annals of Mathematical Statistics, 42:855–903, 1971.

22.8

J. Copas. Regression, prediction and shrinkage. Journal of the Royal Statistical Society, Series B, 45(3):311–54, 1983.

22.9

D. Denison and E. George. Bayesian prediction using adaptive ridge estimators. Technical report, Imperial College, London, 2000.

22.10

B. Efron. Microarrays, empirical Bayes and the two-groups model (with discussion). Statistical Science, 1(23):1–22, 2008.

22.11

B. Efron and C. Morris. Limiting the risk of Bayes and empirical Bayes—part i: the Bayes case. Journal of the American Statistical Association, 66:807–815, 1971.

22.12

T. Fan and J. O. Berger. Behaviour of the posterior distribution and inferences for a normal mean with t prior distributions. Stat. Decisions, 10:99–120, 1992.

22.13

W. J. Fu. Penalized regressions: The bridge versus the lasso. Journal of Computa- tional and Graphical Statistics, 7(3):397–416, 1998.

22.14

A. Gelman. Prior distributions for variance parameters in hierarchical models. Bayesian Anal., 1(3):515–33, 2006.

22.15

E. I. George and D. P. Foster. Calibration and empirical Bayes variable selection. Biometrika, 87(4):731–747, 2000.

22.16

J. Griffin and P. Brown. Alternative prior distributions for variable selection with very many more variables than observations. Technical report, University of Warwick, 2005.

22.17

C. M. Hans. Bayesian lasso regression. Technical report, Ohio State University, 2008. W. James and C. Stein. Estimation with quadratic loss. In Proc. Fourth Berke-

22.18

ley Symp. Math. Statist. Prob., volume 1, pages 361–79, 1961.
I. M. Johnstone and B. W. Silverman. Needles and straw in haystacks: Empirical- Bayes estimates of possibly sparse sequences. The Annals of Statistics, 32(4):1594– 1649, 2004.

22.19

T. Mitchell and J. Beauchamp. Bayesian variable selection in linear regression (with discussion). Journal of the American Statistical Association, 83:1023–36, 1988.
T. Park and G. Casella. The Bayesian lasso. Journal of the American Statistical Association, 103(482):681–6, 2008.
J. Rissanen. A universal prior for integers and estimation by minimum description length. The Annals of Statistics, 11(2), 1983.
J. G. Scott and J. O. Berger. An exploration of aspects of Bayesian multiple testing. Journal of Statistical Planning and Inference, 136(7):2144–2162, 2006.
J. G. Scott and J. O. Berger. Bayes and empirical-Bayes multiplicity adjustment in the variable-selection problem. Discussion Paper 2008-10, Duke University Department of Statistical Science, 2008.

C. Stein. Estimation of the mean of a multivariate normal distribution. The Annals of Statistics, 9:1135–51, 1981.
W. Strawderman. Proper Bayes minimax estimators of the multivariate normal mean. The Annals of Statistics, 42:385–8, 1971.
G. C. Tiao and W. Tan. Bayesian analysis fo random-effect models in the analysis of variance. i. Posterior distribution of variance components. Biometrika, 51:37–53, 1965.
R. Tibshirani. Regression shrinkage and selection via the lasso. J. Royal. Statist. Soc B., 58(1):267–88, 1996.
V. Uthoff. The most powerful scale and location invariant test of the normal versus the double exponential. The Annals of Statistics, 1(1):170–4, 1973.
M. West. On scale mixtures of normal distributions. Biometrika, 74(3):646–8, 1987.

参考資料(References)

Data Scientist の基礎(2)
https://qiita.com/kaizen_nagoya/items/8b2f27353a9980bf445c

岩波数学辞典 二つの版がCDに入ってお得
https://qiita.com/kaizen_nagoya/items/1210940fe2121423d777

岩波数学辞典
https://qiita.com/kaizen_nagoya/items/b37bfd303658cb5ee11e

アンの部屋(人名から学ぶ数学:岩波数学辞典)英語(24)
https://qiita.com/kaizen_nagoya/items/e02cbe23b96d5fb96aa1

<この記事は個人の過去の経験に基づく個人の感想です。現在所属する組織、業務とは関係がありません。>

最後までおよみいただきありがとうございました。

いいね 💚、フォローをお願いします。

Thank you very much for reading to the last sentence.

Please press the like icon 💚 and follow me for your happy life.

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0