<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.0 20120330//EN" "JATS-journalpublishing1.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">NEJSDS</journal-id>
<journal-title-group><journal-title>The New England Journal of Statistics in Data Science</journal-title></journal-title-group>
<issn pub-type="ppub">2693-7166</issn><issn-l>2693-7166</issn-l>
<publisher>
<publisher-name>New England Statistical Society</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">NEJSDS12EDI</article-id>
<article-id pub-id-type="doi">10.51387/23-NEJSDS12EDI</article-id>
<article-categories>
<subj-group subj-group-type="heading"><subject>Editorial</subject></subj-group>
</article-categories>
<title-group>
<article-title>Editorial. Modern Bayesian Methods with Applications in Data Science</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name><surname>Dey</surname><given-names>Dipak K.</given-names></name><email xlink:href="mailto:dipak.dey@uconn.edu">dipak.dey@uconn.edu</email><xref ref-type="aff" rid="j_nejsds12edi_aff_001"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Chen</surname><given-names>Ming-Hui</given-names></name><email xlink:href="mailto:ming-hui.chen@uconn.edu">ming-hui.chen@uconn.edu</email><xref ref-type="aff" rid="j_nejsds12edi_aff_002"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Xie</surname><given-names>Min-ge</given-names></name><email xlink:href="mailto:mxie@stat.rutgers.edu">mxie@stat.rutgers.edu</email><xref ref-type="aff" rid="j_nejsds12edi_aff_003"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Wang</surname><given-names>HaiYing</given-names></name><email xlink:href="mailto:haiying.wang@uconn.edu">haiying.wang@uconn.edu</email><xref ref-type="aff" rid="j_nejsds12edi_aff_004"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Wu</surname><given-names>Jing</given-names></name><email xlink:href="mailto:jing_wu@uri.edu">jing_wu@uri.edu</email><xref ref-type="aff" rid="j_nejsds12edi_aff_005"/>
</contrib>
<aff id="j_nejsds12edi_aff_001">Department of Statistics, <institution>University of Connecticut</institution>, <country>USA</country>. E-mail address: <email xlink:href="mailto:dipak.dey@uconn.edu">dipak.dey@uconn.edu</email></aff>
<aff id="j_nejsds12edi_aff_002">Department of Statistics, <institution>University of Connecticut</institution>, <country>USA</country>. E-mail address: <email xlink:href="mailto:ming-hui.chen@uconn.edu">ming-hui.chen@uconn.edu</email></aff>
<aff id="j_nejsds12edi_aff_003">Department of Statistics, <institution>Rutgers University</institution>, <country>USA</country>. E-mail address: <email xlink:href="mailto:mxie@stat.rutgers.edu">mxie@stat.rutgers.edu</email></aff>
<aff id="j_nejsds12edi_aff_004">Department of Statistics, <institution>University of Connecticut</institution>, <country>USA</country>. E-mail address: <email xlink:href="mailto:haiying.wang@uconn.edu">haiying.wang@uconn.edu</email></aff>
<aff id="j_nejsds12edi_aff_005">Department of Computer Science and Statistics, <institution>University of Rhode Island</institution>, <country>USA</country>. E-mail address: <email xlink:href="mailto:jing_wu@uri.edu">jing_wu@uri.edu</email></aff>
</contrib-group>
<pub-date pub-type="ppub"><year>2023</year></pub-date><pub-date pub-type="epub"><day>15</day><month>9</month><year>2023</year></pub-date><volume>1</volume><issue>2</issue><fpage>123</fpage><lpage>125</lpage>
<permissions><copyright-statement>© 2023 New England Statistical Society</copyright-statement><copyright-year>2023</copyright-year>
<license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by/4.0/">
<license-p>Open access article under the <ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by/4.0/">CC BY</ext-link> license.</license-p></license></permissions>
</article-meta>
</front>
<body>
<sec id="j_nejsds12edi_s_001">
<label>1</label>
<title>Background of the special issue</title>
<p>This special issue is on “Modern Bayesian Methods with Applications in Data Science”, originated from the EAC-ISBA 2021 conference, with the theme of celebrating Dr. James O. Berger’s 70th birthday.</p>
<p>This issue brings together a collection of thought-provoking discussions and innovative methodologies that shed light on the interplay between frequentist and Bayesian approaches in statistics.</p>
</sec>
<sec id="j_nejsds12edi_s_002">
<label>2</label>
<title>The discussions</title>
<p>James O. Berger’s paper [<xref ref-type="bibr" rid="j_nejsds12edi_ref_001">1</xref>] sets the stage by providing a comprehensive examination of different types of frequentism and their compatibility with Bayesian reasoning. Through practical examples, he elucidates the strengths and limitations of various frequentist perspectives, offering valuable insights to researchers and practitioners alike.</p>
<p>Van der Vaart [<xref ref-type="bibr" rid="j_nejsds12edi_ref_014">14</xref>] provides personal reflections on Berger’s classification of frequentists into different types. It offers appraisal for Type I and Type II frequentism, resonating with the author’s pragmatism. The discussion highlights the natural acceptance of empirical frequentism across statistical frameworks and the compatibility of procedural frequentism with Bayesian reasoning, grounded in the notions of consistency and compatibility.</p>
<p>Pericchi [<xref ref-type="bibr" rid="j_nejsds12edi_ref_007">7</xref>] emphasizes the importance of distinguishing between different types of frequentism and highlights the scientifically compelling “empirical frequentist” approach. It delves into the convergence between frequentist and Bayesian schools, particularly through the lens of objective Bayesian reasoning.</p>
<p>In Rousseau [<xref ref-type="bibr" rid="j_nejsds12edi_ref_010">10</xref>], the attention is drawn to Berger’s review of error reporting from a frequentist perspective and its connection to Bayesian thinking. The discussion raises intriguing questions about the justification of Neyman-Pearson procedures from an empirical frequentist viewpoint, stimulating thought-provoking discussions around relevant measures of uncertainty and reported errors.</p>
<p>The rejoinder [<xref ref-type="bibr" rid="j_nejsds12edi_ref_002">2</xref>] provides valuable insights from prominent researchers who respond to specific comments and observations. Their contributions further enrich the discussions on empirical error, precision in defining frequentism, and empirical frequentist targets in multiple testing scenarios.</p>
</sec>
<sec id="j_nejsds12edi_s_003">
<label>3</label>
<title>Research articles</title>
<p>In addition to the discussion paper, this issue also features a diverse range of research articles that explore Bayesian and frequentist inferences in various statistical domains.</p>
<p>Porwal and Raftery [<xref ref-type="bibr" rid="j_nejsds12edi_ref_008">8</xref>] talk about Bayesian model averaging (BMA), which accounts for model uncertainty in statistical inference tasks. The authors compare eight different model space priors and three adaptive parameter priors in BMA for variable selection in linear regression models. They assess the performance of these prior specifications for various statistical tasks, including parameter estimation, interval estimation, inference, point and interval prediction, through extensive simulation studies based on 14 real datasets. The authors reveal that the beta-binomial model space priors specified in terms of the prior probability of model size performed the best on average for different statistical tasks and datasets.</p>
<p>Vimalajeewa et al. [<xref ref-type="bibr" rid="j_nejsds12edi_ref_015">15</xref>] propose a method for wavelet denoising of signals contaminated with Gaussian noise using level dependent shrinkage rules. The method is particularly useful for denoising tasks when the signal-to-noise ratio is low. Through simulations, the proposed method outperformed several standard wavelet shrinkage methods.</p>
<p>Gu et al. [<xref ref-type="bibr" rid="j_nejsds12edi_ref_004">4</xref>] focus on scalable marginalization of latent variables in modeling correlated data. The authors introduce innovative approaches, such as Gaussian processes and sparse representation, to overcome the computational complexity associated with large data sets. These techniques have wide-ranging applications in various domains, including molecular dynamics simulation, cellular migration, and agent-based models.</p>
<p>Halder et al. [<xref ref-type="bibr" rid="j_nejsds12edi_ref_005">5</xref>] discuss double generalized linear models, which can vary the mean and dispersion across observations, and are applicable to many commonly used distributions. However, there are challenges with model specification when dealing with many covariates and dependent data. To address these challenges, the authors propose a hierarchical model with a spatial random effect, specifically using a Gaussian process specification. They use Bayesian variable selection with a continuous spike-and-slab prior on fixed effects to address the problem of model specification. They showcase the accuracy of their frameworks through synthetic experiments and apply them to analyze automobile insurance premiums in Connecticut.</p>
<p>Maity and Basu [<xref ref-type="bibr" rid="j_nejsds12edi_ref_006">6</xref>] also focus on variable selection, an important topic in data analytics applications. The authors propose a Bayesian approach to selecting the model with the highest posterior probability. The authors use simulated annealing to perform this optimization over the model space and show its feasibility in high-dimensional problems through various simulation studies. They provide theoretical justifications and discuss applications to high-dimensional datasets. The proposed method is implemented in an R package called <monospace>sahpm</monospace> and is available in R CRAN.</p>
<p>Shen et al. [<xref ref-type="bibr" rid="j_nejsds12edi_ref_011">11</xref>] compare tail probabilities between the Bayesian and frequentist methods. The authors investigate why the Bayesian estimator for tail probability is consistently higher than the frequentist estimator and establish sufficient conditions for this phenomenon, using both Jensen’s inequality and Taylor series approximations. These analyses point to the convexity of the distribution function. The authors bring up the example of a rainfall in Venezuela that caused over 30,000 deaths, which was not captured by simple frequentist extreme value techniques but was predicted by Bayesian inference using parameter uncertainty and full available data.</p>
<p>Prothero et al. [<xref ref-type="bibr" rid="j_nejsds12edi_ref_009">9</xref>] discuss the under-examined aspect of centering in data analysis, specifically in functional data analysis (FDA). The authors suggest that centering along a dimension other than the default can identify a useful mode of variation not previously explored in FDA. They propose a unified framework and new terminology for centering operations, as well as a series of diagnostics for determining the best choice of centering for a given dataset. The authors clearly demonstrate the intuition behind and consequences of each centering choice through informative graphics and explore the application of their diagnostics in several FDA settings. The article also addresses ambiguities in matrix orientation and nomenclature.</p>
<p>Shen et al. [<xref ref-type="bibr" rid="j_nejsds12edi_ref_012">12</xref>] consider the envelope model, a dimension reduction method for multivariate linear regression that has gained attention for its modeling flexibility and improved estimation and prediction efficiency. The authors incorporate the partial response envelope model and the simultaneous envelope model into a Bayesian framework and propose a novel Bayesian simultaneous partial envelope model that addresses some of the limitations of both approaches. The method has the flexibility of incorporating prior information and enables coherent quantification of all modeling uncertainty through the posterior distribution of model parameters. A block Metropolis-within-Gibbs algorithm for Markov chain Monte Carlo sampling from the posterior is developed.</p>
<p>Thornton et al. [<xref ref-type="bibr" rid="j_nejsds12edi_ref_013">13</xref>] study approximate confidence distribution computing (ACDC), which is a likelihood-free inference method within a frequentist framework that provides frequentist validation for computational inference in problems with unknown or intractable likelihoods. The main theoretical contribution of this work is the identification of a matching condition necessary for frequentist validity of inference from this method, connecting Bayesian and frequentist inferential paradigms. The authors present a data-driven approach to drive ACDC in both Bayesian or frequentist contexts, using a data-dependent proposal function that is general and adaptable to many settings. The ACDC development does not require to use a sufficient statistic, sidestepping a constraint for making a valid inference in an Approximate Bayesian Computing (ABC) method. The paper also includes numerical studies to empirically validate the theoretical results and suggests instances where ACDC outperforms the ABC methods.</p>
<p>Dey et al. [<xref ref-type="bibr" rid="j_nejsds12edi_ref_003">3</xref>] introduce the use of graphical Gaussian processes for modeling multivariate spatial data, which is an area that has seen significant growth and usage in spatial data science. While much of the literature has focused on a single or few spatially dependent outcomes, recent attention has been given to modeling and inference for a large number of outcomes. The focus of the article is on scalable graphical models that exploit the notion of conditional independence among a large number of spatial processes to enable fully model-based Bayesian analysis.</p>
</sec>
<sec id="j_nejsds12edi_s_004">
<label>4</label>
<title>Remark</title>
<p>We hope that this special issue will inspire researchers to further explore the fascinating bridges between frequentism and Bayesianism, and simulate further developments of novel methodologies to advance statistics and data science.</p>
</sec>
</body>
<back>
<ack id="j_nejsds12edi_ack_001">
<title>Acknowledgement</title>
<p>We extend our gratitude to all the authors, reviewers, and contributors who have made this issue possible. Their dedication and expertise have ensured the quality and relevance of the papers presented.</p></ack>
<ref-list id="j_nejsds12edi_reflist_001">
<title>References</title>
<ref id="j_nejsds12edi_ref_001">
<label>[1]</label><mixed-citation publication-type="other"> <string-name><surname>Berger</surname>, <given-names>J.</given-names></string-name> (2022). Four Types of Frequentism and Their Interplay with Bayesianism. <italic>The New England Journal of Statistics in Data Science</italic> 1–12. <ext-link ext-link-type="doi" xlink:href="https://doi.org/10.51387/22-NEJSDS4" xlink:type="simple">https://doi.org/10.51387/22-NEJSDS4</ext-link>.</mixed-citation>
</ref>
<ref id="j_nejsds12edi_ref_002">
<label>[2]</label><mixed-citation publication-type="other"> <string-name><surname>Berger</surname>, <given-names>J.</given-names></string-name> (2022). Rejoinder of “Four Types of Frequentism and Their Interplay with Bayesianism”. <italic>The New England Journal of Statistics in Data Science</italic> 1–2. <ext-link ext-link-type="doi" xlink:href="https://doi.org/10.51387/22-NEJSDS4REJ" xlink:type="simple">https://doi.org/10.51387/22-NEJSDS4REJ</ext-link>.</mixed-citation>
</ref>
<ref id="j_nejsds12edi_ref_003">
<label>[3]</label><mixed-citation publication-type="other"> <string-name><surname>Dey</surname>, <given-names>D.</given-names></string-name>, <string-name><surname>Datta</surname>, <given-names>A.</given-names></string-name> and <string-name><surname>Banerjee</surname>, <given-names>S.</given-names></string-name> (2023). Modeling Multivariate Spatial Dependencies Using Graphical Models. <italic>The New England Journal of Statistics in Data Science</italic> 1–13. <ext-link ext-link-type="doi" xlink:href="https://doi.org/10.51387/23-NEJSDS47" xlink:type="simple">https://doi.org/10.51387/23-NEJSDS47</ext-link>.</mixed-citation>
</ref>
<ref id="j_nejsds12edi_ref_004">
<label>[4]</label><mixed-citation publication-type="other"> <string-name><surname>Gu</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Liu</surname>, <given-names>X.</given-names></string-name>, <string-name><surname>Fang</surname>, <given-names>X.</given-names></string-name> and <string-name><surname>Tang</surname>, <given-names>S.</given-names></string-name> (2022). Scalable Marginalization of Correlated Latent Variables with Applications to Learning Particle Interaction Kernels. <italic>The New England Journal of Statistics in Data Science</italic> 1–15. <ext-link ext-link-type="doi" xlink:href="https://doi.org/10.51387/22-NEJSDS13" xlink:type="simple">https://doi.org/10.51387/22-NEJSDS13</ext-link>.</mixed-citation>
</ref>
<ref id="j_nejsds12edi_ref_005">
<label>[5]</label><mixed-citation publication-type="other"> <string-name><surname>Halder</surname>, <given-names>A.</given-names></string-name>, <string-name><surname>Mohammed</surname>, <given-names>S.</given-names></string-name> and <string-name><surname>Dey</surname>, <given-names>D. K.</given-names></string-name> (2023). Bayesian Variable Selection in Double Generalized Linear Tweedie Spatial Process Models. <italic>The New England Journal of Statistics in Data Science</italic> 1–13. <ext-link ext-link-type="doi" xlink:href="https://doi.org/10.51387/23-NEJSDS37" xlink:type="simple">https://doi.org/10.51387/23-NEJSDS37</ext-link>.</mixed-citation>
</ref>
<ref id="j_nejsds12edi_ref_006">
<label>[6]</label><mixed-citation publication-type="other"> <string-name><surname>Maity</surname>, <given-names>A. K.</given-names></string-name> and <string-name><surname>Basu</surname>, <given-names>S.</given-names></string-name> (2023). Highest Posterior Model Computation and Variable Selection via Simulated Annealing. <italic>The New England Journal of Statistics in Data Science</italic> 1–8. <ext-link ext-link-type="doi" xlink:href="https://doi.org/10.51387/23-NEJSDS40" xlink:type="simple">https://doi.org/10.51387/23-NEJSDS40</ext-link>.</mixed-citation>
</ref>
<ref id="j_nejsds12edi_ref_007">
<label>[7]</label><mixed-citation publication-type="other"> <string-name><surname>Pericchi</surname>, <given-names>L.</given-names></string-name> (2023). Invited Discussion of J.O. Berger: Four Types of Frequentism and Their Interplay with Bayesianism. <italic>The New England Journal of Statistics in Data Science</italic> 1–3. <ext-link ext-link-type="doi" xlink:href="https://doi.org/10.51387/23-NEJSDS4B" xlink:type="simple">https://doi.org/10.51387/23-NEJSDS4B</ext-link>.</mixed-citation>
</ref>
<ref id="j_nejsds12edi_ref_008">
<label>[8]</label><mixed-citation publication-type="other"> <string-name><surname>Porwal</surname>, <given-names>A.</given-names></string-name> and <string-name><surname>Raftery</surname>, <given-names>A. E.</given-names></string-name> (2022). Effect of Model Space Priors on Statistical Inference with Model Uncertainty. <italic>The New England Journal of Statistics in Data Science</italic> 1–10. <ext-link ext-link-type="doi" xlink:href="https://doi.org/10.51387/22-NEJSDS14" xlink:type="simple">https://doi.org/10.51387/22-NEJSDS14</ext-link>.</mixed-citation>
</ref>
<ref id="j_nejsds12edi_ref_009">
<label>[9]</label><mixed-citation publication-type="other"> <string-name><surname>Prothero</surname>, <given-names>J.</given-names></string-name>, <string-name><surname>Hannig</surname>, <given-names>J.</given-names></string-name> and <string-name><surname>Marron</surname>, <given-names>J. S.</given-names></string-name> (2023). New Perspectives on Centering. <italic>The New England Journal of Statistics in Data Science</italic> 1–21. <ext-link ext-link-type="doi" xlink:href="https://doi.org/10.51387/23-NEJSDS31" xlink:type="simple">https://doi.org/10.51387/23-NEJSDS31</ext-link>.</mixed-citation>
</ref>
<ref id="j_nejsds12edi_ref_010">
<label>[10]</label><mixed-citation publication-type="other"> <string-name><surname>Rousseau</surname>, <given-names>J.</given-names></string-name> (2023). Discussion of: Four Types of Frequentism and Their Interplay with Bayesianism, by J. Berger. <italic>The New England Journal of Statistics in Data Science</italic> 1–2. <ext-link ext-link-type="doi" xlink:href="https://doi.org/10.51387/23-NEJSDS4C" xlink:type="simple">https://doi.org/10.51387/23-NEJSDS4C</ext-link>.</mixed-citation>
</ref>
<ref id="j_nejsds12edi_ref_011">
<label>[11]</label><mixed-citation publication-type="other"> <string-name><surname>Shen</surname>, <given-names>N.</given-names></string-name>, <string-name><surname>González-Arévalo</surname>, <given-names>B.</given-names></string-name> and <string-name><surname>Pericchi</surname>, <given-names>L. R.</given-names></string-name> (2023). Comparison Between Bayesian and Frequentist Tail Probability Estimates. <italic>The New England Journal of Statistics in Data Science</italic> 1–8. <ext-link ext-link-type="doi" xlink:href="https://doi.org/10.51387/23-NEJSDS39" xlink:type="simple">https://doi.org/10.51387/23-NEJSDS39</ext-link>.</mixed-citation>
</ref>
<ref id="j_nejsds12edi_ref_012">
<label>[12]</label><mixed-citation publication-type="other"> <string-name><surname>Shen</surname>, <given-names>Y.</given-names></string-name>, <string-name><surname>Park</surname>, <given-names>Y.</given-names></string-name>, <string-name><surname>Chakraborty</surname>, <given-names>S.</given-names></string-name> and <string-name><surname>Zhang</surname>, <given-names>C.</given-names></string-name> (2023). Bayesian Simultaneous Partial Envelope Model with Application to an Imaging Genetics Analysis. <italic>The New England Journal of Statistics in Data Science</italic> 1–33. <ext-link ext-link-type="doi" xlink:href="https://doi.org/10.51387/23-NEJSDS23" xlink:type="simple">https://doi.org/10.51387/23-NEJSDS23</ext-link>.</mixed-citation>
</ref>
<ref id="j_nejsds12edi_ref_013">
<label>[13]</label><mixed-citation publication-type="other"> <string-name><surname>Thornton</surname>, <given-names>S.</given-names></string-name>, <string-name><surname>Li</surname>, <given-names>W.</given-names></string-name> and <string-name><surname>Xie</surname>, <given-names>M.</given-names></string-name> (2023). Approximate Confidence Distribution Computing. <italic>The New England Journal of Statistics in Data Science</italic> 1–13. <ext-link ext-link-type="doi" xlink:href="https://doi.org/10.51387/23-NEJSDS38" xlink:type="simple">https://doi.org/10.51387/23-NEJSDS38</ext-link>.</mixed-citation>
</ref>
<ref id="j_nejsds12edi_ref_014">
<label>[14]</label><mixed-citation publication-type="other"> <string-name><surname>van der Vaart</surname>, <given-names>A.</given-names></string-name> (2022). Frequentism. <italic>The New England Journal of Statistics in Data Science</italic> 1–4. <ext-link ext-link-type="doi" xlink:href="https://doi.org/10.51387/22-NEJSDS4A" xlink:type="simple">https://doi.org/10.51387/22-NEJSDS4A</ext-link>.</mixed-citation>
</ref>
<ref id="j_nejsds12edi_ref_015">
<label>[15]</label><mixed-citation publication-type="other"> <string-name><surname>Vimalajeewa</surname>, <given-names>D.</given-names></string-name>, <string-name><surname>DasGupta</surname>, <given-names>A.</given-names></string-name>, <string-name><surname>Ruggeri</surname>, <given-names>F.</given-names></string-name> and <string-name><surname>Vidakovic</surname>, <given-names>B.</given-names></string-name> (2023). Gamma-Minimax Wavelet Shrinkage for Signals with Low SNR. <italic>The New England Journal of Statistics in Data Science</italic> 1–13. <ext-link ext-link-type="doi" xlink:href="https://doi.org/10.51387/23-NEJSDS43" xlink:type="simple">https://doi.org/10.51387/23-NEJSDS43</ext-link>.</mixed-citation>
</ref>
</ref-list>
</back>
</article>
