<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>http://debianws.lexgopc.com/wiki143/index.php?action=history&amp;feed=atom&amp;title=Nested_sampling_algorithm</id>
	<title>Nested sampling algorithm - Revision history</title>
	<link rel="self" type="application/atom+xml" href="http://debianws.lexgopc.com/wiki143/index.php?action=history&amp;feed=atom&amp;title=Nested_sampling_algorithm"/>
	<link rel="alternate" type="text/html" href="http://debianws.lexgopc.com/wiki143/index.php?title=Nested_sampling_algorithm&amp;action=history"/>
	<updated>2026-04-21T17:23:02Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.43.1</generator>
	<entry>
		<id>http://debianws.lexgopc.com/wiki143/index.php?title=Nested_sampling_algorithm&amp;diff=3801355&amp;oldid=prev</id>
		<title>imported&gt;Michael Hardy at 17:11, 14 June 2025</title>
		<link rel="alternate" type="text/html" href="http://debianws.lexgopc.com/wiki143/index.php?title=Nested_sampling_algorithm&amp;diff=3801355&amp;oldid=prev"/>
		<updated>2025-06-14T17:11:18Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;{{Bayesian statistics}}&lt;br /&gt;
The &amp;#039;&amp;#039;&amp;#039;nested sampling algorithm&amp;#039;&amp;#039;&amp;#039; is a [[computation]]al approach to the [[Bayesian statistics]] problems of comparing models and generating samples from posterior distributions. It was developed in 2004 by [[physicist]] John Skilling.&amp;lt;ref name=&amp;quot;skilling1&amp;quot;&amp;gt;{{cite journal |last= Skilling |first= John |title= Nested Sampling |journal= AIP Conference Proceedings |pages= 395–405 |year= 2004 |volume= 735 |doi= 10.1063/1.1835238|bibcode= 2004AIPC..735..395S }}&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[Bayes&amp;#039; theorem]] can be applied to a pair of competing models &amp;lt;math&amp;gt;M_1&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;M_2&amp;lt;/math&amp;gt; for data &amp;lt;math&amp;gt;D&amp;lt;/math&amp;gt;, one of which may be true (though which one is unknown) but which both cannot be true simultaneously. The posterior probability for &amp;lt;math&amp;gt;M_1&amp;lt;/math&amp;gt; may be calculated as:&lt;br /&gt;
&lt;br /&gt;
: &amp;lt;math&amp;gt;&lt;br /&gt;
\begin{align}&lt;br /&gt;
 P(M_1\mid D) &amp;amp; = \frac{P(D\mid M_1) P(M_1)}{P(D)} \\&lt;br /&gt;
  &amp;amp; = \frac{P(D\mid M_1) P(M_1)}{P(D\mid M_1) P(M_1) + P(D\mid M_2) P(M_2)}  \\&lt;br /&gt;
  &amp;amp; = \frac{1}{1 + \frac{P(D\mid M_2)}{P(D\mid M_1)} \frac{P(M_2)}{P(M_1)} }&lt;br /&gt;
\end{align}&lt;br /&gt;
&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The prior probabilities &amp;lt;math&amp;gt;M_1&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;M_2&amp;lt;/math&amp;gt; are already known, as they are chosen by the researcher ahead of time. However, the remaining [[Bayes factor]] &amp;lt;math&amp;gt;P(D\mid M_2)/P(D\mid M_1)&amp;lt;/math&amp;gt; is not so easy to evaluate, since in general it requires marginalizing nuisance parameters. Generally, &amp;lt;math&amp;gt;M_1&amp;lt;/math&amp;gt; has a set of parameters that can be grouped together and called &amp;lt;math&amp;gt;\theta&amp;lt;/math&amp;gt;, and &amp;lt;math&amp;gt;M_2&amp;lt;/math&amp;gt; has its own vector of parameters that may be of different dimensionality, but is still termed &amp;lt;math&amp;gt;\theta&amp;lt;/math&amp;gt;. The marginalization for &amp;lt;math&amp;gt;M_1&amp;lt;/math&amp;gt; is&lt;br /&gt;
&lt;br /&gt;
: &amp;lt;math&amp;gt;P(D\mid M_1) = \int d \theta \, P(D\mid \theta,M_1) P(\theta\mid M_1)&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
and likewise for &amp;lt;math&amp;gt;M_2&amp;lt;/math&amp;gt;. This integral is often analytically intractable, and in these cases it is necessary to employ a numerical algorithm to find an approximation. The nested sampling algorithm was developed by John Skilling specifically to approximate these marginalization integrals, and it has the added benefit of generating samples from the posterior distribution &amp;lt;math&amp;gt;P(\theta\mid D,M_1)&amp;lt;/math&amp;gt;.&amp;lt;ref name = &amp;quot;skilling2&amp;quot;&amp;gt;{{cite journal |last= Skilling |first= John |title= Nested Sampling for General Bayesian Computation |journal= Bayesian Analysis |volume= 1 |issue= 4 |pages= 833–860 |year= 2006 |doi= 10.1214/06-BA127|doi-access= free }}&amp;lt;/ref&amp;gt; It is an alternative to methods from the Bayesian literature&amp;lt;ref name=&amp;quot;chen&amp;quot;&amp;gt;{{cite book |author= Chen, Ming-Hui, Shao, Qi-Man, and Ibrahim, Joseph George |title= Monte Carlo methods in Bayesian computation |publisher= Springer |year= 2000 |isbn= 978-0-387-98935-8 |url= https://books.google.com/books?id=R3GeFfshc7wC}}&amp;lt;/ref&amp;gt;  such as bridge sampling and defensive importance sampling.&lt;br /&gt;
&lt;br /&gt;
Here is a simple version of the nested sampling algorithm, followed by a description of how it computes the marginal probability density &amp;lt;math&amp;gt;Z=P(D\mid M)&amp;lt;/math&amp;gt; where &amp;lt;math&amp;gt;M&amp;lt;/math&amp;gt; is &amp;lt;math&amp;gt;M_1&amp;lt;/math&amp;gt; or &amp;lt;math&amp;gt;M_2&amp;lt;/math&amp;gt;:&lt;br /&gt;
&lt;br /&gt;
 Start with &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; points &amp;lt;math&amp;gt;\theta_1,\ldots,\theta_N&amp;lt;/math&amp;gt; sampled from prior.&lt;br /&gt;
 &amp;#039;&amp;#039;&amp;#039;for&amp;#039;&amp;#039;&amp;#039; &amp;lt;math&amp;gt;i=1&amp;lt;/math&amp;gt; to &amp;lt;math&amp;gt;j&amp;lt;/math&amp;gt; &amp;#039;&amp;#039;&amp;#039;do&amp;#039;&amp;#039;&amp;#039;        % The number of iterations j is chosen by guesswork.&lt;br /&gt;
     &amp;lt;math&amp;gt;L_i := \min(&amp;lt;/math&amp;gt;current likelihood values of the points&amp;lt;math&amp;gt;)&amp;lt;/math&amp;gt;;&lt;br /&gt;
     &amp;lt;math&amp;gt;X_i := \exp(-i/N);&amp;lt;/math&amp;gt;&lt;br /&gt;
     &amp;lt;math&amp;gt;w_i := X_{i-1} - X_i&amp;lt;/math&amp;gt;&lt;br /&gt;
     &amp;lt;math&amp;gt;Z := Z + L_i\cdot w_i;&amp;lt;/math&amp;gt;&lt;br /&gt;
     Save the point with least likelihood as a sample point with weight &amp;lt;math&amp;gt;w_i&amp;lt;/math&amp;gt;.&lt;br /&gt;
     Update the point with least likelihood with some [[Markov chain Monte Carlo]] steps according to the prior, accepting only steps that&lt;br /&gt;
     keep the likelihood above &amp;lt;math&amp;gt;L_i&amp;lt;/math&amp;gt;.&lt;br /&gt;
 &amp;#039;&amp;#039;&amp;#039;end&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
 &amp;#039;&amp;#039;&amp;#039;return&amp;#039;&amp;#039;&amp;#039; &amp;lt;math&amp;gt;Z&amp;lt;/math&amp;gt;;&lt;br /&gt;
&lt;br /&gt;
At each iteration, &amp;lt;math&amp;gt;X_i&amp;lt;/math&amp;gt; is an estimate of the amount of prior mass covered by the hypervolume in parameter space of all points with likelihood greater than &amp;lt;math&amp;gt;\theta_i&amp;lt;/math&amp;gt;. The weight factor &amp;lt;math&amp;gt;w_i&amp;lt;/math&amp;gt; is an estimate of the amount of prior mass that lies between two nested hypersurfaces &amp;lt;math&amp;gt;\{ \theta \mid P(D\mid\theta,M) = P(D\mid\theta_{i-1},M) \}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\{ \theta \mid P(D\mid\theta,M) = P(D\mid\theta_i,M) \}&amp;lt;/math&amp;gt;. The update step &amp;lt;math&amp;gt;Z := Z+L_i w_i&amp;lt;/math&amp;gt; computes the sum over &amp;lt;math&amp;gt;i&amp;lt;/math&amp;gt; of &amp;lt;math&amp;gt;L_i w_i&amp;lt;/math&amp;gt; to numerically approximate the integral&lt;br /&gt;
&lt;br /&gt;
: &amp;lt;math&amp;gt;&lt;br /&gt;
 \begin{align}&lt;br /&gt;
  P(D\mid M) &amp;amp;= \int P(D\mid \theta,M) P(\theta\mid M) \,d \theta \\&lt;br /&gt;
         &amp;amp;= \int P(D\mid \theta,M) \,dP(\theta\mid M)&lt;br /&gt;
 \end{align}&lt;br /&gt;
 &amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the limit &amp;lt;math&amp;gt;j \to \infty&amp;lt;/math&amp;gt;, this estimator has a positive bias of order &amp;lt;math&amp;gt; 1 / N&amp;lt;/math&amp;gt;&amp;lt;ref&amp;gt;{{cite journal |last= Walter |first= Clement| title=Point-process based Monte Carlo estimation| journal=Statistics and Computing|pages=219–236 |year=2017| volume=27|doi=10.1007/s11222-015-9617-y |arxiv=1412.6368|s2cid= 14639080}}&amp;lt;/ref&amp;gt; which can be removed by using &amp;lt;math&amp;gt;(1 - 1/N)&amp;lt;/math&amp;gt; instead of the &amp;lt;math&amp;gt;\exp (-1/N)&amp;lt;/math&amp;gt; in the above algorithm.&lt;br /&gt;
&lt;br /&gt;
The idea is to subdivide the range of &amp;lt;math&amp;gt;f(\theta) = P(D\mid\theta,M)&amp;lt;/math&amp;gt; and estimate, for each interval &amp;lt;math&amp;gt;[f(\theta_{i-1}), f(\theta_i)]&amp;lt;/math&amp;gt;, how likely it is a priori that a randomly chosen &amp;lt;math&amp;gt;\theta&amp;lt;/math&amp;gt; would map to this interval. This can be thought of as a Bayesian&amp;#039;s way to numerically implement [[Lebesgue integration]].&amp;lt;ref name=&amp;quot;Jasa&amp;quot;&amp;gt;{{cite journal |last1= Jasa|first1= Tomislav |last2= Xiang |first2= Ning|title= Nested sampling applied in Bayesian room-acoustics decay analysis |journal= Journal of the Acoustical Society of America |pages= 3251–3262 |year= 2012 |volume= 132 |issue= 5 |doi=10.1121/1.4754550|pmid= 23145609 |bibcode= 2012ASAJ..132.3251J|s2cid= 20876510 }}&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Choice of MCMC algorithm ==&lt;br /&gt;
The original procedure outlined by Skilling (given above in pseudocode) does not specify what specific Markov chain Monte Carlo algorithm should be used to choose new points with better likelihood.&lt;br /&gt;
&lt;br /&gt;
Skilling&amp;#039;s own code examples (such as one in Sivia and Skilling (2006),&amp;lt;ref&amp;gt;{{cite book | last=Sivia | first=Devinderjit | last2=Skilling | first2=John | title=Data Analysis: A Bayesian Tutorial | publisher=Oxford University Press, USA | publication-place=Oxford | date=June 2006 | isbn=978-0-19-856832-2 | page=}}&amp;lt;/ref&amp;gt; [https://www.inference.org.uk/bayesys/sivia/lighthouse.c available on Skilling&amp;#039;s website]) chooses a random existing point and selects a nearby point chosen by a random distance from the existing point; if the likelihood is better, then the point is accepted, else it is rejected and the process repeated. Mukherjee et al. (2006)&amp;lt;ref name=&amp;quot;mukherjee&amp;quot;&amp;gt;{{cite journal |last1= Mukherjee |first1= P. |last2= Parkinson |first2= D. |last3= Liddle |first3= A.R. |title= A Nested Sampling Algorithm for Cosmological Model Selection |journal= Astrophysical Journal |volume= 638 |issue= 2 |pages= 51–54 |year= 2006 |bibcode= 2006ApJ...638L..51M |doi= 10.1086/501068|arxiv= astro-ph/0508461|s2cid= 6208051 }}&amp;lt;/ref&amp;gt; found higher acceptance rates by selecting points randomly within an ellipsoid drawn around the existing points; this idea was refined into the MultiNest algorithm&amp;lt;ref name=&amp;quot;multinest&amp;quot;&amp;gt;{{cite journal |last1= Feroz |first1= F. |last2= Hobson |first2= M.P. | last3=Bridges|first3=M.|title= MULTINEST: an efficient and robust Bayesian inference tool for cosmology and particle physics |journal= MNRAS |volume= 398 |issue= 4 |year= 2008 |url= https://arxiv.org/abs/0809.3437 |doi= 10.1111/j.1365-2966.2009.14548.x|arxiv= 0809.3437 }}&amp;lt;/ref&amp;gt; which handles multimodal posteriors better by grouping points into likelihood contours and drawing an ellipsoid for each contour.&lt;br /&gt;
&lt;br /&gt;
==Implementations==&lt;br /&gt;
Example implementations demonstrating the nested sampling algorithm are publicly available for download, written in several [[programming language]]s.&lt;br /&gt;
* Simple examples in [[C (programming language)|C]], [[R (programming language)|R]], or [[Python (programming language)|Python]] are on [http://www.inference.phy.cam.ac.uk/bayesys/ John Skilling&amp;#039;s website].&lt;br /&gt;
* A [[Haskell (programming language)|Haskell]] port of [http://hackage.haskell.org/package/NestedSampling the above simple codes is on Hackage].&lt;br /&gt;
* An example in [[R (programming language)|R]] originally designed for [[Curve fitting|fitting]] [[Spectrum|spectra]] is described on [http://www.mrao.cam.ac.uk/~bn204/galevol/speca/rnested.html Bojan Nikolic&amp;#039;s website] and is [https://github.com/bnikolic/ available on GitHub].&lt;br /&gt;
* A NestedSampler is part of the [[Python (programming language)|Python]] toolbox BayesicFitting&amp;lt;ref name=&amp;quot;kester&amp;quot;&amp;gt;{{cite journal |last1= Kester |first1= D. |last2= Mueller |first2= M. |title= BayesicFitting, a PYTHON toolbox for Bayesian fitting and evidence calculation.: Including a Nested Sampling implementation. |journal= Astronomy and Computing |volume= 37 |pages= 100503 |year= 2021 | doi= 10.1016/j.ascom.2021.100503|doi-access= free |arxiv= 2109.11976 |bibcode= 2021A&amp;amp;C....3700503K }}&amp;lt;/ref&amp;gt; for generic model fitting and evidence calculation. It is [https://github.com/dokester/BayesicFitting available on GitHub].&lt;br /&gt;
* An implementation in [[C++]], named DIAMONDS, [https://github.com/JorisDeRidder/ is on GitHub].&lt;br /&gt;
* A highly modular [[Python (programming language)|Python]] parallel example for [[statistical physics]] and [[condensed matter physics]] uses is [https://github.com/js850/nested_sampling on GitHub].&lt;br /&gt;
* pymatnest is a package designed for exploring the [[energy landscape]] of different materials, calculating thermodynamic variables at arbitrary temperatures and locating [[phase transitions]] is [https://github.com/libAtoms/pymatnest on GitHub]&lt;br /&gt;
* The MultiNest software package is capable of performing nested sampling on multi-modal posterior distributions.&amp;lt;ref name=&amp;quot;multinest&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;feroz&amp;quot;&amp;gt;{{cite journal |last1= Feroz |first1= F. |last2= Hobson |first2= M.P. |title= Multimodal nested sampling: an efficient and robust alternative to Markov Chain Monte Carlo methods for astronomical data analyses |journal= MNRAS |volume= 384 |issue= 2 |pages= 449–463 |year= 2008 |url= http://adsabs.harvard.edu/cgi-bin/bib_query?arXiv:0704.3704 |doi= 10.1111/j.1365-2966.2007.12353.x |doi-access= free |bibcode=2008MNRAS.384..449F |arxiv= 0704.3704|s2cid= 14226032 }}&amp;lt;/ref&amp;gt; It has interfaces for C++, [[Fortran]] and Python inputs, and is [https://github.com/farhanferoz/MultiNest available on GitHub].&lt;br /&gt;
* PolyChord is another nested sampling software package [https://github.com/PolyChord/PolyChordLite available on GitHub]. PolyChord&amp;#039;s computational efficiency scales better with an increase in the number of parameters than MultiNest, meaning PolyChord can be more efficient for high dimensional problems.&amp;lt;ref&amp;gt;{{cite journal |last1=Handley |first1=Will |first2=Mike |last2=Hobson |first3=Anthony |last3=Lasenby |title=polychord: next-generation nested sampling |journal=Monthly Notices of the Royal Astronomical Society |date=2015 |volume=453 |issue=4 |pages=4384–4398 |doi=10.1093/mnras/stv1911 |doi-access=free |bibcode=2015MNRAS.453.4384H |arxiv=1506.00171 |s2cid=118882763 }}&amp;lt;/ref&amp;gt; It has interfaces to likelihood functions written in Python, Fortran, C, or C++.&lt;br /&gt;
* NestedSamplers.jl, a [[Julia (programming language)|Julia]] package for implementing single- and multi-ellipsoidal nested sampling algorithms is [https://github.com/TuringLang/NestedSamplers.jl on GitHub].&lt;br /&gt;
* [https://www.cse-lab.ethz.ch/korali/ Korali] is a high-performance framework for uncertainty quantification, optimization, and deep reinforcement learning, which also implements nested sampling.&lt;br /&gt;
&lt;br /&gt;
==Applications==&lt;br /&gt;
Since nested sampling was proposed in 2004, it has been used in many aspects of the field of [[astronomy]]. One paper suggested using nested sampling for [[cosmology|cosmological]] [[model selection]] and object detection, as it &amp;quot;uniquely combines accuracy, general applicability and computational feasibility.&amp;quot;&amp;lt;ref name=&amp;quot;mukherjee&amp;quot; /&amp;gt; A refinement of the algorithm to handle multimodal posteriors has been suggested as a means to detect astronomical objects in extant datasets.&amp;lt;ref name=&amp;quot;feroz&amp;quot;/&amp;gt; Other applications of nested sampling are in the field of [[finite element updating]] where the algorithm is used to choose an optimal [[finite element]] model, and this was applied to [[structural dynamics]].&amp;lt;ref&amp;gt;{{cite journal |last1= Mthembu |first1= L. |last2= Marwala |first2= T. |last3= Friswell |first3= M.I. |last4= Adhikari |first4= S. |title= Model selection in finite element model updating using the Bayesian evidence statistic |journal= Mechanical Systems and Signal Processing |volume= 25 |issue= 7 |pages= 2399–2412 |year= 2011 |doi=10.1016/j.ymssp.2011.04.001|bibcode= 2011MSSP...25.2399M }}&amp;lt;/ref&amp;gt; This sampling method has also been used in the field of materials modeling. It can be used to learn the [[Partition function (statistical mechanics)|partition function]] from [[statistical mechanics]] and derive [[thermodynamics|thermodynamic]] properties.&amp;lt;ref name=&amp;quot;partay&amp;quot;&amp;gt;{{cite journal|last1=Partay|first1=Livia B.|year=2010|title=Efficient Sampling of Atomic Configurational Spaces|journal=The Journal of Physical Chemistry B|volume=114|issue=32|pages=10502–10512|doi=10.1021/jp1012973|pmid=20701382|arxiv=0906.3544|s2cid=16834142}}&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Dynamic nested sampling==&lt;br /&gt;
&lt;br /&gt;
Dynamic nested sampling is a generalisation of the nested sampling algorithm in which the number of samples taken in different regions of the parameter space is dynamically adjusted to maximise calculation accuracy.&amp;lt;ref&amp;gt;{{cite journal |last1=Higson |first1=Edward |last2=Handley |first2=Will |last3=Hobson |first3=Michael |last4=Lasenby |first4=Anthony |title=Dynamic nested sampling: an improved algorithm for parameter estimation and evidence calculation |journal=Statistics and Computing |date=2019 |volume=29 |issue=5 |pages=891–913 |doi=10.1007/s11222-018-9844-0 |bibcode=2019S&amp;amp;C....29..891H |arxiv=1704.03459 |s2cid=53514669 }}&amp;lt;/ref&amp;gt; This can lead to large improvements in accuracy and computational efficiency when compared to the original nested sampling algorithm, in which the allocation of samples cannot be changed and often many samples are taken in regions which have little effect on calculation accuracy.&lt;br /&gt;
&lt;br /&gt;
Publicly available dynamic nested sampling software packages include:&lt;br /&gt;
* {{Proper name|dynesty}} – a Python implementation of dynamic nested sampling which can be [https://github.com/joshspeagle/dynesty downloaded from GitHub].&amp;lt;ref&amp;gt;{{cite journal |last=Speagle |first=Joshua |title=dynesty: A Dynamic Nested Sampling Package for Estimating Bayesian Posteriors and Evidences |journal=Monthly Notices of the Royal Astronomical Society |year=2020 |volume=493 |issue=3 |pages=3132–3158 |doi=10.1093/mnras/staa278 |doi-access=free |arxiv=1904.02180|s2cid=102354337 }}&amp;lt;/ref&amp;gt;&lt;br /&gt;
* dyPolyChord: a software package which can be used with Python, C++ and Fortran likelihood and prior distributions.&amp;lt;ref&amp;gt;{{cite journal |last1=Higson |first1=Edward |title=dyPolyChord: dynamic nested sampling with PolyChord |journal=Journal of Open Source Software |date=2018 |volume=3 |issue=29 |page=965 |doi=10.21105/joss.00965 |doi-access=free }}&amp;lt;/ref&amp;gt; dyPolyChord is [https://github.com/ejhigson/dyPolyChord available on GitHub].&lt;br /&gt;
&lt;br /&gt;
Dynamic nested sampling has been applied to a variety of scientific problems, including analysis of gravitational waves,&amp;lt;ref&amp;gt;{{cite journal |last1=Ashton |first1=Gregory |title=Bilby: A User-friendly Bayesian Inference Library for Gravitational-wave Astronomy |journal=The Astrophysical Journal Supplement Series |date=2019 |volume=241 |issue=2 |page=13 |doi=10.3847/1538-4365/ab06fc |display-authors=etal|bibcode=2019ApJS..241...27A |arxiv=1811.02042 |s2cid=118677076 |doi-access=free }}&amp;lt;/ref&amp;gt; mapping distances in space&amp;lt;ref&amp;gt;{{cite journal |last1=Zucker |first1=Catherine |title=Mapping Distances across the Perseus Molecular Cloud Using {CO} Observations, Stellar Photometry, and Gaia {DR}2 Parallax Measurements |journal=The Astrophysical Journal |date=2018 |volume=869 |issue=1 |page=83 |doi=10.3847/1538-4357/aae97c |display-authors=etal|arxiv=1803.08931 |s2cid=119446622 |doi-access=free }}&amp;lt;/ref&amp;gt; and exoplanet detection.&amp;lt;ref&amp;gt;{{cite journal |last1=Günther |first1=Maximilian |title=A super-Earth and two sub-Neptunes transiting the nearby and quiet M dwarf TOI-270 |journal=Nature Astronomy |date=2019 |volume=3 |issue=12 |pages=1099–1108 |doi=10.1038/s41550-019-0845-5 |display-authors=etal|bibcode=2019NatAs...3.1099G |arxiv=1903.06107 |s2cid=119286334 }}&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==See also==&lt;br /&gt;
*[[Bayesian model comparison]]&lt;br /&gt;
*[[List of algorithms]]&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
{{Reflist}}&lt;br /&gt;
&lt;br /&gt;
[[Category:Bayesian statistics]]&lt;br /&gt;
[[Category:Model selection]]&lt;br /&gt;
[[Category:Randomized algorithms]]&lt;/div&gt;</summary>
		<author><name>imported&gt;Michael Hardy</name></author>
	</entry>
</feed>