Markov blanket: Difference between revisions
imported>Grammophone minds ce, clarify definition, rm OR ("Markov chains" claim), improve lead, +Markov condition, per WP:VERIFY and WP:NPOV |
imported>WikiCleanerBot m v2.05b - Bot T18 CW#553 - Fix errors for CW project (<nowiki> tags) |
||
| Line 2: | Line 2: | ||
[[Image:Diagram of a Markov blanket.svg|frame|In a [[Bayesian network]], the Markov boundary of node ''A'' includes its parents, children and the other parents of all of its children.]] | [[Image:Diagram of a Markov blanket.svg|frame|In a [[Bayesian network]], the Markov boundary of node ''A'' includes its parents, children and the other parents of all of its children.]] | ||
In [[statistics]] and [[machine learning]], a '''Markov blanket''' of a [[random variable]] is a minimal [[Set (mathematics)|set]] of variables that renders the variable [[Conditional independence|conditionally | In [[statistics]] and [[machine learning]], a '''Markov blanket''' of a [[random variable]] is a minimal [[Set (mathematics)|set]] of variables that renders the variable [[Conditional independence|conditionally independent]] of all other variables in the system. This concept is central in [[Graphical model|probabilistic graphical models]] and [[feature selection]]. If a Markov blanket is minimal—meaning that no variable in it can be removed without losing this conditional independence—it is called a '''Markov boundary'''. Identifying a Markov blanket or boundary allows for efficient [[Statistical inference|inference]] and helps isolate relevant variables for prediction or causal reasoning. The terms of Markov blanket and Markov boundary were coined by [[Judea Pearl]] in 1988.<ref name=":0">{{cite book |last=Pearl |first=Judea |authorlink=Judea Pearl |title=Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference |publisher=Morgan Kaufmann |location=San Mateo CA |year=1988 |isbn=0-934613-73-7 |series=Representation and Reasoning Series |url-access=registration |url=https://archive.org/details/probabilisticrea00pear }}</ref> A Markov blanket may be derived from the structure of a probabilistic graphical model such as a [[Bayesian network]] or [[Markov random field]].<!--[[Markov chain#Testing]]--> | ||
== Markov blanket == | == Markov blanket == | ||
A Markov blanket of a random variable <math>Y</math> in a random variable set <math>\mathcal{S}=\{X_1,\ldots,X_n\}</math> is any subset <math>\mathcal{S}_1</math> of <math>\mathcal{S}</math>, conditioned on which other variables are independent with <math>Y</math>: | A Markov blanket of a random variable <math>Y</math> in a random variable set <math>\mathcal{S}=\{X_1,\ldots,X_n\}</math> is any subset <math>\mathcal{S}_1</math> of <math>\mathcal{S}</math>, conditioned on which other variables are independent with <math>Y</math>: | ||
<math display="block">Y \perp\!\!\!\perp \mathcal{S} \ | <math display="block">Y \perp\!\!\!\perp \mathcal{S} \smallsetminus \mathcal{S}_1 \mid \mathcal{S}_1</math> | ||
It means that <math>\mathcal{S}_1</math> contains at least all the information one needs to infer <math>Y</math>, where the variables in <math>\mathcal{S} \ | It means that <math>\mathcal{S}_1</math> contains at least all the information one needs to infer <math>Y</math>, where the variables in <math>\mathcal{S} \smallsetminus \mathcal{S}_1</math> are redundant. | ||
In general, a given Markov blanket is not unique. Any set in <math>\mathcal{S}</math> that contains a Markov blanket is also a Markov blanket itself. Specifically, <math>\mathcal{S}</math> is a Markov blanket of <math>Y</math> in <math>\mathcal{S}</math>. | In general, a given Markov blanket is not unique. Any set in <math>\mathcal{S}</math> that contains a Markov blanket is also a Markov blanket itself. Specifically, <math>\mathcal{S}</math> is a Markov blanket of <math>Y</math> in <math>\mathcal{S}</math>. | ||
Latest revision as of 23:31, 23 June 2025
In statistics and machine learning, a Markov blanket of a random variable is a minimal set of variables that renders the variable conditionally independent of all other variables in the system. This concept is central in probabilistic graphical models and feature selection. If a Markov blanket is minimal—meaning that no variable in it can be removed without losing this conditional independence—it is called a Markov boundary. Identifying a Markov blanket or boundary allows for efficient inference and helps isolate relevant variables for prediction or causal reasoning. The terms of Markov blanket and Markov boundary were coined by Judea Pearl in 1988.[1] A Markov blanket may be derived from the structure of a probabilistic graphical model such as a Bayesian network or Markov random field.
Markov blanket
A Markov blanket of a random variable in a random variable set is any subset of , conditioned on which other variables are independent with :
It means that contains at least all the information one needs to infer , where the variables in are redundant.
In general, a given Markov blanket is not unique. Any set in that contains a Markov blanket is also a Markov blanket itself. Specifically, is a Markov blanket of in .
Example
In a Bayesian network, the Markov blanket of a node consists of its parents, its children, and its children's other parents (i.e., co-parents). Knowing the values of these nodes makes the target node conditionally independent of the rest of the network. In a Markov random field, the Markov blanket of a node is simply its immediate neighbors.
Markov condition
The concept of a Markov blanket is rooted in the Markov condition, which states that in a probabilistic graphical model, each variable is conditionally independent of its non-descendants given its parents.[1] This condition implies the existence of a minimal separating set — the Markov blanket — that shields a variable from the rest of the network.
Markov boundary
A Markov boundary of in is a subset of , such that itself is a Markov blanket of , but any proper subset of is not a Markov blanket of . In other words, a Markov boundary is a minimal Markov blanket.
The Markov boundary of a node in a Bayesian network is the set of nodes composed of 's parents, 's children, and 's children's other parents. In a Markov random field, the Markov boundary for a node is the set of its neighboring nodes. In a dependency network, the Markov boundary for a node is the set of its parents.
Uniqueness of Markov boundary
The Markov boundary always exists. Under some mild conditions, the Markov boundary is unique. However, for most practical and theoretical scenarios multiple Markov boundaries may provide alternative solutions.[2] When there are multiple Markov boundaries, quantities measuring causal effect could fail.[3]
See also
- Andrey Markov
- Free energy minimisation
- Moral graph
- Separation of concerns
- Causality
- Causal inference