Heaviside step function: Difference between revisions
imported>Zvavybir →Formulation: Added the piece-wise definition also to the other two kinds of Heaviside function |
imported>JJMC89 bot III m Removing Category:Eponymous functions per Wikipedia:Categories for discussion/Log/2025 October 27#Eponyms in mathematics round 2 |
||
| (One intermediate revision by one other user not shown) | |||
| Line 16: | Line 16: | ||
==Formulation== | ==Formulation== | ||
Taking the convention that {{math|''H''(0) {{=}} 1}}, the Heaviside function may be defined as: | Taking the convention that {{math|''H''(0) {{=}} 1}}, the Heaviside function may be defined as: | ||
* | * A [[piecewise function]]: <math display="block">H(x) := \begin{cases} 1, & x \geq 0 \\ 0, & x < 0 \end{cases}</math> | ||
* | * Using the [[Iverson bracket]] notation: <math display="block">H(x) := [x \geq 0]</math> | ||
* | * An [[indicator function]]: <math display="block">H(x) := \mathbf{1}_{x \geq 0}=\mathbf 1_{\mathbb R_+}(x)</math> | ||
For the alternative convention that {{math|''H''(0) {{=}} {{sfrac|1|2}}}}, it may be expressed as: | For the alternative convention that {{math|''H''(0) {{=}} {{sfrac|1|2}}}}, it may be expressed as: | ||
* | * A [[piecewise function]]: <math display="block">H(x) := \begin{cases} 1, & x > 0 \\ \frac{1}{2}, & x = 0 \\ 0, & x < 0 \end{cases}</math> | ||
* | * A [[linear transformation]] of the [[sign function]]: <math display="block">H(x) := \frac{1}{2} \left(\mbox{sgn}\, x + 1\right)</math> | ||
* | * The [[arithmetic mean]] of two [[Iverson bracket]]s: <math display="block">H(x) := \frac{[x\geq 0] + [x>0]}{2}</math> | ||
* | * A [[one-sided limit]] of the [[atan2|two-argument arctangent]]: <math display="block">H(x) =: \lim_{\epsilon \to 0^{+}} \frac{\mbox{atan2}(\epsilon,-x)}{\pi}</math> | ||
* | * A [[hyperfunction]]: <math display="block">H(x) =: \left(1-\frac{1}{2\pi i}\log z,\ -\frac{1}{2\pi i}\log z\right)</math> Or equivalently: <math display="block">H(x) =: \left( -\frac{\log -z}{2\pi i}, -\frac{\log -z}{2\pi i}\right),</math> where {{math|log ''z''}} is the [[Complex logarithm#Principal value|principal value of the complex logarithm]] of {{mvar|z}}. | ||
Other definitions which are undefined at {{math|''H''(0)}} include: | Other definitions which are undefined at {{math|''H''(0)}} include: | ||
* | * A [[piecewise function]]: <math display="block">H(x) := \begin{cases} 1, & x > 0 \\ 0, & x < 0 \end{cases}</math> | ||
* | * The derivative of the [[ramp function]]: <math display="block">H(x) := \frac{d}{dx} \max \{ x, 0 \}\quad \mbox{for } x \ne 0</math> | ||
* in terms of the [[absolute value]] function as | * Expressed in terms of the [[absolute value]] function, such as:<math display="block"> H(x) = \frac{x + |x|}{2x}</math> | ||
<math display="block"> H(x) = \frac{x + |x|}{2x}</math> | |||
==Relationship with Dirac delta== | ==Relationship with Dirac delta== | ||
The [[Dirac delta function]] is the [[weak derivative]] of the Heaviside function: | The [[Dirac delta function]] is the [[weak derivative]] of the Heaviside function:<math display="block">\delta(x)= \frac{d}{dx} \ H(x),</math>Hence the Heaviside function can be considered to be the [[integral]] of the Dirac delta function. This is sometimes written as:<math display="block">H(x) := \int_{-\infty}^x \delta(s)\,ds,</math>although this expansion may not hold (or even make sense) for {{math|''x'' {{=}} 0}}, depending on which formalism one uses to give meaning to integrals involving {{mvar|δ}}. In this context, the Heaviside function is the [[cumulative distribution function]] of a [[random variable]] which is [[almost surely]] 0. (See [[Constant random variable]].) | ||
<math display="block">\delta(x)= \frac{d}{dx} H(x) | |||
Hence the Heaviside function can be considered to be the [[integral]] of the Dirac delta function. This is sometimes written as | |||
<math display="block">H(x) := \int_{-\infty}^x \delta(s)\,ds</math> | |||
although this expansion may not hold (or even make sense) for {{math|''x'' {{=}} 0}}, depending on which formalism one uses to give meaning to integrals involving {{mvar|δ}}. In this context, the Heaviside function is the [[cumulative distribution function]] of a [[random variable]] which is [[almost surely]] 0. (See [[Constant random variable]].) | |||
== Analytic approximations == | == Analytic approximations == | ||
Approximations to the Heaviside step function are of use in [[biochemistry]] and [[neuroscience]], where [[logistic function|logistic]] approximations of step functions (such as the [[Hill equation (biochemistry)|Hill]] and the [[Michaelis–Menten kinetics|Michaelis–Menten equations]]) may be used to approximate binary cellular switches in response to chemical signals. | Approximations to the Heaviside step function are of use in [[biochemistry]] and [[neuroscience]], where [[logistic function|logistic]] approximations of step functions (such as the [[Hill equation (biochemistry)|Hill]] and the [[Michaelis–Menten kinetics|Michaelis–Menten equations]]) may be used to approximate binary cellular switches in response to chemical signals. | ||
For a [[Smooth function|smooth]] approximation to the step function, one can use the [[logistic function]]:<math display="block">H(x) \approx \tfrac{1}{2} + \tfrac{1}{2}\tanh kx = \frac{1}{1+e^{-2kx}},</math>where a larger {{mvar|k}} corresponds to a sharper transition at {{math|''x'' {{=}} 0}}. | |||
For a [[Smooth function|smooth]] approximation to the step function, one can use the [[logistic function]] | |||
<math display="block">H(x) \approx \tfrac{1}{2} + \tfrac{1}{2}\tanh kx = \frac{1}{1+e^{-2kx}},</math> | |||
If we take {{math|''H''(0) {{=}} {{sfrac|1|2}}}}, equality holds in the limit:<math display="block">H(x)=\lim_{k \to \infty}\tfrac{1}{2}(1+\tanh kx)=\lim_{k \to \infty}\frac{1}{1+e^{-2kx}}.</math> | |||
<math display="block">H(x)=\lim_{k \to \infty}\tfrac{1}{2}(1+\tanh kx)=\lim_{k \to \infty}\frac{1}{1+e^{-2kx}}.</math> | |||
There are [[Sigmoid function#Examples|many other smooth, analytic approximations]] to the step function.<ref>{{MathWorld | urlname=HeavisideStepFunction | title=Heaviside Step Function}}</ref> Among the possibilities are: | [[File:Step function approximation.png|alt=A set of functions that successively approach the step function|thumb|500x500px|<math>\tfrac{1}{2} + \tfrac{1}{2} \tanh(kx) = \frac{1}{1+e^{-2kx}}</math><br>approaches the step function as {{math|''k'' → ∞}}.|none]]There are [[Sigmoid function#Examples|many other smooth, analytic approximations]] to the step function.<ref>{{MathWorld | urlname=HeavisideStepFunction | title=Heaviside Step Function}}</ref> Among the possibilities are:<math display="block">\begin{align} | ||
<math display="block">\begin{align} | |||
H(x) &= \lim_{k \to \infty} \left(\tfrac{1}{2} + \tfrac{1}{\pi}\arctan kx\right)\\ | H(x) &= \lim_{k \to \infty} \left(\tfrac{1}{2} + \tfrac{1}{\pi}\arctan kx\right)\\ | ||
H(x) &= \lim_{k \to \infty}\left(\tfrac{1}{2} + \tfrac12\operatorname{erf} kx\right) | H(x) &= \lim_{k \to \infty}\left(\tfrac{1}{2} + \tfrac12\operatorname{erf} kx\right) | ||
\end{align}</math> | \end{align}</math>These limits hold [[pointwise]] and in the sense of [[distribution (mathematics)|distributions]]. In general, however, [[pointwise convergence]] need not imply distributional convergence, and vice versa distributional convergence need not imply pointwise convergence. (However, if all members of a pointwise convergent sequence of functions are uniformly bounded by some "nice" function, then [[Lebesgue dominated convergence theorem|convergence holds in the sense of distributions too]].) | ||
These limits hold [[pointwise]] and in the sense of [[distribution (mathematics)|distributions]]. In general, however, pointwise convergence need not imply distributional convergence, and vice versa distributional convergence need not imply pointwise convergence. (However, if all members of a pointwise convergent sequence of functions are uniformly bounded by some "nice" function, then [[Lebesgue dominated convergence theorem|convergence holds in the sense of distributions too]].) | |||
In general, any [[cumulative distribution function]] of a [[continuous distribution|continuous]] [[probability distribution]] that is peaked around zero and has a parameter that controls for [[variance]] can serve as an approximation, in the limit as the variance approaches zero. For example, all three of the above approximations are [[cumulative distribution function|cumulative distribution functions]] of common probability distributions: the [[logistic distribution|logistic]], [[Cauchy distribution|Cauchy]] and [[normal distribution|normal]] distributions, respectively. | In general, any [[cumulative distribution function]] of a [[continuous distribution|continuous]] [[probability distribution]] that is peaked around zero and has a parameter that controls for [[variance]] can serve as an approximation, in the limit as the variance approaches zero. For example, all three of the above approximations are [[cumulative distribution function|cumulative distribution functions]] of common probability distributions: the [[logistic distribution|logistic]], [[Cauchy distribution|Cauchy]] and [[normal distribution|normal]] distributions, respectively. | ||
== Non-Analytic approximations == | == Non-Analytic approximations == | ||
Approximations to the Heaviside step function could be made through [[Non-analytic_smooth_function#Smooth_transition_functions|Smooth transition function]] like <math> 1 \leq m \to \infty </math>: | Approximations to the Heaviside step function could be made through [[Non-analytic_smooth_function#Smooth_transition_functions|Smooth transition function]] like <math> 1 \leq m \to \infty </math>:<math display="block">\begin{align}f(x) &= \begin{cases} | ||
<math display="block">\begin{align}f(x) &= \begin{cases} | |||
{\displaystyle | {\displaystyle | ||
\frac{1}{2}\left(1+\tanh\left(m\frac{2x}{1-x^2}\right)\right)}, & |x| < 1 \\ | \frac{1}{2}\left(1+\tanh\left(m\frac{2x}{1-x^2}\right)\right)}, & |x| < 1 \\ | ||
| Line 71: | Line 58: | ||
==Integral representations== | ==Integral representations== | ||
Often an [[integration (mathematics)|integral]] representation of the Heaviside step function is useful: | Often an [[integration (mathematics)|integral]] representation of the Heaviside step function is useful:<math display="block">\begin{align} | ||
<math display="block">\begin{align} | |||
H(x)&=\lim_{ \varepsilon \to 0^+} -\frac{1}{2\pi i}\int_{-\infty}^\infty \frac{1}{\tau+i\varepsilon} e^{-i x \tau} d\tau \\ | H(x)&=\lim_{ \varepsilon \to 0^+} -\frac{1}{2\pi i}\int_{-\infty}^\infty \frac{1}{\tau+i\varepsilon} e^{-i x \tau} d\tau \\ | ||
&=\lim_{ \varepsilon \to 0^+} \frac{1}{2\pi i}\int_{-\infty}^\infty \frac{1}{\tau-i\varepsilon} e^{i x \tau} d\tau | &=\lim_{ \varepsilon \to 0^+} \ \frac{1}{2\pi i}\int_{-\infty}^\infty \frac{1}{\tau-i\varepsilon} e^{i x \tau} d\tau, | ||
\end{align}</math> | \end{align}</math>where the second representation is easy to deduce from the first, given that the step function is real and thus is its own [[complex conjugate]]. | ||
where the second representation is easy to deduce from the first, given that the step function is real and thus is its own complex conjugate. | |||
== Zero argument == | == Zero argument == | ||
Since {{mvar|H}} is usually used in integration, and the value of a function at a single point does not affect its integral, it rarely matters what particular value is chosen of {{math|''H''(0)}}. Indeed when {{mvar|H}} is considered as a [[distribution (mathematics)|distribution]] or an element of {{math|''L''{{isup|∞}}}} (see [[Lp space|{{math|''L{{isup|p}}''}} space]]) it does not even make sense to talk of a value at zero, since such objects are only defined [[almost everywhere]]. If using some analytic approximation (as in the [[#Analytic approximations|examples above]]) then often whatever happens to be the relevant limit at zero is used. | Since {{mvar|H}} is usually used in [[Integral|integration]], and the value of a function at a single point does not affect its integral, it rarely matters what particular value is chosen of {{math|''H''(0)}}. Indeed when {{mvar|H}} is considered as a [[distribution (mathematics)|distribution]] or an element of {{math|''L''{{isup|∞}}}} (see [[Lp space|{{math|''L{{isup|p}}''}} space]]) it does not even make sense to talk of a value at zero, since such objects are only defined [[almost everywhere]]. If using some analytic approximation (as in the [[#Analytic approximations|examples above]]) then often whatever happens to be the relevant limit at zero is used. | ||
There exist various reasons for choosing a particular value. | There exist various reasons for choosing a particular value. | ||
* {{math|''H''(0) {{=}} {{sfrac|1|2}}}} is often used since the [[graph of a function|graph]] then has rotational symmetry; put another way, {{math|''H'' − {{sfrac|1|2}}}} is then an [[odd function]]. In this case the following relation with the [[sign function]] holds for all {{mvar|x}}: <math display="block"> H(x) = \tfrac12(1 + \sgn x).</math> | * {{math|''H''(0) {{=}} {{sfrac|1|2}}}} is often used since the [[graph of a function|graph]] then has [[rotational symmetry]]; put another way, {{math|''H'' − {{sfrac|1|2}}}} is then an [[odd function]]. In this case the following relation with the [[sign function]] holds for all {{mvar|x}}: <math display="block">H(x) = \tfrac12(1 + \sgn x).</math>Also, <math> \forall x, \ H(x) + H(-x) = 1</math>. | ||
Also, H(x) + H(-x) = 1 | |||
* {{math|''H''(0) {{=}} 1}} is used when {{mvar|H}} needs to be [[right-continuous]]. For instance [[cumulative distribution function]]s are usually taken to be right continuous, as are functions integrated against in [[Lebesgue–Stieltjes integration]]. In this case {{mvar|H}} is the [[indicator function]] of a [[closed set|closed]] semi-infinite interval: <math display="block"> H(x) = \mathbf{1}_{[0,\infty)}(x).</math> The corresponding probability distribution is the [[degenerate distribution]]. | * {{math|''H''(0) {{=}} 1}} is used when {{mvar|H}} needs to be [[right-continuous]]. For instance [[cumulative distribution function]]s are usually taken to be right continuous, as are functions integrated against in [[Lebesgue–Stieltjes integration]]. In this case {{mvar|H}} is the [[indicator function]] of a [[closed set|closed]] semi-infinite interval: <math display="block"> H(x) = \mathbf{1}_{[0,\infty)}(x).</math> The corresponding probability distribution is the [[degenerate distribution]]. | ||
* {{math|''H''(0) {{=}} 0}} is used when {{mvar|H}} needs to be [[left-continuous]]. In this case {{mvar|H}} is an indicator function of an [[open set|open]] semi-infinite interval: <math display="block"> H(x) = \mathbf{1}_{(0,\infty)}(x).</math> | * {{math|''H''(0) {{=}} 0}} is used when {{mvar|H}} needs to be [[left-continuous]]. In this case {{mvar|H}} is an indicator function of an [[open set|open]] semi-infinite interval: <math display="block"> H(x) = \mathbf{1}_{(0,\infty)}(x).</math> | ||
* In functional-analysis contexts from optimization and game theory, it is often useful to define the Heaviside function as a [[Multivalued function|set-valued function]] to preserve the continuity of the limiting functions and ensure the existence of certain solutions. In these cases, the Heaviside function returns a whole interval of possible solutions, {{math|''H''(0) {{=}} [0,1]}}. | * In functional-analysis contexts from [[optimization]] and [[game theory]], it is often useful to define the Heaviside function as a [[Multivalued function|set-valued function]] to preserve the continuity of the limiting functions and ensure the existence of certain solutions. In these cases, the Heaviside function returns a whole interval of possible solutions, {{math|''H''(0) {{=}} [0,1]}}. | ||
==Discrete form== | ==Discrete form== | ||
An alternative form of the unit step, defined instead as a function <math>H : \mathbb{Z} \rarr \mathbb{R}</math> (that is, taking in a discrete variable {{mvar|n}}), is: | An alternative form of the unit step, defined instead as a function <math>H : \mathbb{Z} \rarr \mathbb{R}</math> (that is, taking in a discrete variable {{mvar|n}}), is:<math display="block">H[n]=\begin{cases} 0, & n < 0, \\ 1, & n \ge 0, \end{cases} </math>Or using the half-maximum convention:<ref>{{cite book |last=Bracewell |first=Ronald Newbold |date=2000 |title=The Fourier transform and its applications |language=en |location=New York |publisher=McGraw-Hill |isbn=0-07-303938-1 |page=61 |edition=3rd}}</ref><math display="block">H[n]=\begin{cases} 0, & n < 0, \\ \tfrac12, & n = 0,\\ 1, & n > 0, \end{cases} </math>where {{mvar|n}} is an [[integer]]. If {{mvar|n}} is an integer, then {{math|''n'' < 0}} must imply that {{math|''n'' ≤ −1}}, while {{math|''n'' > 0}} must imply that the function attains unity at {{math|1=''n'' = 1}}. Therefore the "step function" exhibits ramp-like behavior over the domain of {{closed-closed|−1, 1}}, and cannot authentically be a step function, using the half-maximum convention. | ||
<math display="block">H[n]=\begin{cases} 0, & n < 0, \\ 1, & n \ge 0, \end{cases} </math> | |||
<math display="block">H[n]=\begin{cases} 0, & n < 0, \\ \tfrac12, & n = 0,\\ 1, & n > 0, \end{cases} </math> | |||
where {{mvar|n}} is an [[integer]]. If {{mvar|n}} is an integer, then {{math|''n'' < 0}} must imply that {{math|''n'' ≤ −1}}, while {{math|''n'' > 0}} must imply that the function attains unity at {{math|1=''n'' = 1}}. Therefore the "step function" exhibits ramp-like behavior over the domain of {{closed-closed|−1, 1}}, and cannot authentically be a step function, using the half-maximum convention. | |||
Unlike the continuous case, the definition of {{math|''H''[0]}} is significant. | Unlike the continuous case, the definition of {{math|''H''[0]}} is significant. | ||
The discrete-time unit impulse is the first difference of the discrete-time step | The discrete-time unit impulse is the first difference of the discrete-time step:<math display="block"> \delta[n] = H[n] - H[n-1].</math>This function is the cumulative summation of the [[Kronecker delta]]:<math display="block"> H[n] = \sum_{k=-\infty}^{n} \delta[k], </math>where <math display="inline"> \delta[k] = \delta_{k,0} </math> is the [[degenerate distribution|discrete unit impulse function]]. | ||
<math display="block"> \delta[n] = H[n] - H[n-1].</math> | |||
This function is the cumulative summation of the [[Kronecker delta]]: | |||
<math display="block"> H[n] = \sum_{k=-\infty}^{n} \delta[k] </math> | |||
where | |||
<math display=" | |||
is the [[degenerate distribution|discrete unit impulse function]]. | |||
== Antiderivative and derivative== | == Antiderivative and derivative== | ||
The [[ramp function]] is an [[antiderivative]] of the Heaviside step function: | The [[ramp function]] is an [[antiderivative]] of the Heaviside step function:<math display="block">\int_{-\infty}^{x} H(\xi)\,d\xi = x H(x) = \max\{0,x\} \,.</math>The [[distributional derivative]] of the Heaviside step function is the [[Dirac delta function]]:<math display="block"> \frac{d H(x)}{dx} = \delta(x) \,.</math> | ||
<math display="block">\int_{-\infty}^{x} H(\xi)\,d\xi = x H(x) = \max\{0,x\} \,.</math> | |||
The [[distributional derivative]] of the Heaviside step function is the [[Dirac delta function]]: | |||
<math display="block"> \frac{d H(x)}{dx} = \delta(x) \,.</math> | |||
== Fourier transform == | == Fourier transform == | ||
The [[Fourier transform]] of the Heaviside step function is a distribution. Using one choice of constants for the definition of the Fourier transform we have | The [[Fourier transform]] of the Heaviside step function is a distribution. Using one choice of constants for the definition of the Fourier transform we have | ||
<math display="block">\hat{H}(s) = \lim_{N\to\infty}\int^N_{-N} e^{-2\pi i x s} H(x)\,dx = \frac{1}{2} \left( \delta(s) - \frac{i}{\pi} \operatorname{p.v.}\frac{1}{s} \right).</math> | <math display="block">\hat{H}(s) = \lim_{N\to\infty}\int^N_{-N} e^{-2\pi i x s} H(x)\,dx = \frac{1}{2} \left( \delta(s) - \frac{i}{\pi} \operatorname{p.v.}\frac{1}{s} \right).</math>Here {{math|p.v.{{sfrac|1|''s''}}}} is the [[distribution (mathematics)|distribution]] that takes a test function {{mvar|φ}} to the [[Cauchy principal value]] of <math>\textstyle\int_{-\infty}^\infty \frac{\varphi(s)}{s} \, ds</math>. The limit appearing in the integral is also taken in the sense of (tempered) distributions. | ||
Here {{math|p.v.{{sfrac|1|''s''}}}} is the [[distribution (mathematics)|distribution]] that takes a test function {{mvar|φ}} to the [[Cauchy principal value]] of <math>\textstyle\int_{-\infty}^\infty \frac{\varphi(s)}{s} \, ds</math>. The limit appearing in the integral is also taken in the sense of (tempered) distributions. | |||
== Unilateral Laplace transform == | == Unilateral Laplace transform == | ||
The [[Laplace transform]] of the Heaviside step function is a [[meromorphic function]]. Using the unilateral Laplace transform we have: | The [[Laplace transform]] of the Heaviside step function is a [[meromorphic function]]. Using the unilateral Laplace transform we have:<math display="block">\begin{align} | ||
<math display="block">\begin{align} | |||
\hat{H}(s) &= \lim_{N\to\infty}\int^N_{0} e^{-sx} H(x)\,dx\\ | \hat{H}(s) &= \lim_{N\to\infty}\int^N_{0} e^{-sx} H(x)\,dx\\ | ||
&= \lim_{N\to\infty}\int^N_{0} e^{-sx} \,dx\\ | &= \lim_{N\to\infty}\int^N_{0} e^{-sx} \,dx\\ | ||
&= \frac{1}{s} \end{align}</math> | &= \frac{1}{s} \end{align}</math>When the [[Laplace transform#Bilateral Laplace transform|bilateral transform]] is used, the integral can be split in two parts and the result will be the same. | ||
When the bilateral transform is used, the integral can be split in two parts and the result will be the same. | |||
==See also== | ==See also== | ||
Latest revision as of 22:42, 9 November 2025
Template:Short description Template:Refimprove Template:Infobox mathematical function
The Heaviside step function, or the unit step function, usually denoted by Template:Mvar or Template:Mvar (but sometimes Template:Mvar, Template:Math or Template:Math), is a step function named after Oliver Heaviside, the value of which is zero for negative arguments and one for positive arguments. Different conventions concerning the value Template:Math are in use. It is an example of the general class of step functions, all of which can be represented as linear combinations of translations of this one.
The function was originally developed in operational calculus for the solution of differential equations, where it represents a signal that switches on at a specified time and stays switched on indefinitely. Heaviside developed the operational calculus as a tool in the analysis of telegraphic communications and represented the function as Template:Math.
Formulation
Taking the convention that Template:Math, the Heaviside function may be defined as:
- A piecewise function:
- Using the Iverson bracket notation:
- An indicator function:
For the alternative convention that Template:Math, it may be expressed as:
- A piecewise function:
- A linear transformation of the sign function:
- The arithmetic mean of two Iverson brackets:
- A one-sided limit of the two-argument arctangent:
- A hyperfunction: Or equivalently: where Template:Math is the principal value of the complex logarithm of Template:Mvar.
Other definitions which are undefined at Template:Math include:
- A piecewise function:
- The derivative of the ramp function:
- Expressed in terms of the absolute value function, such as:
Relationship with Dirac delta
The Dirac delta function is the weak derivative of the Heaviside function:Hence the Heaviside function can be considered to be the integral of the Dirac delta function. This is sometimes written as:although this expansion may not hold (or even make sense) for Template:Math, depending on which formalism one uses to give meaning to integrals involving Template:Mvar. In this context, the Heaviside function is the cumulative distribution function of a random variable which is almost surely 0. (See Constant random variable.)
Analytic approximations
Approximations to the Heaviside step function are of use in biochemistry and neuroscience, where logistic approximations of step functions (such as the Hill and the Michaelis–Menten equations) may be used to approximate binary cellular switches in response to chemical signals.
For a smooth approximation to the step function, one can use the logistic function:where a larger Template:Mvar corresponds to a sharper transition at Template:Math.
If we take Template:Math, equality holds in the limit:
approaches the step function as Template:Math.
There are many other smooth, analytic approximations to the step function.[1] Among the possibilities are:
These limits hold pointwise and in the sense of distributions. In general, however, pointwise convergence need not imply distributional convergence, and vice versa distributional convergence need not imply pointwise convergence. (However, if all members of a pointwise convergent sequence of functions are uniformly bounded by some "nice" function, then convergence holds in the sense of distributions too.)
In general, any cumulative distribution function of a continuous probability distribution that is peaked around zero and has a parameter that controls for variance can serve as an approximation, in the limit as the variance approaches zero. For example, all three of the above approximations are cumulative distribution functions of common probability distributions: the logistic, Cauchy and normal distributions, respectively.
Non-Analytic approximations
Approximations to the Heaviside step function could be made through Smooth transition function like :
Integral representations
Often an integral representation of the Heaviside step function is useful:where the second representation is easy to deduce from the first, given that the step function is real and thus is its own complex conjugate.
Zero argument
Since Template:Mvar is usually used in integration, and the value of a function at a single point does not affect its integral, it rarely matters what particular value is chosen of Template:Math. Indeed when Template:Mvar is considered as a distribution or an element of Template:Math (see [[Lp space|Template:Math space]]) it does not even make sense to talk of a value at zero, since such objects are only defined almost everywhere. If using some analytic approximation (as in the examples above) then often whatever happens to be the relevant limit at zero is used.
There exist various reasons for choosing a particular value.
- Template:Math is often used since the graph then has rotational symmetry; put another way, Template:Math is then an odd function. In this case the following relation with the sign function holds for all Template:Mvar: Also, .
- Template:Math is used when Template:Mvar needs to be right-continuous. For instance cumulative distribution functions are usually taken to be right continuous, as are functions integrated against in Lebesgue–Stieltjes integration. In this case Template:Mvar is the indicator function of a closed semi-infinite interval: The corresponding probability distribution is the degenerate distribution.
- Template:Math is used when Template:Mvar needs to be left-continuous. In this case Template:Mvar is an indicator function of an open semi-infinite interval:
- In functional-analysis contexts from optimization and game theory, it is often useful to define the Heaviside function as a set-valued function to preserve the continuity of the limiting functions and ensure the existence of certain solutions. In these cases, the Heaviside function returns a whole interval of possible solutions, Template:Math.
Discrete form
An alternative form of the unit step, defined instead as a function (that is, taking in a discrete variable Template:Mvar), is:Or using the half-maximum convention:[2]where Template:Mvar is an integer. If Template:Mvar is an integer, then Template:Math must imply that Template:Math, while Template:Math must imply that the function attains unity at Template:Math. Therefore the "step function" exhibits ramp-like behavior over the domain of Template:Closed-closed, and cannot authentically be a step function, using the half-maximum convention.
Unlike the continuous case, the definition of Template:Math is significant.
The discrete-time unit impulse is the first difference of the discrete-time step:This function is the cumulative summation of the Kronecker delta:where is the discrete unit impulse function.
Antiderivative and derivative
The ramp function is an antiderivative of the Heaviside step function:The distributional derivative of the Heaviside step function is the Dirac delta function:
Fourier transform
The Fourier transform of the Heaviside step function is a distribution. Using one choice of constants for the definition of the Fourier transform we have Here Template:Math is the distribution that takes a test function Template:Mvar to the Cauchy principal value of . The limit appearing in the integral is also taken in the sense of (tempered) distributions.
Unilateral Laplace transform
The Laplace transform of the Heaviside step function is a meromorphic function. Using the unilateral Laplace transform we have:When the bilateral transform is used, the integral can be split in two parts and the result will be the same.
See also
- Gamma function
- Dirac delta function
- Indicator function
- Iverson bracket
- Laplace transform
- Laplacian of the indicator
- List of mathematical functions
- Macaulay brackets
- Negative number
- Rectangular function
- Sign function
- Sine integral
- Step response
References
External links
- Digital Library of Mathematical Functions, NIST, [1].
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".