<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>http://debianws.lexgopc.com/wiki143/index.php?action=history&amp;feed=atom&amp;title=Hartley_function</id>
	<title>Hartley function - Revision history</title>
	<link rel="self" type="application/atom+xml" href="http://debianws.lexgopc.com/wiki143/index.php?action=history&amp;feed=atom&amp;title=Hartley_function"/>
	<link rel="alternate" type="text/html" href="http://debianws.lexgopc.com/wiki143/index.php?title=Hartley_function&amp;action=history"/>
	<updated>2026-05-09T20:45:00Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.43.1</generator>
	<entry>
		<id>http://debianws.lexgopc.com/wiki143/index.php?title=Hartley_function&amp;diff=2410160&amp;oldid=prev</id>
		<title>imported&gt;Jacobolus: base of a logarithm is a better target than base (exponentiation)</title>
		<link rel="alternate" type="text/html" href="http://debianws.lexgopc.com/wiki143/index.php?title=Hartley_function&amp;diff=2410160&amp;oldid=prev"/>
		<updated>2025-01-28T07:06:22Z</updated>

		<summary type="html">&lt;p&gt;&lt;a href=&quot;/wiki143/index.php?title=Base_of_a_logarithm&amp;amp;action=edit&amp;amp;redlink=1&quot; class=&quot;new&quot; title=&quot;Base of a logarithm (page does not exist)&quot;&gt;base of a logarithm&lt;/a&gt; is a better target than &lt;a href=&quot;/wiki143/index.php?title=Base_(exponentiation)&amp;amp;action=edit&amp;amp;redlink=1&quot; class=&quot;new&quot; title=&quot;Base (exponentiation) (page does not exist)&quot;&gt;base (exponentiation)&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;The &amp;#039;&amp;#039;&amp;#039;Hartley function&amp;#039;&amp;#039;&amp;#039; is a measure of [[uncertainty]], introduced by [[Ralph Hartley]] in 1928. If a sample from a finite set &amp;#039;&amp;#039;A&amp;#039;&amp;#039; uniformly at random is picked, the information revealed after the outcome is known is given by the Hartley function&lt;br /&gt;
: &amp;lt;math&amp;gt; H_0(A) := \mathrm{log}_b \vert A \vert ,&amp;lt;/math&amp;gt;&lt;br /&gt;
where {{abs|&amp;#039;&amp;#039;A&amp;#039;&amp;#039;}} denotes the [[cardinality]] of &amp;#039;&amp;#039;A&amp;#039;&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
If the [[base of a logarithm|base of the logarithm]] is 2, then the unit of uncertainty is the [[Shannon (unit)|shannon]] (more commonly known as [[bit]]). If it is the [[natural logarithm]], then the unit is the [[Nat (unit)|nat]]. Hartley used a [[base-ten logarithm]], and with this base, the unit of information is called the [[Hartley (unit)|hartley]] (aka [[Hartley (unit)|ban]] or [[Hartley (unit)|dit]]) in his honor. It is also known as the Hartley entropy or max-entropy.&lt;br /&gt;
&lt;br /&gt;
== Hartley function, Shannon entropy, and Rényi entropy ==&lt;br /&gt;
The Hartley function coincides with the [[Shannon entropy]] (as well as with the Rényi entropies of all orders) in the case of a uniform probability distribution. It is a special case of the [[Rényi entropy]] since:&lt;br /&gt;
:&amp;lt;math&amp;gt;H_0(X) = \frac 1 {1-0} \log \sum_{i=1}^{|\mathcal{X}|} p_i^0 = \log |\mathcal{X}|.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
But it can also be viewed as a primitive construction, since, as emphasized by Kolmogorov and Rényi, the Hartley function can be defined without introducing any notions of probability (see &amp;#039;&amp;#039;Uncertainty and information&amp;#039;&amp;#039; by George J. Klir, p.&amp;amp;nbsp;423).&lt;br /&gt;
&lt;br /&gt;
== Characterization of the Hartley function ==&lt;br /&gt;
The Hartley function only depends on the number of elements in a set, and hence can be viewed as a function on natural numbers. Rényi showed that the Hartley function in base 2 is the only function mapping natural numbers to real numbers that satisfies&lt;br /&gt;
&lt;br /&gt;
# &amp;lt;math&amp;gt;H(mn) = H(m)+H(n)&amp;lt;/math&amp;gt; (additivity)&lt;br /&gt;
# &amp;lt;math&amp;gt;H(m) \leq H(m+1)&amp;lt;/math&amp;gt; (monotonicity)&lt;br /&gt;
# &amp;lt;math&amp;gt;H(2)=1&amp;lt;/math&amp;gt; (normalization)&lt;br /&gt;
&lt;br /&gt;
Condition 1 says that the uncertainty of the Cartesian product of two finite sets &amp;#039;&amp;#039;A&amp;#039;&amp;#039; and &amp;#039;&amp;#039;B&amp;#039;&amp;#039; is the sum of uncertainties of &amp;#039;&amp;#039;A&amp;#039;&amp;#039; and &amp;#039;&amp;#039;B&amp;#039;&amp;#039;. Condition 2 says that a larger set has larger uncertainty.&lt;br /&gt;
&lt;br /&gt;
== Derivation of the Hartley function ==&lt;br /&gt;
We want to show that the Hartley function, log&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;(&amp;#039;&amp;#039;n&amp;#039;&amp;#039;), is the only function mapping natural numbers to real numbers that satisfies&lt;br /&gt;
# &amp;lt;math&amp;gt;H(mn) = H(m)+H(n)\,&amp;lt;/math&amp;gt; (additivity)&lt;br /&gt;
# &amp;lt;math&amp;gt;H(m) \leq H(m+1)\,&amp;lt;/math&amp;gt; (monotonicity)&lt;br /&gt;
# &amp;lt;math&amp;gt;H(2)=1\,&amp;lt;/math&amp;gt; (normalization)&lt;br /&gt;
&lt;br /&gt;
Let &amp;#039;&amp;#039;f&amp;#039;&amp;#039; be a function on positive integers that satisfies the above three properties. From the additive property, we can show that for any integer &amp;#039;&amp;#039;n&amp;#039;&amp;#039; and &amp;#039;&amp;#039;k&amp;#039;&amp;#039;,&lt;br /&gt;
: &amp;lt;math&amp;gt;f(n^k) = kf(n).\,&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Let &amp;#039;&amp;#039;a&amp;#039;&amp;#039;, &amp;#039;&amp;#039;b&amp;#039;&amp;#039;, and &amp;#039;&amp;#039;t&amp;#039;&amp;#039; be any positive integers. There is a unique integer &amp;#039;&amp;#039;s&amp;#039;&amp;#039; determined by&lt;br /&gt;
: &amp;lt;math&amp;gt;a^s \leq b^t \leq a^{s+1}. \qquad(1)&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Therefore,&lt;br /&gt;
: &amp;lt;math&amp;gt;s \log_2 a\leq t \log_2 b \leq (s+1) \log_2 a \, &amp;lt;/math&amp;gt;&lt;br /&gt;
and&lt;br /&gt;
: &amp;lt;math&amp;gt;\frac{s}{t} \leq \frac{\log_2 b}{\log_2 a} \leq \frac{s+1}{t}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
On the other hand, by monotonicity,&lt;br /&gt;
: &amp;lt;math&amp;gt;f(a^s) \leq f(b^t) \leq f(a^{s+1}). \, &amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Using equation (1), one gets&lt;br /&gt;
: &amp;lt;math&amp;gt;s f(a) \leq t f(b) \leq (s+1) f(a),\,&amp;lt;/math&amp;gt;&lt;br /&gt;
and&lt;br /&gt;
: &amp;lt;math&amp;gt;\frac{s}{t} \leq \frac{f(b)}{f(a)} \leq \frac{s+1}{t}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Hence,&lt;br /&gt;
: &amp;lt;math&amp;gt;\left\vert \frac{f(b)}{f(a)} - \frac{\log_2(b)}{\log_2(a)} \right\vert \leq \frac{1}{t}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Since &amp;#039;&amp;#039;t&amp;#039;&amp;#039; can be arbitrarily large, the difference on the left hand side of the above inequality must be zero,&lt;br /&gt;
: &amp;lt;math&amp;gt;\frac{f(b)}{f(a)} = \frac{\log_2(b)}{\log_2(a)}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
So,&lt;br /&gt;
: &amp;lt;math&amp;gt;f(a) = \mu \log_2(a)\,&amp;lt;/math&amp;gt;&lt;br /&gt;
for some constant &amp;#039;&amp;#039;μ&amp;#039;&amp;#039;, which must be equal to 1 by the normalization property.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
* [[Rényi entropy]]&lt;br /&gt;
* [[Min-entropy]]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
{{reflist}}&lt;br /&gt;
* {{PlanetMath attribution|id=6070|title=Hartley function}}&lt;br /&gt;
* {{PlanetMath attribution|id=6082|title=Derivation of Hartley function}}&lt;br /&gt;
&lt;br /&gt;
[[Category:Information theory]]&lt;/div&gt;</summary>
		<author><name>imported&gt;Jacobolus</name></author>
	</entry>
</feed>