<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>http://debianws.lexgopc.com/wiki143/index.php?action=history&amp;feed=atom&amp;title=Active-set_method</id>
	<title>Active-set method - Revision history</title>
	<link rel="self" type="application/atom+xml" href="http://debianws.lexgopc.com/wiki143/index.php?action=history&amp;feed=atom&amp;title=Active-set_method"/>
	<link rel="alternate" type="text/html" href="http://debianws.lexgopc.com/wiki143/index.php?title=Active-set_method&amp;action=history"/>
	<updated>2026-05-08T20:31:49Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.43.1</generator>
	<entry>
		<id>http://debianws.lexgopc.com/wiki143/index.php?title=Active-set_method&amp;diff=6177508&amp;oldid=prev</id>
		<title>imported&gt;Lorax: Adding short description: &quot;Mathematical optimization algorithm&quot;</title>
		<link rel="alternate" type="text/html" href="http://debianws.lexgopc.com/wiki143/index.php?title=Active-set_method&amp;diff=6177508&amp;oldid=prev"/>
		<updated>2025-12-15T02:38:47Z</updated>

		<summary type="html">&lt;p&gt;Adding &lt;a href=&quot;https://en.wikipedia.org/wiki/Short_description&quot; class=&quot;extiw&quot; title=&quot;wikipedia:Short description&quot;&gt;short description&lt;/a&gt;: &amp;quot;Mathematical optimization algorithm&amp;quot;&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Previous revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 02:38, 15 December 2025&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l1&quot;&gt;Line 1:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 1:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-deleted&quot;&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;{{Short description|Mathematical optimization algorithm}}&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;{{redirect|Active set|the band|The Active Set}}&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;{{redirect|Active set|the band|The Active Set}}&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>imported&gt;Lorax</name></author>
	</entry>
	<entry>
		<id>http://debianws.lexgopc.com/wiki143/index.php?title=Active-set_method&amp;diff=1173635&amp;oldid=prev</id>
		<title>128.84.126.71: Typo &quot;intital&quot; fixed to &quot;initial&quot;.</title>
		<link rel="alternate" type="text/html" href="http://debianws.lexgopc.com/wiki143/index.php?title=Active-set_method&amp;diff=1173635&amp;oldid=prev"/>
		<updated>2025-05-07T21:11:50Z</updated>

		<summary type="html">&lt;p&gt;Typo &amp;quot;intital&amp;quot; fixed to &amp;quot;initial&amp;quot;.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;{{redirect|Active set|the band|The Active Set}}&lt;br /&gt;
&lt;br /&gt;
In mathematical [[Optimization (mathematics)|optimization]], the &amp;#039;&amp;#039;&amp;#039;active-set method&amp;#039;&amp;#039;&amp;#039; is an algorithm used to identify the active [[Constraint (mathematics)|constraints]] in a set of [[Inequality (mathematics)|inequality]] constraints. The active constraints are then expressed as equality constraints,  thereby transforming an inequality-constrained problem into a simpler equality-constrained subproblem.&lt;br /&gt;
&lt;br /&gt;
An optimization problem is defined using an objective function to minimize or maximize, and a set of constraints&lt;br /&gt;
&lt;br /&gt;
: &amp;lt;math&amp;gt;g_1(x) \ge 0, \dots, g_k(x) \ge 0&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
that define the [[feasible region]], that is, the set of all &amp;#039;&amp;#039;x&amp;#039;&amp;#039; to search for the optimal solution. Given a point &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt; in the feasible region, a constraint &lt;br /&gt;
&lt;br /&gt;
: &amp;lt;math&amp;gt;g_i(x) \ge 0&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
is called &amp;#039;&amp;#039;&amp;#039;active&amp;#039;&amp;#039;&amp;#039; at &amp;lt;math&amp;gt;x_0&amp;lt;/math&amp;gt; if &amp;lt;math&amp;gt;g_i(x_0) = 0&amp;lt;/math&amp;gt;, and &amp;#039;&amp;#039;&amp;#039;inactive&amp;#039;&amp;#039;&amp;#039; at &amp;lt;math&amp;gt;x_0&amp;lt;/math&amp;gt; if &amp;lt;math&amp;gt;g_i(x_0) &amp;gt; 0.&amp;lt;/math&amp;gt; Equality constraints are always active. The &amp;#039;&amp;#039;&amp;#039;active set&amp;#039;&amp;#039;&amp;#039; at &amp;lt;math&amp;gt;x_0&amp;lt;/math&amp;gt; is made up of those constraints &amp;lt;math&amp;gt;g_i(x_0)&amp;lt;/math&amp;gt; that are active at the current point {{harv|Nocedal|Wright|2006|p=308}}.&lt;br /&gt;
&lt;br /&gt;
The active set is particularly important in optimization theory, as it determines which constraints will influence the final result of optimization. For example, in solving the [[linear programming]] problem, the active set gives the [[hyperplane]]s that intersect at the solution point. In [[quadratic programming]], as the solution is not necessarily on one of the edges of the bounding polygon, an estimation of the active set gives us a subset of inequalities to watch while searching the solution, which reduces the complexity of the search.&lt;br /&gt;
&lt;br /&gt;
==Active-set methods==&lt;br /&gt;
In general an active-set algorithm has the following structure:&lt;br /&gt;
&lt;br /&gt;
: Find a feasible starting point&lt;br /&gt;
: &amp;#039;&amp;#039;&amp;#039;repeat until&amp;#039;&amp;#039;&amp;#039; &amp;quot;optimal enough&amp;quot;&lt;br /&gt;
:: &amp;#039;&amp;#039;solve&amp;#039;&amp;#039; the equality problem defined by the active set (approximately)&lt;br /&gt;
:: &amp;#039;&amp;#039;compute&amp;#039;&amp;#039; the [[Lagrange multipliers]] of the active set&lt;br /&gt;
:: &amp;#039;&amp;#039;remove&amp;#039;&amp;#039; a subset of the constraints with negative Lagrange multipliers&lt;br /&gt;
:: &amp;#039;&amp;#039;search&amp;#039;&amp;#039; for infeasible constraints among the inactive constraints and add them to the problem&lt;br /&gt;
: &amp;#039;&amp;#039;&amp;#039;end repeat&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The motivations for this is that near the optimum usually only a small number of all constraints are binding and the solve step usually takes superlinear time in the amount of constraints. Thus repeated solving of a series equality constrained problem, which drop constraints which are not violated when improving but are in the way of improvement (negative lagrange multipliers) and adding of those constraints which the current solution violates can converge against the true solution. The optima of the last problem can often provide an initial guess in case the equality constrained problem solver needs an initial value.&lt;br /&gt;
&lt;br /&gt;
Methods that can be described as &amp;#039;&amp;#039;&amp;#039;active-set methods&amp;#039;&amp;#039;&amp;#039; include:&amp;lt;ref&amp;gt;{{harvnb|Nocedal|Wright|2006|pp=467–480}}&amp;lt;/ref&amp;gt;&lt;br /&gt;
* [[Successive linear programming]] (SLP) &amp;lt;!-- acc. to: Leyffer... - alt: acc. to &amp;quot;MPS glossary&amp;quot;, http://glossary.computing.society.informs.org/ver2/mpgwiki/index.php/Main_Page: Successive approximation --&amp;gt;&lt;br /&gt;
* [[Sequential quadratic programming]] (SQP) &amp;lt;!-- acc. to: Leyffer... - alt: acc. to &amp;quot;MPS glossary&amp;quot;, http://glossary.computing.society.informs.org/ver2/mpgwiki/index.php/Main_Page: Successive approximation  --&amp;gt;&lt;br /&gt;
* [[Sequential linear-quadratic programming]] (SLQP) &amp;lt;!-- acc. to: Leyffer... - alt: acc. to &amp;quot;MPS glossary&amp;quot;, http://glossary.computing.society.informs.org/ver2/mpgwiki/index.php/Main_Page: Successive approximation  --&amp;gt;&lt;br /&gt;
* [[Frank–Wolfe algorithm|Reduced gradient method]] (RG) &amp;lt;!-- acc. to: MPS glossary, http://glossary.computing.society.informs.org/ver2/mpgwiki/index.php/Main_Page - alt: acc. to &amp;quot;Optimization - Theory and Practice&amp;quot; (Forst, Hoffmann): Projection method --&amp;gt;&lt;br /&gt;
* [[Generalized Reduced Gradient|Generalized reduced gradient method]] (GRG) &amp;lt;!-- acc. to: MPS glossary, http://glossary.computing.society.informs.org/ver2/mpgwiki/index.php/Main_Page - alt: acc. to &amp;quot;Optimization - Theory and Practice&amp;quot; (Forst, Hoffmann): Projection method --&amp;gt;&lt;br /&gt;
&amp;lt;!-- ? Wilson&amp;#039;s Lagrange-newton method --&amp;gt;&lt;br /&gt;
&amp;lt;!-- ? Method of feasible directions (MFD) --&amp;gt;&lt;br /&gt;
&amp;lt;!-- ? Gradient projection method -  alt: acc. to &amp;quot;Optimization - Theory and Practice&amp;quot; (Forst, Hoffmann): Projection method --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Performance ==&lt;br /&gt;
Consider the problem of Linearly Constrained Convex Quadratic Programming. Under reasonable assumptions (the problem is feasible, the system of constraints is regular at every point, and the quadratic objective is strongly convex),  the active-set method terminates after finitely many steps, and yields a global solution to the problem. Theoretically, the active-set method may perform a number of iterations exponential in &amp;#039;&amp;#039;m&amp;#039;&amp;#039;, like the [[simplex method]]. However, its practical behaviour is typically much better.&amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;{{Cite web |last=Nemirovsky and Ben-Tal |date=2023 |title=Optimization III: Convex Optimization |url=http://www2.isye.gatech.edu/~nemirovs/OPTIIILN2023Spring.pdf}}&amp;lt;/ref&amp;gt;{{Rp|location=Sec.9.1}}&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
{{Reflist}}&lt;br /&gt;
&lt;br /&gt;
==Bibliography==&lt;br /&gt;
* {{cite book |last=Murty |first=K. G. |title=Linear complementarity, linear and nonlinear programming |series=Sigma Series in Applied Mathematics |volume=3 |publisher=Heldermann Verlag |location=Berlin |year=1988 |pages=xlviii+629 pp |isbn=3-88538-403-5 |url=http://ioe.engin.umich.edu/people/fac/books/murty/linear_complementarity_webbook/ |mr=949214 |access-date=2010-04-03 |archive-url=https://web.archive.org/web/20100401043940/http://ioe.engin.umich.edu/people/fac/books/murty/linear_complementarity_webbook/ |archive-date=2010-04-01 |url-status=dead }} &lt;br /&gt;
* {{Cite book | last1=Nocedal | first1=Jorge | last2=Wright | first2=Stephen J. | title=Numerical Optimization | publisher=[[Springer-Verlag]] | location=Berlin, New York | edition=2nd | isbn=978-0-387-30303-1 | year=2006 }}&lt;br /&gt;
&lt;br /&gt;
[[Category:Optimization algorithms and methods]]&lt;/div&gt;</summary>
		<author><name>128.84.126.71</name></author>
	</entry>
</feed>