Jump to content

ITP method: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Still working on it
Just finished the introduction.
Line 3: Line 3:




In [[numerical analysis]], the ITP Method, short for ''Interpolate Truncate and Project'', is the first [[Root-finding algorithms|root-finding algorithm]] to achieve the [[superlinear convergence]] of the [[secant method]] <ref>{{Cite journal|last=Argyros|first=I. K.|last2=Hernández-Verón|first2=M. A.|last3=Rubio|first3=M. J.|date=2019|title=On the Convergence of Secant-Like Methods|url=http://springer.nl.go.kr/chapter/10.1007/978-3-030-15242-0_5|journal=Current Trends in Mathematical Analysis and Its Interdisciplinary Applications|language=en|pages=141–183|doi=10.1007/978-3-030-15242-0_5}}</ref> while retaining the optimal<ref>{{Cite journal|last=Sikorski|first=K.|date=1982-02-01|title=Bisection is optimal|url=https://doi.org/10.1007/BF01459080|journal=Numerische Mathematik|language=en|volume=40|issue=1|pages=111–117|doi=10.1007/BF01459080|issn=0945-3245}}</ref> [[Best, worst and average case|worst-case]] performance of the [[bisection method]] <ref name=":0">{{Cite journal|last=Oliveira|first=I. F. D.|last2=Takahashi|first2=R. H. C.|date=2020-12-06|title=An Enhancement of the Bisection Method Average Performance Preserving Minmax Optimality|url=https://doi.org/10.1145/3423597|journal=ACM Transactions on Mathematical Software|volume=47|issue=1|pages=5:1–5:24|doi=10.1145/3423597|issn=0098-3500}}</ref>. It is also the first method with guaranteed average performance strictly better than the bisection method under any continuous distribution <ref name=":0" />. In practice it performs better than traditional interpolation and hybrid based strategies (Brent's Method, Ridders, Illinois), since it not only converges super-linearly over well behaved functions but also guarantees fast performance under ill-behaved functions where interpolations fail.
In [[numerical analysis]], the ITP Method, short for ''Interpolate Truncate and Project'', is the first [[Root-finding algorithms|root-finding algorithm]] that achieves the [[superlinear convergence]] of the [[secant method]] <ref>{{Cite journal|last=Argyros|first=I. K.|last2=Hernández-Verón|first2=M. A.|last3=Rubio|first3=M. J.|date=2019|title=On the Convergence of Secant-Like Methods|url=http://springer.nl.go.kr/chapter/10.1007/978-3-030-15242-0_5|journal=Current Trends in Mathematical Analysis and Its Interdisciplinary Applications|language=en|pages=141–183|doi=10.1007/978-3-030-15242-0_5}}</ref> while retaining the optimal<ref>{{Cite journal|last=Sikorski|first=K.|date=1982-02-01|title=Bisection is optimal|url=https://doi.org/10.1007/BF01459080|journal=Numerische Mathematik|language=en|volume=40|issue=1|pages=111–117|doi=10.1007/BF01459080|issn=0945-3245}}</ref> [[Best, worst and average case|worst-case]] performance of the [[bisection method]] <ref name=":0">{{Cite journal|last=Oliveira|first=I. F. D.|last2=Takahashi|first2=R. H. C.|date=2020-12-06|title=An Enhancement of the Bisection Method Average Performance Preserving Minmax Optimality|url=https://doi.org/10.1145/3423597|journal=ACM Transactions on Mathematical Software|volume=47|issue=1|pages=5:1–5:24|doi=10.1145/3423597|issn=0098-3500}}</ref>. It is also the first method with guaranteed average performance strictly better than the bisection method under any continuous distribution <ref name=":0" />. In practice it performs better than traditional interpolation and hybrid based strategies (Brent's Method, Ridders, Illinois), since it not only converges super-linearly over well behaved functions but also guarantees fast performance under ill-behaved functions where interpolations fail<ref name=":0" />.


The strategy is comprised of three steps: it calculates the [[regula falsi]] estimate, then it perturbes/truncates the estimate (similar to {{section link|Regula falsi|Improvements in ''regula falsi''}}) and then projects the perturbed estimate onto an interval in the neighbourhood of the bisection midpoint. The method depends on three hyper-parameters $]kappa1$ kappa2 and n0, the first two control the size of the truncation and the third a slack variable.
The ITP Method follows the same structure of standard bracketing strategies that keeps track of upper and lower bounds for the location of the root; but is also keeps track of the region where worst-case performance is kept upper-bounded. As a bracketing strategy, in each iteration the ITP queries the value of the function on one point and discards the part of the interval between two points where the function value shares the same sign. The the queried point is calculated with three steps: it interpolates finding the [[regula falsi]] estimate, then it perturbes/truncates the estimate (similar to {{section link|Regula falsi|Improvements in ''regula falsi''}}) and then projects the perturbed estimate onto an interval in the neighbourhood of the bisection midpoint. The method depends on three hyper-parameters <math>\kappa_1\in (0,\infty), \kappa_2 \in \left[1,1+\phi\right) </math> and <math>n_0\in[0,\infty) </math> where <math>\phi </math> is the golden ration <math>\tfrac{1}{2}(1+\sqrt{5}) </math>, the first two control the size of the truncation and the third is a slack variable that controls the size of the interval for the projection step.


=== Root Finding Problem: ===
The problem definition:<blockquote>

Given a continuous function <code><nowiki><math>f:[a,b]\to \mathbb{R}</math></nowiki></code> such that find tilde x such that x-x leq epsilon.</blockquote>
Given a continuous function <math>f</math> defined from <math>[a,b]</math> to <math>\mathbb{R}</math> such that <math>f(a)f(b)\leq 0</math>, where at the cost of one query one can access the values of <math>f(x)</math> on any given <math>x</math>. And, given a pre-specified target precision <math>\epsilon>0</math>, a root-finding algorithm is design to solve the following problem with the least amount of queries as possible:<blockquote>

'''Problem Definition:''' Find <math>\hat{x}</math> such that <math>|\hat{x}-x^*|\leq \epsilon</math>, where <math>x^*</math> satisfies <math>f(x^*) = 0</math>.</blockquote>

This problem is very common in [[numerical analysis]], [[computer science]] and [[engineering]]; and, root-finding algorithms are the standard approch to solve it. Often, the root-finding proceedure is called by more complex parent algorithms within a larger context, and, for this reason solving root problems efficiently is of extreme importance since an inneficent approach might come at a high computational cost when the larger context is taken into account. This is what the ITP method attempts to do by simultaneously exploiting interpolation guarantees as well as minmax optimal guarantees of the bisection method.


== The Method ==
== The Method ==




== References ==
== References ==

Revision as of 18:39, 17 December 2020

ITP Method


In numerical analysis, the ITP Method, short for Interpolate Truncate and Project, is the first root-finding algorithm that achieves the superlinear convergence of the secant method [1] while retaining the optimal[2] worst-case performance of the bisection method [3]. It is also the first method with guaranteed average performance strictly better than the bisection method under any continuous distribution [3]. In practice it performs better than traditional interpolation and hybrid based strategies (Brent's Method, Ridders, Illinois), since it not only converges super-linearly over well behaved functions but also guarantees fast performance under ill-behaved functions where interpolations fail[3].

The ITP Method follows the same structure of standard bracketing strategies that keeps track of upper and lower bounds for the location of the root; but is also keeps track of the region where worst-case performance is kept upper-bounded. As a bracketing strategy, in each iteration the ITP queries the value of the function on one point and discards the part of the interval between two points where the function value shares the same sign. The the queried point is calculated with three steps: it interpolates finding the regula falsi estimate, then it perturbes/truncates the estimate (similar to Regula falsi § Improvements in regula falsi) and then projects the perturbed estimate onto an interval in the neighbourhood of the bisection midpoint. The method depends on three hyper-parameters and where is the golden ration , the first two control the size of the truncation and the third is a slack variable that controls the size of the interval for the projection step.

Root Finding Problem:

Given a continuous function defined from to such that , where at the cost of one query one can access the values of on any given . And, given a pre-specified target precision , a root-finding algorithm is design to solve the following problem with the least amount of queries as possible:

Problem Definition: Find such that , where satisfies .

This problem is very common in numerical analysis, computer science and engineering; and, root-finding algorithms are the standard approch to solve it. Often, the root-finding proceedure is called by more complex parent algorithms within a larger context, and, for this reason solving root problems efficiently is of extreme importance since an inneficent approach might come at a high computational cost when the larger context is taken into account. This is what the ITP method attempts to do by simultaneously exploiting interpolation guarantees as well as minmax optimal guarantees of the bisection method.

The Method

References

  1. ^ Argyros, I. K.; Hernández-Verón, M. A.; Rubio, M. J. (2019). "On the Convergence of Secant-Like Methods". Current Trends in Mathematical Analysis and Its Interdisciplinary Applications: 141–183. doi:10.1007/978-3-030-15242-0_5.
  2. ^ Sikorski, K. (1982-02-01). "Bisection is optimal". Numerische Mathematik. 40 (1): 111–117. doi:10.1007/BF01459080. ISSN 0945-3245.
  3. ^ a b c Oliveira, I. F. D.; Takahashi, R. H. C. (2020-12-06). "An Enhancement of the Bisection Method Average Performance Preserving Minmax Optimality". ACM Transactions on Mathematical Software. 47 (1): 5:1–5:24. doi:10.1145/3423597. ISSN 0098-3500.