∞
Fixed Points and Stability
Undergraduate
Definition
A fixed point is a state that doesn't change over time in a dynamical system. Fixed points can be stable (attractors), unstable, or saddle points.
Formulas
f(x^*) = x^* (discrete), f(x^*) = 0 (continuous)
Fixed point condition
|f'(x^*)| < 1 : stable, |f'(x^*)| > 1 : unstable
Stability in discrete systems
Examples
Example 1
Analyze fixed points and stability of f(x) = x² - x - 1.
Applications
Control Theory
System stabilization
Ecology
Equilibrium populations
Economics
Market equilibrium
Related Documents
Was this page helpful?