Loading AI tools
Simple polynomial map exhibiting chaotic behavior From Wikipedia, the free encyclopedia
The logistic map is a polynomial mapping (equivalently, recurrence relation) of degree 2, often referred to as an archetypal example of how complex, chaotic behaviour can arise from very simple nonlinear dynamical equations. The map, initially utilized by Edward Lorenz in the 1960s to showcase irregular solutions (e.g., Eq. 3 of [1]), was popularized in a 1976 paper by the biologist Robert May,[2] in part as a discrete-time demographic model analogous to the logistic equation written down by Pierre François Verhulst.[3] Mathematically, the logistic map is written
(1) |
You can help expand this article with text translated from the corresponding article in Japanese. Click [show] for important translation instructions.
|
where xn is a number between zero and one, which represents the ratio of existing population to the maximum possible population. This nonlinear difference equation is intended to capture two effects:
The usual values of interest for the parameter r are those in the interval [0, 4], so that xn remains bounded on [0, 1]. The r = 4 case of the logistic map is a nonlinear transformation of both the bit-shift map and the μ = 2 case of the tent map. If r > 4, this leads to negative population sizes. (This problem does not appear in the older Ricker model, which also exhibits chaotic dynamics.) One can also consider values of r in the interval [−2, 0], so that xn remains bounded on [−0.5, 1.5].[4]
The image below shows the amplitude and frequency content of some logistic map iterates for parameter values ranging from 2 to 4.
By varying the parameter r, the following behavior is observed:
For any value of r there is at most one stable cycle. If a stable cycle exists, it is globally stable, attracting almost all points.[12]: 13 Some values of r with a stable cycle of some period have infinitely many unstable cycles of various periods.
The bifurcation diagram at right summarizes this. The horizontal axis shows the possible values of the parameter r while the vertical axis shows the set of values of x visited asymptotically from almost all initial conditions by the iterates of the logistic equation with that r value.
The bifurcation diagram is a self-similar: if we zoom in on the above-mentioned value r ≈ 3.82843 and focus on one arm of the three, the situation nearby looks like a shrunk and slightly distorted version of the whole diagram. The same is true for all other non-chaotic points. This is an example of the deep and ubiquitous connection between chaos and fractals.
We can also consider negative values of r:
The relative simplicity of the logistic map makes it a widely used point of entry into a consideration of the concept of chaos. A rough description of chaos is that chaotic systems exhibit a great sensitivity to initial conditions—a property of the logistic map for most values of r between about 3.57 and 4 (as noted above).[2] A common source of such sensitivity to initial conditions is that the map represents a repeated folding and stretching of the space on which it is defined. In the case of the logistic map, the quadratic difference equation describing it may be thought of as a stretching-and-folding operation on the interval (0,1).[13]
The following figure illustrates the stretching and folding over a sequence of iterates of the map. Figure (a), left, shows a two-dimensional Poincaré plot of the logistic map's state space for r = 4, and clearly shows the quadratic curve of the difference equation (1). However, we can embed the same sequence in a three-dimensional state space, in order to investigate the deeper structure of the map. Figure (b), right, demonstrates this, showing how initially nearby points begin to diverge, particularly in those regions of xt corresponding to the steeper sections of the plot.
This stretching-and-folding does not just produce a gradual divergence of the sequences of iterates, but an exponential divergence (see Lyapunov exponents), evidenced also by the complexity and unpredictability of the chaotic logistic map. In fact, exponential divergence of sequences of iterates explains the connection between chaos and unpredictability: a small error in the supposed initial state of the system will tend to correspond to a large error later in its evolution. Hence, predictions about future states become progressively (indeed, exponentially) worse when there are even very small errors in our knowledge of the initial state. This quality of unpredictability and apparent randomness led the logistic map equation to be used as a pseudo-random number generator in early computers.[13]
At r = 2, the function intersects precisely at the maximum point, so convergence to the equilibrium point is on the order of . Consequently, the equilibrium point is called "superstable". Its Lyapunov exponent is . A similar argument shows that there is a superstable value within each interval where the dynamical system has a stable cycle. This can be seen in the Lyapunov exponent plot as sharp dips.[14]
Since the map is confined to an interval on the real number line, its dimension is less than or equal to unity. Numerical estimates yield a correlation dimension of 0.500±0.005 (Grassberger, 1983), a Hausdorff dimension of about 0.538 (Grassberger 1981), and an information dimension of approximately 0.5170976 (Grassberger 1983) for r ≈ 3.5699456 (onset of chaos). Note: It can be shown that the correlation dimension is certainly between 0.4926 and 0.5024.
It is often possible, however, to make precise and accurate statements about the likelihood of a future state in a chaotic system. If a (possibly chaotic) dynamical system has an attractor, then there exists a probability measure that gives the long-run proportion of time spent by the system in the various regions of the attractor. In the case of the logistic map with parameter r = 4 and an initial state in (0,1), the attractor is also the interval (0,1) and the probability measure corresponds to the beta distribution with parameters a = 0.5 and b = 0.5. Specifically,[15] the invariant measure is
Unpredictability is not randomness, but in some circumstances looks very much like it. Hence, and fortunately, even if we know very little about the initial state of the logistic map (or some other chaotic system), we can still say something about the distribution of states arbitrarily far into the future, and use this knowledge to inform decisions based on the state of the system.
The bifurcation diagram for the logistic map can be visualized with the following Python code:
import numpy as np
import matplotlib.pyplot as plt
interval = (2.8, 4) # start, end
accuracy = 0.0001
reps = 600 # number of repetitions
numtoplot = 200
lims = np.zeros(reps)
fig, biax = plt.subplots()
fig.set_size_inches(16, 9)
lims[0] = np.random.rand()
for r in np.arange(interval[0], interval[1], accuracy):
for i in range(reps - 1):
lims[i + 1] = r * lims[i] * (1 - lims[i])
biax.plot([r] * numtoplot, lims[reps - numtoplot :], "b.", markersize=0.02)
biax.set(xlabel="r", ylabel="x", title="logistic map")
plt.show()
Although exact solutions to the recurrence relation are only available in a small number of cases, a closed-form upper bound on the logistic map is known when 0 ≤ r ≤ 1.[16] There are two aspects of the behavior of the logistic map that should be captured by an upper bound in this regime: the asymptotic geometric decay with constant r, and the fast initial decay when x0 is close to 1, driven by the (1 − xn) term in the recurrence relation. The following bound captures both of these effects:
The special case of r = 4 can in fact be solved exactly, as can the case with r = 2;[17] however, the general case can only be predicted statistically.[18] The solution when r = 4 is,[17][19]
where the initial condition parameter θ is given by
For rational θ, after a finite number of iterations xn maps into a periodic sequence. But almost all θ are irrational, and, for irrational θ, xn never repeats itself – it is non-periodic. This solution equation clearly demonstrates the two key features of chaos – stretching and folding: the factor 2n shows the exponential growth of stretching, which results in sensitive dependence on initial conditions, while the squared sine function keeps xn folded within the range [0,1].
For r = 4 an equivalent solution in terms of complex numbers instead of trigonometric functions is[17]
where α is either of the complex numbers
with modulus equal to 1. Just as the squared sine function in the trigonometric solution leads to neither shrinkage nor expansion of the set of points visited, in the latter solution this effect is accomplished by the unit modulus of α.
By contrast, the solution when r = 2 is[17]
for x0 ∈ [0,1). Since (1 − 2x0) ∈ (−1,1) for any value of x0 other than the unstable fixed point 0, the term (1 − 2x0)2n goes to 0 as n goes to infinity, so xn goes to the stable fixed point 1/2.
For the r = 4 case, from almost all initial conditions the iterate sequence is chaotic. Nevertheless, there exist an infinite number of initial conditions that lead to cycles, and indeed there exist cycles of length k for all integers k > 0. We can exploit the relationship of the logistic map to the dyadic transformation (also known as the bit-shift map) to find cycles of any length. If x follows the logistic map xn + 1 = 4xn(1 − xn) and y follows the dyadic transformation
then the two are related by a homeomorphism
The reason that the dyadic transformation is also called the bit-shift map is that when y is written in binary notation, the map moves the binary point one place to the right (and if the bit to the left of the binary point has become a "1", this "1" is changed to a "0"). A cycle of length 3, for example, occurs if an iterate has a 3-bit repeating sequence in its binary expansion (which is not also a one-bit repeating sequence): 001, 010, 100, 110, 101, or 011. The iterate 001001001... maps into 010010010..., which maps into 100100100..., which in turn maps into the original 001001001...; so this is a 3-cycle of the bit shift map. And the other three binary-expansion repeating sequences give the 3-cycle 110110110... → 101101101... → 011011011... → 110110110.... Either of these 3-cycles can be converted to fraction form: for example, the first-given 3-cycle can be written as 1/7 → 2/7 → 4/7 → 1/7. Using the above translation from the bit-shift map to the logistic map gives the corresponding logistic cycle 0.611260467... → 0.950484434... → 0.188255099... → 0.611260467.... We could similarly translate the other bit-shift 3-cycle into its corresponding logistic cycle. Likewise, cycles of any length k can be found in the bit-shift map and then translated into the corresponding logistic cycles.
However, since almost all numbers in [0,1) are irrational, almost all initial conditions of the bit-shift map lead to the non-periodicity of chaos. This is one way to see that the logistic r = 4 map is chaotic for almost all initial conditions.
The number of cycles of (minimal) length k = 1, 2, 3,… for the logistic map with r = 4 (tent map with μ = 2) is a known integer sequence (sequence A001037 in the OEIS): 2, 1, 2, 3, 6, 9, 18, 30, 56, 99, 186, 335, 630, 1161.... This tells us that the logistic map with r = 4 has 2 fixed points, 1 cycle of length 2, 2 cycles of length 3 and so on. This sequence takes a particularly simple form for prime k: 2 ⋅ 2k − 1 − 1/k. For example: 2 ⋅ 213 − 1 − 1/13 = 630 is the number of cycles of length 13. Since this case of the logistic map is chaotic for almost all initial conditions, all of these finite-length cycles are unstable.
In the logistic map, we have a function , and we want to study what happens when we iterate the map many times. The map might fall into a fixed point, a fixed cycle, or chaos. When the map falls into a stable fixed cycle of length , we would find that the graph of and the graph of intersects at points, and the slope of the graph of is bounded in at those intersections.
For example, when , we have a single intersection, with slope bounded in , indicating that it is a stable single fixed point.
As increases to beyond , the intersection point splits to two, which is a period doubling. For example, when , there are three intersection points, with the middle one unstable, and the two others stable.
As approaches , another period-doubling occurs in the same way. The period-doublings occur more and more frequently, until at a certain , the period doublings become infinite, and the map becomes chaotic. This is the period-doubling route to chaos.
Looking at the images, one can notice that at the point of chaos , the curve of looks like a fractal. Furthermore, as we repeat the period-doublings, the graphs seem to resemble each other, except that they are shrunken towards the middle, and rotated by 180 degrees.
This suggests to us a scaling limit: if we repeatedly double the function, then scale it up by for a certain constant :then at the limit, we would end up with a function that satisfies . This is a Feigenbaum function, which appears in most period-doubling routes to chaos (thus it is an instance of universality). Further, as the period-doubling intervals become shorter and shorter, the ratio between two period-doubling intervals converges to a limit, the first Feigenbaum constant .
The constant can be numerically found by trying many possible values. For the wrong values, the map does not converge to a limit, but when it is , it converges. This is the second Feigenbaum constant.
In the chaotic regime, , the limit of the iterates of the map, becomes chaotic dark bands interspersed with non-chaotic bright bands.
When approaches , we have another period-doubling approach to chaos, but this time with periods 3, 6, 12, ... This again has the same Feigenbaum constants . The limit of is also the same Feigenbaum function. This is an example of universality.
We can also consider period-tripling route to chaos by picking a sequence of such that is the lowest value in the period- window of the bifurcation diagram. For example, we have , with the limit . This has a different pair of Feigenbaum constants .[20] And converges to the fixed point to As another example, period-4-pling has a pair of Feigenbaum constants distinct from that of period-doubling, even though period-4-pling is reached by two period-doublings. In detail, define such that is the lowest value in the period- window of the bifurcation diagram. Then we have , with the limit . This has a different pair of Feigenbaum constants .
In general, each period-multiplying route to chaos has its own pair of Feigenbaum constants. In fact, there are typically more than one. For example, for period-7-pling, there are at least 9 different pairs of Feigenbaum constants.[20]
Generally, , and the relation becomes exact as both numbers increase to infinity: .
Universality of one-dimensional maps with parabolic maxima and Feigenbaum constants , .[21][22]
The gradual increase of at interval changes dynamics from regular to chaotic one [23] with qualitatively the same bifurcation diagram as those for logistic map.
The Feigenbaum constants can be estimated by a renormalization argument. (Section 10.7,[14]).
By universality, we can use another family of functions that also undergoes repeated period-doubling on its route to chaos, and even though it is not exactly the logistic map, it would still yield the same Feigenbaum constants.
Define the family The family has an equilibrium point at zero, and as increases, it undergoes period-doubling bifurcation at .
The first bifurcation occurs at . After the period-doubling bifurcation, we can solve for the period-2 stable orbit by , which yields At some point , the period-2 stable orbit undergoes period-doubling bifurcation again, yielding a period-4 stable orbit. In order to find out what the stable orbit is like, we "zoom in" around the region of , using the affine transform . Now, by routine algebra, we havewhere . At approximately , the second bifurcation occurs, thus .
By self-similarity, the third bifurcation when , and so on. Thus we have , or . Iterating this map, we find , and .
Thus, we have the estimates , and . These are within 10% of the true values.
The logistic map exhibits numerous characteristics of both periodic and chaotic solutions, whereas the logistic ordinary differential equation (ODE) exhibits regular solutions, commonly referred to as the S-shaped sigmoid function. The logistic map can be seen as the discrete counterpart of the logistic ODE, and their correlation has been extensively discussed in literature[24]
In a toy model for discrete laser dynamics: , where stands for electric field amplitude, [25] is laser gain as bifurcation parameter.
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.