# Stabilization

**Memory stabilization** (abbreviated *SInc* for *stability increase*) is the increase in memory stability as a result of the retrieval of a memory (e.g. in review). **Stabilization** may also be a result of memory optimization in sleep.

The higher the stability, the lesser the stability increase at review (see: Stabilization decay). As the name implies, the memory tends to *stabilize* in the long-term storage.

In addition, the increase in memory stability at retrieval depends on retrievability and memory complexity. The lower the retrievability, the higher the average increase in stability. Expected increase in stability described by the *consolidation curve*, and probabilistically modulated by the prospect of a memory lapse, peaks at different levels of the forgetting index (usually at retrievability in the range of 30-80%).

Complexity is a confounding component in memory research. For atomic memories, stability can be modelled (e.g. as in Algorithm SM-17). Stability gradually loses its meaning with the increase in memory complexity. For example, if we consider a memory of a book, its stability increase cannot be measured because a book cannot be recalled verbatim. The memory imprint of a book is a complex constellation of memories whose stabilities may span ranges from short-term to *permastore*.

In Algorithm SM-17, the matrix of stability increase (*SInc[]*), can be considered an extension of the matrix of O-Factors (from older SuperMemos) into the dimension of retrievability. That extension was essential for making the algorithm universal for all imaginable repetition schedules (delays in review correspond with lower retrievability).

See also:

- Two component model of long-term memory
- Search for a universal memory formula
- Stabilization curve
- Stabilization decay
- Structural and molecular mechanisms of the spacing effect
- SuperMemo Algorithm: 30-year-long labor
- Algorithm SM-17
- Post-lapse stability

This glossary entry is used to explain SuperMemo, a pioneer of spaced repetition software since 1987

Figure:Stability increase function as computed by Algorithm SM-17. The function takes three arguments: (1) stability at review expressed in days (on the left), (2) retrievability at review (on the right), and (3) memory complexity expressed as item difficulty (slider labelledDiffcurrently set at 0.8). In the picture, stability increase peaks at around 15 (vertical axis). For some levels of stability and retrievability, the function drops below 1.0 indicating a decrease in stability (e.g. due to premature review evoking the spacing effect in massed presentation). 61,768 repetitions of relatively difficult items were needed to produce this graph (at Diff=0.8). The longest intervals reach around 14 years (stability quantile 5172)

Figure:Approximation of the stabilization function in SuperMemo. Data points, green and red circles, correspond with memory stabilization (SInc) for all values of stability and retrievability in a selected difficulty quantile (Diff=0.5). The blue-and-red surface represents the best fit found by SuperMemo using a formula derived from the stabilization curve, stabilization decay, and parametric changes to those functions with item difficulty. 54,449 repetition cases have been used to generate the graph. Green outlier points on the right represent "pollution" from new items whose stability has not yet been accurately estimated

DiffFigure:Expected stabilization as approximated by Algorithm SM-17. The function takes three arguments: (1) stability at review expressed in days (on the left), (2) retrievability at review (on the right), and (3) memory complexity expressed as item difficulty (slider labelledcurrently set on easy items at 0.1). In the picture, consolidation (expected stability increase) peaks at above 5 (vertical axis). Unlike actual data, the approximation is set to never drop below 1.0. This means that the approximating function will never result in a drop in stability at review. 10,130 repetitions of easy items were used to produce this graph (Diff=0.1). One of the data point shows a set of repetitions with intervals reaching 14 years (stability quantile 5172)

Figure:Stabilization curvecomputed with SuperMemo. The horizonal line shows time expressed as memory retrievability. The vertical line shows stabilization, i.e. the increase in the durability of memory expressed as an increase in memory stability. Blue circles show how much stability increases with review that takes place at a given level of retrievability. The size of the blue circles depends on the number of data points collected. The graph has been produced using 31,721 repetitions in SuperMemo in data quantiles of difficulty=0.53 and stability=26 [days]. Stabilization increases from 1.36 at retrievability=100% (SIncMin=1.36) to 26.31 at R=0% (SIncMax=26.31). Optimum review at R=90% in SuperMemo produces stabilization of 1.86 (Stab90 is an equivalent of O-Factor in older SuperMemos). The Gain constant that expresses the spacing effect equals 2.96, i.e. relatively high as befits a low stability quantile. Stabilization in this dataset can be computed accurately with the formula (yellow line): 26*e^{-2.96*R}, which produces a tiny deviation of 0.5069. The consolidation curve is shown in purple, and indicates that the forgetting index is a plausible learning criterion for the presented dataset. For R approaching 100%, the actual stabilization for the presented graph is 0.879 as obtained with 2074 measurements. This is not clearly visible in the picture, but can be investigated with stabilization matrix exports in SuperMemo. This means that the stabilization curve formula may be inaccurate at the extremes of retrievability approaching 100% and 0%. This can be explained by the fact that no memory is perfectly retrievable or verifiably obliterated (see: We never forget)

decayFigure:Stabilization decayis the drop inmemory stabilizationwith the increase inmemory stability. The picture taken from SuperMemo shows the impact of stability on stabilization for items of difficulty estimated at 0.37. 25,686 repetition cases have been used to plot the graph. Thein stabilization is -0.529. The maximum possible increase in stability (SIncMax) is 3.102 (at Stability=1). The blue dots show data points (S:SInc), where the bigger the circle the bigger the sample of repetitions. The yellow line is a power fit described by the formula:SInc=(3.102-1)*S^{-0.529}+1. This fit provides a deviation of 0.4235

Figure:Uncertain course of the stabilization of complex memories. The picture shows a hypothetical course of stabilization, forgetting, generalization, and interference on the example of a single dendritic input pattern of a single concept cell. The neuron, dendrites and dendritic filipodia are shown in orange. The picture does not show the conversion of filopodia into dendritic spines whose morphology changes over time with stabilization. The squares represent synapses involved in the recognition of the input pattern. Each square shows the status of the synapse in terms of the two component model of long-term memory. The intensity of red represents retrievability. The size of the blue area represents stability. After memorizing a complex memory pattern, the concept cell is able to recognize the pattern upon receiving a summation of signals from the red squares representing a new memory of high retrievability and very low stability. Each time the cell is re-activated, active inputs will undergo stabilization, which is represented by the increase in the blue area in the input square. Each time a signal does not arrive at an input while the concept cell is active, its stability will drop (generalization). Each time a source axon is active and the target neuron fails to fire, the stability will drop as well (competitive interference). Due to the uneven input of signal patterns to the concept cell, some synapses will be stabilized, while others will be lost. Forgetting occurs when a synapse loses its stability and its retrievability and when the relevant dendritic spine is retracted. Generalization occurs when the same concept cell can be re-activated using a smaller, but a more stable input pattern. Retroactive interference occurs when a new input pattern contributes to forgetting some of the redundant inputs necessary for the recognition of the old input pattern. Stabilization of the old patterns results in the reduced mobility of filopodia, which prevents the takeover of a concept by new patterns (proactive interference). At the every end of the process, a stable and a well-generalized input pattern is necessary and sufficient to activate the concept cell. The same cell can respond to different patterns as long as they are consistently stabilized. In spaced repetition, poor choice of knowledge representation will lead to poor reproducibility of the activation pattern, unequal stabilization of synapses, and forgetting. Forgetting of an item will occur when the input pattern is unable to activate sufficiently many synapses and thus unable to reactivate the concept cell. At repetition, depending on the context and the train of thought, an item may be retrieved or forgotten. The outcome of the repetition is uncertain