Stability[edit]
The concept of
stability, useful in the analysis of many kinds of equilibria, can also be applied to Nash equilibria.
A Nash equilibrium for a mixed strategy game is stable if a small change (specifically, an infinitesimal change) in probabilities for one player leads to a situation where two conditions hold:
- the player who did not change has no better strategy in the new circumstance
- the player who did change is now playing with a strictly worse strategy.
If these cases are both met, then a player with the small change in his mixed-strategy will return immediately to the Nash equilibrium. The equilibrium is said to be stable. If condition one does not hold then the equilibrium is unstable. If only condition one holds then there are likely to be an infinite number of optimal strategies for the player who changed.
John Nash showed that the latter situation could not arise in a range of well-defined games.
In the "driving game" example above there are both stable and unstable equilibria. The equilibria involving mixed-strategies with 100% probabilities are stable. If either player changes his probabilities slightly, they will be both at a disadvantage, and his opponent will have no reason to change his strategy in turn. The (50%,50%) equilibrium is unstable. If either player changes his probabilities, then the other player immediately has a better strategy at either (0%, 100%) or (100%, 0%).