To successfully do anything at all, you must have some idea of how cause and effect works in the world around you. I’m getting very worried by a pattern I’ve noticed, where large organisations are actively disconnecting themselves from reality because they cannot honestly achieve their goals. This cannot work, but it’s worse than that: the now-blind organisations cannot notice it not working, and risk becoming stuck in their delusions.
This phenomenon feels like a special case of Goodhart’s Law (“when a measure becomes a target, it ceases to be a good measure”) crossed with the fallacy of Affirming the Consequent (“All men are mortal. Socrates is mortal. Therefore, all men are Socrates.”). The pattern, which I’m calling enforcing the consequent, looks like this:
Boosting B so aggressively often means turning off feedback mechanisms which would signal that more B is a bad idea. I’m seeing this happen in really large organisations and systems, and that scares me: big systems can make big mistakes. Examples after the jump.
Read more...