Incorrect Model Generalization
It’s surprising to me how many people with an engineer’s mind think that the world is “pre-determined”. They have this silly idea that one could “simulate” the world from a given point onward to figure out what will happen in the future on a macro-scale. Figure out things like “if I give Sally a phone call today this will lead to an unlikely future where a plane will crash of San Francisco in 2047”. In case the idea of being able to simulate reality as a whole on a meaningfully large scale with high accuracy doesn’t sound silly to you, I lay out the reasoning better here.
Engineers might get this idea of the world being predetermined not because of some understanding of physics or technology or whatever. After all, it’s been an idea since ancient times, when people understood not much of what we do now about the world.
It’s more so that people which are used to solving small contained problems will start applying the same models/concepts/heuristics (e.g. causality, almost-absolute determinism, linearity of space and time, a root cause) to systems that are neither small nor very well self-contained, maybe even to systems that can’t be viewed as “problems”.
Of course, this isn’t limited to engineers, a lot of people try to apply a lot of silly models to a lot of things.
Two interesting stereotypes I can create here are:
People have a model that works really well in a subset of scenarios, but they are unaware of the boundaries of that subset
People have a model that doesn’t work, but external factors make it seem like it does, when those external factors no longer apply the model breaks
The first case is that of the engineering-minded person not getting that some systems are so complex as to be unreachable by dynamic conceptual reasoning. Or that of the soldier trying to integrate into the business world and finding that discipline and comradery count for much less and can often be seen as negatives. Or that of the child from an abusive household (or boomer from a communist country) not being able to integrate into a society built on trust and positive-sum games.
The second case is that of the pretty social butterfly that thinks they “understands people really well” when in fact they are just really nice to have around. Or that of the programmer thinking that it’s “really easy” to get a 6-figure job if you just spend a few months “learning to code” ignoring the prerequisite “be genetically lucky enough to place 2 standard deviations toward the right of the GI bell curve”. It’s the rich and privileged kid thinking that life boils down to spiritual attainment and “dissolving the illusion of want” without having ever had to deal with lack of money or systematic adversity in their lives.