The Default Settings of the Modern Mind: We're Lagging Centuries Behind
- Rafael von Hertzen
- 2 days ago
- 7 min read
I’ve long suspected that while advances in science and thought can happen at a breakneck pace, our collective worldview can not update at a similar pace. New discoveries can undo our understanding of reality in an instant, but it can take centuries for those ideas to truly reshape how regular people see the world.
In that sense, I think our worldview is still living in the 1800s. We’ve updated our technologies, but not our core beliefs. Our science may be modern, but our assumptions are still Victorian.
So, let’s start exploring this idea by making an attempt to map out the default worldview of the average modern mind, the unspoken assumptions that most of us, consciously or not, take for granted.
Reality
The world is made of matter. It is mechanical, measurable, and governed by cause and effect.
Everything that happens has a material explanation.
Science is the ultimate arbiter of truth. If you can't measure it, it's not real.
Human Nature
People are born blank. They are shaped mostly by environment, education, and social context.
Human beings are rational and generally act in their self-interest.
Morality
Ethics are human constructions, not of divine origin.
Equality is good. Hierarchy is bad.
Freedom is the highest value, especially freedom to self-express.
A good life is one of self-fulfillment: to be authentic, happy, and self-actualized.
Religion is outdated; purpose is something you create, not something you are given.
For most of us alive today, this feels simply like the way things are. But these ideas aren’t self-evident truths.
So where did they come from?
Why do we believe all this?
Should we still continue to have these beliefs?
Let’s break it down.
Reality
Where These Beliefs Came From
The modern concept of reality as mechanical and measurable emerged during the Scientific Revolution (17th–18th century), and solidified through the Enlightenment and Industrial Age.
The key architects were:
René Descartes (1600s) — Split reality into mind and matter. Matter was “stuff”: predictable, measurable, soulless. Mind was the private domain of thought. This Cartesian dualism separated consciousness from the physical world, and set up the idea that nature was a machine.
Isaac Newton (1687) — Built the mathematical model of that machine. In Newton’s universe, every action has a precise cause and effect, every future can be predicted if we know the initial conditions. This birthed mechanistic determinism, the notion that the universe runs like clockwork.
Francis Bacon (1620) — Championed empirical science: the idea that knowledge should be grounded in observation and experiment. This gave us the scientific method, extraordinarily powerful, but it also reinforced the belief that only what can be measured is real.
New Ideas We Haven't Psychologically Digested
Our advances in physics, from Einstein’s relativity to quantum mechanics, have repeatedly proven that the world does not behave the way our dominant worldview assumes. And yet, outside of a few intellectual circles, our collective consciousness has barely caught up.
Einstein showed us that space and time are not fixed, but rather dynamic, intertwined variables that stretch and bend depending on gravity and motion. There is no single, universal frame of reference, only perspectives. Each observer inhabits their own patch of reality, different from everyone elses.
Quantum mechanics went further still. They revealed that matter itself is not solid or fixed at all. What we call “particles” are better understood as waves of probability, potential states that only "materialize" into a definite reality once they are observed.
In 2022, we progressed even further. The Nobel Prize in Physics went to Alain Aspect, John Clauser, and Anton Zeilinger for confirming what decades of theory had suggested: reality cannot be both local and real.
If reality isn't local, it means particles can influence each other instantly over vast distances, faster than the speed of light should allow. This suggests that space as we understand it is more like just a surface layer, with a deeper invisible network connecting everything underneath.
If reality isn't real, it means matter doesn't actually exist at all, until there is someone there to observe it. This is similar to a modern video game that only renders the part of the world you’re looking at, in order to save processing power.
Whichever way you interpret it, the result is the same: the old picture of an objective, clockwork universe is gone. Observation affects the outcome; cause and effect are not strictly linear or independent of the observer. The universe is not a predetermined machine, it is relational, probabilistic, and participatory.
Human Nature
Where These Beliefs Came From
The modern view of human nature, as rational, self-interested, and shaped primarily by environment, emerged between the Enlightenment and early Industrial Age.
The key architects were:
John Locke (1689) — Introduced the idea of the tabula rasa, the “blank slate.” Humans, he argued, are born without innate ideas; everything we become is written by experience and education. This gave rise to modern liberalism, and later, social engineering and behaviorism.
Adam Smith (1776) — In The Wealth of Nations, he described human beings as rational agents pursuing self-interest. Though Smith himself believed in moral sympathy, his ideas were distilled into homo economicus, the self-maximizing individual, a cornerstone of modern economics.
Charles Darwin (1859) — Evolution by natural selection reframed human behavior as a struggle for survival and reproduction. Over time, this was simplified into the idea that competition and self-interest drive all progress — biologically, economically, socially.
New Ideas We Haven't Psychologically Digested
If our view of reality has lagged behind physics, then our view of ourselves has lagged just as far behind psychology, biology, and neuroscience. We still see humans as miniature Newtonian machines; rational, self-contained, and programmable by their surroundings, when in truth, the science of the last century has revealed something far more dynamic.
Behavioral genetics and evolutionary psychology have shown beyond doubt that the “blank slate” view of human nature, is false. Genes influence nearly every aspect of who we are: temperament, intelligence, risk-taking, empathy, even how we respond to stress. Human populations carry genetic diversity shaped by millennia of adaptation to different environments, and those variations can affect physiology and behavior. Likewise, sex differences in hormones, brain wiring, and life-history strategies create distinct statistical differences between men and women, that our collective understanding likes to pretend don't exist.
None of this implies fixed destinies or moral hierarchies. It simply reminds us that human nature is biological, not infinitely malleable. Environment and culture matter enormously, but they interact with an inherited template rather than writing on an empty page.
Similarly, neuroscience and behavioral economics have dismantled the myth of humans as rational actors. Pioneers like Daniel Kahneman, and Amos Tversky have revealed that emotion and intuition drive most of our decisions; reason arrives only afterward. We aren't calculators, but rather storytelling machines.
Morality
Where These Beliefs Came From
The modern view of morality, that ethics are human constructions, that equality and freedom are the highest goods, and that the purpose of life is self-fulfillment was once again born during the Enlightenment and matured through the social revolutions of the 18th and 19th centuries.
The key architects were:
Immanuel Kant (1785) — Placed morality within human reason. Each person, as a rational agent, could determine right and wrong through universal logic, independent of divine command. This made ethics subjective yet universal, a matter of conscience, not scripture.
Karl Marx (1848) — Brought moral language into the material world. He reframed morality as justice between classes, equality as the supreme virtue, hierarchy as exploitation. His moral framework of “oppressor vs. oppressed” still underpins our cultural ethics.
John Stuart Mill (1861) — Advanced utilitarianism: the idea that morality’s purpose is to maximize happiness and minimize suffering. The good life became a matter of pleasure, empathy, and self-determination.
Friedrich Nietzsche (1882) — Declared the death of God and with it, the collapse of objective morality. Humans must create their own values.
New Ideas We Haven't Psychologically Digested
Over the past century, discoveries across biology, neuroscience, and cultural evolution have quietly undermined the human-centered moral framework of the Enlightenment. We’ve begun to realize that morality, while technically invented, is also emergent.
Through theories of cultural evolution and memetics, thinkers such as Richard Dawkins, Joseph Henrich, and others have shown that moral systems evolve through natural selection, just like organisms do. Cultures with “good” moral codes, meaning codes that strengthen cooperation, reproduction, and resilience, tend to outcompete those with “bad” ones.
In this view, morality is not something individuals simply decide through logic or preference. It’s something reality itself selects for over time. Evolution becomes the invisible judge of ethics, rewarding some moral systems with continuity and condemning others to extinction.
You could say, then, that the morals favored by evolution are the truly “good” ones, not because they feel good, or seem fair, but because they work. They align with the laws of life. Peoples who abandon them eventually go extinct, while those who uphold them remain.
By that logic, the failure of systems like the Soviet Union wasn’t just political, it was evolutionary. If equality as a supreme moral value led to collapse, perhaps it was not the highest good after all. Similarly, the idea of utilitarianism, that the highest moral aim is to maximize pleasure and minimize suffering, is challenged by this understanding. Biology shows us that comfort often weakens a species, while hardship refines it.
In a sense, ethics of divine origin do still exist — even without a God. The morals that endure through time are, in every practical sense, the ones creation itself has chosen.
Conclusion
I’ve only scratched the surface here, but I hope the core idea is clear: the worldview shared by most people today rests on foundations laid centuries ago. Whether or not you agree with the newer perspectives explored here is beside the point. The key takeaway is that scientific and philosophical revolutions take far longer to penetrate collective consciousness than to appear in academic journals.
The ideas that quietly shape our default assumptions about reality, morality, and human nature were largely born in the 1700s and 1800s; the eras of Newton, Descartes, Darwin, and Marx. The insights of the 1900s and beyond; relativity, quantum theory, evolutionary psychology, cultural evolution, have yet to truly reshape how ordinary people see the world.
There seems to be a lag, a kind of cultural inertia, between discovery and belief, a delay between what we know and what we feel to be true. I can’t help but wonder if there’s a standard rhythm to this process. Perhaps the length of two average lifetimes, or about 150 years at current?
If so, it would mean that we are only now beginning to process the science of the 1900s. The question that remains is: what will the 21st-century mind look like, once it finally catches up?