Spiritual Science

Towards the Unity of Knowledge, the Destruction of Idols, the Reclamation of Western Thought, and the Attainment of Utopia
Physics  /  Neuroscience  /  Information Theory  /  Evolutionary Biology  /  Developmental Psychology  /  Self Psychology  /  Narrative Psychology  /  Positive Psychology  /   Linguistics  /   Media Theory  /   Expertise Studies  /   Wisdom Studies  /   Philosophy  /   History  /   Anthropology  /   Education Studies  /   Religious Studies  /   Kabbalah


The utopian society has been a consistent theme in Western thought since Plato’s Republic, yet after millennia of technological and social advancement, the Western project is stumbling. In particular, North American society is gripped by widespread misery, and is in the depths of chaos due to technological innovations and bitter disputes over core Western values.

While some proximal causes for these crises have been identified, the true problem lies with Western metaphysics – our shared beliefs about what is true about ourselves and the universe. Only through a careful review of the causes of our predicament, our accumulated knowledge and the relationships between discoveries in diverse fields are we able to develop a full understanding of what is, what ought to be, and what ought to be done to achieve the utopia we have dreamed about for so long.

Note: This work will constitute a substantial response to and extension of Jordan B. Peterson's "Maps of Meaning", as well as Edward O. Wilson's "Consilience", Ayn Rand's "Romantic Manifesto", and many other key works.

Table of Contents


  • Parable of the Madman: Revisited
א - The Destruction of Human Nobility
ב - Rebuilding the Person
  • Ch.7 - First Principles & Boundary Conditions (draft link)
  • Ch.8 - Time & Motion (draft link)
  • Ch.9 - An Ever-Finer Quality (draft link)
  • Ch.10 - Maps of Meaning (draft link)
  • Ch.11 - More Precious Than Rubies (draft link)
  • Ch.12 - The Mechanisms of Lifespan Development (draft link)
  • Ch.13 - Trauma, Stagnation, Possession (draft link, paper)
  • Ch.14 - Eigenvectors of Individual Psychology (draft link)
  • Ch.15 - Psychotechnologies & Mystic Rhythms (WIP)
  • Ch.16 - Mind, Self, Soul, Integrity
  • Ch.17 - Towards Maslow's Fourth Force
ג - Rebuilding the World
Building on the previous section, this will expand the domain of concern to groups, organizations, and societies, grounding their behavior within the laws of physics, establishing common attractor states such as idolatry, and the optimization of social systems.
  • Ch.18 - We Bring Forth a World
  • Ch.19 - Distributed Ledgers & Collective Computing
  • Ch.20 - The Thermodynamics of Good & Evil (see this video for a preview)
  • Ch.21 - Daemonium ex Machina (short preview)
  • Ch.22 - Raise the Child, Raise the Nation
  • Ch.23 - Balancing Me & We
  • Ch.24 - Moving Beyond Politics
  • Ch.25 - The Transvaluation of Values
  • Ch.26 - Into the Mystic
ד - The Pillars of Creation
The scientific worldview, built on pillars of evolution and early-universe physics, is significantly shakier than advertised. Moreover, the Egyptian dynastic chronology is grievously mistaken and is hiding evidence for the Biblical Exodus as-written. The case for and implications of this will be explored here.
  • Ch.27 - Fear & Faith (see this video)
  • Ch.28 - Behold, a Nation That Rises Like a Lion
  • Ch.29 - Sinai & The Mesorah (this video is excellent)
  • Ch.30 - Disputing the Dynasties
  • Ch.31 - Evidence for the Exodus (see this summary)
  • Ch.32 - Magen Avraham
  • Ch.33 - Creation Revealed
  • Ch.34 - The Grand Consilience (see this or this)
  • Ch.35 - Seven Simple Commandments
  • Ch.36 - An Unexpected Utopia
ה - Homo Divinus
The domain of aesthetics, long considered to be subjective, will be grounded in physics, biology, and Abraham Maslow's investigations into high-performing human beings he called "peakers". In light of the Biblical claims previously made, the idea of ultimate beauty will be explored, and an aesthetic called "Judeo-Romanticism" shall be outlined.
  • Ch.37 - The Essence of Creativity
  • Ch.38 - The Essence of Romanticism
  • Ch.39 - Towards a Human Aesthetic (this might be fun)
  • Ch.40 - On the Shoulders of Giants
  • Ch.41 - Repentance & Forgiveness
  • Ch.42 - Joining the Dance

Parable of the Madman: Revisited

Have you not heard of that prophet who lit a lantern in the bright morning hours, ran to the market-place, and cried incessantly: “I have found God! I have found God!”

As many of those who did not believe in God were standing together there, he excited considerable laughter. Have you not heard that God is dead? said one. Have you not heard that science explains the universe? said another. Are you an anti-vaxxer? A faith healer? A creationist? Thus they shouted and laughed. The prophet sprang into their midst and pierced them with his glances.

“God is dead?!” he cried. “Hear, O heavens, and give ear, O earth: God lives! But how does God live? How did your best efforts not wipe God away? Have you not conquered the earth? Have you not reached the stars? Have you not discovered the building blocks of life? You abandoned God and bowed down to colourful rags, to men, and to intricate theories, crying out “Save me! You are my god!”. You held festivals of atonement in Flanders and Dresden, sacred games in Auschwitz and Nanking. Is this how you, self-proclaimed murderers of all murderers, console yourselves after your supposed deicide? Do you not see that you are now perpetually falling? Backward, sideward, forward, in all directions?

Is there any up or down left for you? Male or female? Good or evil? Objective reality? Do you not hear anything yet of the noise of the seekers who have remembered God? Do you not smell the flowers that grow each spring? God lives. God will remain alive. That which was the holiest and mightiest of all that the world has yet possessed remains available to us. Remember the former things, those of long ago; for I have found God, and there is no other like Him. There has never been a greater discovery; and whosoever shall be born after us - for the sake of this discovery he shall be part of a higher history than all history hitherto."

Here the prophet fell silent and again regarded his listeners; and they too were silent and stared at him in astonishment. At last he threw his lantern to the ground, and it broke and went out. "I have come too early," he said then; "my time has not come yet. The tremendous realization is still on its way, still travelling - it has not yet reached the hearts of men. Lightning and thunder require time, the light of the stars requires time, perspectives require time even after information is acquired, before they can be understood and appreciated. This insight is still more distant from them than the distant stars - and yet it has been around them since the beginning."

It has been further related that on that same day the prophet entered places of worship and there sang joyous psalms. Led out and quietened, he is said to have retorted each time: "what are these buildings now if they are not the Kingdom of Heaven?"


Chapter 1 Draft - "Rumors of Glory"

The concept of a “utopia”, an ideal society where hardships and injustices are minimized, has been a central theme in Western thought for almost two thousand years. Indeed, ever since Plato’s cast of inquisitive characters first explored the nature of justice and just societies in Republic , a great deal of Western intellectual effort has been devoted to envisioning and building better systems, societies, and even cities.

While many of these efforts have been theoretical or philosophical exercises, such as Republic, Sir Thomas More’s Utopia , or Aldous Huxley’s Island , other utopian projects have been grounded in real-world concerns and are responsible, at least in part, for some of the most momentous turns in world history. The Bolshevik Revolution, for example, ousted the ruling class of tzarist Russia and implemented radical new ideas about economics and governance, a series of decisions which backfired magnificently, killed millions of people, hobbled Russia for decades, and had worldwide ripple effects throughout the twentieth and twenty-first centuries. Over one hundred years before the Bolsheviks took control of Russia, the American Revolution was also catalyzed by grievances regarding economics and governance, overthrowing British colonial rule and establishing autonomous colonies in North America. Much like the Bolsheviks, the philosopher-leaders of the United States of America had radical and original ideas about how to run a country, which they outlined in a collection of founding documents such as the Declaration of Independence.

Aside from being one of the most significant documents in the history of Western governance, the Declaration of Independence is somewhat unique in that it makes three key metaphysical claims that form the foundation and justification for the entire American project. The “self-evident” and “inalienable” rights to life, liberty, and the pursuit of happiness claimed as facts of the universe are, in fact, the guiding principles for everything that America has ever accomplished, from the Civil Rights Movement to Hollywood and Motown.

The history books and statistical records clearly demonstrate that the American attempt at utopia has been decidedly more successful than the Soviet efforts. Indeed, despite its many faults, in just a few centuries the controversial “land of the free” has become a world leader in culture, science, technology, politics, and philosophy, as well as the world’s leading military superpower and a primary food exporter. Immigrants from around the world flock to its borders, some even making dangerous and illegal crossings over the Rio Grande for a chance at a better life.

On the other hand, the Soviet economy was an overmanaged disaster with thousands of government offices vainly attempting to control all aspects of production and distribution, leaving many Soviet citizens without the ability to buy a car, a television set, or even many food items considered staples in the West. Furthermore, whereas the waves of illegal immigrants crossing the American border have triggered heated discussions about a border wall to keep people out, the infamous Berlin Wall implemented by the Soviets was not constructed for defensive purposes, but to keep East Germans from fleeing communist rule. Dissenters and dangerous thinkers who could not escape to the West were imprisoned in gulags, often left to freeze to death while performing hard labor in the wastelands of Siberia.

Boulevard of Broken Dreams

As chronicled by political scientist and anthropologist James C. Scott in Seeing Like a State, the timeline of human history contains many attempts at utopia which ended in economic disaster, mass starvation, or even genocide. From Le Corbusier’s questionable attempt at an “efficient” architectural commune in the French countryside , to the violently mechanistic attempt at agricultural utopia in Pol Pot’s Cambodia , it seems to be the case that individuals or nations with grand dreams consistently end up in shambles. In the case of the Soviet Union, this can be both metaphorically and physically seen in the remnants of the Berlin Wall, torn down by East Germans in 1989 following decades of silent discontent and now treated like ancient ruins.

Yet, and most unfortunately, although the main branch of Western civilization has outlasted and outperformed every instantiation of communism that has been attempted, most poignantly its Soviet arch-rival, the West’s attempt at utopia also seems to be faltering in recent years – and inexplicably so.

For example, the number of Americans dying from opioid overdoses each year is now roughly equivalent to the number of soldiers who died in the entire Vietnam War, and assisted suicide has recently become the leading cause of death in Canada. The suicide rate of American youth between the ages of ten and twenty-four has nearly doubled over the last decade, and fourteen percent of youth have experienced serious psychological distress within the last month. Eighty percent of employees report being disengaged at work, and the top regrets of dying Westerners are largely centered around themes of working too hard, sacrificing one’s individuality, and neglecting important relationships.

Beyond the psychological misery experienced by large swathes of North American society, the average person faces significant economic challenges. Home ownership is but a distant dream for many – it is estimated that over half of young adults still live with their parents for the first time since the Great Depression, and that the majority of those who do live on their own find their living situation hard to afford. Moreover, the savings of the average household are exceedingly slim, with fifty-six percent of families reporting that they would be unable to find one thousand dollars to cover an emergency.

In addition to the economic woes facing the West, which were exacerbated by the lockdowns and supply chain disruptions of the COVID-19 pandemic, the political atmosphere in North America is strained nearly to the point of breaking. As will be discussed in a later chapter, the political left’s adoption of sophisticated neo-Marxist ideas has led them to conclude that many of the values held sacrosanct in the modern Western tradition, such as free speech and equality of opportunity, are in fact dangerous – as well as the people who hold those values, who are labelled as fascists or otherwise cast as oppressors. The arguments these activists wield are extremely sophisticated, developed over decades by intelligent cynics, and have caused many to doubt Western civilization’s claims to justice and rationality, either in part or in whole. The divide has become so deep, and the rhetoric so violent, that both the United States of America and Canada now qualify for inclusion on genocide watchlists.

Visions of Destruction

Most distressingly, the West seems to have lost its capacity for even engaging with the kinds of bold ideas contained in utopian visions, leaving it mired in short-term thinking and bitter disputes over policy issues. For example, in the late 1940s, an American schoolteacher named John Reber drafted an ambitious plan to build dams in the San Francisco Bay area, supporting new land development, the creation of freshwater lakes, and better transit routes. Despite the vast changes this would have made to the geography of San Francisco, Reber’s plan was taken so seriously that a scale model of his proposal was built by the government to prototype his suggestions. This level of consideration and engagement would be nearly unthinkable today, especially when the progenitor of the idea has no credentials or formal education in the field.

In response to the West’s loss of capacity for engagement with visionary ideas, one of the most successful investors and thinkers in Silicon Valley, Peter Thiel, has diagnosed contemporary Western society with a case of “indefinite optimism” – a vague belief in a better future without any specific plan to get there. In a business context, this manifests as a vague commitment to “shareholder value” that manifests itself in a myopic concern with share prices or revenue, disconnected entirely from matters of innovation, market leadership, or company mission. An example of this is the Walt Disney World resort in Florida, which was originally intended to be a twenty-thousand-person Experimental Prototype Community Of Tomorrow (EPCOT), complete with underground tunnels for cars, a circular “garden city” design, and a wide range of urban planning innovations. Unfortunately, following Walt’s unexpected death in 1966, the company abandoned his bold vision for the Florida project and instead built another theme park, a much safer option that delivered predictable “shareholder value”.

Even science fiction, a Western literary genre known for depicting expansive and fantastical visions of futuristic societies, has lost the optimistic and imaginative themes that were a mainstay in the nineteenth and early twentieth centuries. Whereas readers of decades past might have entertained themselves with H.G. Wells’ The Shape of Things to Come, Alexander Bogdanov’s Red Star, or Charlotte P. Gilman’s Herland, but today are treated to dystopian and apocalyptic themes through written works like The Hunger Games and The Handmaid’s Tale, as well as movies like The Matrix, Altered Carbon, Avengers: Endgame, and even the charming Wall-E. Unfortunately, the West has been consuming flickering images of its own demise for decades, poisoning its own well for entertainment.

Identifying Root Causes

The simultaneous collapse of North America’s mental health, economic prospects, political discourse, appetite for innovation, and even sense of optimism has raised serious concerns among experts, many of whom now refer to the situation as a crisis. Prominent voices in this chorus include psychologist John Vervaeke, who speaks of a meaning crisis , as well as philosopher Terry Patten, who calls this predicament a meta-crisis . Both theorists are gravely worried about the sudden loss of meaning and continuity experienced by Western individuals in the post-internet era, which cannot be reduced to any singular cause but has had catastrophic impacts on individual and societal well-being.

While Vervaeke and Patten have been developing more comprehensive theories to explain these societal phenomena, other scholars have been identifying specific factors that are contributing to the Western decline. Among them is Jean Twenge, whose research into the relationship between teenagers and digital technology has demonstrated clear links between social media use and mental health in youth, especially in young girls . The effects of digital technology on adults are also fairly well-known, with researchers demonstrating relationships between depression, loneliness, and screen time . Moreover, social media allows for the use of hurtful or inflammatory language at a distance, facilitating an overall level of hostility that would be infeasible in real-life discourse, as well as an “always on” culture where anyone can be reached at any time, for any reason, by anyone.

As previously mentioned, destructive neo-Marxist ideas developed by figures like the Frankfurt School, Michel Foucault, and Kimberlé Crenshaw, among others, are another contributor to the West’s crisis, as the ideas were allowed to run rampant throughout the humanities and social sciences in decades past, effectively training the next generation of teachers, journalists, lawyers, businesspeople, and policymakers on ideas rooted in Marxist resentment and group-based grievance politics . This so-called “long march through the institutions” began well before the turn of the millennium, and the fruits of the neo-Marxist’s efforts can now be seen on many street corners and crosswalks in the form of a rainbow flag, on posters featuring clenched fist iconography, and in the unscientific and harmful curriculum materials drafted by state experts for young children.

Still other researchers have identified modern childrearing practices as dysfunctional, counterproductive, and even harmful. The phenomenon of childhood sheltering, especially in combination with the rise of social media, seems to have spawned a generation of young adults who struggle with autonomy, lack the kinds of life experience typically gained through risky teenage behavior, and grapple with unprecedented levels of anxiety and depression despite living in relatively peaceful cities characterized by material abundance.

The developmental issues that have contributed to the current North American crisis can also be attributed to the failure of the general population to adapt to the increasing levels of complexity in the modern world. Harvard professor Robert Kegan’s work on cognitive complexity, as well as the surrounding literature on the topic, reveal that an individual’s ability to overcome complex challenges like parenting, being a manager, leading change within an organization, or grappling with disagreements is related to their overall level of cognitive sophistication.

The Inconvenient Truth

Although social media, Marxist philosophy, misguided parenting, and developmental challenges are surely contributing factors to the “meta-crisis” currently facing the West, the reality is that the blame for this crisis can be laid at the proverbial feet of Western thought itself. Indeed, a comprehensive review of Western thought, and the history of that thought, reveals a mind-bending level of fraudulence, corruption, conjecture, and ignorance that has left the modern-day North American deeply confused about fundamental matters of human nature, human history, and the physical world. This confusion has cascaded into philosophies, religions, scientific pronouncements, textbooks, government policies, and even artistic expressions that are hostile to human flourishing and an affront to human dignity.

Indeed, without an accurate understanding of human nature or the world, any system of government is doomed to collapse under the weight of its own ignorance – this is what happened to the USSR, as frustrated Soviet citizens were tearing down the Berlin Wall while state bureaucrats attempted to conjure goods for the planned economy. Unfortunately for the West, although the inalienable rights to life, liberty, and the pursuit of happiness are decidedly more realistic than the Marxist-Leninist view of human living, even Western thought contains many misconceptions, delusions, theories, and outright lies that, as can be plainly seen by the current series of catastrophes, are threatening the integrity of the entire utopian endeavor.

Thus, to meaningfully address the multifaceted crisis in Western society, it becomes necessary to critically review the entirety of the Western intellectual project, the metaphysical assumptions it has made, and whether those assumptions are reflective of reality. In the process, it will be necessary to refer to scholarship in many disciplines, including thermodynamics, neuroscience, evolutionary biology, media ecology, expertise studies, and religious studies, eventually facilitating the reconstruction of a harmonious whole out of an array of diverse-yet-related parts. The result of this tremendous endeavor will be a more robust, more expansive, richer, and deeper intellectual tradition that can form the basis for a truly utopian society.

The Unification of All Knowledge

Although this kind of radical juxtaposition of subjects might seem unorthodox in Western thought, the synthesis of the West’s vast stores of accumulated knowledge is, in fact, a long-awaited development and necessary for continued advancement. Following the fragmentation of Western scholastic effort into highly focused disciplines over the last several centuries, and the many discoveries that have been made since, many luminaries have called for researchers to balance specialization with reconciliation – the future, it is said, is in finding relationships and agreements between different fields of inquiry.

Among these voices are Cardinal John Henry Newman, the founder of University College Dublin, who emphasized the importance of cultivating learned minds that could systematize and reconcile new information with what is already known . They also include feminist scholar Camille Paglia, highly critical of the exclusion of biology from the field of women’s studies , and lawyer Brian Muraresku, whose investigation of psychedelic mushroom use in the ancient Near East involved a blend of archaeology, history, and chemistry that has since become known as archaeochemistry . Over the past couple of decades, other chimeric disciplines have emerged that blur lines between disciplines in much the same way as Muraresku’s efforts – among them include biophysics, popularized by Erwin Schrödinger, or biologist Edward O. Wilson’s controversial sociobiology.

As it happens, Edward O. Wilson is one of the strongest voices for interdisciplinary harmony in recent years and enjoys credit for the reintroduction of consilience into the modern academic lexicon . This word, with etymological connotations of parts jumping together or unifying, was invented by British polymath William Whewell in the 1800s to refer to circumstances where multiple lines of independent inquiry all converge on the same answer. This concept was revisited to great acclaim by Wilson in his 1998 work Consilience: The Unity of Knowledge, which introduced many readers to the value and potential of interdisciplinary inquiry for the first time.

Although it is not a word or concept most people are familiar with, the truth is that much of the physical sciences operates, at least implicitly, on consilience. For example, the observations that Charles Darwin made which gave rise to evolutionary theory are completely compatible with Gregor Mendel’s work on gene transmission, which itself is compatible with what is known about chemistry and basic biological processes, which in turn is compatible with the laws of physics. Yet, while the sciences all enjoy a great deal of mutual compatibility, Wilson observes in his work that the relationship between the sciences and humanities is almost nonexistent. He further observes that many of the world’s most pressing problems, such as ecological preservation, cloning, or artificial intelligence, all require deep collaboration between the arts and sciences – and therefore the unification of all human knowledge.

While such a lofty goal may seem like something to be accomplished by luminaries and geniuses in the far-flung future, in reality it is proving to be a very necessary project given the fumbling, stumbling, and crumbling that has become endemic of modern Western civilization. Thankfully, while a comprehensive revision of the West’s accumulated knowledge reveals that some major corrections will have to be made, it also demonstrates that the information, tools, technologies, processes, and wisdom required to create a sustainable utopia have already been discovered.

Get Updates on Spiritual Science

* indicates required

Chapter 2 Draft - "Original Sin"

Although the roots of Western civilization stretch back to Athens and Jerusalem, our modern society sits on the branch that begins with the Council of Nicea, a convention of early Christian leaders held to decide, among other things, on the divine nature of Jesus and the date of Easter. This was one of the key events that set into motion the Holy Roman Empire, which would dominate European culture and intellectual activity until the Renaissance over one thousand years later.

Because of its position as the seat of European religious belief, Christianity’s influence on the West has been profound, from inspiring artistic works like Dante Aligheri’s Inferno and Michelangelo’s Sistine Chapel, spurring the construction of architectural marvels like La Basilica de la Sagrada Familia, and perhaps most importantly, giving the Western world its metaphysical and ethical foundations.

Indeed, the importance of the Christian worldview to the West cannot be understated. Aside from cultural influences, Christian ethics can be found littered throughout Western jurisprudence, including laws named after the Parable of the Good Samaritan , the criminalization of homosexuality , as well as implicit value judgements surrounding crime, guilt, and punishment. Even developments within the Church, such as the Protestant Reformation, spawned contributions like legal positivism, or the view of legal systems as a means, not an end.

Unfortunately for the West, and as many Christians are now discovering thanks to the internet, the entire Christian religion is a fraudulent and incoherent mess that has bamboozled and terrified generations of well-meaning people. Its rationale for the existence of evil and suffering is surprisingly weak, leaving the territory open for capture by neo-Marxist ideologies, and its conceptualization of human nature is deeply flawed and anti-human. Furthermore, it can easily be demonstrated that the doctrines carefully crafted by the early Church fathers savaged, butchered, and corrupted Jewish scriptures, injecting Greek philosophy into Jewish thought in a vain attempt to justify a personality cult.


One of the most famous verses in the entire Bible, Christian or Jewish, is the sixteenth verse of the third chapter of the Gospel of John, which states that God loved the world so much that he gave his only begotten son, so that those who believe in that son can have everlasting life. Originally popularized several decades ago by the unorthodox ministry of Rollen Stewart, a sports fan with a sign simply reading “John 3:16”, the verse elegantly sums up the most salient aspects of Christian faith and doctrine.

Ultimately, Christians believe in the concept of the Jewish Messiah, a man who will bring about an era of enlightenment and world peace . Specifically, they believe Jesus of Nazareth to be that Messiah – who, as said in John 3:16, is the literal “son of God”. To the Christian, one’s way to Heaven is by believing in the efficacy of the blood sacrifice allegedly performed by Jesus of Nazareth, who the Gospels depict as dying by crucifixion for the atonement of humanity’s sins. Not believing in this sacrifice’s efficacy, in the Christian worldview, means one will certainly be burning forever in the underworld.

Although a full refutation of Christianity can and does take many volumes , there are a number of major issues with the aforementioned doctrines. Firstly and most importantly, Christians have incriminated themselves by claiming the Tanakh, the Jewish Bible, as part of their Bible. By affirming it as Divine Scripture, they have left themselves defenseless against the many contradictions to Christian doctrine found in the so-called “Old Testament”. For example, John 3:16 specifically claims that Jesus of Nazareth is God’s “only begotten son”, which conflicts with Exodus 4:22 –

“This is what the LORD says: Israel is my firstborn son”.

It also conflicts with Psalms 2:7, which refers to the Jewish people –

“The LORD has said to Me, ‘You are My Son, Today I have begotten You.”

Moving through the rest of John 3:16, there is the issue of God allegedly allowing this “only” son, Jesus of Nazareth, to become a human sacrifice to atone for the otherwise-unforgivable sins of humanity. Even a cursory reading of the Jewish scriptures will reveal that this is an abomination within the Jewish faith, expressed quite succinctly in Ezekiel 18:20 –

The soul that sins, it shall die; a son shall not bear the iniquity of the father, and a father shall not bear the iniquity of the son; the righteousness of the righteous shall be upon himself, and the wickedness of the wicked shall be upon himself.

As can be clearly seen, any religion that claims to be based on Jewish scriptures cannot also claim to be based on an act of human sacrifice, as Jewish principles hold that every human being is responsible for their own sins and cannot intercede on behalf of another. Indeed, this fundamental aspect of the Christian faith, of Jesus “dying for your sins”, simply does not hold water given the religion’s self-stated roots in Jewish doctrine and prophecy. This philosophy of personal responsibility is reiterated several times throughout the Jewish Bible, including Deuteronomy 24:16 and Proverbs 28:10, and sacrifices of animals are even seen as only being conditionally acceptable in the Jewish faith, as stated in Isaiah 1:11, Jeremiah 7:21-13, and Hosea 6:6.


Another aspect of Christian doctrine that is deeply offensive to Judaism, aside from the Christian claim that the God of Israel accepts human sacrifice for purposes of atonement, is the Trinity – the Christian belief that the Biblical Creator is one being with three distinct personas. On its face, this is absurd from a Jewish perspective, as the central prayer of Judaism and the first words that religious Jews learn as children are the Hebrew translation of “The LORD [is] our God; the LORD is One”.

However, as Christianity continued to spiral into a personality cult and began to elevate their alleged Messiah to divine status, the early leaders of the Church were forced to spend approximately two centuries figuring out how to make sense of Jesus’ alleged divinity given God’s status as a unitive in Jewish scriptures. Their subdivision of God’s oneness into a trinitarian structure was their best philosophical justification for Jesus’ divinity, successfully causing Christians to look past the fact that God told the Jews that He is definitely not a human, nor does He have physical form, as can be found in Numbers 23:19, Deuteronomy 4:12, and Hosea 11:9.

Thus, the entire idea that Jesus is somehow divine, or even part of a Trinity that places him on equal footing with the God of Israel, is outrageously blasphemous within Judaism and is a further indication that Christianity is not a legitimate religion, but rather a splinter cult that has appropriated Jewish texts.


Astoundingly, the much-loved Jesus of Nazareth would not have even qualified as a Jewish prophet, much less the Messiah or the son of God. As found in Deuteronomy 13:1-6, not only are Jews expected to follow the exact stipulations of Torah, from the major observances like the Sabbath to the little things like hand-washing, both of which Jesus is depicted as ignoring, but anyone who encourages others to break Torah law is liable for grave religious crimes. Christians will often cite the miracles depicted in the Gospels as proof of Jesus’ divinity and spiritual authority, but Jewish scriptures stipulate that even miracle-workers and diviners who encourage the transgression of Torah law are liable for crimes –

Everything I command you that you shall be careful to do it. You shall neither add to it, nor subtract from it. If there will arise among you a prophet, or a dreamer of a dream, and he gives you a sign or a wonder, and the sign or the wonder of which he spoke to you happens, [and he] says, "Let us go after other gods which you have not known, and let us worship them," you shall not heed the words of that prophet, or that dreamer of a dream; for the Lord, your God, is testing you, to know whether you really love the Lord, your God, with all your heart and with all your soul.

You shall follow the Lord, your God, fear Him, keep His commandments, heed His voice, worship Him, and cleave to Him. And that prophet, or that dreamer of a dream shall be put to death; because he spoke falsehood about the Lord, your God Who brought you out of the land of Egypt, and Who redeemed you from the house of bondage, to lead you astray from the way in which the Lord, your God, commanded you to go; so shall you clear away the evil from your midst.”

Furthermore, what these verses are specifically referring to is the very stringent prohibition on worshipping anything aside from the God of Israel, which Christians attempt to circumvent by saying Jesus really is God. However, when confronted with parts of the Gospels where Jesus is depicted as praying to “God the Father”, such as the scenes in Gethsemane , it becomes clear that Jesus is a separate entity of some kind that the Christians are ascribing worship to. This, within the framework of Judaism, is one of the worst sins a human being can commit – especially given that Jesus routinely broke Torah, disrespected the Sages and their teachings, and encouraged others to follow along.


Beyond the glaring inconsistencies between Christian doctrines and the “Old Testament” that Christians claim their religion is derived from, there are several errors, misquotes, and even outright falsifications made by the authors of the New Testament that seem to have been deployed to fool ancient peoples into accepting Jesus as their savior. Indeed, the very first chapter of the Gospels includes such an error, with Matthew 1:22-23 explicitly citing Jewish prophecies in support of the claim that Jesus’ virgin birth was foretold in Isaiah.

Of course, this is not the case – a proper reading of Isaiah 7 in its entirety reveals that not only is this prophecy contemporaneous to Isaiah, even naming King Ahaz in the prophecy, but that the original verse was corrupted through a mistranslation. Indeed, about two centuries before Jesus’ lifetime, when Jewish scriptures were translated into non-canonical Greek, the word for “young woman” in Isaiah 7:14, which is “almah”, became the Greek “parthenos”, which means virgin. This mistranslation seems to have cascaded into the Book of Matthew, whose unnamed author was working from the Greek translations – not the original text. Thus, the prophecy for Jesus’ alleged virgin birth simply does not exist.

Similar issues can be found in the writings of Paul the Apostle, whose letters to the early churches constitute much of the New Testament, the beginnings of official Christian doctrine, and the religious groundwork for ideas like “original sin” and kosher human sacrifice. Among other falsehoods, Paul erroneously claims in Romans 9:25 that the kind of relationship the Jewish scriptures depict God as having with Israel has since been extended to all nations following Jesus’ alleged sacrifice. Paul justifies this theological position by making selective references to verses in the Book of Hosea and other prophetic writings, ignoring the reconciliatory verses and presenting only selections excoriating the Jewish people to give the impression that their covenant had ended.

Such underhanded practices have, unfortunately, become commonplace in Christian thought, as even laypeople will unthinkingly and unwittingly quote pieces of Jewish scriptures that have nothing to do with Christian doctrine. One such example is Isaiah 41:14 –

But Zion said, “The Lord has forsaken me, the Lord has forgotten me.”

Christian readings of this verse would seem to imply that God has turned His back on the Jewish people due to their transgressions and sins, thus necessitating a “new covenant” made through Jesus’ death on the cross. However, to take this verse in isolation of the surrounding context is an act of violence against the Tanakh, as can be clearly seen –

But Zion said, “The Lord has forsaken me, the Lord has forgotten me.”

Can a mother forget the baby at her breast and have no compassion on the child she has borne? Though she may forget, I will not forget you! See, I have engraved you on the palms of my hands; your walls are ever before me.

With Christian religious claims sufficiently debunked, it is now possible to look past the cross to survey the spiritual wreckage this false religion has left in its wake. One of the worst offenders in this regard is the idea of “original sin”, which characterizes every human being as irredeemably sinful, wicked, and selfish – hardly the foundation for a positive self-concept. Indeed, Christians are convinced that Adam and Eve, the first humans, transgressed a commandment that brought evil, suffering, and death to Earth. According to the Church, every human since then has been born into a fallen state and requires redemption through acceptance of Jesus’ alleged atonement sacrifice and devotion to him – not God.

Moving beyond cultural relativism to issues of truth and fiction, what has been taught to generations of Christian children – by their own parents, and their parents before them – is a poison pill that sabotages every possible notion of human nobility. By Christian logic, a newborn baby, face purple before its first breath, is a sinful, wretched creature that can only be accepted by its Creator by way of human sacrifice.

Everyone knows, on some level, that this makes no sense.

Without original sin doctrine, however, Christianity would be forced to retreat to the Biblical positions found in verses like Ezekiel 18:20, where everyone is responsible for their own transgressions and has the potential for being ish or ishah tzaddik – a righteous man or woman. There would be no need for a fervent “belief” in the saving power of Jesus’ alleged sacrifice, no need for Jesus to be a divine, sinless, superhuman figure, a total refocusing from people’s sinfulness and brokenness to their strengths and potential for contribution, and, most importantly, no need to donate to the Church so it can continue Jesus’ ministry of lies.


Beyond the anti-human ideas that are core to the Christian faith, people who adopt Christianity as a belief system are often left flummoxed by paradoxes or dismayed at the tragedies of life. Indeed, it is estimated that at least one in five Christians are questioning their faith at any given point in time, a number which includes many clergy members, indicating the existence of genuine questions that lack viable canonical answers.

There are many things that one might hesitate to ask their local clergy member, especially if their family goes to the same church and there is a possibility word might get around about their doubts. For example, how could a perfectly good and loving God allow people to die by the millions in the Holocaust, Holodomor, and other catastrophic events? Why do babies get leukemia? Why did my son die in a car crash?

Although some answers exist within Christian thought, especially Henri Nouwen’s moving Adam: God’s Beloved, the truth is that Christians ultimately must blame the existence of evil on Adam and Eve’s transgression with the forbidden fruit, absolving their Creator God of any direct responsibility for the evil that exists in the world. Yet, this still begs the question of why it’s allowed to happen in the first place, which is a dead end for all but the most inventive Christians.

The issue of evil, and other philosophical quandaries like it, leave the West’s foundations on surprisingly sandy ground. This helps explain, in part, why Christian organizations continually fail to live up to their stated principles, from the slaughterous Crusades and witch hunts, blood libels against Jews, pedophile rings within the Catholic church, and other deep compromises of moral integrity. Scholar Shakka Ahmose, speaking within an Afrocentric context, has labelled the psychological condition experienced by many Christians as “Bible psychosis”, particularly for Africans whose relationship to indigenous African spirituality was severed by Christian missionaries. Indeed, an unwavering belief in one’s own fallen nature, combined with groundless metaphysics and a fragile soteriology, has proven to be fertile ground for cultural and moral catastrophe.

As will be discussed throughout this work, the true nature of human beings, characterized by a striving towards nobility, is demonstrably different than what is espoused by the Church. Furthermore, the growing scientific consensus around certain principles in neuroscience and psychology aligns quite nicely with metaphysical and moral principles of the world religion that people might least suspect – and it is not Christianity.


It cannot be understated that the entire Christian effort has been, for the most part, an intentional and cunning series of obfuscations, misrepresentations, lies, and threats of eternal damnation that have left much of the Western world scared for their souls. As discussed, a review of the history of Christianity reveals that early Church leaders intentionally pulled Jewish scriptures out of context and warped them to suit their needs. Even more brazenly, the three sections of the Jewish Bible were rearranged to better suit the Christian salvation narrative, in addition to many Jewish prophecies being falsely attributed to Jesus. The net result is a veritable wall of “proof” that the average layperson has almost no hope of deciphering, much less refuting.

For the clergy members responsible for propagating these falsehoods, many of them are willing conspirators, at least to some degree. If asked the right questions, they will admit that something is a “mystery”, or that they have heard this issue before but have no answer. Many ministers in Canada and the Netherlands will privately admit to being agnostic, despite leading worship services and exhorting their congregants to “believe”.

The fact that this sophisticated and compelling personality cult has so grievously misled people for almost two millennia is one of the worst catastrophes in the history of the human species, although the collapse of Christian metaphysics in the wake of Darwin’s discoveries proved to be quite the calamity of its own. Before examining the “death of God” and the birth of psychology, however, we must review the impacts of colonialism and industrialization on Western thought, the former of which was facilitated by the Catholic Church’s so-called Doctrine of Discovery.

Get Updates on Spiritual Science

* indicates required

Chapter 3 Draft - "The Map is Not The Territory"

For about one thousand years after the core aspects of Christian doctrine were settled in Nicea, they enjoyed a unique position as one of the primary cross-cultural influences on European artistic, legal, and philosophical efforts. However, following the schism between the Eastern and Western branches of the Church, and especially after the efforts of early Renaissance thinkers cast the credibility of the Roman Catholic Church into serious question, the political, social, and intellectual atmosphere of Europe changed quite dramatically.

Although the causes and consequences of the Renaissance are many, one of the overarching trends of this period was the consolidation of power away from the Church and feudal lords into the hands of monarchies, which later became modern nation-states. Additionally, the discovery of North America and trade routes to the Orient triggered incredible competition between these European powers for territory, wealth, military supremacy, control of the high seas, and other resource-intensive endeavors.

Whereas the ruling classes of the Middle Ages seemed content with levying taxes on farmers, conscripting young men for the occasional crusade, and building a castle or cathedral every so often, the grand visions of post-Renaissance leaders began to require unprecedented control over not only the natural environment, but human behaviour as well. This led towards systems and approaches that prioritized standardization, predictability, and compliance, a set of values that cascaded into post-industrial life in the West. While the fruits of this historical trajectory are many, the truth is that the kinds of assumptions and simplifications made in the name of bureaucratic efficiency have very real costs, many of which are only becoming apparent in this generation.


A striking historical example of the hidden costs of “progress” involves the beginnings of scientific forestry, a now-venerable discipline with rather humble beginnings. While the discipline is deeply connected to ecological and sustainable practices today, the primary lens through which post-Renaissance bureaucrats viewed forests was an economic one. Indeed, references to trees and forests in encyclopedic literature focused heavily on uses for various trees and the utilité publique of forest-related products, not on biological or ecological attributes we associate with such concepts today.

This obsession was so myopic and single-minded that early surveyors and forest managers were solely concerned with the volume of wood available within a given plot of land, which was the only thing relevant to the state officials responsible for managing natural resources. Combined with an unsophisticated knowledge of forest ecology, which had yet to be developed, forest managers would clear-cut entire areas and replace them with orderly rows of Norway Spruce or Scotch Pine to ensure predictable supplies of high-quality timber. Such efforts, of course, ended in disaster.

What early forest managers and state bureaucrats failed to consider was that a forest ecosystem is an extraordinarily complex biosphere which does not necessarily optimize for high timber yields. Aside from large trees, there are saplings and underbrush, decaying matter, extensive mycelium networks, and a host of forest creatures that all contribute to the health of the forest – without which, no large growth would be possible. By clearing everything but the pine trees away, early forest managers created an ecosystem that quickly depleted the soil nutrients and sabotaged the entire effort within a couple of tree generations. This reality check, of course, led to more sophisticated methods, a genuine curiosity about the dynamics of forest biospheres, and, paradoxically for the bureaucrats, higher and more predictable timber yields.

The brand of short-sightedness exemplified by early attempts at forestry is typical of “modern” organizations, particularly nation-states and corporations. By focusing on a single variable, or small set of variables, the bureaucrat or manager neglects complex interdependencies that exist between the stakeholders in their plan – often to the eventual detriment of that plan. Thus, a factory manager fixated on production output might not realize that breaks would improve productivity , or state officials obsessed with maintaining certain kinds of economic growth might legislate their citizens into poverty. These kinds of thin simplifications, as they were called by anthropologist James C. Scott, tend to have catastrophic implications for both the environment and the people tasked with warping reality around unrealistic instructions from managers.

Indeed, as European nation-states became more bureaucratic and expansive, centralized decision-makers began relating to reality through reports, numbers, and figures. Not only did this insulate them from the limitations and hidden costs of their plans, but it also facilitated the reduction of human beings, already considered sinful and broken in mainstream European (and Russian) thought, to mere statistics and means to an end.


Although Friedrich Nietzsche first announced God’s death in 1882, the reality is that Europe’s perceived locus of control shifted from the Heavens to the Earth following the Renaissance, at least in practice. Armed with increasingly ambitious visions about the future of their nations, monarchs and state officials began plundering natural resources, issuing sweeping decrees for their populaces to follow, and bringing together once-disparate groups of people under a unified vision. Underlying all this activity was a belief – a faith – in their own ability to enact these changes sustainably and successfully, as well as a certain level of narcissistic obsession that prioritized the ideal future over the real.

An early signal of this hunger for a better world – perhaps at any cost – comes from Sir Thomas More’s Utopia, the work that introduced the term to the English language and a depiction of a rather authoritarian society that purports to engender a higher quality of life. Other manifestations of this new trajectory became evident in the elevation of certain European dialects to the status of official languages, something that had not even been attempted until the Risorgimento of Italy and the Grand Siècle of France. These kinds of foundational initiatives, which contributed to the erasure of many diverse local dialects, later gave way to mandatory state education, city planning, social improvement campaigns, and even entire planned economies in the case of the Soviet Union.

The mentality that drives these efforts, and others like them, is called high modernism and is defined by James C. Scott as a belief in the perfectibility of nature and mankind through the application of scientific and technological methods. Although its premises are rarely questioned or verbalized within Western circles, high modernism constitutes a key pillar of modern Western thought. Indeed, the West’s past sense of definite optimism, referenced by contemporary thinkers like Peter Thiel, is implicitly founded on the high modernist belief that progress is possible and that more science, technology, oversight, and focused effort will yield that progress.

Despite the many achievements made by industrialized societies, however, the dilemmas faced by the contemporary West, which largely concern human psychological well-being and the environmental sustainability of the modern Western lifestyle, indicate that this so-called “progress” may have natural limitations that are dangerous to overstep.


Several centuries after the rebirth of scientific thought in Europe, investigations into the natural world made by scientists and researchers began to yield incredible developments such as the steam engine, first invented in 1712, and the power loom, brought to market in 1786. Within decades of their release, these new machines revolutionized entire industries, particularly textiles manufacturing, and shifted the means of production from individual craftworkers to centralized factories.

After the Industrial Revolution, as it came to be known, Western nations experienced dramatic changes in both physical and social organization which facilitated even greater control over many aspects of human life. The urban population exploded throughout the nineteenth century, with the peoples of North America and Europe trading their traditional agrarian lives for crowded, disease-ridden, and crime-plagued cities. Having so many people in one area necessitated the development of quasi-military branches of the government – police forces – who were generally responsible for enforcing a dizzying array of laws, bylaws, ordinances, zoning requirements, and other regulations while stemming the tide of violent crime.

With the plow traded for the machine, workers who would have enjoyed a laborious yet autonomous day on their family farm suddenly found themselves subjected to rigorous supervision from managers, punch clocks, and factory owners intent on getting every penny of value out of each resource. This thin simplification, which reduces everything that happens in and because of a business down to a single dollar value, has made – and continues to make – most modern working conditions miserable to the point of being psychologically unbearable, a fact that can be seen in empirical measurements of workplace satisfaction as well as popular works that condemn corporate life such as Dilbert, The Buried Life, and Fight Club.

Indeed, the Western world has developed a dim awareness of industrialization’s unintended impacts, manifestations of which are scattered throughout contemporary discourse and often disguised as artistic exploration. Fatalist comedies like Dr. Strangelove, activists like Vandana Shiva, grassroots documentaries such as Crude and Zeitgeist, as well as influential works like Aldous Huxley’s Island and James E. Lovelock’s The Revenge of Gaia offer Western minds a glimpse of the realities and possibilities that lie beyond the newspaper and the screen. All too often, however, the combined weight of social programming received in school, overwhelming influences of the mass media, and overall cultural momentum prove to be too powerful for even the most poignant and compelling messages.


While the nineteenth century saw the centralization of goods production into factories and other manufacturing centers, the twentieth century witnessed a consolidation of information processing and distribution into the mass media. This was a significant change from pre-industrial agrarian settings, far removed from newspaper distribution channels, where folklore, or the conglomeration of stories, customs, beliefs, and teachings unique to villages and regions, was the dominant vessel for cultural information.

With most of the Western population located in cities by the end of the nineteenth century, the constant flow of information facilitated by mass media such as newspapers seems to have overtaken or disrupted folkloric chains of transmission, further bringing entire populations into one shared and centralized reality that has persisted and evolved to this day.

The invention of the radio, which afforded the creation of real-time audio broadcasts, allowed for an even greater sense of group unity within cities and countries. By allowing a single speaker to address entire nations at once, this medium arguably facilitated Adolf Hitler’s rise to power in 1930s Germany – a testament to the power of the spoken word amplified to unimaginable proportions. The radio was later joined by the television, which offered a similar cadence of daily shows, news broadcasts, and other messaging that was carefully curated and highly centralized due to the prohibitive costs of television production.

Thus, the daily rhythm of news and information, fed to people at breakfast tables, in living rooms, and during highway commutes, gently permeated the modern psyche with things to think about, important topics as decided by producers and editors, and other “mainstream” information. Entertainment behemoths with their roots in the beginnings of communications technology, such as Disney, provided entertainment for the newly-formed masses – and propaganda during wartime . The end result was, and is, a news environment biased towards novelty and scandal, an anemic entertainment environment motivated by economic concerns and trendy moral lectures rather than artistic excellence, and the centralization of narrative-building abilities into the hands of well-funded players with vested interests in maintaining the status quo.

In many respects, this environment can be considered a kind of consensus reality, maintained by the implicit assumption that if something were “real” or “newsworthy”, then it would be covered in the mainstream . In reality, what has occurred is a series of thin simplifications on a societal scale, privileging some narratives while discrediting others.


Most unfortunately, the needs of the colonial-industrial system extend beyond raw materials such as timber, ores, and money. Indeed, people are needed to work in the mines, grow the crops, oversee the machines, fire guns at the nation’s enemies, and manage the bureaucratic tasks necessary for nation-building. This means that from the perspective of the state and the corporation, the disastrous thin simplification is extended to people, who are seen as mere resources to be developed, used, and eventually discarded.

The first person to realize these implications of the high modernist worldview was Frederick the Great of Prussia, who instituted the West’s first mandatory state education system in 1763. Although some critics of the educational system attribute Frederick’s innovation to the Industrial Revolution, this early endeavour predates the popularization of the steam engine by several decades and was primarily driven by the complexifying needs of the Prussian nation-state. The advantages of mandatory education quickly became clear, and such systems were instantiated in the United States and other Western nations in the decades following the Prussian debut.

Much like how the centralization of “news” and other information in the mass media facilitates a kind of collective consciousness or consensus reality, delivering a standardized education to children, with the curriculum set by the state, quickly became a subtle and vital component of the high modernist project. Although every generation has its innovators who urge systems to refocus on the learner’s needs, the growing need for literate and numerate workers, bureaucrats, and soldiers has always taken precedence over the unique needs and abilities of each child.

Most unfortunately, and in true high modernist fashion, the future economic utility of the child is often the primary consideration of education-related decisions, with performance measured through standardized testing – a method of assessment that is generally hated and mistrusted not only by students, but by educators themselves.

If these systems were working even remotely as intended, their continued existence would be defensible. However, an examination of the learning outcomes achieved by Western education, and particularly North American systems, reveal persistent and catastrophic underperformance. For example, it is estimated that half of Americans cannot say with confidence whether the Earth orbits the Sun, or vice versa. Even in universities, studies suggest that most graduates could not be considered skilled – or even proficient – with language or numbers. Indeed, it would seem as if even the fundamental skills that educators hope children will retain for the working world, and therefore the effort spent in developing those skills, is largely a waste.

Even worse, the much-lauded capacity for critical thinking, long considered a product of the Renaissance and the crown jewel of Western education systems, is rather scarce. Studies conducted over the past several decades have indicated that there were no appreciable gains made in critical thinking between the first and final years of a university education, and that half of students make no progress in critical thinking ability in the first two years of their degree. Distressingly, many educators struggle to even define what critical thinking is, let alone teach or assess it, suggesting that Western education has been veering dangerously off-course without the people in charge realizing it.

Finally, researchers studying human lifespan development have identified a key capacity called self-authorship that drives not only academic and workplace success, but parenting ability, capacity to handle ambiguity and complexity, and a host of other important life skills. Not only have they found that Western education systems generally fail to produce this quality in their graduates, but that simple training and development interventions can do so reliably. This suggests that not only are state-employed educators failing to deliver on the fundamentals, but they have also proven themselves unable to capitalize on powerful curricular content that is well within reach.

These issues are not new or unknown within the education sector – indeed, there have been dire warnings from twentieth-century luminaries like Hannah Arendt, well-known innovators proposing radically different philosophies and systems, and even pioneers like Maria Montessori and Laszlo Polgar demonstrating incredible levels of success with unorthodox methods. However, the status quo has prevailed for over one hundred years, with each generation of students receiving a more confused, shallow, and ineffective education from the poorly-generated cohort of teachers instructing them – somewhat like a society-wide game of telephone.

Aside from their need to develop a pipeline of useful citizens through standardized education projects, bureaucrats and state officials have found a range of other ways to ensure a steady supply of labour for their visions. Most of these, perniciously, are centered around the nebulous issue of “child welfare”, often as defined by the state, usually influenced by the implicit view of human beings as capital. Indeed, the early decades of the twentieth century saw the rise of the helping class, a group of teachers, doctors, child welfare workers, public health officials, and other professional busybodies whose explicit purpose was to disrupt the folklore-driven practices of the populace and instantiate “safer” practices based on “official advice”. Unsurprisingly, it was around this time that traditional female birth workers and health workers such as doulas and midwives were displaced in favor of a more patriarchal and scientific system, backed in full by Western nation-states.

The valiant struggle for the “welfare of the child” also led to some of the greatest ethical catastrophes of the Western world, perhaps most notably the Canadian residential school system – a project that took indigenous children away from their parents to be educated in Western boarding schools overseen by Christian faith leaders.

Although there were ulterior motives such as assimilation, the stated goal of the system was to provide indigenous children with the skills they would need to be successful in the colonized Canada they now found themselves in. The implicit assumption, of course, was that being raised in traditional indigenous systems would lead to a substandard quality of life, a notion that has since been debunked by indigenous thinkers and is moreover a consequence of a system that has forcibly colonized large swathes of land that would enable more traditional lifestyles.


As can be seen, the general trend in the Western world has been towards conformity ever since the great unification efforts of European nation-states. The natural world, human cultural diversity, traditional craftworks, and even the dissemination of information have all been subjected to a series of thin simplifications motivated generally by economic growth, with Western populations too distracted by the stressors and glittering prizes of modern living to notice the swindle.

Many generations into the project, most people in North America have come to love their system, one way or another. The institution that poses the greatest threat to family integrity – the school system – is seen as a crowning achievement in both Canadian and American societies. The French government under Macron has even gone so far as to try to ban homeschooling. In some respects, the slaves have come to love their servitude, and even go so far as to demand that everyone share their virtual reality.

Indeed, over the past several decades, particularly since the assassination of John F. Kennedy in 1963, thinkers operating outside of the Western mainstream have developed many metaphors, words, and phrases to describe the kind of entrancement experienced by the average Westerner. From pejoratives like sheeple or non-player characters (NPCs) to more poetic descriptions of dream-barriers that stifle all sound or being in the Matrix, those who are aware of a reality beyond history textbooks, newspaper columns, and news broadcasts seem to have come to the overwhelming consensus that people ensconced within the Western paradigm are sleeping, dreaming, or even in a semi-psychotic state as a result of their so-called societal programming.

Perhaps the most accurate term to describe the psychological and social outcomes of these thin simplifications is colonization, a word traditionally used by indigenous scholars to describe the process of foreign powers establishing control over distant territories of land, usually for the purposes of economic exploitation. However, this word can also be used to describe the psychological mechanisms of subjugation and control applied to the people living on that land in order to encourage them to reject traditional ways of life and support high modernist objectives.

Signs of such subjugation include, among other things, a perverted epistemology, or understanding of knowledge, as well as the pursuit of systemic goals at personal expense. The fact that both are necessary conditions for contemporary Western life, as well as enablers for anti-human ideologies like Marxism, suggests that there are many aspects of post-industrial society that must be revised if the project is to be sustainable.

For example, at some point in history, the ancestors of every living Westerner were convinced by a high modernist dreamer that the “new ways” were superior to the “old ways”, a process that is still ongoing in places like the South American Amazon. Often, the affordances of modern technology are presented as evidence that traditional knowledge is inferior or deficient, or, in cases such as King Mongkut’s Siam, scientific methods are used to make predictions about the natural world that are impossible for traditional practitioners to replicate.

Once traditional knowledge structures have been devalued, the targets of colonization can be acculturated to the Western world’s tradition of canonical literature, peer-reviewed science, and expert consensus, whose pronouncements are sometimes enforced as reality through instantiation in law. Thus, the consensus reality of Western thought, including and especially Christianity, spread throughout the Americas and Asia with tactics no more sophisticated than a stage magician’s.

The main issue with the West’s consensus reality, at least for the average layperson with a high school education and perhaps a university degree, is that the average person’s critical thinking and self-authorship capacities are so atrophied by systemic neglect that they are unable to think for themselves – at least not in the highly complex, ever-changing, globalized economy we now inhabit. The rate of medical knowledge alone increases at a rate far greater than any professional could hope to keep up with, which means the West relies on networks of ever-more-focused specialist to make sense of reality. The overwhelming complexity of the entire Western project means that most people are completely reliant on the pronouncements of “experts” for almost every aspect of their lives, while having little to no ability to judge the reliability of those experts themselves.

This dangerous and precarious situation, reminiscent of the religious structures that atheists decry, has been the root cause of countless scandals over the past several decades. Indeed, experts with vested interests in certain societal outcomes have consistently manipulated the public’s trust by using science, expertise, or even simple authority as an intellectual weapon. Examples include the sugar industry’s selfish and malicious war on fat to distract from their contributions to health outcomes, the United States’ false assertion that Saddam Hussein was poised to unleash weapons of mass destruction, and a great deal of so-called “science” and “psychology” now used by the North American political left as justification for extreme policies.

With traditional knowledge – and even one’s own sense of intuition – replaced with Western knowledge and “ways of knowing”, or methods of knowledge acquisition, high modernist schemers find themselves in a position where they can influence people’s values – or at least exert tremendous influence on their behaviors. As will be discussed later, someone’s sense of “what is”, or their metaphysics, determines their sense of “what ought to be done”, or their ethics. This means that a perverted knowledge structure which centers authority figures and their knowledge structures can have cascade effects on human behavior.

One tangible example of this phenomenon is the famous Milgram experiment, where Western test subjects consistently administered “fatal” electric shocks to an actor . Sadly, all that was required was an authority figure and the order, along with a scenario that placed the subject in a subordinate role as a test subject. In effect, the sense of the authority, and the unknown costs of noncompliance, even in a trivial situation proved themselves to be more salient drivers of decision-making than the natural human aversion to murder. Further consider that soldiers in World War I were routinely asked to walk through machine gun fire, shrapnel, and into bayonet combat – a feat so unpleasant, the history of that war is marked by spontaneous truces.

Modern nation-states, particularly those confronted with television coverage of the realities of war, have not only had to deploy national meta-narratives in service of their military goals, but create entire systems of medals and commendations to incentivize destructive human behavior. In particular, the rituals and honours surrounding medal-bearers provides many with enough incentive to risk their lives in service of – they are told – their family, community, and country.

To a lesser degree, every North American can be said to be sacrificing themselves for the system’s benefit in one way or another. As previously discussed, the rates of job dissatisfaction are extremely high, and end-of-life regrets tend to be centered around unexplored possibilities, unfulfilled relationships, and overwork. Yet, the unwavering high modernist belief that the West is making progress, that all of the effort will be worth it, and that the future will somehow be better keeps the machine operational, and the promise of lifetime achievement awards, material wealth, and other life outcomes designated as praiseworthy by the system help keep the modern machine operational. The idea that there could be another way, as promised by books like Timothy Ferriss’ The Four Hour Workweek or Chris Guillebeau’s The $100 Startup, are dismissed as dreams or fairy-tale thinking by the more “realistic” experts, leaving many otherwise capable and talented people spending their lives pursuing systemic goals rather than authentic ones.


As a result of the consensus reality that has been constructed over the past several centuries, most Westerners live with some level of cognitive dissonance that they have rationalized away – usually with reference to expert prognostication. For example, the fiat monetary system employed by many countries, and particularly the fiscal policies of the central banks, can be blamed for much of the public’s wealth erosion from inflation – yet this subtle form of taxation is taken as a given.

Additionally, it is a matter of public record that most institutions currently occupying a position of public trust have significant corruption issues, rendering their pronouncements unreliable – yet they are still followed unquestioningly. Almost every North American will agree that the majority of politicians are corrupt, and even that substantial systemic changes are required – yet continue to vote for the status quo.

These naked hypocrisies are signs of a degenerate and complacent population, yet those who point them out, like George Carlin did towards the end of his career, are met with laughter. Indeed, much of Western culture, especially its comedy, can be seen as forms of cognitive dissonance reduction – ways of distancing unpleasant or inconvenient ideas and reducing them to a sideshow, a curiosity, or a triviality. Such has been the approach with indigenous peoples, who sell dreamcatchers at Canadian festivals, yet whose elders are not present at municipal meetings.

In extreme cases, the high modernist narrative can become so fragile that it requires physical force to maintain. This was the case with the Soviet Union, whose extensive informant and prison network was designed to weed out all dissenting thought . This is also seen in various countries’ approach to the internet, which is heavily censored in China, North Kora, Russia, and soon Canada as well. Much of the acrimonious behavior present in the contemporary “culture war”, as it has been termed, is a result of two fragile realities, each with unacknowledged inconsistencies, attempting to erase opposing viewpoints from society or discredit them sufficiently enough to achieve narrative supremacy.

This has been, in large part, the project of the Marxists and neo-Marxists, who responded to the inhumanities of industrialization by seizing the means of production wherever possible and dictating their version of reality to captive populations. Unfortunately, certain versions of Marxist thought have come to dominate Western institutions and discourse, presenting a potentially fatal threat to an already-precarious project.

Get Updates on Spiritual Science

* indicates required

Chapter 4 Draft - "A War on Reality"

While the post-Renaissance period in the European continent and its colonies involved the destruction of traditional lifestyles, the curtailment of human autonomy, and the perversion of folk knowledge, this time was also marked by a trend towards democracy, social egalitarianism, and universal human rights in Western thought. Indeed, the triumphs and achievements of Western intellectuals, and their impact on political and social developments within Western countries, are numerous – including the United States Declaration of Independence, a brilliantly elegant document that established the core metaphysics of many democratic countries.

However, as observed by luminaries like Dr. Martin Luther King, the West’s stated values and principles are like a generous cheque that has come back marked “insufficient funds” for racial minorities and other historically disadvantaged groups . Indeed, progress towards these ideals has been slow and marked by tremendous setbacks such as Apartheid, Jim Crow, and the Holocaust, all of which were, in a significant sense, “democratic”. Additionally, the post-industrial West is characterized by significant wealth disparities which manifest themselves in differentiated access to education, influence, and opportunity across social groups. The promises and dreams of technological utopia that could be found in the World Fairs of the early 1900s have given way to a dystopian present marked by overwork, mental illness, and information overwhelm.

In such a world, particularly during the early stages of industrialization where coal mining, unsafe construction methods, and heavy factory labour were claiming the lives of countless workers , some Western thinkers became disillusioned and began to question the viability of the entire project.

Over the last century, especially following the counter-cultural movements of the 1960s, these criticisms of Western traditions, values, economics, and socio-political structures became mainstream, eventually working their way into Western scholarship – and from there, into the minds of the last two generations of teachers, journalists, politicians, social workers, lawyers, and business leaders. These deeply cynical ideas have proven to be a destabilizing force in the West, leaving North America mired in a “culture war” characterized by political and social brinksmanship and a lack of dialogue between parties. Many in North America and Europe, particularly the younger generations most influenced by these ideas in their schooling , have come to completely reject Western society in part or in whole, the most prominent target of criticism being capitalism.

Despite elements of validity contained within the writings of anti-Western theorists like Karl Marx, Michel Foucault, and Kimberlé Crenshaw, upon deeper investigation they represent a dangerous return to premodern superstition, original sin doctrine, sacrificial atonement rituals, and anti-egalitarianism that claimed millions of lives throughout the twentieth century. Indeed, the philosophical tradition of Critical Theory , Postmodern Neo-Marxism , Critical Social Justice , or simply Theory is wickedly sophisticated, deeply resentful, and, much like Christianity, colonialism, and industrialization before it, dangerously out of touch with reality.


The beginnings of modern Western thought can be traced back to the fall of Constantinople in the fifteenth century, after which a wave of scholars from Byzantium emigrated to Europe . Carrying with them centuries of Islamic scientific and philosophical advancements and precious manuscripts of Greek philosophy long forgotten by Europe, these scholars broke the vicious cycle of Christian confusion by reintroducing the ideas of Socrates, Plato, Aristotle, and other intellectual giants into conversations that were otherwise gripped by tremendous amounts of confusion.

Indeed, much of European intellectual effort during the Middle Ages consisted of attempts to reconcile Christian doctrines, which are disconnected from reality, with the human faculty of reason and other sources of cognitive dissonance . Following the rediscovery of Greek thought in Europe, philosophers such as Francis Bacon and René Descartes laid the foundation of the modern Western project with their focus on epistemology, or how things are known. Their conclusions about reason and rationality, including Descartes’ famous Cogito Ergo Sum , logically led to the conclusion that people must be free to think for themselves, which led, in turn, to an individualist and skeptical stance that questioned tradition and the purveyors of it.

As the Catholic Church’s influence in Europe continued to be challenged by Galileo’s astronomical discoveries and Martin Luther’s Wittenburg theses, philosophers such as John Locke and John Rawls were developing a system of ethics and values that followed naturally from the Renaissance focus on reason. Although there were – and are – many schools of thought within modern philosophy, the general agreement was that logical and rational processes, as opposed to religious doctrine or authority, should determine the course of human action. This represented a radical change from the theocratic-monarchial systems that had dominated Europe up until the post-Renaissance period, established the ideological substructure for the West’s shift towards democratic governance, and proved to be a necessary precursor for the Declaration of Independence previously discussed.

Yet there were Western philosophers who had questions about the limits and utility of reason, most famously Immanuel Kant. Centuries following the publication of his Critique of Pure Reason, Kant’s tradition of skepticism contributed to communist critiques of capitalist economy systems, then to postmodernist critiques of Western meta-narratives, and finally to contemporary critiques of reality itself, including the existence of biological sex. The end result, particularly because of the popularity of these ideas within education, has been mass confusion, orgiastic displays of collective guilt, and levels of civil unrest in North America that have been likened by some to the Bolshevik Revolution.


As previously discussed, the post-industrial populations of Europe and North America quickly found themselves in undesirable living conditions, working difficult jobs for little pay, and enjoying a quality of life much lower than the lucky few who happened to own the factories, corporations, resources, and land that generate wealth in capitalist economies. As many are aware, the solution proposed by Marx, Engels, and the intellectuals that followed them was to overthrow the capitalist system, by force if necessary, and institute a centrally-planned economy where the citizen-workers were the primary stakeholders of the factories, corporations, resources, and land.

On its face, this is not an entirely unreasonable proposition given the colonial-industrial violence that has been committed against indigenous peoples, folk lifestyles, and even linguistics in the name of the Western utopian vision, as well as the naked economic exploitation that is an unfortunate reality in even contemporary capitalist systems . Contemporary management research, as well as the interesting case studies of Canadian steelmakers Dofasco and Stelco, has also revealed that employees given autonomy, a genuine stake in the business, and the power to make meaningful decisions are more productive and deliver work of higher quality.

However, there are many issues with communist philosophy – the first is a computational one. There are so many things happening within an economy, with so many variables and minor setbacks that require ad-hoc corrections, that it becomes mathematically impossible to measure the entire economy, process that raw information, and then make timely and effective decisions. Indeed, at the height of the Soviet attempt, it is estimated that there were 46,000 industrial enterprises and 60,000 agricultural collectives, all desperately trying to maintain control over what is essentially a chaotic quasi-ecological system. Of course, after decades of delivering a substandard and anemic quality of life to citizens, many of whom were desperate to escape in some form, it collapsed in spectacular fashion in 1989.

The second issue, which is actually a point of worthy discussion in Western society, is the relative value of product innovation to the labour required to produce it . If someone spends ten hours and five hundred thousand dollars of their own money to design a widget and build a production facility, how much of the profit should go to them, as opposed to the workers they hire to make the product? Some thinkers, like Ayn Rand, prioritize the innovator and their intellectual labour, while Marx and communist philosophy say the value lies in the production labour. As of yet, the best practices on this subject as they pertain to management praxis remain unclear, however it seems obvious that well-compensated workers are generally happier and more productive.


Throughout the nineteenth and twentieth centuries, communist philosophies had a tremendous impact on the political and social landscape of not only the West, but the Asian continent as well. Most notably, the Bolsheviks of Russia overthrew the tzarist-capitalist system and instituted the Stalinist-Leninst version of communism, with death tolls in the tens of millions. The People’s Republic of China is estimated to have killed tens of millions during their “Great Leap Forward”. Over three million have died under North Korean communist rule, and the Khmer Rouge in Cambodia claimed over two and a half million lives. Vietnam, Yugoslavia, Cuba, and many other countries have similar stories.

Indeed, the overwhelming trend is that communist revolution leads, one way or another, to mass graves and overwhelming government oppression. By the middle of the nineteenth century, this had become apparent to Western intellectuals through the work of Aleksandr Solzhenitsyn and other whistleblowers from the Soviet regime, which presented communist-friendly thinkers, of whom there were many, with a series of very difficult challenges.

The first challenge, obviously, was to reconcile the genocidal outcomes of every single attempt at communism with its stated goals of equality, social engagement, and neighbour-love, a feat that many avoid even attempting by claiming that “no true version” of communism has yet existed . The second challenge, and a more achievable goal, was to somehow advance Marxist objectives despite growing awareness of its deficiencies.

Fundamentally, Marxism is a group-based philosophy that divides society roughly into the more populous working class, who are subjugated and exploited by the capitalist working class. Therefore, to continue their project of overthrowing the capitalist system, Marxist intellectuals folded their ideas about group-based struggle into pre-existing conversations about sex, race, the environment, and other social issues. As the twentieth century continued, Marxist influences on social and artistic commentary gave birth to postmodernism.


The philosophical school of postmodernism, with early iterations developed by the Frankfurt School in the 1930s and intellectual giants like Michel Foucault advancing more sophisticated versions in the 1960s, can be most elegantly expressed as a skepticism of grand social narratives – especially the narratives that are most dominant in a culture or society.

By framing Western notions of progress, justice, and equality as bourgeoisie constructions designed to pacify an exploited populace, the postmodernists injected the Marxist class struggle into cultural discourse with explosive effect. In postmodernism’s artistic manifestations, which were some of the first salvos from the intellectual Bolsheviks, traditional Western assumptions about beauty and artistic value were boldly challenged by works like John Cage’s 4:33 and Marcel Duchamp’s Fountain, a tradition which also included Andy Warhol and the daring Carolee Schneemann.

The intellectual vanguard of postmodernism, of which Michel Foucault, Jean-Francois Lyotard, Richard Rorty, and Jacques Derrida were the most influential proponents, launched assaults on the Western concept of reason, pointing out errors and limitations in scientific and rational ways of knowing – with significant help from phenomenologists like Edmund Husserl. This allowed them to cast the folk knowledge held by minority groups as distinct in structure and equal in value to dominant Western narratives, giving rise to the high levels of cultural relativism in the modern West as well as presenting a significant challenge for Western thinkers.

Much like the sophists of Ancient Greece, who believed that language was merely a tool for obtaining power, the postmodernists advanced the idea that reality was more socially constructed than anything else, pointing to the overwhelming cultural diversity around the world as evidence. Leveraging philosophical ideas like John Locke’s tabula rasa and the evidence for psychological malleability being accumulated by early psychologists, postmodernist ideas became quite popular in Western academic circles throughout the latter half of the twentieth century, eventually becoming the dominant paradigm in many white-collar circles due to their influence on postsecondary curriculums. Unfortunately, while the acceptance and consideration of diverse viewpoints represents a genuine developmental advance for Western society that can be credited to the postmodernists, the rejection of Western standards of reason, debate, and even belief in an objective and scientifically-discoverable reality has led to catastrophic outcomes.


One of the main problems that postmodernism has is its emphasis on cultural and individual relativity, or the notion that different groups of people, or even different individuals, can have vastly different experiences of the world based on their personal circumstances. Thus, as Kimberlé Crenshaw noted in her foundational works, a black lesbian from a lower socio-economic strata will experience womanhood, and therefore feminism, differently than an upper-middle-class white woman. The idea that different “intersections” of race, sex, class, religion, and other factors can affect someone’s life experience in unexpected and profound ways is known as intersectionality, and has since become a philosophical pillar of the postmodernist movement.

The first problem with intersectionality, as noted by Dr. Jordan B. Peterson in his many lectures on the subject, is that the West already solved that issue by prioritizing the individual as the primary unit of consideration in a society. Crenshaw’s proposed innovation, while seizing a tremendous intellectual beachhead for Marxist thought, is actually a step backwards in terms of achieving true justice as it reduces individuals to their group characteristics – and therefore makes them collectively responsible for societal outcomes regardless of individual culpability.

The second problem with intersectionality, especially when combined with a radical acceptance of diverse viewpoints, is that anybody is able to approach a postmodernist-influenced organization and claim unique status based on their unique “intersections”. Some of the most outrageous examples include an Ontario teacher wearing fetish gear to class as part of their protected right to gender expression , delusional men identifying as disabled women , white women masquerading as indigenous or mixed-race individuals , and “minor attracted persons” – the latest of many attempts to legitimize pedophilia as an orientation.

Because postmodernist intellectuals and activists have concocted a philosophy that cannot say no to even the most outrageous of claims, over the last few decades it has become almost completely disconnected from reality. As a result, postmodernist activists have had to spend more and more effort to “control the narrative” – the embodiment of their socially-constructed reality – and quash, cancel, silence, or censor opposing viewpoints. The fundamental alienation from reality itself was noted quite insightfully by Soviet escapee Ayn Rand, whose fundamental philosophical axiom was “A is A”, a repudiation of the violently delusional society she left behind.


Unfortunately, the problem with postmodernism is deeper than its accumulation of delusions. One line of investigation has revealed that Americans on the political left, who have generally embraced postmodernist philosophy, experience both a positive change in mood as well as changes in political attitude following a dose of testosterone. A completely different study found that white liberals – and not white conservatives – were prone to dumbing themselves down when communicating through email with people they thought were black. Various studies have found that postmodernists are less self-sufficient, more motivated towards interdependence, and just as capable of discriminatory behavior – as can be seen from the race quotas, segregation initiatives, and hiring requirements now in place at many North American institutions like Harvard or Dalhousie.

Given these tendencies, it is easy to see how people of a certain mindset or attitude can be seduced by a philosophy that allows them to blame their situation on a complex mixture of systemic factors, signal virtue by nominally championing the causes of highly specific marginalized groups, and develop complex arguments for the growth of government into a caretaker role. As someone becomes more involved with postmodern ideas, however, they are forced to accept – and publicly defend – increasingly outrageous claims and extreme positions.

Beyond its wholesale acceptance of all kinds of “identities”, postmodernist intellectuals and activists are often forced to warp or distort reality in order to maintain the political positions that they take on behalf of the groups they claim to represent. For example, the transgender movement began with the relatively benign claim that certain individuals suffering from gender dysphoria should be granted certain allowances to alleviate the social aspects of their pain. Then, as the 2010s progressed and ground was gained, governments moved to protect these delusions in the same way that race and sex were, which gave rise to Dr. Jordan B. Peterson and the “culture war”.

Finally, teachers are allowed to wear fetish gear to class, male-bodied individuals identifying as women can undress in the same room as little girls , and a lesbian in Europe is facing jail time for saying that people with penises, by definition, cannot be lesbians . In this case, as with many other instantiations of Marxist philosophy, the time between espousing neighbour-love and gulags is very short.

However, the delusions accumulated by the postmodernist revolutionary project extend far beyond the recent phenomenon of transgenderism. Much of modern feminist philosophy, especially contemporary iterations, is founded on bad statistics, outrageous exaggerations, and a complete denial of the impact of biology on the male and female experience. This can be seen most clearly in the lack of women in technological fields, a longstanding issue for postsecondary institutions and businesses that hire their graduates. According to the feminists, the predominance of men in engineering and computer science can be attributed to a mixture of discrimination, a lack of role models, and cultural beliefs about femininity and women that discourage them from seeking technical occupations. However, the actual science on the issue is markedly different, and demonstrates that men tend to be “thing-oriented” and women tend to be “people-oriented”, which manifests not only in career differences, but the toy choices of toddlers, newborns, and even primates of other species.

Other core feminist grievances, like the gender pay gap, evaporate under more rigorous analysis and can be attributed to differences in career choice, working hours, and childcare decisions. Indeed, the issue of “work-life balance” is almost always a focus of women’s professional conferences, whereas male professional culture, subjected to colonial and industrial influences for much longer, would typically view such discussions as almost embarrassing or “counterproductive”.

Feminism’s departure from reality is documented extensively by gender and art scholar Camille Paglia, who laments the exclusion of the biological sciences from the formation of the first women’s studies departments . Indeed, women’s studies as a field was founded exclusively by postmodernist English professors, a historical decision that is nakedly evident in the overwhelming focus of academics on issues of culture and discourse, instead of on scientific matters. In fact, feminists are typically hostile to experts in the hard sciences, like psychologist Jordan B. Peterson or biologists Bret Weinstein and Heather Heying, as the hard facts presented by the scientists undermine many of the political and social gains that feminists have made, and therefore represent an existential threat to the consensus reality.

As will be discussed in the next two chapters, postmodernist-aligned professionals have even infiltrated and co-opted entire professional organizations in order to accomplish the goals of their beloved interest groups, most dangerously the American Psychological Association – one of the primary influencers of mental health practice in North America and perhaps the world. This extraordinarily dangerous move has already resulted in the sterilization and mutilation of North American children, as well as institutionalized sexual interference in schools and psychiatric practices, a crime to which this author can testify. However, we shall also examine the weaponization of science against the individual, a crime of which every doctor, medical researcher, and pharmacist is a guilty party.

Get Updates on Spiritual Science

* indicates required


Chapter 5 Draft - "Operation Mindcrime"

Although many aspects of modern society are highly systematized, one thing that high modernists cannot reliably replicate is the expertise possessed by trained professionals such as doctors, lawyers, and engineers. The kinds of decisions these experts make every day have extremely important outcomes for many people and are often the difference between life and death – this means that the expert class in modern society enjoys a level of respect and deference roughly equivalent to the high priests and prophets of Christendom. Problematically, they also often wield an equivalent degree of power over the decisions made by groups, organizations, and even nations, often to everyone’s detriment.

Indeed, much like the “holy scriptures” that provided the high priests of the Dark Ages with their claim to authority, the increasingly complicated scientific literature assembled since the Enlightenment provides experts with tremendous powers and privileges within modern systems. The most obvious example at the time of writing is the international response to the COVID-19 pandemic, largely driven by top doctors in regulatory bodies as well as scientists and researchers in the pharmaceutical industry.

The lockdowns and mandates that were a feature of this response, unpopular with large segments of the general population as well as many in science and medicine, were ultimately justified not through a democratic process, but by complex mathematical models developed by British experts as well as the results of various scenario exercises executed by a consortium of bodies including the Johns Hopkins Center for Health Security, the Bill and Melinda Gates Foundation, and the World Economic Forum.

During the pandemic, and in many situations encountered in more regular circumstances, people skeptical of the unprecedented measures were told to “follow the science” and trust the judgement of the experts, who had gone to school for years, worked in the field for decades, and were very confident that their course of action was correct. Although the outcomes of the pandemic response will be discussed later, we shall first contemplate the complete suspension of personal autonomy on a global scale, even bodily autonomy in the case of some mandates.

As discussed in previous chapters, the overall trend of high modernist plans has been to sacrifice human freedoms and even lives in order to accomplish a military, economic, or social goal considered important. While many of these goals are generally considered to be worthy and important sacrifices, such as the Dieppe Raid, many are acknowledged as total wastes of blood, sweat, and tears, such as the entire First World War or the seven thousand migrant workers who died building the World Cup infrastructure in Qatar . Folk songs about railroad builders , millworkers, and working men resonate deeply in the Western consciousness, given the careless and exploitative attitudes they faced and the costs of progress that never show up in balance sheets.

While previous justifications for the destruction of autonomy were based on thinly-disguised greed and a desire for material abundance, this new iteration of exploitation is much more pernicious and wields centuries of scientific development as its justification for why things must be a certain way.
The true rot much goes deeper, unfortunately.

Upon a comprehensive review of the kinds of science only encountered by graduate or postgraduate students, and even then only passingly, it becomes clear that the scientific process that tells humanity what is real has been subverted by both high modernist interests and the neo-Marxist reactionaries seeking to subvert them. As will be discussed in this chapter and the next, which could be considered jagged little red pills, the entire world that most Westerners occupy is partially or entirely a delusion, supported by increasingly expansive taxation plans, erosion of the natural world, exploitation of third world workers, and, in the case of the West, even the exploitation, mutilation, confusion, poisoning, and sterilization of children.

Indeed, two centuries of conditioning from the school system and regimented work environments, punctuated by pre-planned trips and glimpses of joy, has become the beloved ideal, ardently defended by the very people it takes advantage of. We shall begin with the scientific method.


If you ask a scientist what the best thing about science is, and their answer is anything other than the scientific method, that answer had better be uniquely insightful and thought-provoking. Indeed, the rigorous kinds of examination developed by Western civilization and its influences have facilitated a slow grind towards truth, which in turn have afforded us many modern luxuries, unbelievable creative capacities, and a genuine chance at achieving technological utopia.

The scientific method can be summarized as follows:

  1. You are humble enough to be unsure about something and willing to investigate it
  2. You establish what is currently known about the topic
  3. You develop a hypothesis about what might be true
  4. You design an experiment to test your hypothesis fairly, rigorously, and creatively
  5. You execute the experiment and collect your data
  6. You impartially and honestly examine the data to draw conclusions
  7. You share your results with others for review and replication

One of the strengths of this method is its impartiality – by creating a hypothesis beforehand and then testing it with a predetermined experiment, the result is independent of the wishes or preferences of the experimenter, and is therefore considered to be a glimpse of unadulterated reality. The latter parts of the method, which include analysis of the data and the ongoing efforts to replicate and confirm novel discoveries, transcend any individual scientist and have, over the last century or two, coalesced into institutions that manage the production and professional use of particular spheres of knowledge. From Professional Engineers Ontario to the World Health Organization, it is generally assumed that the best and the brightest of all nations have worked their way into positions of profound trust and authority.

Indeed, this has been the justification for not only the entire pandemic, which locked many Westerners in their homes for certain lengths of time, but also for the treatment of the mentally ill, who, for the past century, have been diagnosed with various mysterious and invisible maladies that manifest in destructive actions like psychosis violence or heroin addiction. The entire weight of the scientific project – which works through the method previously described, manifesting corporeally in the journals, governing bodies, policymaking organizations, and scientists who make decisions for other people – is behind the justification of selective violence for the sake of individual or public health. As a brief case study, the indigenous woman trampled by Canadian horses during the trucker protest , well within the age range for residential schools, is likely a victim of both types of violence, and has demonstrably been so while in her golden years.


In 1962, one of the most important works regarding the philosophy of science was produced by Thomas Kuhn, an American historian and philosopher who developed the concept of a paradigm in science, commonly understood in the popular lexicon as a framework or worldview that influences the entirety of someone’s thought on a topic . Through his historical analysis of the hard sciences, particularly the emergence of quantum physics from its Newtonian counterpart, Kuhn realized that there were two kinds of scientific research – so-called normal research, which constitutes the incremental gains made by experiments that yield predictable results, as well as experiments or accidents that yield anomalies that cannot be explained by current theories.

An example of this would be the photoelectric effect, or the emission of photons from a metal exposed to light. Once this became discernable following the development of vacuum technology and measurement equipment, it took Einstein’s work, an extension and revision of the work of physicists like Newton and Maxwell, to explain the effect – and in the process, completely transformed the scientific understanding of the universe.

This is called a paradigm shift and is the result of an anomaly, a process in science that catalyzes a change in the agreed-upon laws of the universe, the acceptable methods of inquiry, standards for what constitutes evidence, the concepts that are considered table stakes for entry into professional discourse, and the kinds of inquiry that take place. For example, people didn’t care about uranium, and in fact it wasn’t discovered until 1789 by Martin Klaproth.

Whereas mathematics, physics, chemistry, biology, neuroscience, and the computer sciences seem to have largely settled into a comprehensive understanding of their domains, the so-called “science” of psychology is a relative newcomer to the field, first gaining prominence with the release of Freud’s the 1890s and early 1900s. Indeed, upon close examination of its theories and the history of the development of those theories, it becomes clear that psychology lacks a coherent paradigm, and therefore cannot be considered a science despite its use of different aspects of the scientific method.

And, whereas mathematics, physics, chemistry, biology, neuroscience, and the computer sciences have rarely caused for institutionalized violence against people who do not fit easily into colonial-industrial society, this upstart prodigy developed not two centuries ago has the power of law in many countries, with the ability to confine “mentally ill” people on wards , physically restrain them to their beds , medicate them against their will , and do all sorts of other things – for their own good. As many mentally ill people can attest, and is often documented in popular culture like Ken Kesey’s One Flew Over the Cuckoo’s Nest or Terry Goodkind’s The Law of Nines, the treatment from these systems is harsh, authoritarian, utterly rigid, and belligerent to the point of keeping patients with religious delusions in confinement without actively seeking support from religious practitioners – or even considering such support as a viable option.

These deeply ignorant and egregious oversteps on behalf of this so-called “science” have facilitated crimes as grotesque and barbaric as Mengele’s experiments on Jews or Unit 731, and perpetrated on a similar scale and with similar scientific efficiency.


Some examples of safe and effective treatments, as claimed by yesteryear’s psychologists, include Freud’s recommendation of cocaine to patients, the lobotomy, and the electric shock treatment, all of which were developed and approved within psychology’s attempt at the scientific method. Put simply, it boggles the mind and causes one to wonder how things could have gone so wrong. Moreover, the ongoing mental health crises, which are getting worse despite psychologists’ best efforts, is another indicator that psychology is not just lacking a paradigm, but perhaps a rudder and a moral compass.

Given these realities, it becomes necessary to review not only the discoveries that psychologists claim they have made, some of which have turned out to be massive frauds, but the history and context of the alleged discoveries.

Although the history of psychology does include many investigations and themes in Western culture that predate Josef Breuer’s therapeutic work with Anna O., the history of modern colonial psychology begins with those interactions, and the theories developed by Sigmund Freud based on those interactions. It is to this complicated history, full of devious subtleties that have spawned countless internal demons, that we shall now turn.


Modern psychology, as many are aware, begins with the psychoanalytic techniques and theories developed by Freud, Breuer, and O., which were based largely on the link that indeed exists between childhood traumas and adult discontent. Their school of thought was soon disputed by the behaviorists like Ivan Pavlov and Alexander Luria, precursors to modern neuroscientists who were concerned primarily with brain activity and the resulting behaviors that were driven. The incompatibilities between these schools of thought, as well as some of the existing problems within psychology in the early twentieth century, catalyzed the development of Third Force Psychology or Positive Psychology, an attempt by luminaries like Abraham Maslow, Erik and Joan Erikson, and Carl Rogers to reform and humanize the field.

As the twentieth century progressed and computing technology was invented, scientists and researchers began drawing many connections between the brain and their new machines. Both processed and stored information; both could make decisions, albeit the computers required instruction; both used electricity to function. In addition to many exciting research opportunities, this also created many metaphysical problems for psychologists and philosophers, as some began wondering if computers were conscious – or could become conscious – and what the implications were for humans. This opened up a new branch of psychology, largely intertwined with philosophy and metaphysics, which sought to understand the causes and functions of human consciousness, develop an understanding of the human self, and determine the nature of human decision-making.

Indeed, as will be discussed in the next section, the resonances between the human brain and the computer go far deeper than many appreciate. Furthermore, contemplation of these resonances reveals some of the deepest laws and principles that can guide human behavior. However, each of these approaches to human psychology – psychoanalytic, behaviorist, positivist, and “computational” – can be metaphorically and poetically related to some of psychology’s deepest and most hidden problems.


Psychoanalysis, fundamentally, is rooted in and focused on the role of trauma in human life, a historical reality that has cascaded into the use of trauma, a clinical term, by many Westerners for things that do not fit the definition. Much of Western life, at least in North America, is focused on the discovery and healing of trauma or the alleviation of negative feelings – it is estimated that almost thirty percent of Americans saw a therapist during the pandemic and that about one in ten Americans engage with a mental health professional within any given year.The psychological literature on trauma is tremendous, and includes not only many details about how subtle issues in the parent-child relationship can facilitate intergenerational trauma , but how most people grow from traumatic events and become stronger. This final detail – post-traumatic growth – seems to have escaped notice from general social discourse, where traumas are used as justification for reparations and apologies, implying that the negative aspects of the trauma outweigh the potential for growth. Aside from questions of justice, which are entirely salient, the general trend began by Freud and Breuer following their collaboration with O. is a dangerous and subtle usurpation of human autonomy and robs unsuspecting laypeople of growth opportunities.

Generally speaking, the behaviorists discarded the introspective techniques used by the psychoanalysts and focused strictly on observable material outcomes of brain activity. Although this is generally not known, Pavlov and many other behaviorists were skilled surgeons, and often obtained their discoveries by way of subtle alterations to the bodies of their animal subjects.

This very quickly led to what is known as the “mind-body problem”, a central dispute within psychology which, taken alone, disqualifies it from being a paradigmatic science. Essentially, people in the behaviorist camp believed that there is nothing in the universe aside from matter, and that what people experienced as the “mind” was largely irrelevant to the study of human behavior. This is known as monism, or the belief that the universe only has one kind of stuff.

Their opponents, which by definition include almost all followers of the Abrahamic faiths, believe in dualism, which holds that “mind” is something separate from matter and should be studied and treated differently. This issue has never been resolved in psychology and many have stepped past it, adopting elements of both depending on their context and focus.

The development of third force psychology, in some respects, can be seen as a successful attempt to correct for some of the worst excesses of the baseline problem. In a 1968 memorandum to the Salk Institute of Biological Studies, Abraham Maslow shared his experience with psychological problems that were impossible to resolve within the values-free domain of traditional scientific inquiry . He expressed a level of frustration and disappointment that psychology was predominantly focused on the sick and unwell, rather than on the thriving and fulfilled. This memorandum, and the work of people like Maslow, Rogers, and the Eriksons, gave birth to what is now the positive psychology movement and a focus on becoming one’s best self.

However, there is a problem here – what is the self? Much like the mind-body problem, which led some psychologists to logically conclude that there is no such thing as mind in the universe, the self is a nebulous concept in psychology, with several definitions competing for dominance in the field. Even more problematically, neuroscientists engaging with Buddhist ideas have put forth the idea that there is no such thing as a “self”, citing resonances between Buddhist anatta doctrine and some discoveries in their domain. This is another problem that has been largely stepped over, with many therapeutic workers implicitly assuming that there must be a self – otherwise, their work to improve others would make no sense.

The advent of possibly-conscious computers created many more difficult questions for psychologists and philosophers. In some respects, the machines served as a mirror with which humanity could examine its own consciousness – and largely come away with no real answers. Indeed, psychologists do not know how we are conscious, why we are conscious, what consciousness is actually for, or even how to define consciousness and differentiate between “conscious” and “unconscious” brain activity. Put simply, anything to do with consciousness is wrapped up in complicated philosophy, and some of the brightest minds consider at least one of the problems – the “how” of consciousness – intractable.

The final conceptual problem of psychology is the deepest, and not poetically connected to any school in particular. Rather, it is the issue of the methods used to obtain information by psychological researchers, which are a mixture of introspective methods – like psychoanalytic therapy – and observational methods – like Ivan Pavlov’s dog experiments. Fundamentally, these methods are incompatible unless a highly-advanced Neuralink device can be implemented that can confirm, beyond any shadow of a doubt, that the self-report of a subject or patient is indeed reflective of the activity going on in their brain or “mind”. Without that, the discoveries from these two types of inquiry will remain, at some level, disjointed, meaning that even if the baseline problem, the mind-body problem, the self problem, and the consciousness problems are solved, without a magical device to equivocate between introspection and observation, psychology will never have a hope of being a science.


Given psychology’s inability to gain a firm grasp on reality, combined with its history of barbaric practices, it is probably unsurprising to many that psychology is a field rife with all kinds of fraud, from individual research fraud to extensive institutional deception on important matters.

One area of reasonable suspicion includes the close relationship between pharmaceutical companies, hungry for lifetime customers, and the psychological industry. Concerns have been raised, for example, over the fact that ADHD-like “disorders” have been skyrocketing in young boys over the past couple of decades, demonstrably because of psychology-aware teachers who notice “symptoms” in their classroom. These symptoms include fidgeting, excessive talking, and impulsivity.

All of these are common behaviors in young boys full of energy, but the “solution” is a lifetime subscription to Ritalin or Adderall. Only in recent years have the actual experts in child psychology concluded that ADHD is over-diagnosed, and even that “milder” symptoms are being treated unnecessarily with pharmaceuticals. The treatment of depression with drugs is also a concern - Recently, an extremely large and comprehensive study indicated that depression is not caused by a chemical imbalance, throwing decades of pharmacology and psychology into the trash and calling into question larger segments of the literature.

In terms of outright research fraud, some of psychology’s biggest names have been involved in it. Philip Zimbardo’s internationally-famous Stanford Prison Experiment, which seemed to suggest that human beings were naturally prone to mistreating each other, has been found to be the result of undisclosed manipulations that created the famous outcomes. The Implicit Association Test, which purports to measure levels of subconscious bias towards people of different demographics, does not meet any of the standard definitions for diagnostic reliability in psychology, and even one of its founders has retracted the bold claims made about the test. Even the popular and seemingly intuitive concept of power poses, taught to young people everywhere as a confidence hack, is a placebo when the data is rigorously analyzed.


Perhaps the most unfortunate kinds of deception within psychology are the lies perpetrated over decades by governing bodies, which make claims about human nature that are then cascaded down into law and therapeutic practice. If a psychological untruth makes its way into a professional relationship, it can have catastrophic outcomes for patients and clients, and may even be dangerous, as neither the client or practitioner will be aware of the deception of the more insular governing body. This is, sadly, the case with homosexuality, officially considered since 1973 to be a positive and healthy expression of human sexuality to be treasured and protected in the same way as heterosexual relationships.

The story of homosexuality begins in the 1950s, with early research on the psychological nature of homosexuality conducted by people like Evelyn Hooker and Alfred Kinsey which formed the foundation for later discussions. Around this time, a psychiatrist named Robert Spitzer was involved in developing the third edition of the Diagnostic Statistical Manual, a document published by the American Psychiatric Association that provides a centralized list of all known mental disorders and the symptoms most commonly associated with them. People working with Spitzer on the project relate that it was largely driven by his singular vision, with a process that was often opaque and autocratic, although the DSM-III was an undeniable success that provided professionals with coherent language for describing many psychological maladies.

Spitzer’s leading role within the DSM made him an idea figure to weigh in on the issue of homosexuality, which became increasingly pronounced in the 1950s and 1960s, erupting into marches and riots after the Stonewall incident of 1969. At this time, homosexuality was classified in the DSM as a mental disorder, which created systemic barriers for queer people as well as tremendous prejudice. Indeed, at this time, some of the barbaric methods within psychology, such as morphine-induced nausea treatment and electroshock therapy, were being trialed as potential cures for homosexuality, making it a burning-platform issue for the then-nascent gay pride movement.

As a result, the gay pride activists at the time began engaging in highly aggressive protest tactics at psychological conferences, shouting down speakers, being generally disruptive, and delivering the ultimatum that their lifestyle be removed as a mental disorder in the DSM. Enter Robert Spitzer, who proposed a revision to the DSM-III in 1973 which was not a scientific proposal, but another opaque and autocratic move guided by his personal vision for the document.

As is documented plainly by the proceedings of the American Psychiatric Association, Spitzer elegantly defined “mental disorder” as a psychological condition that impairs general function or is unwanted by the patient, thus removing homosexuality from consideration as a mental disorder and removing it by fiat from the DSM-III.

In all fairness to the gay pride movement, many of their concerns regarding the treatment of homosexuals at work and in society were quite valid. There are many stories of brutality and violence, countless stories of rejection at the hands of friends and family, and the kinds of invisible tragedies and hardships only made known to the mainstream through brilliant queer artistic works like RENT or Sense8. Indeed, although it is not well-known, the author of The Ugly Duckling, Hans Christian Andersen, was himself bisexual and wrote the famous parable as a kind of autobiographical tale.

Yet, a deeper look into the lives of Hans Christian Andersen, Robert Spitzer, Alfred Kinsey, and many others part of the poorly-defined “LGBTQ movement” reveal a number of troublesome patterns, which are reflected in a body of scientific literature that has been systematically hidden, discredited, and outright denied by the psychological mainstream since Spitzer’s 1973 redefinition of mental disorder.

Hans Christian Andersen, as related by popular sources, was born in Danish slums to humble parents and may have been dealing with alcoholism, the prostitution of family members, and potentially even sexual abuse as part of his childhood . In nonfictional autobiographical accounts, Andersen relates that he was abused at school for purposes of character improvement, and was discouraged from pursuing creative outlets by the faculty.

This mixture of experiences, juxtaposed with his relative success later on in life as a weaver of tales and imaginations, is roughly the narrative espoused in The Ugly Duckling, although sadly Andersen’s love life remained unfulfilled, with persistent problems in finding a female mate as well as same-sex attractions that remained mostly unexplored.

However, there are resonances with Robert Spitzer. During Spitzer’s childhood, he reportedly dealt with a “professional patient” for a mother and a “cold, remote” father. He attended therapy as a teenager for these issues, as well as an outlet to talk about his fascination with women.

In fact, as noted by Sexual Personae author and lesbian dissident Camille Paglia, there is consistently a pattern of childhood disturbance present in the male homosexual community, and in one of her public opinions offered on the subject went so far as to say this was ubiquitous across all gay men she had ever known.

Juxtaposed against these allegations, which would likely be decried as genocidally homophobic in the post-Trump era, is the American Psychological Association’s website, which states that there is “no consensus about the exact reasons” and that “most people experience little or no choice” in their sexual orientation. The LGBTQ movement, for their part, has rallied around Lady Gaga’s smash hit Born This Way and claim that their orientations and identities are innate, biologically determined, a fundamental part of who they are, and therefore inviolable.

But what does the science say?

A 2019 study on the relationship between genetics and homosexuality which involved almost half a million Europeans found that genes can only account for between eight and twenty-five percent of homosexual behavior. Although this is not a perfect comparison due to differences in the studies, genetics have been found to drive forty percent of divorce behavior and the majority of bipolar predisposition. Investigations into intrauterine factors that might drive homosexuality such as fetal hormone exposure, as well as theories about evolutionary origins such as kin group optimization, remain inconclusive, unproven, weak, or implausible.

However, much more in line with Paglia’s estimation and the lives of Andersen and Spitzer, a study published in 2012 found that homosexuals were significantly more likely to have been sexually abused as children when compared to their heterosexual counterparts, and that this likely plays some kind of causal role. These findings were foreshadowed by another high-quality study published in 2008 and seem relevant to the treatment of LGBTQ individuals given the trauma that would cascade into other mental health issues typical for victims of childhood sexual abuse, which are many.

Furthermore, research on over one thousand male homosexuals in the twentieth century found a pervasive trend in the male homosexual’s family dynamic, whereby the mother was inappropriately close, controlling, smothering, and often engaging in emotional incest, sometimes even becoming sexually inappropriate in similar ways that she would with female friends. Meanwhile, the father was cold, abusive, competitive, and rejecting – much like Robert Spitzer’s was reported to be. These findings, too, have been corroborated by studies of homosexual clergy members, a 2005 study involving two million Danish subjects, yet for good reason they have been flatly, aggressively, and even violently denied by activists who are adamant that there is nothing wrong whatsoever.

Put simply, these findings, given the context of homosexuality’s removal from the DSM-III and Spitzer’s own life history, are extremely troubling and indicate there is a tremendous amount of unrecognized and unresolved pain in the homosexual community. This hypothesis is supported by a significant amount of literature on LGBTQ mental health, including studies that find homosexual and bisexual men are nearly three times as promiscuous as heterosexual men, are more likely to have been paid for sex, and engage in extreme sexual acts significantly more often.

Non-heterosexuals are much more likely to be dependent on drugs and are more likely to have a diagnosed disorder such as bipolar, panic disorder, borderline personality disorder, or obsessive-compulsive disorder. This author, for example, is diagnosed with bipolar and chain-smokes like a Romanian veteran.

In the context of monogamous same-sex relationships, which are rarer than admitted despite the fierce fight for their recognition, homosexual relationships are more unstable and likely to be plagued by infidelity as well as more violent: male-male partners are just as likely to be violent as heterosexual couples, if not more, and lesbian pairings are the most violent pairing out of any sexual combination.

The metaphysical switcheroo perpetrated by Robert Spitzer in 1973 was not entirely a terrible thing, as it catalyzed the general acceptance of not only homosexuals, but people with all kinds of diverse attributes, including disabilities and even mental illnesses. Today, homosexuals are celebrated for their many achievements – predominantly artistic, likely but not confirmedly due to the gene-driven Openness trait – and even gender-diverse people are beginning to find acceptance. At the very least, there is more room for different kinds of masculine and feminine expression and some aspects of rigidity during formative years has been relaxed.

As disappointing as these series of lies have been, the rot goes much deeper with the psychological bodies. Following the depathologization of homosexuality in 1973, LGBTQ activists with a neo-Marxist agenda began infiltrating the field, eager to gain and wield professional power in service of a demographic that had long been poorly served by electroshock therapy and morphine. However, fifty years into the deception, what started as a push for mere acceptance has devolved into the sick and calculated destruction of entire generations of children, whose bodies and minds have been sacrificed at the altar of inclusion.


If Hans Christian Andersen, the original ugly duckling, were alive today - or this author, who displayed some gender non-conformity as a child, were born a decade later - there is a pretty high chance we would be either transgendered or nonbinary. This can be stated with such certainty because children across the West are being taught about the legitimacy of transgender identities, to the outrage of many conservative parents, which is so pervasive in the historical record that it needs no citation.

However, what does need a citation is the World Professional Association for Transgender Health’s definitive report on canonical transgender science and the standards of psychological and medical care for transgendered individuals, the seventh edition of which was released in 2011.

The document brings evidence that over seventy-seven percent of prepubescent boys referred to clinics for gender dysphoria before the current transgender craze ended up desisting, or no longer experiencing dysphoric feelings into adolescence. The same is true for girls, for whom seventy-three percent desisted around puberty. The document also notes that autism spectrum disorder, anxiety, and depression are usually found to be co-occurring alongside the gender dysphoria.

Given the current context of this document, gender-affirming care is mandated by Canadian law should a child begin to express dysphoria to a parent or caregiver. In a move reminiscent of the residential school era, parental concerns about the legitimacy of the new transgender “science” is being overruled by way of guns and confinement in jail, as one father in British Columbia has found out.

But what of this new transgender science? Where did it come from?

Ultimately, this author personally witnessed the devolution of the LGBTQ movement during his own involvement throughout the 2010s. Even in universities pre-2014, transgendered people were quite rare, although most people on campus were familiar with them. In the mid-2010s, however, the ideas became more mainstream in LGBTQ thought and therefore in academia, which leans liberal in almost all disciplines except the hard sciences. During this time, the author joined an online group dedicated to transgender science, and after observing the academic discussions for about three years, it became clear that the group had begun seeding their research using each other – or even themselves – as subjects.

These activist-academics had also come to explicitly reject the standards for transgender care as espoused by the World Professional Association of Transgender Health in favour of radical and unproven notions of gender which have since become commonplace in social discourse by way of the deference afforded to experts in universities. By forming loosely distributed and largely informal networks of networks, neo-Marxist and academics who could all cite and reference each other, they – and other activist-academics – have been able to usurp the scientific method and confuse many people terribly. The constant threats and reminders of transgender suicide have helped accelerate their social gains, mentally whipping many into scared compliance.

The end result is that almost eighty percent of children, if not more, who are receiving puberty blockers and sex reassignment surgeries, or even gender-affirming care that does not first investigate childhood traumas like sexual abuse or narcissistic family dynamics, are being intentionally neglected by a corrupted discipline that cares more about repressing its own mental illness than it does about alleviating others’ suffering. Worse, their parents are being told that the only other option to gender-affirming care is a dead child, with the tremendously high suicide rates of gender dysphoric individuals provided as evidence that they are obliged to comply.

Furthermore, the transitioning process, which involves hormone treatments and surgeries to construct new sexual organs, is held by many transgender activists to be the panacea of the condition. Yet, the suicide rate for transgendered people who have fully transitioned remains twenty times that of the population average in some studies. Additionally, the ranks of so-called “detransitioners”, or people who regret their transition, are growing. This demographic, many of whom are women on the autism spectrum, cite other issues for their dysphoric feelings and report feeling influenced by the lies propagated by activists.

It would seem that instead of allowing their ugly ducklings to grow into swans, a colonized and impatient population, deluded by mentally ill activist-psychologists, has become obsessed with the defeathering and tearing-apart of these beautiful birds… in the name of diversity and inclusion.


The author’s own life experience is proof that this particular rabbit hole goes even deeper. After attending regular talk therapy for work stress in 2019, the author discovered that some of his LGBTQ orientations, which included open polyamory and closeted bisexual tendencies, were almost completely alleviated and replaced with a strong desire for a wife and a vegetable garden.

According to the American Psychological Association’s 2009 Task Force Report on conversion therapy, or therapy intended to change sexual orientation, this is not even possible if that is the point of the therapeutic relationship, let alone by complete accident.

The report also claimed to find “no credible evidence” for the efficacy of such practices, then concluded that people do not face a choice about their sexual orientation and that “affirmative treatments” are the only responsible option, much like was said for transgenderism a decade later. However, this task force seems to have been comprised of six activists in gay rights causes, with not a single actual practitioner of conversion therapy accepted to the committee, nor even a neutral party. Furthermore, their report conveniently dismissed every single paper documenting conversion therapy success as being methodologically flawed, allowing them to say there is no credible evidence while avoiding the inconvenient truth that evidence happens to exist.

In fact, there are many papers and studies that document the efficacy of conversion therapy, and they seem more reputable than the LGBTQ activists would like to admit. One paper, which accurately described the author’s family dynamic despite published thirteen years before his birth, found that cognitive psychoanalysis had a 30-50% success rate. Another researcher found that a similar proportion of surveyed homosexuals accessing therapy or pastoral care experienced a change from predominantly homosexual to predominantly heterosexual, and experienced positive changes in their psychological, interpersonal, and spiritual well-being.

Infamously, Robert Spitzer, the man responsible for the revolution of the DSM, published a study with over two hundred former homosexuals who claimed to have been cured through therapy, which he later retracted after intense criticism from the ideologically captured psychological mainstream – he claims it is his only professional regret, seemingly unrepentant for his mindcrime to the very end.


Between LGBTQ fraud, the Stanford Prison Experiment, the Implicit Association Test, the collapse of the serotonin theory in depression, further fraud in Alzheimer’s research, and a general “replication crisis” in psychological fields, it would seem that modern society is fundamentally confused about many matters of human nature. Zimbardo’s fraud cast a deep cloud of pessimism for decades, its shocking revelations about human nature spreading like wildfire. Children are being sterilized, mutilated, lied to, and intentionally confused by deranged adults desperately seeking to legitimize their illnesses. The American Psychological Association has even been implicated in torture of United States prisoners.

This leaves Western society with a deep problem. If many things commonly believed to be true about human nature are in fact fictions, and the scientific bodies involved in perpetrating these lies have been captured for decades, than the true nature of human nature is very much an open question to be resolved. Once the answers to “what is?” have been generated, questions such as “what is important?” and “what ought to be done?” can be asked, which would begin to lead society out of its present dream and towards something more closely resembling a utopia.

It must also be said that psychology has been used by various powerful interests to control and influence human behavior on massive scales. Beyond propaganda, which is extremely subtle in the internet age and takes the form of “fact-checkers” funded by corporate media, funded in turn by pharmaceutical and financial interests, there is commercial psychology, also known as marketing. The insights gathered by researchers have long been used to identify leverage points in human behavior, with some marketers even carrying out behavioral tests involving human propensities to purchase items under certain conditions.

To paraphrase dissident psychologists specializing in narcissism, most modern Westerners live in a world of flickering images, denied the opportunity of looking in a mirror and seeing their true self. Kept in an insulated kingdom of plenty, much like the Buddha before his journey began, the profound levels of anxiety, depression, gender confusion, and other mental illnesses could likely be attributed to the odd mixture of overparenting and underdevelopment children are now subjected to in combination with the malicious lies propagated by institutions. Put more simply, children are so underdeveloped they don’t even know what gender they are, yet they are being overeducated and driven to perform by a state system that requires their intellectual and physical labor to maintain economic growth. All the while, one in ten Americans requires therapy, as was discussed, with LGBTQ persons heavily overrepresented in that demographic and, as the transgendered activists themselves point out, represent many of the more extreme and urgent cases.


Despite the rampaging technicolor catastrophe that psychology has degenerated into, there have been many genuinely useful discoveries over the years. Many of the aforementioned names, save for the fraudsters, have contributed at least several puzzle pieces to our self-understanding and have likely helped keep the Western psyche together despite the pernicious influences from high modernists and their subversive counterparts.

Some other examples include Internal Family Systems Therapy, an innovative kind of developmentally-supportive therapy that encourages dialogue between disparate “parts” of a person, the daring work of Bill Masters and Virginia Johnson, who “discovered” the female orgasm, Nancy Friday’s pulse-racing My Secret Garden, an ethnographic account of the exciting and diverse world of female sexual fantasy, and the wickedly tricky Derren Brown, a hypnotist, mentalist, showman, and stuntman who uses advanced psychological techniques to conduct experiments, teach people life lessons, and empower his fellow human beings.

As will become relevant in the second section, Abraham Maslow’s vision for third force psychology, and in his final years an even more exciting fourth force, has driven much of the optimistic tones now taken by mainstream psychology. His focus on values, needs, and self-actualization is one of the pillars of the modern self-help movement, forms standard knowledge for psychology undergraduates, and will mark important waypoints on the reconstruction of psychology from first principles in physics.

Get Updates on Spiritual Science

* indicates required


Chapter 6 Draft - "Idols, Isms, Ideologies"

One of the most famous European philosophers of recent memory is Friedrich Nietzsche, whose aggressive style, bombastic claims, and prescient analysis have made his intellectual legacy one of the most compelling in the Western canon. Writing in the second half of the nineteenth century, Nietzsche’s scathing critiques of traditional European value systems, and grave concern at what would come to replace them in the twentieth century, represent one of the most insightful diagnoses of Western metaphysical ailments ever to be produced.

Indeed, throughout many of his works, Nietzsche is primarily concerned with values and their expression in societies. One of his most provocative concepts, later appropriated by the Nazi regime, was the übermensch, or “overman”, someone who determined and pursued their own values through force of will. Someone who failed to do so was, in Nietzsche’s opinion, doomed to nihilism and decadence, essentially a self-destructive lifestyle.

One of Nietzsche’s main targets of criticism was Christianity, which he correctly intuited to lack a coherent metaphysical substructure for the values it had imposed on Europe for centuries. Writing in the wake of Darwin’s On The Origin of Species, as well as the momentous 1860 Oxford exchange between Bishop Samuel Wilberforce and Thomas Henry Huxley, Nietzsche foresaw what he called the Death of God, or the collapse of a widespread belief in Christianity, as well as the spread of nihilistic philosophies and value systems in its wake:

Are we not plunging continually? Backward, sideward, forward, in all directions? Is there still any up or down? Are we not straying, as through an infinite nothing? Do we not feel the breath of empty space? Has it not become colder? Is not night continually closing in on us? Do we not need to light lanterns in the morning? Do we hear nothing as yet of the noise of the gravediggers who are burying God? Do we smell nothing as yet of the divine decomposition? Gods, too, decompose. God is dead. God remains dead. And we have killed him…

What festivals of atonement, what sacred games shall we have to invent? Is not the greatness of this deed too great for us? Must we ourselves not become gods simply to appear worthy of it? There has never been a greater deed; and whoever is born after us -- for the sake of this deed he will belong to a higher history than all history hitherto." (from The Gay Science)

Although the fictional and poetic exposition of these ideas can be somewhat hard to grasp at first glance, something Nietzsche took pains to discuss in some of his book introductions, what is being discussed here is the collapse of the European value structure following the Darwinian ideas prevailing over Christianity’s faulty metaphysics. The “festivals of atonement” and “sacred games” alluded to here turned out to be different from their face value interpretation, with orgiastic destruction made the rule in Unit 731, the Soviet gulags, and the Nazi death camps.

Given these atrocities, one of the most pressing questions for many modern thinkers is how human beings are so susceptible to genocidal behavior, both in the context of religious belief and in secular situations. As has been previously discussed, the cascade effects of Christianity, modern European statecraft, the Industrial Revolution, mandatory state education, and psychology have not only whipped Westerners into valuing compliance, which Nietzsche described as a slave morality, but have also corrupted their self-understanding and caused them to become decadent, or prone to pursuing self-destructive goals in service of modernist ideals and values.


In one of many bright points of consilience that will be encountered hereafter, Nietzsche also correctly intuited these false belief systems to be idols, something that he made explicit in the title of one of his more famous works. As will be seen, this is a very insightful Biblical reference that will turn out to have deep roots in neuroscience, thermodynamics, and even Orthodox Jewish thought largely hidden away until after the Holocaust.

Regarding its Biblical connotations, which would have been most obvious to Nietzsche’s readers, an idol is commonly associated with a statue of an alleged divine entity that receives veneration or charity in return for blessing. In modern contexts, this would include everything from Buddha statues that receive token veneration to Hindu statues of Ganesha, and even statues of Mary or Jesus. However, in ancient times, and particularly in the Near East, the deities du jour included Moloch and Baal, both of whom demanded various forms of child harm or sacrifice.

Sadly, the slaughter of children, virgins, and other blameless group members was a feature in ancient civilization, with monuments to Aztec bloodshed serving as tourist attractions , burial pits in Carthage indicating the presence of dark elements in that lost civilization, and evidence existing to corroborate Jewish accounts of pagan child sacrifice in the Levant. Although the mindset behind such a practice may seem completely alien to the Western reader, there are a number of clues hidden throughout the historical record as to why such things took place – and how similar patterns continue to manifest in Western society in the form of gender-affirming care and child drag.

The first clue comes from Jewish oral history which shares that serving these idols provided followers with an incredible spiritual high . The second comes from an obscure-yet-influential book by neuroscientist Julian Jaynes about what he suspects to be a mechanism in ancient minds that literally gave people a voice in their heads which were attributed to gods or deities . This mechanism, Jaynes hypothesized, was the precursor to the “inner voice” experienced by many people to one degree or another throughout their lives, and, as will be discussed in the next section, the human sense of intuition which is too-often ignored in Western decision-making processes. Other relevant insights could include the still-reputable Milgram experiment and other documented atrocities in modern history, including the murder of Jewish children by policemen who said they were just following orders.

Although a comparison of religious and secular sources is not always an easy endeavour, the general consensus between rabbis and neuroscientists is that the statues themselves had no real power and were not physically communicating with their carved or graven mouths. Indeed, this can be seen plainly in Jewish scripture, which elevates its criticism of idolatrous practices to satire:

Neither do they know nor do they understand, for their eyes are bedaubed from seeing, their hearts from understanding. And he does not give it thought, and he has neither knowledge nor understanding to say, "Half of it I burnt with fire, and I even baked bread on its coals, I roasted meat and ate. And what was left over from it, shall I make for an abomination, shall I bow to rotten wood?" (from Isaiah 44)

Indeed, the phenomenon of these idols and the sacrifices they demanded, as well as the possible internal voices encouraging ancient peoples to slaughter children, were psychological in nature. Given this, as well as Nietzsche’s explicit comparison of Christian and European value systems to idols, it begs the question of whether the ideologies of the twentieth and twenty-first centuries, known broadly as the isms, may also be idolatrous.


Although the dangers of totalitarian ideologies like communism and National Socialism are obvious given the tremendous number of bodies they create, the civilizational project that began with the European Enlightenment and continues worldwide to this day, which has been previously referred to as modernism or high modernism, is more subtle in its approach, yet just as devastating to human well-being. Indeed, as predicted by Nietzsche before the dawn of the twentieth century, the period following the collapse of Christian morality in the West has been extremely chaotic, with various political, religious, social, and special interest groups all chasing their unique vision of a utopia for as long as they can maintain the votes to enact their plans.

The results of this chaos have admittedly produced many potentially-useful innovations, such as modern digital technologies, the concepts of which have roots in the Cold War arms race. However, from the killing fields of Yugoslavia to the Black Lives Matter and January Sixth riots in America, the struggle for ideological supremacy between mutually-incompatible worldviews has been tremendously costly in terms of human life and the global progression towards peace. Even the Cold War, an ideological conflict long thought to be over following the destruction of the “antifascist” Berlin Wall , has re-erupted in Ukraine in the form of a standoff between NATO powers and Russia.

For his part, Vladmir Putin, the man who declared the Russian war on Ukraine, has accused the West of a number of crimes and transgressions. Among them include trying to turn Russia into a “weak dependent country”, turning Ukraine into a de facto colony following the color revolution almost a decade before Russia’s declaration of war, using Ukrainian territory to engage in prohibited biological research, and engaging in genocide against Russian-descended residents of the Donbass region . Putin has also spoken poorly of Western elites, generally describing them as selfish degenerates who loot the world for their gain – messaging with strong resonances in some Western self-criticisms.

While many of these criticisms could be dismissed to some degree as wartime propaganda, the fact remains that much of what has been previously discussed, from the spiritual destruction wreaked by Christianity to the swindles of state-run education systems, is a testament to the high modernist need to exploit and control in service of an objective. It is indeed the case that the wealth disparity in Western countries has only grown more pronounced over time, creating an underclass that has become increasingly dissatisfied with the status quo.

Yet, the machine grinds on, with exponential growth as the primary objective that implicitly underlies most national and commercial policy. Indeed, the idols of growth, science, and progress, meta-narratives from European statecraft and industrialization long decried by environmental advocates and postmodernists, seem to have largely supplanted the Christian concept of God – and to great harm, as previously seen. Sadly, the preponderance of audaciously thin simplifications made by technocrats and scientists, laundered to the public as legitimate knowledge, have left the modern West trapped in an exploitative dream that is taking the civilization dangerously close to the kind of techno-dystopia foretold by Huxley and Orwell.


One of the oldest talking points in the environmentalist movement is the relative environmental impact of beef versus vegetables. Although this seems to be sophisticated scientific thinking based on some insightful math, the reality is that cows, goats, chickens, rabbits, and other animals are used on homesteads and small farms to graze on and maintain non-arable land. Furthermore, the monocrop approach favored by the West, which invisibly provides for the environmentalists living in cities, is specifically designed to control the entire environment, kills all animals and insects in the area, and depletes the soil over time. Indeed, the reality is a great deal more complex than environmentalists currently believe it to be, as it is likely that a mixture of meat and produce would be in fact more sustainable than the idealistic vegan approach. This type of simplistic thinking is typical of leftist elites who have never worked on a farm, never spent time on a farm, and only know about the food supply chain through the mainstream media.


A relatively new talking point among environmentalists and the political left is the importance of switching to electric vehicles. The “scientific” claim made to support these policy objectives, many of which are being instantiated throughout Europe and North America, have to do with the environmental impact of fossil fuel emissions on the environment. Many might believe the case is so clear, and the need to act is so great given the possibility of climate change, that there could be no question on this matter. However, this is also a thin simplification as seen from the incredible environmental costs associated with mining the rare metals needed for the car batteries.

There are also human costs associated with the Western luxury of emissions-free vehicles, as many human rights groups discovered when over forty thousand children were found to be mining cobalt in Africa to feed the West’s green dreams . This is in addition to the known child labor issues associated with many consumer goods from China, as well as many other human rights abuses tucked away under the glitz and glam of city lifestyles often favored by the political left.


Perhaps the greatest test of Western rationality, ingenuity, scientific progress, and high modernist techniques of control was the COVID-19 pandemic, where global bodies were able to achieve an unprecedented level of coordination across all facets of society to achieve a comprehensive goal of death reduction. As mentioned in the previous chapter, the entire weight of the scientific project was behind this coordinated response, with predictive models developed largely in Britain , vaccines rapidly developed by pharmaceutical companies, additional insight provided by bodies like the World Economic Forum and the Gates Foundation, which all resulted in the public urged to “follow the science” at every new development.

Unfortunately, in hindsight, it seems that even the best of Western medicine and science turned out to be an idol that far too many have been sacrificed to, from people dying alone in retirement homes to the developmental and educational delays in young children. Indeed, high modernist bureaucrats, drunk on their own power and unaware of existential constraints on their ability to control reality, demonstrably destroyed large facets of society for little appreciable gain. Their thin simplification, which proved fatal for many, was the COVID-19 death count – a number to be minimized, as shall be seen, at any cost.


The extensive lockdowns in most Western countries have had many documented human costs, yet is touted as an absolute necessity by health authorities and government policymakers. The truth to this controversy, considered unmentionable by many, is that bureaucrats exclusively concerned with the pandemic’s death toll, which turned out to be almost entirely concentrated in the elderly, obese, and infirm , contorted society and destroyed countless livelihoods unnecessarily. This is demonstrably evident by the comparison between Swedish and North American approaches, which yielded similar case counts per capita despite starkly different approaches to social control.

The high modernist delusions of grandeur that drove these totalitarian measures remain, at least theoretically, possibilities in the future as new pandemics emerge. In the wake of the COVID-19 pandemic, many Western countries have also become signatories of various international agreements regarding travel regulations and coordination of future responses, signalling future restrictions of autonomy in a histrionic attempt to keep everyone maximally safe.


As anyone with access to YouTube can see, even particulate matter such as vape smoke can pass through surgical and cloth masks relatively easily, and even N95 masks can leave holes for airflow around the nose . Sustained use in an indoor environment, and even outdoors, despite gleeful emphasis from public health officials, seems to be of questionable benefit given that they stop spray at best. Furthermore, the health benefits of masking of children and youth, who are not at risk for serious or lethal cases of COVID, are extremely questionable given the speech impediments and other developmental delays that have resulted from such a practice. Once again, the thin simplification of “death reduction” has created collateral damage that will require even more systemic effort to rectify. Indeed, the West’s approach to problem solving creates new problems with every solution.


Without question, the most hotly-debated topic of the entire pandemic has been the vaccines developed by pharmaceutical companies at Warp Speed. These were flatly and loudly refused by about twenty percent of the North American population, and since the introduction of booster rounds, the general population seems to have lost interest as well. Serious allegations of crimes against humanity have been swirling, with politicians making reference to the Nuremberg Code and commissioning investigations into the ethicality of the vaccine.

Furthermore, before and during the deployment of the vaccine, experts and scientists from around the world called the efficacy and safety of the vaccine into question, including one of the original contributors of mRNA technology , one of the world’s most-published cardiologists , and even doctors, nurses, embalmers, and other front-line workers concerned with the things they saw following the introduction of the vaccine into the arms of the public. The parallel realities occupied by people who follow the mainstream guidance and those who are listening to alternative narratives has created some of the starkest political divisions in the West, complete with crimes motivated by vaccine status.

However, before continuing an investigation of this topic, which will be fraught with uncertainty and some conjecture, it is important to be clear on one thing – what a vaccine is. Unfortunately, much like mental disorder received a convenient redefinition fifty years ago, and much like racism was redefined over the last decade to include notions of systemic power, and much like woman was recently redefined to include transgender women, the word “vaccine” was redefined by the USA’s Center for Disease Control to mean “protection” instead of “immunity” in September 2021 – shortly before the most aggressive mandates began.

This kind of language game is just one example of thousands of subtle redefinitions, incomplete perspectives, and half-truths that have been sold to the public under the guise of science to justify the injections. Absolutely no sustained debates have taken place between vaccine skeptics and public health officials, and in fact the author’s own attempts to obtain answers about concerning data from his region were met with police visits for so-called “mental health checks”. A meeting with a hospital ethicist, who could not deny the data regarding risk-benefit for children or the ethical logic behind the concerns, was more fruitful in that it yielded an admission the anti-vaccine arguments were “powerful” but that significant bureaucratic pressures existed that precluded action.

Even the majority of North America who assented to the first two injections out of concern for their neighbor and a desire to return to normalcy have become demonstrably skeptical. Booster uptake is lower than bureaucrats would like, with only twenty percent of people with four doses in Canada, and the child vaccination rate is also lower than hoped-for. Even general compliance with traditional vaccines has seen a significant drop, as the abuses of trust and self-admitted overselling of the vaccines has harmed trust in public institutions.


Although it cannot definitively be said that the vaccine was a population control measure, as some of the hardened conspiracy theorists have suggested, it is not clear that the mRNA project has been a net benefit to the West, nor is it clear that ethical procedures were followed during this very expensive and expansive experiment. Whistleblowers from the clinical trials, which were accelerated from a decade-long process to three months, have identified ethical and reporting issues – a plausible allegation given the pharmaceutical industry’s long history of corruption. A child who was crippled by her side effects seems to have been listed as having “gastrointestinal distress” by Pfizer . Despite repeated assertions that vaccine side effects are rare and minor, a German insurance company’s data suggests that there are significant reporting issues, as well as national data from Denmark and Canada which suggests that the vaccines may even be a net harm. At any rate, it is starkly obvious to even the most committed modernists that these new “vaccines” do not confer long-lasting immunity, only limited-time protection, something even admitted by vaccine mogul Bill Gates in a 2023 interview.

Although the true costs of the vaccine endeavor will not be known until at least several years in the future, when the mid-term and long-term effects of mRNA on the human body become more clear, the fact remains that this extremely divisive and controversial public health measure, which cascaded into the violation of civil liberties in Canada and the curtailment of freedoms in most Western countries, seems to have been a bust – as booster uptake strongly indicates.

If anything, the entire endeavor, and particularly the insistence of public health bodies that children required these injections, represented a form of bodily sacrifice – and even child or infant sacrifice reminiscent of Moloch – in the name of public safety. Even worse are the many thousands of people who felt pressured to take it because of work-related mandates, whose very bodily autonomy was irreparably violated by a thin simplification purportedly made for their own benefit. This is aside from concerns about the influence that unelected global organizations are having on national health policy, an extremely dangerous form of high modernist control that could usurp autonomy for entire nations without their realizing it.


Generally speaking, the ideas and values that drove the West’s COVID-19 response are outlined in a publicly-available book by World Economic Forum founder Klaus Schwab, who enjoys close relationships with many of the most aggressive pandemic bureaucrats including Jacinda Ahern of New Zealand, Justin Trudeau of Canada, and Emmanuel Macron of France. In the book, called The Great Reset, Schwab essentially proposes taking advantage of the COVID-19 pandemic to instantiate a form of techno-communism on a global scale.

His ideology, broadly speaking, is predicated on three core pillars. The first is a recognition that industrial society has had many negative impacts on the natural world as well as the socio-economic situations of many countries. The second is an arrogant assumption that a better-managed and more efficient society can overcome its own structural flaws and achieve environmental sustainability. The third, final, and most Machiavellian pillar is a belief that a global emergency like a pandemic represents the perfect opportunity to make large-scale societal changes in service of this vision.

Although the new global system, which is already being assembled before the eyes of a crisis-weary populace, will never be called communism, it will have all the salient features of the modern Chinese iteration. Specifically, this will include strict ideological conformity, extensive government interference in economic affairs, the concentration of wealth and power into a selective elite, and significantly reduced personal freedoms facilitated by surveillance technology.

The next step for the World Economic Forum, and for many Western countries, is the curtailment of personal freedoms in the name of environmental sustainability. Now that large sections of Europeans and North Americans have been conditioned into radical compliance on account of one emergency, the thinking seems to be that the new emergency measures, which already include restrictions on farming in Canada and the Netherlands, will be accepted without much fuss. Alongside the reduction of agricultural activity – especially meat production – insect matter is now being introduced to children in schools, as supplements in grocery items, and in other food-like settings. Unfortunately, a deeper investigation of the issue of eating bugs reveals that it comes with many health issues and that meat is much healthier.

Most unfortunately, the “far-right” claim that leftist elites dream of a society with an underclass who lives in pods, eats bugs, and is watched over by sophisticated technological systems is largely coming true. The entire endeavor has proven to be a total usurpation of Western metaphysics, with false narratives about environmental sustainability, nutrition, human nature, and public health being wielded as holy scripture against an undereducated and overmanaged population. Without the inquiry skills, the courage, and the mentorship to research these matters for themselves and realize that the truth is more nuanced than longstanding narratives, most people have no chance – and the elites likely know it.


As can be seen, upon a rigorous examination, modern society proves to be every bit as religious as Europe under the Catholic Church. The modern Western mind, colonized beyond belief, has been raised in an educational and home environment marked by emotional neglect, a lack of true critical thinking, and a crushing lack of autonomy. Fed a steady stream of pablum from mainstream news and other official sources, much of it demonstrably false, biased, or epistemically questionable, large swathes of society live in a dream reinforced by priest-like experts and an incomprehensible mountain of academic literature backing their claims.

Nietzsche’s predictions, unfortunately, have come true, as have Maslow’s concerns about the lack of values in human-focused fields like psychology. The Christian ideal, flawed as it was, gave way to the pleasure-pain principle writ large across society, with statues and odes to suffering now ubiquitous across leftist-dominated campuses, the pursuit of happiness exploited to justify unscientific notions about gender and sexuality, obesity and drug epidemics across several Western demographics, and general civil unrest marked by unprecedented levels of violence in many North American cities.

Although a Westerner familiar with media ecology and conspiracy theories may point to the unrelenting messages about consumption, credential acquisition, and traditional success as a corruption of what people want, the truth is that the rot in Western values lies much deeper. As might be said by dissident psychiatrist Alone, the correction that must be made is in how the wanting is done. This perspective helps illuminate the true challenges facing Nietzsche’s hypothetical overman, as well as the nihilistic and decadent society “he” is meant to represent. Indeed, it is not enough to simply want to be, do, and have different things – they must be acquired in a certain way to be truly satisfying.


Aside from Aleksandr Solzhenitsyn, one of the most famous critics of the Soviet system was Ayn Rand, who emigrated to America in the early twentieth century and became one of the most influential philosophers of her generation. One of the focuses of her work, which she saw as a continuation of the Romantic tradition, emphasized humankind’s free will and our ability to choose values, to work towards them, and to gain satisfaction from achieving those goals. The position that she took in her opus, which seemed bombastic or exaggerated to many until recently, was that anybody who even remotely espoused leftist and collectivist ideals was a villainous looter who refused to take part in the rational process of valuation and productive labor.

The heart of the issue, to Rand, was a metaphysical and ethical one. The looters in her stories wanted to deny reality and expected others to pick up the tab – much like the collectivist neo-Marxists are expecting society to do now. By denying reality, Rand said that they were attempting to steal from other people and exploit their good faith. The righteous person, in Rand’s philosophical framework, can be described as a trader, an inventor, a laborer, an artist, and an intellectual to whatever degree they are able – vocations that are often idealized or featured in her work. Underlying Rand’s ideas is a belief in a definite and knowable reality, a product of the Enlightenment and scientific project she championed in her work. This stood in stark contrast to the postmodernist position, which holds that reality is socially constructed and nobody can ever really know anything objectively.

In such a world, one’s ability to get one’s needs met is not based on productive labor, which would implicitly require interactions with reality, but rather one’s ability to manipulate others into meeting their needs. This is the fundamental mindset behind communism, which rejects the outright competition of the free market and replaces it with social competition based on conformity to increasingly complexified values. This is also the key driver behind the continual expansion of the queer demographic to include disturbingly specific sub-identities such as the “ampukodo”, someone who believes they are a child amputee. Failure to adhere to social conventions when addressing such demographics results in expulsion from the group, akin to the Soviet gulag treatment or the public confessions favored by the Chinese communists.

Thus, as Rand laid out in Atlas Shrugged in great detail, the honest producers of society are manipulated by shame and Nietzschean slave morality values into complying with people whose only way of providing for themselves is through this manipulation. Her proposed solution, a strike, would force everyone to face reality without the benefit of their privileged delusions. Elements of this can already be seen in Western society, as evidenced by noticeable drops in North American military recruitment, a lack of people willing to work minimum-wage jobs, and the growing popularity of alternative working arrangements, entrepreneurial activity, and homesteading lifestyles.


Although the Hunger Games story has reached hundreds of millions globally, many of whom certainly enjoyed it, the buzz over a female action protagonist seems to have eclipsed the fact that the Capitol, a decadent utopia that exists on the backs of the other districts – which engages in ritual child sacrifice on a regular basis to satisfy the proverbial peace idol – is much more representative of the modern leftist than it would be of the right-wing fascists they fret over.

Indeed, as has been shown over the past several chapters, the utopian visions characteristic of the West before the World Wars has been subverted by neo-Marxist intellectuals, maniac psychologists, and power-hungry elites who have corrupted human nature, stunted intellectual development, encouraged the development of decadent value structures, abandoned history, and have proven themselves willing to even mutilate children and force-medicate adults.

Even as the West takes its final steps to emulate China’s dedication to electronic surveillance and population control, a measure of solace can be found in the fact that these ideologies, isms, idols, and other false narratives are inherently unsustainable. For example, a small cadre of scientists and activists have kept track of all relevant publications on homosexuality and kept them compiled on hidden corners of the internet – a single look at this mountain of evidence, or even a chance viewing of video on YouTube, is enough to collapse the fifty years of false homosexual narratives in minutes. Even the vaccine narratives that have been relentlessly pushed by all levels of government as well as their willing partners in the media are proving to be unsustainable, as the majority of people in North America are declining further boosters. This happened within years. The same is true for cheating spouses whose efforts at deception are destroyed by leaving their phone on a countertop carelessly.


Aside from his attacks on traditional European values, Nietzsche lamented the rational structures of Western society and longed for a return to Dionysian, or intuitive, ways of living. Although this must be weighted against the obvious need to be connected to reality and the value of rational inquiry, the loss of the West’s intuitive capacities is of significant interest. All too often, people are encouraged to “be rational” and to mistrust their intuition, which, as will be shown, is often the very capacity they need to be able to escape the high modernist dream. Indeed, the artists and visionaries of Western society have long been the ones to criticize it most thoroughly – George Carlin’s social commentary is an example, as is The Matrix, dystopian novels like Brave New World and 1984, and even more contemporary pieces such as the surprisingly thoughtful The Secret Life of Walter Mitty.

If society were to listen to George Carlin, rather than laugh at him, the end result would likely be a strike or an uprising of some kind. Yet, although his messages and comedy are wildly popular in North America, or at least were at one time, nothing has changed. Why is this the case? The problem, again, is with metaphysics, which determine values, which in turn determine behavior. If the behavior has not changed, it is because the values have not changed – and if the values have not changed, it means that the person’s understanding of reality has not been sufficiently altered. Given that a great deal of time and energy has been spent by established interests to corrupt the West’s understanding of human nature, it seems most helpful to begin there.

As Abraham Maslow lamented in his 1968 memorandum, too much time has been spent focusing on the negative aspects of the human experience – the Christian original sin, the psychoanalytic trauma – rather than on the positive aspects and the potentialities. Thus, in order to begin to address the damage that has been done by the Western project, it becomes necessary to reconstruct a new psychology from first principles in the hard sciences, expand that understanding to group and social dynamics, and finally investigate some of the fundamental claims made by evolutionary scientists to determine what is true about the species and the universe.

By developing a more correct understanding of human nature and its implications in this way, the West can begin to heal its metaphysical corruption, develop a value system that is not decadent and is finally in line with reality, and begin to develop the behaviors and systems needed to establish a sustainable utopia. As has been suggested throughout popular culture, perhaps most famously Disney’s The Lion King, once we understand who we are, we will be able to step into our full potential and become everything that we could be. Indeed, everything that the sun touches is ours.

Get Updates on Spiritual Science

* indicates required


Chapter 7 Draft - "First Principles & Boundary Conditions"

In much the same way that a headstrong teenager might learn a tough lesson in a schoolyard brawl, so too will Western civilization continue to be disappointed by its utopian attempts if it does not first develop an understanding of human nature that is not in line with reality. Thankfully, as the result of tremendous advances in physics, neuroscience, evolutionary biology, and even some fields of psychology, it is possible to ground the complexities of human mental activity in the hard sciences, extrapolate those insights to social activity, and therefore develop a robust framework of boundaries and rules to rebuild Western civilization.

Beginning with the laws of physics is, from a scientific perspective, is considered the strongest approach for theoretical work, as these laws are widely accepted to constrain all other branches of science such as chemistry, geology, and biology. This means such an approach is ideal for handling fields like psychology and sociology, which blend subjective and objective methods in a warped social context and become far removed from reality as a result.

Although a physics-based approach is the strongest, it is also the most unforgiving – for example, a review of the universe’s laws may reveal that some much-loved aspects of Western civilization, such as various freedoms considered sacrosanct by Enlightenment liberalism, may not be tenable for individuals or allowable in groups. The existence of immutable facts about human nature may also be a sore point for many Westerners, for whom the ideas of being a “blank slate”, a “self-made person”, or “self-actualizing” are rather attractive. Indeed, the notion that there might be existential constraints on human power and freedom, despite our incredible technological advances, could be a bitter pill to swallow depending on what those constraints might be.


In addition to discovering behavioral constraints that may be yielded by this process, it is also possible that the realities of human nature may imply, or even necessitate, the existence of certain values – things that humans “should” do to flourish. While such concerns have historically been the domain of philosophy, an evidence-based approach that grounds abstract moral ideas in the realities of human life should identify several important items that contribute to human flourishing.

This is the kind of work that Abraham Maslow, the founder of third-force psychology, was primarily concerned with, and the kind of research work that he implored his colleagues to support. The problems with incorporating moral ideas into psychology and the sciences, however, are many – not least of which is the fact that the scientific process is not designed to answer questions about values, as can be seen by the gain-of-function research in Wuhan which sparked a global health emergency or the systematic destruction of children by the modern medical system.

Indeed, whereas questions of values are rarely relevant in the affairs of chemists or geologists, the intersection of the scientific method, human needs, and human rights has been an extremely problematic area for both researchers and practitioners in psychology. Given the incredible diversity of behaviors, customs, beliefs, thinking patterns, and personalities in homo sapiens, the obvious fact that many different kinds of cultures and societies have achieved success, and tireless activism from certain minorities seeking to legitimize their conditions, the trend has been towards a pluralistic stance of non-judgementalism. Despite the positions taken by the mainstream, however, Maslow’s passionate advocacy for the inseparability of values and human research can be corroborated by a very simple appeal to the existence of the laws of physics themselves – revealing yet another tremendous oversight in modern psychology.

Consider, for example, that the universe exists and is governed by a set of laws, such as gravity and electromagnetism. Further consider that these laws, when applied to biological systems, seem to cause some organisms to thrive and others to perish. This means that in any given situation, a human being is faced with a range of choices that could be ranked by their contribution to the human’s environmental fitness, ability to reproduce, or even ability to survive.

If taken to its logical extreme, this would mean that there are “better” and “worse” courses of action, in terms of acting according to these laws or against them. It would also imply that there are “superior” ways of living that correspond with the laws of the universe, and moreover that every action a human being takes either corresponds to these ways of living, and therefore reality, or is therefore divorced from them. This would be the case regardless of the desires, wishes, or beliefs of the organism, a conclusion stands in stark contrast to the neo-Marxists who claim that reality is socially constructed and therefore malleable.

Although the strong resonance between philosophical works and this simple thought experiment does not provide a “proof” for any specific ethical system, it does hint at the possibility for the existence of such an ethical system. This is an early indication that Abraham Maslow’s concern with values was not misplaced, and moreover that a careful analysis of the universe’s functioning could lead to the elucidation of values conducive to human flourishing.


Although this approach will be laborious, lengthy, and involves a kaleidoscope of references to different disciplines, events, and theories, as can be seen it yields rapid and powerful insights – such as the potential existence of “superior” behaviors and societal structures. In much the same way, it can yield an understanding of what human nature is like, in the most general sense, and how people grow and develop throughout their lives. Of particular interest will be the fundamental mechanisms that underlie mental activity, as they will dictate the parameters of downstream phenomena like trauma and creativity.

Indeed, by beginning with the most basic and fundamental principles in physics and introducing elements of greater complexity thereafter, it becomes much easier to identify general mechanisms that govern individual psychology and group behavior. As opposed to experimental approaches popular today, where psychologists search for trends in data obtained from test subjects, moving from the fundamental principles of physics to known complexities of human behavior facilitates the identification of deeper and more obscure forces that would elude the kinds of hyper-focused studies being funded today.

A tangible example of these kinds of “hidden laws” of the universe, at least pertaining to their expression in human contexts, is modern economics and free market theory. As opposed to communist systems, where bureaucrats vainly attempt to defy computational limits by controlling every aspect of the economy, Adam Smith’s concept of the “invisible hand” of the free market is the philosophical expression of a distributed computation system where everyone makes their own choices. The effects of Smith’s invisible hand are said to be the sum of the effects of individual choices – examples include viral sensations like Justin Bieber or Rebecca Black, made famous by millions of independent viewing choices and peer-to-peer sharing. Although the teenagers taking part in the elevation of these viral sensations may have never heard of Adam Smith, the concepts, forces, and “laws” at work in the music industry are independent of this fact and seem to “exist” in much the same way as gravity.


In a world where science and technology are unimaginably advanced, and humanity’s control of the natural world is near-complete, it can be difficult to imagine that there may be “laws of the universe” that remain undiscovered – at least by scientists. Yet, even physics, the most venerable of the sciences, continues to push the boundaries of knowledge with nuclear fusion and high-speed particle collision, meaning there is no reason why the rest of the sciences, and the humanities, cannot experience similar developments. Furthermore, it is not inconceivable that the kinds of unorthodox consiliences being sought are themselves innovations, at least in the sense of developing a more finely-tuned sense of reality. Here, the knowledge lies in the relationships between the disciplines at hand, and not necessarily within any one field.

Get Updates on Spiritual Science

* indicates required


Chapter 8 Draft - "Time & Motion"

Once the existence of the universe and a set of laws that governs its activities have been accepted, attention can be devoted to one of the most enduring human preoccupations – the passage of time, which was likely the inspiration for tools like the prehistoric Ishango Bone and is still considered one of the greatest unsolved mysteries of physics . Indeed, the phenomenon of time has proven to be one of the most fundamental and indescribable aspects of our universe, and the fact that living creatures only have a limited amount of it has been the cause of great concern and study throughout the ages.

Although many animals will follow seasonal migration patterns and have been generally observed to have some sense of time , , only humans have developed mechanisms for tracking time, only humans have developed explicit schedules for how their days will be structured, and only humans take great care in marking beginnings and conclusions. From new year celebrations to elaborate burial and grieving rituals, and from calendars to clocks, humans very much live in a world of time – especially since industrialization, the introduction of global time zones around the time of the First World War, and quartz oscillator clocks installed at the United States Bureau of Standards in 1929.

But what is time? The physicists, and scientists more generally, would describe time as an independent variable by which everything else is measured . Given the unique properties of time, especially its predictable and inexorable progression “forward”, it serves as the perfect foundation by which to measure changes in physical systems on Earth and in the skies. Indeed, there are cause-and-effect relationships between past states of systems and present states – for example, a teacup shatters on a concrete floor because it was dropped by accident, and it was the interaction between the floor and the teacup that caused the shattering – preceded by the gravitational acceleration between the teacup and Earth.

An example of these changes happening in a living context would be the growing seasons in agriculture, where light energy and chemicals are photosynthesized into plant matter – a slow but beautiful process. These mechanisms, although similar in many ways to non-living processes like erosion and gravity, represent a very different kind of cause-and-effect relationship that cascades into new life over time. This general relationship, known to people as “evolution”, is rather special in that Earth is the only planet in the known universe to have it, a source of great wonder and speculation since our species gained interstellar awareness.


Before progressing to matters of the mind, however, a fundamental definition must be developed for this phenomenon known broadly as “life”. Intuitively and rationally, it can be surmised that there is something different between a human being and a table, or a human being and a rock, and that human beings share some similarities with dogs, trees, and shrimp. But what is the common element that differentiates “living” beings from “non-living” or “inanimate” objects?

Put generally, life can be defined as a process that moves something towards complexity, sophistication, and utility over time. The processes we describe as evolution, at a fundamental level, take existing life forms, combine them with environmental inputs, and create new life forms that are better suited to respond to those environmental inputs. One way that scientists describe living beings is by calling them self-organizing, which means that within the living being’s own system, matter and energy are organized in particular ways to further the survival and replication of the organism. Additionally, although individual organisms may stop engaging in these processes – what we call “dying” – they can propagate their genetic code to future generations through reproductive mechanisms. This, notably, is something that rocks, stars, and other objects classified as non-living cannot do.

This seemingly trivial detail is not only one of the greatest mysteries of life, given that the rest of the universe seems to tend towards disorder, but the basis of a more coherent view of psychological phenomena. Indeed, it can be said that in a very general and abstract way, the process we call “life” is involved in selectively propagating certain kinds of matter forward through time. This not only includes genetic material, the traditional concern of evolutionary biologists, but has also come to encompass things like language and culture, which are typically viewed as the concern of the social sciences. However, by reconciling these different types of ordered matter under the singular concept of information, generally defined as ordered matter, it is possible to not only develop a coherent view of life, but move towards an understanding of the function and form of human consciousness.


From the perspective of traditional biology, each living creature is said to have a genotype, or a specific set of attributes determined by the genetic information it inherits. Following Darwin’s initial publications, the work of Gregor Mendel and other early geneticists revealed that variation and selection mechanisms provided the “rules” by which this information is passed forward through generations and time, which was the primary focus of biologists for many decades afterwards. However, as researchers in other spheres of knowledge began to integrate these new concepts into their own paradigms, the concern of evolutionary biology progressively expanded – in addition to the genotype, or the genetic attributes of an organism, it became understood that organisms also had phenotypes, which encompassed environmental impacts, learned behaviors, and other elements of life relevant to survival and replication but largely independent of genetics.

Without doubt, the most famous addition to the West’s understanding of the role of non-genetic information in evolution is Richard Dawkins’ concept of the meme, a unit of information that is replicated and propagated much like a gene. Whereas genetic information is stored in cells, however, memes are comprised of mental information stored in the brain, and are selected for reproduction based on their contribution to the survival of the organism. A tangible example of a meme would be the saying “an apple a day keeps the doctor away”, which is not only a simple rhyme that replicates easily, but an extremely sophisticated piece of dietary advice that support the survival and genetic replication of the organism itself. More complicated memes include belief systems like Christianity, centered around the memes of sin, intercession, and neighbor-love, or the Western notions of “free speech” or “free trade”, all of which have effects on social systems and therefore individual evolutionary outcomes.

Although Dawkins’ concept of the meme has proven to be extremely useful and helpful when considering human evolutionary contexts, an excellent 2005 work by Eva Jablonka and Marion Lamb revisited this concept and more clearly defined the kinds of information at work in evolutionary processes. They proposed that evolution was driven by genetic, epigenetic, behavioral, and symbolic inheritance processes, which can be simplified to genetic and mental information for purposes of the present inquiry. Indeed, much like a computer’s hard drive can be said to store both data and software “made” of data, the human brain is the storehouse for things like behaviors, language, and culture, all of which support – or hinder – survival and replication. And, much like physical organisms seem to compete over scarce resources, with the strongest prevailing, the best ideas seem to be propagated to the next generation through childrearing and education.


In much the same way as an organism requires food to support its physical processes like metabolism, living creatures, especially human beings, require information from their environment as a necessity of living. The uncertain conditions faced by most people, animals, and plants means that even the most basic of organisms requires a way to detect and avoid unfavorable outcomes while pursuing positive outcomes. Whether this is something as simple as the flagellum on prokaryotes to the sophisticated eyes of eagles, living things are in a constant state of exchange with their environment – not only in terms of nutrients, but also in terms of valuable information.

Whereas most animals, plants, and other organisms seem to be content with basic perception and communication, human beings have an extraordinary appetite for all kinds of information, from sitcom trivia to the secrets of the universe. Up until the invention of writing, suspected to have been invented thousands of years ago in the ancient Near East, the sum total of everything human beings knew was pretty much entirely in the heads of the tribespeople alive. Based on what is known about hunter-gatherer tribes that retain much of their traditional lifestyles, the information contained within the collective was largely for survival and cultural purposes, with many memes like stories, songs, or dramatizations serving both needs.

Once writing was invented, first on clay tablets, the information that humans could store and retrieve became much greater in terms of scope and complexity. Things such as records of transactions or taxes could now be “remembered”, facilitating new kinds of economies and organization. The scroll and codex, and later the book, made it possible to faithfully store entire religious narratives like the Jewish Torah or the Buddhist Pali Canon, propagating them unerringly through time. Today, the advanced storage technologies available, such as computer hard drives, as well as the proliferation of sensors and content creators, have made information as much of a commodity as wheat or barley – and, in some cases, as or more valuable.


As can be seen, the phenotype of the human being is rather unique. It includes not only genetic information propagated through reproduction and selection, but also memes that exist in the minds of human beings and are propagated in much the same way, through communication and selection. In a 2021 book that studies, among other things, the environmental impact of information, astronomer Caleb Scharf notes that storing and sharing our information, especially in the age of vast server warehouses and cryptocurrency mining, has not only become more integral to our functioning as a society, but also occupies a larger share of human effort and energy expenditure than ever before.

Particularly as algorithms and artificial intelligences begin to perform more data processing work – and even creative work – on behalf of humans, the phenotype of the species is progressing towards profound complexity rather quickly. Instead of being occupied with survival-oriented concerns, which are “simple” enough to be mastered by even an illiterate caveperson, modern humans now spend most of their days swimming in a veritable sea of information, largely on social media and other websites. This information is processed, regurgitated, remixed, riffed on, and sent back into the sea, where a combination of human signalling activity such as “likes”, as well as algorithms trained to detect patterns in those activities, determines the reach of the idea. To even engage with much of the information available online requires at least a decade of education, if not more, a far cry from the cuneiform cattle counts of Mesopotamia.

It could be very easily argued that much of this activity is useless or wasteful, especially as the electricity requirements of digital communications technology continue to rise. However, the human tendency towards inexhaustible curiosity – and therefore the need for information – is not only one of the defining features of the species, but perhaps the fundamental process of human consciousness and the reason for the dominance of homo sapiens in almost every environment on Earth.

Get Updates on Spiritual Science

* indicates required


Chapter 9 Draft - "An Ever-Finer Quality"

With the biological mechanisms of selection and reproduction grounded in fundamental concepts in physics, it is now possible to further examine the role of information in the processes of life. Consider, as was previously mentioned, that life processes tend towards complexity and specificity, whereas the rest of the universe tends towards disorder. Indeed, proponents of evolutionary theories are known for their expansive view of life on Earth, which seems like it began with simpler life forms and moved towards more sophisticated organisms over millions of years.

Given that neuroscience and psychology are themselves subsets of the more expansive domain of biology, it would stand to reason that not only would we expect to see a trend towards genetic complexity, but also a trend towards psychological sophistication. To some degree, this would seem to be the case given the progression of world history from small tribal groups to international alliances of hundreds of millions of people. However, before moving immediately to issues of meme propagation, we must first consider how the laws of physics might shape the fundamental mechanisms that guide human thought.


Karl J. Friston is a member of the Royal Society, a recipient of the Golden Brain Award, and hailed by some as the “genius neuroscientist” whose theories may unlock true artificial intelligence. Among his many achievements is the provocative, insightful, and surprisingly simple Free Energy Principle, which Friston has proposed as a kind of unified brain theory. Despite the exciting potential of his ideas, they are notorious for their complexity, difficult mathematical style, and almost-tautological nature, and are also infamous for being understood by Friston himself and perhaps some other select few around the world:

At Columbia’s psychiatry department, I recently led a journal club for 15 PET and fMRI researhers, PhDs and MDs all, with well over $10 million in NIH grants between us, and we tried to understand Friston’s 2010 Nature Reviews Neuroscience paper – for an hour and a half. There was a lot of mathematical knowledge in the room: three statisticians, two physicists, a physical chemist, a nuclear physicist, and a large group of neuroimagers – but apparently we didn’t have what it took. I met with a Princeton physicist, a Stanford neurophysiologist, a Cold Springs Harbor neurobiologist to discuss the paper. Again blanks, one and all.

Although there are many complexities to Friston’s theories, one of the observations made in a key 2010 paper is that biological systems, and neurological systems like the human brain, seem to violate the fluctuation theorem – physics terminology for a tendency towards complexity instead of disorder, as would be suggested by entropy, or the Second Law of Thermodynamics. Precisely how the brain does this, however, is a matter involving several details that are themselves still hotly-debated by philosophers, physicists, and neuroscientists.

Much in the same way that Rene Descartes’ cogito ergo sum postulated that the only thing we can be totally sure of is that we think, and therefore exist, Friston’s Free Energy Principle begins with the problem that the brain is a separate system from the world – at least to some extent. Put more simply, there is information that exists “out there” in the world, as well as information contained within the brain’s memory, which remain separate from each other. The senses, which provide information about the external world to the brain by way of electrical signals, are the intermediary by which the brain updates its understanding of what is happening “out there”.


Physicists and mathematicians represent this kind of dynamic – a brain and environment, with senses as an intermediary – using a concept called the Markov Blanket. This idea, although considered tautological by some and a “trick” by others, is essentially a mathematical formulation of the underlying physical realities governing the brain and can be reasonably taken as a given. The implications of the Markov Blanket, however, suggest that the brain must be involved in some kind of iterative or cyclical process to match its impression of the external world with the information it gets from its senses:

See image here

Although this arrangement is nice in theory, it becomes decidedly more complex when real-world systems like brains and computers, with limited memory and limited time to compute information. Indeed, with an estimated processing power of trillions of calculations per second, the human brain is tremendously powerful yet still faces constraints on its capacities. Additionally, Friston’s Free Energy Principle would imply that the brain is making vastly complex comparisons between its memory and the world at every second, which is true on some level but faces similar mathematical constraints in real-world scenarios.

The brilliance of the Free Energy Principle, and the reason it has proven to be so exciting for neuroscience and psychology, is because several pre-existing constructs in mathematics and physics exist that simplify these computations to such a degree that it could be plausibly handled by a human brain. Put simply, instead of having to make calculations across all possible environmental states, including extremely unlikely ones like unexpected meteor strikes or ghost apparitions, the human brain instead makes a series of guesses or hypotheses, and then calculates the amount of surprise generated by its sensory inputs.

Once a level of surprise is ascertained, the mental model can then be progressively revised to fit the incoming sensory data, or action can be taken to change the external world to match the model. This can be represented visually by two converging graphs, which represent probability distributions, which in turn correspond to expected and actual states as expressed by the brain’s electrical signals:

See image here


With its emphasis on an iterative progression towards a sophisticated and expansive perspective on the world, the general trajectory of development implied by the Free Energy Principle would imply that the human brain begins at a low level of sophistication and grows in response to the sensory information it acquires. The Principle would also imply that the more data someone has access to, the more sophisticated and expansive their hypotheses about the world should be – and the less often they will be surprised. So far, this seems to map to our reality, where people with more experience with something, like expert pilots, are less likely to experience surprise or confusion when performing related tasks. But what of the physics involved in these processes, and the information that exists within the system? How might the brain change over time as a result of sensory data and an iterative truth-finding process?

The key to understanding these matters lies in Friston’s mathematical formulation of the Free Energy Principle, which represents the comparison the brain makes between its generative model and the sensory information it receives:

High School: Surprise equals the difference between generative model and sensory input.

Undergraduate: F = - [ln(model)] + [ln(sensory)]

Hard Mode: F = -<ln p(s,ϑ|m)>q + <ln q(ϑ|µ)>q

Although the Free Energy Principle only involves a simple subtraction operation, what the symbols in the equation actually represent is a matter of great complexity. For example, the presence of logarithms is due to Friston’s work being founded on the physics of information itself – mathematically, unlikely events contain more information, while events that are entirely unsurprising provide observers with little to no data . In addition, the concept of information in physics is very similar to the idea of computer binary, with each element of data providing an answer to a question about the world and thus constraining possibilities. How these dynamics actually play out within the brain have been a rather complicated matter for many researchers, but can be largely explained through a simple thought experiment that follows a hypothetical human throughout their lifespan.

Consider, for example, a newborn baby that is only beginning to acquire information from their environment. At this point in their lifespan, the information contained within the baby’s body and brain is almost entirely of a genetic nature, and the contents of their mind consist solely of the core “programming” that is a consequence of their particular iteration of the human genotype. Mathematically, this corresponds to logarithms of zero across most of the generative model, meaning that from the baby’s perspective, almost everything is entirely possible and most sensory inputs would yield a great deal of surprise.

Indeed, this would seem to be the case for observable behavior, as it is known that babies are easily disturbed, are crying in anguish a great deal of the time, and can be soothed by predictably rhythmic sensory inputs like rocking or gentle singing. It is also well-known that infant minds are in a state of almost constant learning, and more neural connections are formed in the early stages of life than at any other time. From the perspective of Friston’s mathematics, the infant’s brain is acquiring information and developing reasonable expectations of the environment based on that information, which would correspond to more information being stored in the brain and used to constrain the generative model. So far, this makes sense.

Moving from infancy to early childhood, the Free Energy Principle would suggest that the brain in this stage has the beginnings of a generative model, but is still reasonably “unconstrained” as it lacks a great deal of information about reality. Indeed, the phenomenon of fantasies and imaginary friends in childhood further corroborates Friston’s model, as it is clear from any sustained interaction with a young child that they live in a world of near-unlimited possibilities.

This was the general trajectory identified by child psychologist Jean Piaget, famous for his observations of the development of logical and “rational” capacities in children as they approached adolescence. Among the findings of Piaget and those that followed him include children’s inabilities to understand logical puzzles until certain stages of development, a tendency away from fantasy and whimsy to coherence in self and cognition, and a steady progression towards full rational capacity by puberty for most. Piaget’s work can be summarized in four stages, which, as would be expected from the work so far, represent increasing levels of sophistication and a metaphorical “awakening” or “activation” of the mind’s capacities.

Adulthood and old age are typically associated with the acquisition of knowledge, as well as an elusive capacity known generally as wisdom. However, based on the tendency towards complexity dictated by evolutionary dynamics and Friston’s work on generative models and free energy, it should be expected that people in old age would have fairly expansive and well-developed models for what they think should and should not be true. Also, from a computational perspective, to “rewrite the code” would take an extraordinary amount of time and energy, making it perhaps unattractive. Indeed, this would seem to be the case – things like fluid intelligence, or the ability to solve novel problems, begins declining noticeably after early middle age . Conversely, crystallized intelligence, or the ability to draw upon what one already knows, is maintained into elderhood for most.

Researchers have also found that elderly people in the West tend to be perceived as stubborn to the point of riskiness by their adult children, another subtle indication that Friston’s concept of a generative model is accurate . And although these stories are, of course, anecdotal, business and art history is replete with executives who turned down incredible deals and became famous for their commitment to preconceived notions. Examples of so-called “famous rejections” include The Beatles and J.K. Rowling, which sometimes include rejection letters that are clearly based on a well-developed and rigid generative model of reality:

Not to mince words, Mr. Epstein, we don’t like your boys’ sound. Groups of four guitarists are on the way out.

On the other hand, the capacity for wisdom, an elusive and ill-defined quality even today, is a feature of elderhood that supports both society and survival. Researchers studying family dynamics have found that grandparents involved in childcare, something almost unheard of in the animal kingdom, play an important role in the acculturation and development of grandchildren. Indigenous elders such as chiefs, medicine people, shamans, and other leaders are known by researchers to be stores of incredible knowledge and wisdom, carried in their brains and shared with the community as needed . Thus, an examination of different aspects of the lifespan, from Piaget’s model of childhood development to the tribulations of elderhood, reveals a trajectory that supports the Free Energy Principle.


Another reason that Karl Friston’s work is so exciting is related to the structure of the Free Energy Principle’s primary equation. Consider the similarities between Friston’s formulation and a well-known equation in thermodynamics related to entropy and heat energy:

Surprise = “Free Energy” = - [ln(model)] + [ln(sensory)]

Free Thermodynamic Energy = H − T*S

Indeed, the relationship between physics and biology, already discussed extensively by past luminaries, may be extended to neuroscience and consciousness by way of Friston’s work. This tremendous achievement, if it holds up under the ongoing scrutiny, would solidify an area of extremely profound consilience that has been already demonstrated to involve physics, evolutionary dynamics, information, and several human phenomena observable in development and behavior.

Another area of consilience is the fact that Friston’s work strongly suggests that consciousness does, in fact, become more complex and sophisticated over time. This mirrors the genetic evolutionary process from the perspective of physics, and would imply in a metaphorical sense that each individual human being is taking part in a personal evolutionary process just by existing, sensing, and thinking. As will be seen, these insights constitute the seed that will germinate into a comprehensive and insightful perspective on the social sciences, and it is Friston’s unique genius that is largely to thank for them.


Given the totally unique genetic and socio-cultural-economic-geographic conditions inherited by each child upon birth, as well as the unique life path they follow as a result of their choices and circumstances, this progression towards sophistication implies a unique phenotype for each human being. As will be discussed, this phenotype includes not only the accumulated skills, knowledges, and abilities of each individual, but also the information they propagate into their social environment. The deep interdependencies that drive human growth at the social level, however, must be reserved for the next section – as well as their profound implications for the individualistic West.


Although the debates over the viability of Friston’s work continue to take place within the academic circles that can understand them, if they are accurate, one should expect a primal human drive to reduce surprise documented in the neuropsychological literature. One should also expect an instinctual response to surprises encountered in the environment, as well as instinctual responses to deny or push away information that would require a computationally expensive “rewrite” of the generative model. Ideally, the neuroscientific evidence should be quite clear and well-documented in the literature.

Thankfully, some key discoveries made by a group of pioneering Soviet behaviorists offer another layer of consilience with exciting and powerful implications, suggesting not only that Friston’s core concepts map closely to reality, but that kaleidoscopic works of Soviet-influenced scholars like Jordan B. Peterson’s Maps of Meaning can also be incorporated into the growing body of consilient literature being developed.

Get Updates on Spiritual Science

* indicates required


Chapter 10 Draft - "Maps of Meaning"

Aside from his many insights into the fantastical and wondrous worlds of children, Jean Piaget contributed a very important sense of the hard limits and boundaries encountered by humans in their youth, some of which can be directly traced back to biology. A self-styled genetic epistemologist, Piaget saw himself as someone who looked for the origins of human knowledge in human biology, and his work with children could be seen as investigations into how our most fundamental knowledge structures are acquired. Piaget’s early investigations continue to be valuable for parents, educators, and researchers today, along with the work of other less-appreciated child psychologists such as Kazimierz Dąbrowski and Maria Montessori.

Among the plentiful ranks of underappreciated psychologists include the Soviet behaviorists, skilled surgeons and neuroscientists in the same style as Ivan Pavlov, whose work under the collectivist regimes was subject to existential limitations on publishing, dissemination, and a strong bias towards atheist materialism. Their concern with observable behavior and tangible observations in brain or body activity that could be connected to that behavior, much like Pavlov’s famous dog salivation experiment, betrays to some level the Soviet priorities on research funding, but in retrospect yielded some key discoveries that support Friston’s work while predating it by several decades.

Perhaps as a result of their unique set of challenges, the works of E.N Sokolov, O. Vinogradova, and A.R Luria remained relatively unknown by the Western public until the popularization of Jordan B. Peterson and his keystone work Maps of Meaning brought their research on the human response to novelty to the attention of many thousands of readers. One of the foundational insights of Peterson’s whole academic career, in fact, is derived from the Soviet literature on what they called the orienting reflex, or the near-instantaneous reflex of living creatures, both human and animal, to pay attention to things in the environment that don’t belong.


Noticing something brown on the sidewalk a few feet ahead, hearing a wrong note in a live musical performance, or hearing an odd noise in the forest at night are the kinds of things that evoke an immediate refocusing of attention, an emotional response, and even survival mechanisms like the fight-or-flight response. The orienting reflex is, essentially, the neurological circuitry in the brain that is involved in comparing incoming sensory information to the brain’s ongoing generative model and involves a very old section of the brain called the thalamic structure, partially responsible for sensory processing and the direction of goal-oriented activity.

One of the most striking aspects of the orienting reflex are that it happens faster than conscious thought, usually between two hundred and five hundred milliseconds, making it a potentially life-saving mechanism in the event of a falling toddler, a car veering out of control, or a physical confrontation. Another notable thing about this neurological mechanism is that it is involuntary, meaning that the orienting reflex is about as deep as the fight-or-flight response, if not deeper. When this is placed within context of Friston’s theories, the Soviets’ work indicates that the brain is indeed wired to attend to things that are surprising, in fact as a matter of basic survival.

Another point of significant consilience between the Soviets’ work and the Free Energy Principle is the Soviet hypothesis that the brain creates internal representations, mental models, or maps of what is happening in the environment – mirroring almost precisely Friston’s concept of the generative model while preceding it by decades. However, whereas the Soviets believed that these representations were strictly of facts, events, or other materialistic things, something that is taken as a given by many psychologists and irrelevant to Friston’s scope of work within the Free Energy Principle, the key insight that drives much of Peterson’s Maps of Meaning is that humans are primarily concerned with the significance of those facts to their goals, experienced as emotional significance or value.

From the consiliences that have been drawn so far about life and its conceptualization within the realm of physics, Peterson’s hypothesis is completely consilient with the fact that living creatures pursue desirable states and avoid undesirable states as a consequence of having to exist. Regardless of what some corrupt social scientists might say, each human being is born into into a goal-oriented perspective that all information is judged against – even if that goal is only obtaining the next breastfeeding as expected by the initial generative model as dictated by the mammalian genotype. Indeed, the grand consilience work constituted by Maps of Meaning, released only one year after E.O. Wilson’s Consilience called for precisely such unifications of knowledge, indicated that much of, if not all, human effort was motivated by the pursuit of increasingly complex goals. When Maps of Meaning is placed within the context of existing literature on evolutionary processes, consciousness, and developmental psychology, it becomes abundantly clear that this goal-driven activity is predicated on the acquisition of information, as evidenced by the reflexive human response to anomaly and the deep relationships between surprise and consciousness.

It would seem thus far that the human drive to know and understand is a deeply-ingrained habit that exists across individual and social contexts. However, the nuances of how humans come to know things, aside from sensory input being compared to pre-existing mental models, is also important – beyond the metaphysics of Western civilization being corrupted, the epistemology, or the methods by which that metaphysics has been derived, is also compromised. Thus, it becomes necessary to ground human ways of knowing – perhaps the key focus of Maps of Meaning – within the principles of physics that govern life and human mental activity.


Aside from his positions on various political topics, Peterson is perhaps most famous for his emphasis on the dynamic between chaos and order, which he has demonstrated to be recurring themes in human culture linked to the psychological experiences of individual humans. In the context of the Free Energy Principle, the orienting reflex, and the role of information in human life, chaos and order correspond to the domain of the unknown, or things that are surprising, and the known – things that correspond with a given mental model.

Of particular interest to Peterson are the implications of the brain’s hemispheric structure on the interpretation of anomaly and surprise, given that information is processed in tandem by these structures in most cases. As he relates in Maps of Meaning, the right hemisphere is specialized for pattern recognition, holistic thinking, making general conclusions and navigating “unknown” situations, whereas the left hemisphere is better-suited for symbolic processing, linear thinking, details, and familiar situations. Peterson also notes that the brain seems to process information in a right-to-left manner, where global hypotheses and general conclusions are made by the right hemisphere, then details are worked out by the left.

Although this description of brain activity is rather simplified, as cognition involves the simultaneous activation of many areas of the brain, points of striking consilience can be found throughout the neurological literature and include an underground classic by Julian Jaynes on the possible origins of modern consciousness. Working, like Peterson, at the intersection of ancient mythology and modern neuroscience, Jaynes observed that ancient Greek myths like the Iliad contained no references to psychological phenomena, no side-stage soliloquies, and, most notably, no introspection on behalf of the characters, with actions being commanded by the gods and carried out immediately. Drawing on the archaeological and historical record, Jaynes further observed that hearing voices from gods or spirits, often “through” or “from” statues or idols, was relatively commonplace in the ancient world, and that the Iliad appeared to be a sophisticated dramatization of this psychological reality.

The existence of such phenomena, at least in the ancient world, is explained by Jaynes to be the result of the brain’s hemispheric structure, separated into global hypothesis formation on the right and symbolic processing on the left. Joining these two hemispheres is the corpus callosum, a mass of two million nerve fibres that serves as the primary connection point and communications channel, which happens to connect with left hemispheric areas responsible for speech. It is these connections, Jaynes suggests, that were responsible for the ancient people’s experience of hearing voices, as it was likely the brain’s best way of processing right hemispheric activity in ancient times.

Regardless of whether Jaynes is correct about the details of his theories, however, the general points of consilience remain – that the brain’s hemispheric structure plays a significant role in how humans interpret and respond to anomaly. In particular, the role of the right hemisphere in developing global hypotheses and the left’s role in putting words to those hypotheses is extremely salient, as it is this progression of understanding – nonverbal and intuitive to verbal and formalized – that constitutes the essence of human epistemology, human expertise, and even the faculty of intuition.


As the industrialized world became more complex, it required people with specialized training and experience – experts – to manage its systems. The field of expertise studies, born out of an interest in replicating and training experts, exists at the boundary of neuroscience and is partially responsible for influencing pedagogy in domains like chess, sports, and medicine. Decidedly one of the more realistic social sciences as a result of its commitment to working with excellence, expertise studies has replicated the orienting reflex in its own context by way of studying athletic reflexes, identified the use of mental models in expert practices across many domains, and has also studied, in its own way, the implications of the brain’s hemispheric structure on human ways of knowing.

Indeed, of great interest to expertise researchers is the phenomenon known as situation awareness or expert judgement, which is generally the ability of experts like chess grandmasters or pilots to draw accurate conclusions about situations within their realm of expertise – usually within seconds or hundreds of milliseconds. These almost-superhuman abilities are well documented throughout the literature, and include radiologists making accurate diagnoses within the timeframe of the orienting reflex, tennis players responding to subtle cues in their opponent’s body language, and chess players memorizing game states after a single glance at the board.

Within the context of the consiliences that have been drawn thus far, one would expect that these incredible displays of skill are predicated on a sophisticated mental model – this is corroborated extensively by expertise researchers. One would also expect that the exceptional situation awareness displayed by experts, which is demonstrably faster than conscious thought in most cases, is a result of primarily right hemispheric activity – something that has also been documented by neuroscientists. Finally, one would expect that experts would have trouble explaining, in left hemispheric and symbolic fashion, how they came to conclusions that were primarily the result of global hypothesis formation in other parts of the brain – this is also the case for most experts, as is well-documented in the literature.

Another word for these kinds of phenomena is intuition, generally defined as the ability to understand something immediately and without conscious reasoning. Although it is usually dismissed by modern Western systems as an inferior form of knowledge, the consiliences drawn thus far, largely from physics and neuroscience, would indicate that intuition is, in fact, the most fundamental method of knowing that humans have access to. Furthermore, the preponderance of various kinds of expertise in Western society, from medicine to baking and trend-spotting, indicate that intuition can be meaningfully developed and used for personal, economic, or systemic benefit.

From the perspective of the human brain and the generative model it employs, there is no distinction between what constitutes “an expert” or “a novice” aside from the sophistication of that model in a given domain of activity and the amount of surprise generated when performing related tasks. Indeed, the literature on situation awareness indicates that an individual’s level of awareness is a function of the ability of their generative model to perceive, interpret, and anticipate events in the environment. In terms of the information that is being acquired and processed by the brain, what expertise researchers have found is that low levels of surprise correspond to faster and more effective action, which makes sense given the computational limits of the human brain.

Taken together, these consiliences suggest that the specific mechanisms of expertise that have been identified in the real world correspond almost precisely to the neurological mechanisms that drive learning, intuition, and problem-solving. Furthermore, it would seem that the acquisition of expertise, expert intuition, or even general intuition can be simplified to the development of a sophisticated mental model that can quickly interpret the sensory information provided in domain-related situations.


Closely related to the phenomenon of intuition, however, is the experience that some people have of a literal “inner voice” in the style of Jaynes’ Iliad characters. Indeed, modern studies indicate that many people have rich inner monologues or even dialogues, with some of those conversations being experienced as auditory events in the mind. History is replete with famous characters who discussed having such a companion, most notably Socrates’ daimonion, which by his account and the philosophical literature may have been one of the driving forces behind Western civilization.

The best hypothesis currently available that explains these phenomena is Jaynes’ work on the corpus callosum and its relationship to the speech areas of the left hemisphere. Additionally, the other literature on the topic, as well as the consiliences already gathered, would suggest that in most cases this is not a mental aberration or a disorder – on the contrary, people relate, like Socrates, that their inner voice can serve as an important guide in difficult or complex situations, sometimes even presenting them with solutions to problems or guidance in difficult conversations. This would strongly suggest that the development of intuition should be one of the primary goals of education, and that the sophistication of a student’s mental model is the real measure of their learning.

Get Updates on Spiritual Science

* indicates required


Chapter 11 Draft - "More Precious Than Rubies"

Following the popularization of Freud’s ideas in the West, there was an explosion of research activity in the twentieth century dedicated to understanding the human condition. Among the many sub-disciplines in this new field was developmental psychology, which investigated the changes that humans experienced as they moved through their life cycle. Originally focused on infants and children through the works of researchers like Jean Piaget, the discipline quickly expanded to include issues encountered by adolescents and adults.

Although much of developmental psychology is observational and therefore can be subject to individual and cultural biases, a comprehensive review of the literature reveals a noticeable trend towards complexity and sophistication as humans age. This trend occurs regardless of what kind of development is being analyzed, and much like expertise researchers have uncovered links between certain practices and expert performance, developmental psychologists have been able to demonstrate that certain kinds of growth are correlated to higher task performance.


One of the earliest, and most famous, developmental psychologists was Jean Piaget, whose work on the stages of development that children progress through on their way to adolescence has become a mainstay in education and childrearing. As previously discussed, one of his primary findings was that children appear to develop the capacities for “formal logic” and “rationality” at a rather gradual pace, spending much of their childhood subject to fantasies and an incredible capacity for fantastical imagination.

As was discovered by psychological researchers following Piaget, however, the growth and development of a human being’s mental model does not stop with adolescence. Indeed, the works of Lawrence Kohlberg, who studied moral reasoning abilities, and intelligence researchers like Alfred Binet, suggested that there were measurable differences in the mental capacities of individual humans, that these differences were linked to real-world outcomes, and that these differences could be ordered along trajectories of development.

For example, Kohlberg’s work on moral reasoning, based on Piaget’s work with children, centered largely around providing people with case studies involving ethical dilemmas and recording their responses. By focusing on the reasoning for the responses rather than the responses themselves, Kohlberg uncovered that people generally progress through three stages of moral reasoning capability, each with unique motivating factors that drive “ethical” behavior.

Among Kohlberg’s findings is the fact that until about age nine, children operate under what Kohlberg calls “preconventional morality”, which is a fairly self-centered paradigm focused largely on avoiding punishment and gaining reward. Through early puberty, however, Kohlberg found that children begin internalizing the ethical systems that they were taught by their parents and teachers, and when providing their reasoning for responses to ethical dilemmas, tended to cite external rule systems as existential constraints rather than arbitrary systems. This, he called “conventional morality”. The final stage of moral development, as identified by Kohlberg, was possessed generally by adults only and was called “postconventional morality”. At this stage, it was found that people offered unique thinking on the ethical issues that were tied to more universal moral principles such as charity and fairness – or at least their personal understanding of these principles.

One very intriguing finding from Kohlberg’s work is that the adult population is distributed along these three different stages, implying that not all people are equally developed. In particular, he found that only ten to fifteen percent of adults operate at the postconventional stage and make appeals to universal principles in their moral reasoning – most of the population, in fact, operates according to conventional morality and behaves generally how an average person in their society is expected to behave. This suggests that the adult population is actually significantly less autonomous than one would believe from observing Western societies, and that the prevailing culture is a more powerful driving force than one might expect.

Another pioneer in developmental psychology was Erik Erikson, who with his wife and research partner Joan Erikson identified, albeit more qualitatively than quantitatively, eight different life stages that each human being progresses through from birth to death. These stages also signify a trend towards complexity, as earlier stages grapple with problems like trust or mistrust in the environment – a consequence of caregiver attachment – while the elderly are faced with the struggle of finding meaning in their life journey, a task decidedly too deep for infants and most children.


Owing to Freud’s conceptualization of the human psyche as containing an id, an ego, and a superego, many developmental psychologists referred to the part of the mind responsible for moral reasoning, problem solving, and meaning-making as the ego. Generally described as the central processing unit of the mind, the concept of the ego represents the system through which all brain activity is driven through. It is the “cause”, so to speak, of personal goals and subjective biases, of the internal narrative that all environmental stimuli are eventually related to, and perhaps could be likened to the entity responsible for the “maps of meaning” spoken about in Peterson’s opus of the same name.

One of the earlier researchers to put forth these lines of thinking was Erik Erikson, who was, generally speaking, more concerned with “ego identity” and the individual’s relationship to society. However, aside from domain-specific investigations like Kohlberg’s, research into the overall developmental trajectory of this “ego” was not conducted until one of Erikson’s students and collaborators, Jane Loevinger, discovered anomalies in her work with the maturity of mothers.

Loevinger found, curiously, that some mothers simultaneously endorsed corporal punishment while agreeing with statements like “a mother should be her child’s best friend”, implying a lack of sophistication regarding the use of physical force and its implications for relationships. Upon further investigation, which included pioneering work in the use of statistics in psychology, Loevinger discovered that not only mothers, but all adults, existed along a trajectory of “ego development” that influenced how they thought about problems, how they related to others, their views on social issues and proposed solutions, and even the sophistication of their internal narratives.

Much like Kohlberg’s analysis of qualitative responses, which indicated people operated according to preconventional, conventional, or postconventional moral reasoning frameworks, Loevinger used a statistical approach combined with language analysis to demonstrate that this stage model represented the general trajectory of human lifespan development. By having respondents complete question stems such as “I am…” or “My biggest problem is…”, Loevinger showed that the syntax and structure of these responses, and not necessarily the content, revealed a great deal about how the respondent processed information and how they approached life.

By performing these sentence-completion tests with thousands of respondents across different population demographics, and analyzing the responses in aggregate – a heroic undertaking in the age before modern computing – Loevinger elucidated approximately eight stages of increasing sophistication that humans could occupy, with a statistical distribution suggesting that only ten to fifteen percent of the population was operating at “postconventional” stages – much like Kohlberg’s conclusions. These findings have since been replicated and extended by Suzanne R. Cook-Greuter, whose decades of research on postconventional stages have yielded an additional two stages marked by the kinds of extreme sophistication one might typically expect of a sage, guru, or world-changing leader.

View Chart Here


What is found in the works of Loevinger, Cook-Greuter, and other ego development researchers like Harvard’s Robert Kegan and the workplace-focused Bill Torbert, is indeed a trajectory of increased capacity for sophistication and nuance. For example, earlier stages like the conformist tend to accept inherited traditions unquestioningly and have trouble when placed in situations that require pluralistic stances, such as cross-cultural interactions or political discussions involving many different points of view. They also have a great deal of difficulty thinking outside those traditions to develop solutions for their lives. However, postconventional stages like “Alchemists” are capable of finding elegant and useful solutions to paradoxes and conflicts, much like how Nelson Mandela famously wore the Springboks jersey during a key moment in South Africa’s reconciliation process.

Additionally, in much the same way that Kohlberg’s postconventional stage demonstrated that a certain subset of the adult population can develop their own unique relationship to the principles enumerated in laws, traditions, and culture, this developmental journey is reflected in the transition from the “Achiever” stage, occupied by many modern managers, to the “Individualist” stage, which might be more reflective of a solopreneur or a lifestyle-focused digital nomad who has their own meaning of work and success.

In general, people who test at the postconventional stages display a level of appreciation for paradox, ideological tension, and nuance that other stages do not possess, which allows them to find unexpected and unique solutions to national issues – like Mandela’s jersey maneuver. On a less grand scale, however, research indicates that postconventional development within leaders is virtually required to lead change successfully within organizations, suggesting that tangible and high-stakes outcomes can be tied to this developmental trajectory.

In her work, Cook-Greuter has noted that the “Achiever” stage, or the most sophisticated conventional stage, is currently viewed by the West as the most sophisticated level of development. Indeed, much of education is spent on the promulgation of national values at a conventional level combined with so-called “critical thinking” education, with corrupt Marxist overtures to diversity and inclusion constituting the postformal portion of learning. For children and adults struggling with developmental issues further on in this trajectory, they must turn to therapists, coaches, and pastoral care workers for assistance in journeys that become increasingly personal and specific.


When this exciting and powerful work is placed within the context of the physical, biological, and neuropsychological realities that have been discussed previously, it becomes apparent that the anomalies and novelties that humans encounter throughout their lifespans drive their development along specific and measurable trajectories. Whereas some of this development can be understood qualitatively, such as through Erikson’s developmental stage model, the “central processing capabilities” of the human brain, referred to in psychological shorthand as the ego and in this context as the mental model, is reflected in language syntax, problem-solving approaches, relational ability, and personal narratives.

Indeed, what Loevinger, Cook-Greuter, and others seem to have uncovered are some of the parameters by which mental models grow and develop. These parameters, unlike the domain-specific findings of Kohlberg and the expertise researchers, extend across domains, sexes, and cultures, representing something approaching a human universal.

Could this kind of development be considered a form of generalized “life expertise” or something approaching it? Given that both the developmental psychologists’ concept of ego development and the expertise researcher’s understanding of expertise are essentially based on the sophistication of mental models, which can themselves be understood in the context of surprise, anomaly, and information, it certainly seems this way. It is also notable that descriptions of the highest stages in the Loevinger-Cook-Greuter model are often described in almost spiritual language:

Consciousness or rational awareness is no longer perceived as a shackle, but as just another phenomenon that assumes foreground or background status depending on one’s momentary attention. Persons at the Unitive stage can see a world in a grain of sand, that is, they can perceive the concrete, limited, and temporal aspects of an entity simultaneously with its eternal and symbolic meaning.

Such conceptualizations of the higher stages evoke the concept of wisdom, which is defined in the English language as the capacity for good judgement based on experience and knowledge. For its part, the academic literature on wisdom offers a nine-part conceptualization of this elusive human quality which further corroborates the general understanding:

The most commonly included subcomponents [of wisdom], which appeared in more than half of the definitions are (1) social decision making and pragmatic knowledge of life, which relates to social reasoning, ability to give good advice, life knowledge, and life skills; (2) prosocial attitudes and behaviors, which include empathy, compassion, warmth, altruism, and a sense of fairness; (3) reflection and self-understanding, which relates to introspection, insight, intuition, and self-knowledge and awareness; (4) acknowledgement of and coping effectively with uncertainty; and (5) emotional homeostasis, which relates to affect regulation and self-control.

Finally, subcomponents included in fewer than half of the reviewed definitions include (1) value relativism and tolerance, which involves a nonjudgmental stance and acceptance of other value systems; (2) openness to new experience; (3) spirituality; and (4) sense of humor.

One noticeable consilience in the wisdom literature, which appear in over half of the literature ever produced on the topic, is an acknowledgement and acceptance of uncertainty – a key feature of postconventional stages as described in the Loevinger-Cook-Greuter model as well as a quality strongly reminiscent of situation awareness in the expertise literature. Additionally, when compared to the literature on ego development, and particularly Cook-Greuter’s work on the later stages of the general trajectory, the wisdom literature’s emphasis on reflection, self-knowledge, and value relativism are strongly reminiscent of the qualities measured in postconventional thinkers, who are demonstrably more capable of holding space for value conflicts, have an awareness and understanding of their own biases, and can acknowledge blind spots more readily. This kind of perspective-taking, which involves nuance, conditional if-then thinking, and also a “yes-and” approach as opposed to an “either-or” mindset, seems to be where the wisdom literature, expertise literature, ego development literature, and the foundational work previously discussed all align.


One of the fundamental struggles of the ego development literature has been with the fact that so-called “later” stages often experience higher levels of career success, more enjoyable personal relationships, and have more influence and impact in their communities. Yet, at the same time, many people operating at the “conformist” level can live perfectly happy lives, be productive and happy employees, and engage with their communities fruitfully without ever having to progress beyond that level. This begs the question of whether one stage is objectively “better” than another, particularly in the context of state-mandatory education and systems that may seek to push people along this trajectory regardless of its benefit to them.

However, when grounded within the assembled consilience of the evolutionary and neuroscientific literature previously discussed, it is obvious that human beings will be encountering anomalies and novelties throughout their lifetimes almost as a condition of existing and will thus be in the process of reflexively and involuntarily developing more sophisticated mental models. This indicates that the kind of ego development being observed by Kohlberg, Loevinger, Cook-Greuter, Kegan, and Torbert is a function of the kinds of anomalies being encountered, not the presence or non-presence of anomaly itself.

Moreover, a consistent trend in the ego development literature, as well as in the academic and spiritual literature concerning wisdom, are very young individuals – in their twenties or thirties – who test at the highest levels of ego development, or have been documented in scriptures and history books as displaying outrageously high levels of wisdom and intuition given their age. This suggests, at least in principle, that it is possible to accelerate development to the point that a young adult could operate, at least in some capacities, with the wisdom of an octogenarian. But how might this be the case?

A simple explanation would involve traits or circumstances that cannot be scaled across populations, such as intellectual giftedness, socio-economic affordances, or other similar factors. However, there is a great deal of literature which suggests that much of the growth along this fundamental trajectory is driven by life circumstances which do not necessarily require money or intellect to benefit from, indicating that transformative growth opportunities for those that wish them are quite plentiful.

Get Updates on Spiritual Science

* indicates required


Chapter 12 Draft - "Mechanisms of Lifespan Development"

Although the existence of prodigies, young gurus, and old souls represents an interesting anomaly within the developmental psychology literature, this phenomenon is, in fact, rather widespread. The education of gifted children, for example, is considered a special vocation within teaching and has its own niche of associated literature. Some expertise researchers focus their efforts on understanding how children can be taught complicated skills quickly , while enterprising parents have developed methods for raising chess grandmasters and polyglots without the “help” of colonial education systems. While it is indisputable that genetic traits and life events, such as traumatic brain injury, place hard limits on the growth and development of each individual human, the consistent and persistent appearance of prodigies within multiple fields pertaining to human development suggests that there is some kind of mechanism or set of circumstances that allow one person to advance along developmental trajectories, either in a specific skill or in terms of ego sophistication, faster than their peers.

Some of these mechanisms will likely be unique to individuals, in the sense that a combination of personal traits and life circumstances made for specific kinds of breakthroughs. However, if it is the case that some mechanisms can be generalized across the population, or are otherwise accessible by the average person, then the trajectory outlined by the assembled biological, neuropsychological, and psychological literature can be fashioned into something more closely resembling a valuable roadmap.


Framed within the terms of what has been assembled so far, learning could be generally described as the acquisition of information by the brain and the subsequent reorganization of its mental model to more closely conform to reality. It can also be reasonably inferred that the brain’s time and energy will be focused on the perception and resolution of anomalies – the very things that provide the information required to update the mental model.

Although this is a technically accurate description, the problem of human development is not merely a technical one, but an emotional one as well. As anybody who has been subjected to colonial education systems will know, the process of education can be emotionally laborious as well as intellectually taxing, and often involves many difficult reconstructions of one’s worldview and attitude. Thus, the concern of what learning feels like to individuals, or what the process of learning appears like from the individual perspective, becomes relevant.

Such questions fall within the domain of education studies, which is adjacent to, and intermingles freely with, psychology. Indeed, the colonial and industrial drive to optimize and accelerate makes the fields often indistinguishable from each other, with psychological principles developed by pioneers like Jean Piaget and John Dewey operationalized and weaponized to maximize information transfer from blackboard to skull. Yet, even “educational psychology” has bright points and breakthroughs which help build bridges to the harder sciences – one such work is Experiential Learning, a groundbreaking and influential work by David A. Kolb released in 1984.


Written, in part, as a gentle critique of colonial education systems, Kolb’s Experiential Learning makes an important distinction between symbolic information, which could be thought of as anything involving letters, words, or numbers, and experiential information, which encompasses all of the things that can be easily and immediately understood by an individual, yet can be difficult to put into words.

For example, there is a significant difference between reading a book about how to drive and learning how to drive a car by practicing to do so. The first involves mental exercises that are not tied to real-world actions and outcomes, while the second involves motor coordination, situation awareness, and real-time decision making that can only be developed through doing and experiencing. The same is true for many physical activities, including sports, sexual activity, and dancing, indicating that there is indeed a difference between the kinds of information that can be transmitted through symbols and language, and the kinds of behavioral information that is “embodied” within the individuals body-brain system.

Thus, experiential learning becomes the act of learning through experience, or acquiring information about a situation by putting oneself in that situation. Kolb’s work serves as a deep investigation of the dynamics of this process, concluding from a set of its own consiliences that there are four steps in any experiential learning process that involve, like is suggested by the biological and neuropsychological evidence, a deep interplay between humans and their environment.

Step 1: Concrete Experience

The beginning of the experiential learning process involves the transfer of information from the environment to the individual. This happens by way of the senses, and can include a combination of symbolic-linguistic information and experiential information that cannot be easily verbalized. This is called “concrete” as it represents the real and tangible feedback that the individual gets from their environment and situation.

Step 2: Reflective Observation

Following the acquisition of new information, Kolb observes that humans invariably spend time reflecting on what they have gained and making sense of it. In traditional educational scenarios, such as training workshops or schools, this is usually done by way of group discussion, however in some scenarios, like some performance arts and sports, this reflection may be done by oneself or through conversations with mentors, coaches, producers, or other creative facilitators.

Step 3: Abstract Conceptualization

After the experience itself is sufficiently grasped, Kolb notes that learners then begin to relate the anomaly to their pre-existing mental models and bodies of knowledge. This is when people manage to make sense of what has happened, either by grouping it with similar events or by establishing a theory or understanding of the anomaly. New or unique language, even jargon, can be developed at this stage as the experience is more fully incorporated into the mental model.

Step 4: Active Experimentation

The final stage in the learning cycle, which triggers the first step, is through taking action in the environment or situation based on what has been learned. This is a crucial step in the learning process as it allows an individual to “test” their new hypotheses and understandings against the environment itself, thus generating authentic and immediate feedback on things like situation awareness, understanding, and performance.


When framed within the context of experiential learning, all the information an individual gains throughout their lifespan can be understood as an opportunity for this kind of education, and indeed is serving to build or refine their mental model. This refinement, in turn, can be understood as a tendency towards sophistication, wisdom, and expertise regardless of the domain or topic. However, the kinds of perspective transformations that take place between preconventional, conventional, and postconventional stages of development, and even the transformations that take place within those stages, involve profound psychological changes that do not easily fall within the domain of experiential education.

One example of this might be the changes that take place between conventional and postconventional stages, which include an acceptance of plurality and viewpoint diversity as well as a habitual recognition of the relativity of one’s own viewpoint. The profundity of these realizations can, of course, be encompassed within Kolb’s experiential learning model, however the kinds of experiences, reflections, and abstractions must be rather significant and complex in order to catalyze such growth.

How these kinds of changes happen is the focus of Transformative Dimensions of Adult Learning, a 1991 work by Jack Mezirow that offers a perspective on adult development unique from, yet consilient with, both Loevinger’s trajectory-focused work and Kolb’s experiential learning cycle. Much like Kolb makes an important distinction between symbolic and experiential information in his work, Mezirow distinguishes between different kinds of learning that result in different kinds of changes to an individual’s mental model.

The first kind of learning can be described as non-transformative in the sense that the anomalies encountered through experience can be easily accommodated using pre-existing concepts, words, narratives, and memories. An example of this might be encountering road rage during one’s daily commute – although an angry person represents a significant, even life-threatening anomaly, road rage incidents usually do not require extensive therapy as it is a commonplace enough occurrence that most people can easily write them off as someone else having mental problems or a very bad day.

Much of Mezirow’s Transformative Dimensions, however, focuses on a kind of transformative learning that he often frames as liberating or emancipatory, most often in the sense of an individual being liberated from societal conditioning such as restrictive gender stereotypes or a false religion like Christianity. Like Loevinger’s insights into lifespan development were catalyzed by anomalous experimental results with mothers, a great deal of the literature on this type of perspective-changing education comes from researchers or practitioners engaged with women entering education or the workforce for the first time.

Working from the experiences of women, survivors of trauma, and other people who have experienced tremendous events, Mezirow and the researchers he was influenced by identified several salient characteristics in a transformative learning process that differentiate it from more straightforward kinds of learning. The first and most pertinent is a dilemma or problem that cannot be resolved within the current mental model – for example, a lifelong Christian recognizing that their New Testament is replete with controversies and scandals.

The second most salient characteristic of a transformation process is an honest self-evaluation of one’s own understandings and preconceived notions, combined with an acceptance that they are insufficient to resolve the anomaly or address the situation. Following this, the search for new perspectives begins, which is often done in a social context with friends, managers, mentors, spiritual leaders, experts, or Google. Finally, the development of new perspectives, attitudes, and ways of knowing happens in an experiential context, usually through role exploration, active experimentation, and iteration.

From the perspective of the neuropsychologists, it could be said that what Mezirow describes in his transformative learning cycle is a change in the fundamental structure of the mental model, whereby it begins to interpret old information in completely new ways, and new information is contextualized differently than it used to be before this transformation. This is, functionally, both a reorganization and a revaluation of values, which is observed in the literature to be a tremendously laborious process both intellectually and emotionally that almost always requires focused external support in the form of mentorship or therapy.


As can be found throughout the observational data of the developmental and educational psychologists, one of the most difficult things for any individual to do is to grow into a new perspective, especially under circumstances that are not necessarily supportive or amenable to growth. Indeed, many women who entered education and the workforce during the sixties and seventies found themselves questioning what being a woman meant, which equates to a profoundly deep reorganization and revaluation within their mental models that had cascade effects on their families, children, and societies.

Similarly, Robert Kegan and Bill Torbert, working with a more diverse population within developmental psychology and adult education, have both noted the difficulty that many star performers encounter when they are promoted to managerial positions and become responsible for directing and influencing others while maintaining strong relationships. This requires, for many, a move from the “Expert” and “Achiever” stages towards more postconventional stances, and often requires executive coaching or specialized training to happen within workplace contexts.

Given the difficulties inherent in these kinds of transformations, and the obvious disparities in development and general performance between large sections of the Western population, it becomes reasonable to inquire about why some individuals might experience profound growth in response to an anomaly, while others do not – or, worse yet, become bitter and disillusioned by something they don’t want to put in the effort to understand. Gaining an understanding of what mechanisms control this most fundamental bifurcation in development should, in theory, allow for an ever deeper understanding of human growth, and indeed will prove to the “line in every human heart” alluded to by writers like Aleksandr Solzhenitsyn that separates evil from good.

Get Updates on Spiritual Science

* indicates required


Chapter 13 Draft - "Terror, Stagnation, Possession"

To understand why some human beings develop faster or farther than others, it is necessary to return to the phenomenon of novelty, which is the driver of all human learning activity. As documented by the Soviet neuropsychologists, the orienting reflex is an instinctual and faster-than-thought response to the presence of new things. As documented meticulously by Karl Friston, Jordan Peterson, and the expertise researchers, it seems like much of the brain’s structure and function is geared towards the perception and reconciliation of anomalies. Clearly, surprises and shocks are something that the human body is wired to avoid. But how might it feel to encounter an anomaly, or even a chaotic environment?


Among many other memorable aspects of Peterson’s work is the juxtaposition of mythological elements and archetypes with psychological phenomena, a consilience-based demonstration of the fact that many of the things now considered folklore by the West are actually expressions of deeply-wired biological and psychological instincts. One of the most famous of these juxtapositions is the “dragon of chaos”, alternatively described by Peterson as the devouring aspect of Mother Nature, the evil witch queens in Disney films, the monster under the bed, and other highly variable and dangerous unknowns that pose threats to human life.

Much like rats, which can scream for hours after smelling or seeing a feline predator, humans have deeply instinctual responses to novelties perceived as possible or actual threats. The most well-known of these is the so-called “fight-flight-freeze” response, which has become shorthand for the kinds of urgent responses humans can have to threats in their immediate vicinity. Much like how the knights of mythology can either run from the dragon or face it valiantly, Peterson frames many psychological phenomena as being rooted in this constant battle with the unknown.

From this, it is perfectly reasonable to conclude, as many have done, that some people might be afraid of something new simply because they perceive it to be a threat.


Contrasted with the dragon of chaos and the evil witch queen is the “walled garden”, characterized throughout Peterson’s work as “known territory”. Unlike situations that present significant amounts of new information, known territory is the domain of culture, of rules, of predictability. From the perspective of expertise researchers and neuroscientists, it could be said that this represents situations where someone has high situation awareness or a well-developed (and accurate) mental model.

In Maps of Meaning, Peterson characterizes culture and known territory – the “Great Father” – as both protective and controlling. Positively expressed, culture provides a safe foundation for achievement and positive encouragement for performance, much like a father would. Negatively expressed, however, culture can become tyrannical and rigid, resulting in the kinds of oppression experienced by the subjects of Stalin, Mao, and other dictators:

Group membership, social being, represents a necessary advance over childish dependence, but the spirit of the group requires its pound of flesh. Absolute identification with a group means rejection of individual difference: means rejection of “deviation,” even “weakness,” from the group viewpoint; means repression of individuality…

Similar tendences can be observed at an individual level as well, expressed mythologically by Peterson as the courageous hero and the tyrannical villain. The first archetype, characterized by a brave going-forth into the unknown, willingly engages with the proverbial dragons of chaos. The second, however, rejects the presence of anomaly and refuses the hero’s journey:

The lie is willful adherence to a previously functional schema of action and interpretation – a moral paradigm – in spite of new experience, which cannot be comprehended in terms of that schema; in spite of new desire, which cannot find fulfillment within that previous framework. The lie is willful rejection of information apprehended as anomalous on terms defined and valued by the individual doing the rejection. That is to say: the liar chooses his own game, sets his own rules, and then cheats. This cheating is failure to grow, to mature; is rejection of the process of consciousness itself.

Although Peterson’s final comment may sound extreme, given that consciousness is ever-updating in the presence of new information, formulating a willful ignorance of things that one instinctually knows require investigation – things that trigger the orienting reflex – is indeed a rejection of consciousness, or at least an updated and more realistic version of it. In the psychological literature, the human tendency to rationalize, deny, and explain away is known as cognitive dissonance and is among the most well-documented and well-known psychological phenomena in existence.

Less well-known than cognitive dissonance is the domain of Terror Management Theory, a sub-field of social psychology which has spent decades documenting and understanding the relationship between death and human behavior. Among their findings is the relationship between mortality salience, or how mindful someone is of their impending demise, and their adherence to a worldview. More generally, researchers have found that people adopt and cling to worldviews offered by societies and religions to achieve either literal immortality, as promised by certain religions, or symbolic immortality in the sense of making a memorable contribution to society. When these insights are placed within the context of the consiliences developed between biology and neuroscience, it becomes apparent that the brain’s mental model is, in effect, the fundamental survival mechanism of the entire human biosystem. Indeed, the “worldview” identified as a pillar of psychological stability by terror management researchers can be understood to be the survival mechanism by which someone operates in “known” territory, as well as the paradigm within which they approach novelty. This indicates that anomalies that threaten changes to the mental model, especially if they appear to be threatening, can be perceived or felt as a threat to someone’s way of life, or even their life, and indicates that disparities in growth between humans can be partially chalked up to fear and avoidance – even “terror”.


Another perspective on disparities in human growth comes from Kazimierz Dąbrowski, a Polish psychologist and physician who wrote extensively on something he called the third factor. This human attribute, described in general and phenomenological terms based on Dąbrowski’s patient work, encompasses one’s internal drive towards growth and often manifests in feelings of disquietude, guilt, or shame, usually existing in relation to a personality ideal or a desired future.

Dąbrowski’s most famous theory, the theory of positive disintegration, outlines a kind of psychological hero’s journey by which someone grows into more nuanced and sophisticated worldviews. Not only are changes themselves inherently frightening, Dąbrowski notes that the psychological transformations required in the face of certain events or anomalies cause one’s existing worldview to become fragmented. The resulting negative psychological effects of this fragmentation, often manifesting as symptoms of mental illness, are noted by Dąbrowski to be symptoms of positive and more comprehensive changes.

The third factor, or the strength of one’s attachment to an ideal over the uncomfortable present and an even more uncomfortable change process, is what Dąbrowski believed to be the primary mediator of growth in the average human. Indeed, much like organizations and societies stagnate in the absence of a strong future vision, it would seem that human beings also face a fundamental choice between pursuing an ideal in the face of the unknown and ignoring opportunities for growth.


While Peterson’s characterization of anomalous information as a “dragon of chaos” has received ridicule from some skeptics of his work, upon a deeper investigation of the psychological phenomena at play, it would indeed seem that human beings often display a strong tendency to ignore or rationalize novelty. This leads to stagnation, generally defined as a failure to grow and adapt, which Peterson associates with individual and social tyranny, a rejection of the unknown, and an out-of-date mental model.

Based on his own work with patients and research subjects, developmental psychologist Erik Erikson identified stagnation as a failure to progress fully through adulthood, and associated it with self-centered neuroticism, a lack of self-improvement, and hedonistic concerns. Abraham Maslow, for his part, identified a human “growth motivation” that he associated with the pursuit of personal ideals, social goals, and a “higher pleasure of production”, also connected to productivity in adulthood.

However, despite the necessity of growth and the many pleasures associated with it, most notably Maslow’s elusive state of self-actualization, the costs seem too great for many – only fifteen percent of the adult population is operating in a postformal worldview, which means they genuinely struggle to appreciate other cultural paradigms, many difficult social problems, and even parts of their own personality. Moreover, several social problems that characterize the modern West, such as extreme political tribalism, can be understood as failures to develop as individuals and as a society, resulting in breakdowns of communication, better understood as the exchange of new information. Additionally, this failure to develop has resulted in the rise of xenophobic behavior between social groups that are otherwise undifferentiated demographically, as well as the development of genuinely fringe conspiracies like QAnon as well as the “follow the science” movement, which is just as disconnected from reality after years of conflicting scientific pronouncements.


Whereas the phenomenon of stagnation is related to a mental model that isn’t updating due to fear, the problem of ideological possession, generally known to the public as cult indoctrination, poses a far greater threat to the Western utopian project given people’s obvious commitment to their political stances. From a developmental standpoint, possession happens when someone with an unsophisticated mental model, most likely operating in the Conformist or Expert stages, acquires a set of ideas or theories that provide easy explanations for anomalies that would otherwise be difficult to resolve without a postconventional mindset.

One easy example is antisemitic ideology, which uses a variety of arguments and data points to convince young men that a cabal of Jewish people and their associates manage many world affairs. While these perspectives are not in line with reality, they can be rather sophisticated, and in combination with the perplexing phenomenon of Jewish success, give some misguided people a ready-made mental model with which to interpret many complex world events – most recently the FTX cryptocurrency scandal, the main culprit of whom was a Jewish man. Indeed, events such as this provide “false positives” to these ideologies, allowing them to become more ingrained into an inculcated human.

Generally speaking, the entire concept of “science” has become weaponized and deployed as an ideology, as evidenced by the rampant corruption in psychology, the many ethical problems with the COVID-19 vaccination program, the Western left’s simplistic understanding of food production, and the misguided effort to reduce climate change through cobalt mining in the Congo. For most people, mentally and emotionally castrated across generations by the Western colonial systems, an expert’s opinion – or a peer-reviewed article for the studious – is enough to accept something unquestioningly as truth, and even demonize people who disagree.

For people subjected to ideological possession, it cannot be understated that their corrupted mental model becomes not only their survival mechanism, but their whole life. In the modern world, gender-confused teens and angry young men alike will spend countless hours creating content on the internet to ingratiate themselves into their ideological communities, express their creativity through memes and TikToks, and make sense of events through the lens that they were provided. Moreover, much like many cults are known to do, many modern ideologies contain very strong “us versus them” dynamics, with the Marx-influenced leftists affirming that white men are the cause of society’s problems and conservative ideologues framing their political opponents as Satanic or demonic.

Unfortunately, in addition to their fear of novelty that stifles personality and perspective change, people mentally trapped in such ideologies also become attached to their group status, their friends, and their communities, all of which are contingent on supporting the cause and aligning with the group’s conceptualization of reality. This presents additional barriers to personality and perspective change that, as evidenced by the literature on deradicalizing cult members, often requires focused therapeutic support as well as family intervention, and on society-wide levels a reconciliation process.


Interestingly enough, the folkloric references to “societal programming” or “mindviruses” that exist in conspiracy theory circles, or even the concept of being plugged into the Wachowski’s Matrix, have strong grounding in neuroscience, psychology, and information theory. At a very literal level, what is happening during an ideological indoctrination is the acquisition of a mental model, which functions much the same way as a computer program in the sense that it helps the brain process data. Since all events pertaining to the mental model are processed within that paradigm, it is, in effect, a kind of program-like meme that has taken over its host’s brain resources to survive and replicate to new hosts.

From this understanding, it can be seen that the postmodern and Marxist ideas that are currently gripping the West, along with the false and barbaric psychological doctrine inflicted on its peoples, are dangerous precisely because they are so sophisticated. Concepts such as having respect for diversity, redressing societal inequalities, and finding one’s true self were truncated, packaged, and sold to younger people through universities and schools, giving them an extremely advanced-yet-false understanding of postconventional themes that has proven too sophisticated to “deprogram” through traditional societal discourse. Such discourse has broken down precisely because these ideas about diversity, sexuality, race, and gender have been framed as intrinsic human qualities that must be respected, meaning that no conversation with confused leftists can take place until their “third factor” overpowers their fear of change – and reality.


Given the factors that seem to influence an individual’s attitude towards novelty, growth, and change, it would indeed seem that the choice between tyrannical avoidance and courageous curiosity is perhaps the most fundamental choice that many human beings have. Arguably, it is at the root of many social problems, almost all political problems, and even individual psychological difficulties. Even major life choices, such as getting married or having a child, inherently contain many unknowns that are documented, by Erikson and other psychologists, to be tremendously difficult for many people to make and live with. Additionally, the brain is constantly encountering new information, and thus is obligated to make the best possible use of that information for purposes of survival.

Unfortunately, in the absence of courage, which can be the result of many factors including the absence of encouragement, humans seem to prefer the status quo, the walled garden, the familiar perspective. The rigidity of that perspective across time, even in the face of new information, leads to tyrannical behavior, social problems, psychological problems, and even society-wide catastrophes like genocide, as will be explained in the next section.

Get Updates on Spiritual Science

* indicates required


Chapter 14 Draft - "Eigenvectors of Individual Psychology"

Although his work is considered esoteric by most mainstream psychologists, Carl Jung’s efforts to understand human behavior and personality are among the most popular and enduring of all psychologists, with his pioneering thoughts on personality types forming the basis of personality psychology and his concept of “archetypes” now ubiquitous among writers, branding professionals, and other creatives. While Jung’s ideas, especially regarding archetypes, have been largely dismissed by the mainstream, when viewed within the context of physics, biology, and information flow, the notion that there might be recurring themes or patterns within human behavior becomes much more plausible.


In creative fields such as writing or visual art, an archetype is widely understood to be a recurring symbol or motif, such as the dragons and serpent-like creatures usually associated with powerful evil forces or the wise and kind wizard, often eccentric, who provides help to the story’s heroes. Indeed, an examination of story and myth from around the world, and across time, reveals that there seem to be patterns to not only the types of motifs featured in those stories, but even the structure of the stories themselves.

For most psychologists, and even many scholars of culture and fiction, these recurring themes and character typologies are chalked up to happenstance. However, in the earlier years of psychology, where psychoanalysis and dream interpretation were among the dominant therapeutic interventions, Jung found himself fascinated by the symbolism of his patient’s dreams, as well as the hidden insight these dreams often contained. Unlike his teacher and collaborator Sigmund Freud, who believed that dreams were primarily windows into his patients’ sexual complexes and family dynamics, Jung began to perceive deeper meaning to many dreams, and began to see patterns and motifs that were common between different dreams.

For his part, Jung believed that the origins of these archetypes were primarily biological, and that the motifs and patterns experienced in dreams and portrayed in art were a natural consequence of the unique particulars of human life. Alternatively referring to archetypes as “archaic remnants” or “primordial images”, Jung became famous for seeing his patients’ dreams as more than mental noise, instead characterizing them as deep expressions of what he called the “collective unconscious”, perhaps most accurately described as the inherited biological-neurological substrate that all humans inherit at birth.


First invented by the German mathematician David Hilbert, the eigenvalue and eigenvector can be generally defined as properties of complex systems that are related to patterns and points of stability within those systems. The term has also found purchase in the humanities, used by media ecologist Niklas Luhmann to describe recursive patterns he observed in the mass media, such as fixations on scandal or tragedy, which he believed to be a result of the organization and function of the media in society.

Much in the same way that Luhmann likened the mass media to a chaotic mathematical system, and compared certain themes or motifs in that media to eigenvalues of that system, it is possible to view Jung’s concept of archetypes as eigenvectors of human psychology. Put simply, an eigenvector is a recursive, self-perpetuating pattern within a mathematical system that emerges from the makeup of that system, and can be modified to some degree without losing its form. Juxtaposed against Jung’s idea of an archetype, which is a pattern or motif that can change superficially while maintaining a central “essence” or “form”, it becomes clear that what Jung is proposing can be more easily reconciled with biology and neuroscience than is currently believed.

Consider, for example, that the wise wizard, known in Jungian terms as The Sage, is found throughout myth and story, almost always serving the same role. From Tolkein’s Gandalf to Rowling’s Dumbledore, Baum’s Good Witch of the North, Lucas’ Obi-Wan Kenobi, and even Collins’ misanthropic and hard-drinking Haymitch, it is consistently the case that in heroic stories, there is an older, worldly, experienced, and helpful individual who provides advice to the protagonist at crucial junctures.

Although this could be hand-waved away as mere happenstance, or perhaps an emergent property of stories themselves and not of human life, when the concept of The Sage is considered within the perspective of evolution and information propagation, it becomes clear that what is being depicted in these “mere stories” is a fundamental pattern that is a function of the human phenotype. Indeed, if it is the case that expertise and “wisdom” are acquired through time and experience, which is demonstrably the case from every analysis conducted on the subject, then it stands to reason that the oldest generations of any society have the greatest capacity for wisdom given the longer timeframes that they have had to develop and hone their mental models.

Thus, while prodigies and “young gurus” can and do exist, the limitations of information transfer and learning inherent to human life dictate that the greatest levels of wisdom and the best kinds of advice should come from those who have had the most time to accumulate them – elders. Thus, the consistent and persistent relationship between age, wisdom, and helpful advice in mythology and story, as embodied by The Sage, can be understood to be consequences of the structure of human life, and not simply popular themes latched onto by storytellers and artists.


Although it is not a Jungian idea per se, Joseph Campbell’s famous conceptualization of the “hero’s journey” undertaken by most fictional protagonists has proven to be just as popular and influential as Jung’s ideas about archetypes. Much like Jung’s archetypes point to deeply-rooted patterns in the human psyche that are a consequence of the human phenotype, Campbell’s observation that most stories follow the same general formula or progression represents yet another point of consilience regarding the realities of not only human psychology, but the process of growth and development developed in previous chapters.

Within Campbell’s formulation of the hero’s journey, stories are seen to follow the same basic progression – a disruption of the familiar followed by a call to adventure, an encounter with a mentor figure who provides aid or magical items, a symbolic or literal death and rebirth where the hero is forever transformed, a confrontation with the evil antagonist, and finally a return to “normal life” or to “the village”. While a surface-level analysis conducted within the sphere of literary criticism might indicate that this progression of events is merely the most exciting pattern for a story, a comparison of these dynamics to the processes of learning identified by Kolb, Mezirow, Peterson, and Friston reveals profound consiliences.

Consider, for example, that the process of transformative learning developed by Jack Mezirow begins with a “disorienting dilemma” that cannot be resolved within someone’s current knowledge structures. Much like mythical heroes face unignorable calls to adventure, the presence of an unresolvable anomaly is an invitation to learning that is intrinsically salient given its novelty and the existence of the orienting reflex. Also, just as heroes encounter mentors along their journey, Mezirow’s process of transformative learning usually involves the acquisition of new skills and knowledge, usually through people who have faced similar challenges in the past. The mythical motif of death and rebirth is inherent to Mezirow’s entire theory, but can also be tied to later stages involving personal transformation and experimentation with new roles and skills.

Based on his own analysis of human neuropsychology and its relation to elements in myth and story, Jordan Peterson proposed a “map” of human experience that emphasizes the presence of anomaly, and indeed a great deal of Maps of Meaning is concerned with Peterson’s conceptualization of the hero’s journey and the courageous confrontation with the unknown that is inherent to human life.

Thus, it can be reasonably concluded that the progression of events outlined by Campbell’s heroic journey, Mezirow’s process of transformative learning, Peterson’s map of human experience, and Kolb’s experience-centered learning process all point towards the same fundamental pattern that is a consequence of our biology, our ongoing relationship with the environment, and even the structure of the human brain. Perhaps more interestingly, however, it is possible to further conclude, as Peterson does in Maps of Meaning and as tale-weavers throughout history have done in their own works, that it is willful engagement with the process that we call “heroic” which constitutes the optimal attitude for growth and development.

Following this conclusion, however, it becomes necessary to delineate what elements are central to the heroic archetype, or heroic eigenvector, and therefore what attitudes and approaches human beings should adopt for best results in their own lives. From the consiliences identified previously, and from Peterson’s exhaustive exploration of the subject in Maps of Meaning, it becomes evident that courageous engagement with anomaly or novelty constitutes engagement with the process, and therefore “heroism”. Thus, it can be concluded that heroes, or courageous people, do not shy away from that which they do not understand. In mythology, Campbell has noted that protagonists will sometimes refuse the initial call to adventure, perhaps indicating that even creatives without any knowledge of human psychology acknowledge that learning adventures can be scary.

Even when operating from the perspective of physics, biology, and information theory, it is remarkable that courage and curiosity can be identified as admirable human qualities, and even necessary ones from the perspective of human development. Unfortunately, both the fictional and historical record demonstrate that these qualities are often in short supply, particularly when they are most needed. Thus, it becomes necessary to investigate what happens when a human being encounters novelty and refuses the “call to adventure”, and the personal consequences that result from such a choice.


Although modern society has become rather enamored with supervillains on the silver screen, the fictional backstories of these villains are often poor representations of the kinds of dynamics that can lead to malevolent behavior. Consider, for example, the case study of Regina Mills of Once Upon a Time, who was depicted as falling into witchcraft following profound heartbreak and parental trauma. Although this is certainly satisfactory from the perspective of the average television viewer, and in fact one of the most empathic representations of evil available in the medium, it does not fully explore the developmental failure that led to Mills’ dastardly deeds.

Consider, for example, that from the perspective of information theory, heartbreak and parental trauma simply represent anomalies that cannot be easily reconciled. Within Mezirow’s transformative learning theory, the personal catastrophes that Mills suffered could be characterized, perhaps a bit coldly, as a disorienting dilemma. Now, when faced with such a dilemma or novelty, regardless of what it might be, the optimal response for growth and development would be to engage with it. However, the particular circumstances faced by Mills were not easily reconcilable and would have required significant time and effort to explore and grow from, which would represent not only a lot of emotional labor, but a profound investment of caloric energy.

When faced with such a choice, between a seemingly-easy conventional path forward and the monumental task of reconciling one’s history of family trauma, it becomes quite easy to see how conventional paths can appear more attractive. However, much like Peterson insists that the villainous archetype rejects novelty, rejects the process of growth, and therefore rejects consciousness itself, someone like Mills who has suffered such trauma has no choice but to engage with it. Put in layperson’s terms, this represents baggage that Mills was handed by someone else that she now has to carry, albeit it represents a profound growth opportunity.

Although this case study is entirely fictional, and indeed from a series called Once Upon a Time, it bears very close resemblances to the actual processes of catastrophe and growth documented by the psychological mainstream. For example, the phenomenon of post-traumatic stress disorder, assumed by many laypeople to be the default response to trauma, is actually significantly overplayed – indeed, the phenomenon of post-traumatic growth seems to be the default human response, and involves many inspiring stories of cancer patients, refugees, and other trauma survivors becoming better versions of themselves as a result of what happened to them.

Moreover, it is notable that one of the most prominent psychologists, Carl Rogers, proposed that the only necessary characteristic of a successful therapeutic relationship is empathic and supportive listening on behalf of the therapist. Although the insight of this is not necessarily obvious from within the psychological mainstream, when placed within the context of information theory, it becomes obvious that what Rogers is proposing is that people carrying trauma simply require processing time, sometimes with some outside direction, to work through the complicated and traumatic anomalies they have encountered. This is further corroborated by the executive coaching industry, which often provides value to corporate clients by acting as qualified and confidential sounding boards – often one of the only resources of their kind executives have access to outside of therapy.

Therefore, just as the heroic eigenvector can be distilled down to the essential characteristic of courageous curiosity, its counterpart, the villainous eigenvector, can be reduced to cowardice, denial, or fear, and perhaps laziness. While this could be framed as merciless or unforgiving to those who have are carrying trauma, especially as a result of unjust circumstances, given the realities of human life that are a consequence of the universe’s laws, growth is the only option.


Although the other salient characteristics of Jung’s Sage archetype are easily reconcilable within an information-centric perspective, it is not immediately clear why these characters are so often portrayed as having eccentric qualities. Consider, for example, the case of Zeddicus Zu’l Zorander, a “Wizard of the First Order” in Terry Goodkind’s Sword of Truth series who often consulted his spiritual guides while naked outside.

Although this certainly makes for an amusing characteristic of a key character, it is not obvious why it was important, from an archetypal perspective, to have Zedd be so eccentric. Similarly, sage-like characters in other fictional works are eccentric or aloof in their own ways – Haymitch, in The Hunger Games, was an unpleasant drunkard. The Star Wars fan favorite, Yoda, spoke exclusively in backwards riddles. Mister Miyagi of The Karate Kid confused the protagonist with seemingly menial housekeeping tasks, which he only later connected to martial arts practices.

The most eccentric characters of this archetype transcend mere wizardry or sagacity and become what is generally referred to as shamans or mystics. Perhaps most famously exemplified by Rafiki in Disney’s The Lion King, these characters typically possess profound wisdom or even magical powers of great utility, however they are often difficult to approach or understand because of their eccentricities. Moreover, these eccentricities are usually connected, either explicitly or implicitly, to their power, much in the same way as it is commonly said that genius and madness are separated by a fine line.

While it is not immediately apparent why this is the case for shamans and mystics specifically, when the literature on these issues is contextualized within an information-centric paradigm, these “eccentricities” are revealed to be anything but. In fact, the anthropological and psychological consensus is that the shamanic and mystical people within communities are often great assets when socialized correctly, and that there can be a lot to learn from their oblique methodologies.

Get Updates on Spiritual Science

* indicates required