The Jurassic Park Principle: In the Age of AI, the Poets Inherit the Earth
The data scientists were so preoccupied with whether they could, they didn’t stop to think if they should.
Welcome to Jurassic Park.
While the tech world races to build the next generation of AI, a critical question looms: are we building a future we truly understand? For decades, the business world has prioritized the "hard" sciences. The heroes of our modern creation story have been the engineers, the data scientists, and the coders—the brilliant minds who, through the sheer force of technical will, have built our digital world. Even so-called 'soft' science courses like marketing are full of formulas and quantitative analyses. Now, with the explosion of generative AI, the scientists have reached the apex: they have, effectively, achieved a creation of the same magnitude as the dinosaurs of Jurassic Park.
Jeff Goldblum’s iconic line from the film lays out a chaotic, cascading narrative: "God creates dinosaurs; God destroys dinosaurs; God creates man; man destroys God; man creates dinosaurs..." He trails off, but Laura Dern predicts the chilling conclusion: "Dinosaurs eat man." The creators are consumed by their creation.
But the story doesn't end there. It's easy to maintain the illusion of order while looking at the world through a computer screen, but chaos inevitably ensues. When every essential system collapses, the park is overrun, and dinosaurs once again rule the earth. Or the island, at least.
In the film, Dern predicts 'woman inherits the earth,' but I propose a variation for our current moment: the poets inherit the earth.
Man Creates Dinosaurs: AI as an Unpredictable New Life Form
To understand why the 'poets' are essential, we must first confront the nature of the 'dinosaurs' we've created. Like those dinosaurs, generative AI is a breathtaking achievement of science. It is powerful, yet its core programming is shaped by human hands.
In a direct parallel to the film's scientists filling gaps in the dinosaur genome with frog DNA, AI creators make critical choices every day when they select and label data inputs. This intervention, meant to complete the system, instead makes the technology fundamentally unpredictable–essentially, creating a new species of intelligence whose behavior we cannot fully anticipate. It can write entire original narratives, design products, and analyze data at a scale beyond human comprehension.
The STEM fields have given us a new force of nature, and we are, rightly, in awe. We have built the ultimate theme park, with market analysts at Statista projecting the generative AI market will grow at a CAGR of almost 37% over the next six years, and the world is lining up for tickets.
But as with any powerful creation that lacks inherent judgment, the park has security flaws.
Dinosaurs Eat Man: The Limits of a Purely Technical Mindset
The problems with AI are not bugs in the code; they are reflections of the messy, human world that trained it. AI bias, model hallucinations, copyright infringement, and the potential for mass misuse are not technical problems in search of a purely technical solution.
And these are not abstract flaws; they are liabilities that can manifest as reputational damage, legal battles, and catastrophic failures in the wrong hands. In fact, a 2024 survey by KPMG revealed that 61% of CEOs surveyed are concerned about the ethical challenges of generative AI.
Ultimately, these are problems of history, sociology, ethics, and language - problems that are, at their core, rooted in the liberal arts. While the technical brilliance behind "Can we?" has propelled us forward, the complex questions of "Should we?" often fall outside the scope of a purely technical framework. A system designed to predict the next word in a sequence cannot, by itself, understand the meaning or real-world consequences of those words. When the dinosaurs get loose, the engineers who built the fences are often the first to be surprised by their behavior.
The Poets Inherit the Earth: The New Value of Critical Reasoning
So, who thrives in this new, chaotic, AI-driven ecosystem?
The "poets."
By "poets," I mean the historians, the philosophers, the sociologists, the classicists, the linguists—all those trained in the liberal arts. These are the people equipped with the skills not to build the AI, but to manage it, to provide the judgment it lacks.
Their value is no longer "soft"; it is the most critical business function of the 21st century. This isn't just a theory; it's a documented shift in business priorities. In its 2025 Future of Jobs Report, the World Economic Forum identified analytical and creative thinking as the two most important skills for workers today; these distinctly human abilities are now considered more crucial than purely technical literacy by a majority of companies.
They are the ones who can:
Provide Context: They understand the historical and societal context in which AI operates, allowing them to anticipate second-order effects.
Apply Ethical Frameworks: They are trained to deconstruct complex ethical dilemmas and guide AI development and implementation responsibly.
Communicate with Nuance: They can explain the risks and benefits of AI to diverse stakeholders—customers, regulators, and boards—in clear, human terms.
Think Critically: They can identify the flaws in an argument, question the assumptions behind the data, and reason from first principles—essential when an AI provides a confident but incorrect answer.
In the age of AI, the ability to generate content is becoming a commodity. The ability to exercise judgment, wisdom, and critical reasoning, however, has never been more valuable. The most successful and resilient companies will be those that pair their brilliant engineers with the contextual wisdom of their "poets." They will be the ones who not only create the dinosaurs, but also possess the wisdom to know how to live alongside them.