02/05 2026
345
When I first started working in tech media, technicians were usually very considerate to us writers during interviews. When discussing technical details, they would habitually add, 'Let me put it simply.' This instinctive simplification was like a mirror reflecting the marginal position of a liberal arts background in the tech field.
This sense of marginalization is even more pronounced for those working in tech PR within tech companies.
Among the complaints from friends in corporate PR, one word frequently comes up: frustration. Their skills in communication, creativity, and interpersonal interaction, honed through liberal arts education, are often relegated to auxiliary roles in tech-dominated organizations. Internally, they must accommodate and align with R&D rhythms, lacking core involvement and influence. Externally, they must maintain media relationships and clearly explain complex technical concepts, navigating a difficult balancing act.
Now, the winds are quietly shifting. A peculiar fascination with liberal arts is spreading among STEM groups.
In industry chat groups, coders joke that while 'talk is cheap, show me the code' was the mantra before, now 'code is cheap, show me the talk' rings true. Skills in communication and storytelling with AI are becoming more important than coding.
Scrolling through social media, a programmer friend working on algorithms at a major company remarked that liberal arts students who clear the basic IT threshold often have a stronger advantage in the long run.
It seems that, overnight, the entire STEM community has rediscovered the value of the liberal arts.
Historian Ge Zhaoguang once said that humanities, both in China and abroad, seem to be on the brink, and the self-defense of humanities has been repeated countless times over the years, to the point where absolute truths have become clichés.
However, when these 'clichés' of self-defense come from STEM professionals, they take on a humorous and thought-provoking tone.
The shift in STEM's attitude toward the liberal arts, from marginalization to admiration, essentially reflects the comprehensive infiltration of AI into various foundational roles. Both humanities and sciences are being submerged by technology, leading STEM to seek humanistic strengths to fill their technical gaps and avoid being replaced by AI. Unfortunately, the liberal arts cannot serve as a lifeline for STEM, nor can they launch a counterattack against AI.
November 2025, Beijing 798 Art District, Fuguang Hall.
I attended a tech launch event in this space imbued with humanistic aesthetics. A traditional B2B company announced its entry into an AI-driven new phase, launching a consumer-facing product: an emotional companion AI. Endowing the AI with a warm personality requires a touch of imagination. A journalist at the event asked, 'Can STEM employees handle tasks that are typically the forte of artistic and creative individuals?'
The company's executive smiled and revealed that the two team members who excelled in debugging the emotional companion AI were, in fact, liberal arts graduates.
He even suggested that assigning artistic and creative individuals as AI product managers while letting engineers focus on their technical roles might be a more suitable division of labor.
The rising importance of liberal arts skills is closely tied to the reality of companies' large-scale adoption of large models.
From 2023 to 2025, corporate AI transformation has accelerated. QuantumBlack's 'The State of AI in 2025: Agents, Innovation, and Transformation' surveyed nearly 2,000 respondents across 105 countries, spanning various industries, company sizes, professional fields, and experience levels. It found that AI usage has increased in nearly all sectors.
During their AI transformation, companies, in addition to technical R&D roles, have created a significant number of new positions explicitly requiring humanities and social science skills. Several of these new roles seem well-suited for liberal arts graduates:
1. **AI Storytellers**: Common in gaming, AI companions, and short-video industries. For example, when an IT company builds an emotional companion AI, this role involves creating complete character backstories, dialogue logic, and narrative arcs. These positions favor individuals with creative abilities, such as those with backgrounds in art and design, psychology, or experience in amusement park creative planning or public opinion analysis.
2. **AI Annotators**: Large language models require high-quality corpora, necessitating human 'language teachers' to help AI understand the subtext in human language. Annotators do more than just tag data; they need sociological and linguistic literacy to grasp emotional nuances and cultural context in texts.
3. **AI Ethicists**: These professionals conduct moral reviews for AI to prevent it from making offensive remarks toward vulnerable groups or assisting in unethical activities like writing extortion letters. Companies like Google and Anthropic have introduced such roles, requiring sensitivity to philosophy, law, and sociology.
4. **AI Art Designers**: Responsible for refining the aesthetic quality of AI-generated content and ensuring artistic compliance. OpenAI's 'red team' for the Sora model falls into this category.
The preference for humanities and social science skills in these roles stems from AI's technical limitations aligning with the strengths of liberal arts students. This has fueled the narrative of a 'liberal arts comeback,' making it one of the hottest topics in the tech industry.
Tech influencers like Zhou Hongyi have argued that liberal arts students may have an edge in the AI era. Li Feifei, director of the Stanford AI Lab, has called for every top AI lab to include anthropologists, sociologists, and psychologists. There's even a rumor that a college dropout with an art background built crucial AI infrastructure.
For decades, the liberal arts were seen as a burden to technological progress, steadily declining. For instance, Europe's lag in AI development is often attributed to its strict adherence to ethical principles like data security and privacy protection, which are perceived as humanistic constraints hindering technological innovation.
Why, then, are the liberal arts now being sought after by technologists, seemingly gaining an edge over STEM?
At the recent Davos World Economic Forum, Elon Musk predicted that by 2030, AI's intelligence would surpass that of all humanity combined. Only 2,000 days remain for the old world.
Under such dire predictions, the liberal arts are hailed as humanity's last bastion, capable of launching a counterattack against AI with humanistic spirit.
Yet the reality is that the liberal arts, submerged by AI, cannot construct a fortress.
The rising tide of AI's capabilities is quietly encroaching, eroding career opportunities for liberal arts students inch by inch.
Those at the bottom are submerged first.
In cheap apartments in Kenya and Uganda, young people trained in literature, education, and publishing click through options like 'anger,' 'joy,' or 'neutrality' on their screens. They are AI annotators hired by Meta through outsourcing firms like Appen, tasked with teaching algorithms to understand sarcasm, slang, and cultural subtleties in human language.
Their literary skills fuel large language models. This data-labor job has nothing to do with metaphysics, let alone humanistic value.
By 2025, AI's tide has lapped at white-collar desks.
While Selina and Evon from Beijing and Shanghai were celebrating the Lunar New Year back home, DeepSeek-R1 trended on social media. Overnight, roles in brand copywriting, PR planning, and government secretarial work—once considered respectable careers for liberal arts graduates—were collectively disrupted.
These clerical tasks demand linguistic precision and format compliance but lack creative barriers, making them prime targets for AI.
'My world has collapsed. What was the point of my 20+ years of effort?' a distraught writer posted on social media.
AI scientists and tech leaders agree that liberal arts skills are important. But that doesn't equate to the importance of liberal arts graduates—a harsh reality for the 99% of ordinary liberal arts students.
What about the remaining 1%?
In April 2024, when Sora's video generation model entered the film industry, rumors swirled about 'AI replacing directors' and 'AIGC targeting extras first.' OpenAI urgently recruited human artists to form a 'red team' to refine AI-generated content. It seemed artists could coexist with AI.
Ironically, these artists later collectively sued OpenAI, claiming their works were used to train models without authorization. Even high-level liberal arts skills were exploited and discarded by AI.
Liberal arts students are being submerged by AI. So what exactly are the 'liberal arts skills' that tech companies covet?
No one can clearly say.
An AI companion company once hired a veteran game world architect to build the story engine for their emotional companion AI. While his writing skills were undeniable, the actual performance fell short.
The core issue was that beyond creative writing, the role required integrating AI's technical strengths and weaknesses, translating creativity into practical steps like prompt engineering. As a product manager, one must independently complete the loop from idea to execution. Relying solely on humanistic literacy is insufficient.
Whether these new opportunities and roles at the intersection of humanities and technology are viable or sustainable remains uncertain. They cannot serve as an outlet for liberal arts graduates.
The technological tide rises silently, drowning the silent majority. The notion of a liberal arts counterattack is merely a well-intentioned illusion.
When the humanities must prove their utility to AI to gain recognition and survive in the technological age, their intrinsic value is already lost.
Some scholars argue that humanities flourished from the 16th to 18th centuries because scholars critically examined and questioned issues that people wanted to address but lacked the capacity to express. In that era, humanities influenced the public and, in turn, established their disciplinary value.
Today, when the liberal arts must prove their worth to AI, they have lost their most precious transcendence.
At this moment of humanistic self-dissolution, STEM suddenly begins to call for humanistic values and liberal arts education. Why?
Because AI's tide has finally breached their defenses, reaching the gates of the technical community.
Silicon Valley is experiencing its most intense wave of tech layoffs. Google, Meta, and Amazon have cut over 200,000 jobs in three years, mostly affecting junior programmers and test engineers. In India, the once-thriving code outsourcing industry, which relied on cheap labor, is now losing ground to AI. For $200 a month, large models can accomplish what an entire team did in a week.
One company even posted a radical tweet announcing the 'last line of human-written code.'
This survival crisis has forced STEM professionals to reflect: When programming becomes automated and human craftsmanship holds no cost advantage over AI, how can they remain relevant?
They are turning to the liberal arts, seeking humanistic skills that AI cannot yet replicate, such as empathy, creativity, and critical thinking. This fascination with the liberal arts is STEM's self-redemption amid crisis.
In 1952, Kurt Vonnegut predicted in *Player Piano* that 'machines will first replace physical labor, then routine work, and finally, perhaps, genuine intellectual labor.'
The process may differ, but the outcome seems to align: AI does not favor liberal arts or STEM; it is an equalizer, submerging every discipline.
Liberal arts lie on one shore, STEM on the other, with a clear divide—a legacy of the 19th-century industrial era. This division efficiently cultivated specialists for industrialization but appears rigid and fragile in the face of AI's onslaught.
Traditional disciplines are now jumping into the water, learning to navigate the new era.
One day in 2021, Professor Lu Peng from Lanzhou University's School of Sociology sent a message to an AI company's ecosystem lead: 'Your AI training programs focus heavily on computer science teachers. Why not consider our social science faculty, a larger group?'
Their discussion revealed that many areas of social science, such as sociological surveys and data analysis, could benefit from AI. Professor Lu proposed: 'Why not offer AI training for social science teachers?'
Thus, 'The First AI Course for Social Science Teachers' quietly began.
On graduation day, Professor Lu said, 'The goal isn't to turn you into coders but to integrate social contexts and knowledge into code.'
Meanwhile, STEM is quietly becoming more 'humanistic.'
In 2013, Rhode Island School of Design President John Maeda launched the STEAM Core Group, integrating STEM (Science, Technology, Engineering, Mathematics) with Art and Design (A). She argued that we must move beyond the binary view of STEM versus liberal arts education.
The public school system in Andover, Massachusetts, prioritized STEAM education, while DeSoto West Middle School in Texas opened the iSTEAM3D Magnet Academy, where students design cities in *Minecraft*, learn planning, use 3D printers for prototyping, and study chemistry through protein-folding games like *Foldit*, contributing new ideas to scientific research.
More radically, some initiatives bypass universities entirely.
Palantir, an AI company specializing in big data analytics, directly recruits high school students, training them in high-intensity liberal arts programs to cultivate value judgment.
These efforts point to a truth: The industrial-era discipline classification system and 19th-century disciplinary frameworks can no longer answer 21st-century questions. Change is inevitable.
Yet change is slow. The journey from popularizing ideas to reaching consensus, and then to institutional arrangements and practical implementation, is long.
The late recognition of liberal arts value and the slow progress of integrated education may never catch up to AI's evolution.
As we begin to reexamine the meaning of humanities, AI's iteration has already accelerated.
In January 2026, Anthropic CEO Dario Amodei published a 20,000-word essay, *The Adolescence of Technology*, arguing that 2027 could mark AI's 'coming of age' and the first window for systemic societal disruption.
He emphasized that liberal arts (liberal arts) and humanistic literacy would remain crucial as technological capabilities grow exponentially, ending its 'adolescence' and achieving human-level or superior general intelligence. Yet, as we observe, neither liberal arts students nor the liberal arts themselves can lead a counterattack. The time left for liberal arts to innovate and for humanities and technology to integrate may be Musk's 2,000 days or Amodei's year or two. Either way, it is running out.
Expecting a liberal arts counterattack may already be too late, but it is not meaningless. The hope for humanistic value stems from an eternal impulse: Through liberal arts education, we can understand who we are, what we want, and why we matter. Ultimately, we should define our own value.