AI Lowers Learning Costs, Yet Laziness Persists: A Cultural and Technological Analysis

01/22 2025 458

Despite the cost-reducing benefits of AI, individuals remain reluctant to embrace learning, even shunning new tools that could enhance their productivity.

Text / Shuhang 2025.1.22

A randomized controlled trial by a World Bank team in Nigeria revealed that six weeks of AI-assisted after-school tutoring yielded learning outcomes equivalent to two years of conventional schooling. This approach surpassed 80% of educational interventions in developing countries listed in the randomized controlled trial database.

Recent Pew Research data indicates that 26% of American teenagers admit to using ChatGPT for homework. Awareness of AI tools varies by income and ethnicity: high-income families' 13-17-year-olds are more likely to know about ChatGPT than their low-income counterparts; awareness among white, black, and Hispanic teenagers stands at 83%, 73%, and 74%, respectively.

AI has significantly disrupted traditional teaching methods. Without adjustments to education and assessment standards, courses heavily reliant on rote memorization and offline thinking may struggle to produce accurate results. A cycle has emerged where students utilize large models to complete assignments, while teachers rely on "intelligent grading" for corrections.

Peter Kyle, the UK's Minister for Science, Innovation, and Technology, believes that students should not be strictly prohibited from using AI, likening it to past calculators (notably, there remains a disparity in calculator usage in basic education between large and small cities).

In an interview, Sam Altman of OpenAI stated that AI may gradually diminish the importance of "raw intelligence," with the ability to ask the right questions becoming more crucial than finding answers.

Discussions frequently center on the adoption of versatile chat-style platforms like ChatGPT, Doubao, Kimi, and ERNIE Bot, rather than developers' arduous efforts to create customized "learning machines." In fact, these simple chat boxes have had the most profound impact on previous educational aids and after-school tutoring.

ChatGPT holds approximately 70% of the global AI market share, with roughly 200 million weekly active users. In September 2024, ChatGPT surpassed 3 billion visits, widening the gap with Bing in website traffic and surpassing TikTok in monthly visits that month. For over a year, it dominated App Store downloads until recently being overtaken by apps like Xiaohongshu.

ChatGPT Search attracted over 10 million users in its first month. The proportion of American respondents using ChatGPT as their primary search engine increased from 1% in June 2023 to 8% in September 2023, while Google's share fell from 80% to 74%.

These findings may not bode well for AI entrepreneurs specializing in companion software and hardware, as well as AI for specific purposes like teaching. Once an AI companion app acquires customers and offers stable service, revenue generation becomes less of a concern. However, "acquiring customers" has emerged as the primary challenge.

Discussions on "AI companions" often feature top products like ChatGPT, Character.ai, and Replika. Even as The New York Times noted, despite ChatGPT's weekly memory reset, some users persist in retraining their "amnesiac" virtual companions.

People prefer established, well-known products and show little interest in new names, even if smaller products might better meet their needs.

Other widely used practical applications, often overlooked as typical "AI applications," include voice assistants like Siri and Xiaoai, algorithmic recommendations on streaming media and e-commerce platforms, and spam filters. Their commonality lies in being embedded within existing products rather than standalone entities.

Even corporate clients who can afford dedicated applications now hesitate to develop them. I have observed that among vendors with both B-end and C-end businesses, most users who actually use the product for work prefer individual memberships over corporate subscriptions.

In 2024, corporate clients for AI products prioritized simplicity and out-of-the-box functionality over customization, with less demand for control than anticipated. Companies are even willing to stick with public chatbots, despite potential security risks, rather than deploying their own, which could be more secure.

This explains why despite the availability of free and open-source alternatives like Meta's Llama and Mistral, OpenAI and Anthropic's revenues surged more than fivefold last year.

Meta stated that startups and enterprises widely use Llama, with over 770 million downloads. However, its performance on platforms like AWS where Llama can be deployed has been mixed. For many companies, directly using ChatGPT is more straightforward.

A year ago, enterprise software providers like Databricks and Snowflake expected customers to fine-tune models to understand their specific business vocabulary for better answers. However, achieving a truly effective model requires cleaning and formatting vast amounts of data. Baris Gultekin, AI Director at Snowflake, told The Information, "We are seeing the market increasingly turning to packaged, ready-to-use solutions."

Both companies have essentially ceased customizing open-source models for users. Retrieval-Augmented Generation (RAG) can enhance model performance to meet specific needs without altering the model itself.

Established enterprise application developers will welcome users' preference for non-open-source model solutions. This mirrors Microsoft's historical lobbying efforts to favor its own solutions over Linux server solutions. The aggressive and supposedly free Linux solution ultimately failed to replace Windows servers.

This could also be good news for companies like Salesforce and Oracle, which integrate AI capabilities into existing applications and appropriately raise prices to cover costs, rather than developing new AI products.

The centralization of AI capabilities, rather than decentralization, has sparked another notable trend—concentrated computing places higher demands on infrastructure like computing power and data centers.

UK Prime Minister Starmer has announced significant investments in AI data centers and other infrastructure. Shortly thereafter, Biden signed an executive order before leaving office to open federal lands for data center construction.

Bloomberg analysis suggests that if the global AI arms race of the past two years was defined by access to cutting-edge chips and talent, the next phase may be marked by data centers. After Chinese models like Deepseek demonstrated the potential "cost-effectiveness" of computing power, the importance of concentrated computing has gradually increased.

A Dubai-based developer has announced a $20 billion investment in US data center projects, and SoftBank's Masayoshi Son has also expressed plans to invest $100 billion in US AI infrastructure. This is part of a broader strategy by international businesspeople to curry favor with the Trump administration.

In a "blueprint" released this month, OpenAI urged the US government to accept foreign investment and collaborate closely with the private sector to build AI infrastructure, warning that "China will take the lead" otherwise. Some anticipate that Trump may relax regulations on data centers powered by coal or natural gas, despite the serious environmental concerns this raises.

Solemnly declare: the copyright of this article belongs to the original author. The reprinted article is only for the purpose of spreading more information. If the author's information is marked incorrectly, please contact us immediately to modify or delete it. Thank you.