Since early 2022, the big buzz in the tech industry and among the general public has been “artificial intelligence.” While the concept isn’t new—AI has been used to describe how computers play games since at least the 1980s—it has once again captured the public’s imagination. Before delving into the article, a brief primer is necessary. AI can be broken down into seven broad categories. Most of these are hypothetical and do not exist. The type of AI everyone is interested in falls under the category of Limited Memory AI, where large language models (LLMs) reside.
Think of LLMs as complex statistical guessing machines. You type in a sentence, and it outputs something based on the loaded training data that statistically aligns with your request. For example, ask ChatGPT 4.0 to solve a logic puzzle: “This is a party: {} This is a jumping bean: B The jumping bean wants to go to the party.” It will output, with some word flair, “{B}.” Although this may seem impressive, it shows the limitations of AI. When given a slightly different but still simple puzzle, ChatGPT produced incorrect answers, demonstrating that AI can falter even with minor complexity. This illustrates that while AI can handle simple, predictable tasks well, it struggles with anything more nuanced.
Despite advancements, the technology remains ludicrously expensive. OpenAI, the leader in LLMs, is on track to lose $5 billion this year, representing half of its total capital investment. Losses expand with more customers, and the better the model gets. Additionally, viable applications for this technology are surprisingly scarce. AI implementations like Air Canada’s customer service and AI-assisted legal case preparations have backfired spectacularly. The energy requirements to operate AI are also immense and growing.
The hardware industry, crucial for AI, is nearing the end of its advancement potential, making further development not just more complex but also more expensive. Processor designers have exhausted speed levers and single-thread performance peaked in 2015. Increasing logic core counts via shrinking transistors is expected to peak next year. This spells trouble for AI, as efficiency gains from hardware will no longer offset costs. New customers will require new capacity, continuously driving costs up. With these factors, a prudent businessman might cut losses in the AI space, considering its rapidly expanding costs and questionable utility.
Yet, AI investments continue to expand. This ongoing investment is a significant repercussion of the long easy-money era, despite formal Federal Reserve interest rate hikes. The tech industry, in particular, has reaped benefits from this phenomenon. Easy money has gone on for so long that entire industries, including tech, are built around it. Tech companies invest billions in questionable business plans merely because they have a software component. The AI boom shows patterns similar to the WeWork fiasco. Both address mundane solutions that don’t scale well to the customer base. They are highly subject to variable operational costs and apply an extra layer of expense without significant innovation.
Major tech firms like Google and Microsoft pour resources into AI because, relative to their massive reserves, the costs seem trivial. The fear of missing out outweighs the potential loss. However, even easy money has limits. Estimates suggest a 2025 investment in AI of $200 billion, which isn’t insignificant even for tech giants like Alphabet. Some projections, like global AI revenues reaching $1.3 trillion by 2032, seem far-fetched. Investors may eventually question the sanity of pouring money into AI without corresponding returns. Losses can’t be sustained indefinitely. While big players like Microsoft and Nvidia may weather the storm, lower profits and layoffs in the tech sector seem likely.
Of course, there’s always a chance that AI will prove its worth and generate $1.3 trillion in consumer dollars in the next five years, succeeding where other tech innovations have failed. However, given the patterns and historical context, optimism seems misplaced. The tech industry is in the midst of an easy-money-fueled party, but once the cash flow stops, the landscape will likely be littered with failed startups and substantial layoffs. My proof? The last truly disruptive tech, the iPhone, turned 17 not long ago. Since then, the industry has chased the next big thing with little success, sustained only by easy money.