Classifieds

AI Data Center Demand Is Grossly Underestimated. Takeaways from Gen AI Leaders

[ad_1]

Tech industry leaders from Google, Salesforce, Adobe, Anthropic, CoreWeave and Amazon Web Services, met at Bloomberg Intelligence’s annual AI summit at Bloomberg’s New York City headquarters to discuss the advancements in generative AI, like the rise of autonomous agents and the soaring demand for data centers.

Enterprise spending on gen AI is expected to reach an eye-popping $151.1 billion by 2027, per eMarketer, and advancements in the tech are progressing at a breakneck pace, spurring tech and marketing leaders to debate topics like large-language models (LLMs) for emerging modalities, including video integration, as well as copyright risks in text-to-video tools.

Here are some of the key takeaways for marketers.

AI use cases are only just emerging

Neerav Kingsland, head of global accounts at Anthropic, which snagged a $2.75 billion investment from Amazon last week, said tangible enterprise applications are only just beginning: LexisNexis is using Anthropic’s LLMs on Amazon Bedrock for legal analysis.

Anthropic’s suite of LLMs, Claude, debuted in March last year.

“By April, we had Fortune 100 companies knocking on our door,” said Kingsland. “But it was all proof of concept. Nothing was really getting deployed in 2023.”

Brian Venturo, co-founder and chief strategy officer at Nvidia-backed CoreWeave, said that consumer LLMs are currently in a “novelty phase,” and it will take approximately five years to understand its implications for companies fully.

“Nobody really knows how this is going to permeate our daily lives,” he said.

Demand for data centers is grossly underestimated

CoreWeave’s Venturo said that the daily data center requests he receives are “absurd,” with some clients seeking entire campuses exclusively for their use.

“The market is moving a lot faster than supply chains that have historically supported a very physical business have been set up to do,” he said, predicting more mega campuses, which will stress power grids and potentially lead to political tension.

LLMs rely on the quality of the data, chips, and algorithms they are trained on, which occur within data centers globally, some powered by cleaner energy than others depending on their location.

The demand for AI, coupled with limited land availability, has pushed U.S. data center prices up by 19%, according to real estate firm JLL. Nvidia CEO Jensen Huang estimates that $250 billion will be spent annually on data center equipment.

1 2Next page

Related Articles

Back to top button