AI Adoption Still in ‘Very Early’ Stages, BlackRock Exec Says
Investment giant BlackRock expects 2025 to be a big year for infrastructure and cybersecurity.
And the artificial intelligence (AI) boom will play a major role in those investments, Jay Jacobs, BlackRock’s U.S. head of thematic and active ETFs, told CNBC in a report published Saturday (Dec. 7).
“It’s still very early in the AI adoption cycle,” he said.
He added that AI firms need to build out their data centers, while protecting that data will likely be a wise investment in the new year.
“If you think about your data, you want to spend more on cybersecurity as it gets more valuable,” he said. “We think this is really going to benefit the cybersecurity [and the] software community, which is seeing very rapid revenue growth based off of this AI.”
In addition, Jacobs sees a broader impact when it comes to the infrastructure that supports AI. As “magical” as the tech might be, it’s important to remember the physical things — power, chips, real estate — that support it.
“It’s not just something that lives in the ether, in the cloud, there’s real physical things that have to happen, and that means energy, that means more materials like copper, that means more real estate. You really have to think about kind of the physical infrastructure that underlies it.”
And as PYMNTS wrote last week, this is a costly infrastructure, as running advanced programs requires massive data centers and specialized processors.
For example, Microsoft’s partnership with OpenAI required building several AI supercomputers, each powered by thousands of Nvidia A100 GPUs. These installations eat up significant amounts of power, as training a large language model (LLM) can use the same amount of energy as thousands of households.
“This pressure has sparked innovation in software architecture,” that report said. “Google has pioneered various optimization techniques, such as quantization, which reduces the precision of numbers required in calculations while maintaining model performance. Meta achieved efficiency gains with its Llama AI models through architectural improvements, allowing smaller models to perform strongly while using fewer parameters.”
With this in mind, PYMNTS wrote, major tech companies are racing to shrink their AI systems, marking a major shift in the industry.
“The process, known as AI optimization, involves refining complex software systems to improve their performance while reducing the computing power they need to run,” PYMNTS wrote. “These efficiency improvements can transform challenging economics into sustainable operations for companies that rely on massive computing systems.”
The post AI Adoption Still in ‘Very Early’ Stages, BlackRock Exec Says appeared first on PYMNTS.com.