Want smarter insights in your inbox? Sign up for our weekly newsletters to get only what matters to enterprise AI, data, and security leaders. Subscribe Now
A new survey from cnvrg.io, an Intel company, reveals that enterprise adoption of artificial intelligence (AI) solutions remains low, even as generative AI grabs headlines. Despite the buzz, the pathway to production remains fraught with hurdles, from infrastructure to skill gaps.
The 2023 ML Insider survey, now in its third year, draws insights from a global panel of data scientists and AI professionals, underscoring a cautious approach to generative AI adoption. The survey indicates that only 10% of organizations have successfully launched gen AI solutions to production, a figure that starkly contrasts with the high hopes and expectations for the technology.
Financial Services, Banking, Defense and Insurance emerge as the leaders in AI adoption, embracing the promises of enhanced efficiency and improved customer experiences. On the opposite end, Education, Automotive and Telecommunications appear hesitant, with AI initiatives in their infancy.
“The survey suggests organizations may be hesitant to adopt gen AI due to the barriers they face when implementing LLMs,” said Markus Flierl, corporate VP for the developer cloud at Intel. “With greater access to cost-effective infrastructure and services, such as those provided by cnvrg.io and the Intel Developer Cloud, we expect greater adoption in the next year as it will be easier to fine-tune, customize and deploy existing LLMs without requiring AI talent to manage the complexity.”
AI Scaling Hits Its Limits
Power caps, rising token costs, and inference delays are reshaping enterprise AI. Join our exclusive salon to discover how top teams are:
- Turning energy into a strategic advantage
- Architecting efficient inference for real throughput gains
- Unlocking competitive ROI with sustainable AI systems
Secure your spot to stay ahead: https://bit.ly/4mwGngO
Additional findings from the survey include:
- 46% cited infrastructure as the top barrier to deploying large language models that power generative AI. Compute-intensive models strain IT resources.
- 84% of respondents admitted needing to improve skills to support the growing interest in language models. Just 19% felt fully proficient in how models generate content.
- Top AI use cases were chatbots and translation, likely reflecting generative AI advances in 2023. However, only 25% of organizations had deployed any generative models to production.
- 58% have low AI integration, running 5 or fewer models. The number has not grown substantially since 2022. Larger companies are more likely to run 50+ models.
- 62% still rate executing successful AI projects as difficult. The larger the company, the harder it is to deploy AI.
The findings indicate that despite the buzz around AI due to tools like ChatGPT, real enterprise adoption faces real challenges. Companies are still experimenting with generative AI versus having fully integrated it into operations. Factors like skills, regulation, reliability and infrastructure create hurdles to rapidly scaling AI.
“The 2023 ML Insider Survey shows that a majority of AI developers say lack of technical skills is slowing down their organization’s adoption of ML and Large Language Models, which creates pressure in a business world racing to implement gen AI capabilities,” said Tony Mongkolsmai, software architect and technical evangelist at Intel. “As an industry, we need to do everything we can to remove complexity and simplify tasks to make it easier for developers.”
To see additional insights, check out the full ML Insider 2023 report on the company website.