Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":2792106,"post_type":"story","post_chan":"ai","tags":"category-business-industrial,category-computers-electronics,category-science-computer-science","ai":true,"category":"ai","all_categories":"ai,","session":"D"}']

Advice for deploying AI in production environments

Artificial intelligence (AI) is steadily making its way into the enterprise mainstream, but significant challenges remain in getting it to a place where it can make a meaningful contribution to the operating model. Until that happens, the technology risks losing its cachet as an economic game-changer, which could stifle adoption and leave organizations with no clear way forward in the digital economy.

This is why issues surrounding AI deployment have taken center stage this year. Getting any technology from the lab to production is never easy, but AI can be particularly problematic considering it offers such a wide range of possible outcomes for every problem it is directed to solve. This means organizations must proceed both carefully and quickly so as not to fall behind the curve in an increasingly competitive landscape.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":2792106,"post_type":"story","post_chan":"ai","tags":"category-business-industrial,category-computers-electronics,category-science-computer-science","ai":true,"category":"ai","all_categories":"ai,","session":"D"}']

Steady progress deploying AI into production

According to IDC, 31 percent of IT decision-makers say they have pushed AI into production, but only a third of that group considers their deployments to be at a mature stage. This is defined as the moment it starts to benefit enterprise-wide business models by improving customer satisfaction, automating decision-making or streamlining processes.

As can be expected, dealing with data and infrastructure at a scale that AI requires to deliver real value remains one of the biggest hurdles. Building and maintaining data infrastructure at this scale is no easy feat, even in the cloud. Equally difficult is properly conditioning data to weed out bias, duplication and other factors that can skew results. While many organizations are taking advantage of pre-trained, off-the-shelf AI platforms that can be deployed relatively quickly, they tend to be less adaptable and difficult to integrate into legacy workflows.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

Scale is not just a matter of size, however, but coordination as well. Sumanth Vakada, founder and CEO of Qualetics Data Machines, says that while infrastructure and lack of dedicated resources are key inhibitors to scale, so are issues like the siloed architectures and isolated work cultures that still exist in many organizations. These tend to keep critical data from reaching AI models, which leads to inaccurate outcomes. And few organizations have given much thought to enterprise-wide governance, which not only helps to harness AI toward common goals but also provides crucial support to functions like security and compliance.

The case for on-premises AI infrastructure

While it might be tempting to leverage the cloud to provide the infrastructure for large-scale AI deployments, a recent white paper by Supermicro and Nvidia is pushing back against that notion, at least in part. The companies argue that on-premises infrastructure is a better fit under certain circumstances, namely these::

  • When applications require sensitive or proprietary data
  • When infrastructure can also be leveraged for other data-heavy applications, like VDI
  • When data loads start to push cloud costs to unsustainable levels
  • When specific hardware configurations are not available in the cloud or adequate performance cannot be assured
  • When enterprise-grade support is required to supplement in-house staff and expertise

Clearly, an on-premises strategy only works if the infrastructure itself falls within a reasonable price structure and physical footprint. But when the need for direct control exists, an on-prem deployment can be designed along the same ROI factors as any third-party solution.

Still, in terms of both scale and operational proficiency, it seems that many organizations have put the AI cart before the horse – that is, they want to garner the benefits of AI without investing in the proper means of support.

Jeff Boudier, head of product and growth at AI language developer Hugging Face, noted to VB recently that without proper backing for data science teams, it becomes extremely difficult to effectively version and share AI models, code and datasets. This, in turn, adds to the workload of project managers as they strive to implement these elements into production environments, which only contributes to disillusionment in the technology because it is supposed to make work easier not harder.

Many organizations, in fact, are still trying to force AI into the pre-collaboration, pre-version-control era of traditional software development rather than use it as an opportunity to create a modern MLops environment. Like any technology, AI is only as effective as its weakest link, so if development and training are not adequately supported, the entire initiative could falter.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":2792106,"post_type":"story","post_chan":"ai","tags":"category-business-industrial,category-computers-electronics,category-science-computer-science","ai":true,"category":"ai","all_categories":"ai,","session":"D"}']

Deploying AI into real-world environments is probably the most crucial stage of its evolution because this is where it will finally prove itself to be a boon or a bane to the business model. It may take a decade or more to fully assess its worthiness, but for the moment at least, there is more risk to implementing AI and failing than holding back and risk being outplayed by increasingly intelligent competitors going forward.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More