Presented by MongoDB
Generative AI is a critical component of today’s applications — and in the future, it will live within every application and every piece of software, delivering groundbreaking, personalized experiences. Generative AI is also key to developers and businesses who are building and delivering world-class applications.
“Generative AI augments and accelerates developer productivity and enhances their experience across their entire development cycle,” Gregory Maxson, global lead, AI GTM at MongoDB told VentureBeat. “AI gives developers so much more time to focus on meaningful innovation, so they can build these cool applications faster.”
Recent studies underline the extent of these productivity and experience gains. McKinsey found that developers can handle coding tasks up to twice as fast with gen AI. Writing code and documenting functionality takes half the time, while code refactoring can be done in about two-thirds the time. And that’s translated to truly significant gains in job satisfaction, as developers using generative AI-based tools are more than twice as likely to report overall happiness, fulfillment and a state of flow.
Meanwhile, recent research from GitHub shows that developers using AI-powered application development tools feel 75% more fulfilled in their roles, while 87% said they were more likely to solve high-value problems rather than just spending time on repetitive tasks — which in turn helps companies retain their best talent.
MongoDB is uniquely positioned to deliver this opportunity to organizations of every size and their developers, Maxson added. The company’s multi-cloud platform, MongoDB Atlas, helps developers build and scale applications in one place. Combined with MongoDB’s integrated operational database, Atlas Vector Search can help organizations build applications powered by generative AI faster and with less complexity — while taking advantage of MongoDB’s industry-leading partner integrations.
Subtracting complexity, adding choice and flexibility
There is currently a broad array of frameworks available for developers to work on, depending on the programming language a developer prefers to use, and how they build their applications. To provide developers with choice and flexibility, MongoDB believes it’s important to ensure that all of its features and functionalities are easily accessible through the frameworks, meeting developers where they are throughout their development journey and enabling seamless interoperability with the tools they rely on.
“We think it’s important to give developers the flexibility to use whatever framework they like when using MongoDB Atlas, to help them build modern applications more quickly and seamlessly,” Maxson explained.
For example, MongoDB has been investing in partnerships with leading frameworks, including LlamaIndex and LangChain, for over a year. These frameworks work closely with Atlas Vector Search, removing much of the friction involved in building AI-enabled applications by bringing solutions for operational and vector data together onto a single platform.
Both LangChain and LlamaIndex recently launched enterprise products, and MongoDB ensured they were a partner from day one. MongoDB also launched semantic caching with Langchain, enabling developers to build advanced applications even more efficiently. Maxson explained that as the market evolves, so does the nature of MongoDB’s existing partnerships. He adds, “We are deeply committed to bringing current innovation to our customers, which means constantly evolving our existing partnerships to meet their demands.”
Partnerships in action
Integrating Vector Search with LangChain and LlamaIndex lets developers access and manage LLMs from a wide variety of MongoDB partners, such as AWS, Google Cloud and Microsoft Azure, or model providers like OpenAI, Cohere and Anthropic, to generate vector embeddings and build AI-powered applications on MongoDB Atlas.
“LlamaCloud enables businesses to orchestrate production LLM and RAG workflows over their unstructured, semi-structured and structured data sources with MongoDB Atlas Vector Search as a key storage component for this data,” said co-founder and CEO of LlamaIndex, Jerry Liu.
“Our collaboration will ultimately enable developers to more efficiently and quickly translate the fundamental reasoning capabilities of LLMs into world-class, innovative applications and overall business impact,” he explained.
The rapid pace of change creates significant lock-in risk when building solutions around specific model providers or architectures, added Erick Friis, founding engineer at LangChain. “Our partnership with MongoDB offers our users an exceptional solution that adds vector store capabilities to a trusted NoSQL database to support virtually all advanced retrieval methods. We’re pleased to see an increasing number of users choose MongoDB for its flexibility and advanced feature set, enabling strategies ranging from LLM-based query analysis to traditional filtering systems. This allows for precise document retrieval based on content similarity as well as specific metadata criteria, such as source website or original file types.”
MongoDB is already seeing this strategy succeed. Maxson told VentureBeat that two emerging AI frameworks have recently built their own integrations with MongoDB from the start. In other words, MongoDB is seen as so critical that framework developers prioritize developing with them from the ground up.
The impact of partnerships on the future of AI
As generative AI becomes top of mind for every organization, Maxson said, the need to partner with both the major cloud providers and the up-and-coming AI startups has become only more apparent and necessary.
“Deeply integrated, seamless experiences and strong relationships with the cloud providers, the model providers, the frameworks and the other tools in a developer’s stack give a developer a complete end-to-end seamless solution instead of a bunch of moving parts that they have to think about connecting,” he explained.
MongoDB has extensive relationships with the three major cloud providers — AWS, Microsoft Azure and Google Cloud — and continues to deepen partnerships as the market evolves. Recently, the company has partnered with the cloud providers to focus on training models for natural language code completion generation, and has developed integrations with cloud providers and managed LLM services, including deeper integrations across Amazon Bedrock, Google Cloud Vertex AI and Microsoft Azure OpenAI. For example, MongoDB recently announced the general availability of MongoDB Atlas Search Nodes on both AWS and Google Cloud, which provide dedicated infrastructure to scale generative AI and relevance-based search workloads with up to 60 percent faster query times.
In addition, MongoDB is also working with AI trailblazers like Cohere, Codeium, Patronus, Together AI, Fireworks AI and Voxel 51, and the company only expects its list of AI partners to only grow.
“We believe that capturing that AI opportunity will not only require a collective effort from MongoDB and our current partner ecosystem, but also an investment in new partners to bring all the cutting-edge innovation happening in the AI space to our customers through seamless experiences,” said Maxson.
MongoDB is also helping customers keep up with the ever-changing world of AI by tracking models and providers. By working closely with AI partners, it’s able to offer customers an overall MongoDB solution.
“We want to bring our customers a broad ecosystem of offerings, showing our customers that no matter what other generative AI tools you want to use in your stack, MongoDB runs with anyone,” Maxson said. “That’s our goal, to ensure that no matter where developers are in their journey, no matter what tools they use, we work well with them.”
Developer response and innovative use cases
MongoDB is already seeing its platform resonating not only with the larger, more mature companies integrating AI within their operations, applications and workflows but innovative, well-financed startups like Hugging Face, Tekion, OneAI and Neuro, which are delivering the next wave of AI-powered applications.
Use cases include real-time patient diagnostics for personalized medicine from Inovaare; VISO TRUST’s cyber threat data analysis for risk mitigation; CetoAI’s predictive maintenance for maritime fleets — and more.
UKG, a human capital and workforce management technology company with more than 80,000 customers across the globe, leverages MongoDB Atlas Vector Search for AI-powered assistants because of its minimal added architectural complexity, flexibility to handle rapidly changing data as applications evolve and ability to scale to handle large workloads.
There’s also ENI, one of Europe’s largest energy companies, which has terabytes of geospatial data spanning several countries, and has struggled with their relational database’s inability to create comprehensive data sets and run complex queries. ENI turned to MongoDB Atlas to make working with unstructured data easier and more accessible. They also developed a chatbot-based document management platform so that their scientists and business owners can take full advantage of generative AI apps in Atlas Vector Search to retrieve the exact data needed.
“A year ago, there were some questions — will generative AI live up to the hype? Now everyone from the startups to the largest enterprises in the world is recognizing that they need to incorporate gen AI into their operations and applications to stay relevant,” said Maxson. “Developers have appreciated the continued investment and innovation that MongoDB has provided to enable them to build these experiences with reduced complexities.”
Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. For more information, contact