MongoDB, today at AWS re:Invent 2023 announced plans to integrate MongoDB Atlas Vector Search with Amazon Bedrock to enable organizations to build next-generation applications on Amazon Web Services (AWS) and their industry-leading cloud infrastructure. MongoDB Atlas Vector Search uses an organization’s operational data to simplify bringing generative AI and semantic search capabilities into applications for highly engaging and customized end-user experiences.
This integration will make it easier for developers to create applications on AWS that use generative AI to complete complex tasks for a wide range of use cases and deliver up-to-date responses based on proprietary data processed by MongoDB Atlas Vector Search. To learn more about building AI-powered applications on MongoDB Atlas, visit mongodb.com/use-cases/artificial-intelligence.
“Customers of all sizes from startups to enterprises tell us they want to take advantage of generative AI to build next-generation applications and future proof their businesses. However, many customers are concerned about ensuring the accuracy of the outputs from AI-powered systems while protecting their proprietary data,” said Sahir Azam, Chief Product Officer at MongoDB. “With the integration of MongoDB Atlas Vector Search with Amazon Bedrock, we’re making it easier for our joint-AWS customers to use a variety of foundation models hosted in their AWS environments to build generative AI applications that can securely use their proprietary data to improve accuracy and provide enhanced end-user experiences.”
Amazon Bedrock is a fully managed service from AWS that offers a choice of high-performing foundation models (FMs) via a single API, along with a broad set of capabilities to build generative AI applications with security and privacy.
This new integration with Amazon Bedrock allows organizations to quickly and easily deploy generative AI applications on AWS that can act on data processed by MongoDB Atlas Vector Search and deliver more accurate and relevant responses. Unlike add-on solutions that only store vector data, MongoDB Atlas Vector Search powers generative AI applications by functioning as a highly performant and scalable vector database with the added benefits of being integrated with a globally distributed operational database that can store and process all of an organization’s data.
Using the integration with Amazon Bedrock, customers can privately customize FMs—from AI21 Labs, Amazon, Anthropic, Cohere, Meta, and Stability AI—with their proprietary data, convert data into vector embeddings, and process these embeddings using MongoDB Atlas Vector Search. Leveraging Agents for Amazon Bedrock for retrieval augmented generation (RAG), customers can then build applications that respond to user queries with relevant, contextualized responses—without needing to manually code.
For example, a retail apparel organization can more easily develop a generative AI application to help employees automate tasks like processing inventory requests in real time or to help personalize customer returns and exchanges by suggesting similar styles of in-stock merchandise.
With fully managed capabilities, this new integration will enable joint AWS and MongoDB customers to securely use generative AI with their proprietary data to its full extent throughout an organization and realize business value more quickly—with less operational overhead.
The integration of MongoDB Atlas Vector Search with Amazon Bedrock will be available on AWS in the coming months.
Also Read: Do Our Abilities Match the Ambitions of Smart Cities?