Elasticsearch Open Inference API and Playground Now Support Amazon Bedrock

Developers now have more LLMs to choose from when iterating and building production-ready RAG applications

Elastic (NYSE: ESTC) announced support for Amazon Bedrock-hosted models in Elasticsearch Open Inference API and Playground. Developers now have the flexibility to choose any large language model (LLM) available on Amazon Bedrock to build production-ready RAG applications.

“Our latest integration with Amazon Bedrock continues our focus on making it easier for AWS developers to build next-generation search experiences,” said Shay Banon, founder and chief technology officer at Elastic. “By leveraging Elasticsearch and Amazon Bedrock’s extensive model library, developers can deliver transformative conversational search.”

Developers using Elasticsearch and models hosted on Amazon Bedrock can now store and use embeddings, refine retrieval to ground answers with proprietary data and more. Amazon Bedrock models are also available in the low-code playground experience, giving developers more choice when A/B testing LLMs.

Support for Amazon Bedrock is available today, read the Inference API and Playground blogs to get started.

About Elastic

Elastic (NYSE: ESTC), the Search AI Company, enables everyone to find the answers they need in real-time using all their data, at scale. Elastic’s solutions for search, observability and security are built on the Elastic Search AI Platform, the development platform used by thousands of companies, including more than 50% of the Fortune 500. Learn more at elastic.co.

Elastic and associated marks are trademarks or registered trademarks of Elastic N.V. and its subsidiaries. All other company and product names may be trademarks of their respective owners.

Contacts

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.