Home Vector Search Overview
Home Vector Search Overview
“GSI’s OpenSearch plugin made building a multilingual and multimodal search application super easy. If you’re already using OpenSearch, the plugin allows you to implement neural search without having to move to a completely new vector search database. This is a big boost to quickly launching Neural Search in production at scale. Also, it allowed us to achieve low latency similar to keyword search but in our vector search application, which is quite cool.”
Dmitry Kan—Host of Vector Podcast, and CEO & Co-founder, Muves Ltd |
Neural Search is quickly surpassing traditional keyword search in many NLP applications because
it can better understand user intent—and this means a better user search experience.
But how do you add neural search to your search stack without having to invest
the time and resources needed to learn a new software platform?
We Have The Solution GSI’s OpenSearch neural search plugin allows you to extend your current OpenSearch software investment with a production-grade neural search solution that searches billions of items in milliseconds, with high recall. That means you don’t have to learn a new software platform to add neural search to your existing OpenSearch workflow—saving you valuable time and resources. So, now you can focus on what you do best—creating great applications. |
|
Easy Integration You don’t have to learn new APIs—just connect the plugin to your OpenSearch installation to add neural search and a vector database to your application. This saves valuable time and resources and enables rapid deployment and use. |
|
Flexible
GSI’s neural search is offered in both on-premises and SaaS deployments. We offer an on-premises solution if physical control of your vector database is critical. Just add our plugin and neural search hardware to your OpenSearch installation, and it turns your server into a vector database. If you do not want to manage the infrastructure, our SaaS offering makes it easy to add neural search to your existing OpenSearch cloud deployment with a simple plugin. The SaaS solution offers flexible usage-based hourly pricing based on the amount of APU resources needed to support your application.
|
Built Specifically for High-Performance, Scalable Search
Unlike other solutions that rely on general purpose CPUs, GSI’s neural search solutions are based on in-memory computing technology built specifically for vector search at scale. The result—you get a cost-effective solution that searches billions of items in milliseconds, leading to a better user experience and lower cost per query. |
Vector Search Products
Dual Leda-E 2U Server | Contact for Purchase |
8 Leda-E 2U Server | Contact for Purchase |
16 Leda-S 1U Server | Contact for Other Configurations |
Leda-E PCIe card | Contact for Purchase |
Hosted Service | Beta Trial on Searchium.ai |
FVS Plug-In | Contact for Details |
For additional products concerning smaller, lower power variants, or IP, please contact GSI for details. |
Vector Search Resources
How Developers Are Using GSI’s Neural Search Solution