Context by Cohere
Cohere Enables Enterprises to Connect Third-Party Datastores to Command

Cohere Enables Enterprises to Connect Third-Party Datastores to Command


Cohere’s enterprise AI platform took a major step forward today with the launch of build-your-own connectors, allowing companies to securely and quickly connect their datastores to Cohere’s large language model (LLM), Command. This enables businesses to build AI assistants that can answer highly specific questions about their own business and sector based on their own data, even if it sits on third-party applications like Slack or Google Drive.

The open-beta release helps companies ask far more nuanced questions, and receive better-sourced and more verifiable answers, than off-the-rack models can support. With connectors, companies can leverage information from hundreds of tools that they use in their daily work. It’s the next step in Cohere’s philosophy of meeting companies where their data lives, regardless of which cloud or third-party application they use. 

Traditional LLMs struggle with questions that require information outside of their training dataset, like “What are the key takeaways from the annual report?” or “What’s the latest on our supply-chain optimization project?” In contrast, by leveraging your own data through custom connectors, Command can provide contextually relevant and accurate responses. 

Cohere’s upgraded AI platform is able to provide this by accessing information spread out across disparate proprietary datasets. Connectors allow companies to plug this data into the models, securely, within their own cloud, and use retrieval-augmented generation (RAG) to see exactly where the answers come from, improving the accuracy of responses and minimizing hallucinations.

Seamless RAG on Enterprise Data

To understand the significance of connecting data from third-party datastores to Command, and then using RAG to retrieve answers, it is important to understand how RAG works. Next to AI-generated answers to queries, RAG enables Cohere to display citations, so that you know the source of key pieces of information. Users can click on those source links and read the original document for themselves for verification or additional context.

RAG lifts the veil on the “black box” of so many LLMs, where answers appear to be generated out of thin air, and in the case of hallucinations, may in fact be. This added verification gives users the confidence that the information they are receiving is grounded, and allows them to learn more on their own.

The seamless integration of RAG in enterprise use cases – and now leveraging data located in third-party data stores via connectors – is a dramatic improvement in the usefulness of conversational AI solutions for businesses. Afterall, quick generation of answers is useless without the confidence that the answers are grounded in fact. 

The upgraded Command framework allows enterprises to develop a connector to any datastore that offers an accompanying search API. In other words, companies can now ground their answers in virtually any proprietary information that they have, regardless of what third-party app they are using for corporate collaboration.

Additionally, for some of the most popular enterprise datastores, including GitHub, Asana, Slack, Dropbox, Google Drive, Pinecone, and a range of others, Cohere is releasing around 100 quickstart connectors to help in-house developers hit the ground running. Cohere can also provide full support for enterprises looking to set up connectors across their various third-party datastores.

Focus on Secure Deployment

Many of the biggest concerns of companies looking to deploy AI solutions are around privacy and security. Reports of data leakage, where other customers or consumers gain access to proprietary corporate information, have rightly been a primary area of focus for corporate security officers. 

Simply put, connectors are only useful insofar as data remains private and secure. 

Cohere has consistently prioritized secure deployment as the only proprietary LLM available on all four major cloud platforms, including AWS, Azure, GCP, and OCI, as well as offering private deployment. 

We have been just as diligent when it comes to connectors, which can be configured with user-level permissions. This allows individuals to inherit access levels from connected third-party datastores. 

Starting Building Your Own Connectors

If you are a developer who wants to start building, please refer to our technical documentation. We have also open-sourced code for around 100 popular enterprise datastores (available within this GitHub repository) to help you get started. We’re excited to hear your feedback.

Cohere works with the world’s largest enterprises to deploy our state-of-the-art AI offerings at scale. If your company is working on a project where this technology might be relevant, please reach out to us to discuss your needs.

Keep reading