Context by Cohere
Three New Modules on LLM University: Semantic Search, Prompt Engineering, and Building With the Cohere Platform

Three New Modules on LLM University: Semantic Search, Prompt Engineering, and Building With the Cohere Platform

Share:

Since its launch a few months ago, Cohere’s LLM University (LLMU) has become a trusted learning resource, offering a comprehensive NLP curriculum that covers the essentials of large language models (LLMs) and advanced concepts, such as generative AI and model deployment. The program benefits developers, MLEs, researchers, and even enterprise technical leaders, and for some, LLMU has become the go-to destination for mastering large language models and building AI-powered applications for their businesses.

The world of AI moves at a blazing speed, constantly pushing the boundaries of knowledge and innovation. Cohere is at the heart of this whirlwind, and the LLMU team is always eager to learn and create new content. To help you stay at the forefront of these exciting developments, my colleagues Jay Alammar, Meor Amer, and myself are thrilled to introduce three new modules for LLMU: Information Retrieval, Prompt Engineering, and The Cohere Platform. Here’s a short synopsis of what you can expect with each module.

Search systems have been crucial in computing even before the birth of the internet, for efficiently navigating and retrieving information within vast data sets. In this module, you'll learn how to enhance the performance of search systems with LLMs, and also how to use search to improve the results of a generative model, in order to reduce hallucinations and output more exact responses. You'll first learn basic search systems like keyword search. Then you'll learn more powerful search systems that retrieve information based on semantic meaning, such as dense retrieval and reranks. Finally, you'll get to put what you learned into practice in our coding labs, where you'll be searching for answers to queries using a large dataset of Wikipedia articles.

Prompt Engineering: Get the Best Out of Command

Generative LLMs are powerful and are making a deep impact in many areas. But to extract the best performance from them, it is imperative that you give them the most effective prompts. In this module, you'll learn the best practice techniques for constructing an effective prompt to help you get the intended output from a generative model. You'll then learn how to apply these techniques to various use cases, as well as how to chain multiple prompts to unlock even more opportunities to build innovative applications. Finally, you'll learn how to validate and evaluate the outputs generated by an LLM.

The Cohere Platform: Learn How to Build and Deploy Models

Cohere provides a fully managed LLM platform that enables teams to leverage the technology without having to worry about building infrastructure and managing deployments. In this module, you'll get a solid overview of the Cohere platform, including its serving framework, the types of foundation models it serves, the available endpoints, and the range of applications that can be built on top of it. This module will get you ready to build and deploy LLM-powered applications in the enterprise setting

Ready to Dive In?

Don’t hesitate — start exploring the new LLMU modules today. And if you haven’t yet done so, be sure to check out the original curriculum and embark on an exciting learning journey with us!

Join Our LLMU Community

If you’d like to go through the course material with other enthusiasts, join our Discord LLMU Community! For specific questions regarding LLMU, please visit our #llmu-announcements channel on Discord, where you can connect with fellow learners, share ideas, and receive support.

Keep reading