Introducing Mistral Small: Empowering Developers with Efficient LLMs on Azure AI Models as a Service
Microsoft’s collaboration with Mistral AI continues to accelerate AI innovation. After the successful launch of Mistral Large, Mistral AI’s flagship model, we’re thrilled to unveil Mistral Small – a compact yet powerful language model designed for efficiency.
Available in the Azure AI model catalog, Mistral Small joins our growing collection of LLMs. Developers can access it through Models as a Service (MaaS), enabling seamless API-based interactions.
Mistral Small
As per insights provided by Mistral AI, Mistral Small is Mistral AI’s smallest proprietary Large Language Model (LLM). It can be used on any language-based task that requires high efficiency and low latency.
Mistral Small is:
A small model optimized for low latency: Very efficient for high volume and low latency workloads. Mistral Small is Mistral’s smallest proprietary model, it outperforms Mixtral 8x7B and has lower latency.
Specialized in RAG: Crucial information is not lost in the middle of long context windows. Supports up to 32K tokens.
Strong in coding: Code generation, review and comments with support for all mainstream coding languages.
Multi-lingual by design: Best-in-class performance in French, German, Spanish, and Italian – in addition to English. Dozens of other languages are supported.
Efficient guardrails baked in the model, with additional safety layer with safe prompt option.
Get started with Mistral Small on Azure AI
Provision an API Endpoint: Create your Mistral Small API endpoint in seconds.
Experiment: Try it out in the Azure AI Studio playground or integrate it with popular LLM app development tools.
Build Safely: Leverage dual-layer safety mechanisms to create reliable and secure Generative AI applications.
Here are the prerequisites:
If you don’t have an Azure subscription, get one here: https://azure.microsoft.com/en-us/pricing/purchase-options/pay-as-you-go
Create an Azure AI Studio hub and project. Make sure you pick East US 2/Sweden Central as the Azure region for the hub.
Next, you need to create a deployment to obtain the inference API and key:
Open the Mistral Small model card in the model catalog: https://aka.ms/aistudio/landing/mistral-small
Click on Deploy and select the Pay-as-you-go option.
Subscribe to the Marketplace offer and deploy. You can also review the API pricing at this step.
You should land on the deployment page that shows you the API and key in less than a minute. You can try out your prompts in the playground.
The prerequisites and deployment steps are explained in the product documentation: https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-mistral.
You can use the API and key with various clients. Review the API schema if you are looking to integrate the REST API with your own client: https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-mistral#reference-for-mistral-large-deployed-as-a-service. Let’s review samples for some popular clients.
Basic CLI with curl and Python web request sample: https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/mistral/webrequests.ipynb
Mistral clients: Azure APIs for Mistral Small are compatible with the API schema offered on the Mistral AI ‘s platform which allows you to use any of the Mistral AI platform clients with Azure APIs. Sample notebook for the Mistral python client: https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/mistral/mistralai.ipynb
LangChain: API compatibility also enables you to use the Mistral AI’s Python and JavaScript LangChain integrations. Sample LangChain notebook: https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/mistral/langchain.ipynb
LiteLLM: LiteLLM is easy to get started and offers consistent input/output format across many LLMs. Sample LiteLLM notebook: https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/mistral/litellm.ipynb
Prompt flow: Prompt flow offers a web experience in Azure AI Studio and VS code extension to build LLM apps with support for authoring, orchestration, evaluation and deployment. Learn more: https://learn.microsoft.com/en-us/azure/ai-studio/how-to/prompt-flow. Out-of-the-box support for Mistral AI APIs on Azure is coming soon, but you can create a custom connection using the API and key, and use the SDK of your choice Python tool in prompt flow.
Explore the power of Mistral Small – where efficiency meets innovation!
Microsoft Tech Community – Latest Blogs –Read More