Mistal Large, Mistral AI’s flagship LLM, debuts on Azure AI Models-as-a-Service
Microsoft is partnering with Mistral AI to bring its Large Language Models (LLMs) to Azure. Mistral AI’s OSS models, Mixtral-8x7B and Mistral-7B, were added to the Azure AI model catalog last December. We are excited to announce the addition of Mistral AI’s new flagship model, Mistral Large to the Mistral AI collection of models in the Azure AI model catalog today. The Mistral Large model will be available through Models-as-a-Service (MaaS) that offers API-based access and token based billing for LLMs, making it easier to build Generative AI apps. Developers can provision an API endpoint in a matter of seconds and try out the model in the Azure AI Studio playground or use it with popular LLM app development tools like Azure AI prompt flow and LangChain. The APIs support two layers of safety – first, the model has built-in support for a “safe prompt” parameter and second, Azure AI content safety filters are enabled to screen for harmful content generated by the model, helping developers build safe and trustworthy applications.
The Mistral Large model
Mistral Large is Mistral AI’s most advanced Large Language Model (LLM), first available on Azure and the Mistral AI platform. It can be used on a full range of language-based task thanks to its state-of-the-art reasoning and knowledge capabilities. Key attributes:
Specialized in RAG: Crucial information is not lost in the middle of long context windows. Supports up to 32K tokens.
Strong in coding: Code generation, review and comments with support for all mainstream coding languages.
Multi-lingual by design: Best-in-class performance in French, German, Spanish, and Italian – in addition to English. Dozens of other languages are supported.
Responsible AI: Efficient guardrails baked in the model, with additional safety layer with safe prompt option.
Benchmarks
You can read more about the model and review evaluation results on Mistral AI’s blog: https://mistral.ai/news/mistral-large. The Benchmarks hub in Azure offers a standardized set of evaluation metrics for popular models including Mistral’s OSS models and Mistral Large.
Using Mistral Large on Azure AI
Let’s take care of the prerequisites first:
If you don’t have an Azure subscription, get one here: https://azure.microsoft.com/en-us/pricing/purchase-options/pay-as-you-go
Create an Azure AI Studio hub and project. Make sure you pick East US 2 or France Central as the Azure region for the hub.
Next, you need to create a deployment to obtain the inference API and key:
Open the Mistral Large model card in the model catalog: https://aka.ms/aistudio/landing/mistral-large
Click on Deploy and pick the Pay-as-you-go option.
Subscribe to the Marketplace offer and deploy. You can also review the API pricing at this step.
You should land on the deployment page that shows you the API and key in less than a minute. You can try out your prompts in the playground.
The prerequisites and deployment steps are explained in the product documentation: https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-mistral.
You can use the API and key with various clients. Review the API schema if you are looking to integrate the REST API with your own client: https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-mistral#reference-for-mistral-large-deployed-as-a-service. Let’s review samples for some popular clients.
Basic CLI with curl and Python web request sample: https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/mistral/webrequests.ipynb
Mistral clients: Azure APIs for Mistral Large are compatible with the API schema offered on the Mistral AI ‘s platform which allows you to use any of the Mistral AI platform clients with Azure APIs. Sample notebook for the Mistral python client: https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/mistral/mistralai.ipynb
LangChain: API compatibility also enables you to use the Mistral AI’s Python and JavaScript LangChain integrations. Sample LangChain notebook: https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/mistral/langchain.ipynb
LiteLLM: LiteLLM is easy to get started and offers consistent input/output format across many LLMs. Sample LiteLLM notebook: https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/mistral/litellm.ipynb
Prompt flow: Prompt flow offers a web experience in Azure AI Studio and VS code extension to build LLM apps with support for authoring, orchestration, evaluation and deployment. Learn more: https://learn.microsoft.com/en-us/azure/ai-studio/how-to/prompt-flow. Out-of-the-box support for Mistral AI APIs on Azure is coming soon, but you can create a custom connection using the API and key, and use the SDK of your choice Python tool in prompt flow.
Develop with integrated content safety
Mistral AI APIs on Azure come with two layered safety approach – instructing the model through the system prompt and an additional content filtering system that screens prompts and completions for harmful content. Using the safe_prompt parameter prefixes the system prompt with a guardrail instruction as documented here. Additionally, the Azure AI content safety system that consists of an ensemble of classification models screens for specific types of harmful content. The external system is designed to be effective against adversarial prompts attacks such as prompts that ask the model to ignore previous instructions. When the content filtering system detects harmful content, you will receive either an error if the prompt was classified as harmful or the response will be partially or completely truncated with an appropriate message when the output generated is classified as harmful. Make sure you account for these scenarios where the content returned by the APIs is filtered when building your applications.
FAQs
What does it cost to use Mistral Large on Azure?
You are billed based on the number of prompt and completions tokens. You can review the pricing on the Mistral Large offer in the Marketplace offer details tab when deploying the model. You can also find the pricing on the Azure Marketplace: https://azuremarketplace.microsoft.com/en-us/marketplace/apps/000-000.mistral-ai-large-offer
Do I need GPU capacity in my Azure subscription to use Mistral Large?
No. Unlike the Mistral AI OSS models that deploy to VMs with GPUs using Online Endpoints, the Mistral Large model is offered as an API. Mistral Large is a premium model whose weights are not available, so you cannot deploy it to a VM yourself.
This blog talks about the Mistral Large experience in Azure AI Studio. Is Mistral Large available in Azure Machine Learning Studio?
Yes, Mistral Large is available in the Model Catalog in both Azure AI Studio and Azure Machine Learning Studio.
Does Mistral Large on Azure support function calling and Json output?
The Mistral Large model can do function calling and generate Json output, but support for those features will roll out soon on the Azure platform.
Mistral Large is listed on the Azure Marketplace. Can I purchase and use Mistral Large directly from Azure Marketplace?
Azure Marketplace enables the purchase and billing of Mistral Large, but the purchase experience can only be accessed through the model catalog. Attempting to purchase Mistral Large from the Marketplace will redirect you to Azure AI Studio.
Given that Mistral Large is billed through the Azure Marketplace, does it retire my Azure consumption commitment (aka MACC)?
Yes, Mistral Large is an “Azure benefit eligible” Marketplace offer, which indicates MACC eligibility. Learn more about MACC here: https://learn.microsoft.com/en-us/marketplace/azure-consumption-commitment-benefit
Is my inference data shared with Mistral AI?
No, Microsoft does not share the content of any inference request or response data with Mistral AI.
Are there rate limits for the Mistral Large API on Azure?
Mistral Large API comes with 200k tokens per minute and 1k requests per minute limit. Reach out to Azure customer support if this doesn’t suffice.
Are Mistral Large Azure APIs region specific?
Mistral Large API endpoints can be created in AI Studio projects to Azure Machine Learning workspaces in East US 2 or France Central Azure regions. If you want to use Mistral Large in prompt flow in project or workspaces in other regions, you can use the API and key as a connection to prompt flow manually. Essentially, you can use the API from any Azure region once you create it in East US 2 or France Central.
Can I fine-tune Mistal Large?
Not yet, stay tuned…
Supercharge your AI apps with Mistral Large today. Head over to AI Studio model catalog to get started.
Microsoft Tech Community – Latest Blogs –Read More