Microsoft Unveils Phi-3 Small Language Model Family

These models have capabilities comparable to LLMs but are trained on smaller amounts of data.

Published on Apr. 24, 2024
Microsoft Unveils Phi-3 Small Language Model Family
Photo: Shutterstock

Microsoft has developed a new class of small language models, or SLMs, called Phi-3, which offer similar capabilities to large language models but are smaller in size and trained on smaller amounts of data. The Phi-3 models, starting with Phi-3-mini, measuring 3.8 billion parameters, outperform models twice their size across benchmarks that evaluate language, coding and math capabilities. 

LLMs typically require significant computing resources to operate. While SLMs are not designed for in-depth knowledge retrieval like LLMs, they are well-suited for simpler tasks and can be used offline. These models are useful for organizations building applications that can run locally on a device, rather than in the cloud.

Microsoft is making Phi-3-mini available in the Microsoft Azure AI Model Catalog. It will also be available on platforms including Hugging Face, Ollama and as an NVIDIA NIM microservice. Additional models in the Phi-3 family, such as Phi-3-small and Phi-3-medium, will be released soon, according to a company blog post.

This article was written by Writer, a generative AI tool, using information from press releases and company blogs provided by our staff. All content was reviewed by a Built In editor and went through a fact-checking process to ensure accuracy. Errors can be reported to our team at [email protected].

Hiring Now
System1
AdTech • Big Data • Digital Media • Marketing Tech