DeepSeek, founded in 2023, is dedicated to researching the world's leading underlying models and technologies of general artificial intelligence and challenging the cutting-edge challenges of artificial intelligence. Based on self-developed training frameworks, self-built intelligent computing clusters, and tens of thousands of computing cards and other resources, the DeepSeek team has released and open-sourced multiple large models with hundreds of billions of parameters in just half a year, such as the DeepSeek-LLM general large language model and the DeepSeek-Coder code large model. And in January 2024, it was the first to open source the first domestic MoE large model (DeepSeek-MoE). The generalization effects of each major model outside the public evaluation list and real samples have all performed outstandingly, surpassing models of the same level. Talk to DeepSeek AI and easily access the API.