Alibaba Cloud CTO Zhou Jingren: Tongyi Qianwen has evened the gap between open source and closed source models | WAIC 2024

Today, Alibaba Cloud handed over a report card to everyone at the Shanghai World Artificial Intelligence Conference.

According to the official introduction of Alibaba Cloud, in the past two months, the download volume of Tongyi Qianwen open source model has tripled, exceeding 20 million times, and the number of Alibaba Cloud Bailian service customers has increased from 90,000 to 230,000, an increase of more than 150%.

Taking this, Alibaba Cloud CTO Zhou Jingren also reiterated Alibaba Cloud's firm stance of embracing open source:

Two years ago, we released the Tongyi large model series at the World Artificial Intelligence Conference. At that time, we announced that the Tongyi core model would be open source. As of today, Tongyi Qianwen has achieved full-size, full-mode open source in the true sense, flattening the gap between open source and closed source models.

In the past year, the Tongyi model series has continued to evolve, and the performance of the basic models has also been continuously improved.

Judging from the OpenCompass benchmark test results, Tongyi Qianwen-Max's score is tied with GPT-4 Turbo, which is the first time a domestic large model has achieved such a good result on this benchmark.

In August last year, Tongyi was the first to join the ranks of open source, launching dozens of models along the "full-mode, full-size" open source route.

For example, Qwen2-72B, the latest open source model launched by Tongyi Qianwen, achieved the highest score among domestic large models with a total score of 1090 in Compass Arena, the Chinese large model arena of model anonymous PK. The total score was second only to GPT-4o.

In the Open LLM Leaderboard model evaluation organized by the international open source community Hugging Face, Qwen2-72B-instruct once again topped the list, beating overseas models such as Llama-3, Mixtral, and Phi-3.

HuggingFace co-founder and CEO Clem posted: "Qwen2 is the king, and China is the leader in the global open source large model field."

The cost of training and iteration of large models is extremely high, and most AI developers and small and medium-sized enterprises cannot afford it.

Based on this background, the Bailian large model platform was fully upgraded in May this year to become an important platform for Alibaba Cloud to carry cloud + AI capabilities, providing one-stop, fully managed large model customization and application services.

Here, developers can quickly build RAG applications with 5 to 10 lines of code, allowing large models to have the "most powerful plug-ins."

In terms of model services, Alibaba Cloud Bailian also insists on creating an open ecosystem.

At present, the Bailian platform has integrated hundreds of large model APIs. In addition to Tongyi, Llama, ChatGLM and other series, it is also the first to host large models such as Zero One Thousand Things and Baichuan Intelligence, covering mainstream manufacturers at home and abroad, and is linked to the Magic Open Source Community , and also supports enterprises to list general or industry models, providing developers with sufficiently diverse model choices.

In order to minimize the threshold for model use and accelerate the explosion of AI applications, on May 21, the price of Tongyi Qianwen series models was significantly reduced. The main GPT-4 level model dropped by 97%, as low as only 0.5 yuan per million tokens.

Zhou Jingren emphasized that Alibaba Cloud will insist on embracing open source and openness to create "the most open cloud in the AI ​​era."

# Welcome to follow the official WeChat public account of aifaner: aifaner (WeChat ID: ifanr). More exciting content will be provided to you as soon as possible.

Ai Faner | Original link · View comments · Sina Weibo