Pry open the Snapdragon 8 and see that it is full of top meeting papers

"Besides photo optimization and voice assistants, what else does mobile phone AI have?"

When the new generation of Snapdragon 8 mobile platform was released this year, Qualcomm once again translated and translated, what is a big brain hole——

Let the mobile phone learn to "authorize" and recognize possible diseases of the user through voice, such as depression and asthma;

Let the mobile phone realize "anti-peeping" and realize automatic lock screen by recognizing the sight of unfamiliar users;

Let mobile games get super-resolution, and bring the picture quality that was only capable of running on the PC side to the mobile phone to experience…

More importantly, Snapdragon 8 has the ability to run these AI functions at the same time !

Qualcomm claims that the 7th-generation AI engine of the Snapdragon 8 has a performance increase of up to 4 times compared to the previous generation .

This means that when we play on mobile phones, it is no problem to "open" several AI applications at the same time. More importantly, it is not only a simple AI performance improvement, but also a smooth application experience for users.

Today, when hardware process upgrades are so difficult, how did Qualcomm “turn over” so many new tricks in the performance and application of the 7th generation AI engine?

We looked through some research papers and technical documents published by Qualcomm and found some "clues":

In the AIMET open source tool document released by Qualcomm, there is information about "how to compress the AI ​​super-resolution model";

In a technical blog related to "Anti-peeping", I introduced how to use target detection technology under the premise of privacy protection…

These documents and the top conference papers behind the technical blogs all come from one organization- Qualcomm AI Research Institute .

It can be said that Qualcomm has "hidden" many AI papers published by the research institute in the 7th generation AI engine.

Top meeting paper "Hiding" mobile AI

Let's take a look at the improvement of the camera algorithm of the 7th generation AI engine .

Aiming at this point of intelligent recognition, Qualcomm has increased facial feature recognition points to 300 this year, which can capture more subtle expression changes.

But at the same time, Qualcomm has increased the speed of face detection by 300% . how did you do that?

In a study published by Qualcomm on CVPR, we found the answer.

In this article, Qualcomm proposed a new convolution layer called Skip-Convolutions (skip convolution), which can subtract two images before and after, and only convolve the changed part.

Yes, just like human eyes, it is easier to notice the "moving part".

This enables Snapdragon 8 to focus more on the target object itself when doing real-time video stream detection algorithms such as target detection and image recognition, and at the same time use excess computing power to improve accuracy.

You may ask, what is the use of face recognition with such details for taking pictures?

Furthermore, this time Qualcomm and Leica launched the Leica Leitz filter together , using an AI-based intelligent engine, which includes algorithms such as face detection, allowing users to take more intelligently without thinking. Artistic style photos.

Not only face detection, but Qualcomm's smart shooting functions also include super-resolution, multi-frame noise reduction, local motion compensation…

However, the video stream in high-resolution shooting is usually real-time. How does the AI ​​engine intelligently process such a large amount of data?

It is also a CVPR paper. Qualcomm proposed a neural network composed of multiple cascaded classifiers, which can change the number of neurons used in the model according to the complexity of the video frame, and control the amount of calculation by itself.

Faced with the "large volume and complex" process of intelligent video processing, AI can now hold it.

In addition to smart photography, Qualcomm's voice technology is also a bright spot this time.

As mentioned at the beginning, the 7th-generation AI engine supports the use of mobile phones to accelerate the analysis of user voice patterns to determine the risk of health conditions such as asthma and depression.

So, how exactly does it accurately distinguish the user's voice without involving data collection?

Specifically, Qualcomm has proposed a federated learning method on the mobile phone, which can not only use the voice training model of mobile phone users, but also ensure that the privacy of voice data is not leaked.

Many of these AI functions can be found in papers published by Qualcomm AI Research Institute.

The clues that can also be found are the theoretical support that AI mentioned at the beginning to improve the performance of mobile phones. This has to mention a question:

With so many AI models running at the same time, how does Qualcomm improve the processing performance of the hardware?

Here we have to mention the quantification of a key research direction of Qualcomm in recent years .

Judging from the latest technology roadmap released by Qualcomm, model quantification has always been one of the core technologies that AI Research Institute has studied over the past few years. The purpose is to "slim down" AI models.

Due to limited power, computing power, memory, and heat dissipation capabilities, the AI ​​model used by mobile phones is very different from the AI ​​model on PCs.

On the PC, the GPU has hundreds of watts of power at every turn, and the calculation of the AI ​​model can use 16 or 32-bit floating point numbers (FP16, FP32). The mobile phone SoC has only a few watts of power, and it is difficult to store large-volume AI models.

At this time, it is necessary to reduce the FP32 model to an 8-bit integer (INT8) or even a 4-bit integer (INT4), while ensuring that the accuracy of the model does not suffer too much loss.

Taking the AI ​​matting model as an example, we can usually achieve very accurate AI matting with the computing power of a computer processor, but in contrast, if we want to use a mobile phone to achieve "almost effect" AI matting, we have to use it To the method of model quantification.

In order to allow more AI models to be installed on mobile phones, Qualcomm has done a lot of quantitative research. The papers published at the top conference include data-free quantization DFQ, rounding mechanism AdaRound, and joint quantization and pruning technology Bayesian Bits (Bayesian Bits). )Wait.

Among them, DFQ is a data-free quantization technology that can reduce the time of training AI tasks and improve quantization accuracy performance. On MobileNet, the most common visual AI model on mobile phones, DFQ has achieved the best performance beyond all other methods:

AdaRound can reduce the weight of complex Resnet18 and Resnet 50 networks to 4 bits, which greatly reduces the storage space of the model, while only losing less than 1% of accuracy:

As a new quantization operation, Bayesian bits can not only double the bit width, but also quantize the residual error between the full precision value and the previous rounded value at each new bit width, so as to achieve accuracy and efficiency. Provide a better trade-off between.

These technologies not only allow more AI models to run on mobile phones with lower power consumption , like game AI super-resolution (similar to DLSS) that can only run on computers, now it can run on Snapdragon 8. ;

Even some of these AI models can "run at the same time", such as gesture detection and face recognition:

In fact, the thesis is only the first step.

If you want to quickly apply AI capabilities to more applications, you also need more platforms and open source tools.

Unleash more AI capabilities to the application

In this regard, Qualcomm maintains an open mind.

The methods and models for efficiently building AI applications in these papers, Qualcomm AI Research Institute through cooperation, open source and other methods, shared them with more developer communities and partners, so we can experience more on Snapdragon 8. Interesting functions and applications.

On the one hand, Qualcomm cooperated with Google to share the ability to rapidly develop more AI applications with developers.

Qualcomm is equipped with Google's Vertex AI NAS service on Snapdragon 8 , which is still updated monthly, which means that AI applications developed by developers on the 7th generation AI engine can quickly update the model performance.

Using NAS, developers can automatically use AI to generate appropriate models, including the smart camera algorithm, voice translation, and super-resolution that Qualcomm announced at the top meeting. They can all be included in the "screening range" of AI and automatically developed for development Match the best model.

Qualcomm's motion compensation and frame interpolation algorithms are used here. And similar to these AI technologies, developers can also implement it through NAS, and it can better adapt to Snapdragon 8, without the problem of "ineffective tuning".

Imagine that when you play games on a mobile phone equipped with Snapdragon 8 in the future, you will feel that the picture is smoother, but it will not lose more power (referring to increased power consumption):

At the same time, the maintenance of the AI ​​model has become simpler. According to Google, compared with other platforms, the number of lines of code required to train a model for Vertex AI NAS can be reduced by nearly 80%.

On the other hand, Qualcomm has also open sourced its own tools that have been researched and quantified over the years.

Last year, Qualcomm open sourced a model "efficiency improvement " tool called AIMET (AI Model Efficiency Toolkit).

It includes a large number of compression and quantization algorithms such as neural network pruning and singular value decomposition (SVD), many of which are the results of top conference papers published by Qualcomm AI Research Institute. After developers use AIMET tools, they can directly use these algorithms to improve their AI models and make it run more smoothly on mobile phones.

Qualcomm's quantitative capabilities are not only open sourced to ordinary developers, but also enable more AI applications of leading AI companies to be implemented on Snapdragon 8.

On the new Snapdragon 8, they cooperated with Hugging Face, a well-known company in the NLP field, so that the smart assistant on the phone can help users analyze notifications and recommend which ones to prioritize, so that users can see the most important notifications at a glance.

When running their sentiment analysis model on Qualcomm AI engine, it can achieve 30 times faster than ordinary CPU speed .

It is precisely the precipitation of technical research and the open attitude maintained in technology that Qualcomm continues to refresh the various AI "new brain holes" in the mobile phone industry:

From the previous video smart "elimination", smart meeting mute, to this year's privacy screen, mobile phone super resolution…

There are more AI applications implemented by papers, platforms and open source tools, all of which are also carried in this AI engine.

The Qualcomm AI Research Institute, which has been hiding behind these researches, has once again surfaced with the appearance of the 7th generation AI engine.

Qualcomm AI's "soft and hard"

Most of the time, our impression of Qualcomm AI seems to remain on the "hardware performance" of the AI ​​engine.

After all, since the first AI project was launched in 2007, Qualcomm has been improving processing capabilities for AI models in terms of hardware performance.

However, Qualcomm's research on AI algorithms also "has already planned."

In 2018, Qualcomm established the AI ​​Research Institute, headed by Max Welling , a well-known theorist in the AI ​​field , and he is a student of Hinton, the father of deep learning.

According to incomplete statistics, since Qualcomm established the AI ​​Research Institute, dozens of papers have been published in top AI academic conferences such as NeurIPS, ICLR, and CVPR.

Among them, at least 4 model compression papers have been implemented on the AI ​​side of mobile phones, and there are many papers related to computer vision, speech recognition, and privacy computing.

The above-mentioned 7th generation AI engine can be said to be just a microcosm of Qualcomm's research results on AI algorithms in recent years.

Through Qualcomm AI's research results, Qualcomm has also successfully extended the AI ​​model to many cutting-edge technology application scenarios.

In terms of autonomous driving , Qualcomm has launched the Snapdragon Automotive Digital Platform, which "contains" a one-stop solution from chips to AI algorithms. Currently, it has reached cooperation with more than 25 car companies, and the number of connected cars using their solutions has been Reached 200 million vehicles.

Among them, BMW's next-generation assisted driving system and autonomous driving system will adopt Qualcomm's autonomous driving solution.

On XR , Qualcomm released the Snapdragon Spaces XR development platform for the development of devices and applications such as head-mounted AR glasses.

Through cooperation with Wanna Kicks, Snapdragon 8 also brings the capabilities of the 7th generation AI engine to the AR try-on app.

On drones , Qualcomm released the Flight RB5 5G platform this year. Many of these functions, such as 360° obstacle avoidance, drone photography and anti-shake, can be implemented through the AI ​​model on the platform. Among them, the first UAV to reach Mars, the "Gizwit", is equipped with processors and related technologies provided by Qualcomm.

Looking back, it is not difficult to find that this time Qualcomm no longer emphasizes the improvement of hardware computing power (TOPS) in AI performance, but integrates software and hardware as a whole, and obtains the data of 4 times improvement in AI performance, and further strengthens the AI ​​application experience. All-round landing.

This not only shows that Qualcomm pays more attention to the actual experience of users, but also shows Qualcomm's confidence in its own software strength, because the hardware is no longer a full manifestation of Qualcomm's AI capabilities.

It can be said that the upgrade of the 7th generation AI engine of Snapdragon 8 marks the beginning of Qualcomm's AI software and hardware integration .

Recently, Qualcomm has put forward several new researches on codecs, which were published on ICCV 2021 and ICLR 2021 respectively.

In these papers, Qualcomm also used AI algorithms to show new ideas for codec optimization.

In a study using the GAN principle, Qualcomm's latest codec algorithm makes the image not only clearer, but also smaller per frame, which can be done with only 14.5KB:

In contrast, after the original codec algorithm is compressed to 16.4KB per frame, the forest will become extremely blurred:

In another paper that uses the idea of ​​frame interpolation and neural codec, Qualcomm chose to combine neural network-based P frame compression and frame interpolation compensation, and use AI to predict the motion compensation that needs to be performed after frame interpolation.

After testing, this algorithm is better than Google's previous SOTA record on CVPR 2020, and it is also better than the current compression performance of the open source codec based on the H.265 standard.

It is not the first attempt by Qualcomm to apply AI models to more fields. The application of video codecs is a new direction.

If these models can be successfully implemented on the platform or even the application, we can truly be free when watching videos on the device.

As the "soft and hard integration" program is continued, we may actually see these latest AI results being applied to smartphones in the future.

Combining Qualcomm's "muscle show" in PC, automobile, XR and other fields…

It is foreseeable that the Qualcomm you are familiar with and the Snapdragon you are familiar with will definitely not stop at the mobile phone, and its AI capabilities will not stop at the mobile phone.

#Welcome to follow Aifaner's official WeChat account: Aifaner (WeChat ID: ifanr), more exciting content will be provided to you as soon as possible.

Ai Faner | Original link · View comments · Sina Weibo