Demystifying the iPhone 13 movie effect mode, this is how Apple made it

I remember watching a documentary when I was young, which explained the division of labor behind the film. And photographers are often accompanied by a "follow-focuser" attendant, which is like every golfer will have a caddie, and every rally player will have a navigator, which has almost become the standard in the film industry.

The focus follower mainly only ensures that the focal point (focal plane) of the lens meets the needs of the director of photography, that is, to ensure that the subject is clear.

It sounds like a fairly simple job, but not everyone can do it. It has high requirements for the speed of focus and the accuracy of switching. Without years of accumulation to form muscle memory, it is difficult to reach the point of proficiency, and it is difficult to accurately switch between different focal points.

Even now that the camera's autofocus technology has made great progress, the film industry is still inseparable, and it is still fully manual, and the photographer still has such a "follow".

The movement of the focus will highlight the characters and may also highlight the environment. The human eye’s instinct is to track a clear subject. The change of focus will guide the audience's attention to shift with the focus, thereby achieving narrative or atmosphere. It has become a movie. A piece of language. Zha Dao should have a say in this.

If ordinary people want to shoot similar to this kind of "movie" effect, if it is not a later stage, a full-frame camera and a large aperture lens are enough, but from the economic and universal point of view, it can be a negative score. NS.

And this year's "Movie Effect Mode" on the entire iPhone 13 series may be a new choice.

Movie effect mode is the video portrait mode

Of all the TVCs on iPhone 13, the ones that impressed me most were "Whodunnit" and "Who stole my iPhone". The addition of the film effect mode has greatly improved the story of the whole film. Of course, the latter's actors are also a big one. Highlights.

▲ It is highly recommended to watch the full clip

In fact, as early as before Apple launched the iPhone 13 on September 15, there were corresponding rumors that Apple would launch the "video portrait" function, and did not think it would be much "useful" at that time.

But when we saw the commercial demonstration of "Whodunnit" and after we got the iPhone 13 series, we tried to "reproduce" for a while, and then gradually understood that the meaning of "movie effect mode" was comparable to the "film effect mode" on the iPhone 7 Plus. Portrait mode".

They are not based on classic optical technology, but find another way. The former "portrait mode" and the current "movie effect mode" are based on multi-lens parallax, not LiDAR, which explains why the iPhone 13 is supported by all systems, not only in the Pro series.

The purpose is obvious, that is, to popularize like the "portrait model" of the year.

▲ Your face is big, you win.

It is also based on this that the shooting effect of the iPhone 13's "movie effect mode" is similar, and the recognition of the subject to simulate the depth of field will focus on the face, and under multiple faces, the big priority will be given.

It cannot identify objects and animals, and it requires manual intervention to switch. In the crowded scene, the iPhone 13 can recognize multiple faces at the same time, but it will frequently switch the focus. At this time, it also needs to intervene to select the person you want to follow, or directly press and hold to lock it.

In addition, the "movie effect mode" was first launched by Apple, and naturally there are many restrictions. One is that the cutout is not detailed enough, the edges of the characters are sometimes "pierced", and the highest can only record 1080p 30p video, the same is true for the iPhone 13 Pro Max 1TB version.

However, under the above restrictions, the "movie effects mode" of the entire iPhone 13 series can still take into account Dolby Vision HDR encoding, which is also a choice for Apple to weigh.

"Movie Effect Mode" was actually born out of curiosity

In the past, many of the emergence of new features required insight into needs and obtained from user feedback, but Apple, at the top of the pyramid, made products more in accordance with its own understanding and rhythm.

▲ Johnnie Manzari, Apple's human-machine interface designer. Picture from: Getty Images

In an interview with Matthew Panzarino, the editor-in-chief of TechCrunch, Apple's human-machine interface designer Johnnie Manzari explained the beginning and end of the film effect model.

"Film effect mode" does not start from the function itself, but only because the design department is curious about the process of film making, using this as a starting point to study and learn film photography techniques, so as to achieve close-to-real focus conversion and some optical characteristics.

At the same time, John said that the development of the "movie effect mode" is somewhat similar to the studio light-effect portrait photo function that appeared on the iPhone X.

From the portrait works of classic portrait artist Andy Warhol (Avedon Warhol), Baroque painter Rembrandt, and Chinese meticulous painting, learn experience and analyze, and then apply it in the algorithm of the product.

▲ Andy Warhol's artwork (I am on the bottom left).

The "movie effect mode" is a similar process. The Apple team first discussed with world-class videographers and watched a large number of movies.

In the process, we discovered some unchanging trends in the filmmaking process. The change between the focal points is a common language in the film industry, and we need to accurately understand how and when they are used.

▲The focus shift in “movie effect mode''.

Therefore, John and the team members worked closely with the director of photography and the first assistant of photography to understand the mechanism behind all this. The use of shallow depth of field can guide the audience's attention and help the depiction of the story.

However, the "zoom" is for professionals, and it is difficult for ordinary people to grasp the accuracy.

After knowing that it takes many years of practice for the focus follower to master the real-time situation of keeping the focus constant according to the position of the movie camera and the subject, John, the design team and Apple all believe that this "movie effect mode" is about to be marketed. "It will be a very competitive feature.

▲ Simple follow focus equipment

For this, by studying Apple, the "movie effect mode" is broken down into two parts, one is "finding the focus" and the other is how to smoothly transfer.

Finding the focal point is finally set as "gaze detection", which is actually closer to face detection, so that the audience can focus on the movement of the protagonist and guide the audience to quickly understand the story.

The smooth transfer is a feature added by John’s long-term observation of the focus control wheel operated by a follower. A skilled follower will use the roller control to make the focus move naturally and stably, and will also change the focus according to the focal length of the lens. The object distance changes in real time.

Apple spent a lot of time in the "movie effect mode" trying to simulate the handwork of this follower, so we are shooting in the "movie effect mode". When switching between different focus points, it is not a mechanical quick push and pull, but There is a more obvious follow-up process.

Of course, the "movie effects mode" has just been released, just like the "portrait mode", it still has a lot to optimize.

It's as if you see Lang Lang playing the piano so easily and freely, but you know that it is impossible to use algorithms to achieve his master level.

The hero behind the landing of "movie effect mode" A15

Since it is "guessed" through an algorithm, in theory the "movie effect mode" can be downloaded to other iPhones (such as iPhone 12) via OTA, but real-time preview may be sacrificed. But with Apple's thinking, it is difficult to decentralize new features by sacrificing some experience.

▲ A15 upgrade points, note that the CPU and GPU upgrades are compared with competitors.

Therefore, the "movie effect mode" will be a temporarily exclusive feature of the entire iPhone 13 series, and it will also be a threshold for the computing power of the A-series chips, and it may also become a new generation of iPhone "nail users."

Kaiann Drance, vice president of iPhone marketing, said in an interview that making videos with a simulated depth of field is more "challenging" than portrait photos.

▲ Kaiann Drance, vice president of iPhone marketing.

The video needs to move as the subject moves, real-time anti-shake (digital plus optics) is required, and different scenes, people, animals, and objects are accurately recognized. Therefore, for neural engines and machine learning, higher quality depth is required Data, and rendered in real time.

Oh, and real-time focus, and support for Dolby Vision HDR encoding. And behind all this comes from A15, it can be said that the young age has taken on the important responsibility of the whole family.

Compared with the CPU of A15, in the actual test, the GPU performance has improved significantly. It may adopt a brand-new architecture and pile up one more core. In addition, the computing power of A15's neural network engine has been increased to 15.8TOPS.

And these hardware improvements may be related to the improvement of the video functions of the iPhone 13 series this time. The "movie effect mode" contains some machine learning problems that are different from static photos. I believe that when A15 was developed, John's human-computer interaction design team had already had in-depth exchanges with chip development.

In the test video of TechCrunch, they also found that the "movie effect mode" of the iPhone 13 series uses redundant sensors to predict the subject during the process of identifying people.

As soon as the back of Matthew’s daughter’s head appeared on the screen, the iPhone 13 in his hand immediately switched the focus, as if there is a director who controls the overall situation in a stage play. The protagonist has not appeared on the stage, and the lighting is prepared in advance. , And lead the audience to look forward to her appearance.

Regarding this detail of the "movie mode", Apple's human-computer interaction designer John said:

While observing the production of the film, we discovered that the focus follower does not wait for the protagonist to move the focus, but has a prediction process. We also use redundant sensors to predict the movement of people outside the screen through machine learning. When they appear on the screen, the focus is already on them.

Jobs once liked to say that Apple is at the intersection of technology and art.

Before the release of the iPhone 13 series, "portrait mode" was more like a technical term, and it felt cold. As for the "movie effect mode" and the presentation of TVC, it is no longer cold, but with a certain temperature. After in-depth understanding of the story behind the "movie effect mode", it brought a bit of "the art of shooting".

For ordinary people, the "movie effect mode" may not have such a deep meaning. It is more like a red wine screwdriver in a Swiss army knife. It may not be the most perfect and labor-saving, but it is a tool that can be used in emergencies. .

Movies show us human emotions and stories. If you use the correct language, it will be conveyed. At the moment, we have worked hard for a long time to let your life story be recorded and interpreted on your mobile phone. I can't wait to see the stories written by people using the "movie effect mode".

#Welcome to follow Aifaner's official WeChat account: Aifaner (WeChat ID: ifanr), more exciting content will be provided to you as soon as possible.

Ai Faner | Original link · View comments · Sina Weibo