Apple is playing an “imitation game” that deceives the senses

The feature of iPad Pro 2024 that interests me most is not the new M4 chip or the new OLED screen, but a tiny "chemical reaction" between the Apple Pencil Pro and iPad Pro 2024: when you use Apple When Pencil Pro is drawing and writing on iPad Pro 2024, the shadow of the corresponding brush will be displayed on the screen to let users know what mode the Apple Pencil Pro is currently in.

Subsequently, someone found 10 3D models developed by Apple for this feature in Apple's PencilKit framework.

And what's interesting is that this shadow is not a simple texture. It will change accordingly according to the position of Apple Pencil Pro in the real world, the distance from the screen, and the tilt angle. In short, it has strong "authenticity" , "sense of space" and "interactivity".

More information about this Apple Pencil Pro has also been revealed: Apple has stuffed it with a new tactile feedback engine, a new pressure sensor, a gyroscope and a U2 chip. It has the potential to simulate a real writing experience on iPad Pro, and it also has the potential to be linked with Vision Pro.

Apple’s copycat game

Apple is the company in the world that likes to simulate real-world sensory experiences through hardware.

The Taptic Engine installed on the 2015 MacBook successfully achieves "fake and real" through delicate simulation of vibration and touch. Perhaps until now, many people still don't know that the MacBook's Force Touch trackpad itself is a flat surface that "cannot be pressed."

Regardless of whether it is 3D Touch or the current non-pressure-sensitive Haptic Touch, the iPhone uses this flat glass to simulate a magical feeling of rebounding when the button is pressed through just the right vibration feedback. The sound effect of button pressing makes people unable to resist pressing it several times.

The same technology is also used in the Home button of iPhone 7 and the rotating crown of Apple Watch, both of which achieve a tactile feel that is very close to mechanical buttons and gears.

In addition to using Taptic Engine to "cheat" users' fingers, iOS 14 and AirPods devices create "spatial audio" that simulates sounds in various directions in 3D space and head tracking technology to bring "immersive sound" to users. "a feeling of. On social platforms, many users said that after turning on spatial audio, they thought their phones were being played outside when they put on their headphones.

Before the release of Apple Pencil Pro, many sources predicted that Apple would equip this new pen with a new Taptic Engine, so that Apple Pencil Pro and iPad can also provide a realistic friction feeling close to that of paper and pen.

This sounds a bit "sci-fi", but it doesn't make people feel unreasonable, because Apple is a manufacturer that likes and is good at playing with vibration motors. Even if the current Pencil has not yet implemented relevant functions, it is still promising in the future.

Another company that also likes to use hardware to deceive users' senses is Nintendo.

The Joy-Con handle on Nitendo Switch supports "HD vibration" technology. Players can distinguish the subtle difference between "pouring ice into a cup" and "pouring water into a cup" just by relying on different vibration feels. Combined with the Switch Labo paper suit, the gyroscope and vibration motor in the handle can also recreate the real feeling of catching fish from the water.

Whether it’s Apple’s vibration magic or Nintendo’s Labo kit and controller, “one piece of hardware” is used to simulate “countless possibilities.”

For example, the Taptic Engine on the iPhone 7 not only simulates the Home button and 3D Touch. The phone vibrates like a "shaking head" when the password is entered incorrectly, as well as various feedbacks in the system. All vibration actions are dependent on this huge internal Taptic Engine completed.

Spatial audio is similar. Through gyroscopes and accelerometers, AirPods Max can simulate complex sounds with a sense of spatial orientation.

If Nintendo aims to bring players a more realistic gaming experience, then Apple’s “imitation game” is a full range of simulations, covering touch, hearing, and vision.

Therefore, the projection of Apple Pencil Pro is by no means an isolated little easter egg.

Vision Pro Completion Plan

During the visionOS testing process last year, some developers discovered that when the user held the Apple Pencil, Vision Pro could not recognize the pinch operation of the fingers normally. However, the official feedback said that "this is a designed feature and not a BUG." Visible recognition The Apple Pencil is likely already present in the Vision Pro.

The addition of hardware such as a tactile feedback engine, new pressure sensor, gyroscope and U2 chip makes it more like a smaller VR controller, and the logic is quite similar.

Coincidentally, Logitech has launched a stylus for VR devices, MX INK, for Meta Quest 2 and 3.

This stylus, which is 10 grams heavier than the Apple Pencil 2, can not only create flat content in a mixed reality environment like a traditional stylus, but also supports six degrees of freedom tracking to create freely in 3D space. The pen body is equipped with There are multiple buttons, and the pen tip is also pressure-sensitive.

▲Logitech MX INK

So, does Vision Pro need a pen? In other words, does it require "peripherals" other than eyes and fingers?

Before talking about this open question, we can take a look at the series of popular science videos Apple prepared for developers at WWDC24: such as "Creating an Engaging Space Photo and Video Usage Experience."

A lot of conceptual things are mentioned in it. Apple divides the stereoscopic video experience into: 3D video, spatial video and Apple immersive video:

  • 3D video is presented on a flat surface, and the content has a sense of depth, similar to the 3D movies we watch in the cinema.
  • Spatial video, in addition to having a three-dimensional sense, the content will be scaled to the true size of the object, and the boundaries will be blurred.
  • Immersive video completely surrounds the user. The user is not watching the video, but immersed in it.

Among them, "spatial video" can imitate people's sense of presence in the real world and can change the viewing angle within a certain range; while "immersive video" is even more extreme. Everything the user sees is video, and the user becomes the user of the video. part.

Moreover, Apple has also put forward additional requirements. For example, immersive videos need to have interactive 3D content.

▲ The "What If…?" immersive story launched by Marvel for Vision Pro allows users to reach out and participate (Source: YouTube @Adam Savage's Tested)

The concept of "spatial metadata" is also mentioned in the popular science video, and there is also an explanation of "projection": Projection defines the relationship between objects in the world and pixels in the image.

The little easter egg in Apple Pencil Pro is also explained in the Vision Pro developer's popular science video: Apple has never regarded iPad Pro 2024 and Apple Pencil Pro as separate devices. They are connected through "projection."

In other words, an Apple Pencil may be "projected" into various pens of different shapes in the virtual world of Vision Pro in the future.

Since Vision Pro can explain the questions about Apple Pencil Pro, then Apple, as the master of this "imitation game", we also have reason to believe that the current Vision Pro is still in the early stage of the product, and it can already achieve a considerable degree of visual and auditory performance. It is "fake and real", but it is far from enough in terms of touch. This naturally comes down to the fact that Vision Pro does not have specially adapted peripherals. We can't rely on Vision Pro to blow our minds, right?

Now, we are almost certain that Apple Pencil Pro with vibration engine, gyroscope and U2 chip can be linked with Vision Pro. Apple Pencil Pro can also be projected on the screen into "other appearances", such as an ink pen; then this Maybe the pen could be used like the controllers of other VR devices, turning into a lightsaber in the Saber Beat game, or a wand in the Harry Potter series?

▲ Dexmo force feedback gloves

If you can be a little more imaginative, maybe Apple will release a "force feedback glove". The entire glove is covered with vibration motors, and with Apple's adjustment, users can really "touch" the virtual world presented by Vision Pro.

It is almost certain that Apple is already thinking of ways to add more "interactions" to Vision Pro, because Vision Pro is not yet complete, and Apple's "imitation game" of replacing reality with virtuality is far from over.

The importance of “realism”

In previous articles, we discussed the importance of keys. When it comes to Vision Pro, a scenario that allows users to experience "no physical performance", interaction and feedback can be said to be even more important. The editor-in-chief of the technology magazine "Wired" once said this:

All devices require interaction. If something doesn't feel interactive, it's considered broken.

This perfectly matched my first experience with Vision Pro: No matter which way or where I tried to use gestures, I couldn't click the button I wanted to click, and I didn't even know it was a problem with Vision Pro. , or it’s my own problem.

Therefore, even if Apple has to cut off the 3.5mm headphone jack and reduce battery capacity, it still has to make room for a huge Taptic Engine, which not only extends the life of the component by abandoning the mechanical structure, but also ensures the realism of interactive feedback.

The real sense of interaction is not just a signal to feedback whether the device is broken. If you want to go deeper, we have completely shifted from the "button" era to the "touch" era, and are also entering the "air-conditioning operation" era. In the future, perhaps we will no longer need to touch any device to perform various operations. Various holographic UIs will surround us. All we need to do is point in the air or use voice control to do what we want.

This blurred future between virtual and reality will bring us all kinds of uncertainty. For example, how do you know that you are a living person and not a brain in a vat?

Perhaps Apple's "imitation game" will one day end completely with changes in operations, but before that, we can still look forward to how Apple will use various methods to deceive our fingers and brains.

# Welcome to follow the official WeChat public account of aifaner: aifaner (WeChat ID: ifanr). More exciting content will be provided to you as soon as possible.

Ai Faner | Original link · View comments · Sina Weibo