Most of the biggest functions of Apple’s brain holes come from this department

Lately, my commute has become "magic practice".

I don't have to carry my wand, just a light touch between my thumb and forefinger, and I can control my Apple Watch.

Every year, I look forward to May.

Not only is there a long holiday, but its third Thursday is "Global Accessibility Awareness Day", which means that Apple will bring some unexpected ways of interacting, with new "magic".

This year, we were delighted to have another conversation with Sarah Herrlinger, Apple's senior director of global accessibility policy and actions, to find out how the new "magic" just announced has been "changed".

▲Sarah Herrlinger

No matter how big Apple is, it can still be very "personal" when it comes to new features

Among the new accessibility features announced this year, the one that surprised me the most was Apple Watch Mirroring.

To put it simply, this is to "project" the operating interface of the Apple Watch to the iPhone.

▲ Mirror the Apple Watch interface on the iPhone

Next, in addition to clicking directly on the iPhone to operate a slightly enlarged version of the Apple Watch interface, users can also control the watch through all the auxiliary functions originally supported by the iPhone – voice control, head tracking control and switching control.

In this case, users who are inconvenient to touch the Apple Watch with their fingers can also find a more suitable way to use it.

▲ 2017 WWDC's iPhone switching control demo

While it sounds plausible now, I'm really curious how Apple came up with this idea in the first place.

Unexpectedly, everything originated from a user:

We have a user who has cerebral palsy. He told us that every time a new Apple Watch comes out and hears various new features, he wants to use them, especially the health features.

However, because his limbs are limited, he can't use the Apple Watch like everyone else.

Since he was going to use the iPhone by switching controls, we naturally wanted him to use the accessibility features he was already familiar with for these (on the Apple Watch).

In this way, after Apple officially launched this function through a system update later this year, he will finally be able to independently use the Apple Watch's heart rate/blood oxygen detection, meditation, sleep detection and other health functions.

However, Apple's accessibility team is not only inspired by "individuals".

Herrlinger said that in order to do a good job in accessibility design, in addition to listening to user letters, customer service feedback, and the voices of various disability groups, another very important key is to hire disabled employees:

(We) don't just build (accessibility features) for them, but build features with them.

Moreover, only with such close contact with users can we discover a steady stream of real pain points.

Unusual "crossover" to create innovation that only Apple can have

Another important new feature this year is Door Detection.

As a mode in the Amplifier app, door detection can help blind or low vision users solve the "last steps" problem.

This feature identifies "doors" through the iPhone/iPad camera and informs the user how far they are from the door, whether the door is open or closed, and how they can open the door (push, pull, or rotating handle).

▲ Identification display: closed door, 8 feet away, text message "MUFFIN TO WRITE HOME ABOUT BAKERY"

At the same time, the mode will also read the information on the door and surrounding signs, such as the door number or accessibility signs.

It doesn't sound complicated, but in addition to using cameras, this feature actually involves lidar scanners and on-device machine learning.

(Lidar and Machine Learning) These two teams actually have a lot of different projects beyond accessibility design, but they will work together for our project to create some features that really only Apple can do.

Outside of this project, Apple's accessibility team usually facilitates a lot of "crossovers".

Herrlinger told us that internally, Apple has a "large-scale" accessibility team consisting of "experts in building assistive technologies" who have created accessibility features such as narration, switching control, and voice control that run through Apple's various product lines. technology.

At the same time, this team also collects ideas from across the company, collaborates with different engineering teams within Apple to develop new features, and root accessibility in the entire ecosystem:

Our Accessibility team works closely with the rest of Apple.

So, we don't see accessibility as a single application, or an operating system, but fully integrate it into our entire hardware and software ecosystem, which is our unique advantage at Apple.

In Apple's Imagination Center, see tomorrow's interactive "magic" in advance

Some people say that the meaning of art is not in the pieces left in the museum, but in the fact that it brings a new perspective and way of thinking to human beings.

I think accessibility design has the same effect.

It takes us to perspectives we seldom imagined, and opens up functions we never thought we could use in this way, constantly expanding the collective imagination of technological innovation.

At Apple, the accessibility team is like a center of imagination—mining “extreme” needs, connecting different product teams to solve “impossible tasks,” and driving innovation.

Remember the "magic gesture" at the beginning of the article?

It's actually an accessibility feature called "gesture control" that Apple introduced for the Apple Watch last year.

By making a fist or pinching our fingers, we can control the Apple Watch with one hand. This provides one more way to use the watch for users who are inconvenient to tap the screen with the other hand, or have hand shaking.

Although it was originally an accessibility feature built for accessibility, due to its popularity, the Apple team announced this year that it would launch "Quick Actions" based on this technology.

Later this year, everyone will be able to answer or end calls, take pictures, control music playback, and start/pause energy training with a "double pinch" gesture.

Now we also hope to develop more quick operations, so that more people can use this convenient and quick operation method, and they are not necessarily disabled.

Perhaps, as the types of quick operations expand, these "magic gestures" can be as common as AirPods "tap" headphones to control.

Of course, this isn't the first time accessibility has entered the "mainstream".

iPadOS supports control with a Bluetooth/wired mouse, and although it was initially introduced as an accessibility feature, it is also very popular.

As for the "Live Caption" feature that will be launched later this year, I believe it will also become an accessibility feature that many people will use.

On iPhone, iPad and Mac, the feature will enable real-time transcribing of any audio, be it phone calls, FaceTime, video or social media.

Moreover, these real-time generated subtitles are generated on the device side, thus ensuring information privacy and security.

That's why I always pay attention to accessibility updates.

The breakthrough of hardware technology has its "hard core romance", and the solutions that burst out with the needs of accessibility reflect the infinite possibilities brought by imagination.

Moreover, it always brings me the earliest experience of the interactive "magic" that will change tomorrow's life.

#Welcome to pay attention to the official WeChat account of Aifaner: Aifaner (WeChat: ifanr), more exciting content will be brought to you as soon as possible.

Love Faner | Original link · View comments · Sina Weibo