Metapress

Apple is now using AirPods to Completely Capture your mind

Nearly four years after its release, Apple revealed its ambitions to place on the AirPods product line.

Before the WWDC20 conference was held, the upgraded AirPods Pro in the AirPods product line was just a true wireless headset with active noise reduction function. It has been the most popular one in the same category for half a year since its public sale. Sales volume is far ahead of other competitors.

But Apple is not satisfied with this. With iOS 14 just announced on WWDC20, Apple tried to give AirPods Pro more new capabilities. Based on the new software, AirPods Pro will be able to implement “simulated surround sound field”, “automatic device switching” and other sound-related functions, and also be able to sense the user’s movement through the built-in acceleration sensor to achieve richer motion detection.

AirPods Pro is no longer just a pair of headphones, it will become a set of “sound filter enhancement” equipment, as well as a wearable sensor.

Different sounds

From the first day of release, AirPods Pro has won unanimous praise from users.

Before AirPods Pro, most Bluetooth headsets had various functional defects. For example, it does not support noise reduction, it is not portable, it is not comfortable enough to wear, the quality of the call is not good, it is not waterproof and sweat-proof… AirPods Pro has not been particularly extreme in some respects, but it is at least the most comprehensive headset and can deal with Most demand scenarios.

Previously, Apple’s headphones and even most audio products could only produce “a standard sound.” But AirPods Pro is different, it has two sound modes of “noise reduction” and “transparency”. It can listen to the sound in the environment through an external microphone to meet the needs of users in different scenarios.

However, this is only the first step in Apple’s ambitions in the audio field. In the iOS 14 software update, Apple added the function of simulating the sound field for AirPods Pro. Through AirPods Pro speakers, Apple can simulate 5.1, 7.1 surround sound effects, and even Dolby Atmos standard sound field effects.

Many gaming headsets also have the function of simulating multi-channel stereo, but these headsets are all headsets. There is no precedent before AirPods Pro to achieve this function in in-ear headphones. Not only that, because of the built-in gyroscope and acceleration sensor, AirPods Pro can detect the movement and rotation of the user’s head, and can automatically correct the angle of the sound field distribution according to the angle of deflection, so that the sound the user hears is always in the correct position.

Before AirPods Pro, Apple’s methodology for making headphones was relatively simple. From 2001 to 2019, Apple used almost two sets of headset molds in 18 years. In 2001, Apple introduced the classic “Little White” headphones with the original iPod, and then included with the iPod and iPhone. One use was 10 years, and only a small change was adjusted for the earphone handle and wire control. Until 2012, Apple scanned the ears of hundreds of users and designed and launched EarPods based on this. The same design was also used on AirPods.

The sound experience provided by these headphones is similar. For many years, the sound of Apple’s audio equipment has been described by users as “boiling water” without excessive timbre characteristics. Apple hopes to provide a “standardized” sound experience, so that the sound on different devices can be consistent

But in recent years, the design ideas of Apple’s audio products have been quietly changing. On HomePod, Apple proposed the concept of “perceiving the environment” and optimizing sound based on it; starting from the iPhone XS, Apple added a wider stereo function to the iPhone, which can simulate a wide sound field only through dual speakers On the MacBook Pro, Apple has also begun to manage audio playback through the built-in T2 chip. Since 2018, almost every MacBook Pro update has been accompanied by improvements in speakers and microphone effects.

Obviously, Apple’s audio team is using technological innovation to enhance the sound experience of all its product lines. The most important improvement points are “larger and more accurate sound field” and “tri-band with better separation”. Now, AirPods Pro will also benefit from this achievement.

Not only that, iOS 14 will also allow users to personalize the sound of the headset for the first time. Apple has built a complete set of processes in the new system to help users choose their favorite among multiple sound styles. These different sound settings can make some specific music styles more prominent, and also allow users to hear clearer vocals when talking on the phone or listening to podcasts.

At the same time, this set of experience innovations does not stop at the device level. As early as 2018, Eddy Cue, senior vice president in charge of Internet content services such as Apple Music, said in an interview that Apple Music will adjust their EQ (equalizer effect) settings for different songs based on analysis data to improve the music Expressive. Cooperate with hardware to enhance the expressiveness of music.

Through AirPods Pro’s software update, and its large number of content libraries that support the surround sound standard, Apple is no longer willing to provide only one sound, but to provide a better and richer sound experience for different scenarios.

Wearable sensors

In addition to sound, Apple has also buried more potential in AirPods Pro

On iOS 14, developers will be able to call the built-in motion sensor of AirPods Pro through a new API. In this way, some games or sports functions can be realized. For example, the sensors in the AirPods Pro can detect the user’s actions such as “Bobby jump” and squat, or integrate the user’s head motion detection into the game’s gameplay.

In the past two years, wearable devices are becoming Apple’s new growth engine. Since the second quarter of 2017, due to the sales of AirPods and Apple Watch, Apple’s wearable device business has maintained a rapid growth of more than 30% per year.

But Apple’s layout in the field of wearable devices is far more than just “selling watches and headphones”. After AirPods and Apple Watch are favored by users, how to use the products that these users wear personally every day to create a unique experience is Apple’s larger long-term goal.

As early as 2017, when AirPods first came out, a biomedical expert Steven LeBoeuf told CNBC that AirPods may be a better biosensor than Apple Watch. Because it is worn on the ear, it can better sense the user’s body temperature, heartbeat and other data. At the same time, AirPods’ help for the hearing impaired will also become a very important function point.

He accurately predicted the direction of Apple’s product development. After that, AirPods can indeed cooperate with iPhone to provide “hearing aid” function for the hearing impaired. On iOS 14, this series of functions continues to evolve. Ordinary people can now more personalize the noise reduction function of AirPods Pro. In “transparent mode”, you can set the level and size of the surrounding sound separately, helping Users filter out the sound they don’t want to hear.

In the field of wireless headphones, Apple is not the first to enter the game. But AirPods developed rapidly, which happened to be a portrayal of Apple’s productive efforts when it was made. In just three years, AirPods has grown from a simple true wireless headset to a noise-cancelling headset that can meet various needs.

Now, its blueprint has been extended farther, and it is about to enter an area that has never been involved and achieve subversion again.

Exit mobile version