HOW AI AND ML BENIFITTED THE APPLE

Abhya Singh
2 min readOct 28, 2020

--

1 . Machine learning is used to help the iPad’s software distinguish between a user accidentally pressing their palm against the screen while drawing with the Apple Pencil, and an intentional press meant to provide an input. It’s used to monitor users’ usage habits to optimize device battery life and charging, both to improve the time users can spend between charges and to protect the battery’s longterm viability. It’s used to make app recommendations

2 . Then there’s Siri, which is perhaps the one thing any iPhone user would immediately perceive as artificial intelligence. Machine learning drives several aspects of Siri, from speech recognition to attempts by Siri to offer useful answers.

3.machine learning is also used behind the Photos app’s ability to automatically sort pictures into pre-made galleries, or to accurately give you photos of a friend named that u are entered into the app’s search field.

4.iPhone may take multiple pictures in rapid succession each time you tap the shutter button. An ML-trained algorithm then analyzes each image and can composite what it deems the best parts of each image into one result.

5.Increasingly, Apple performs machine learning tasks locally on the device, on hardware like the Apple Neural Engine (ANE) or on the company’s custom-designed GPUs (graphics processing units).

6. Facial recognition for HomeKit. HomeKit-enabled smart cameras will use photos you’ve tagged on your phone to identify who’s at your door and even announce them by name.

7.Native sleep tracking for the Apple Watch. This uses machine learning to classify your movements and detect when you’re sleeping. The same mechanism also allows the Apple Watch to track new activities like dancing and many more.

8.Handwashing. The Apple Watch not only detects the motion but also the sound of handwashing, starting a countdown timer to make sure you’re washing for as long as needed.

9.App Library suggestions. A folder in the new App Library layout will use “on-device intelligence” to show apps you’re “likely to need next.” It’s small but potentially useful.

10.Translate app. This works completely offline, thanks to on-device machine learning. It detects the languages being spoken and can even do live translations of conversations.

11.Sound alerts in iOS 14. This accessibility feature wasn’t mentioned onstage, but it will let your iPhone listen for things like doorbells, sirens, dogs barking, or babies crying.

thanks for reading!!!!

--

--

No responses yet