[ad_1]
Apple is updating its accessibility features
Degenerative diseases can steal not just mobility, but the ability to communicate. It’s estimated that 80%-to-90% of those with ALS – amyotrophic lateral sclerosis, more commonly known as Lou Gehrig’s Disease – will suffer some form of speech impairment. Soon, however, their iPhone may be able to give them not just the power of speech, but the ability to communicate with loved ones in their own voice.
In recognition of Global Accessibility Awareness Day on May 18, Apple has announced a collection of new iPhone, iPad, and Mac accessibility features, including the ability to have your LiDAR-equipped iPhone or iPad use the Magnifier app to read any text you point to (think greeting cards, instructions); Live speech, which reads out whatever you type on the phone; and Personal Voice.
Personal Voice, which has been developed in cooperation with Team Gleason a US-based non-profit ALS awareness foundation, could be the most exciting of these, because instead of the iPhone using one of its pre-made Siri voices (Australian, English, different genders), it uses a synthesized version of your own voice to say whatever text you input.
A quick training
To train the system, which Apple plans to ship later this year, you position yourself about six to 10 inches from the iPhone’s microphone, and then repeat a series of randomly selected sentences. That’s apparently enough to train the iPhone’s onboard machine learning (ML), and enable the handset to repeat what you type in your synthetically-generated voice.
Since the system is designed to help those who are losing their voices due to motor or cognitive impairment, the training is also flexible. If you can’t do a 15-minute training session, you can stop and start until you’ve made it through all the sentences. In addition, the training system is self-guided, so there’s no screen-tapping necessary.
While the system is not designed as a voice-over system, you can use Personal Vocie to save often-used phrases like “How are you?” “Thank you,” and “Where is the bathroom?”
Personal Voice will live under Settings/Accessibility on the iPhone, iPad, and Mac, and works with any of these devices running Apple silicon. For now, it only supports English.
The AI-powered accessibility feature joins a host of other on-device Assistive Access features, which include redesigned and customizable home screens with larger buttons and text, a combined voice call and FaceTime screen that lets someone choose the easiest way for them to communicate, and streamlined interfaces in Music, Camera, and Photos.
Announcing the new features, ALS Advocate and Team Gleason board member Philip Green said, “At the end of the day, the most important thing is being able to communicate with friends and family.”
Green who was diagnosed with ALS in 2018, added, “If you can tell them you love them, in a voice that sounds like you, it makes all the difference in the world.”
[ad_2]
Source link