- News>
- Technology
Apple iPhone Will Soon Speak In Your Voice After 15 Min Of Training: Here`s How It Works
In addition to Personal Voice, Apple is introducing Live Speech on the iPhone, iPad, and Mac to enable those who have speech impairments to communicate.
New Delhi: As part of its global accessibility awareness campaign, Apple has unveiled new features for customers with cognitive, vision, and hearing impairments. The following significant iPhone features are on the way: "Assistive Access," "Personal voice," and "Point and Speak in Magnifier." Apple is also releasing curated collections, extra software features, and other things in a few regions.
However, the corporation makes sure that its new tools take advantage of hardware and software advancements, including on-device machine learning to protect user privacy. (Also Read: Fourth Major Exit In 6 Months! Meta India Executive Manish Chopra Steps Down)
The Personal Voice Advance Speech for users who are at risk of losing their capacity to speak, such as those with a recent diagnosis of ALS or other diseases, is possibly the most important function. By using the iPhone, the tool intends to enable user-generated voice communication. (Also Read: Tamil Nadu Announces 4% DA Hike For Govt Employees, Pensioners)
Apple describes how users can build a personal voice in a blog post: "Users can construct a Personal Voice by reading along with a randomly generated set of text prompts to record 15 minutes of audio on iPhone or iPad. In order to protect user privacy and security, this speech accessibility feature leverages on-device machine learning. It also seamlessly interacts with Live Speech so that users may communicate with loved ones using their Personal Voice."
In addition to Personal Voice, Apple is introducing Live Speech on the iPhone, iPad, and Mac to enable those who have speech impairments to communicate. During phone and FaceTime chats, as well as in-person conversations, users can enter what they want to say to have it spoken aloud.
Users with cognitive limitations can use assistive access. By deleting the extra information, the tool gives a personalised app experience and assists users in choosing the most appropriate choice.
As an illustration, Messages offers an emoji-only keyboard and the choice to record a video message to send to loved ones for users who prefer interacting visually. those trusted supporters can also select between a row-based layout for those who like text and a more visually appealing grid-based layout for their Home Screen and apps.
Simply said, the Assistive Access feature for iPhones and iPads delivers a straightforward user interface with high-contrast buttons and large text labels. A new Point and Speak in Magnifier feature will be available for iPhones equipped with LiDAR scanners so that people with disabilities can interact with actual items.
As users run their fingers across the keypad, Point and Speak, according to Apple, reads out the text on each button using data from the camera, the LiDAR scanner, and on-device machine learning.
Along with the new tools, Apple will launch SignTime on May 18 in South Korea, Germany, Italy, and Spain to connect consumers of its Apple Support and Apple Store with on-demand sign language interpreters.
To help consumers learn about accessibility features, a few Apple Store locations across the world offer educational sessions every day of the week.