Advertisement

Apple 'Restricts' Internal Use Of ChatGPT, GitHub's Copilot Over Data Leak Risk

ChatGPT is reportedly on the list of restricted software at Apple for months.

Apple 'Restricts' Internal Use Of ChatGPT, GitHub's Copilot Over Data Leak Risk Image Courtesy: Reuters

New Delhi: Apple has reportedly restricted the internal use of AI chatbot ChatGPT and GitHub's Copilot over concerns that its confidential data may end up with developers who trained the AI models on user data. According to The Wall Street Journal, the iPhone maker is "concerned workers could release confidential data as it develops its own similar technology".

Apple has restricted the use of ChatGPT and other external AI tools for some employees "as it develops its own similar technology," according to a document reviewed by WSJ. (Also Read: 47% Of Americans Used ChatGPT For Stock Picks: Study)

The tech giant is developing its own generative AI models but didn't expand on what they may be used for, according to the report. In March, The New York Times reported that multiple teams at Apple, including the one working on Siri, are experimenting with language-generating AI. (Also Read: Apple Removed 1,474 Apps On Govt Takedown Requests In 2022, 14 From India)

ChatGPT is reportedly on the list of restricted software at Apple for months.

Samsung has also reportedly blocked the use of generative AI tools like ChatGPT on company-owned devices as well as non-company-owned devices running on internal networks.

The South Korean giant is said to be developing its own in-house AI tools for "software development and translation". The decision has apparently been taken after sensitive data from Samsung was accidentally leaked to ChatGPT last month.

Organisations like Bank of America, Citi, Deutsche Bank, Goldman Sachs, Wells Fargo, JPMorgan, Walmart and Verizon have also restricted their employees from accessing ChatGPT.