Apple Addresses Concerns Regarding YouTube Data in AI Development

Apple Addresses Concerns Regarding YouTube Data in AI Development
Apple clarifies OpenELM AI model trained on diverse dataset, including YouTube subtitles, but not used in consumer-facing AI features like Apple Intelligence.

In response to recent allegations about the use of YouTube content in AI model training, Apple has clarified its position. The tech giant confirmed that while a specific dataset, which included YouTube subtitles, was used to train its open-source OpenELM open language model, this model does not contribute to any consumer-facing AI or machine learning features.

Apple’s OpenELM: A Research Tool, Not a Consumer Product

Apple emphasized that OpenELM was developed as a research tool and does not power any of its customer-oriented OpenELM, including Apple Intelligence. This clarification comes in response to a Wired report, based on a Proof News investigation, which revealed that several tech companies, including Apple, had utilized subtitles from thousands of YouTube videos in their AI training processes.

YouTube Data: A Small Part of a Diverse Training Set

While YouTube subtitles were included in the training dataset, it’s important to note that they constituted only a fraction of the data used. The dataset encompassed a wide range of content, including transcripts from educational institutions like MIT and Harvard, news outlets like The Wall Street Journal and NPR, and even content from popular YouTubers. This diverse dataset aimed to provide a comprehensive training ground for the AI models.

Balancing Innovation and Privacy: Apple’s Approach to AI

Apple reiterated its commitment to user privacy, stating that Apple Intelligence models are trained on licensed data and publicly available data collected by its web crawler. The company maintains that it does not use users’ private personal data or user interactions for training its AI models.

OpenELM: Advancing Open-Source AI Development

Apple’s OpenELM open language model utilizes a unique layer-wise scaling strategy to optimize parameter allocation within the transformer model, leading to improved accuracy. By open-sourcing this model, Apple aims to contribute to the broader AI research community and foster advancements in open-source large language model development.

Apple’s clarification underscores its commitment to transparency and responsible AI development. While the use of YouTube data in AI training raises questions about data sourcing and privacy, Apple’s emphasis on OpenELM’s research purpose and its limited role in consumer-facing AI features aims to alleviate concerns. The company’s ongoing efforts to balance innovation with user privacy will likely remain a focal point as AI technology continues to evolve.

About the author

Hardik

Hardik Mitra

With 8 years of digital media experience and a Digital Marketing degree from Delhi University, Hardik's SEO strategies have significantly grown PC-Tablet's online presence, earning accolades at various digital marketing forums.

Add Comment

Click here to post a comment

Follow Us on Social Media

Web Stories

Best performing phones under Rs 70,000 in December 2024: iQOO 13, OPPO Find X8, and more! realme 14X 5G Review Redmi Note 14 Pro vs Realme 13 Pro Most Affordable 5G Phones Under Rs 12000 in December 2024: Samsung, Redmi, Lava, Poco & More! Best mobile phones under Rs 35,000 in December 2024: realme GT 6T, Vivo T3 Ultra 5G and more! Best Mobile Phones under Rs 25,000 in December 2024: Nothing Phone 2(a), OnePlus Nord CE 4 Lite & More!