Skip to content

Michael Tsai’s Blog: Exploring Radar AI Training

admin

Apple has introduced a new opt-in training program for Apple Intelligence that allows users to contribute content from their iPhones for AI model development. This training occurs on-device using Differential Privacy to preserve user privacy. However, concerns have arisen regarding the lack of a clear opt-out option, with users able to refuse only by disabling analytics sharing.

In the Feedback Assistant, users must now consent to allow their submitted bug report attachments to be utilized for AI training, which has outraged developers. They feel this practice disrespects their trust in Apple when sharing sensitive data intended to resolve issues. Parents and past experiences suggest that this may benefit broader AI models rather than just internal bug triaging.

Overall, developers express frustration over inadequate feedback from Apple on bug reports, feeling their time and contributions are undervalued. The situation raises serious privacy concerns and questions about data handling transparency.

Source link

Share This Article
Leave a Comment