AI Applications Under the Radar of Data Protection Authorities
Concerns about how artificial intelligence (AI) systems use and store data collected from users have been frequently debated by data protection authorities and professionals dealing with data security and privacy issues lately. These discussions are kept in the public eye through decisions, publications, and press releases related to the subject.
The Italian Data Protection Authority, Garante per la Protezione dei dati personali (“Garante”), has placed ChatGPT at the center of these discussions by temporarily restricting it and later announcing the waiving of this decision on restriction. ChatGPT, a large language model developed by US-based OpenAI, is described as an AI-based chatbot capable of generating human-like language using deep learning algorithms and uses large data sets to produce meaningful responses through natural language processing technology.
Garante's Evaluation of ChatGPT
ChatGPT provides users with control over the use of their chat history data for the application's development. It is noted that the chat histories of users who do not permit this will be deleted from the systems within 30 days. The absence of an active age verification system during registration is a critical detail expected to be addressed by the Personal Data Protection Board ("Board") for users accessing Turkey.
On the other hand, countries like the United Kingdom, which approach the subject with a focus on innovation, indicate the necessity of regulating producer artificial intelligence through adopting common principles with a multidisciplinary attitude, addressing opportunities and risks related to artificial intelligence.
Undoubtedly, AI applications like ChatGPT and the automatic processing of personal data in this way will stay on the agenda for a long time, with all its aspects being discussed extensively. For example, AI-based applications like ChatGPT must provide clear and transparent information about their privacy policies and take the necessary steps to protect users' data. Furthermore, data protection authorities are also anticipated to audit these systems to ensure they operate correctly regularly. Generally, all next-generation applications and solutions must undergo a preliminary audit for personal data protection and privacy, and compliance checks and analyses should be conducted before products and applications are made available for use.
Special thanks to Aslı Naz Güzel’ for her contributions.