Amazon co-founder MacKenzie Scott has donated over $19 billion to charity in just five years
Diamond batteries powered by nuclear waste promise 28,000 years of clean energy
Mistral AI’s Right to Respond:
“User queries are essential for enhancing the accuracy of our assistant’s responses. Mistral AI has always provided its users the ability to opt-out of using data from their interactions with Le Chat.”
No Opt-Out for Free Users
Lawyer Jérémy Roche approached the CNIL about a specific issue: the free version of Le Chat logs everything typed by users and the responses it gives, using this data to refine the AI. Escape is not an option, unlike the “Pro” subscribers ($15/month) who can deactivate this feature in their settings. Moreover, “Team” and “Enterprise” subscribers’ data is not even used in this manner.
The core issue? According to Roche, Mistral AI makes the fundamental right to object to data use conditional on subscription payment. This approach might violate the GDPR, the European data protection regulation. Indeed, Article 12 of the regulation states that user rights must be accessible at no cost, except in special circumstances (such as excessive or repetitive requests). Clearly, this is not the case here.
A Common Yet Criticized Practice
Mistral AI is not the only company relying on user data to train its AI. OpenAI, with ChatGPT, engaged in the same practice until the Italian data protection authority intervened. Since then, OpenAI has introduced an option to reject this data collection. Even Elon Musk, through his AI Grok integrated into X (formerly Twitter), allows users to adjust their settings to prevent the use of their conversations for training purposes. Perplexity, an AI-based search engine, also offers a toggle to disable this option.
As you can see, other players have already been forced to amend their methods on this issue. Mistral AI might have to do the same under pressure from the CNIL.
So far, neither Mistral AI nor the CNIL has officially responded to this complaint. However, if the French authority decides to step in, the company may need to revise its practices… and potentially offer all users a real choice regarding the use of their data.
