Gujarat HC flags deepfake misuse, seeks accountability from platforms

The Gujarat High Court on Tuesday raised serious concerns over the misuse of artificial intelligence (AI) and deepfake content, observing that no video or material should be uploaded on any platform without proper authentication.
Court flags unchecked spread of AI-generated content
Hearing a Public Interest Litigation (PIL) on the issue, a bench of Chief Justice Sunita Agarwal and Justice D N Ray said objectionable or manipulated content cannot be allowed to circulate freely, including on social media, without verification.
The court underscored the need for “strict action” against platform providers and indicated that it may issue directions to ensure compliance.
PIL raises concerns over reputational harm, misinformation
The petition, filed by advocate Amit Panchal, highlighted the growing use of AI-generated images and videos, including those involving constitutional authorities. It argued that such content harms “reputation, dignity and public image” and misleads the public.
The plea urged both the state and Central governments to introduce specific laws and regulatory measures to curb the misuse of deepfakes.
Government responses and compliance gaps
During the hearing, the state government placed its suggestions on record, while the Centre submitted an affidavit. A key concern flagged was the failure of platform providers to act even after being instructed to remove objectionable or unauthorised content.
In several instances, companies reportedly responded that the URL in question was “unavailable”.
The court was also told that children and lay users often cannot distinguish between real and AI-generated content, heightening the risk of misinformation.
Court questions enforcement against platforms
Addressing enforcement, the Central government said revised rules now require platforms to remove fake or objectionable content within three hours of receiving instructions, reduced from 36 hours earlier.
However, the bench questioned what action would follow if platforms failed to comply with such directives.
The Centre responded that action could be taken under provisions of the Information Technology Act, 2000, including Sections 66C, 66D and 66E.
The court, however, noted that these provisions apply to individuals uploading such content, not necessarily to platform providers, and sought clarity on penalties for non-compliant intermediaries.
Low response rate raises concern
In a significant disclosure, the government informed the court that out of 1,000 complaints, only 14 had received responses from platform providers.
The High Court stressed the urgent need for a robust regulatory mechanism to ensure stricter oversight and accountability, signalling that stronger directions may follow to address the issue.

