/buzzincontent-1/media/media_files/2026/02/13/tanvi_collage_1200x675-2026-02-13-07-42-29.jpg)
New Delhi: A reel of a woman delivering a sharp Urdu couplet has gone viral on Instagram, clocking about 28 million views in a day. The twist is that the performer, introduced as “Tanvi Joshi”, is not a real person.
The video, posted by the handle @tanvijoshii_, shows a woman in a yellow kurta walking onto a stage and reciting, “Khaandani raees rakhte hain mijaaz naram apna… tumhara lehja bata raha hai ki tumhari daulat nayi nayi hai.”
Viewers praised her delivery and diction, and the comments section quickly filled with debates on pronunciation and meaning.
Soon after, users began flagging what they described as unusually “perfect” facial movement and visual inconsistencies. Online chatter and trend reports said the account is built around an AI-generated model designed to look and move like a human creator.
The controversy widened when users traced the audio to a real individual. Reports identified the voice as belonging to Marziya Shanu Pathan, a corporator from Thane, who had recited the same lines earlier. Pathan also commented on the viral post, asserting it was her voice, which triggered fresh debate on consent and credit in the AI era.
The episode underlines a fast-growing platform challenge. AI-made “creators” can now pass as ordinary influencers, with realistic lip-sync, stage ambience and camera presence. For audiences, that raises the risk of being misled about who is speaking. For creators, it increases the likelihood of their voice, likeness, or performance being repurposed without permission.
For brands and publishers, the risk is twofold. Synthetic personas can be used to run paid promotions while masking who operates the page. At the same time, borrowed voices and repackaged performances can trigger ownership disputes, takedown requests and reputational blowback, especially when the source is identified publicly.
The timing is also significant. On February 10, the Ministry of Electronics and Information Technology (MeitY) notified amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules to explicitly bring “synthetically generated information” (SGI) into the due diligence framework, with enforcement from February 20, 2026.
Under the amended framework, platforms that enable the creation or distribution of synthetic content are required to follow enhanced due diligence requirements around SGI, including measures aimed at curbing harms linked to deepfakes and related misuse. MeitY has also issued FAQs outlining the intent and scope of the changes.
The amendments also tighten takedown timelines. After “actual knowledge” through a court order or a reasoned government intimation, intermediaries must act within sharply reduced time windows, which legal experts have said apply broadly across social media content categories.
For pages like @tanvijoshii_, the practical impact is that synthetic creator content is more likely to face platform-level labelling and stricter scrutiny when complaints are raised. If a post is alleged to be deceptive, impersonating, or unlawfully using someone’s identity markers, the revised compliance timelines increase the chance of quicker enforcement action once a platform receives valid notice.
For the creator economy, the “AI that looks real” moment is likely to reset norms on trust. Agencies and influencer managers may start treating disclosure, provenance and rights clearances as core deliverables, not optional hygiene.
That could raise production and compliance costs for synthetic content in the short term, but it also creates new monetisation paths for human creators through licensing, authorised voice partnerships, paid usage rights and hybrid formats where creators openly collaborate with AI tools.
Over time, brand contracts are expected to harden around authenticity, indemnities, and takedown risk, with disclosure compliance becoming a prerequisite for influencer spends.
/buzzincontent-1/media/agency_attachments/ovtHKkiRFrKggtFNaCNI.png)
Follow Us