/buzzincontent-1/media/media_files/2026/02/06/ai-replica-2026-02-06-14-53-16.png)
New Delhi: A viral music cover posted by creator Pragati Naagar has set off a wider conversation around artificial intelligence, identity and authenticity online, after a wave of AI-generated recreations began circulating across social platforms.
The development has drawn attention within creator communities and prompted questions about consent, attribution and platform safeguards in an era of rapidly improving generative tools.
The cover, a rendition of the Japanese track Just a Boy, gathered millions of views within days of being posted. Soon after, multiple accounts, some newly created began publishing hyper-real videos that closely replicated the singer’s voice, expressions and performance style.
Several of these videos also amassed significant engagement, with audiences often unable to distinguish between the original creator and AI-generated recreations.
Recently, the original creator of the audio, Pragati Naagar, shared a post in collaboration with IFP.world (formerly India Film Project) outlining how the content began circulating and how multiple recreations started appearing across feeds. The post detailed the spread of the videos and noted that links to both the original and derivative clips would be added for context.
According to the information shared, several accounts uploading recreated versions had been set up only days earlier but quickly drew views, followers and interactions. In some cases, viewers appeared to treat the AI-generated personas as real creators, reaching out to them for collaborations or financial exchanges.
The incident has highlighted how generative AI tools can replicate not only voice and visual likeness but also performance patterns and micro-expressions. For many viewers, the difference between an original performance and a synthetic recreation becomes harder to detect as the technology improves. The result is a blurred boundary between real creators and algorithmically generated identities.
The conversation extends beyond imitation. Questions are emerging around accountability if synthetic personas are used to solicit money, brand deals or interactions under the assumption that they are real individuals. Concerns have also been raised about whether platforms should introduce clearer labelling for AI-generated content, stricter monetisation controls and improved verification mechanisms.
For creators, the implications are broader. When audiences struggle to verify authenticity, genuine performers may find themselves needing to prove their identity more frequently. The spread of realistic AI recreations could shift trust dynamics across social media, affecting how audiences interpret and engage with content.
The situation also brings into focus the absence of clear frameworks around consent and likeness replication. None of the recreated versions referenced in the material shared sought permission from the original performer, raising questions about ownership and the commercial use of replicated identities.
/buzzincontent-1/media/agency_attachments/ovtHKkiRFrKggtFNaCNI.png)
Follow Us