Growing AI-enabled scams across the world: Cybercriminals misuse AI technology to commit voice scams in US
December 6, 2025
  • Read Ecopy
  • Circulation
  • Advertise
  • Careers
  • About Us
  • Contact Us
Android AppiPhone AppArattai
Organiser
  • ‌
  • Bharat
    • Assam
    • Bihar
    • Chhattisgarh
    • Jharkhand
    • Maharashtra
    • View All States
  • World
    • Asia
    • Europe
    • North America
    • South America
    • Africa
    • Australia
  • Editorial
  • International
  • Opinion
  • RSS @ 100
  • More
    • Op Sindoor
    • Analysis
    • Sports
    • Defence
    • Politics
    • Business
    • Economy
    • Culture
    • Special Report
    • Sci & Tech
    • Entertainment
    • G20
    • Azadi Ka Amrit Mahotsav
    • Vocal4Local
    • Web Stories
    • Education
    • Employment
    • Books
    • Interviews
    • Travel
    • Law
    • Health
    • Obituary
  • Subscribe
    • Subscribe Print Edition
    • Subscribe Ecopy
    • Read Ecopy
  • ‌
  • Bharat
    • Assam
    • Bihar
    • Chhattisgarh
    • Jharkhand
    • Maharashtra
    • View All States
  • World
    • Asia
    • Europe
    • North America
    • South America
    • Africa
    • Australia
  • Editorial
  • International
  • Opinion
  • RSS @ 100
  • More
    • Op Sindoor
    • Analysis
    • Sports
    • Defence
    • Politics
    • Business
    • Economy
    • Culture
    • Special Report
    • Sci & Tech
    • Entertainment
    • G20
    • Azadi Ka Amrit Mahotsav
    • Vocal4Local
    • Web Stories
    • Education
    • Employment
    • Books
    • Interviews
    • Travel
    • Law
    • Health
    • Obituary
  • Subscribe
    • Subscribe Print Edition
    • Subscribe Ecopy
    • Read Ecopy
Organiser
  • Home
  • Bharat
  • World
  • Operation Sindoor
  • Editorial
  • Analysis
  • Opinion
  • Culture
  • Defence
  • International Edition
  • RSS @ 100
  • Magazine
  • Read Ecopy
Home World North America USA

Growing AI-enabled scams across the world: Cybercriminals misuse AI technology to commit voice scams in US

In a report titled "The Artificial Imposter" by McAfee, an online security firm, the participants who took part in the survey were not able to distinguish between the real and the cloned voice.

WEBDESKWEBDESK
Jun 13, 2023, 10:30 am IST
in USA, World, Sci & Tech
Follow on Google News
FacebookTwitterWhatsAppTelegramEmail

Fraudsters are employing remarkably convincing AI-enabled voice cloning technologies, which are publicly accessible online, to steal from victims by impersonating family members in a new generation of schemes that have alarmed US authorities.

Cybercriminals create deepfakes, which refer to the “broad range of generated or manipulated digital media (e.g., images, videos, audio, or text; collectively referred to as “synthetic content” or “synthetic media”) created using artificial intelligence and machine learning processes. Deepfakes can depict the alteration or impersonation of a person’s identity to make it appear as if they are doing or saying things they never did”.

The capacity of artificial intelligence to blur the lines between reality and fiction poses the greatest threat since it will provide cybercriminals with a simple and efficient tool for spreading misinformation, experts said.

While talking to the media about AI-enabled voice cloning scams, the chief executive of Blackbird.AI, Wasim Khaled, said, “AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively”.

He said, “With a small audio sample, an AI voice clone can be used to leave voicemails and voice texts. It can even be used as a live voice changer on phone calls”. Khaled added, “Scammers can employ different accents, genders, or even mimic the speech patterns of loved ones. [The technology] allows for the creation of convincing deep fakes”.

In a report titled “The Artificial Imposter” by McAfee, an online security firm, the participants who took part in the survey were not able to distinguish between the real and the cloned voice.

In this global study of 7054 people from nine nations, including the United States and India, indicated they had fallen victim to an AI voice cloning fraud themselves or knew someone who had. However, the latest AI scam poses a greater risk, with researchers claiming that cloning someone’s voice is now a major tool in cyber criminals’ arsenal.

Officials in the United States have expressed concern about an increase in the “grandparent scam,” in which a fraudster assumes the identity of a grandchild who is in desperate need of money.

In March, the US Federal Trade Commission explained the modus operandi of the AI voice scam, “You get a call. There’s a panicked voice on the line. It’s your grandson. He says he’s in deep trouble —- he wrecked the car and landed in jail. But you can help by sending money”. As per the advisory, to distinguish between the actual and cloned voices, cross-check it by calling the real person to verify the story.

Even from India, 1010 respondents participated in this survey. The report said, “The survey reveals that more than half (69 percent) of Indians think they don’t know or cannot tell the difference between an AI voice and real voice”.

About 83 per cent have faced financial losses due to these new AI-based voice scams. The report said, “About half (47 per cent) of Indian adults have experienced or know someone who has experienced some kind of AI voice scam, which is almost double the global average (25 per cent). 83 per cent of Indian victims said they had a loss of money- with 48 per cent losing over Rs 50,000”.

More than 80 per cent of Indians transfer their voice data online, especially through recorded audio notes on social media, voice notes, and other channels at least once a week. From these sources mentioned above, the audio data gets into the hands of cybercriminals who use it to clone a person’s voice. It just takes three seconds of audio to clone somebody’s voice.

The report said, “Particularly if they thought the request had come from their parent (46 per cent), partner or spouse (34 per cent), or child (12 per cent). Messages most likely to elicit a response were those claiming that the sender had been robbed (70 per cent), was involved in a car incident (69 per cent), lost their phone or wallet (65 per cent) or needed help while travelling abroad (62 per cent)”.

McAfee CTO Steve Grobman said, “Artificial Intelligence brings incredible opportunities, but with any technology, there is always the potential for it to be used maliciously in the wrong hands. This is what we’re seeing today with the access and ease of use of AI tools helping cybercriminals to scale their efforts in increasingly convincing ways”.

Topics: Steve GrobmandeepfakesThe Artificial ImposterBlackbird.AIWasim KhaledTechnologygrandparent scamCybercriminalsFederal Trade CommissionAI voice scamMcAfee
Share1TweetSendShareSend
✮ Subscribe Organiser YouTube Channel. ✮
✮ Join Organiser's WhatsApp channel for Nationalist views beyond the news. ✮
Previous News

Netflix, Amazon, Disney and JioCinema may challenge Govt’s new Tobacco warning rules

Next News

Jammu & Kashmir: Security forces arrest LeT terrorist associate, recover 2 Chinese hand grenades

Related News

Computers, the internet, mobile phones and related devices have reshaped how we work, learn and live

Digital technology — Boon or Bane?

Belagavi police bust international call center racket duping American citizens

Karnataka: Belagavi police bust international call center racket duping American citizens

A representative image, courtesy: Newsonair

Bihar assembly polls: EC warns parties against circulating deepfakes

RSS Sarsanghchalak Dr Mohan Bhagwat

100 Years of Sangh Yatra: Technology should not dominate us, and education must build character — Dr Mohan Bhagwat

Artificial Intelligence(AI) is a source of both aspirations and anxieties

Uncertainties of Artificial Intelligence

Microsoft leaves Pakistan

Microsoft abandons Pakistan after 25 years: Technology giant packs up from a failing state

Load More

Comments

The comments posted here/below/in the given space are not on behalf of Organiser. The person posting the comment will be in sole ownership of its responsibility. According to the central government's IT rules, obscene or offensive statement made against a person, religion, community or nation is a punishable offense, and legal action would be taken against people who indulge in such activities.

Latest News

PM Modi presents Putin with Bhagavad Gita, chess set, and silver horse

Cultural ties strengthened: PM Modi presents Putin with Bhagavad Gita, chess set, and silver horse

Image for representational purpose only, Courtesy Vocal Media

Bihar to get ‘Special Economic Zones’ in Buxar and West Champaran

Thirupparankundram Karthigai Deepam utsav

Andhra Pradesh: AP Dy CM Pawan Kalyan reacts to Thirupparankundram row, flags concern over religious rights of Hindus

23rd India-Russia Annual Summit

India-Russia Summit heralds new chapter in time-tested ties: Inks MoUs in economic, defence, tourism & education

DGCA orders probe into IndiGo flight disruptions; Committee to report in 15 days

BJYM leader Shyamraj with Janaki

Kerala: Widow of BJP worker murdered in 1995 steps into electoral battle after three decades at Valancherry

Russian Sber bank has unveiled access to its retail investors to the Indian stock market by etching its mutual fund to Nifty50

Scripting economic bonhomie: Russian investors gain access to Indian stocks, Sber unveils Nifty50 pegged mutual funds

Petitioner S Vignesh Shishir speaking to the reporters about the Rahul Gandhi UK citizenship case outside the Raebareli court

Rahul Gandhi UK Citizenship Case: Congress supporters create ruckus in court; Foreign visit details shared with judge

(L) Kerala High Court (R) Bouncers in Trippoonithura temple

Kerala: HC slams CPM-controlled Kochi Devaswom Board for deploying bouncers for crowd management during festival

Fact Check: Rahul Gandhi false claim about govt blocking his meet with Russian President Putin exposed; MEA clears air

Load More
  • Privacy
  • Terms
  • Cookie Policy
  • Refund and Cancellation
  • Delivery and Shipping

© Bharat Prakashan (Delhi) Limited.
Tech-enabled by Ananthapuri Technologies

  • Home
  • Search Organiser
  • Bharat
    • Assam
    • Bihar
    • Chhattisgarh
    • Jharkhand
    • Maharashtra
    • View All States
  • World
    • Asia
    • Africa
    • North America
    • South America
    • Europe
    • Australia
  • Editorial
  • Operation Sindoor
  • Opinion
  • Analysis
  • Defence
  • Culture
  • Sports
  • Business
  • RSS @ 100
  • Entertainment
  • More ..
    • Sci & Tech
    • Vocal4Local
    • Special Report
    • Education
    • Employment
    • Books
    • Interviews
    • Travel
    • Health
    • Politics
    • Law
    • Economy
    • Obituary
  • Subscribe Magazine
  • Read Ecopy
  • Advertise
  • Circulation
  • Careers
  • About Us
  • Contact Us
  • Policies & Terms
    • Privacy Policy
    • Cookie Policy
    • Refund and Cancellation
    • Terms of Use

© Bharat Prakashan (Delhi) Limited.
Tech-enabled by Ananthapuri Technologies