Saudi Press

Saudi Arabia and the world
Friday, Aug 22, 2025

Deepfake videos 'double in nine months'

Deepfake videos 'double in nine months'

About 14,700 computer-generated face-swap videos, most of which are pornographic, have been flagged.

New research shows an alarming surge in the creation of so-called deepfake videos, with the number online almost doubling in the last nine months. There is also evidence that production of these videos is becoming a lucrative business.

And while much of the concern about deepfakes has centred on their use for political purposes, the evidence is that pornography accounts for the overwhelming majority of the clips.

The research comes from cyber-security company Deeptrace. Its researchers found 14,698 deepfake videos online, compared with 7,964 in December 2018.

They said 96% were pornographic in nature, often with a computer-generated face of a celebrity replacing that of the original adult actor in a scene of sexual activity.

While many of the subjects featured were American and British actresses, the researchers found that South Korean K-Pop singers were also commonly inserted into fake videos, highlighting that this is a global phenomenon.

The report does highlight the potential for the use of deepfake technology to be used in political campaigns. But in the two cases it highlights - in Gabon and Malaysia - the allegations that faked videos had been used turned out to be incorrect.

What seems clear, though, is that the real danger at the moment is the use of the technology in revenge porn and cyber-bullying.

- Google makes deepfakes to fight deepfakes


- 'Deepfake' app causes fraud and privacy fears in China


- Facebook puts $10m into effort to spot deep fake videos


Henry Ajder, head of research analysis at Deeptrace, says too much of the discussion of deepfakes misses the mark.

"The debate is all about the politics or fraud and a near-term threat, but a lot of people are forgetting that deepfake pornography is a very real, very current phenomenon that is harming a lot of women," he explains.


Fake images, real money


Deeptrace's very existence is evidence of how rapidly the deepfake phenomenon has become a concern for corporations and governments.

It describes its mission as protecting "individuals and organisations from the damaging impacts of AI- generated synthetic media".

The term deepfake was first coined in a Reddit post in 2017, and this report explains that in just two years a whole industry has emerged to profit from this phenomenon.

Deeptrace found that the four leading deepfake-themed pornography websites, supported by advertising, had attracted 134 million views for their videos since February 2018.

Apps making it possible to create this material are now proliferating.

One that allowed users to synthetically remove the clothes from still images of women charged $50 (£40) for removing a watermark from each finished product.

Visits to the app's website surged after a critical article was written about it, and the owners took it down.

But the software is still out there, repackaged by others seeking to profit from it.

One independent expert highlighted that other software has also made it much easier to create fake videos than before.

"It's now become possible to create a passable deepfake with only a small amount of input material - the algorithms need smaller and smaller amounts of video or picture footage to train on," explained Katja Bego, principal researcher at innovation foundation Nesta.

"As the technology is advancing so rapidly, it is important for policymakers to think now about possible responses. This means looking at developing detection tools and raising public awareness, but also [to] consider the underlying social and political dynamics that make deepfakes potentially so dangerous."

The authors of the Deeptrace report also describe service portals - online businesses generating and selling deepfake videos.

One such portal required 250 photos of the target subject and two days of processing to generate a video. Deeptrace says the prices charged vary but can be as little as $2.99 per video.

Another report earlier this year by the Witness Media Lab, a collaboration between a human-rights organisation and Google, found that creating deepfake videos still requires some skill - but that is changing quickly.

The report says right now simulating actual faces completely realistically still involves a significant team of people with specialised skills and technology.

But the lengthy process is being automated, allowing people without that specialist knowledge to make videos that may be less sophisticated but can be generated much faster.

Looking at videos flagged with the deepfake hashtag on YouTube, there are some impressive examples of how the technology is being used by professional teams.

One video where The Shining suddenly features Jim Carrey in the Jack Nicholson role, is made by an artist called Ctrl Shift Face.

The anonymous creator helpfully warns on his channel: "Do not believe everything that you see on the internet, OK?"

Ctrl Shift Face's aim is to entertain rather than deceive. But there are obviously fears that such fakery could be used to sway an election campaign or whip up hatred against a particular group.

So far, however, there appear to be few, if any, instances of deepfakes succeeding in fooling people for malevolent purposes.

Now, as a business set up to protect organisations from this phenomenon, it could be in the interests of Deeptrace to hype this threat. And Ms Bego questioned whether deepfake-detection technology is the right approach.

"A viral video can reach an audience of millions and make headlines within a matter of hours," she explained.

"A technological arbiter telling us the video was doctored after the fact might simply be too little too late."

In any case, it appears that in the short term the real victims of malicious users of deepfake videos will not be governments and corporations but individuals, most of them women.

It is unlikely that they will be able to afford to hire specialists to combat their abusers.

Newsletter

Related Articles

Saudi Press
0:00
0:00
Close
Dogfights in the Skies: Airbus on Track to Overtake Boeing and Claim Aviation Supremacy
Tim Cook Promises an AI Revolution at Apple: "One of the Most Significant Technologies of Our Generation"
Are AI Data Centres the Infrastructure of the Future or the Next Crisis?
Miles Worth Billions: How Airlines Generate Huge Profits
Zelenskyy Returns to White House Flanked by European Allies as Trump Pressures Land-Swap Deal with Putin
Beijing is moving into gold and other assets, diversifying away from the dollar
Cristiano Ronaldo Makes Surprise Stop at New Hong Kong Museum
Zelenskyy to Visit Washington after Trump–Putin Summit Yields No Agreement
High-Stakes Trump-Putin Summit on Ukraine Underway in Alaska
Iranian Protection Offers Chinese Vehicle Shipments a Cost Advantage over Japanese and Korean Makers
Saudi Arabia accelerates renewables to curb domestic oil use
Cristiano Ronaldo and Georgina Rodríguez announce engagement
Asia-Pacific dominates world’s busiest flight routes, with South Korea’s Jeju–Seoul corridor leading global rankings
Private Welsh island with 19th-century fort listed for sale at over £3 million
Sam Altman challenges Elon Musk with plans for Neuralink rival
Australia to Recognize the State of Palestine at UN Assembly
The Collapse of the Programmer Dream: AI Experts Now the Real High-Earners
Armenia and Azerbaijan to Sign US-Brokered Framework Agreement for Nakhchivan Corridor
British Labour Government Utilizes Counter-Terrorism Tools for Social Media Monitoring Against Legitimate Critics
WhatsApp Deletes 6.8 Million Scam Accounts Amid Rising Global Fraud
Nine people have been hospitalized and dozens of salmonella cases have been reported after an outbreak of infections linked to certain brands of pistachios and pistachio-containing products, according to the Public Health Agency of Canada
Texas Residents Face Water Restrictions While AI Data Centers Consume Millions of Gallons
Tariffs, AI, and the Shifting U.S. Macro Landscape: Navigating a New Economic Regime
India Rejects U.S. Tariff Threat, Defends Russian Oil Purchases
United States Establishes Strategic Bitcoin Reserve and Digital Asset Stockpile
Thousands of Private ChatGPT Conversations Accidentally Indexed by Google
China Tightens Mineral Controls, Curtailing Critical Inputs for Western Defence Contractors
OpenAI’s Bold Bet: Teaching AI to Think, Not Just Chat
BP’s Largest Oil and Gas Find in 25 Years Uncovered Offshore Brazil
JPMorgan and Coinbase Unveil Partnership to Let Chase Cardholders Buy Crypto Directly
British Tourist Dies Following Hair Transplant in Turkey, Police Investigate
WhatsApp Users Targeted in New Scam Involving Account Takeovers
Trump Deploys Nuclear Submarines After Threats from Former Russian President Medvedev
Germany’s Economic Breakdown and the Return of Militarization: From Industrial Collapse to a New Offensive Strategy
IMF Upgrades Global Growth Forecast as Weaker Dollar Supports Outlook
Politics is a good business: Barack Obama’s Reported Net Worth Growth, 1990–2025
"Crazy Thing": OpenAI's Sam Altman Warns Of AI Voice Fraud Crisis In Banking
Japanese Prime Minister Vows to Stay After Coalition Loses Upper House Majority
President Trump Diagnosed with Chronic Venous Insufficiency After Leg Swelling
Man Dies After Being Pulled Into MRI Machine Due to Metal Chain in New York Clinic
FIFA Pressured to Rethink World Cup Calendar Due to Climate Change
"Can You Hit Moscow?" Trump Asked Zelensky To Make Putin "Feel The Pain"
Nvidia Becomes World’s First Four‑Trillion‑Dollar Company Amid AI Boom
Iranian President Reportedly Injured During Israeli Strike on Secret Facility
Kurdistan Workers Party Takes Symbolic Step Towards Peace in Northern Iraq
BRICS Expands Membership with Indonesia and Ten New Partner Countries
Elon Musk Founds a Party Following a Poll on X: "You Wanted It – You Got It!"
AI Raises Alarms Over Long-Term Job Security
Saudi Arabia Maintains Ties with Iran Despite Israel Conflict
Russia Formally Recognizes Taliban Government in Afghanistan
×