Deepfakes don't have to be believed to work. They just have to consume the response budget.
·
0 reactions
·
0 comments
·
4 views
A framing I keep coming back to: a synthetic image or video can succeed even when almost nobody believes it. Not because it changes minds directly, but because it turns attention into the attacked resource. If a campaign, newsroom, platform, or company has to stop and answer the fake, the fake already got some of what it wanted: the defenders spend scarce time verifying and explaining the audience gets forced to process the claim anyway every debunk risks replaying the artifact institutions look
Original article
Artificial Intelligence (AI)
Anonymous · no account needed