this post was submitted on 22 Oct 2024
151 points (98.1% liked)

Futurology

1757 readers
173 users here now

founded 1 year ago
MODERATORS
top 5 comments
sorted by: hot top controversial new old
[–] [email protected] 23 points 1 week ago* (last edited 1 week ago) (1 children)

We've been dealing with this a lot in WA, too. The popular one a couple months ago was the cliche hollywood "We've kidnapped your grandkid here's proof (AI copy of their grandchild, taken from facebook videos as far as we can tell, reading off that day's news article) please send $xxx bitcoin". It's disgustingly effective. Don't post your kids on social media, folks. Don't do it.

[–] [email protected] 5 points 1 week ago

Is this the 2000s again....

No names, no numbers, no addresses, stop storing things on someone else's computer . But it's honestly too late for most people. They were all caught up in the fuckerburg give him your personal info days.

More then half the world, doesn't care or never thinks of the potentials and then this kinda crap happens.

[–] [email protected] 4 points 1 week ago

this is literally the plot of Thelma

[–] [email protected] 3 points 1 week ago

Everyone should make their elders watch Kitboga. I’ve learned about so many scams from him.

[–] [email protected] 2 points 1 week ago

This scam has been around since long before AI voice was a thing. You say something scary enough, and people will subconsciously attribute anything off about the voice to a bad connection and the severe stress the person is presumably under and genuinely think the voice sounds exactly like their loved one. AI voice makes it easier to fool more people, but I bet most of these types of scammers are not putting in the time to research every target to build a voice profile and instead focus on calling as many people as possible. Of course, these days anyone taken by this scam will assume it must have been AI voice, because otherwise how did they sound so convincing.