this post was submitted on 27 Oct 2024
172 points (97.8% liked)

Ask Lemmy

26690 readers
2287 users here now

A Fediverse community for open-ended, thought provoking questions

Please don't post about US Politics.


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected]. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 1 year ago
MODERATORS
 

If AI and deep fakes can listen to a video or audio of a person and then are able to successfully reproduce such person, what does this entail for trials?

It used to be that recording audio or video would give strong information which often would weigh more than witnesses, but soon enough perfect forgery could enter the courtroom just as it's doing in social media (where you're not sworn to tell the truth, though the consequences are real)

I know fake information is a problem everywhere, but I started wondering what will happen when it creeps in testimonies.

How will we defend ourselves, while still using real videos or audios as proof? Or are we just doomed?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 12 points 1 week ago* (last edited 1 week ago) (1 children)

Eventually, we will just have to accept that photographic proof is no longer proof.

There are ways that you could guarantee an image is valid. You would need a hardware security module inside the camera, which signs a hash of the picture with its own built-in security key that can't be extracted and a serial number that it generates. That can prove that an image came from a particular camera, and if you change even one pixel of that image the signature won't match anymore. I don't see this happening anytime soon. Not mainstream at least. There are one or two camera manufacturers that offer this as a feature, but it's not on things like surveillance cameras or cell phones nor will it be anytime soon.

[–] [email protected] 2 points 1 week ago (1 children)

True, sooner or later there might be ways to make sure that a picture or video are digitally signed and probably it would be very hard to crack, but theoretically a fake video might still pass for real (though it would require a lot of resources to make that happen)

[–] [email protected] 1 points 1 week ago

More likely, most of the sources that produce photos and videos would not be using the digital signatures. Professional cameras for journalists probably would have the signature chip. Cheapo Chinese surveillance cameras? Unlikely.