this post was submitted on 31 Aug 2023
589 points (98.0% liked)

Technology

59111 readers
5033 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

I'm rather curious to see how the EU's privacy laws are going to handle this.

(Original article is from Fortune, but Yahoo Finance doesn't have a paywall)

(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 7 points 1 year ago

"virtually" impossible. hehehe

[–] [email protected] 6 points 1 year ago (3 children)

I feel like one way to do this would be to break up models and their training data into mini-models and mini-batches of training data instead of one big model, and also restricting training data to that used with permission as well as public domain sources. For all other cases where a company is required to take down information in a model that their permission to use was revoked or expired, they can identify the relevant training data in the mini batches, remove it, then retrain the corresponding mini model more quickly and efficiently than having to retrain the entire massive model.

A major problem with this though would be figuring out how to efficiently query multiple mini models and come up with a single response. I'm not sure how you could do that, at least very well...

load more comments (3 replies)
[–] [email protected] 6 points 1 year ago

The Danish government, which has historically been very good about both privacy rights and workers' rights has recently suggested that they are looking into fixing the nurses shortage "via AI".

Our current government is probably the stupidest, most irresponsible and least humanitarian one we've had in my 40 year lifetime if not longer 🤬

[–] [email protected] 5 points 1 year ago

Then why they put it in in the first place no? 👁👄👁

[–] [email protected] 5 points 1 year ago (19 children)

Can't they remove the data from the training set and start over?

[–] [email protected] 4 points 1 year ago (1 children)

Yes, but that's not easy... I can't remember exactly, but I think I saw an estimate that the compute time to train just one of the GPT models cost around $66 million. IDK whether that's total cost from scratch, or incremental cost to arrive at that model starting from an earlier model that was already built, but I do know that GPT is still to this day using that September 2021 cutoff which to me kind of implies that they're building progressively on top of already-assembled models and datasets (which makes sense, because to start from scratch without needing to would be insane).

You could, technically, start from scratch and spend 2 more years and however many million dollars retraining a new model that doesn't have the private data you're trying to excise, but I think the point the article is making is that that's a pretty difficult approach and it seems right now like that's the only way.

[–] [email protected] 5 points 1 year ago

Un-robbing a bank also isn't easy, but that doesn't mean I'm able to just say "it too hard :c" and then walk off into the sunset with my looted gains.

load more comments (18 replies)
[–] [email protected] 5 points 1 year ago (10 children)

It is not impossible, it is just expensive.

load more comments (10 replies)
[–] [email protected] 3 points 1 year ago

Have you tried..

format Earth

load more comments
view more: ‹ prev next ›