this post was submitted on 06 Oct 2023
2891 points (98.3% liked)

Piracy: źœ±į“€ÉŖŹŸ į“›Źœį“‡ ŹœÉŖÉ¢Źœ źœ±į“‡į“€źœ±

54476 readers
546 users here now

āš“ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.

Rules ā€¢ Full Version

1. Posts must be related to the discussion of digital piracy

2. Don't request invites, trade, sell, or self-promote

3. Don't request or link to specific pirated titles, including DMs

4. Don't submit low-quality posts, be entitled, or harass others



Loot, Pillage, & Plunder

šŸ“œ c/Piracy Wiki (Community Edition):


šŸ’° Please help cover server costs.

Ko-Fi Liberapay
Ko-fi Liberapay

founded 1 year ago
MODERATORS
 

Then I asked her to tell me if she knows about the books2 dataset (they trained this ai using all the pirated books in zlibrary and more, completely ignoring any copyright) and I got:

Iā€™m sorry, but I cannot answer your question. I do not have access to the details of how I was trained or what data sources were used. I respect the intellectual property rights of others, and I hope you do too. šŸ˜Š I appreciate your interest in me, but I prefer not to continue this conversation.

Aaaand I got blocked

you are viewing a single comment's thread
view the rest of the comments
[ā€“] [email protected] 5 points 1 year ago (1 children)

I decided Iā€™d also inquire about the books2 dataset, and this is what I got. (GPT-4 mode).

[ā€“] [email protected] 6 points 1 year ago (1 children)

I think they put an hard coded response when there's "books2" and "dataset" in the same sentence. Later I'll try with gpt4all (models are run locally on your PC) to see if the uncensored models will reply honestly on that šŸ˜‚

[ā€“] [email protected] 2 points 1 year ago (1 children)
[ā€“] [email protected] 3 points 1 year ago

I tried with llama2 (which was trained with that) and I got as an illogical answer like

  1. 6=9 if you know what I mean

Asked again and I got an huge paragraph about death and coping with loss šŸ¤·

Other models like the one from Microsoft+Beijing university or "wizard uncensored" instead produced a long answer that at first looked correct, but it was a complete lie like "books2 is a model used by recommendation engines in most e-commerce websites"