this post was submitted on 06 Oct 2023
2891 points (98.3% liked)
Piracy: ź±į“ÉŖŹ į“Źį“ ŹÉŖÉ¢Ź ź±į“į“ź±
54476 readers
546 users here now
ā Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.
Rules ā¢ Full Version
1. Posts must be related to the discussion of digital piracy
2. Don't request invites, trade, sell, or self-promote
3. Don't request or link to specific pirated titles, including DMs
4. Don't submit low-quality posts, be entitled, or harass others
Loot, Pillage, & Plunder
š c/Piracy Wiki (Community Edition):
š° Please help cover server costs.
Ko-fi | Liberapay |
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I decided Iād also inquire about the books2 dataset, and this is what I got. (GPT-4 mode).
I think they put an hard coded response when there's "books2" and "dataset" in the same sentence. Later I'll try with gpt4all (models are run locally on your PC) to see if the uncensored models will reply honestly on that š
Please let us know
I tried with llama2 (which was trained with that) and I got as an illogical answer like
Asked again and I got an huge paragraph about death and coping with loss š¤·
Other models like the one from Microsoft+Beijing university or "wizard uncensored" instead produced a long answer that at first looked correct, but it was a complete lie like "books2 is a model used by recommendation engines in most e-commerce websites"