this post was submitted on 21 Aug 2023
15 points (100.0% liked)

SneerClub

982 readers
8 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
all 50 comments
sorted by: hot top controversial new old
[–] [email protected] 15 points 1 year ago (3 children)

Random blue check spouts disinformation about "seed oils" on the internet. Same random blue check runs a company selling "safe" alternatives to seed oils. Yud spreads this huckster's disinformation further. In the process he reveals his autodidactically-obtained expertise in biology:

Are you eating animals, especially non-cows? Pigs and chickens inherit linoleic acid from their feed. (Cows reprocess it more.)

Yes, Yud, because that's how it works. People directly "inherit" organic molecules totally unmetabolized from the animals they eat.

I don't know why Yud is fat, but armchair sciencing probably isn't going to fix it.

[–] [email protected] 14 points 1 year ago (1 children)
[–] [email protected] 11 points 1 year ago (1 children)

That reminds me. If the world is about to FOOM into a kill-all-humans doomscape, why is he wasting time worrying about seed oils?

[–] [email protected] 15 points 1 year ago (13 children)

A lot of rationalism is just an intense fear of death. Simulation hypothesis? Means that maybe you can live forever if you're lucky. Superintelligence? Means that your robot god might grant you immortality someday. Cryogenics? Means that there's some microscopic chance that even if you pass away you could be revived in the future at some point. Long terminism? Nothing besides maybe someday possibly making me immortal could possibly matter.

I mean don't get me wrong I'd give a lot for immortality, but I try to uhh... stay grounded in reality.

[–] [email protected] 7 points 1 year ago (1 children)

@sailor_sega_saturn @TinyTimmyTokyo @nyrath I like to call LessWrong "modern-day Scientology" and that moniker seems more and more appropriate with each passing month.

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago)

@sailor_sega_saturn @TinyTimmyTokyo @nyrath For someone claiming to be rational (meaning putting reality above superstition), he [Yudkowsky] really did create what is essentially a proto-religion. Hence, Scientology.

I am literally a devout (if reformist) Christian and I'm less superstitious than that clown shoe and his zombies.

[–] [email protected] 5 points 1 year ago

@sailor_sega_saturn @TinyTimmyTokyo Been thinking and saying this for a while. These powerful billionaire types are terrified of death as it’s so egalitarian - nobody escapes it; No matter how much money and power they accumulate, they can’t get control over this one thing and it drives them up the wall.

[–] [email protected] 4 points 1 year ago

@sailor_sega_saturn @TinyTimmyTokyo Kinda hilarious those beliefs are called "rationalism". They're speculation
& pseudoscience at best.

[–] [email protected] 3 points 1 year ago (24 children)

@sailor_sega_saturn

Spelt "TESCREAL", pronounced "existential angst"...

load more comments (24 replies)
[–] [email protected] 2 points 1 year ago

@sailor_sega_saturn @TinyTimmyTokyo Eh, no guarantee (or any reason to believe really) a simulation would be even focused in any way on humanity (no anthropocentrism needed).

Similarly for superintelligence, few reasons for it to care.

Cryogenics is a better bet and as you say it's quite unlikely unfortunately.

load more comments (7 replies)
[–] [email protected] 9 points 1 year ago

The whole ‘trad clean’ market on twitter is wild. You’d be amazed at the mark ups of like, tortilla chips.

[–] [email protected] 6 points 1 year ago

given how much of his schtick is a sort of semi-theological EvoPsych descendant (ie. there is a perfect rational way for a human brain and body to work & it developed that was and the only thing that can surpass it is post-humanism & AI), one wonders how he thinks humanity could have gotten this far if it's dangerous to eat seeds or any non-cow animals that eat seeds. You'd think at some point in 300K years we'd have evolved to safely digest more widely available forms of fat, Yud.

[–] [email protected] 11 points 1 year ago

Carves reality at the seams.

Oops, anyone know how to stitch reality back together again?

[–] [email protected] 7 points 1 year ago (1 children)

"carving reality at the joints" - does this mean anything?

Only finer-grained concepts like "linoleic acid" are useful for carving reality at the joints.

[–] [email protected] 14 points 1 year ago

His metaphors mix worse than seed oil and water.

[–] [email protected] 3 points 9 months ago

This systematic review and meta-analysis doesn't seem to indicate that linoleic acid is unusually bad for all-cause mortality or cardiovascular disease events. And is there another meta-analysis showing the opposite? I kinda just don't trust those anymore, unless somebody I trust vouches for the meta-analysis

"I only trust meta-analyses if the results agree with me"

[–] [email protected] 2 points 1 year ago

Poor Yud's Bayes must have broken.