this post was submitted on 02 Aug 2023
14 points (100.0% liked)

TechTakes

1374 readers
65 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
 

Running llama-2-7b-chat at 8 bit quantization, and completions are essentially at GPT-3.5 levels on a single 4090 using 15gb VRAM. I don't think most people realize just how small and efficient these models are going to become.

[cut out many, many paragraphs of LLM-generated output which prove… something?]

my chatbot is so small and efficient it only fully utilizes one $2000 graphics card per user! that’s only 450W for as long as it takes the thing to generate whatever bullshit it’s outputting, drawn by a graphics card that’s priced so high not even gamers are buying them!

you’d think my industry would have learned anything at all from being tricked into running loud, hot, incredibly power-hungry crypto mining rigs under their desks for no profit at all, but nah

not a single thought spared for how this can’t possibly be any more cost-effective for OpenAI either; just the assumption that their APIs will somehow always be cheaper than the hardware and energy required to run the model

all 49 comments
sorted by: hot top controversial new old
[–] [email protected] 14 points 1 year ago (1 children)

And this isn't even the expensive part -- training, this is just inference.

Can't wait for this fad to be over

[–] [email protected] 5 points 1 year ago

not happening fast enough. Maybe that's just my inner Luddite hankering for some circuit board smashing

[–] [email protected] 11 points 1 year ago (1 children)

dont worry once we get AGI it'll figure out how to run itself on an intel 8080 trust me i thought about it really hard

[–] [email protected] 8 points 1 year ago* (last edited 1 year ago) (1 children)

for real though, i keep saying important people specifically say "AI will help with climate change" and like... how, dude? by burning a ton of energy to think really hard about it with its magic brain powers? like, what is the actual concrete help here supposed to be, for real. is this just the new "crypto incentivizes switching to green energy"? :/

[–] [email protected] 6 points 1 year ago

I'm just as clueless. I think there are three syllogisms that tech brains orbit around.

  • AI will improve society, a better society would face climate change more effectively, so AI will help us face climate change.
    • Basically an extension of the milder claims with regards to AI improving education, health care, research, the economy, etc.
    • Very vague and feel-good and I think more a reaction to distrust of AI than an assertion of anything.
  • AI will help us better understand complex systems like climate, understanding the complexity of climate change helps us, so AI will help us face climate change.
    • Stemming from skepticism of current climate science methodology that doesn't fit with what they think science should be.
    • Secretly hope that AI will show us climate change isn't actually even real and the fact we think it is is some byproduct of our feeble minds trying to understand something so dynamic and complex.
  • There's some magic bullet technological solution to climate change that is outside of current human ability space to invent. AI can potentially eclipse these limits and invent things we can't. So AI can invent this magic solution.
    • Hardcore AI singularity takeoff yadda yadda folks. Goes hand in hand with the ideas AI will invent microorganisms or nanobots that will take over the entire biosphere or thinking it will find a new theory of physics that lets it teleport places or shit like that.
    • In this POV climate is even a non-issue since AI could easily solve it but we can't easily solve how to not make this AI kill us.
[–] [email protected] 9 points 1 year ago (1 children)

Who decided that this point on the climate change graph was a good point at which to spend millions of dollars on AI.

[–] [email protected] 8 points 1 year ago

the guys with all these slightly charred crypto mining graphics cards lying about?

[–] [email protected] 4 points 1 year ago (1 children)

It's a shame that analog inference accelerators are taking so long to hit the market. GPUs are way too expensive and power hungry for inference when you don't need the ability to train a network.

[–] [email protected] 14 points 1 year ago

oh totally, upgrading from GPUs to ASICs will really increase my ~~hash rate~~ ~~mining profits~~ number of concurrent conversations