this post was submitted on 01 Aug 2023
319 points (97.1% liked)
Technology
59111 readers
5621 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Heat is a huge barrier to increasing clock speeds, so a room temperature and pressure superconductor would actually fairly directly translate to major performance gains in computing.
While true, that'd only be for a superconducting CPU. I doubt this material can both superconduct and act as a transistor, and even if it can, I highly doubt you could pack in anywhere near the amount we have in standard CPUs. So while we might replace a standard power supply with a superconducting one, and reduce heat that way, I don't see any direct computing boosts from this. We could superconduct everything around a CPU, have superconducting wires, but the heat from a CPU is generated in the silicon.
It'll be pretty nice to have 100% efficient PSUs, though. Definitely some gains there, just not the same revolutionary ones seen elsewhere.
This is where my mind went. Wondered if the reduction in heat would allow further overclocking/defaults on both CPU and GPUs. I don't know that much about the actual hardware and how it works though.
Not really. First, standard equipment is limited by cost, not technology. Nothing stopping some power user from using liquid nitrogen to cool a desktop, it's just costly. Superconductor tech, though, would be bleeding edge, it wouldn't cost any less for a long time. Supercomputing, on the other hand, has had access to more esoteric cooling systems, and can already use them. They also have had access to the extreme cold superconductors that have already existed.
The real issue there is the CPU makes the heat, but this tech isn't a transistor. We can't replace the silicon chips with superconducting ones, at least not in a form dense enough to be a CPU. There's lots of small improvements around the CPU we can make, but those aren't at the "wow, this will revolutionize technology" level. They're cool but it's the other stuff that's gonna get the focus.
Managing heat is a large part of circuit design. Superconductors can fundamentally change everything about it meaning far smaller much faster and more capable in every way. As an example 95%+ of modern CPU's and GPUs are cooling related. The actual chips are tiny in comparison to the whole component.