31
Leak claims RTX 5090 has 600W TGP, RTX 5080 hits 400W — up to 21,760 cores, 32GB VRAM, 512-bit bus
(www.tomshardware.com)
All things related to technology hardware, with a focus on computing hardware.
Rules (Click to Expand):
Follow the Lemmy.world Rules - https://mastodon.world/about
Be kind. No bullying, harassment, racism, sexism etc. against other users.
No Spam, illegal content, or NSFW content.
Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.
Please try and post original sources when possible (as opposed to summaries).
If posting an archived version of the article, please include a URL link to the original article in the body of the post.
Some other hardware communities across Lemmy:
Icon by "icon lauk" under CC BY 3.0
32gb VRAM would suck. Then I'll not buy a 5090. A 5080 with only 16GB of GDDR7 would also suck. Better get a 3090 with 24gb then.
The only thing keeping 4080(and 5080) cards "reasonably" priced is the fact that they only have 16GB, therefore they arent that good for ai shit. You dont need more than 16gB vram for gaming. If those cards had more vram, the ai datacenters would pick them up, keeping their price even higher than it is.
I have a 7900xt and was using over 17gig in Jedi Survivor. No ray tracing, no frame gen. Just raw raster and max AA.
Granted, that's because that game is so horribly optimized. But still... I used more than 16gig.
You kinda can... Nvidia card users have been having the toughest time with the Hunt Showdown update because CryEngine is happily gobbling up VRAM. For AMD cards it's not a problem but various Nvidia card owners have been having bad experiences running at the resolutions they normally do.
Maybe 16GB is the number where things are okay, I haven't heard complaints on cards above 12GB. However, point being... Nvidia being VRAM stingy has bit some folks and at least one game developer.
Still 32 seems EXCESSIVE.
nVidia & production yields decide/plan how much RAM they are gonna give them.
If it made financial sense (a market existed) nVidia would stop making desktop cards overnight.
For VR, you do already.