210
Sony announces the PS5 Pro with a larger GPU, advanced ray tracing, and AI upscaling
(www.theverge.com)
Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.
Submissions have to be related to games
No bigotry or harassment, be civil
No excessive self-promotion
Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts
Mark Spoilers and NSFW
No linking to piracy
More information about the community rules can be found here.
Well, as a PC gamer, there's a bunch of settings you can turn on from "last Gen" games to make them look better. Just because they ran on those machines doesn't mean you were getting the best version. If you're playing on console you're never getting the best version. This newer one can just turn on more settings and a higher resolution and framerate than the previous ones. I wish they'd let players decide what settings they want themselves, but sadly that's not happening on console anytime soon I don't think.
Chasing the "best version" is a fool's errand, though. Unless you're buying top-of-the-line hardware every cycle, you'll never have the best. And even then, there are games that seem to target future hardware by having settings so high not even top-end PCs can max them out comfortably, and other games that are just so badly optimized they'll randomly decide they hate some feature of your setup and tank the performance, too.
Everyone has their threshold for what looks good enough, and they upgrade when they reach that point. I used my last PC for 10 years before finally upgrading to a newer build, and I'm hoping to use my current one as long as well.
But just based on the displayed difference in performance between the base PS5 and the PS5 Pro, it doesn't seem like a good investment for what benefits you get. It's like paying Apple prices for marginally better hardware, and with overpriced ~~wheels~~ disc drive sold separately.
For sure, trying to max out everything is a bad idea. You can always have from FPS and higher resolution, for example. My point is just that "last Gen" doesn't mean anything. The previous console versions couldn't max the games out if they had graphics options. The game being older doesn't mean it doesn't take advantage of more advanced setting with better hardware.
I think chasing high graphics settings in general is a dumb idea. My favorite games are low fidelity indie games that do interesting things (right now Ostranauts, but also Factorio, Dwarf Fortress, and so many others). The games that max out my hardware are generally worse games. If you're selling your game based on graphics then you aren't selling it based on gameplay. I know console players generally seem to care about "realistic" graphics more, but it's a fool's errand.
Man, this is true now, but this conversation makes me very nostalgic for the good old days of the 1080Ti, where PC games were absolutely a "max out and forget" affair.
Sure, that was because monitors were capped out at 1080p60, by and large. These days people are trying to run 20 year old games at 500fps or whatever. But man, the lack of having to think about it was bliss.