Good lord...This is a pretty needlessly antagonistic and sad attempt at dunking on someone. Everyone without a $2000 GPU just needs to get out of their parents' basement? Really?
I thought it was 24fps?If high framerates are so important, why are Hollywood films -- which always vastly exceed videogames in graphical realism -- released at 30fps?
First of all, most films are actually at 24 fps. So it would seem your argument is even more valid!If high framerates are so important, why are Hollywood films -- which always vastly exceed videogames in graphical realism -- released at 30fps?
Ray tracing. 4K. 8K.
How about a consistent 1080p/60fps first, hmm?
https://www.eurogamer.net/new-xbox-series-xs-consoles-detailed-in-enormous-microsoft-leakPerhaps you have missed Phil Spencer's latest interviews in which he clearly said that he is not interested in providing a "pro/upgrade" this time around, but rather bring brand new hardware to the market instead, which ties in with the official announcements of Xbox working on a new console to be revealed sometime this year.
Ok, maybe a dumb question among hard core techies, but the PS5 GPU is from AMD, so why can't such improvements be made to give a big RT boost to AMD PC GPUs?
You're arguing a different point. The argument was that higher framerates made more visually appealing images than raytracing, which is patently absurd.The problem is, movies are simply watched, whereas games are interactive. Higher framerates allow for less input lag.
Theoretically? You don't normally troll so I assume you're actually serious?You're arguing a different point. The argument was that higher framerates made more visually appealing images than raytracing, which is patently absurd.
But to answer your point, yes a higher frame rate *theoretically* means less input lag ... but it takes a neural impulse about 250 ms to travel from your eye to your brain to your hand. That works out to a grand total of FOUR fps. Some competitive gamers have been measured with a reaction rate slightly more than twice as fast ... but needing 100 fps+ to compete is one of those security-blanket beliefs that videogamers cherish.
Absolutely correct. But do the math:If your argument really is "it takes a neural impulse longer blah blah blah", Then surely if you see it quicker on screen due to the higher framerate, your "neural impulse" will "start" sooner than the person on the lower framerate machine?
Okay I see what you mean now.Absolutely correct. But do the math:
Worse-case lag due to frame rate delay
Person @ 60 fps: 250ms + (1/60)(1000) = 266 ms.
Person @ 150 fps: 250ms + (1/150)(1000) = 257 ms.
Net improvement: 3.4%
That's local performance. If you're playing online, there's likely at least 100ms of round-trip network latency, which drops the improvement to 2.5%. And that's worst case -- the mean improvement is half that, or 1.25%.
I won't argue if you call that a competitive edge, as my original point had nothing to do with this: it was merely to point out that graphical realism is far more heavily tied to the physics of raytracing and motion, rather than frame and pixel counts.
Obviously you missed the part on the article that says Series X refresh NOT a "pro/upgrade" in which Phil Spencer has said those were old outdated plans anyways.
Day dreaming much, right?it will be the same as with Red dead redemption 2.
it will be console exclusive for the first few years, and at release it will be marketed as "plays best on ps5 PRO/Xbox One Pro"(whatever that console is called now)
reviewers will be in awe of how incredible GTA6 looks and plays - on a pro console.
Firstly, the main article that we are all commenting on is on the PS refresh not the XBox refresh. The only person to mention Phil Spencer is you.Obviously you missed the part on the article that says Series X refresh NOT a "pro/upgrade" in which Phil Spencer has said those were old outdated plans anyways.
You should read the articles you are linking.