Micron claims GDDR7 memory can bring up to 3.1x performance boost over GDDR6

Alfonso Maruccia

Posts: 1,120   +336
Staff
Something to look forward to: Micron started sampling GDDR7 chips earlier this month, stating that the new state-of-the-art memory will significantly boost the next generation of graphics processors. A new slide provides additional details about the anticipated performance increase.

Micron's GDDR7 memory is expected to replace both GDDR6 and non-standard GDDR6X chips currently used in modern GPUs for gaming and AI workloads. The Idaho-based company claims that graphics cards built with GDDR7 will provide over 30 percent more frames per second for both ray tracing and rasterization rendering across the three most common gaming resolutions (1080p, 1440p, and 4K).

According to one of Micron's official slides (seen above), players can expect "seamless visuals" and significantly better performance with next-generation games. The slide indicates that GDDR7 memory chips can deliver up to a 3.1x improvement over GDDR6 applications and a 1.5x increase over "best-in-class" GDDR6X applications.

Micron also provides some "normalized FPS" gaming benchmarks, comparing today's video memory technology to GDDR7 at 1080p Ultra, 1440p Ultra, and 4K Ultra settings with unnamed gaming titles.

The GDDR7 hardware appears to excel in ray tracing rendering scenarios, showing a 2.3x FPS increase at 1080p Ultra compared to GDDR6, and a 3.1x increase at 4K. Normalized FPS performance for raster graphics is also improved, with a 1.2x increase at 1080p Ultra and a 1.7x increase at 4K Ultra (we assume Ultra settings in such games, where memory speed may be a larger factor).

Nvidia GPUs in the GeForce 40 series use GDDR6X memory chips manufactured by Micron, except for the RTX 4060 and RTX 4060 Ti models, which use GDDR6. The Blackwell architecture is expected to debut in consumer/gaming graphics cards by the end of 2024, with the flagship GeForce RTX 5090 model likely incorporating the GDDR7 memory chips that Micron recently started sampling.

Speculation about actual performance achievable by Nvidia's upcoming gaming GPUs is rampant. Based on current information about Blackwell, it is assumed that the GeForce RTX 5090 could be up to 42 percent faster than the GeForce RTX 4090 in rasterization and up to 48 percent faster for ray traced gaming.

Permalink to story:

 
There is potential for good here, but also for bad. Let's hope this boost in performance does not result in card makers dropping memory bandwidth numbers just because they can.
 
I'm sorry, Micron are claiming GDDR6X is over 2x faster than GDDR6 which is complete and utter BS. GDDR6x is at most 20% faster with a maximum throughput of 24Gb/s vs 20Gb/s for GDDR6. How they came up with these performance benefits is bizarre. Why is this stuff being spread around the web by tech sites is ridiculous.
 
Reminds me how potential speed of usb can be amazingly fast, but it does not help most devices being fairly slow.
I give it 30% improvement
 
And somehow, Nvidia will come up with a feature that «just isn’t possible on the 4xxx series…we call it «ultra dlss 4.0» - and you’re Really missing out if you don’t have it!
 
I'm sorry, Micron are claiming GDDR6X is over 2x faster than GDDR6 which is complete and utter BS. GDDR6x is at most 20% faster with a maximum throughput of 24Gb/s vs 20Gb/s for GDDR6. How they came up with these performance benefits is bizarre. Why is this stuff being spread around the web by tech sites is ridiculous.
Yeah seems pretty unlikely unless there's some secret sauce that boosts certain RT calculations or something. Can't wait for actual benchmarks (though we might run into CPU bottlenecks anyway).
 
The amount of increase they claim will have to be bench-marked and proven to by outside source before they can be taken at face value. We have seen way too many of these claims disproven in the past to start believing them now.
 
The running frequency multiplied with bus width = bandwidth. More memory bandwidth => better performance.

But lets wait for independent testing before any conclusions.
 
Maybe it's just me, but I find it somewhat interesting that the biggest improvement claims only show up at 4K ultra settings...

Seems like some collusion there with the monitor mfgr's to push everyone to higher & higher spec'd screens. Kinda similar to nGreediya's "Spend more, save more" campaign :D

the GeForce RTX 5090 could be up to 42 percent faster than the GeForce RTX 4090 in rasterization and up to 48 percent faster for ray traced gaming

Yea yea, we've heard all that nonsense before, and when the hype settles down and the actual product hits the market, we MIGHT see 10-15% speed increases, albeit at 50-75% higher prices too !

Whatever makes the most profit, that for the most part is how much improvement we will get.

^^THIS^^
 
Back