Alder Lake DDR5 Performance Benchmarks Leaks As Well

Published by

Click here to post a comment for Alder Lake DDR5 Performance Benchmarks Leaks As Well on our message forum
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
Horus-Anhur:

Probably it could reduce latency by a few ns. But not much. Maybe 2-4ns. Just guessing since command rate doesn't have that big impact in memory latency. But there's one thing to consider. Those timings are very lose. Cl40 @ 6400 means an absolute latency of 12.5 ns. For comparison, CL16 @ 3800 gives an absolute latency of 8.4 ns For that kit of 6400 to reach that value, it would need to have a CAS of 27. But this is only one timming. And not even the most important.
Nobody or very very few people have CL16 3800 DDR4. CL18 3600 is what I'd consider mainstream. So actually I think this DDR5 is pretty comparible latency-wise to mainstream DDR4. Of course, much better DDR5 will follow in the next year or two.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
user1:

intel added a 1/2 rate mode for the memory controller in rocket-lake, known as gear2(I mistakenly referred to this as geardown mode), so if you are running 2t + gear 2 , you get effective 4t, combined with increased latency from the slower memory controller, while this is mainly used to reduce power consumption on current chips, it can be presumed that it is necessary to run DDR5 at its intended rate on alderlake, barring some-kind of unexpectedly massive improvement to intel's memory controller. here is graph from pcgamer showing the effect of gear 2 on performance in f1 2020 [SPOILER] [/SPOILER] gear2+2t is enough to reduce the performance of a 3200mt/s memory kit, to that of a 2400mt/s kit operating in gear1+t1 its a pretty severe hit to latency.
That's a big difference. Their latency test show a 10-12 ns difference, just by changing Gear 1 to 2. https://cdn.mos.cms.futurecdn.net/aDdRiGumTHSL6XfUEBh2bK-2560-80.png
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
southamptonfc:

The conclusion I draw from that graph is that memory speeds and latencies have zero effect on gaming unless you have a 250hz 1080p monitor.
depends on the game obviously, f1 is a light game, you're probably gonna be hitting 60fps on a potato cpu, If you were to play something heavy on cpu, it would probably make some real-world difference in playability . honestly though, I think we're a little spoiled these days , most games will run well above 60fps without issue on mediocre hardware, the large amounts of cache on modern cpus, masks alot of the latency penalty of slower memory.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
southamptonfc:

The conclusion I draw from that graph is that memory speeds and latencies have zero effect on gaming unless you have a 250hz 1080p monitor.
We are only looking at average FPS. But a CPU that has a better memory subsystem is able to maintain higher minimum frame rates. And less stuttering. This can result in much smoother gameplay.
https://forums.guru3d.com/data/avatars/m/255/255510.jpg
I read an article the other day that said: Intel had been stuck on 14nm for the paste 6 years. But new architectures had been released to increase performance and of course more cores for the core wars. But in real terms Intel CPU's over the past 6 years have only increased in performance by 10%. 😱
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
vestibule:

I read an article the other day that said: Intel had been stuck on 14nm for the paste 6 years. But new architectures had been released to increase performance and of course more cores for the core wars. But in real terms Intel CPU's over the past 6 years have only increased in performance by 10%. 😱
Since Skylake, they didn't release one new arch. At least not on the desktop side. The 6000 onwards was the same. Only core count and some clock speed increases. And of course, a ton of micro code to patch all those security holes. Only Rocket Lake was new. And now Alder Lake.
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
Horus-Anhur:

We are only looking at average FPS. But a CPU that has a better memory subsystem is able to maintain higher minimum frame rates. And less stuttering. This can result in much smoother gameplay.
hmmm, the graph has min. framerates. The slowest RAM still manages min. 198 fps. No stuttering there...
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
southamptonfc:

hmmm, the graph has min. framerates. The slowest RAM still manages min. 198 fps. No stuttering there...
F1 games are not very hard on the memory subsystem. On other games, memory speed can lower both average and 1% low fps. [youtube=VElMNPXJtuA]
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
Horus-Anhur:

F1 games are not very hard on the memory subsystem. On other games, memory speed can lower both average and 1% low fps. [youtube=VElMNPXJtuA]
That video confirms what I said. "In short, in most applications and games, memory speed has little impact". 1080p then yes, you might see a benefit. But as they say, at "realistic" resoultions, 1440p and above, it makes no difference. DDR3200 is as fast as DDR4000. At 4K DDR2400 is 1FPS slower than DDR4000 😀
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
southamptonfc:

That video confirms what I said. "In short, in most applications and games, memory speed has little impact". 1080p then yes, you might see a benefit. But as they say, at "realistic" resoultions, 1440p and above, it makes no difference. DDR3200 is as fast as DDR4000. At 4K DDR2400 is 1FPS slower than DDR4000 😀
It depends on how much a game and resolution is CPU bound.
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
Horus-Anhur:

It depends on how much a game and resolution is CPU bound.
Do you have any examples where at a realistic resolution, that is the case? Because so far, everything that has been posted here shows there is no benefit to gaming with RAM speeds above ~3200
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
southamptonfc:

Do you have any examples where at a realistic resolution, that is the case? Because so far, everything that has been posted here shows there is no benefit to gaming with RAM speeds above ~3200
But you never argued that you must go above 3200 to get huge frame rates. In fact no one in this thread argued that. You are the one that now, defined an arbitrary speed of 3200, just to defend your position. We were talking about how some settings affect latency, such as Gears, comand rate, etc. And what do you mean about "realistic" resolutions. 1080p is the most used resolution used by PC gamers. More than 68% of Steam users run at 1080p.
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
Horus-Anhur:

But you never argued that you must go above 3200 to get huge frame rates. In fact no one in this thread argued that. You are the one that now, defined an arbitrary speed of 3200, just to defend your position. We were talking about how some settings affect latency, such as Gears, comand rate, etc. And what do you mean about "realistic" resolutions. 1080p is the most used resolution used by PC gamers. More than 68% of Steam users run at 1080p.
OK.... 🙄 Let's recap
Horus-Anhur:

We are only looking at average FPS. But a CPU that has a better memory subsystem is able to maintain higher minimum frame rates. And less stuttering. This can result in much smoother gameplay.
I pointed out that the graph DOES have min. framerates and contradicts what you said. Then you tried to wriggle out of that by saying:
Horus-Anhur:

F1 games are not very hard on the memory subsystem. On other games, memory speed can lower both average and 1% low fps. [youtube=VElMNPXJtuA]
You included a video which directly contradicts what you said. "Realistic" resolutions is quote from the video you posted. Did you actually watch it? 3200 again comes from the video you posted. Now I'm certain you didn't watch it.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
southamptonfc:

OK.... 🙄 Let's recap I pointed out that the graph DOES have min. framerates and contradicts what you said.
And even that graph shows lower minimum frame rates. You are the one that decided that a difference of 100 fps of average, between the fastest and slowest ram, is nothing. And a 70 fps difference in minimum.
southamptonfc:

Then you tried to wriggle out of that by saying: You included a video which directly contradicts what you said.
That video shows games with performance that scale with memory speed. Both in average and in minimum fps.
southamptonfc:

"Realistic" resolutions is quote from the video you posted. Did you actually watch it? 3200 again comes from the video you posted. Now I'm certain you didn't watch it.
That video tests with 3 resolutions. 1080p, 1440p and 4k.
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
Horus-Anhur:

And even that graph shows lower minimum frame rates. You are the one that decided that a difference of 100 fps of average, between the fastest and slowest ram, is nothing. And a 70 fps difference in minimum. That video shows games with performance that scale with memory speed. Both in average and in minimum fps. That video tests with 3 resolutions. 1080p, 1440p and 4k.
Whatever dude, you make less sense with each post. You win. 10 internet points to you.
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
southamptonfc:

Do you have any examples where at a realistic resolution, that is the case? Because so far, everything that has been posted here shows there is no benefit to gaming with RAM speeds above ~3200
heres a good review/ article https://www.overclockersclub.com/guides/memory_speed_vs_performance_intel/4.htm , basically its going to be dependant on the cpu, and at what threshold there isn't enough memory bandwidth/speed to feed the cpu, specifically you can see that with 40k: dawn of war 3, that on a ddr3 platform going from 1333 to 2400 is enough to bring the minimums from56 fps to 80 fps and avg climbs substantially aswell, its a particularly cpu heavy game. However once you've reached/exceeded the memory bottleneck senario, there is no further increase, as seen with the ddr4 platform, where going from 2133 to 3200 makes little improvement overall. While you may have a hard time finding a game right now that is memory bottlenecked at ddr4 speeds, that will not be forever imo , I suspect that once we see some new games that push really high unit counts and fully uilize the available cores, it will start to matter again... or maybe not since we're looking at cpus with >100mb of cache on the horizon lol.