Core i9 13900K DDR5 7200 MHz (+memory scaling) review

Memory (DDR4/DDR5) and Storage (SSD/NVMe) 370 Page 1 of 1 Published by

Click here to post a comment for Core i9 13900K DDR5 7200 MHz (+memory scaling) review on our message forum
data/avatar/default/avatar03.webp
Hyperflux:

I've been waiting for a DDR5 clock speed/latency comparison, my system is essentially bottlenecked for 128GB DDR5 where I have to make one of the following two choices with my RAM kits (GSkill Trident Z5 4x32GB DDR5 6000mhz CL32 sticks): -128GB DDR5 @4800mhz CL32 (or CL40 default for stability - not sure what it's set at right now from my builder) -64GB DDR5 at full speed (6000mhz CL32) Which would you choose for now for performance? I do tend to run CFD applications that require over 64GB RAM but I can use another workstation for the time being until a future BIOS update resolves DDR5 issues. The priority is essentially 4k120/1080p500 gaming, multitasking (heavy Chrome tab use) and trading charts/applications, high bandwidth music playback (32bit/384khz), music production and editing. For reference: the rest of my build is an i9-13900KS, 4090 Suprim X Liquid, Asus ROG Maximus Extreme Z790 motherboard (a future BIOS update is what would fix my dilemma assuming 64GB DDR5 sticks don't come out by then), EVGA 1600W SuperNova P2 80+ Platinum, 2x 4TB WD SN850X NVMe SSDs, Corsair H170i Elite 420mm CPU cooler w/LCD (I wanted the Ryujin II), Cooler Master HAF 700 Evo case, Windows 11 Pro, Asus ROG PG42UQ OLED 4k138hz 16:9 display. The absolute difference in RAM latency is 10.7ns (6000mhz/C32) vs 13.3ns (4800mhz/C32) vs 16.7ns (4800mhz/CL40) if my calculations are right. That's a 25-56% increase in latency with the slower 4800mhz default 4 DIMM speeds over the full speed option with half the RAM. Any input is appreciated, thanks!
This is a 2 vs 4 sticks of memory problem right? I do not think it is bios fixable, the memory controller just has more load on it with 4 sticks, compared to 2. I would say it depends on the games you play and how you play them. If you are playing at 4k 120hz then I am guessing the performance difference is less then 5fps. At 1080p 500 it would be a different story, but the question is then, does it matter if you have 400 or 500fps and does the monitor even produce a good picture (overshoot) at those speeds? You can test all this if you have a boring weekend ahead of you, but I suggest you do it as controlled as possible and write down before and after, type of workload, memory remaining, are tasks running out of memory at 64GB? It is important to have your own data with your settings in front of you and then make a decision.
data/avatar/default/avatar15.webp
Undying:

Im just waiting to see how much badly amd zen4x3d will beat it. 😀
Add 10% over Intel and i think we are good.
data/avatar/default/avatar16.webp
Would be good review if RTX4090 was used. Sadly 3090 is not powerfull enough.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
this gives my C18 3600 DDR 4 a potential of existing on a gen higher than my 5700X if I was to go for a 13600K or something Intel on DDR4. Kinda weird to say that.
data/avatar/default/avatar03.webp
It kinda bugs me that all the games used are old and/or not very cpu heavy. Would have been alot more interesting to see the new version of witcher 3 tested (albiet horribly made from a technical standpoint), or something like star citizen. When you are getting 300+ fps with any cpu in the games tested... time to test different games ! Ironically the one game that aint getting that high fps (watch dog legions) is just gpu bottlenecked hardcore by that 3090 - as others have said, should have been tested with a 4090.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Kool64:

this gives my C18 3600 DDR 4 a potential of existing on a gen higher than my 5700X if I was to go for a 13600K or something Intel on DDR4. Kinda weird to say that.
Will the gains even justify the expense? I mean going from a 5700x to a 13600 .
data/avatar/default/avatar36.webp
Kool64:

this gives my C18 3600 DDR 4 a potential of existing on a gen higher than my 5700X if I was to go for a 13600K or something Intel on DDR4. Kinda weird to say that.
Venix:

Will the gains even justify the expense? I mean going from a 5700x to a 13600 .
I am GPU limited in 90% of my games with a 5800x at 4900mhz boost, 4600 allcore. Using a stock clocked 6900xt. The few games I am not GPU limited in, I am limited by my 2500x1440p 165hz monitor. I would wait one more generation.
https://forums.guru3d.com/data/avatars/m/145/145154.jpg
The real world benefits of getting the absolute fastest ram (pick a year) have always been pretty slim and have always been a terrible value for the gains. Doesn't seem like that's ever changing. It's ram for uber builds to wring out a few more percent.
data/avatar/default/avatar39.webp
Kool64:

this gives my C18 3600 DDR 4 a potential of existing on a gen higher than my 5700X if I was to go for a 13600K or something Intel on DDR4. Kinda weird to say that.
Why not plug in and play a 5800x3D? Going for less then $350 in some places USA. What resolution are you using?
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
Oh I don’t need a new chip I was more commenting on the fact that my garbage DDR4 could still be useful on a next gen system.