Spanish site posts review AMD Ryzen 7 2700X

Published by

Click here to post a comment for Spanish site posts review AMD Ryzen 7 2700X on our message forum
https://forums.guru3d.com/data/avatars/m/216/216490.jpg
4350MHz! Nice boost! As already said many times, I can't wait for HH's review!! Edit: I took a peek in the article's source and it mentions only 4.2GHz@1.4v?
https://forums.guru3d.com/data/avatars/m/260/260828.jpg
Like you said, there is something really wrong with their memory testing.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
They said they have the latest bios and the good driver... wich make them wonder how the mem work so bad On other hand AMD itself have sended a mother board that have the "latest bios" that were not compatible with Ryzen G bundled with their Ryzen G for a review last month: the CPU were working fine on blog's owner own rig and had same issue with the provided mobo. the boost is nice but the OC is not... i hope it is due to the mobo.
https://forums.guru3d.com/data/avatars/m/216/216490.jpg
I took a peek in the article's source and it mentions only 4.2GHz@1.4v? I edited my previous post as I wouldn't consider 4.2GHz a "nice boost" tbh. Not bashing the 2700X or anything. Just plainly referring/talking about OC numbers.
......but yeah, I'm not seeing any game issues whatsoever. Latency is lower etc. .....
http://images.christianpost.com/ipost/full/2944/ObamaWink.jpg
https://forums.guru3d.com/data/avatars/m/209/209750.jpg
Actually, they don't have the latest bios for the new chipsets, they only tried the processor in a 370 motherboard with the latest (non ryzen+) drivers, which are not compatible yet with the new cpu, they just swapped the cpu.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
My body is ready... as soon as RAM prices stop being insane.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Any review not utilizing a 400 series chipset will not get the full affect of the Ryzen+ CPU. The only reason to test on the 370 chipset is to see the difference between the two.
data/avatar/default/avatar14.webp
AMD's advantage, ie higher number of cores, is it's Achilles heel as well. In a sense that 8 cores @ 4+GHz is not all that esy to do. What's the AMD's version of SpeedStep? Ah OK I remember... XFR... They need to improve on this, and let the frequency boost upwards, if there is enough tdp/temp room. For example 1,2,3 core jobs And Intel's Speedstep does it so gracefully. This stewpid 8700k of mine wants to go to 5GHz, without me barely touching it. I swear a god, all I did is I answered positively to a question in ASUS BIOS (then I spend n hour trying to dial it back it, cos I dont want 6cores@5GHz all the time LOL, NH-D15@250 RPM ftw 😀)
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Noisiv:

AMD's advantage, ie higher number of cores, is it's Achilles heel as well. In a sense that 8 cores @ 4+GHz is not all that esy to do. What's the AMD's version of SpeedStep? Ah OK I remember... XFR... They need to improve on this, and let the frequency boost upwards, if there is enough tdp/temp room. For example 1,2,3 core jobs And Intel's Speedstep does it so gracefully. This stewpid 8700k of mine wants to go to 5GHz, without me barely touching it. I swear a god, all I did is I answered positively to a question in ASUS BIOS (then I spend n hour trying to dial it back it, cos I dont want 6cores@5GHz all the time LOL, NH-D15@250 RPM ftw 😀)
I think that while 5GHz is good clock. It says rather pathetic story as long as that story starts with Sandy-E 6C/12T having no issues running 4.5GHz and for some people 4.7GHz on all cores. Because then that story is about intel having chip capable to run at high clocks and their "big" source of improvement was change in out-of-the-box clock speeds. Slowly moving from 6C/12T i7-3930K's 3.2~3.8GHz to i7-8700K's 3.7~4.7GHz by using already present clock headroom. What really allows you to run all cores on 5GHz is not magical improvement in chip design itself, it is more energy power efficient manufacturing process and as such there is lower Vdrop and so cores are stable at higher clock. It is quite possible that power consumption of i7-3930K OCed to stability limit will be almost same as power consumption of i7-8700K OCed to its stability limit. But AMD's problem here is not ability to keep all cores at certain clock, Zen1 simply could not clock single core above 4.1GHz in most cases regardless of how few watts and how tiny Vdrop was. In contrast even Sandy could do 5GHz on Single core. But then to do it with 2 cores was issue due to Vdrop under load. Question is this: Is someone on similar manufacturing node in same factory able to get higher clock? Is limitation due to AMD's design or due to manufacturing process itself? Since 7nm press stuff, it looks like this one should deliver AMD 4.5~5.0GHz. There was not talk about AMD redesigning chip for higher clock, but about manufacturing process allowing it. Edit: And btw. congrats on good clocking chip. Not gonna ask you where it can go, as I think it is already great performance @5GHz with all cores.
data/avatar/default/avatar30.webp
@Fox2232 Thanks alot! I'm still trying to get the hang of it. Good thing this Asus is dumb proof. Because basically: http://www.lolhome.com/img_big/cooking-dog.jpg PS remember that exchange of ours, Arduino, remote switch on, Wake On Lan.. This mofo can be WoL/booted from the shutdown just fine, from S5 to be precise 🙂
https://forums.guru3d.com/data/avatars/m/270/270718.jpg
So Hilbert, you are in possession of your press kit eh? Please put these new chips through the ringer and give us the lowdown on mem compat and latency improvements. Curious to see which apps/games benefit from that. Looking forward to your review as always...
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
Nothing really changed for gamers: - High latency - OC...What OC? - Gaming...can't even review them (bluescreens)
data/avatar/default/avatar25.webp
sammarbella:

Nothing really changed for gamers: - High latency - OC...What OC? - Gaming...can't even review them (bluescreens)
Well it changes only for...games supporting all its cores/threads it will be the best CPU out there for its money with huge difference. But wait, there is not a single game able to utilize properly more than 8 threads at their full potential. So, it is a CPU for dreamers or future proof stylers.
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
warlord:

Well it changes only for...games supporting all its cores/threads it will be the best CPU out there for its money with huge difference. But wait, there is not a single game able to utilize properly more than 8 threads at their full potential. So, it is a CPU for dreamers or future proof stylers.
Yes Ryzen could be the best CPU for gamers IF: - Insert a giant IF game devs support the use of ALL cores balancing the workload between them. - Add a miracle to magically solve the infinity fabric performance dependency on RAM speed borked by subpar (the wicked word) memory controller in Ryzen.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
sammarbella:

Nothing really changed for gamers: - High latency - OC...What OC? - Gaming...can't even review them (bluescreens)
I completely beg to differ. You people act like if you game on Ryzen, your gaming is gonna be complete trash. That is complete bullshit. My 1500x is better at gaming then my 3930k was. 1500x 3.6ghz boost stock right now vs 3930k @ 4.1. And guess what, min FPS is better on my 1500x, even in BF1, that game used all 12 threads of my 3930k. I just have higher cpu usuage online in BF1 with the 1500x. I have had my 1500x @ 4.1 but dont have a good cooler. In fact, everyone I have seen with a 1700(X) has said nothing but awesome things about their gaming on them. Yes most are running 3200mhz ram, and I have 3000 myself. Its really funny, everyone would bash on AM3+ cpus because terrible single core performance, say intel didnt need high clock speeds with non K cpus to beat AM3+ cpus that were overclocked, Now Ryzen comes out, fantastic performance, on par IPC now with intel, but cant overclock high, but still does very well. The more cores are a problem is BS. What is BS, is software. Its taken 10 years for games to use more then 4 threads, and it took them a long time to even use more then 2 threads after 2006. AMD has pushed the multicore hard, even with lots of backlash, and lack to true support. Im sorry but not sorry here, but the majority of games use more then 4 threads now. The issue still is Direct X11 and crappy DX12 ported games. And Vulkan is awesome, period. I wish all games were Vulkan. I have seen a few of my friends even ditch their Intel Fanboy ways and built themselves a Ryzen rig, and some even have AMD gpus too now, and they like them alot. Ryzen may not be an overclockers dream, but to crap all over it based on MHZ limit, and then to think its gonna suck in gaming based of not all 16 threads being used, is pretty ignorant. Its not like the IPC is behind a 2010 AM3 CPU like AM3+ was.
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
Agonist:

I completely beg to differ. You people act like if you game on Ryzen, your gaming is gonna be complete trash. That is complete bullshit. My 1500x is better at gaming then my 3930k was.
Your 2017 medium range AMD CPU is better at gaming than your medium range 2011 Intel CPU? Hopefully! It only took AMD medium range CPU 6 years to beat Intel 6 years old CPUs. LOL Seriously, your opinion is as valid as mine but try to differ with cold data. This is the Latency benchmark from the 2700x review , 2700 is even worst than 1700...: https://elchapuzasinformatico.com/wp-content/uploads/2018/04/AMD-Ryzen-7-2700X-Tests-05.jpg https://elchapuzasinformatico.com/2018/04/amd-ryzen-7-2700x-review
data/avatar/default/avatar30.webp
I beg to differ these latencies, they are all over the place, both for Intel and for AMD, my Ryzen 2400G with 3200MHz memory has a latency between 69-71ns the 2700X should have between 60-65ns, the testing was done badly, upping the 2400G to 3466MHz will lower it to 59-65ns on my system, I cant go higher why my Corsair LPX cause they are only 2400MHz stock modules, but 3600MHz with the 2200G/2400G and the new Pinnacle Ridge should be easy to attain.
https://forums.guru3d.com/data/avatars/m/196/196531.jpg
Anyone noticed that hes using the X370 Chipset? the CPU might not run too well without its own chipset. The also added a note saying that they could not try any games because they would all Crash.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
sammarbella:

Your 2017 medium range AMD CPU is better at gaming than your medium range 2011 Intel CPU? Hopefully! It only took AMD medium range CPU 6 years to beat Intel 6 years old CPUs. LOL Seriously, your opinion is as valid as mine but try to differ with cold data. This is the Latency benchmark from the 2700x review , 2700 is even worst than 1700...: https://elchapuzasinformatico.com/wp-content/uploads/2018/04/AMD-Ryzen-7-2700X-Tests-05.jpg https://elchapuzasinformatico.com/2018/04/amd-ryzen-7-2700x-review
You take some random incompetent dude's word as solid base for argument? Or do you actually need just any random trash to make dummy meaningless bash post? Because if you spent few milliseconds of your time looking at that other memory performance image. You would notice that they had memory bandwidth matching something like ~2400MHz. So it is just their incompetence to believe they were running RAM at 3200MHz. Or do you really believe that while 1700X w/ 3200MHz memory does have ~49GB/s read speed. Ryzen 2700x w/ same 3200MHz memory will have just 36.5GB/s read speed?
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
typhon6657:

I have an 8700k but I want to be a popular You Tube gamer and stream for a living. Now I am in a serious pickle because 12 threads is not enough and I believe 16 would be way better then 12.
Do you run all cores on 5.1GHz or have you set step down turbo in way like 5.1/5.0/4.8/4.7/4.6/4.6? Because if you notice some instability it may be due to different Turbo based on number of loaded cores. In that case just limit Game's fps to something reasonably good and save CPU cycles. + Tune streaming encoder settings. I do not know what encoder you are using, but you can always go for nVidia's encoder to save CPU cycles at cost of bit lower image quality. (which may not be an issue depending on resolution you encode)