Alienware Area 51 Ryzen Threadripper Benchmarks

Published by

Click here to post a comment for Alienware Area 51 Ryzen Threadripper Benchmarks on our message forum
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
So basically: For the reasons one might want a multi-core monster, the threadripper is great, especially for the price, and is the one to get, compared to Intel. The reviews i'm seeing on places like newegg for the 7900x seem..... interesting. Things like: "Cons: It's too cheap. Other Thoughts: It should be more expensive."
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Didn't they get it to work with faster RAM?
data/avatar/default/avatar34.webp
I hope the new 1900X also delivers.......I want those extra lanes and quad channel.....for someone who just play games mostly, a 1950X would be overkill. My savings are for the 1900X and a 2180Ti to replace my 6 years old Sandy Bridge of my signature 🙂 PS. Great idea Alienware!!!, make their customers to save some money using 2666MHz memory.....GENIUS!
https://forums.guru3d.com/data/avatars/m/269/269912.jpg
Still waiting to see benchmarks when the Threadripper gets to strectch it's legs on a full x399 motherboard and with a faster ram. Also want to see what happens when someone ponies up the money and puts in 1 tb of ram. Intel is really getting it's skirt pulled up around it's neck on this one.:funny:
data/avatar/default/avatar39.webp
Threadripper "sux" for gaming, ROX for rendering 🙂
data/avatar/default/avatar30.webp
Looks like AMD is having a hard time beating Intel where it counts (most game benchmarks is proof of that claim) and please, spare me from the whole "who knows, maybe in the future" crap, now is all that matters.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Looks like AMD is having a hard time beating Intel where it counts.
Sorry but gaming is a small fragment of what computers are used for.
https://forums.guru3d.com/data/avatars/m/269/269303.jpg
threadripper 1950x: £999 i9 7900x: £850 Well as i'm solely a gamer with no picture/video editing needs i think i am more than happy to stay with Intel looking at these results. However time will tell with AMD bios/game updates
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Looks like AMD is having a hard time beating Intel where it counts (most game benchmarks is proof of that claim) and please, spare me from the whole "who knows, maybe in the future" crap, now is all that matters.
More like they are close enough that taking AMD cpu is a good alternative. And better in other cases.
data/avatar/default/avatar29.webp
Looks like AMD is having a hard time beating Intel where it counts (most game benchmarks is proof of that claim) and please, spare me from the whole "who knows, maybe in the future" crap, now is all that matters.
My opinion-What you are seeing on the Gaming benchmarks is Majority of game devolopers are Coding For intel specificly so thats why you see that small gap in performance ect, Things are slowly turning around finally since Amd has shown it CAN compete in the Mid/high end gaming segment.Give it a couple years and things are in fact going to turn around.:)
https://forums.guru3d.com/data/avatars/m/260/260828.jpg
They tested the CPUs in gaming at 4K???
data/avatar/default/avatar19.webp
I was waiting for monsters like this and I'm close to buying 1920x. Why noone says about real multi-tasking which for this rig should be build for, like leaving open but minimalised 100 tabs in chrome with realtime charts (BTC/ALTs), starting some sandboxed VMs systems (isolated wallets), mining in the background and playing some game in 4k in the same time.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
threadripper 1950x: £999 i9 7900x: £850 Well as i'm solely a gamer with no picture/video editing needs i think i am more than happy to stay with Intel looking at these results. However time will tell with AMD bios/game updates
Why do these results have anything to do with your derision if you are only a gamer? Who in there right mind even looks at a 16 core CPU and thinks gaming. Seriously guys AMD smokes Intel on the workstation stuff for now and that is all that should matter with these CPU's. AMD hs Ryzen for "gamers" or Intel has a 7700k or whatever else that isn't 10+ cores.
data/avatar/default/avatar20.webp
They tested the CPUs in gaming at 4K???
Nobody game in 4k, because 60hz sux :wanker:
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I hope the new 1900X also delivers.......I want those extra lanes and quad channel.....for someone who just play games mostly, a 1950X would be overkill. My savings are for the 1900X and a 2180Ti to replace my 6 years old Sandy Bridge of my signature 🙂
Keep in mind TR is likely going to have worse latency. You ought to get better minimum frame rates but worse maximum. Unless you're doing 3x Crossfire or SLI, the extra PCIe lanes won't do anything for you vs an AM4 build. I'm guessing you meant 2080Ti?
Looks like AMD is having a hard time beating Intel where it counts (most game benchmarks is proof of that claim) and please, spare me from the whole "who knows, maybe in the future" crap, now is all that matters.
Really, gaming is where it counts? That's all that matters when it comes to buying a CPU? Not only is that objectively wrong, but TR is not by any means a gamer's platform and I really don't understand where people think otherwise. Where you want to be spared of the "maybe in the future" excuse is enough proof of that. All that being said, sure, there is nothing wrong with using TR for gaming. It is a great platform if you do work and leisure, much like Nvidia's Titan series. But anyone who buys TR (or a Titan, while I'm at it) for gaming is, frankly in my opinion, an idiot, and I don't care who I offend in saying that. Again, people who have significant priorities other than gaming are exceptions. To clarify, I don't think TR is a bad gaming platform, it's just bad to use with gaming as the primary purpose.
I was waiting for monsters like this and I'm close to buying 1920x. Why noone says about real multi-tasking which for this rig should be build for, like leaving open but minimalised 100 tabs in chrome with realtime charts (BTC/ALTs), starting some sandboxed VMs systems (isolated wallets), mining in the background and playing some game in 4k in the same time.
In this day and age, I'm a bit surprised there hasn't been a more well-established multitasking benchmark. Most benches nowadays are strictly for a single process; multi-core CPUs are designed for multi-tasking, not necessarily parallel processing (that's what GPUs are for). But, I'd say 100 Chrome tabs wouldn't be much use - Chrome does a pretty good job at idling tabs that aren't being viewed, to spare CPU cycles. 100 tabs just simply eats up a lot of RAM.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Looks like AMD is having a hard time beating Intel where it counts (most game benchmarks is proof of that claim) and please, spare me from the whole "who knows, maybe in the future" crap, now is all that matters.
Although with Ryzen, this has proven to be the case given with every microcode release, memory support and higher speeds which Ryzen loves is always getting better.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Why do these results have anything to do with your derision if you are only a gamer? Who in there right mind even looks at a 16 core CPU and thinks gaming. Seriously guys AMD smokes Intel on the workstation stuff for now and that is all that should matter with these CPU's. AMD hs Ryzen for "gamers" or Intel has a 7700k or whatever else that isn't 10+ cores.
There are still some workloads outside gaming that Intel has the edge like Photoshop and Office, but likely that is due to lack of optimization. However the facts remain Ryzen is not a slam dunk better across the board on all workloads outside gaming.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
I was waiting for monsters like this and I'm close to buying 1920x. Why noone says about real multi-tasking which for this rig should be build for, like leaving open but minimalised 100 tabs in chrome with realtime charts (BTC/ALTs), starting some sandboxed VMs systems (isolated wallets), mining in the background and playing some game in 4k in the same time.
I'm curious why nobody uses grid computing for these chips? An app like BOINC can max out all threads and scales perfectly (it's like prime95, except you actually compute live data). You don't even need active testers, since you can just start it and let it run, nor do you need any skills to use it (just download, select project(s) and run). Rendering and workstation benches are useful, but some computing results would be good as well.
Looks like AMD is having a hard time beating Intel where it counts (most game benchmarks is proof of that claim) and please, spare me from the whole "who knows, maybe in the future" crap, now is all that matters.
Talking about missing the mark. Do you really think people would buy a $999 16-core processor for playing games? HEDT chips are for high-end desktops, not consumer gaming PCs. What matters the most for these chips is multi-core performance and productivity.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I'm curious why nobody uses grid computing for these chips? An app like BOINC can max out all threads and scales perfectly (it's like prime95, except you actually compute live data). You don't even need active testers, since you can just start it and let it run, nor do you need any skills to use it (just download, select project(s) and run). Rendering and workstation benches are useful, but some computing results would be good as well.
Are you saying for testing purposes or in general? I'm sure there are people interested in them for BOINC, but building brand new dedicated BOINC PCs are a very niche market. Otherwise you're right - CPUs like these would be perfect for it, and BOINC would make for a pretty good real-world benchmark. Certain projects like SETI aren't going away any time soon and ought to have pretty consistent results. SETI also has GPU support - not too often would you be able to see CPUs mixed in with GPUs on the same graph. On a side note, I really wish BOINC had more GPU-oriented tasks. There are a lot of projects I'd really like to contribute toward but I have more GPUs to my disposal than CPUs.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
Are you saying for testing purposes or in general? I'm sure there are people interested in them for BOINC, but building brand new dedicated BOINC PCs are a very niche market. Otherwise you're right - CPUs like these would be perfect for it, and BOINC would make for a pretty good real-world benchmark. Certain projects like SETI aren't going away any time soon and ought to have pretty consistent results. SETI also has GPU support - not too often would you be able to see CPUs mixed in with GPUs on the same graph. On a side note, I really wish BOINC had more GPU-oriented tasks. There are a lot of projects I'd really like to contribute toward but I have more GPUs to my disposal than CPUs.
It's just one of those tasks which never really get tested on these chips, and yet is perfectly suited for them. The focus always seems to be on rendering, gaming, and synthetic workloads. I've also seen reviewers trying to do a full-system load test by clumsily running a CPU synthetic test along with a game benchmark, but it would be much easier to use BOINC for it. They probably just don't know about it, but it would be a much easier way to test these systems, especially since it's a real-world app with real data. FYI, my upcoming Threadripper system will be dedicated to BOINC. I might also use it for some media consumption but I do not plan on gaming on it at all. If not for this, I would have no interest in TR. I know it's a niche use case, but HEDT itself is a niche market. My issue with BOINC regarding GPUs is the lack of OpenCL support. Almost all GPU projects support CUDA, but only a handful support OpenCL, which limits my vendor choices. I really wish this wasn't the case, as I prefer open standards over proprietary ones - just goes to show the dominance of Nvidia cards, even in computing apps.