The ASUS ROG Dominus Extreme Comes to Life

Published by

Click here to post a comment for The ASUS ROG Dominus Extreme Comes to Life on our message forum
data/avatar/default/avatar07.webp
I suppose to make complete usage of this mobo plus the cpu plus utilization of 6-channel DDR4 with 6 of 12 dimms, you start at 5000$. You ain't calling that consumer platform at all. To finalize this system you can reach 9k euro or even surpass 10k wall with multiple gpus and other gimmicks.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
There's no particular need to give pricing info. Anyone buying a 3000 euro CPU wouldn't really care about the prices of components, only the specs would matter.
https://forums.guru3d.com/data/avatars/m/245/245494.jpg
Everything about this new "lineup" screams desperation from Intel's side. I hope that AMD is able to do the same on the GPU market as what they did on the CPU market. Last couple years clearly showed that the next to no competition for Intel/nVidia is not a good thing for the end consumer.
data/avatar/default/avatar09.webp
"Consumer" in the same way a private jet or multi-story yacht is pretty much, not that it isn't a pretty board mind. πŸ˜›
https://forums.guru3d.com/data/avatars/m/263/263435.jpg
One of the ugliest high-end mobo,s i have ever seen looks cheap and i dont like the design aswell.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
GIMME.......GIMME.....GIMME......!.!.!.!. Now if AMD can drop zen3 with their threadripper bringing the same capabilities at half the price then I'll be all over it like an obese freak on cake with no self control.... "Figure that was more PC than saying it the other way...." Don't want to hurt any feelings here. I'm just sick of this price gouging from intel. Dangling capabilities in front of their consumers with these kinds of price tags is simply disgusting. I've only been waiting five plus years for a system to be capable of a replacement of my Sandybrigde-e six core twelve thread. BEEN WAITING!.!.!.!.!
https://forums.guru3d.com/data/avatars/m/229/229509.jpg
Still can't do 4 way x16 GFX as my old X58 board could (and still can, actually) :P
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Tuoni:

Everything about this new "lineup" screams desperation from Intel's side. I hope that AMD is able to do the same on the GPU market as what they did on the CPU market. Last couple years clearly showed that the next to no competition for Intel/nVidia is not a good thing for the end consumer.
Nvidia has kept bringing more to the consumers quite consistently. Sure, lots of people disagree with RTX, but it still has a whole lot of extra in the chip: the raytracing and tensor cores. Whether they are of much use is a different thing, but they are very concrete nonetheless. If RT/tensor turn out no good, it was just a bad bet by Nvidia, not deliberately scamming their own customers. Intel, however, just kept releasing the same old shit year after year. They might have updated something external like the usb version and sata, add support to NVMe and lolptane, but that's it. It was always just 2/4 cores max for the mainstream, with exceedingly marginal performance increases, and those were more or less only because they could raise the clocks free of charge for themselves. Consequently, and hilariously, the same security bugs remained for that whole decade, catcthing Intel pants down generations into the past. I wouldn't describe both Intel and Nvidia in the same sentence.
data/avatar/default/avatar03.webp
AMD needs to bring quad channel memory to the Ryzen line up. The only issue i have with AMD is memory performance, it's been their weakness for way to long. Its the only thing that makes me miss my 6 core Xeon gaming rig. That thing overclocked to 4.2Ghz/4.3Ghz and was running at +-60meg on memory reads and rights, Asus Rampage IV Formula motherbaord, DD3 2400, x79 board. I still have the system and it still kicks butt. The only problem with this is that at the moment the tech industry seems to be riding the greedy train.....lets add 2x M2 sockets and some coloured lights and charge $200 more then the same product from 8 months ago. I exaggerate, but not by much πŸ˜› 😱 😳 πŸ™„
data/avatar/default/avatar35.webp
BLEH!:

Still can't do 4 way x16 GFX as my old X58 board could (and still can, actually) πŸ˜›
Pciex 3.0 x8 = Pciex 2.0 x16 And if you consider half are capable of Pciex 3.0 x16 in the new board, it ultimately has 50% more total bandwidth. πŸ˜‰ So, it's far stronger than X58 4way.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Purple-pinkish "Dominus"? Oh wow. My avatar is "Dominus" too, right? Over 900W Power Delivery... Makes sense with so power hungry CPU. As @warlord wrote, It is some 500W stronger πŸ˜€
https://forums.guru3d.com/data/avatars/m/229/229509.jpg
warlord:

Pciex 3.0 x8 = Pciex 2.0 x16 And if you consider half are capable of Pciex 3.0 x16 in the new board, it ultimately has 50% more total bandwidth. πŸ˜‰ So, it's far stronger than X58 4way.
*TECHNICALLY* yes, but I'm still claiming the moral victory! Not that we're even saturating 3.0 x 8 yet anyway...