Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
Scythe Mugen 5 Rev.C CPU Cooler review
be quiet Pure Loop 2 FX 280mm LCS review
HP FX900 1 TB NVMe Review
Scythe FUMA2 Rev.B CPU Cooler review
SK Hynix Platinum P41 2TB M.2 NVMe SSD Review
Corsair K70 RGB PRO Mini Wireless review
MSI MPG A1000G - 1000W PSU Review
Goodram IRDM PRO M.2 SSD 2 TB NVMe SSD Review
Samsung T7 Shield Portable 1TB USB SSD review
DeepCool LS720 (LCS) review

New Downloads
GeForce 516.94 WHQL driver download
Display Driver Uninstaller Download version 18.0.5.4
FurMark Download v1.31
Intel HD graphics Driver Download Version: 31.0.101.3222
Intel ARC graphics Driver Download Version: 30.0.101.1743
AMD Radeon Software Adrenalin 22.7.1 driver download
GeForce 516.93 WHQL Studio driver download
Corsair Utility Engine Download (iCUE) Download v4.26.110
ReShade download v5.3.0
AIDA64 Download Version 6.75


New Forum Topics
GeForce 516.94 WHQL driver download Meet the Y2 Fish Tank Chassis, an aquarium on top of your PC components Nvidia reduces revenue prediction due to video card demand. DDR5-6000 Memory is the Sweet Spot For AMD Ryzen 7000 Zen 4 CPUs AEPICLeak CPU bug affects Intel Core processors from the 10th, 11th, and 12th generations NVIDIA RTX 4070 Ti could have the performance of an RTX 3090 Ti The AMD Ryzen 7000 (Zen4) Series Retail Box Has Been Revealed Sony introduces the VPL-XW7000 4K home theater projector with ACF lens for 13579 USD AMD Software: Adrenalin Edition 22.7.1- Driver download and discussion NVIDIA GeForce Hotfix Driver Version 516.79




Guru3D.com » Review » Tech preview: AMD Radeon RX Vega 56 and 64 5

Tech preview: AMD Radeon RX Vega 56 and 64 5

Posted by: Hilbert Hagedoorn on: 07/31/2017 04:27 AM [ 50 comment(s) ]

In this technology preview we'll have a closer look at the now announced AMD Radeon RX Vega 56 and 64. AMD is to release three models for the consumer market, two air-cooled versions and one liquid cooled product. Wanna know more? Read all about it here.

Read article


Advertisement



Tagged as: radeon rx vega, amd

« Tech preview: Threadripper 1900X - 1920X & 1950X · Tech preview: AMD Radeon RX Vega 56 and 64 · Sneak preview: Unboxing Threadripper 1920X & 1950X »

pages « < 7 8 9 10

waltc3
Senior Member



Posts: 1401
Posted on: 07/31/2017 06:40 PM
I'm sure you knew, but just in case, that's a photo of the character Patrick Bateman in the film American Psycho, it is weird though!


Yes, the actor is Christian Bale...

waltc3
Senior Member



Posts: 1401
Posted on: 07/31/2017 06:52 PM
You could actually double the bandwidth of HBM by using more stacks, like NVIDIA does on their GV100 GPUs - but of course more stacks cost more money (and GV100 costs a fortune, but its a "pro" datacenter GPU).

In any case I tend to generally agree. HBM is largely overrated for consumer use. To get an actual bandwidth advantage you would need to use 4 stacks, which costs a lot of money, and at 2 stacks GDDR5x and the future GDDR6 will match or even beat it - at a lower pricepoint at that.

Big advantage of HBM2 compared to GDDR5 is power consumption. Big nod to HBM2, from what I read so far. Additionally, it will be interesting to see if the Vega GPU engine can utilize all of that bandwidth, as well. I think at this point the power savings alone is reason enough to use it.

From HH's article: "The graphics engineers from AMD claimed that HBM2 will offer you 5x the power efficiency compared to any other graphics memory including GDDR5, and yes that is huge." We shall see...

Don't know quite what to think of all of these bundles right out of the gate, however. AMD is being extremely aggressive this year--I'm wondering if Vega will be powerful enough to do 4k in most cases at a decent frame rate. If so, then the bundles ought to lead into massive sales--especially for the $399 non-bundle variant. We shall see...Looking forward to HH's hands-on review!

waltc3
Senior Member



Posts: 1401
Posted on: 07/31/2017 07:16 PM
Sort of - bragging rights of numbers is the top priority. A good gaming experience is distinctly the #2 priority for most people here. I've seen on many forums (here included) where people actively and willingly choose higher frame rates over reducing microstutter. People will also willingly pay an extra $150 for an additional 5-15FPS. Usually nobody even glances at average minimum frame rates.


I agree completely. I sort of hate what 3dfx did in starting the frame-rate craze many years back...but then when were talking average GLIDE frame rates of 20-30 fps compared to competitor's cards using less mature APIs at the time maxing out sometimes as low as ~5 fps, it made a lot more sense than it does now as it marked the difference between "playable" frame rates and slide shows. Intel discreet GPUs in those days (the infamous i7xx discrete 3d cards that Intel later dropped because they could not compete with 3dfx/nVidia--I owned two of them)--and ATI Rage Fury 128 (I believe that was the brand name--I had a very difficult time getting >10 fps out of that card and so returned it.) Matrox Millennium was *the* card to own for 2d gaming, but sucked at d3d gaming in terms of frame-rates and image quality. 3dfx whipped everyone for several years in terms of playable frame rates.) Etc.

Today, in a blind test most people couldn't see the difference between 120 fps and 80 fps (imagine a test where both displays were running at 80 fps but the end user was asked to point out the display running at 120 fps... ;) Lots of people would find it when it wasn't even there!), but nevertheless we have benchmark bar charts actually illustrating differences of ~1 fps between GPUs and CPUs--which is certainly not worth talking about at all. 1fps differences simply are not real and would never be perceived by the end-users, etc.

Lately, I'm enjoying AMD's new "Enhanced Sync" option for the RX480 and up, introduced last week in the 17.7.2 WHQL Crimson drivers. The idea is that you set the game to run with vsync off but set the Crimson profile for the game to Enhanced Sync, and the game should run very close to full bore vsync-off rates but with little to no tearing. It's also supposed to help in games in which the frame-rate engines limit the frame-rates to 30 fps or less, too. It's become my new default and seems to work as advertised--for smooth gaming with no stutter and almost no tearing--it's much better than vsync off, imo.

Amx85
Senior Member



Posts: 333
Posted on: 07/31/2017 08:21 PM
Just the VEGA 56 is reasonable if outperforms GTX1070, still no understand why VEGA 64 draws a lot more, Just for 10~15% naaa, AMD put more ROPs please, Tessellation is'nt all...

Denial
Senior Member



Posts: 13801
Posted on: 07/31/2017 08:33 PM
I agree completely. I sort of hate what 3dfx did in starting the frame-rate craze many years back...but then when were talking average GLIDE frame rates of 20-30 fps compared to competitor's cards using less mature APIs at the time maxing out sometimes as low as ~5 fps, it made a lot more sense than it does now as it marked the difference between "playable" frame rates and slide shows. Intel discreet GPUs in those days (the infamous i7xx discrete 3d cards that Intel later dropped because they could not compete with 3dfx/nVidia--I owned two of them)--and ATI Rage Fury 128 (I believe that was the brand name--I had a very difficult time getting >10 fps out of that card and so returned it.) Matrox Millennium was *the* card to own for 2d gaming, but sucked at d3d gaming in terms of frame-rates and image quality. 3dfx whipped everyone for several years in terms of playable frame rates.) Etc.

Today, in a blind test most people couldn't see the difference between 120 fps and 80 fps (imagine a test where both displays were running at 80 fps but the end user was asked to point out the display running at 120 fps... ;) Lots of people would find it when it wasn't even there!), but nevertheless we have benchmark bar charts actually illustrating differences of ~1 fps between GPUs and CPUs--which is certainly not worth talking about at all. 1fps differences simply are not real and would never be perceived by the end-users, etc.

Lately, I'm enjoying AMD's new "Enhanced Sync" option for the RX480 and up, introduced last week in the 17.7.2 WHQL Crimson drivers. The idea is that you set the game to run with vsync off but set the Crimson profile for the game to Enhanced Sync, and the game should run very close to full bore vsync-off rates but with little to no tearing. It's also supposed to help in games in which the frame-rate engines limit the frame-rates to 30 fps or less, too. It's become my new default and seems to work as advertised--for smooth gaming with no stutter and almost no tearing--it's much better than vsync off, imo.

120 and 80? I could probably tell on a game like CS/Quake but generally no, not much of a difference - but when a new game comes out and the difference is the same 40% but between 60 and 40, it becomes immediately apparent.

And honestly, from experience I went from a 1080 to a 1080Ti on a QHD monitor and the difference was pretty significant. There are too many games at the max settings that are on the cusp of 60fps and the 1080Ti makes it there while the ~30% slower 1080 just doesn't cut it without sacrificing settings.

pages « < 7 8 9 10

Post New Comment
Click here to post a comment for this article on the message forum.

Guru3D.com » Articles » Tech preview: AMD Radeon RX Vega 56 and 64 5

Guru3D.com © 2022