Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
ASUS GeForce RTX 4080 Noctua OC Edition review
MSI Clutch GM51 Wireless mouse review
ASUS ROG STRIX B760-F Gaming WIFI review
Asus ROG Harpe Ace Aim Lab Edition mouse review
SteelSeries Arctis Nova Pro Headset review
Ryzen 7800X3D preview - 7950X3D One CCD Disabled
MSI VIGOR GK71 SONIC Blue keyboard review
AMD Ryzen 9 7950X3D processor review
FSP Hydro G Pro 1000W (ATX 3.0, 1000W PSU) review
Addlink S90 Lite 2TB NVMe SSD review

New Downloads
Intel ARC graphics Driver Download Version: 31.0.101.4148
GeForce 531.29 WHQL driver download
CrystalDiskInfo 9.0.0 Beta3 Download
AMD Ryzen Master Utility Download 2.10.2.2367
AMD Radeon Software Adrenalin 23.3.1 WHQL download
Display Driver Uninstaller Download version 18.0.6.1
CPU-Z download v2.05
AMD Chipset Drivers Download 5.02.19.2221
GeForce 531.18 WHQL driver download
ReShade download v5.7.0


New Forum Topics
OS Copying Software Red Dead Redemption 2 failing to launch, Exit code 0xc0000005 Leaked Photographs of Alleged GeForce RTX 4060 (Ti) Founders Edition Card Designed to Fit Two PCIe Slots RX Vega Owners Thread, Tests, Mods, BIOS & Tweaks ! (cont.) Review: ASUS GeForce RTX 4080 Noctua OC Edition 3060ti vs 6700xt a year later NVIDIA GeForce 531.29 WHQL driver Download & Discussion Intel's Desktop Processor Roadmap 2024 SHows Arrow Lake-S Returning with Intel 800 Series Chipset Crucial PCIe 5.0 SSD, T700, Impresses with Compact Heatsink and High Speeds AMD's Upcoming EPYC Genoa-X CPUs to Feature 3D V-Cache for Improved Performance and Efficiency




Guru3D.com » News » Nvidia Doubling Up prices on GeForce GXT 2080? Tackling Some Rumors

Nvidia Doubling Up prices on GeForce GXT 2080? Tackling Some Rumors

by Hilbert Hagedoorn on: 03/02/2018 11:29 AM | source: | 54 comment(s)
Nvidia Doubling Up prices on GeForce GXT 2080? Tackling Some Rumors

Ever since a week or 3-4, there have been massive amounts of chatter about NVIDIA Ampere (GeForce cards) and, what we assume to be, Turing (Mining / HPC Cards). You might have read and heard about it a number of times now. In this post, I wanted to walk you through some of these rumors, as that is all that they are, rumors growing out of their original proportions.

A few weeks ago, chatter indicated that new NVIDIA GPUs based on a name called Ampere would be announced at GTC/GDC in March 2018, this would be followed by a launch in April. That rumor now is again debunked and from the looks of it, indeed does not seem to be true. That or Nvidia has delayed the announcement. Considering the March announcement was based on nothing other than speculation, nobody really knows of these GPUs even really exist?

Three weeks ago, Reuters all of the sudden mentioned an NVIDIA GPU called Turing, they tagged it as a gamers card, which really contradicts with the name Turing. Turing would be a name better suited to AI and HPC products (Turing test for artificial intelligence). A Turing machine is a hypothetical machine thought of by the mathematician Alan Turing in 1936. Adding to the discussion. And today, a new rumor surfaces the web, Nvidia would be doubling up prices on GeForce GTX 2080 the benefit from miners purchasing the cards anyway. The source? There isn't any ... it's gossip, chatter, and rumors from the usual websites.

What do we know thus far?

Well really, nobody knows anything really - All GPU names are based upon speculation and are kept alive by feeding small bits of info explaining or denying stuff. The cards would be released in March, then all of the sudden it wouldn't. It was Ampere, all of the sudden it's Turing. There even was a specific launch date mentioned, April 2018, the 12th, again based on nothing substantial. Let me also remind you, neither Ampere or Turing ever surfaced on any of Nvidia's roadmaps, ever. To date, Nvidia has not talked about it.

So are these GPUs for real?

Well really, nobody knows anything really - Logic dictates that Nvidia will release new stuff this year, but that is merely an assumption. Wth the current state of the graphics card / GPU industry, there really isn't a need for it? Performance wise, the stack is filled and they are in the lead. NVIDIA's GPUs, no matter what they sell, will sell out anyway. So it is far more revenue effective for them to keep pushing Pascal, as it's a good yield GPU series to fab, and thus cost-effective. However, this is against the nature of the entity that is and represents Nvidia. We've seen it in the past so often, Nvidia lives and thrives on innovation, which gives them their reputation they have these days. So with that in mind, it would make much sense for them to release a GPU dedicated to mining (and here I'll mention Turing) and separate Gaming GPUs from cryptocurrency GPUs. Nvidia could potentially optimize their drivers for dedicated mining based cards, getting better hash rates with such cards, and IMHO that could help settle down the price inflated gaming graphics card market, allowing gamers to purchase gamer graphics cards at more normalized prices where miners would get the best results with their own dedicated GPUs. But yeah, this is speculation of the highest form as nobody really knows if the aforementioned GPUs are for real.

Nvidia doubling prices for GeForce GTX 2080?

Well really, nobody knows anything really - Today yet another rumor surfaced on the web, Nvidia would be hiking and jacking up the prices for the new 2000 series flagship card as one website mentioned it would be 50% faster (uh, huh?, based on what exactly?), another website then assumed that the 50% increase in performance automatically means a doubled prices. E.g. NVIDIA would be doubling the price up for the GeForce GTX 2080, this also in an effort to benefit from the mining demand craze, boosting NVidia's revenues. If there even is such a card in the works, rest assured that Nvidia isn't even ready to talk about prices. They never did so in the past prior to any release, and they never will. Nvidia announces prices a day prior to release to media and merely a few days for its add-in card and board partners. So that rumor is based on exactly that, a rumor. Would it make sense for Nvidia to do this? Sure, and since it sounds like common sense, this rumor sounds plausible, but it is that, a rumor and nobody can actually know this aside from Nvidia themselves. I'll leave it at that. 

Where did the Ampere name originate from?

This info and name is derived and traced back towards a German website called 3dcenter, and is nothing other than speculation based on a few observations. Nvidia would be readying a GPU series called Ampere, and not Volta. The name "Ampere" has been mentioned in the past a couple of times already, in fact, we wrote a few items about that GPU name popping up, constantly. It's mentioned that Nvidia halted production of the GP102 (e.g. 1080 Ti / Titan X) and likely GP104 GPU (e.g. 1070/1080), all originally released back in 2016. So, the question then arises, what is happening with Volta GPUs? Where are they?

So if it is Ampere, Why not Volta?

Currently, Volta is based on HBM2 memory which is not available in large enough mass-volume quantities, and it is expensive to purchase and implement as well. It also is an architecture with Tensor cores, that's expensive and not likely benefiting gaming. So why produce high-end graphics cards, or better yet, GPUs with Tensor cores? Then there are the recent announcements on GDDR6 graphics memory. Earlier on we reported about GDDR6 closing in, really fast and actually already announced as available. Currently, it is already possible to fab graphics cards with the blazingly new fast GDDR6 memory. So my thesis here is that Volta will remain on track for the HPC / enterprise side of things including data-centers with its Tensor cores, and Ampere GPU series would be used for the consumer slash gaming parts. And hey, think about it, the specs of the Titan V and the Nvidia Volta architecture just don’t that much sense for gaming with its 640 tensor cores that games are never going to use.

Ampere; a new architecture or Pascal refresh?

Well really, nobody knows anything really - In short, the "Ampere" GPUs series could be better suited for consumers, and not the "deep learning" market,  but it is assumed they are based on GDDR6 graphics memory. Ampere as an architecture never made it into the long-term roadmaps from Nvidia, never ever, hence logic dictates here that Ampere would be based on refresh Pascal architecture, perhaps fabbed on a smaller node and optimized fabrication process. Basically, Nvidia would release the GA104 and then we can speculate onwards, in a speculated chart, based on just assumptions and the rumor (!), that would look something like this (courtesy of 3dcenter):

   

GPUmarket segmentReleaseNamesPerformance aim
GA104 High end April 2018 GeForce GTX 2070 & 2080 2070 ~ GeForce GTX 1080 Ti
GA106 Mid range Late summer / fall 2018 GeForce GTX 2060 2060 ~ GeForce GTX 1080
GA102 Enthusiast End of 2018 to spring 2019 GeForce GTX 2080 Ti 2080 Ti ~ GeForce GTX 1080 Ti + 70-80%
GA107 Mainstream Spring / Summer 2019 GeForce GTX 2050 & 2050 Ti 2050 Ti ~ GeForce GTX 1060
GA108 Entry level Spring / Summer 2019 GeForce GT 2030 2030 ~ GeForce GTX 1050


 

Who's your daddy?

NVIDIA is your daddy; thus far nothing has been confirmed or denied. NVIDIA did well, nothing, shared no information, yet these GPU names surfaced with so much speculation continuously being posted on the web that it's getting out of proportion, and Nvidia is laughing really hard here as they receive a ton of media coverage, while they have yet to release a single shred of information. The honest truth is that nobody actually knows when or what is being released at what price and if the information that continuously surfaced is even factual right. 

Well really, nobody knows anything really - As far as this editorial goes, anything and nothing can be confirmed, neither denied. 







« COUGAR Releases the New Panzer-G Case · Nvidia Doubling Up prices on GeForce GXT 2080? Tackling Some Rumors · New Battlefield V game would take place in World War II Setting »

Related Stories

Aquantia Provides Multi-Gig Networking Support for NVIDIA DRIVE Xavier & Pegasus - 01/31/2018 06:48 PM
Aquantia is announcing a new suite of products targeted at autonomous vehicle platforms – their AQcelerate product line is providing the Multi-Gig networking support for the NVIDIA DRIVE Xa...

Nvidia Delivers Xavier SoC in 2018 - 09/27/2017 08:10 AM
Nvidia's Xavier based SoC for automotive purposes will be released in 2018. Xavier is based on Volta architecture and intended for autonomous cars, drones and industrial robots. ...

Nvidia DGX-1 With Tesla V100 Spotted in GeekBench With Staggering Numbers - 09/18/2017 12:40 PM
Back in May Nvidia announced its Testla Volta V100 processor with Tensor architecture. The companies TSMC’s 12nm finfet process bakes graphics processor has 5120 shader processors activated...

Nvidia drops 3 and 4-way SLI mode starting with GeForce 1000 series - 06/09/2016 05:34 PM
We already shared a thing or two on this topic in the Pascal GPU reviews. Initially Nvidia would allow 3 and 4-way SLI mode with a enthusiast driver key. As it seems though, Nvidia is abandoning that ...

NVIDIA Demos Zero Latency Display Running at 1700Hz - 04/08/2016 08:34 AM
Nvidia is demonstrating a display that holds a true refresh-rate of 1700Hz. It is a prototype called zero latency display and it can display imagery stable even when shaking the screen, handy for VR....


11 pages « < 8 9 10 11


Andrew LB
Senior Member



Posts: 1238
Joined: 2012-05-22

#5525136 Posted on: 03/04/2018 12:41 AM
Doubling prices over previous generation based on roughly 75% increased performance doesn't make any sense other than for purposes of greed and monopolization.

Let's ignore actual performance gains and various other differences and simply use MHz in the following example.

1993 Intel Pentium @ 66MHz cost around $964 initially.
2006 Intel Core Solo T1400 @ 1830MHz cost around $290 initially.

So (1830/66) * $964 = $26,729. Adjust for inflation (~40%) and we get $37,420.

So if we go by this ridiculous model, Intel gave everyone a $37,130 discount on the 2006 Core Solo T1400 at that time.
It would also follow that today's latest CPUs would be going for around the same price as a brand new Porsche 911 Turbo.

Comparing a high end desktop CPU to a low power, budget CPU for netbooks? A much better comparison is intel's Xeon L3014, a 2.4GHz Wolfdale CPU from 2008. I think it initially cost around $800.

It's crazy how much bang for your buck you get these days with computers. I remember as a kid how much my father spent on our first Macintosh or our Apple IIGS. Even my first PC which was an Intel 486 DX2-66mhz cost far more than any system i've built today (not including inflation) and back then they didn't have fancy video cards.

Speaking of video cards, since 2000 the price of them has stayed relatively flat when adjusted for inflation. All this crying about how greedy nVidia and Intel are, and how they **** consumers is just stupid. The only reason why the most recent cards deviate from the norm is because of factors not under nVidia control, and are due to the retailers hiking prices.

http://i.imgur.com/SE3TNqZ.png

fantaskarsef
Senior Member



Posts: 14278
Joined: 2014-07-21

#5525406 Posted on: 03/05/2018 08:42 AM
Hmm Hilbert not responding to no shipping manifests, I guess he knows something that he's under NDA to not tell us...
Do not respond to let us know ;) :D

TheDeeGee
Senior Member



Posts: 8620
Joined: 2010-08-28

#5525852 Posted on: 03/06/2018 06:01 PM
They really gotta find a way to block mining on gaming cards.

If not then mid-range cards will start at $1000.

alanm
Senior Member



Posts: 11501
Joined: 2004-05-10

#5525870 Posted on: 03/06/2018 06:43 PM
They really gotta find a way to block mining on gaming cards.

Its basically like telling them to find a way to make less money. But if it continues for a long time, they risk shrinking the PC gaming market. Tough poison to pick.

Problem is, the only way to satisfy both gamers and miners is to drastically increase GPU production and inventory. Which goes against common business sense since mining can go bust any time. No one wants to be stuck unsaleable inventory.

11 pages « < 8 9 10 11


Post New Comment
Click here to post a comment for this news story on the message forum.


Guru3D.com © 2023