Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
GALAX GeForce RTX 4070 Ti EX White review
Cougar Terminator gaming chair review
G.Skill TridentZ5 RGB DDR5 7200 CL34 2x16 GB review
ASUS TUF Gaming B760-PLUS WIFI D4 review
Netac NV7000 2 TB NVMe SSD Review
ASUS GeForce RTX 4080 Noctua OC Edition review
MSI Clutch GM51 Wireless mouse review
ASUS ROG STRIX B760-F Gaming WIFI review
Asus ROG Harpe Ace Aim Lab Edition mouse review
SteelSeries Arctis Nova Pro Headset review

New Downloads
HWiNFO Download v7.42
Intel ARC graphics Driver Download Version: 31.0.101.4257
CrystalDiskInfo 9.0.0 Beta4 Download
AIDA64 Download Version 6.88
GeForce 531.41 WHQL driver download
AMD Radeon Software Adrenalin 23.3.2 WHQL download
GeForce 531.29 WHQL driver download
AMD Ryzen Master Utility Download 2.10.2.2367
AMD Radeon Software Adrenalin 23.3.1 WHQL download
Display Driver Uninstaller Download version 18.0.6.1


New Forum Topics
AMD Software: Adrenalin Edition 23.3.2 WHQL - Driver Download and Discussion Review: Cougar Terminator gaming chair NVIDIA GeForce 528.49 WHQL driver Download & Discussion G.SKILL Launches Up to DDR5-8200 DDR5 Memory Kits with 24GBx2 and 48GBx2 Capacities Nvidia Cracks Down on Counterfeit Graphics Cards in Collaboration with Chinese E-commerce Platforms Review: G.Skill TridentZ5 RGB DDR5 7200 CL34 2x16 GB The Last of Us Part I PC Port Receives 77% negative ratings on Steam, due to poor optimization Will my Corsair 650W PSU survive until tomorrow? Review: GALAX GeForce RTX 4070 Ti EX White DLSS Tweaks - add DLAA to any games




Guru3D.com » News » Nvidia Profits Tripled In Q4 2016

Nvidia Profits Tripled In Q4 2016

by Hilbert Hagedoorn on: 02/10/2017 02:55 PM | source: | 72 comment(s)
Nvidia Profits Tripled In Q4 2016

Nvidia is happy alright, the graphics chip (or should we say parralel computing) company increased its net profit to $ 655 million (614 million euros). That's a tripled compared to the same quarter a year earlier.

The news reaches us thjroug figures of the fourth financial quarter for 2016. Revenues grew to $ 2.17 billion, an increase of 55 percent compared with a year earlier. Revenues from Nvidia's data center division, which manufactures chips for include Amazon, Microsoft and Alibaba also has tripled which summed up towards 926 million USD in revenues.

Nvidia speaks of record figures. "Our GPU platform is increasingly being used for artificial intelligence, cloud computers, gaming and self-propelled vehicles," said Nvidia founder Jen-Hsun Huang.

The company has reached all of its goals that quarter in terms of gaming, visualization, data centers and self-propelled vehicles. Exact figures on some departments have not been shared.







« New Samsung QLED TV Becomes 100 Percent Color Volume Verified · Nvidia Profits Tripled In Q4 2016 · Call of Duty To Return Back To its Original Format »

Related Stories

Nvidia Prevents Reselling Bundled Games By Forcing GeForce Experience - 02/03/2017 12:51 PM
Right, this is going to be a complicated story to explain. As you guys know I have been rather critical towards what Nvidia is doing with GeForce Experience. At first it was intended to be a platform ...

Nvidia Pascal GP100 Die Shot Photo - 08/25/2016 10:05 AM
Over at the Hot Chips symposium Nvidia has been sharing a thing or two with media. Quite interesting is a new photo series of the GP100. Now do not confuse GP100 with the GP102 that is housed in the N...

Nvidia Pascal GP104-400 GPU photo surfaces and shows GDDR5X Memory - 04/25/2016 09:45 AM
Yet another PCB photo appeared with on it, a GPU seated. This time it is the  GP104-400-A1 GPU which which would be the 3rd Pascal GPU surfacing, and kind of seems to cinfirm a 980 Ti succesor, the ...

NVIDIA partners halt GeForce GTX 970, 980, 980Ti production and GPU Pascal codenames surface - 04/13/2016 09:25 AM
In anticipation of upcoming Pascal products in the coming month or two, at least that's what hwbattle is reporting this morning. As we get closer to the actual launch more and more tiny little detai...

Nvidia Pascal Consumer card announced during Computex - 04/11/2016 02:48 PM
But they will launch later ... Nvidia is to announce its consumer grade Pascal graphics cards at Computex 2016 from May 31-June 4, graphics card manufacturers include Asus, Gigabyte and MSI. They wou...


15 pages « < 9 10 11 12 > »


xIcarus
Senior Member



Posts: 989
Joined: 2010-08-24

#5392650 Posted on: 02/14/2017 11:13 AM
Not randomly. Entry level gaming card is one which is used for minimum requirements of most popular game titles. nVidia x50 range contains entry cards for modern gaming. nVidia x40 and x30 range is for old games. nVidia entry cards (release date - official recommended price, 2017 market price):

nVidia GTS 450 (2010 - $129, 2014 - $65, 2017 - unavailable)
nVidia GTX 750 Ti (2013 - $149, 2014/2017 - $80)
nVidia GTX 950 ($159, 2017 – $127)
nVidia GTX 1050 ($139, 2017 - $109)

Prices from www.videocardbenchmark.net (Amazon, Newegg)
Important note: GTX 950 has similar performance to 750 Ti. Have you noticed that market price for GTX 950 is much higher? One of the possibilities is that nVidia is trying to dry down inventory and control production of new cards in order to reduce supply and keep market prices of new cards high. Which is expected, considering that only 2 major players left on the gaming GPU market.



Makes sense if you are using Russian Rubles or Zimbabwean Dollars. Information above shows that it has little to do with inflation (see the note).



Argument would make more sense if Triple-A titles used Intel GPU for minimum requirements. Intel GPU runs fine older games or games on lowest possible settings.



Don't be smug, unless you made sure that facts are on your side and opponent can't back his arguments with logic and evidence :).

Okay, maybe you didn't deserve such a brash response - but you didn't think it through before posting. Even this post, which I'm replying to, is based on false premises.

Firstly, for the love of god DO NOT use that website for comparing GPUs. It's very inaccurate. Instead, use this: http://hwbench.com/vgas/geforce-gtx-750-ti-vs-geforce-gtx-950
And as we can see, the 950 is 22% faster than the 750 Ti in gaming. This automatically dismantles your first argument because considering that both cards were being produced at the same time, it makes sense for the 950 to be more expensive.

And I disagree with x30/x40 is for old games at min details. HD Graphics is significantly slower than a 460 for example (the 460 is more than twice as fast). And you can even play modern games in 720p with those things. Sh!ttily, but you can.
Thus there is still a market for the x30/x40. Just because they don't get the spotlight doesn't mean they completely suck. They're really fine for modern games in 720p. That's the definition of entry level.

Inflation applies to every country out there, not just Russia or Zimbabwe, I don't know what you're getting at. Maybe I didn't catch your point.

And 'minimum requirements' frequently use very old GPUs. If you extrapolate towards iGPUs, you'll see that many modern games will run on them.
Sure, some AAA titles like ROTR will run like cock but others aren't so demanding.

EJocys
Senior Member



Posts: 138
Joined: 2003-07-08

#5392982 Posted on: 02/14/2017 09:53 PM
but you didn't think it through before posting. Even this post, which I'm replying to, is based on false premises.


I was basing my first comment on my past experience buying cards, and then I backed it up with real numbers, so it can't be called "false premises".

Firstly, for the love of god DO NOT use that website for comparing GPUs. It's very inaccurate.


I've used well known site, which collected real data from more that 800,000 Video Cards. Why, for the fictional god's love, I should believe your site, which do not clearly tell where they are getting data? :).

And as we can see, the 950 is 22% faster than the 750 Ti in gaming.


Yes, you are right here and I was wrong about performance. When it comes to PassMark scores they are 41% better on 950.

This automatically dismantles your first argument because considering that both cards were being produced at the same time, it makes sense for the 950 to be more expensive.


Not enough to dismantle. GTX 750 was released in 2013. One years later its price halved. GTX 950 was released in 2015, but its price dropped very little. There are two major explanations:
a) Manufacturing cost of GTX 750 is very cheap, so it could go down half way easily.
b) nVidia managed to take control on inventory and distribution, so they could control supply in order to increase price/profits on GTX 950. It works, because there are not enough GPU manufacturers to compete properly.
c) Something else...

Both explanations are normal business practice.

And I disagree with x30/x40 is for old games at min details... Sh!ttily, but you can.


In 2013 47% of monitors had resolution 1920x1080 and higher, and we are talking about PC games with gaming cards. x30/x40 cards are for 2-4 year old games and low settings. "Sh!ttily, but you can" is accurate slogan when it comes to these cards and modern games, especially when they are badly ported from consoles.

Thus there is still a market for the x30/x40.


Agree. Sometimes, I use nVidia GT 740 to play Battlefield 4 on lowest settings 1680x1050 at work :).

They're really fine for modern games in 720p.


PC master race is not fine with 720p :).

Inflation applies to every country out there, not just Russia or Zimbabwe, I don't know what you're getting at.


Inflation of US dollar or Euro is too low and could not explain huge price differences inside range of 2-4 years, especially if prices vary ±30%. Not like Zimbabwean Dollar which can be used to explain any price hike next morning.

schmidtbag
Senior Member



Posts: 7260
Joined: 2012-11-10

#5392987 Posted on: 02/14/2017 10:05 PM
Agree. Sometimes, I use nVidia GT 740 to ... at work

Isn't that statement alone all you need to be paying attention to? It is widely accepted nowadays that a GTX 1050 or a RX 460 are considered entry level for gaming, both of which are drastically better than a 740. As stated by xIcarus, IGPs are now known as entry level GPUs in general (so, not even gaming specific). By today's standards, anything below a 1050 or 460 is not meant for gaming. Their market is for things like low-power home and office PCs, media centers, and healthy upgrades to outdated PCs. They're also good for office-based OpenCL tasks (like spreadsheet calculations).

xIcarus
Senior Member



Posts: 989
Joined: 2010-08-24

#5393230 Posted on: 02/15/2017 11:09 AM
I was basing my first comment on my past experience buying cards, and then I backed it up with real numbers, so it can't be called "false premises".



I've used well known site, which collected real data from more that 800,000 Video Cards. Why, for the fictional god's love, I should believe your site, which do not clearly tell where they are getting data? :).



Yes, you are right here and I was wrong about performance. When it comes to PassMark scores they are 41% better on 950.



Not enough to dismantle. GTX 750 was released in 2013. One years later its price halved. GTX 950 was released in 2015, but its price dropped very little. There are two major explanations:
a) Manufacturing cost of GTX 750 is very cheap, so it could go down half way easily.
b) nVidia managed to take control on inventory and distribution, so they could control supply in order to increase price/profits on GTX 950. It works, because there are not enough GPU manufacturers to compete properly.
c) Something else...

Both explanations are normal business practice.



In 2013 47% of monitors had resolution 1920x1080 and higher, and we are talking about PC games with gaming cards. x30/x40 cards are for 2-4 year old games and low settings. "Sh!ttily, but you can" is accurate slogan when it comes to these cards and modern games, especially when they are badly ported from consoles.



Agree. Sometimes, I use nVidia GT 740 to play Battlefield 4 on lowest settings 1680x1050 at work :).



PC master race is not fine with 720p :).



Inflation of US dollar or Euro is too low and could not explain huge price differences inside range of 2-4 years, especially if prices vary ±30%. Not like Zimbabwean Dollar which can be used to explain any price hike next morning.

What real numbers? You screwed up your argument by using videocardbenchmark which is known to be unreliable, while hwbench is the de-facto standard for comparing CPUs and GPUs.
I'm serious, this is not about 'why should you trust what I post'. Hwbench is pretty much universally accepted as one of the best GPU and CPU comparison web applications today. I warmly invite you to use it.

Since your performance comparison was inaccurate, your price comparison is largely invalid by extension.

The 750Ti had its price cut because it was 'included' in the 900 lineup so-to-speak. The 750Ti was being produced along with the 950 as if they were the same generation of cards (they kind of were, 750Ti is Maxwell).
750Ti was the exception, not the rule. The 950 was completely replaced by the 1050, and thus the 950:
a) didn't get a price cut
b) its production ceased
If you look back through time, what happened to the 950 happens almost all the time. For example the 770 didn't drop in price just because the 970 appeared. It was simply replaced by the 970 - 770 production ceased at that time.
There are also cases like the 290x which got its price cut because of competition.

I feel what you're saying about resolution, however I wasn't referring to the monitor's resolution but the ingame resolution. Sometimes, with a sh!tty enough card, you are forced to drop it lower.
A good amount of people have 4k monitors or TVs, that doesn't mean they all rock GTX1080s to properly play at that resolution. People with x40 cards are usually casual gamers who emphasize web browsing or watching movies and the likes.
Same goes for 1080p. It's the standard today, it doesn't mean everyone can game on it.

And you pretty much proved my point by saying you sometimes game on a 740 at work.

About the inflation talk, I finally understood what you were referring to. I thought you were talking about it long term, like since the 6600GT days or something like that.
Short-term price increase usually happens either when they get new tech into their cards, tech that has consumed a lot of R&D budget, or when they switch to an expensive or immature fab node.
As we all know, these are the first generation of cards on a 16/14 nm process. I'll admit I'm not sure about this, but I recall companies like Samsung saying that 14nm didn't bring a massive decrease in transistor cost. Something about yield rates? This might explain why Nvidia has jacked up its prices this generation. And this also makes me wonder what margins AMD is getting from their cards.

Does not explain why the 950 was so expensive though.

EJocys
Senior Member



Posts: 138
Joined: 2003-07-08

#5393407 Posted on: 02/15/2017 06:49 PM
What real numbers? You screwed up your argument by using videocardbenchmark which is known to be unreliable


Oficially Hwbench.com is owned by LUIS AGUSTIN CIALCETA (LUISG2249@GMAIL.COM) in TRINIDAD, Uruguay, registered in 2015, and receives 7 times less users than PassMark's videocardbenchmark.net site. PassMark is a very well known and reputable software developer whose products are well known amongst professional for 20 years now.

while hwbench is the de-facto standard for comparing CPUs and GPUs


In Trinidad?

Since your performance comparison was inaccurate, your price comparison is largely invalid by extension.


No. It just shows that you did not get what I've wrote.

The 750Ti had its price cut because it was 'included' in the 900 lineup so-to-speak.


It is not so simple.

b) its production ceased


Now you are getting somewhere. nVidia is licensing production of their cards to third part manufacturers and probably forcing production ceasing dates on them to limit supply. Don't forget that main question to answer is: "Why entry gaming cards no longer drop in price after 2 years by 50% since release of GTX 750?"

And you pretty much proved my point by saying you sometimes game on a 740 at work.


Not exactly. It runs fine 4 year old game, on sub-standard resolution on lowest settings. That card is incapable to run any modern game, like latest two CoDs properly and ruins gaming experience.

15 pages « < 9 10 11 12 > »


Post New Comment
Click here to post a comment for this news story on the message forum.


Guru3D.com © 2023