AMD to position the Radeon R7 370X against GeForce GTX 950

Published by

Click here to post a comment for AMD to position the Radeon R7 370X against GeForce GTX 950 on our message forum
data/avatar/default/avatar21.webp
Isn't the gtx 950 going to be more in line price point/performance/market wise to the 360? I mean they're two completely different types of cards, the 950 is more for compact, lower power and lower end builds where as the 370x is more of a lower end mid range card (i.e. more like the 960).
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
We'll probably see full Tonga GPU and 380X as well.
https://forums.guru3d.com/data/avatars/m/240/240605.jpg
We'll probably see full Tonga GPU and 380X as well.
Well full tonga would be the 380x as the 380 is a tonga in disguise, so i think you´re mentioning the same gpu twice. That´d be an aweome card though.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Well full tonga would be the 380x as the 380 is a tonga in disguise, so i think you´re mentioning the same gpu twice. That´d be an aweome card though.
Thats what i meant. First 370X then Tonga XT 380X as well. 😉
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
Should be able to beat the 950 in everything but power use but some people are really picky on power use for some reason.
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Should be able to beat the 950 in everything but power use but some people are really picky on power use for some reason.
It costs money. I don't see any other reason to be.
data/avatar/default/avatar21.webp
Ah, the long life of Pitcairn.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Should be able to beat the 950 in everything but power use but some people are really picky on power use for some reason.
electric power price have increased about +60% last year... that's make you think twice about the power use.
data/avatar/default/avatar39.webp
Should be able to beat the 950 in everything but power use but some people are really picky on power use for some reason.
In Portugal, electricity VAT raised from 6% to 23%. And that makes you think about the power use.
data/avatar/default/avatar04.webp
At the price range of these GPU's I think the the price of a new power supply, as well as some folk being comfortable installing one are an issue as well. These would easily be used as upgrades in pre-built computers.
data/avatar/default/avatar06.webp
Since the R7-370X is equivalent to the R9-270X, it will blow the 950 out of the water performance wise and if the price is the same or lower, the choice will be a no brainer. As far as power costs go, they haven't changed here in the last 20 years. It's still 8.5 cents per kw/h.
data/avatar/default/avatar07.webp
the power consumption comments are idiotic..
https://forums.guru3d.com/data/avatars/m/128/128096.jpg
the power consumption comments are idiotic..
To you maybe, not to people who have to pay for power. It does matter. For example, even assuming that the desktop rig idles 100% for the whole year, the cost difference between a 980 and a 290x is $30 USD a year, times the two or three years that an average person would keep such a high-end card and you get nearly ~$90 of ownership cost for a 290x on IDLE. Let's say you use it one hour a day to game, then the 290x will cost $29 more on idle and another $5 dollars for the load, which makes it $34 more expensive to own a year. This is completely discounting the much better performance per watt you get to begin with from the GTX 980. So yes, electric costs matter in the final decision. And I'm being EXTREMELY conservative with my numbers, and the situation gets much worse if you game more or use the "uber" mode (add another third to the cost). (All wattage stats were taken from Anandtech and times by a 0.23 per kwh used by Hilbert).
https://forums.guru3d.com/data/avatars/m/224/224714.jpg
I think his point is that most gaming rigs have a 500W or higher PSU so buying a slower card makes no sense when you have the power to run a faster card of a similar price.
data/avatar/default/avatar18.webp
To you maybe, not to people who have to pay for power. It does matter. For example, even assuming that the desktop rig idles 100% for the whole year, the cost difference between a 980 and a 290x is $30 USD a year, times the two or three years that an average person would keep such a high-end card and you get nearly ~$90 of ownership cost for a 290x on IDLE. Let's say you use it one hour a day to game, then the 290x will cost $29 more on idle and another $5 dollars for the load, which makes it $34 more expensive to own a year. This is completely discounting the much better performance per watt you get to begin with from the GTX 980. So yes, electric costs matter in the final decision. And I'm being EXTREMELY conservative with my numbers, and the situation gets much worse if you game more or use the "uber" mode (add another third to the cost). (All wattage stats were taken from Anandtech and times by a 0.23 per kwh used by Hilbert).
Yeah ppl with 2k euro setups or me with my 6,7 k euro setup suddenly became worried about the power bill?? give me a break, the power consumption thing means nothing in the ultra high end
data/avatar/default/avatar31.webp
R7 370X = GCN1.0, Direct 11.2a; DX12 in software only. GTX 950 = Maxwell 2, DirectX 12.1.
data/avatar/default/avatar31.webp
Yeah ppl with 2k euro setups or me with my 6,7 k euro setup suddenly became worried about the power bill?? give me a break, the power consumption thing means nothing in the ultra high end
In addition to that, you have to factor in all the extra cost to be spent on cooling the rig and the environment as well. I live in a tropical climate and with a 970 and OCed 5820k I'm already paying $75/month for electricity. If I swap to the equivalent 290x, I'll probably pay well over $100 for electricity and air conditioning. That's at least $25/annum saved as well as less discomfort/noise while gaming. We're not talking about ultra high end. Since when is a x50 from NVIDIA ultra high end? :tool:
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
To you maybe, not to people who have to pay for power. It does matter. For example, even assuming that the desktop rig idles 100% for the whole year, the cost difference between a 980 and a 290x is $30 USD a year, times the two or three years that an average person would keep such a high-end card and you get nearly ~$90 of ownership cost for a 290x on IDLE. Let's say you use it one hour a day to game, then the 290x will cost $29 more on idle and another $5 dollars for the load, which makes it $34 more expensive to own a year. This is completely discounting the much better performance per watt you get to begin with from the GTX 980. So yes, electric costs matter in the final decision. And I'm being EXTREMELY conservative with my numbers, and the situation gets much worse if you game more or use the "uber" mode (add another third to the cost). (All wattage stats were taken from Anandtech and times by a 0.23 per kwh used by Hilbert).
System idle difference for 290x and 980 is 2-4 watts depending on source from what I gather. Now in games that changes a lot for sure there we can see that 30$ difference a year gaming around 4 hours a day 5 days a week if the card reaches peak on both 980 and 290x which it rarely does when gaming @ 1200p and 60fps limit. My card then again does not even have something called "uber" mode since it is pretty much reference card thing only. Now comparing 980 and 290x is funny anyway 780 ti would be proper comparison to 290x where the difference is small enough as to not matter. When Nano comes out we got reference point from amd vs the 980 I guess.
https://forums.guru3d.com/data/avatars/m/255/255340.jpg
R7 370X = GCN1.0, Direct 11.2a; DX12 in software only. GTX 950 = Maxwell 2, DirectX 12.1.
Foo Dx12 is software itself. Plus the entire hype of the api is based on the low-level driver improvements of Mantle and Microsoft shills dropping "200% more this" at every chance.
data/avatar/default/avatar39.webp
Yeah ppl with 2k euro setups or me with my 6,7 k euro setup suddenly became worried about the power bill?? give me a break, the power consumption thing means nothing in the ultra high end
We are talking about budget or mid-range GPUs, not high-end...high end buyers off course cares a little about power usage.