COLORFUL Announces iGame GeForce RTX 4080 16GB Advanced and Ultra W Graphics Cards

Published by

Click here to post a comment for COLORFUL Announces iGame GeForce RTX 4080 16GB Advanced and Ultra W Graphics Cards on our message forum
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
TheDeeGee:

A 320 watt xx70, not sure which world you're living in.
3070ti was a 300w gpu.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Undying:

3070ti was a 300w gpu.
That's a Ti though.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
TheDeeGee:

A 320 watt xx70, not sure which world you're living in.
The world where it was confirmed and nGreedia do whatever they want.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Neo Cyrus:

Daily reminder that the "4080 16GB" is a 4070 and it was confirmed that was what nGreedia were going to name it originally. The galaxy of a difference between the "4080" and the "4090" has a reason.
no, its not.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Astyanax:

no, its not.
Yes it was. See, I can Astyanax you right back. Being serious, what would he have to gain by lying about that?
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Neo Cyrus:

Yes it was. See, I can Astyanax you right back. Being serious, what would he have to gain by lying about that?
nvidia has never intended to call their AD103 part a 4070
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Astyanax:

nvidia has never intended to call their AD103 part a 4070
Their last 103 chip was the 3060 Ti... and you're not really giving a reason why you believe that. Again, what would he have to gain from lying?
https://forums.guru3d.com/data/avatars/m/282/282657.jpg
tunejunky:

lipstick on a pig
Agreed, rofl, can´t stop rofl, realy can´t stop it.............rofl....
liponpig.png
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Neo Cyrus:

Their last 103 chip was the 3060 Ti... and you're not really giving a reason why you believe that. Again, what would he have to gain from lying?
only to get rid of stock piled gpu's, and its still cut down to the same specs as the 104 variant. GA103 was the chip that desktop 3080 was being aimed for, but nvidia opted for more shaders and more a denser memory interface putting it on 102 instead. it ended up being used as the notebook 3080 part.
https://forums.guru3d.com/data/avatars/m/282/282657.jpg
Neo Cyrus:

Yes it was. See, I can Astyanax you right back. Being serious, what would he have to gain by lying about that?
Wasn´t he killed during the fall of "Nvi..." ooops Troy? I wanted to say...lol.
https://forums.guru3d.com/data/avatars/m/272/272100.jpg
schmidtbag:

The only time a 4080 would make sense (over a 4090) is if you needed something physically smaller. Well, this thing ain't really smaller than a 4090, so it really is a pointless product.
How about the power consumption?
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
Skylinestar:

How about the power consumption?
But is that honestly a driving factor when deciding? Is it not better to have a more powerful card you can dial down. As an example there are several games I play where I can run my 6900xt at way lower clocks and therefore lower power consumption but if I need to, I can push them back up for more demanding games.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Skylinestar:

How about the power consumption?
pegasus1:

But is that honestly a driving factor when deciding? Is it not better to have a more powerful card you can dial down. As an example there are several games I play where I can run my 6900xt at way lower clocks and therefore lower power consumption but if I need to, I can push them back up for more demanding games.
It`s an important factor because cards use huge amounts of energy, and that has some disadvantages, of course.
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
H83:

It`s an important factor because cards use huge amounts of energy, and that has some disadvantages, of course.
Agreed but you dont have to run them at full power with every game all the time. As an example im playing COD WW2 right now and at 144hz its only pulling 180w becuase ive got the core and VRam dialled down, if its Metro EE i run it at full speed which is over 300w.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
pegasus1:

Agreed but you dont have to run them at full power with every game all the time. As an example im playing COD WW2 right now and at 144hz its only pulling 180w becuase ive got the core and VRam dialled down, if its Metro EE i run it at full speed which is over 300w.
True.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Skylinestar:

How about the power consumption?
Heat and power consumption are directly correlated. If you need a 3-slot heatsink, that suggests it's going to consume at least 400W of power. If you've got a GPU drawing that much power, chances are, power consumption isn't that big of a deal to you.