Greetings and salutations earthlings, welcome to yet another new
NVIDIA product review. It's been discussed widely ever since... hmm
what was it, February? Today NVIDIA is launching its GeForce 8800
Ultra. Now, NVIDIA tried to keep this product as secret as can be, why?
Two reasons. First to prevent technical specifications leaking onto the
web. Secondly, obviously to change specs at the last minute. See ATI is
releasing their R600 graphics card soon and the Ultra is the product
that NVIDIA prepared to counteract it in the market, an allergic
reaction to the R600 so to speak.
It's fair to say that the leaked R600 info you have seen has some
validaty (yes, we'll have an article soon) in it and yes, obviously
NVIDIA corporate is scratching their heads right now asking what the
heck happened with R600?
Anyways welcome to the review. We're not going extremely in-depth
today as despite the rumors of a GX2 based 8800 (which where false) the
8800 Ultra is a respin product. This means it's technically similar to
the original GeForce 8800 GTX. So then no 196 Shader cores or whatever
the Inquirer figured it would be. No my friends we have exactly the
same stuff, yet a respin means its core is clocked faster, has faster
memory and the 128 shaders processors are clocked faster.
Pricing. Initially NVIDIA set this product at a 999 USD price point,
which well honestly I think my pants dropped when I heard that the
first time. In the latest presentation the Ultra was priced 829 USD.
And here me now good citizens I'm changing the price myself and will
say it'll be finalized at 699 USD/EUR. Which is still a truckload of
money and way too much money to just play games but hey; this is the
high-end game. Which means completely insane prices yet quite a number
of you guys will buy it anyway. And hey you know what? I can't blame
you for being a hardcore gamer.
So what can we expect from the GeForce 8800 Ultra? I stated it
already, higher core, memory and shader frequencies (I really prefer to
call them shady frequencies) thus an accumulated amount of additional
performance and good thermals, man look at that new cooler! And all
that at 175 Watts maximum as in this silicon revision NVIDIA claims to
have some architectural advantages that got wattage down. So in my
opinion that would a slightly lower core voltage(s) or a better cooled
product. Yes my guru's a better cooled product equals less power
Over the next few pages we'll quickly go through the technical
specs, we'll skip the in-depth DX10 part as honestly please read it in
our reference reviews. We'll look at heat, power consumption, give the
card a good run for the money with a plethora of up-to-date games and
then we'll try and torch the bugger in a tweaking session where we'll
overclock the shiznit (Ed: I'm banning you from ever using that word again, Hilbert) out of it...
Nvidia GeForce GTX 750 and 750 Ti review In todays article we review the new GeForce GTX 750 and 750 Ti from Nvidia. These cards are affordable - low power - decent performance graphics cards that will allow you to game even at 1080P. these...
NVIDIA G-Sync Explored and Explained NVIDIA recently announced GSync, a technology that is named to be a game changer, yes G-Sync eliminated toe problems that come with VSYNC on and off. meaning no more Sync stuttering and or Screen te...
NVIDIA G-Sync explained On Friday NVIDIA announced G-Sync, and considering the little details available out there I wanted to write a quick follow-up on this new technology, as it really is a big announcement - a really bi...
NVIDIA GF100 (Fermi) Technology preview Last week we arrived at Sin City not only to cover CES but there was something else going on as well. In Las Vegas, NVIDIA had organized a briefing for a select group of the press. From Europe perhaps ten to fifteen people where invited for this somewhat privileged preview -- the topic, a technical overview of project Fermi. Fermi is of course the family name of the latest generation of GPUs from NVIDIA. The first chipset deriving from Fermi will be called the GF100 GPU which will likely be used on what we think will be called products like GeForce 360 and GeForce 380. Join us in a nice technology preview.