AMD to Unleash Some Fury ?
Click here to post a comment for AMD to Unleash Some Fury ? on our message forum
Isbre
AMD's marketing department has fairly recently screwed up with the bringing back of the FX brand. Giving me false hopes of something legendary to return. Hope this is NOT the case this time around.
CronoGraal
ATI Rage!
Deathchild
Noisiv
Volcanic Islands, Krakatoa, Fiji, Fukushima, Hiroshima...
It finally dawned on them that setting themselves up for another round of African village electricity jokes is not the best way to kickstart new GPU
http://abload.de/img/23_13_22hvkii.gif
icedman
After the anouncement that all the 300 series would be rebrands i figured they would do a funny naming scheam like titan hopefully its not a joke like titan as well.
Cave Waverider
I hope a Titan-like naming scheme won't cause them to use Titan-like overpricing...
Fox2232
http://www.tomshardware.com/reviews/ati-rage-fury-pro-review,133-6.html
You know, some of us actually had those. I had VooDoo 3 2000 16MB, and I happen to be using AIW 128 Pro some time later. Cousin had Xpert card of that generation.
And then I got TNT2 32MB m64. That loony boy from Mars up the page apparently forgot to mention he paired his ATi card with 386.
Because while ATi Rage 128 was just quite a bit weaker, it was not delivering 10fps at games of that time. And was soon replace by Rage Pro 128.
One thing which was apparent at that time was IQ. Voodoo 3/TNT2 produced textures which would remind you of whore's face in rain. Makeup melted all over, nothing sharp.
That's where ATi/Matrox were clear winners back then. I played mostly 800x600/1024x768 and at that ATi delivered better Image quality per frame rate. But Hey, I had to learn hard way with those 3Dfx/nVidia GPUs.
There is no question that TNT2 lasted me with its 32MB a long time, since it had bit higher performance and it made me used to poor IQ. With it, it hardly mattered if 3D scene was rendered 640x480 or 1024x768, textures made everything blurry in same way.
Here is some comparison: Fox2232
---TK---
B0GiE-uk-
Why has no one picked up on this yet:
"monitor connectivity is HDMI and DP, no more DVI"
NO MORE DVI ?????
Also doesn't HDMI cause lag?
I cannot understand why they are going DP only as HDMI is still valid for 99% of gamers.
How many people have a DP monitor? I know I don't. Which means a new monitor if i want to upgrade to the fastest AMD card.
How much money is that going to cost?
Megabiv
Dch48
I like the name. My first graphics card was the Ati Rage Fury 32mb version.
The card looks good except I wouldn't like the removal of the DVI connection. If I use HDMI I have to then plug my speakers into the back of my monitor and they just don't sound as good compared to having them plugged directly into the sound card output on the PC. The picture isn't any better on HDMI either.
Humanoid_1
@Dch48
I find I can enable (in Volume Control Options, check both output devices) both HDMI and my Sound Blaster XFi Gamer card at the same time. No sync issues.
Neo Cyrus
PhazeDelta1
Rich_Guy
Fender178
IcE
PhazeDelta1
Athlonite
I wonder the pic that's in the story for this is a Boris Vellejo painting did AMD ask permission to use it did they just steal it
http://fantasy.mrugala.net/Boris%20Vallejo%201992-1995/Boris%20Vallejo%201992%20-%2004.jpg