Matrox C900 Graphics Card with Nine DisplayPorts

Published by

Click here to post a comment for Matrox C900 Graphics Card with Nine DisplayPorts on our message forum
https://forums.guru3d.com/data/avatars/m/247/247876.jpg
There was the time when Matrox` videocards were considered one of the best among 2D videocards.
data/avatar/default/avatar20.webp
There was the time when Matrox` videocards were considered one of the best among 2D videocards.
Not just one of the best but the best. Used to be Matrox > 3dfx > ATi > NVIDIA back in the day. Anyway, as for the C900 itself, any word on if it's still AMD's Cape Verde based like their previous couple cards, or have they upgraded to newer GCN?
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Not just one of the best but the best. Used to be Matrox > 3dfx > ATi > NVIDIA back in the day. Anyway, as for the C900 itself, any word on if it's still AMD's Cape Verde based like their previous couple cards, or have they upgraded to newer GCN?
I guess it has to be. It's full DX12 support with 4GB of memory. I have the c680 with 6 LG 31" LED displays for work, and runs all my reports amazing! I can see them all with no performance hits!
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Didn't Matrox invent Bumpmapping back in the day?
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Not just one of the best but the best. Used to be Matrox > 3dfx > ATi > NVIDIA back in the day.
Yeah, back then in the beginning of the shift to 3D people were wondering whether to get one of the new but possibly dodgy 3D cards or a rock solid Matrox.
https://forums.guru3d.com/data/avatars/m/247/247876.jpg
Not just one of the best but the best. Used to be Matrox > 3dfx > ATi > NVIDIA back in the day.
https://commons.wikimedia.org/wiki/Graphics_card Not entirely true. As I remember it before ATI and NVIDIA there came 3dfx with their 3D accelerator which was not videocard and worked combined with 2D videocard. And we used then 2D videocards from Matrox, S3, Trident, Tseng Labs. And later came videocards which combined 2D with 3D. And ATI and NVIDIA produced such cards. Matrox still was the best in terms of 2D picture quality (and supported modes for monitors).
https://forums.guru3d.com/data/avatars/m/251/251394.jpg
Not just one of the best but the best. Used to be Matrox > 3dfx > ATi > NVIDIA back in the day. Anyway, as for the C900 itself, any word on if it's still AMD's Cape Verde based like their previous couple cards, or have they upgraded to newer GCN?
When you see Crysis 3's intro, where Prophet gives a backdrop on the story, and replace every time he says "CELL" with "NVIDIA", it all makes f*cking sense. https://www.youtube.com/watch?v=I6RyUWgZb64
https://forums.guru3d.com/data/avatars/m/262/262613.jpg
Now matrox sounds very familliar 🙂 I got into the pc world shorty before tge voodoo 3 were released and they were amazing, I paired it into an old pc my sister handed down to me with a massive 32Mb of ram a amd k5 100mhz which was so so slow. But I loved my v3 2000 pci and it worked a lot better in my p3 450mhz. Oh the good old day when going from 64Mb of sdram 100 to 128Mb made a huge difference. Anyways, as I remember the voodoo3 did a good job with Diablo 2 as an accelerater.
data/avatar/default/avatar19.webp
After my first Voodoo2 cards I upgraded to a Matrox card. It came bundled with a game called, "Evolva". Some time later a patch was made available to include bumpmapping. I still have the game and play it from time to time.
data/avatar/default/avatar35.webp
I guess it has to be. It's full DX12 support with 4GB of memory. I have the c680 with 6 LG 31" LED displays for work, and runs all my reports amazing! I can see them all with no performance hits!
Cape verde like all GCNs support 'full dx12', matrox doesn't mention d3d feature level
Didn't Matrox invent Bumpmapping back in the day?
It was just EMBM, not bump mapping in general, and it was actually developed by BitBoys even though Matrox was first to bring it to market
https://commons.wikimedia.org/wiki/Graphics_card Not entirely true. As I remember it before ATI and NVIDIA there came 3dfx with their 3D accelerator which was not videocard and worked combined with 2D videocard. And we used then 2D videocards from Matrox, S3, Trident, Tseng Labs. And later came videocards which combined 2D with 3D. And ATI and NVIDIA produced such cards. Matrox still was the best in terms of 2D picture quality (and supported modes for monitors).
Should have been more specific, was referring to g200/400 v3/4/5 rage/radeon tnt/2/gf era, not first 3d adapters ever
https://forums.guru3d.com/data/avatars/m/90/90882.jpg
I used to work for Matrox.
https://forums.guru3d.com/data/avatars/m/59/59930.jpg
says it uses an AMD GPU on Manufacturer site
https://forums.guru3d.com/data/avatars/m/88/88871.jpg
yes yes... that was a dedicated 3D processor, rite? you have your PCI Matrox Milenium for 2D and then Matrox m3D for the Quake.
Yes that was it, forgot about the Millenium part of the setup 🙂 I think I got an Nvidia Riva TNT2 ultra after that.
data/avatar/default/avatar34.webp
says it uses an AMD GPU on Manufacturer site
Yes, that's never been under question, they've used AMD GCN GPUs for a while now. The bigger question is if it's still Cape Verde or did they switch to some other GCN GPU
https://forums.guru3d.com/data/avatars/m/216/216490.jpg
When you see Crysis 3's intro, where Prophet gives a backdrop on the story, and replace every time he says "CELL" with "NVIDIA", it all makes f*cking sense. https://www.youtube.com/watch?v=I6RyUWgZb64
WOW! I will never see Crysis 3 the same way again! lol!
https://forums.guru3d.com/data/avatars/m/217/217375.jpg
Matrox Millenium (II), my 1st circa £200 GFx card... Seemed to stay around the £200 mark since then for some time, though I had to drop back a little from the bleeding edge to maintain it. Sadly those days have ended and even one step (or two) from the bleeding edge (+a few months or so) now has an entry ticket price of £309 which I paid for my current card. heh, Even my 1st PC with that Matrox Millenium and P75 chip cost me over £2,000