Review: MSI GeForce RTX 4090 Suprim Liquid X

Published by

Click here to post a comment for Review: MSI GeForce RTX 4090 Suprim Liquid X on our message forum
data/avatar/default/avatar39.webp
NoLikes:

Too bad it come with 100 watt over mine budget and there no defaulting BIOS within mine spec.
several reviews point the fact that the gpu is actually way more efficient than previous gen I think bitwit ? not sure I watched like 20 of them the 4090 in some cases give +60% fps for -27% less watts and in most games it uses around 330w you can also as seen on derbauer limit it to lose like 10fps but reduce watts drastically, kind of like the 7950x that still does almost 38'000 cinebench r23 for 200watts greatly reducing the temp I would like to remind everyone that on borderlands 3 I went from 120fps on a 1080ti@310w to 220+@380w (real ingame fps in open spaces..not the nerfed benchmark) on a 3090...it should have been 600w if no improvement had been made in 2 gens, the 4090 I'll see in 1-2 months, personnally I hope for 400fps with unlocked fps, why ? because I can...just like I download random trash since I got fiber lol btw when I complain about "reviews" here's a good example : https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-pci-express-scaling/5.html how did they manage to get only 96fps in 1080p with a 4090 ? l I had 100+ EDIT : nvm I had 89fps still it was a 1080ti with my 9900k and 1080ti on the highest preset...that's an older cpu with three gens older gpu this is what I mean when I say "reviews are nice but some are made by people who just don't know what they are doing" the game also ran fine on my 5950x in DX12 EDIT : stopped benchmarking DX11 but with a 3090 I had 159fps no reason to use DX11 you're willingly ruining your cpu score and so harming your gpu perfs on a pcie bandwith test...not a good idea he should also have tried this ingame with fps unlocked as I did to see real bandwith usage there's one dungeon where I had 350fps in 1080 maxed settings and boy did the card get hot..hotter than 60fps 4K for sure EDIT : 3dmark also has an integrated PCIE bandwith test that's how I noticed that using the 5.0 M.2 on the Z690 asus extreme halved my bandwith giving me x8...4.0 not x8 5.0 "or" x16 4.0
https://forums.guru3d.com/data/avatars/m/287/287596.jpg
The 100 WATT differential make the external power unit nominal usage something on another class energy scale, also, the software control on power saving and any in the middle measurement done by the end user doesn't change the default shipped required wattage for the warranted configuration certified by the manufacturer.
https://forums.guru3d.com/data/avatars/m/201/201182.jpg
Over time when enough liquid evaporates out resulting in a significant loss in cooling ability, is there a way to top it up?