Download GPU-Z 0.6.5

Download 370 Published by

Click here to post a comment for Download GPU-Z 0.6.5 on our message forum
https://forums.guru3d.com/data/avatars/m/228/228574.jpg
Seems to not show my clocks correctly on my 680 :/
data/avatar/default/avatar14.webp
Thank you for posting and hosting. Something is not right with the size of the GUI for me. http://www.freeimagehosting.net/newuploads/2txj8.jpg The one on the right is a beta/modded 0.6.4 version that can dump my 670 bios without probs. I also notice that the ASIC quality on the newer version (the one on the left) is reading my gpu as way lower quality. All the Tabs on the new one display the same as the image I posted, chopped off.
https://forums.guru3d.com/data/avatars/m/87/87487.jpg
Where is that ASIC Quality option in GPU-Z please? I've never noticed it before and what does it actually mean?
https://forums.guru3d.com/data/avatars/m/147/147322.jpg
SIC is a somewhat reliable measure of the electric leakage the chip has (it is calculated using integrated circuitry so it's not the most precise but its basically integrated so that AMD/nVidia can decide the application of each chip without having to resort to a more expensive binning process). Basically a low ASIC can permit for much more voltage granting a higher max OC ceiling on conventional chips...but lower ASIC permits for higher overclock for a given voltage making it possible to reach the same clocks with lower temps (as a general rule of thumb). (taken from another forum).
https://forums.guru3d.com/data/avatars/m/87/87487.jpg
Thanks spajdrik. So GPU-Z has always had this ASIC Quality option then and the higher the percentage, the better the GPU is for overclocking? How do you access this option please as I'm curious about my own GTX 680 cards? Thanks.
https://forums.guru3d.com/data/avatars/m/228/228458.jpg
clocks speeds are showing all wrong for me as well.
https://forums.guru3d.com/data/avatars/m/243/243536.jpg
My asic is 75.3% with boost enabled or disabled, at stock or OCed.
https://forums.guru3d.com/data/avatars/m/183/183767.jpg
91.9% asic quality with 0.6.4 70.0% with 0.6.5 :(
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Thank you for posting and hosting. Something is not right with the size of the GUI for me. http://www.freeimagehosting.net/newuploads/2txj8.jpg The one on the right is a beta/modded 0.6.4 version that can dump my 670 bios without probs. I also notice that the ASIC quality on the newer version (the one on the left) is reading my gpu as way lower quality. All the Tabs on the new one display the same as the image I posted, chopped off.
If its reading lower ASIC it means its more accurate. I had the same thing on my 570gtx. First gpu-z 0.58 showed 100.3%, later versions showed correct 87.6%. Edit: clocks are ok here, but if i OC to lets say 855mhz it shows as 940mhz, http://i.imgur.com/cYq2Y.png or at 900mhz it shows 1030mhz xD http://i.imgur.com/WUYuL.png
https://forums.guru3d.com/data/avatars/m/87/87487.jpg
I found the ASIC quality option and I get 87.6% with GPU-Z v0.6.5. Is that good? *EDIT* I get 100.0% with v0.6.4. :3eyes:
https://forums.guru3d.com/data/avatars/m/243/243536.jpg
I found the ASIC quality option and I get 87.6% with GPU-Z v0.6.5. Is that good? *EDIT* I get 100.0% with v0.6.4. :3eyes:
v0.6.4 i get 98.9% and like i posted above with this version, 75.3%. I believe your asic is very good actually.
data/avatar/default/avatar15.webp
Hello i know this is not the better place to talk about but i wont make another thread just for this quick question. Why in Gpu Z and Nvidia control panel it says my gtx 680 got 4gb vram available? It is an evga gtx 680 2gb not 4 O.O. Could someone clarify me this if possible ? Thanks
data/avatar/default/avatar03.webp
It would make sense that gpuz would find the ASIC quality lower on a gtx 670 compared to a gtx 680. Since manufacturers would use a lower bin chip to chop the cluster for the 670.
data/avatar/default/avatar26.webp
GPU-Z's dev changes the "ASIC quality" measure scaling from time to time so it doesn't gets to 120% or weird values. I don't know exactly how its measured but I think GPUZ gets some raw values from the GPU, and then does some math around it to show it with a percentage, so what 100% actually means its defined by W1zzard, sometimes he sets the bounds a lil upper if some GPU series is getting higher ASIC quality values than normal (he did it once when the feature first appeared). Its like getting read/write errors from a SMART report and saying "well, if its around 200 then its at 100% health" and once some HDD series comes out with around 50 read/write errors the program sets up 50 as the "new" 100% health instead of saying 400% health.
https://forums.guru3d.com/data/avatars/m/84/84948.jpg
GPU clock is displayed incorrectly in version 0.6.5 with OCed GPU: 1052MHz instead of 911MHz for my GTX570.