Logitech G602 Wireless Gaming Mouse
Click here to post a comment for Logitech G602 Wireless Gaming Mouse on our message forum
XeoNoX
where or what site did u guys see the specs for the polling rate listed for the G602?
tsunami231
I do like that mouse but i think i done with wireless (have MX 1100), It not the latency that bothers me cause i really dont see any, Its the fact the Mouse is no farther then 1 foot from the receiver and it at random got all jumpy on the screen like its 10+ feet away
I just need to find mouse with simalar button locations same amount and must have tilt wheel with customizable buttons. Going optical too cause i just realized what that negative acceleration issue is with laser it annoys me but no where near as much as that random mouse jumpy movements
Ven0m
http://gaming.logitech.com/en-us/product/g602-wireless-gaming-mouse
also
This mouse looks really inteteresting if you're not digging high-dpi.
From Logitech site that DribbelDog linked: Ven0m
tsunami231
Ven0m
EJocys
MasterBash
EJocys
MasterBash
I mean, low dpi does not cause any video problems or whatever, the only thing that can happen is pixel walking, I wasnt questioning your calculations, but I didn't look at them. Not everyone needs to do a 360 or have a small mousepad. I got a QcK+ btw.
Many games at 1 sensitivity (or default) use 1:1 scaling. It causes no problems with hitting small targets, no matter your DPI. Sure with low sensitivity you can have a problem with pixel walking, but not pixel skipping (Which I guess is what you mean by pixel jumping). The opposite is true, higher than native DPI may create a noise problem and cause ripple effect, which will be a lot less accurate.
The G700 suffers from positive acceleration, it uses the same sensor as the defective G500. The CPI steppings does not mean the sensor will be perfectly fine, its merely to reduce calculation errors. There is no way to fit a 8200 native DPI sensor into a mouse.
Its a myth that higher DPI = more accurate. DPI doesn't affect accuracy, in fact it can have bad effects when using a DPI far from the native sensor resolution.
Edit: I read your calculations and no, it would be a 180 degrees.
Mouse dots per 360° = 400 DPI * 10.8" = 4320 dots.
Its total mouse dots (I guess you can call this Counts Per 10.8", lol), not mouse dots per 360 degrees. If I move 4320 dots that means I will move half of 360 degrees (180). Your mouse dots will tell you how many pixels on the screen you are gonna move by depending on your movement on your mousepad. 400 DPI = 400 pixels on the screen for every inch you move your mouse on your mousepad. Thats it. Hence the 4320 over 10.8". DPI does NOT change accuracy, its simply mouse sensitivity. Don't worry, you don't move at 1/4 of a pixel.
If DPI would be accuracy, everyone would be using max DPI, but thats not the case. Change it... You will see, it makes your mouse slower or faster, it has nothing to do with accuracy.
Enticles
EJocys
MasterBash
If you put your sensitivity to 2, then yes. Your calculations did not imply that though, like I said you cant put Counts per 360 degrees (Like the name implies, it would be the amount of pixels required for a 360 anyway, which isnt even close to what you calculated...). All you did was calculate the amount of Counts over your 10.8" mousepad.
3200DPI with 0.25 sens will simply be a 360. So again, only 8460 will be reported. I don't remember having my mouse plugged into my monitor but I do remember my OS/games deciding what to output. In fact, this can actually cause ripple due to high DPI, which will may or may not make it worse. Add to that a sensor that has positive acceleration...
Its the exact same as if you would use 800dpi and a sens of 1. In fact, the only thing you are doing is taking the reports of the mouse and let the software scale it down, which is worse than keeping 1:1 ratios, due to potentially having more errors. Even better would be to use raw input, then that would end you up with 4 times the sensivity that you already have (Will have to lower DPI to compensate).
And you are wrong, DPI does not increase accuracy, its a myth. Let it die already. Its only sensitivity. Not everything you read over the internet is true. I know people who have no idea how a mouse works will believe a higher DPI number is more accurate. Not the case at ALL.
Dont tell others DPI increase accuracy please, people who got no clue may believe those misinformations. That will only create even more people who are in the wrong.
DPI = Dots per inch. 400 DPI = 400 pixels for every inch on your mousepad. It HAS nothing to do with accuracy at all. Sensitivity only. Even the name implies that. I have no idea where this myth of DPI = accuracy comes from, but the guy who started that doesnt know much.
I guess some people are falling for marketing bull****, they believe a bigger number is better. How DPI works clearly shows that it does not increase accuracy and only sensitivity. Like I said, even the name implies it increases sensitivity.
Ven0m
Regarding accuracy, it's also important how you recalculate pixels to degrees.
Pixels define granularity, so if you have 4000dpi and 800dpi, you have every inch of mouse move divided into 4000 or 800 segments.
Then it's all about developers. If they use proper floating point calculations, then using 4k with 1/5 of in-game sensitivity would result in 5x greater position report density. If developers decide for some weird calculations or to store data with too high granularity, going beyond some point in DPI will not give anything.
So we're talking about pixels for 360°, or just any other nominal value, but assuming one can adjust sensitivity properly, you can also measure number of pixels / individual reports for mouse travel distance, as it will map linearly for in-game angle assuming defined sensitivity.
Let's say that devs have done everything to achieve the greatest precision possible. In such case maximizing device DPI and decreasing in-game sensitivity will result in better accuracy as it will increase the in-game angular resolution in shooters. However there will be little to no difference for a gamer who moves their mouse a lot. Let's say you need 12" for a full 360° turn. That gives 30° per inch. With 800 dpi you're getting granularity of 0.0375° per mouse report (pixel). With 4000 dpi, it's 0,0075° per mouse pixel. These are the smallest angles that you can rotate. 4000 gives you better precision, but does it matter at all? Let's apply some math:
arctan(n) ~= n for small values of n if the angle is in radians
Let's calculate the **** out of it:
360°/800ppi/12" = 0.0375°/pixel ~= 0,000654rad/pixel
So, assuming you play a shooter with perfect angle and mouse movement calculations, aiming from 1000 meters, each pixel will move the target point by 0.65 meter. For 4000dpi it would be 13cm for each pixel. Effectively it would be the minimal angular distance by which you can move your crosshair assuming this mouse dpi and 12" for full circle.
If you want some rough formula for minimum target movement in meters assuming aiming from 1km it's roughly:
6283 / (Mouse DPI * mouse distance for full circle in inches)
For people from not-US:
15960 / (Mouse DPI * mouse distance for full circle in centimeters)
EJocys
MasterBash
0.25 in-game sensitivity is still a scaling, which requires you to have higher DPI for the same sensitivity as 1 in-game. I never mentionned windows pointer sensitivity as I will always use 1:1. You get more reports, yes, but that does not make it more accurate. Look for exemple at Valve games (CS and so on). I have no idea exactly how its coded when it comes to reporting mouse movements. You set your DPI at 800 and 1 in game. You will have the same effect as 3200 and 0.25 in game. Except... with the latter you have a higher margin of error.
And also I am pretty sure I was one of the best CS player in the world, playing at The CPL and other competitive tournaments back then, so I do know about accuracy.
Like I said, there are many problems with DPI, such as ripples. So you get huge inaccuracy.
1. Mouse polling rate, thats true. USB polling will typically have 1ms. Yes, you will get higher reports by the mouse. But here is the problem though, you require your mouse sensor to create a lot of noise and a much higher margin of error when it comes to calculating the end result, you may actually end up a few pixels away from your target.
2. True. A 120hz monitor will help make what you see as you drag the mouse (not the mouse itself) more accurate though.
3. Windows mouse slider is another scaling, which increase the margin of error. This is why it should stay default.
4. Never played any of these games. I do know Valve games are a ratio though. If you use 800dpi with 1 and 3200dpi with 0.25... It will be the exact same but with 3200dpi having more noise.
Like I said, DPI does not increase accuracy. You simply look at it in term of reports, which is a bad thing to do, because sensors are not designed to run at higher or lower than their native DPI without any issues. It will always create an issue. A mouse does not have a real 3200 native dpi sensors in it, so you WILL have errors, no matter how good your mouse is. It just depends how much higher or lower your DPI will be from the native resolution. Thats the biggest challenge when it comes to making sensors.
Native DPI will always be best, same with a non-scaling sensitivity (Note: I don't know which games are using different kind of reports, whether its pixel angles or to the closest pixel, which I am guessing BF is the latter). More reports =/ more accurate, especially if its unreliable. Also with the G700 having positive acceleration, you will be even LESS accurate.
EDIT: I get it from reading your posts, you are thinking precision, not accuracy. You are having more precision at the cost of accuracy. Which is totally not worth it, like I explained, you get something unreliable.
EJocys
MasterBash
When I was talking about 3200dpi 0.25 vs 800dpi and 1, I was talking about the movement speed, but honestly, you wont notice a difference between 1 pixel and 1/4 of a pixel with fast pacing games, since you don't really have time to count them or notice them.
I never played a single game where one pixel matters (CS, CoD), but I have played games were accurate mouse movements matter.
I was using a sniper rifle most of the times. I never had a distance problem at all, targets are quite big across the map. I wouldn't want positive or negative acceleration though (As an example for accuracy, since its very apparent on the G700 and many other laser mouse). I would overshoot or undershoot.
I stand corrected on the precision, your mouse can get more precise, but my comments are also right when it comes to accuracy. Yes, a pixel can make a difference between missing or hitting the target, but there are a lot more factors than that, such as hitbox size (Which are always bigger than the model), lag/bugs (When CSS first came out, you had to shoot in front of your targets). If one pixel makes a huge difference between missing or hitting, then you are obviously close to missing. Models are big enough even at a distance to hit them (Again, I am talking about CoD/CS, you can clearly see the target even very far away).
But I value accuracy so much higher. You probably dont notice any weirdness in the aiming, maybe its very minimal, but if that one time you would get some ripple while aiming, that wouldnt be a good thing. That one pixel could turn out to be something different. Maybe its the same as your G700 with positive acceleration, you may not notice it but it is there.
Personally, I want to use 800 dpi (Native on G400, instead of using my 400dpi), cause I don't want to face issues with pixel walking... I am now playing League of Legends, it wont make a big difference whatsoever, but I am not sure how that game is coded for mouse movements, I don't know if changing mouse sensitivity in-game will result in less accuracy.