Hi, I don't understand the difference between Windows sensitivity and in-game mouse speed (if there is any).
Note: I'm NOT asking about the well-known high DPI and low sensitivity discussion, I understand the difference between hardware-dependent DPI and software-dependent sensitivity. What I don't understand is the difference between the speed set in Windows and the one set in game settings.
By the way, I'm using PMW3366.
The questions:
1) I understand any setting higher than 6 in Windows introduces pixel skipping every x units you move the mouse. But why is setting it to .5x multiplier (4, I think) bad? It just takes moving the mouse 2 times the units to cover the same area, right? It shouldn't skip anything.
2) If lowering the Windows speed below 6 is bad, why is increasing native DPI and lowering in-game sensitivity good? I understand higher DPI detects more detail, but if lowering sensitivity is bad just as lowering the speed in Windows (in other words, skipping every x counts, depending on the setting), how can it be better?
3) In Windows 1:1 tracking is at 6/11. What setting is it in games? Some games use numbers between 0.001 and 20, but others use 1 to 100. If a games uses settings between 1 and 100, the number cannot represent the multiplier, can it? Otherwise it would be multipliers between 1x and 100x which seems ridiculous. Do games have a 1:1 setting or do they work differently?
In this reddit thread (https://www.reddit.com/r/GlobalOffensive/comments/1x2a3l/here_is_how_to_get_the_most_out_of_your_mouse/) the poster claims (under "In-game sensitivity") that there IS a difference between Windows and game sensitivity because Windows does some number rounding which doesn't happen with the in-game sensitivity multiplier. Is this true? If it is, then:
1) Why doesn't Windows do it the same way games do (if it's just plain better)?
2) It says:
"For example, say you are using 3/11. Your windows multiplier is 0.25. If you move the mouse by one pixel constantly for 3 counts, 3 * 0.25 = 0.75, and the pointer is not moved at all. You then move another 2 counts, and 2 * 0.25 = 0.5 + the 0.75 from last time = 1.25. Now the pointer moves by 1. This happens because when Windows applies the scaling factor, it can only pass through to the game a whole (integer) number of mouse movement and it holds back or delays a remainder which is added into the next movement."
I don't see any other way to do it. If you set it to 6 and change the DPI accordingly to get the same effective speed, how would it be any different? If you don't move the mouse far enough to move the cursor by one pixel, it won't move, right? It can't move less than one pixel. Am I missing something here? I'd really like to understand this better.
Any help is appreciated, thanks!