Leaderboard
Popular Content
Showing content with the highest reputation on 02/21/2018 in all areas
-
Perceived sensitivity
potato psoas and one other reacted to CaptaPraelium for a topic
I'm sorry I've been sparse with the updates to this thread. This is largely because it's very time-consuming to make illustrations which are really needed to explain the progress I'm making. I'm old.... I've been doing it with pen and paper and such I might just get a camera and put pics of some of that here, rather than just nothing. Theoretically, 0% is perfect. For the image, on the screen. But what we see, is not what is on the screen. What we see, is on the inside of our eye. This is why 0% feels 'slow'. It does not account for the last step in the game world being projected from the monitor to our eye. We do not have any formula which do so, and accordingly, 0% is the most correct theory we have as of right now. As per the science nerdery posted above, we know that we do not measure distance between two points, in the real world or the game world, directly - as we would with say, a ruler, or by pixels on screen. We measure it by means of deriving the distance from the angle between two points. This is a terrible thing to attempt to explain without pictures, but I'll try, because it offers us two interesting insights. Firstly, it offers some validity to 'monitor matching', and secondly, offers some hint as to why it is that we seem to prefer to monitor match at the kind of percentages which we do. If none of this makes any sense, I'll do some cruddy mspaint to explain it Firstly, let's picture our monitor from above or from the side (it doesn't really matter, but I do it from the side because the games use VFOV) so we have a straight line. Now we need to measure our monitor and our seating position (assuming that your eyes are directly in line with the centre of the screen, which for the purpose of FPS games, they should be). We can use the following formula to find our actual FOV of the monitor. I sit 32.5cm from a 1440p 27" monitor (I can hear my mother telling me that's unhealthy), so mine looks like this: widthpx = 2560 heightpx = 1440 diagcm = 27*2.54 viewdistance = 32.5 <-- ^--- Yep, centimetres because science. Also I'm Aussie You can use inches, just don't *2.54 in the line above. heightcm = (diagcm/sqrt(widthpx^2+heightpx^2))*heightpx actualfov = 2(arctan((heightcm/2)/viewdistance)) = 54.70173510519102597649 Unsurprisingly, valve know their stuff (see links above) and I have adjusted my workspace to bring my FOV close to the natural 55-60 degree FOV where our eyes and brain treat the image as important (beyond this is our peripheral vision where we do not see so much detail but mostly movement, again see links above) So, now we can envision that there is a triangle formed between our eyes (well, our eye. We don't need to worry about stereo for this, so we just use the dominant eye) and the edges of the screen, and the angle from the eyes is as calculated above. Cool. But, let's imagine that angle is increased to say 80degrees (my hipfire FOV). In order for the triangle to meet the edges of the screen, our eyes should be much closer.... and if they are (ie, we move our head closer to the monitor), we see NO distortion. The distortion of the image is NOT caused by the projection. It is caused by the fact that our head doesn't move, to match the focal point of the projection. Here, we start to uncover the real reason WHY we feel the need to change mouse sensitivity when zooming, at all. It's about the amount of angle our eyes need to move, to cover the same amount of angle in the game world. This is distinct from, the distance our eyes move, to cover the distance between two points. Our brain doesn't work that way. It thinks of all distances as angles, which makes sense really, since it's all a matter of feedback from our eyes telling our brain how much they rotated. Now, if we take a few FOVs (in my testing I've been using actual, hipfire, 4x and 8x zoom) and measure out the distances to the focal points, we will have one very close to the monitor (hipfire), one where we sit(actual), one some distance behind where we sit (4x), and one very far behind us (8x). Guess what the ratios between those distances are? zoom ratio. Great And we already know, that Zoom Ratio/0% gives us perfect movement in the centre of the screen. So, why does it fail? Let's say, that we see a target which is half-way to the edge of our monitor. Let us not make the mistakes of the past and think of this as pixels or cm or inches, it is an angle. Our brains all agree on this In my case (using the same formula above and dividing the screen by half again), that's angle= 2(arctan((heightcm/2/2)/viewdistance)) ~=29.00degrees from the centre of the screen. So, now let's put this into effect using our hipfire, 4x and 8x zoom. Our eyes move 29degrees, how far do we need to rotate in game, to aim at this target? (yes, it can be simplified mathematically, but for the purpose of conversation...) We can calulate the focal distance from our screen, for a given FOV, using the following formula: opticallycorrectdistance=(heightcm/2)/(tan(fov/2)) So, I'll do that for my 3 example FOVs: hipdistance=(heightcm/2)/(tan(80/2)) = 20.03463865597708287603 fourdistance=(heightcm/2)/(tan(14.8/2)) = 129.4379759752501060469 eightdistance=(heightcm/2)/(tan(7.45/2)) = 258.21347922131382533488 And now we can just use the same formula above, with these distances, to calculate how far that ~29 degrees of eye movement amounts to, in the game world: actualfov = 2(arctan((heightcm/2/2)/hipdistance)) = 45.52095254923326167504 actualfov = 2(arctan((heightcm/2/2)/fourdistance)) = 7.43098865714869079575 actualfov = 2(arctan((heightcm/2/2)/eightdistance)) = 3.72894033006548981691 Ok that's well and good, but why is it important? This quick example, when we compare the results to those of 0%MM/zoom ratio,demonstrates that as our FOV decreases, the effect of the distortion on angular displacement decreases. So what? well, this tell us that the most important adjustment to our mouse sensitivity, is that made between the widest FOV - which is going to be hipfire - and our actual FOV of the screen from our eyes. As the FOV becomes smaller (higher zoom in game) the distortion is lower and lower and less and less meaningful. So, since we can NEVER make a perfect adjustment of sensitivity for all parts of the screen, because the distortion is not constant across the screen; but we can make an adjustment which is perfect for one part of the screen (this is why there is a percentage in monitor matching and a coefficient in BF and a zoom sensitivity in OW etc)... Which part of the screen is most important? If we say, the centre, then we use zoom ratio. But almost all agree, that 0% feels 'slow', and we know that is because of the angles vs distance thing. If we are CSGO or BF1 defaults, we use 4/3 aka 75% because muh feels. If we're the average OW pro, we use 18%. Why does everyone disagree? Well, if you take the hipfire FOV of a player, and his actual FOV, and work out your ratio from there....suddenly it all begins to line up with what 'muh feels' has been telling us all along. Sure, ANY variation from the optically correct distance from screen, for a given FOV, will introduce distortion; and that distortion will ensure that our mouse sensitivity will never be correct to any given point on the screen..... but the lower our FOV gets, the more zoomed in we get, the less of a difference it makes. The big difference, is that between our wide hipfire FOV, and our actual FOV of the screen.2 points