Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 03/23/2018 in all areas

  1. Being an ex-console peasant, I have the same gift, and I've been very careful to maintain it. I don't have a 'muscle memory' of any significance, since my muscle memory is for sticks not a mouse, and I'm constantly changing my sensitivity in my search for a formula. You're right, there can be no perfect method, at least, not based on monitor distance matching. Because the distortion of the image is introduced by our stationary position relative to the monitor, while the focal point of the image shifts back and forth, the image on the monitor is not an accurate point of reference for rotation. This is well evidenced by the failure of 0% MM which is the perfect representation of the image on the monitor. Me too. The thing is that nobody has really tried before me so I am inventing the wheel here. I have a rare disease and am not able to work on this often, and I'm knowingly biting off more than I can chew, so I stand on the shoulders of giants, in medical/psychological fields as well as gamers here, but there is some leftover which I must do alone (unless others pitch in here as I've invited ) One of the really problematic parts I'm finding, is that I can come up with concepts, do working on paper, but to share my progress takes twice as long as making the progress, since I have to do fancy images and stuff. I'm a lot further along than the thread shows, but it's hard to share. I'll try and give some updates to what I've learned, today, in my replies to you. No. This is a common misconception and one which has led us (and I say "us" as in, me too!) astray. Our brain expects us to move a certain ANGLE from the crosshair, NOT a certain distance. Our brain uses the angle to determine the distance between two objects. The headshrinks refer to this as https://en.wikipedia.org/wiki/Visual_angle . Note in the image on that page, we need V and D to determine S. We cannot expect a movement in distance, because we do not know the distance. We can know the angle thanks to feedback from our eyes. Using 0%MM accounts for the shift in D (refer to the geogebra links above), however it does not account for the distortion introduced by 'faking' the shorter distance (read: zooming the image, rather than actually rendering what it would look like from a shorter distance, or moving your head to the focal point every time you ADS haha) And, that's where this comes in: The reason any sensitivity feels faster at the edges of the monitor is a result of the 'stretching' introduced by the distortion of the image. As we diverge from the centre, the image becomes more and more stretched, so the same angular movement causes a greater distance to be travelled on-screen. Everything starts off slow at the crosshair The reason it feels best at the edge is because the angle of rotation is perfectly matched to that monitor distance, but it will always feel 'off' anywhere else. As discussed above, by accounting for distance, we get 0% MM which is great except for the distortion; and as you'e covering here, the distortion can never be fully accounted for. What we're after here is the 'most correct'. And now we begin to touch on what I've been working on recently.... Now before I get into this I have to say, some of this is untested and unconfirmed. I'll mark those with a ***. Normally, I like to be REALLY fricken sure before I post something here, but I realise that my lack of communication is detrimental and probably moreso than posting a possible mistake....So, I'll post this possible mistake - but I'm sure you'll see where I'm going with this. As I'm sure is obvious by now (so I won't go in-depth explaining why) the distortion we see on-screen is a tangent function. If we consider the sensitivity "monitor matched" at 180 degrees (directly behind us, hence the quotes on "monitor match" because that's NOT on the monitor obviously....), we're looking at something which approaches having no divider for mouse sensitivity whatsoever, as in, the same cm/360 for every zoom level (*** pretty sure of this, haven't finished the math, but I'm very confident). If we consider the sensitivity at 0 degrees (the crosshair), we are looking at 0%MM. You can see this in the geogebra image in my posts above. I have to re-do that geogebra magic to allow for a target that is off-screen, but you can see where it's going. Using that image, I started to suspect that perhaps the formula was as simple as dividing the two FOVs. You can see that at vertical 100% (slide the 'Target' X in that image to the top of the 'screen'), that's exactly what we have. This seems to make a lot of sense, since we've learned that we're not dealing with distance, we're dealing with angles, so dividing the angles of the FOVs seems sensible. And then it clicked, hey this won't work in HFOV. Dividing the two HFOVs does NOT provide the same result as dividing the two VFOVs. A quick spot of experimentation showed me, that if we scale between 0 and 180FOV, we get the monitor's aspect ratio as the ratio between the divided results. No big surprise there, but as I alluded to in my previous post, this got me to thinking: Where is the best place to match? 100% VFOV seemed nice but 100% HFOV breaks that wide open. And then I got to thinking: 100% HFOV is the same thing as 177.77~% VFOV (on 16:9 monitors at least). Knowing the importance of aspect-ratio independent sensitivity (as per my previous discussion and the example of being in a plane or spaceship or whatever, and rolling it over...) This got me to thinking: WHY do we assume that the target is on screen? It's entirely common for me to be in-game, and hear footsteps behind me and have to spin and hit them. Sure, it's not as common as a target roughly in front of me and within FOV, but that has no bearing on what's correct for the formula, it's related to my own movement. The target might be at the right edge of a 16:9 monitor, that's the same thing as being entirely off-screen above me. And oh look, there goes a plane flying above me. Think about things like rocket jumping where you spin 180, fire, then spin back.... As soon as I started thinking about this, examples flooded my head. There has long been discussion about which monitor match percentage is correct, and I am now asking myself, who says it has to be on the monitor at all?! So, we have to find a place that scales correctly, given a minimum sensitivity, as provided by 0%MM, and a maximum, being identical cm/360(***). Which has the least amount of error involved? Well, this is easy if we think of it as a tangent function. At the asymptotes, we have a vertical line, representing maximum sensitivity (same cm/360 ***). that's 90degrees (tan(90degrees)). At 0, we have a horizontal line, representing minimum sensitivity, the 0%MM. annnnd hello 45 degrees. Keep in mind that I am referring to an angle from the centre of screen, so that's a 90FOV. If you consider the average kind of hipfire FOVs people use, this is going to be somewhere just a little greater than 100% VFOV. I'm reminded of the conversation about Battlefield's USA, where they started with what we call 0%MM, and everyone said it was too slow, then they tried 100%VFOV matching and it was almost there, then 100%HFOV but some people found it too fast, and then came to the conclusion that some people feel it differently, and settled on CSGO's 133% as a 'middle-ground'. Believe it, it's true You can see it for yourself, it's been proven, and not only by myself. One way to see it is with the geogebra image I posted above. If you move the 'eye' to the focal point for a given FOV, there is no distortion of the angle from the eye to the target on-screen and beyond. Another more tangible way to do it is by using the high-FOV images from the first page of this thread. Use the formula: opticallycorrectdistance=(heightcm/2)/(tan(hipfov/2)) and move your eye to that correct distance (you'll have your face right up in the monitor!) Look at the centre of the image and note the complete lack of distortion toward the edges. It's funny because I think I've been too easy on it. At first glance (and second and nth, lol) it appears to be perfect. I said early on here, that I expected this thread to validate a previous formula, and not so much to make something new, and to be honest, what I silently expected was to prove that 0%MM was correct. The more I look into it, the more I realise why it feels too slow for everything beyond 0% from the crosshair - because it is the MINIMUM sensitivity across a range of sensitivities which contain the most correct one. As our FOV decreases (we zoom in), so does our range and so does any error in our sensitivity, so it's quite hard to see any error in it. The greatest exposure of any error, is going to be in the way we treat our hipfire FOV.
    1 point
×
×
  • Create New...