Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 07/25/2018 in all areas

  1. i really like this topic, if this post is polluting this topic, feel free to delete it i think that it is very hard to find a perfect formula for every game. Just because games have different mechanics. If we only talk about FPS games, i think these things are relevant: - there are hitscan mechanics and more realistic mechanics with bullet speed / drop. - FPS games with recoil and without recoil With hitscan and no recoil it makes more sense to me using 0mm, since tracking a target is almost always the same in every situation -> put the crosshair on target and track it. Sure, moving from hipfire to a scope will almost never be perfect dead center on a target, so we have to adjust -> but when we put that crosshair on the target the sensitivity should be correct right? With hitscan and recoil things will become a bit harder, we still have the same issue with going from hipfire to scoping, but when we have the crosshair adjusted on target and shoot, the recoil kicks in. Depending on the game this recoil can be little or really much, but it will move the crosshair off target and we have to readjust -> making the sensitivity for a small moment incorrect again right. Now, the issue is when there are realistic mechanics in play, bullet drop and bullet speed. Depending on how far the target is, how fast the target moves in a certain direction, we have to put the crosshair off target and also it can change each shot we take at the target. the target might try to dodge bullets by rapidly changing direction and speed, so in these games we have to compensate for bullet drop, bullet speed, recoil and target movement. first shot might be needed to be shot 2cm left of the target in a 4x scope, after that shot recoil kicks in so our crosshair ends up 3cm higher than before, we have to drag the crosshair down, but in the meantime that target went from full speed running tot the left to a full stop, so we have to move the crosshair to the new location etc etc. In these games tracking a player almost never means you are able to put the crosshair deadcenter on a stable path so to say. in these games there is actually alot of small off target flicks while using a scope on a moving target. Now with 0mm human errors in compensation of recoil and bulletdrop/speed will add up quite fast imho, compensating recoil might be done wrong, and u have to even adjust way more the next shot. this for me is why 0mm in CSGO with an awp feels great, but 0mm in pubg feels way too slow. it always feels great on static targets though. it also might be the case that 0mm is still correct, but just feels to slow and needs time to get adjusted to.
    1 point
  2. If you match at 0%, you will judge the necessary mouse distances properly, since the distances scale proportionately with the fov, whereas any other method is going to be counter intuitive for this, despite the whole concept being to match a a perceived speed or match a screen distance. Instead, these alternative methods give the illusion of better, more consistent aim because they maintain the area on your mouse pad that you use to aim within your field of view. This lets you become very proficient in a single aiming style/method (such as wrist aiming) with specific swiping distances as you don't have to scale your input with the fov. It also lets you get away with low sensitivity at high fovs (people reduce sensitivity for instant results instead of just improving their mechanical skill) due to the same reason, not having to scale your input. Since 0% does not match a distance, it instead matches the velocity, and since other methods are not 0%, they have to result in different speeds in order to accomplish what they were made to do. Matching the velocity has to result in different perceived speeds and different required mouse movement as every fov is unique. The amount of information and distortion scales with the fov. The very essence of increasing the fov is increasing the number of degrees that you can see. So naturally, if you pan the camera, there is going to be a lot more activity on your screen and it is going to look faster than a lower, flatter, more zoomed in fov. So it makes sense that the correct conversion is going to be something where the distance and view speed is not matched. If you do match the view speed instead, then you are slowing the velocity of the camera down for high fovs and increasing the velocity for low fovs in order to make them look the same. The biggest issue with this is that low fovs will feel too sensitive as the required mouse distances are far shorter than assumed. Only 0% will have the correct distance scaling. The reason why you can judge distances properly with 0% is because the distance scales with the zoom. If you zoom in 2x, the target will be 2x further away on your screen, and will require 2x more mouse movement to flick to compared to before the zoom. If you make this a fair comparison and scale the distance between you and the target to counteract the zoom, then the mouse distances will be the same. This will also benefit tracking, since the perceived movement speed, size, and distance of the movement will scale with the zoom, and so will the sensitivity. So if the distance between you and the target scales with the zoom also, then a target will move the same speed across your screen, and require the exact same mouse movement. As for the question about 'match at' percentages and matching a distance in general, the best distance match IMO is the inverse of your aspect ratio, multiplied by 100. E.g. 9/16 * 100 = 56.25. It will match the distance to the radius of the 1:1 aspect ratio. Higher percentages, like 75% are close to matching viewspeed, which suffers from the sensitive low FOV issue. Besides, all distance match methods are arbitrary, and you will get drastically different results depending on what fov measurement you use. 0% is the only method that has the same result regardless of the measurement used. Matching the view speed, screen distance, or 360 distance, is only going to be detrimental to aim performance in the long run. You will have to compromise and develop unique muscle memory for a wide range of fov and hope that your brain can fill in the blanks for fovs in between. These methods will only improve comfort and may give better results, but only in the short term (due to being only proficient in one aiming style, or having a low sensitivity for a high fov, or not having enough mouse pad space in general). They only seem correct because when you zoom in/out, the distance between you and the reference point is remaining static, you don't teleport forward/back to counteract the zoom. For long term, you need to get used to 0%. Ignore the deceptive issues with view speed and the variance in mouse movement. You won't really have to develop muscle memory for every fov, as you will figure out the distances automatically as they scale with the zoom, but the different distances will require you to master your aim with all the styles, such as micro, finger, wrist, and arm (from elbow and shoulder) movement, and you will probably have to use a higher sensitivity in general. And yes, for anyone wondering, I have switched over to 0%.
    1 point
  3. Too much acceleration, so it will be impossible to add unless there's a fix.
    0 points
×
×
  • Create New...