Jump to content

Drimzi

Members
  • Posts

    1,211
  • Joined

  • Days Won

    93

Everything posted by Drimzi

  1. Master Arena An arena game in alpha. Alpha code is instant, just add email address. Alpha code Horizontal FOV 90 - 120 Explicit sensitivity for zoom Not FOV dependent Pitch/yaw somewhere in the 0.00549243 ballpark Mouse sensitivity config: \steamapps\common\MasterArena\UDKGame\Config\UDKInput.ini FOV config: \steamapps\common\MasterArena\UDKGame\Config\UDKMAPlayerInput.ini
  2. 0% is the most plausible conversion, but it's not always the most practical. Use 0% unless you prefer something else.
  3. I didn't make that. everythingllbeok from reddit made that.
  4. The calc is a little buggy. 114 reports 113.13, and then 115 + reports 112.81. The sensitivity is working good though. So it was scaling from (4:3) 65 all along, just the FOV was incorrect. Did a 20 revolution test at 120 fov (113.69) and it was perfect.
  5. What you might be looking for is the Control–Display Gain/Ratio. It is the ratio between the control device (mouse) and the display element (cursor). Since it is a 1 : x ratio, it is expressed as a gain, or amplification. If you google it, there are a lot of papers discussing this topic. For 2D, it is found by doing the following: = ScreenSize / (PixelCount / (CPI * WPS * DPIScaling)) For 3D, it is found by doing the following: = FocalLength / Radius = ((ScreenSize / PixelCount) * ((PixelCount/2) / tan(FOV * π/360))) / (180 / (GameSensitivity * GameYaw * CPI * π)) This is all using square / 1:1 measurements, and assuming the game sensitivity is not FOV dependent. Let's say you have 24.5" monitor, 1920x1080, 450 CPI. Playing Overwatch at 11.37 sensitivity. Crop it to a square: = (DiagonalScreenSize * SquarePixels) / sqrt(VerticalPixels^2 + HorizontalPixels^2) = (24.5 * 1080) / sqrt(1080^2 + 1920^2) = 12 CD Gain for 2D: = 12 / (1080 / (450 * 1 * 1)) = 5 CD Gain for 3D: = ((12 / 1080) * ((1080/2) / tan(70.53 * π/360))) / (180 / (11.37 * 0.0066 * 450 * π)) = 5
  6. Why not just convert from your Overwatch hipfire sensitivity? Converting from 37.94 is just using a low precision version of your hipfire sensitivity The actual number would be this, but the decimals are infinite. The different results will be due to rounding. That change in precision was just enough to round upwards.
  7. http://www.wolframalpha.com/input/?i=c%3D1800;+x%3D-20;+c%2F(1+-+x+1%2F30) If you use this link instead, you just need to put in CPI. Then in the shoot the beat! output parameters, you put the CPI result in there. I also changed the scaling, as -30 seems to be a lot closer to double the distance rather than -20. I did a crude check. I screenshotted 0 and -30, and scaled -30 by 200%, and the center background portrait thing was pretty much the same size.
  8. I just tested it myself. 101 is pretty much equal to 90 FOV (4:3). I used a script to send the counts necessary to rotate to the top of the screen. It pretty much landed on the reference point at 101 FOV, with higher values overshooting. edit: this was with the sensitivity value before the calc update, which was too high to begin with.
  9. Yes, 4200, or just use convert to Aim Lab first to see what the sensitivity is with more decimals, and put that in to the wolfram calc. Then you can convert from Aim Lab to Shoot the Beat's registry value. It's not the same scaling as 56.25%. It's scaling proportionately with the camera distance, so it would be more like 0%.
  10. It does seem to scale in the way I assumed. I'm assuming -100 = 4294967196 ? Tried that and it seemed as good as it could get, but it is a new experience having the camera orbit like that. You can put your CPI, cm/360, sensitivity, or whatever in here, with whatever distance, and it will tell you what to use: Calc for distance at 90 FOV c = cpi, sens, or whatever x = distance
  11. I downloaded the update. At 90 FOV, changing the camera distance seems to have zero effect on sensitivity. Tested with a script for 6 full revolutions with zero residual counts. The cm/360° was completely perfect at 0, +10, -10, -20. edit: When playing it, it does feel weird at different distances though. +10 is too fast, -20 is too slow. So it might be a case like Fortnite, where the sensitivity should scale, even though the angle of view remains constant. The scale factor is the change in camera distance. edit2: Just a quick assumption that +20 would put you at 0 camera distance, where you can't play, then +10 might be 0.5 distance, 0 might be 1 distance, +20 might be 2 distance. Tested +10 with half CPI, and -20 with double CPI, and it felt somewhat okay.
  12. I remember them screwing with the sniper scope sensitivity and there being a lot of backlash, this was in the beta or at launch. Probably so controllers can keep up with the boost jumping. The recon sight (not sure how many others) also have an amplified sensitivity on PC for whatever reason.
  13. Bo3 is 0%, but some scopes like ACOG? seem to be sped up. Also the fov isnt accurate, 110 is more like csgo, and 120 is more like 113-114. So cm/360° tests will be off. If you are converting with 0%, you just need to verify that 80 fov is correct in terms of cm/360°, as it would be scaling internally at 0% to whatever fov it really is. The sensitivity doesnt scale by an arbitrary percentage. It is the tangent ratio (trigonometry) between the hipfire fov and the ads fov.
  14. Although the angle of view remains identical when targeting, the camera shifts position. My assumption would be to scale the sensitivity proportionately with the change in camera position, but I don't know the values and it is probably different per weapon. (I don't play Fortnite). Try setting Targeting to 0.75, Scope to 0.65, (these values remain constant as they are ratios), and set the base sensitivity value to anything you like.
  15. Just note that this quirk with Planetside 2 has nothing to do with 1000 DPI specifically. It's just that he can't reach his preferred cm/360° with 1000 counts per inch. Just do it by feel and adjust accordingly over time... which may take a very long time to settle if you didn't do it by feel the first time and blindly copied someone elses preference or based it off mouse pad space.
  16. The benefits are there, but I wouldn't prioritise it if you have to sacrifice anything. I personally use a 1/4 windows pointer speed and game sensitivity multiplier along with 4x CPI multiplier. Not only for the benefits outlined above, but to also effectively expand my effective CPI range by 4x as well. Every 50 CPI increment is equal to 12.5 (50/4) now, giving me a lot more fine-tuning.
  17. He puts a lot of emphasis on pixels though, which is incorrect, as movement in 3D is angles with the rasterised frame being displayed by pixels, but I'm sure you will get the point.
  18. 3kliksphilip uploaded a video talking about this very recently. And it's not the dpi, you should be saying 0.00565 is better.
  19. You dont pixel skip in games. You angle skip, and you angle skip regardless, the amount just depends on the game sensitivity. It can 'look' like pixel skipping if your turn rate is higher than the angle displayed by the 2 center pixels. The game sensitivity sets the amount of degrees skipped per mouse count. CPI/DPI is the number of mouse counts per inch. Ideally you would want infinitely low game sensitivity and infinitely high CPI, but that's not possible. Your 800 CPI scenario has the potential to be 2x smoother (2x more counts), 2x more reponsive (1/2 the displacement required to trigger a rotation), and 2x more precise (1/2 the degrees rotated per count).
  20. Let's go from 10" monitor to 20" monitor. Both have 1920x1080 pixels. First assumption will be to match the distance it takes to move the cursor from one side to the other side, AKA move 1920 pixels. This is what the calculator does. Let's extrapolate this idea to a 1" screen and a 1000" screen. If the distance it takes to move the mouse results in the cursor moving 1" in one case, and 1000" in the other, will it feel the same? Personally it wouldn't to me. Instead of matching the quantity of pixels displaced, you want to match the distance displaced. Figure out how big a pixel is. (rounded for simplicity) 10" monitor: Every pixel is 0.0045" 20" monitor: Every pixel is 0.0090" If it takes 1" of mouse movement to move 1920 pixels on the 20" monitor, then 1" of mouse movement correlates to 17.4". This is a ratio of 1 : 17.4, or a gain of 17.4. It doesn't matter what units you measure, whether it is mm, cm, inches, kilometers, it doesn't matter. 1 unit of mouse movement correlates to x unit of cursor movement. This is what you convert, not the pixels displaced. It will become inconvenient to move the cursor from the start menu to the system tray if the screen was 1000" wide, but that has nothing to do with the sensitivity of the mouse. For convenience, you can change the sensitivity to reduce that distance, but that is just preference. Let's take it to 3D. In the above explanation you can see that pixels don't matter. The monitor is acting like a window. You are keeping real-world distances. The same applies to 3D. Imagine you are playing a game like CS:GO. You crop the aspect ratio from 16:9 to 4:3. Does the perceived sensitivity change? Is it like some kind of optical illusion where replacing some of the rendered game world with blackness changes the perception of sensitivity? It personally isn't to me. What if instead of blackness, you just change to a 4:3 monitor. You effectively reduced the horizontal angle of view from 106.26 degrees to 90 degrees, but the perceived sensitivity did not change. If it feels the same, why would you scale the sensitivity by the change in angles, ie. 90/106.26, which is what 'monitor distance match' is doing. So at the moment, the angle of view (or FOV) is changing, but the cm/360° is not and it feels good. So maybe it is because the vertical angle of view is still 73.74, so that's why the sensitivity didn't change? Wrong. Since CS:GO enforces a specific angle of view, let's just place black paper on the monitor to 'crop' it, let's reduce the effective vertical angle of view to a low number, will the sensitivity appear to change? If you answer no, then you can see that the angle of view has no bearing on the sensitivity. The focal length remained constant, the angle of view changed, so the answer is the focal length. Here is 90° (4:3) at 20" and at 10", overlayed You can see that they are both different. They have identical angle of view, but the focal length is completely different. If you convert from the 20" to the 10", then the 10" needs to reduce the angle of view to maintain the same focal length. 20" has 90° (4:3), 10" has 53.13° (4:3). Here is 10" converted to 20". 10" has 90° (4:3). 20" has 126.87° (4:3). The first scenario, where the game enforces 90° (4:3), the game sensitivity value doesn't change. However, the CPI did change, by a factor of 2 if you keep the same 2D sensitivity. This is because the 20" is twice the size of 10". The focal length has also changed by the same factor. So the change in CPI results in the change in cm/360°, even though the FOV is the same. If you didn't want your cm/360° to change, you would have to use a lower angle of view on the smaller monitor (the 2nd picture). You can quickly test this for yourself by creating a custom half resolution, like 960x540, and then playing a game in windowed or fullscreen (with no scaling and override enabled). Compare no change in sensitivity, and double game sensitivity (or whatever sensitivity value is equivalent to half the cm/360°). See which one feels better. Scaling the game sensitivity by the change in focal length is exactly what 0% monitor match does. The change in monitor size is also the same factor as the change in focal length. This suggests that 0% is the only way to convert game sensitivity/cpi, as it results in no change in 'sensitivity'. The relationship between the device and the display element remains constant. Any deviation from this will be personal preference. Many will have personal preference that will override 0%. Like if the screen was incredibly wide such as 32:9, making the distance from one end to the other double than that of a 16:9, then they may prefer to scale their sensitivity by a factor or 2 in order to make the hand distance the same when going from the start menu to the taskbar, or closing browser tabs, etc. They distance matched, which requires a change of device/mouse sensitivity (also applies to 3D). Many may also prefer to amplify their sensitivity for aim down sights/scopes, so they don't have to scale their input proportionately with the change in zoom and image curvature. 0% usually feels too slow in this case because you are directly comparing two different focal lengths without the distance between you and the reference point changing to cancel out the perceived zoom, whilst simultaneously using the same hand movement before and after the zoom. The target size, distance between the target and the crosshair, the target movement speed, etc., all scale proportionately with the focal length, which means you also need to scale your input with the focal length to land the flick or track the movement speed (and then there's the difference in eccentricity/curvature that results in that snappy/sluggish feeling and different diagonal trajectories). Due to these preferences to scale sensitivity instead of input, other methods like 'monitor distance match' can be useful. I think Viewspeed v2 is also useful, the feeling is is kind of comparable to zooming in and having the camera dolly in the opposite direction of the zoom. If you zoom in from one end of the spectrum to the other, whilst simultaneously moving the mouse in circles, it looks like the sensitivity is matched perfectly. But when it comes to actually using Viewspeed v2, it results in worse aim performance (personally) at the expense of it feeling more consistent in that exact specific moment, due to the no sudden feeling of slowdown or speed up whilst using identical mouse movement before/after the zoom. As for focal length, I believe it is found by doing the following: (SquarePixels/2) / tan(SquareDegrees * pi/360) = focal length in pixels? It's the same as this graphical fov calculator: https://teacher.desmos.com/activitybuilder/custom/5a61dd34fafbd40a25416e02#preview/d123ef39-8694-4760-af7d-c18c936ce79d Scale pixels by the physical dimension of a pixel, and you will see that the above cases have identical focal lengths despite very different angle of views. 10" 90 (4:3) has 720 pixels focal length, 720 * 0.0045 = 3.24" 20" 126.87 (4:3) has 360 pixels focal length, 360 * 0.0090 = 3.24".
  21. I will explain why in a bit, will probably be a lengthy post.
  22. nope just leave the sens alone, although leaving it alone is the same as 0% scaling. Alternatively change the fov to achieve a zoom factor of 1.175 instead, then u will have thr same cm/360.
  23. In my experience, it is just the size that matters (lol). The amount of space that the monitor occupies in your vision hardly matters, like you can't make a small screen feel like a large screen by moving it closer as the depth is different.
  24. https://www.mouse-sensitivity.com/forum/topic/4635-csgobattlefield/?do=findComment&comment=17635
  25. That math is still right, 940 is correct.
×
×
  • Create New...