Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 01/25/2023 in all areas

  1. Lol what are the chances I have a z fold as well.
    1 point
  2. The short story is that there are a lot of theories abound, many possibly incorrect. I bet we'd agree that WAY more research is needed. It is quite possible that the creator of engine (e.g. Valve) as well as the mouse manufacturers (e.g. Logitech and whomever) created a situation where some things don't work well. It's a long pipeline from mouse sensor to mouse firmware to USB poll to USB drivers to game engine and the math workflow inside a game engine. So the actual truth may be completely different from what everybody wrote (myself and Razer included, I can sometimes parrot incorrect information from, say a mouse engineer that came to the wrong conclusion). What really stands, is that a lot of systems have aimfeel problems when deviating away from 400 or 800. More research and study is needed. Many of us, people here included, have noticed that 400 and 800 is no longer the final frontier for today's increased resolutions at increased refresh rates. Several have personally noticed that aimfeel feels more wonky at the same calculated DPI settings in CS:GO than certain newer engines; the reasons I described may very well be wrong, but that doesn't deny the odd behaviors that sometimes happen from an old game engine at least by some users on some displays; Much more conservative mouse settings were often fine at 1024x768 60Hz with no visible issues, but can produce more visible issues at 2560x1440 360Hz, as an example. Higher resolutions, higher refresh rates and higher frame rates, as well as new sync technologies (other than VSYNC OFF) can amplify issues that weren't seen before. Old games weren't originally tested at those testing variables. This stuff doesn't matter as much to end users but for the future of the refresh rate race.
    1 point
  3. Hand-waving in full effect I see.... The reason 8khz doesn't work a lot of times is nothing to do with "mouse mathematics" or floats vs double precision of sensitivity variables. It is because of how games acquire input themselves. A game like Valorant only works with 8khz when it calls GetRawInputBuffer() to hold the inputs until an arbitrary buffer size is filled when the engine is ready for the input. If any app just gets WM_INPUT messages "as they come" as per a standard read of Rawinput, then unless it's pipeline is so trivial that it is only doing something like updating a counter, then it will most likely fall over with inputs spamming-in haphazardly every ~125us. The symptom is dropped packets, negative accel and / or just maxed CPU usage and stuttering. None of this is anything to do with sensitivity calculations being superior / inferior. Windows is also not a RTOS and is objectively bad once timing gets tight, it becomes extremely expensive to try and do anything accurately at these kind of timings. This is not going to change as it's a fundamental of the Windows environment. The only reason low DPI can work with 8khz in some games where high DPI doesn't, is because the DPI is nowhere near saturating the polling rate and you're getting much lower input cadence. Set your mouse to 8khz and 400 dpi and move at 1 inch per second and your update rate is 400hz (obviously) and is therefore no longer "8khz" as far as the game is concerned. This is nothing to do with the DPI setting itself, which the game has no knowledge of or interaction with as DPI Wizard already said. Most simulation loops of in-game physics, enemy positions - things like whether a crosshair is over an enemy etc will run at 60hz, maybe a really competitive FPS would run higher, and maybe they poll mouse input faster at 2 or 3 times the frame rate of the game, with the textures / graphics rendering running at the actual frame rate obviously. Usually though, you would register event callbacks from input devices which are then passed to a handler / accumulator that is then called once at the start of each frame. In other words, it does not matter if you are sending 8000 updates a second to the OS, because a game will just buffer them all and sum the total mouse distance to be called at the start of each render frame anyway - it makes no practical difference to your crosshair position whether you do this work in the firmware of the mouse at 1khz, or whether a game does it instead at 8khz. The only important factor is that the polling rate is greater than or equal to the highest frame rate of the game at a minimum. If you think using 8khz is giving 8000 discrete rotational updates of your crosshair a second, and for each of those positions an enemy location is being calculated for whether said input would be over a target or not (i.e something meaningful) then you are mad. Once we get into the future where performance improves, it is also not inevitable this will change - rather the opposite. We have, for example the precedent of "Super Audio CD" and "DVD Audio" in the audio realm which were both large increases in resolution vs CD quality on a factual basis, yet both failed as standards precisely because that level of resolution is not required for the user experience - instead, users actually gravitated towards lower resolutions (compressed audio formats) and smaller file sizes for easier distribution. Point being - if such technological improvements were available to game engine developers to do more complex computations within a smaller timeframe, they will not be using such resource to update the crosshair position more frequently. There are many things such as higher animation fidelity, better online synchronisation systems, more complex physics, rendering improvements etc which would all be much higher priority and more obvious quality gains. Either yourselves or Razer's 8k marketing team moving a cursor on a desktop and posting pictures of "micro stutter" is not going to make any user see a problem. This is unlike the actual micro-stutter that occurs in 3D rendering due to frame-cadence issues, which are readily apparent to even the uninitiated user. There is no one using a 1000hz mouse on a 144hz display and going "geez, I really hate these micro-stutters on my desktop cursor, I can't wait for that to be improved!". In short, you are inventing a problem and then trying to sell a solution / idea of a solution, and your argument effectively boils down that anyone who disagrees just doesn't have the eyes to see the problem, when the truth is no problem exists in the first place, and your solution would not even solve it if it did. Mouse report rate does not need to be synchronised to monitor refresh rate or game frame rate etc whatsoever, and insisting it would improve anything is fundamentally misunderstanding how the quantities of each are handled and how they interact with one another. Games will always render frames at arbitrary intervals because each frame has infinitely variable parameters all which require an arbitrary amount of resources, and mouse input polling on Windows will always be haphazard timewise and always has been due to the fundamental design of the operating system. Moreover, once the timings get down to a millisecond or so then there is no value to anyone in any case. No one is going to care about turning G-sync on if the monitor can run 1000hz (this is effectively the "Super Audio CD effect") and any "improved" mouse API that presumably could send decimal distance values to the OS instead of the standard (x,y) packet of integers with remainders carried to next poll, would also achieve nothing of value to anyone over existing systems that are stable, extremely well understood and established.
    1 point
×
×
  • Create New...