Leaderboard
Popular Content
Showing content with the highest reputation on 02/12/2023 in all areas
-
F.E.A.R. Sensitivity Bugged
bone_hero reacted to DPI Wizard for a topic
Can you try again? There was an issue with the calculated sensitivity being too low so it resulted in a division by zero, it should be fixed now.1 point -
Game request archive
GreekBoy reacted to DPI Wizard for a topic
A ridiculous amount of negative acceleration in this one, can't be added.1 point -
1 point
-
High Dpi issues on old Games / Engines
MacSquirrel_Jedi reacted to Chief Blur Buster for a topic
And now I rewrite my accidentally-lost reply, as I promised. Although I will shorten it (summarized/abbreviated) because of continued discussion already covered some ground, and the gist summarized it well. So in other words, I will avoid repeating things I already said in the "gist" above, so read the gist above first. Correct, that's a bigger weak link. Now some angles to ponder: It's a long chain from sensor to photons, and "because of how games acquire input themselves" is exactly it in how good/bad it can do mathematics too. For example, you're doing 500 rounding-errors per second at 500 frames per second, on a 500 Hz display, it can build up to enough to create a "false acceleration or deceleration / false negative acceleration" behavior in some engines even though mouse acceleration is turned off. The more frames that spew at you, the more math rounding errors per second that can occur, if the developer wasn't forward-looking-enough to compensate for that possibility. The best way is to correctly accumulate the mouse deltas and base your new angle of mouseturn off that (accumulation), not the rounded-off mouselook position from the previous frame cycle (modify the previous rounded-off mouselook position). 500 rounding errors per second definitely builds up very fast -- to more than 1% mouselook offpoint error per second in some engines. This isn't audiophile below-the-noisefloor stuff. Sometimes a shotgun approach is done to fix all the weak links all at once including the human-visible (seen by 90%) and the less-likely (seen by 1%), like the concept of preserving sensor timestamps all the way to the engine, regardless of how engine acquires input, increases likelihood the engine can avoid a lot of pitfalls befalling us today. Yup. Exactly. Yup. Exactly. But it's not just about dropped packets, but millions of causes that can jitter everything in the noise between sensor to photons. Correct, Windows is not an RTOS. Exactly -- that's why I am a big advocate of being able to expose higher DPI more accurately to games with fewer side effects. Although DPI and pollrate are different, higher DPI can more quickly lead to quicker event storms in an engine, which can jitter those timestamps around a bit more often at higher DPI (e.g. moving 1 inch per second is more likely to produce more mouse events per second, or more mouse deltas per second, at 1600dpi than 400dpi). Even if the game engine batch-processes an event storm to prevent single-event processing overheads, this does not solve-all. Other weak links remain too, including the lack of accurate sensor timestamps (e.g. less accurate timestamps created by software long past the horses leaving the barn door of the sensor). The timestamps added to mous deltas created by software can jitter many pollcycles-off the actual sensor timestamps, depending on how things played out (USB jitter, software peformance, etc). This is a problem even if you're reading 10 polls all at once from a mouse deltas array, and they all had timestamps added by some software middleman that has no correlation to what was at the sensor level. My stance is, "all weak links are valid. Fix all of them all at once if possible". Thus, timestamps at the sensor, and preserve all the way to the software for it to process accurately relative to their gametime timestamps. Then who cares about some of the academic discussion in between (some visible, some not visible)... That isn't currently today the biggest benefit of 8Khz poll rate (in an 8KHz supported game). Many games don't do 8KHz good enough to make the 8KHz shine, but in the best cases, you can really notice the smoothness differences in the best software in the most optimized systems. The problem is that it's very few and far in between, due to systemic limitations that make it below the human-benefits noise floor. Most of the time I leave my mouse at 2KHz for most software for that reason, as it derives practically ~90% of the benefits of the 1KHz->8KHz journey. But under Windows 11, some games seems to even have more difficulty with even 1KHz (Microsoft optimization issue) then a Windows update fixed that -- and it worked fine at 1-2KHz, but not at 4-8Khz. Currently poll rates pushes system efficiency / OS efficiency / driver efficiency to the point where 8KHz does not produce more widespread universal human-benefits, so it's part of the reason why the proposed HD Mouse API is a good idea. Now poll rate should be somewhat oversampled slightly (jitter). That's seen in the jitters of poll Hz versus another frequency (a framerate, or a refresh rate), like the weird mousefeel you see on some mice and systems if you're testing 249Hz or 251Hz, which is 1Hz off a multiple of common poll rates. The jitter feel is also different depending on sync technology (VSYNC ON, VSYNC OFF or G-SYNC, etc) but a modicum oversampling can have beneficial effects to avoid the jittering between two frequencies. This is not important most of the time, but it's one of the many error margins that need to be addressed en-masse (as while one error margin may be below human noise floor, the dozens of error margins add up to visibiliity). Rather punch through all the noise and just preserve sensor timestamps to the game engine, rather than waste time whac-a-mole on the chain, anyway. While it's not my purview to determine if a person is mad or not, I like to refinements that satisfies both the mad and non-mad people. That's during the shotgun journey of blasting multiple error margins out of existence at the same time (non human visible and human visible ones) This isn't "below the noise floor" stuff. It's all about geometrics, e.g. 240Hz vs 1000Hz is visible to well over 90% of random-from-public average mainstream population if you design your blind test for a forced-eye-tracking on forced-moving-object. (e.g. moving text readability tests where successfully reading text is the objective). Perfect 240fps and perfect 1000fps is very hard though, without all the noise margins (including jitter) that can reduce differentials. Instead of DVD-vs-720p situation, you've got a VHS-vs-8K situation, except it's in the temporal dimension. Motion blur differences of the display persistence of unstrobed framerate=Hz of 4x is much easier to see than LCD GtG-throttled refresh rate incrementalism. Scientifically, assuming perfect framepacing, perfect sensor/step pacing, and perfect GtG=0ms, all motion blur is purely frametime on sample and hold, and zero GPU motion blur: - The most perfect zeroed out motion blur is motionblur=frametime on ANY strobeless/impulseless/flickerless/BFIless display. - Perfect 240fps 240Hz has exactly the same motion blur as a 1/240sec film camera photograph - Perfect 1000fps 1000Hz has exactly the same motion blur as a 1/1000sec film camera photograph It will typically be worse than that (240Hz vs 360Hz feels like a 1.1x difference, not 1.5x) due to the supertankerfuls of zillions of weak links from sensor to photons, especially software. But also hardware. Common ones accrue. GtG weak link? Ditto. USB Jitter weak link? Ditto. Mouse sensor flaws? Ditto. Pollrate processing issue? Ditto. GPU blur weak link? Ditto. Math rounding errors that builds up faster than before due to more frames per second? Ditto. That's just a few of the massive number of of weak links, some significant, some insignificant. It is already well known that major differences in camera shutter speeds are easier to tell apart. If you're a photographer in sports, it's hard to tell apart a 1/240sec and a 1/360sec shutter photograph. But way easier to tell apart a 1/240sec versus 1/1000sec for fast motion. So, perceptually, 1000fps 1000Hz will yield perfect motion (zero stroboscopics, zero blur) for motionspeeds up to 1000 pixels/sec. Now if you go faster motion with tinier pixels, e.g. 4000 pixels/sec or 8000 pixels/sec, it can require more Hz to retina all that out. But on a 1080p display, those motionspeeds are gone offscreen in a fraction of a second. Not so if you're wearing a 16K 180-degree FOV VR headset. Loosely speaking, the retina refresh rate generally occurs at some Hz above a human eye's ability to track a specific motionspeed in pixels/sec (as long as the angular resolution is resolvable enough that panning images has a different clarity than static images). This is to avoid the stroboscopic stepping effect (e.g. during fixed gaze) and to avoid the motion blur (e.g. during tracking gaze), whereupon you can't tell apart the display from real life in a temporal sense; ...Side commentary and side napkin exercise: VR ideally has to look exactly real life. VR is forced to flicker only because we can't achieve things stroblessly. Oculus Quest 2 is 0.3ms MPRT, so it would require 3333fps 3333Hz to create the same motion quality without its current flicker-based (BFI / strobing) motion blur reduction. So it underlines how far we are from eliminating motion blur strobelessly for all use cases for all possible uses of a display in humankind. Until then A/B test between VR and real life can easily fail due to a single weak link (display motion blur forced on you above-and-beyond real life, or stroboscopic effect above-and-beyond real life, or jitter forced on you above-and-beyond real life etc). And we're backporting some VR innovations back to FPS displays, given how things are often much better framepaced / jittercompensated in the VR world than in the non-VR world currently... It's shocking really, if you knew the weak links that existed that we just handwave-off in non-VR world in the era of ever-bigger ever-higher-resolution displays. Given a good system, VR achieved a near-miraculous efficiency in sensortime:photontime that is two to three orders of more magnitude more accurate than an average esports system thanks to superlative display hardware optimization and software optimization, assuming you don't underspec for the VR content like try to run Half Life Alyx on a GTX 780. When you're head turning at 8000 pixels/sec, anything that looks off from real-life shows up far more, like a 4 pixel microstutter which is 0.5ms framepace error on 8000 pixels/sec headturn. As a rule of thumb, stutter error bigger than MPRT in motion blur, is generally easily human visible in a motion test. Even 240Hz unstrobed is still (4.167ms MPRT + "X" ms GtG) of display motion blur, which easily hides these tiny jitters.... Part of the inspiration to HD Mouse APIs, since often headtracker data is often microsecond timestamped and relayed to the system in a preserved way, so that tracktime:gametime:photontime is far more accurate in VR hardware engineering and software engineering, than for current non-VR. And yes, GPU frame rate is a limiting factor to tomorrow's 1000fps+ 1000Hz+ world. Yes, GPU horsepower does need to keep up, but there's longterm lagless paths there (reprojection). Speaking of that, that's a VR innovation eventually to be backported to non-VR contexts in the coming years or decade, as some game developers has noticed. Being that said, this isn't only-audiophile-can-hear-it stuff. It's more majority-of-mainstream-can-see-it-if-blindtested stuff. Even for things like browser scrolling, too (that's why Apple and Samsung came out with 120Hz. But more mainstream notices if it's 4x+ differences, like 60Hz-vs-240Hz, or 240Hz-vs-1000Hz, rather than merely only 60Hz-vs-120Hz -- there are some games and some blind tests where the geometric differences matter than the absolute number of the Hz). Even things like mere browser scrolling looks noticeably clearer to most population at 240fps 240Hz versus 1000fps 1000Hz, it's as if you've turned on strobing (except you didn't). Apple would do it already if it used zero battery power. At the end of the day, killing enough of the weak links to create maximum refresh rate perfection amplifies the differences between refresh rates more. Sometimes one lineitem is unimportant, but we need to obliterate a hell lot of line items, and the mouse is undoubted one of the many weak links. Just as 4K was $10,000 in year 2001 (IBM T221), 4K is now a $299 walmart special. Tomorrow's refresh rates are easily pennies extra on top of 60Hz, but we keep buying small upgrades for so long (e.g. 240Hz then 360Hz is a worthless 1.5x and throttled further to 1.1x due to weak links). And Linus said it's cheap to laglessly frame-generate extra frames, but game engines and GPU vendors haven't picked it up -- but they will be forced to by 2030 to support future kilohertz-class displays. You said some significant issues (that I agree with). I said significant issues too. Whether you and I agree on weak link line-items is academic to the point that there are weak links is sabotaging the refresh rate race, where we don't get our money's worth out of the Hz that we ideally should. Sometimes weak links are fixed line-item. The 240Hz OLED prototypes sitting here, will help solve a major GtG weak link (Thanks to the zeroed-out weak link 240Hz OLED is roughly as clear-motion as the best 360Hz, a XL2566K E-TN, when at framerate=Hz, although the XL2566K lag is still slightly more -- but that's a separate flaw to be refined other than display motion blur). Other weak links should be fixed en-masse (e.g. sensor-level timestamps). Either way, many weak links remain. Also, given lab has already confirmed mainstream random-from-population visibility way above the noise floor of all that silly audiophile testing.... this definitely isn't audiophile-only stuff. We've just been spoonfed refresh rate incrementalism for way too long, with too little humankind benefit, without enmasse-fixing all systemic weak links all at once, including everything you just said (which indirectly agrees with several of my points, even if we may lineitem-disagree).1 point