There wouldn't be any perceived change at all.
The way to illustrate this is to imagine say, a "million DPI", with a wait() function rather than a polling rate (so hypothetically, a mouse that would always update to the OS as soon as it has data, rather than having to wait for the next poll frame).
Moving even a tiny amount at a regular hand speed would send first data to the OS within one microsecond, but in order for the sensitivity to not be insanely unimaginable, the minimum degrees turned for the first counts distance input (i.e the in-game sensitivity) would have to be so small that the initial movement would not even be visible to the human eye as far as how far it has rotated the game world. Therefore it would be completely useless that some data had been received within a microsecond, because the point of a mouse in a game is a turning device to rotate to a target location, and for that to be relevant it needs to be defined over a distance.
As long as one counts distance is less than one pixels distance on screen (referred to as "pixel ratio" in the calculator on this site), there would never be any difference in time to turn to any target which is represented in a different pixel on screen (i.e one that you could see you needed to turn to) that your crosshair wasn't already over to begin with no matter the DPI you use.