It isn't required to be tested, because it's just simple math and the way a mouse works is already completely understood. I added it to a spreadsheet to show the timings of different DPIs until first data sent (which is NOT input latency, because a target distance is not defined for the input).
Make a copy of the sheet and play around with the hand speed and DPI values and or/ system latency of other factors (such as GPU/ CPU / OS latency - which are also not static and vary between runs etc) and you will get the calculated delay until "screen flash" if you tested with the same methodology of the Battlenonsense or Optimum tech youtube channels. The only wiggle room on this is if the DPI reported by the mouse is not accurate.
The reason the higher and higher DPI converges closer to the max value, is not because the effect changes, but because the larger total system latency of other factors becomes more and more dominant in the calculation.
But in either case, none of these differences are meaningful when considering the delay to a target location, which is the only important factor for a pointing device.
Of course, there will be sensors that perform differently on different DPI settings with either more or less drift, and a VERY low DPI mouse would obviously have visibly less accurate angle tracking due to fewer data points with which to define the angle of any direction. This is also to do with the firmware implementation of the mouse / sensor, as some may be even worse on higher DPI's. But this is also not really meaningful when it is tested with a 2D cursor, because cursors cannot sub-pixel increment, so any input is necessarily truncated to the nearest pixel position which doesn't happen in a 3D game. You would always get more visible "drift" with a cursor than what happens in a game.