With mouse input we convert a physical distance to a virtual distance. To do this we normalise both to virtual units.
The "count" or "mickey" is a virtual unit defined by the DPI. If you are at 400 DPI, then 1/400th of linear inch movement, sends a "count" of 1 to the PC.
At the PC side we convert these virtual count units to output distance units. This is either in pixels in 2D, or angles in 3D.
The angles in a game is indeed defined by a "Yaw" value e.g. Source engine by default has 0.022 degrees - this means that 1/400th of an inch (1 count) from our mouse movement turns 0.022 degrees (the base yaw) in the game. These are our normalised input and output virtual distances from your hand motion.
The sensitivity of any function is the ratio between the output and the input, therefore we can say it is a multiplier. Since we already have our normalised values, then we know the sensitivity of "1" is an input of 1/400th of inch on the mouse pad, and an output of 0.022 degrees, so a sensitivity of "0.5" would be an output of 0.011 degrees for the same input distance.
This can always be calculated as outputAngle = yaw * sensitivityFunction() * counts.
Therefore to solve for counts instead, we do counts = outputAngle / ( yaw * sensitivityFunction() ).
Example; we have a game that has a yaw value of 0.014 degrees, sens of 1.2, and we're at 1600 DPI and want to turn 30 degrees. How many counts do we need to turn to this position?
so we have 30 / ( 0.014 * 1.2 ) = 1785.71.
To convert back to physical hand motion, it's therefore: 1785.71 / 1600 = 1.12 inches on the mouse pad turns 30 degrees in the game.
You have to always use the same units for both outputAngle and yaw value obviously. I use the term "sensitivityFunction()" because the value exposed to the user is not always linear like in Source, so this multiplier would be the return value of whatever formula a game uses to represent this to the player, which is highly variable and completely arbitrary so is one of the main reasons for this site's existence.