I'm having a hard time understanding how you can measure universal input delay with a frame count of 3. What if I'm running a game at 120fps, or 30fps, or 15 fps? The idea that there is some kind of universal rule that the frame count for lag is always a minimum of 3 doesn't add up to me, as it...