Never ever try to display a font on screen so that it has a certain real size.
So my solution now is: Display a 200pixel bar and ask the user to measure the length on his monitor. We now know the size of one pixel. For visual acuity testing the distance from the screen is also important.
The following Python Code now scales the letters for the font “Courier New Bold” to display correctly on the screen:
# Formula found online at sciencedirect: # Using a high-resolution display for screening visual acuity for near vision, A. Hoffman, 1996 gap = math.tan(math.pi/180.0 * (1.0 / 60.0)) * self.db['distance']*10 pixel_density = self.db['font_line'] / 200.0 num_pixels = gap / pixel_density height = round(num_pixels * 5.0 * 1.27 * 96.0/72.0)
To explain that a bit: The calculation is done for visual acuity of 1.0, that is normal sight. At first the gap of a letter is calculated, the “E” for instance consists of two gaps and three bars.
Then, the pixel density as calculated from the 200px line and the real length.
num_pixels is then the number of pixels necessary to represent one gap. Finally, the height of the letter (or to be exact: the size to pass to d3d.Font) is calculated: 5.0, because 2 gaps and 3 bars for “E”, 1.27 is the scale factor for Courier New Bold, 96.0/72.0 is the scaling to 72dpi. This is necessary, because I measured the scale factor for the font at 96dpi, that is the resolution of my external screen and Windows' default setting. At this setting, a font with 96pt should equal 1 inch on screen. I actually did the measurements digitally using The GIMP, I should've done that a 72dpi to simplify the formula.
Whatever, it seems to work fine, at least on my machine.