Rarely does it happen that the input voltage swings exactly from 0 to the maximum, giving numeric reading 0.0 to 1.0. This is often due to mechanical limitations on the control or knob travel.
Hard-coding the calibration values is bad practice. They may drift over time and need re-tuning. Calibration should be inherent part of the analog input properties.
As a minimum, every analog input should have min/max calibration, i.e. the raw readings that correspond to the achievable extremes. These extremes would map to the normalised 0.0 and 1.0 values in hw_adc_input_read(). Note that it's not uncommon to have an inverted input, thus the "minimum" should be allowed to be greater than "maximum", and would still map to 0.0.
Ideally, there would also be an interpolation table, expressed in these normalised values, because analog inputs often require non-linearity. But that's "nice to have".
Currently, I have to code such properties and translation manually for each hardware input. My proposal is to make them belong to the analog input.
A big question is how to find the raw min/max. Currently, I have to put some debug code to print out the raw values. In a proper implementation, I see two possible ways:
- Make a "test" Hardware function combined with a small instrument which would display the received value on the panel. One would use it temporarily to check the input and transfer the obtained values to the calibration fields of the real input.
- Ideally, have an additional "Calibrate" button (like for joysticks in Windows). It would ask to swing the input to one end and the other, and record the extremes. Ideally, these extremes should just be put in the normal user-editable fields: it may be helpful to fine-tune them, and just to know them.