When you measure a voltage, how do you know that your measurement is correct? Because your multimeter says so, of course! But how can you trust your multimeter to give the right reading? Calibration of instruments is something we often trust blindly without really thinking about, but it’s not always an impossible task only for a high-end test lab. [Petteri Aimonen] had enough need for a calibrated current source to have designed his own, and he’s shared the resulting project for all to see.
The cost of a reference source goes up with the degree of accuracy required, and can stretch into the many millions of dollars if you are seeking the standards of a national metrology institute, but fortunately [Petteri]’s requirements were considerably more modest. 0.02% accuracy would suffice. An Analog Devices precision voltage reference driving a low-offset op-amp with a driver transistor supplies current to a 0.01% precision resistor, resulting in a reference current source fit for his needs. The reference is available in a range of voltages, his chosen 2.048 volts gave a 2.048 mA current sink with a 100 ohm resistor.
In a way it is a miracle of technology that the cheapest digital multimeter on the market can still have a surprisingly good level of calibration thanks to its on-chip bandgap voltage reference, but it never hurts to have a means to check your instruments. Some of us still rather like analogue multimeters, but beware — calibration at the cheaper end of that market can sometimes be lacking.