05-06-2011, 10:38 AM
A battery that reads 1.2V on my digital multimeter reads around 0.75V on the VTVM - and seems to slowly climbs to 0.8V. The value is eve less coherent if I change to the 15V scale.
I can adjust the zero, but it won't give me that kind of difference. Can such a difference be caused by the current pull of the VTVM compared to the DVM? Maybe the battery really reads less when connected to the VTVM.... I don't really have lots of test subjects to prove the values of the VTVM, to be honest.
Also, can it be caused by one of the tubes? What's the usage of the tubes in those voltmeters?
I would have liked to be able to easily tune it in place... I already have a voltmeter. The VTVM matches my "brand new" heatkit scope, but otherwise might not be what I use the most, so I don't really feel like doing maintenance on it; at least until I have the radio fully restored. (maybe I need vacations, to keep up with all the "projects" I have?)
I can adjust the zero, but it won't give me that kind of difference. Can such a difference be caused by the current pull of the VTVM compared to the DVM? Maybe the battery really reads less when connected to the VTVM.... I don't really have lots of test subjects to prove the values of the VTVM, to be honest.
Also, can it be caused by one of the tubes? What's the usage of the tubes in those voltmeters?
I would have liked to be able to easily tune it in place... I already have a voltmeter. The VTVM matches my "brand new" heatkit scope, but otherwise might not be what I use the most, so I don't really feel like doing maintenance on it; at least until I have the radio fully restored. (maybe I need vacations, to keep up with all the "projects" I have?)
-Mars