I have a PCSU1000 PC Scope with PCLab2000 version 3.08
In measuring DC voltages, I’ve found that when using the 10X probe, I can be off by as much as 10% when compared to my digital voltmeter (brand new EXTECH 570) on a “known” voltage source. The 1X probe is better but is still off by a bit (no pun intended).
I’ve calibrated the scope multiple times to no avail. Is there a trim pot or other adjustment inside the scope I can tweak? Is this within spec? 10% is quite a bit.
If you after the calibration select GND coupling the trace should show max. +/-1 to 2 bit offset on the screen.
The DC value shows corresponding voltage offset (e.g. 0.6V to 1.2V at 20V/div range).
The offset is still less than 1% of the full scale (160V).
This offset may anyhow be the reason to the 10% error in the DC reading if rather low voltage is measured on ‘big’ range.
There is no separate adjustment in the scope for the 10X probe.
What voltage were you recording, and at what input sensitivity? It’s important to have at least a 6 division deflection when making voltage measurements on a DSO, less than that will greatly affect the accuracy.
I performed a similar exercise using my PCSU1000. The input was 11.62V from a somewhat discharged sealed lead acid battery, this voltage was monitored through the tests via an Omega HHM2 4000-count meter. It held steady during all testing.
Using my Lecroy WaveJet 322 (and a compensation adjusted Velleman 60 MHz probe) I verified that the measured voltage in both x1 and x10 modes was indeed 11.6V (the Lecroy sensibly uses only 3 significant digits). The vertical sensitivity was set to 2V/div to create a nearly 6 division deflection.
Moving to the PCSU1000 I observed a measurement of 11.75V, a +1.1% deviation. Selecting the x10 position on the probe, and 0.2V/div on the 'scope I observed a measurement of 12.00V, a +3.3% error. Not as wide as your 10% deviation, however considerably more than with the x1 position.
I feel both of these are within a range of error that I would expect from a relatively inexpensive (vs. the Lecroy, a $3600 instrument) PC-based device. And suspect that the reason for the difference lies in the relative accuracy of the PCSU1000’s, individual “front-end" voltage attenuators.
Because oscilloscopes are not generally applied, or suited to use, as DC voltmeters, I next performed the same tests, using a 7.00V peak-peak (7 division, verified using the WaveJet) sine wave at 1kHz. The PCSU100 reported 7.00 pk-pk with the probe set to x1, and 7.13V pk-pk in the x10 position–using vertical sensitivity settings of 1V/div and 0.1V/div respectively. These values are in error by 0.0% (x1) and 1.9% (x10)–again what I would expect in this price range.
There are 2 trim pots inside the PCSU1000, however they control the upper reference voltage for the ADC. Changing those settings would affect all input sensitivities and would not change the relationship of the individual attenuator’s accuracy.
The voltage was 10.043 VDC (by my DVM), 2 volts per division, and thanks for the reminder about accuracy and too low a sensitivity.
I appreciate the replies and suggestions. I will investigate this further. Somehow, I’m thinking it’s got to be me and my unfamiliarity with the scope. I’d be happy as could be with 1% error. The 10% is disturbing…
Thanks also for the warning about using the scope as a volt meter. I’ll post my findings later tonight I hope.
Keep in mind that the PCSU1000 use 8-bit analog-to-digital conversion, like most oscilloscopes, making it no match for an instrument like the Extech MM570 (of which I am envious). The use of 8-bit converters is also why Lecroy does not report silly levels of precision in it’s measurement system, and limits displayed values to 3 significant digits.
With 8-bits, even if you had a full scale (8 division) signal and a perfect world, the best accuracy you could achieve would be 1/256 or 0.39%. If only using 5 divisions as with your 10.043V source then the perfect world it becomes 1/160 or 0.63% accurate.
In the real world you need to toss out the least significant digit and these numbers become 1/128 (0.78%) and 1/80 (5 division deflection) or 1.25%.
These are the hard numbers that drive the promised vertical gain accuracy specs for my Lecroy Wavejet 322 scope’s +/-(1.5% + 0.5% of full scale)–which is the same as their $60k+ WaveMaster series 'scopes. The “+0.5% of full scale” part accommodates tossing out the least significant digit, rounded to a guaranteed performance number (they also state that typical accuracy is +/- 1.0%).
The remaining factor that make’s the 1% to 3% error you and I have observed with the PCSU1000 reasonable comes down quite simply to cost. It cost quite a bit of money to produce an accurate and closely matched switched attenuator for an oscilloscope’s vertical inputs (and then for a dual channel scope to make two of them that are also accurate and closely matched). In the oscilloscope world this is what separates the “men from the boys”.
If you look at the specs for a typical lower-end DSO (like these, Owon’s PDS series), you’ll see that the DC gain specs in the +/- 5.0% to +/- 3.0% range. This difference between instruments like the Wavejet and the PCSU1000 (and Owon, Rigol, Instek, and even Tek’s absurdly priced TDS1000 and TDS2000 series) is entirely in the accuracy of the input attenuators and (maybe) to far lesser extent ADC quality. Once things get into the digital domain you’d have to really screw-up to introduce error.
I apologise for being so verbose, problem is I like analyzing and talking about this stuff…