PCSU1000 feedback and feature suggestions

I recently acquired your company’s PCSU1000 oscilloscope (through Jameco) and I am currently putting it through the paces. So far my impression is quite favorable. There are however a few details in the hardware and the included scope/spectrum analyzer software that I believe could/should be improved with little effort:


(1) I am missing a way to average over several waveforms in the oscilloscope display (interestingly enough, the spectrum analyzer mode does offer averaging). The scope’s intrinsic noise is remarkably low, but nevertheless there are many situations where one could benefit from removal of random noise.

(2) it would be helpful to have a “store to reference” function that stores the current waveform (it should be possible to turn the reference traces on and off like the CH1 and CH2 waveforms). That would simplify the search for small changes since they would be easily visible as mismatches between stored reference waveform and current waveform.

(3) in addition to existing measurements, RMS noise, peak-to-peak noise, and RMS jitter and peak-to-peak jitter would be nice to have.

(4) it would be great to have the option of manually setting the reference levels (high/mid/low) used in the measurement (right now 10%/50%/90% are hardcoded). At least let the user switch between 10%/90% and 20%/80% for the rise time thresholds.

(5) sometimes variation of the amount of post-trigger would help (at the moment it seems hard-coded to about 1/3 of the waveform record). E.g. allow pre-trigger of 10%/50%/90%.

As for the hardware, two things were less than optimal:

(1) The absolute levels are sometimes significantly off - e.g. probing ground the scope showed up to -0.4V in the 1V/div range. This was after scope calibration, and I even recalibrated to be on the safe side. (and yes, the probe ground lead WAS attached to another nearby system ground point). Could this be an issue with scope ground drift? (the system under test did not have direct connection to the computer ground except through the scope ground lead).

(2) Probing a fast-rising edge (<2ns) from a microcontroller (standard 5V CMOS output) showed close to 50% overshoot, which is not visible when I compare it with the waveform of a high-performance scope (Tektronix TDS 692, 3 GHz / 10GS/s, with 4 GHz probes). The size of the overshoot is present even in oversampling mode and cannot be explained by interpolation artifacts (the Gibbs phenomenon would be limited to <17%). I used the included probes in 10x mode, and they were compensated prior to the measurement. Is this a known issue of the probes?

Is there a way to get automatically informed about new releases of the software? (the About screen says it’s version 1.0 so I assume there will be updates).

Thanks and keep up the good work


I’ve seen this requested in another post and would like to second that - please allow the spectrum analyzer to function in equivalent time sampling mode. I understand that in this case having the trigger on in mandatory.


Thank you for your comments and suggestions to improve the oscilloscope/spectrum analyzer software. Some of the modifications (1 and 4) are indeed quite easy to implement. The pre-trigger position is hard-coded to the hardware and is difficult to change. The reference traces on the screen should be helpful indeed…
Your suggestions will be considered when the next release is developed.

About the hardware issues: Is there 0.4V DC offset even if you disconnect the probe GND from the external circuitry?

The overshoot you reported seems to be rather high. Difficult to say what might be the reason to that. Was the equipment under test galvanically connected to mains ground and was the computer connected to grounded power outlet? This causes the ground loop that may cause some problems.

About the software update:
Sorry, no automatic update (yet).
Please check the Velleman Downloads site for the updates:


first, thanks for your fast response!

I downloaded and installed the latest version of the scope software (the About window says now it’s V1.06).

I installed the new version directly on top of the old one - is this a problem? (the installer did not give any errors or warnings)

I re-calibrated the unit (all inputs open) - passed without error. Unplugged and re-plugged the scope. Restarted the computer. Restarted the scope software. So I believe I started from a very clean plate.


  • Autoset no longer works. My test signal is a nice, clean 100 kHz sine wave, approx. 2V amplitude, centered around zero. The autoset does not find it, neither on CH1 nor on CH2. Instead it goes to 100ms/div and 5mV/div. (I have no problem getting the scope set up manually to show the signal, so it definitely is a problem with the autoset).

  • On CH1 when I set the resolution to first to 50mV and then to 20mV/div (or from 0.5V to 0.2V), about one time out of four the signal on the screen oscillates, even though nothing is attached to CH1. Setting a different sensitivity and then setting it back to 20mV/div fixes the problem. I did not observe this when coming from any other resolution (e.g. 10mV/div) either, it’s specific to the 50mV/div -> 20mV/div and 0.5V/div -> 0.2V/div change. The oscillation shows a peak at around 18 MHz (although I can’t tell if it really is a higher frequency that got aliased by the 50 MHz sampling rate). Amplitude is about 0.5units (i.e. 10mV) peak-to-peak. No other vertical setting shows this behavior. CH2 has a similar issue but has much smaller oscillation amplitude. Also there is no influence of the trigger setting (on or off).

Did you ever observe such a behavior?


Thank you for your comments.
The Autoset problem will be fixed soon in the next release of the software.
No oscillation is noted so far. We’ll do some more testing.