Ok, I realise that the VM110/K8055 is described as a USB ‘Experimenter’ interface, but I have spent an entire day testing one of these cards and have found a problem which essentially renders the ADC useless for anything other than very rough approximations.
I have several of both types of card…K8055 and VM110. I have one rigged up as a datalogger, measuring two temperatures and plotting graphs of temperature variations over time.
Now, I have tried all six of my cards in the system and without fail, for any given voltage on the two analogue inputs, the readings from each of the cards is pretty consistent.
Today I tried out a different system (Tronisoft DACIO300) using 8 channels of 10 bit ADC and I rewrote my logging program to work with the new card and it worked superbly. Except…all my temperatures read lower than I expected.
I checked my maths, and that wasn’t an issue. I checked my electronics. No problems. I then decided to check the VM110 card. I used the pot on the card as my voltage source and adjusted the pot so that the readings (in the VM110 test software) went up in steps of 10. I calculated the voltage that each setting should represent, and then I measured the actual voltages on pins 1 and 7 of the op-amps, and that’s where I found the issue.
When registering 255 on the ADC, the input voltage to the PIC was only 4.87v instead of the expected 5v. In fact, the supply voltage to the PIC is actually 5.02v. With a value of 127 (half scale) I would expect to measure 2.5v on the input with a voltmeter. I measured 2.34v.
I then plotted a graph of values between 0 and 255 versus measured input voltage, calculated input voltage and difference between the two. At first glance the slopes of all three graphs looked linear. Looking closer and calculating percentage error at each point on the graph, I then discovered that the ‘error’ or inaccuracy isn’t linear. That is to say, a simple scale factor will not solve the problem of incorrect values! When the ADC is reading 0 (input shorted to Gnd), the error is zero. At 255, with the voltage calculated at 5v but measured at 4.87v, the error is 2.67%. Now, if the error was linear across the whole ADC range, then scaling up or down would not be a problem. However at 200 (calculated voltage 3.922, measured voltage 3.690) the error is a whopping 6.28%! In my temperature sensing application, measured voltage of 3.690v - 2.73v for the Kelvin offset of the sensor gives 1.06v, or a reading of 106 Centigrade. Calculating temperature from the VM110 gives me a temperature of 119.2C. A degree or so either way is good enough, but this is poor to say the least.
127, or half scale, should be measuring 2.5v. Actual input voltage to get that reading is about 2.38v. An error of some 5.3%.
I could understand the calculated values and measured values being a little out (and scaleable mathematically) if Vref for the ADC’s weren’t Vdd but some other voltage, or used an external reference rather than Vdd but it appears this unit does indeed use Vdd for the analogue reference. If anything, my Vdd is a little over 5v so I would expect the conversions to be lower than expected, rather than higher.
Anyway, the ADC is non-linear, has a worst-case error of over 6% at a value of 200 and to be honest…is pretty much useless. Certainly useless for measuring the input from something like an LM335Z temperature sensor. In fact, using the VM110/K8055, the temperatures I’m logging are all between 13C and 20C too high, based on digital data being read in. Yes, I could change the maths…use a value of 4.87 as Vref instead of 5.02 for my calculations:
Volts=(Vref/256 * DataByte)
C=(Volts-2.73) *100
But that just changes the slope of the graph. It doesn’t make up for the non-linearity.
Switching to a different ADC (DACIO300), I can read the voltage from the LM335 with a DVM (and at 10mV/K I can calculate the temperature from that, and it compares accurately to my thermometer) and the ADC gives me digital data which, when I generate a calculated temperature, is within half a degree. Measuring my LM335 voltage tells me it’s sat at 63C. The thermometer says 63C. The VM110 thinks it’s 83C! Using a 10bit ADC on the Tronisoft board gives me a reading of 64C. Now that’s close enough in my book.
Just for gag value, I then parallelled up the VM110 and the DACIO inputs…and tracked their digital data against identical analogue voltages going into the ADC’s. The DACIO was within the +/- 1 bit accuracy of calculated versus measured voltage. The VM110 was as previously measured - very poor by comparison. If it was just a faulty ADC on one PIC chip on one particular board, or even a dodgy op-amp (I swapped several - no difference) I could accept this, but I have found exactly the same issues with all the Velleman boards I have tested.
Use it for fun, folks. Use it to demonstrate a principle. Use it to give an approximation. Just don’t expect any accuracy, linearity or close relationship between voltage in and digital value out.