I made the following observation and just want to make sure I understand this correct.
I wired D1 to I1, so that I can do some response time testing. I then execute the following program:
OpenDevice(0);
for (i = 0; i < 20; i++) {
SetDigitalChannel(1);
gettimeofday(&start, NULL);
numInputs = 0;
while(1) {
numInputs++;
if (ReadDigitalChannel(1) == 1)
break;
}
gettimeofday(&end, NULL);
ClearDigitalChannel(1);
printf("<ms elapsed and numInputs>");
Sleep(1000);
}
CloseDevice()
The gettimeofday() function reports the current time with microsecond precision (on some OS, some only ms).
The program runs on a Core-i5 under Windows7 Pro 64, which leads to very consistent and repeatable results.
What I am seeing is that the measured time between start and end is about 12 milliseconds. But the message count (numInputs) is consistently 34. Only if I drop the Sleep() down to a few hundred ms does it decrease.
The K8055 is detected as an HID and I read on MSDN that HID’s have a receiver ring buffer that is supposed to prevent the loss of reports (messages). The default size of that ring buffer is 32 messages. So what I believe is happening here (correct me if I’m wrong) is that during the Sleep() time, the K8055 keeps sending reports that fill the ring buffer, and when my program gets busy, setting the output high and starts reading, the first 33 messages it gets are basically the FULL ring buffer, drained out as quick as possible, then the actual last message, where the input reflects the incoming signal, is the one just sent from the K8055.
The first time through the loop, the K8055 responds within numInputs = 2, so right after opening the device, the ring buffer is empty. 32+2=34, which makes sense. The input misses the signal on the first report after setting the output, so the 34 is the full ring buffer, one miss and the final hit.
If all that is correct, then a program that keeps the device open and only does one or two Read…() operations per second is always seeing about 300 ms old data. The oldest message in the ring buffer. To actually see fresh/current data, the program has to constantly read from the card.
Anything wrong with that analysis?