BinaryReader Flush Function

Jul 1, 2013 at 9:05 PM
Hi,
I'm using a BinaryReader to read/write the data stream from/to a control system at the other end of a phone modem. The data is binary and the messaging system is command/response. I send a command and get a response. Everything works fine until, for a variety of reasons, the stream gets out of synchronization and I don't get enough data back or I get too much data back.

I resolve this on a regular basis with Serial streams by using a Flush function where I can start-over without losing the current connection. For example:

private static SerialPort _SerialPort;

while (_SerialPort.BytesToRead > 0)
{
  _SerialPort.Read(buffer, 0, CommConstants.COMM_MAX_MESSAGE);
}

Flush needs to read and throw-away any extraneous data so when the next command is sent, the data in the stream is only for that command and not the previous command. It also needs to do nothing if there's nothing to read.

I've tried the underlying Flush function of the BaseStream but it doesn't seem to work. I've also tried using PeekChar and the length functions of he BaseStream but haven't landed on the right approach yet.

Any ideas out there?
Thanks,
Paul
Jul 18, 2013 at 6:40 PM
To answer my own question, most of the problems I've been having with the read buffers don't seem to be related to the actual data stream but appear to be related to issues discussed in a later post where dropping then redialing a call too quickly results in strange buffer behavior. I've settled on a Flush function using the BinaryReader.BaseStream.Flush() function which may or may not actually work when there is "real" data to be flushed. For now, at least it doesn't crash when there is no data to flush.....

Paul