The program I'm working on in C# (.Net Framework 2.0) calls for the ability to switch over to a 'remote mode' and send ascii data to another screen via Bluetooth. I'll start by saying I'm not a very experienced programmer, and I know nothing about networking, but after fooling around with the SerialPort class yesterday I was able to work up a little chat program that worked nicely between two Bluetooth-connected devices.
The chat program, however, only sent data when the user hit a button to "send" data. If the two devices weren't properly connected I just threw a TimeoutException
along with an error message. The program I'm working on now is much larger, and tries to write data constantly so long as it has the COM port open.
That means if the two devices aren't immediately connected, it has to throw a TimeoutException
, and it will continue to throw it, again 开发者_如何学运维and again until they ARE properly connected. That's totally unacceptable. It slows the program down to the point where it isn't usable, and litters the Debug output with "TimeoutException Thrown Here" error messages.
Is there a better solution for how to do this? Some way that I can get it to only write the data out if I can confirm that the two devices are connected, without constantly checking (and subsequently getting Timeout Errors while checking).
No. A serial connection is stateless.
This means you don't know if someone is on the other side. All you can do is sending something out and take a look if something meaningful is coming back.
The easiest example for this is the good old analog modem. To find out if it is connected is to send out a AT
and check if an OK
comes back.
So your solution is the right one, but maybe not properly implemented. You should put your connection built-up sequence into a BackgroundWorker
. So these tries will be done in another thread while your GUI stays responsive to the user.
精彩评论