开发者

Comparing String received from Python UDP Stream to a Java String

开发者 https://www.devze.com 2023-02-08 06:41 出处:网络
I\'m currently experimenting with UDP-Communication between a server written in Python using the SocketServer class and a client written in Java using the DatagramSocket and DatagramPacket classes.

I'm currently experimenting with UDP-Communication between a server written in Python using the SocketServer class and a client written in Java using the DatagramSocket and DatagramPacket classes. The server accepts python method calls as an input and routes the stdout and stderr back to the client, transmitted in a 1024byte sized packet.

The communication is working, the client can receive packets from the server and send packets to it, however I'm running into problems when comparing data.

For example, when receiving a packet containing the string __DONE__\n in the client, it prints fine using System.out.print(packet.getData()). I'm only running into problems when I am trying to compare it to a String done = "__DONE__\n" as follows:

while (String(packet.getData()).equals(done) != true) {
    doStuff();
}

Here the loop runs forever, as the evaluated statemen开发者_JS百科t always returns false. My guess is that it has something to do with different encodings. I tried to compare the byte-arrays of both the string from the packet and the native Java string and got these results:

String done:                5f5f444f4e455f5f0a
String(packet.getData()):   5f5f444f4e455f5fa0000000[...]
// The 0s are repeated for the whole 1024bytes of the packet

It seems that the String from the datapacket contains the bytes I'm trying to compare as well as the other bytes from the 1024byte packet, which is why the String.equals() method always returns false.

Is there a way to force Java to omit the trailing zeros when converting from a byte array to a String?


I now managed to resolve the issue by specifyng an offset of 0 and the length of the packet when converting my packet to a String:

String(packet.getData(), 0, packet.getLength(), "UTF-8");

The resulting String is stripped of the trailing 0s.


It would seem to me that you can use the setLength before packet.getData to specify how many bytes you want to get from the buffer.

http://download.oracle.com/javase/1.4.2/docs/api/java/net/DatagramPacket.html#setLength%28%29


5f5f444f4e455f5fa is an ODD number of hex characters. Looks like it should be 5f5f444f4e455f5fa0 i.e. "__DONE__\xA0" rather than the "__DONE__" that you wrote. If not, why is that 'a0' in the incoming packet?

Isn't sending a 1024-byte packet padded out with NULs a bit wasteful? Perhaps you should be talking to the source of the packets.

0

精彩评论

暂无评论...
验证码 换一张
取 消