Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use InputStreamReader for serial UTF8 decoder #10389

Draft
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

cmaglie
Copy link
Member

@cmaglie cmaglie commented Jun 19, 2020

The implementation is much more straightforward.

It should also solve a JDK incompatiblity:

java.lang.NoSuchMethodError: java.nio.ByteBuffer.flip()Ljava/nio/ByteBuffer;
at processing.app.Serial.serialEvent(Serial.java:185)
at jssc.SerialPort$LinuxEventThread.run(SerialPort.java:1299)

Fix #8903

The implementation is much more straightforward.

It should also solve a JDK incompatiblity:

  java.lang.NoSuchMethodError: java.nio.ByteBuffer.flip()Ljava/nio/ByteBuffer;
  at processing.app.Serial.serialEvent(Serial.java:185)
  at jssc.SerialPort$LinuxEventThread.run(SerialPort.java:1299)

See arduino#8903
@cmaglie cmaglie self-assigned this Jun 19, 2020
Copy link
Collaborator

@matthijskooijman matthijskooijman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, indeed a lot simpler. I left some inline remarks.

char chars[] = new char[512];
try {
while (next < max) {
int w = Integer.min(max - next, 128);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we even need to limit to 128 chars here? Sems lie decoderInRaw has no fixed buffer size, so maybe you can just push all of buf in there? If this has performance issues (e.g. buf is a lot bigger than 128 so memory usages increases or something), then maybe the incoming buf should be limited?

It should be fairly easy to just read from the serial port in 128-byte blocks, moving the loop here:

byte[] buf = port.readBytes(serialEvent.getEventValue());
processSerialEvent(buf);

(the argument to getBytes() is the number of bytes to read.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The PipedInputStream has a fixed circular buffer size, if exceeded the decoderInRaw.write(...) call will block forever (unless we spawn a thread that reads concurrently from decoderOutputUTF8)... but I would like to avoid complicating it again.

We can increase the buffer size to something bigger like 16Kb to reduce the function call pressure, I'll push a change for this.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, I see. I can't find the default size anywhere, though, so hardcoding 128 might not be ideal. Passing an explicit buffer size would be better (probably through a BUFFER_SIZE constant or something). The buffer size should probably be a few bytes bigger than the max write size, in case some bytes are left in the buffer. A unittest for this would be to write a partial utf-8 character and then a full buffer worth of bytes.

but I would like to avoid complicating it again.

Wouldn't moving the loop only slightly simplify things? It could result in more function calls and chars allocations maybe, but the latter could easily be an instance variable if this is an issue.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The buffer size should probably be a few bytes bigger than the max write size, in case some bytes are left in the buffer. A unittest for this would be to write a partial utf-8 character and then a full buffer worth of bytes.

yep look my last commit.

Wouldn't moving the loop only slightly simplify things? It could result in more function calls and chars allocations maybe, but the latter could easily be an instance variable if this is an issue.

I don't know, adding a thread would already increase the number of LOC, also you need to figure out how to synchronize it. I'm not going into that path... :-) if you want to give it a try I may wait for it before merging this one.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The main problem is that JSSC doesn't implement the InputStream interface, otherwise we could have chained it to the InputStreamReader directly, without the need of this buffering. Unfortunately it provides this weird getBytes() method, so we cannot avoid this boilerplate...

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know, adding a thread would already increase the number of LOC, also you need to figure out how to synchronize it. I'm not going into that path... :-) if you want to give it a try I may wait for it before merging this one.

I wasn't suggesting adding a thread, just let serialEvent loop and read BUFFER_SIZE bytes at a time (until serialEvent.getEventValue() have been read) and calling processSerialEvent for each buffer just as now. This would be very similar, except that it ensures that the buf variable is also limited to BUFFER_SIZE.

The buffer size should probably be a few bytes bigger than the max write size, in case some bytes are left in the buffer.

I thought about this again, and I think this is not actually needed. The InputStreamReader does the decoding and the buffer is in the PipedInputStream. Looking at the InputStream interface, there is no way to peek at bytes or re-insert them after reading them, so the only way for InputStreamReader to decide there are no additional characters is to read all of them, so I suppose there is a tiny buffer in InputStreamReader for partial characters?

Docs for InputStreamReader also suggest that there is some internal buffering:

Each invocation of one of an InputStreamReader's read() methods may cause one or more bytes to be read from the underlying byte-input stream. To enable the efficient conversion of bytes to characters, more bytes may be read ahead from the underlying stream than are necessary to satisfy the current read operation.

Loooking at docs for Reader.read(char[]), so also InputStreamReader, I see:

Reads characters into an array. This method will block until some input is available, an I/O error occurs, or the end of the stream is reached.

I wonder what this means exactly. Two risks:

  1. If no full character is available (e.g. processSerialEvent was called with a partial utf-8 character, or 16kbyte + a partial utf-8 character), this might block and deadlock?
  2. Does this guarantee that it reads as much as possible from the InputStream (until chars is full)? If it could read less, leaving some bytes in the pipe, you'd also get a deadlock.

Problem 1 does indeed seem to occur. This program deadlocks for me:

import java.io.*;
import java.nio.charset.*;

class Foo {
        public static void main(String[] args) throws IOException {
                String str = "\u2e2e";
                byte[] bytes = str.getBytes(StandardCharsets.UTF_8);
                System.out.println(bytes.length);

                PipedOutputStream pos = new PipedOutputStream();
                PipedInputStream pis = new PipedInputStream(pos);
                InputStreamReader isr = new InputStreamReader(pis);
                pos.write(bytes, 0, bytes.length - 1);

                System.out.println(isr.ready());
                int c = isr.read();
                System.out.println(String.format("0x%04X", c));
/* Originally I tested read(chars), which also blocks.
                char chars[] = new char[10];
                int n = isr.read(chars);
                System.out.println(n);
                System.out.println(String.format("0x%04X", Character.codePointAt(chars, 0)));
*/
        }
}

The docs suggest that ready()` can be used to check this:

True if the next read() is guaranteed not to block for input, false otherwise. Note that returning false does not guarantee that the next read will block.

But looking at the StreamDecoder.inReady() source, which is used by InputStreamReader.ready(), this actually returns true when the InputStream has any bytes available, rather than actually pre-reading bytes to see if there is anything actually available. And indeed, ready() returns true, and then reading blocks...

InputStreamReader.ready() does specify what it does correctly (return true when bytes are available), but then still (incorrectly) promises it won't block.

So I wonder if InputStreamReader is useful at all if we cannot guarantee it never blocks (and stop calling read()` when there's < 4 bytes in the buffer would prevent blocking, but also prevent the last few bytes before a pause from being read...).

The main problem is that JSSC doesn't implement the InputStream interface

Yeah, indeed. I wonder: Wouldn't a simple wrapper be possible here to allow removing the pipe stuff and simplifying the chain? I think this could remove one buffer from the chain. With this PR, we have:

  • SerialPort.readBytes allocates a buffer, which we store in buf.
  • This gets written into the pipe, which as an internal buffer
  • This gets read by InputStreamReader, into its internal buffer
  • After decoding, this is written to the chars buffer.

With a wrapper, you could do:

  • SerialPort.readBytes allocates a buffer, which we store in an internal buffer in the wrapper.
  • When the wrapper's read() method is called by InputStreamReader (passing, ISR's internal buffer array), data is copied to ISR's buffer.
  • After decoding, this is written to the chars buffer.

Note that the current code also uses 4 buffers (buf, a ByteBuffer a CharBuffer and chars, but the latter could be removed, see below).

I had a bit of a look at how this would look, but given the deadlock problem above must be solved first, I stopped doing so :-)

Seems we also can't fix this by making a non-blocking InputStream to wrap InputStreamReader, since that triggers an exception: http://hg.openjdk.java.net/jdk8/jdk8/jdk/file/687fd7c7986d/src/share/classes/sun/nio/cs/StreamDecoder.java#l288

Looking at the core implementation of InputStreamReader (which defers to StreamDecoder.implRead(), it seems they essentially do the same trick as our original code (Write to a ByteBuffer, feed that into a CharsetDecoder, which puts the result in a CharBuffer).

Looking at this PR now, it is indeed simpler code, but adds one buffer in the chain, adds several layers of extra objects (pipes, InputStreamReader, StreamDecoder) and on top of that, seems to deadlock.

So, maybe the original approach should be kept, maybe it can be a bit more simplified by reading smaller chunks from the SerialPort? Also, looking at the code again, I think the chars array buffer could be removed as well, by just using CharBuffer.array() to access that internal buffer directly.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. If no full character is available (e.g. processSerialEvent was called with a partial utf-8 character, or 16kbyte + a partial utf-8 character), this might block and deadlock?

...and this is a problem indeed, if there is no easy solution I guess we must drop this PR.

So, maybe the original approach should be kept, maybe it can be a bit more simplified by reading smaller chunks from the SerialPort? Also, looking at the code again, I think the chars array buffer could be removed as well, by just using CharBuffer.array() to access that internal buffer directly.

Yes, but besides that, my concern was the error in #8903 that I've got also on my Windows VM after upgrading to the JDK 8u242. Using an InputStreamReader seemed like the best approach, but I guess we need to find another way.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, interesting, especially since the InputStreamReader seems to use ByteBuffer internally too. Any idea what changed about this flip() to cause this? I can't find anything relevant in the docs, e.g. these all look the same:

arduino-core/src/processing/app/Serial.java Outdated Show resolved Hide resolved
@arduino arduino deleted a comment from ArduinoBot Jun 19, 2020
@cmaglie cmaglie marked this pull request as draft June 19, 2020 18:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Arduino 1:1.8.9-3(compiled with jdk9?) ByteBuffer breaks libraries
3 participants