It appears that there is a serious design flaw in the output ports buffering mechanism.
Currently, output ports work as follows:
* The port has an index and a size. The index starts at 0, and the size is the size of the bytevector buffer.
* When you do a write-char (assume ascii), a single byte is written at bv[idx], and the index is advanced by 1.
* After advancing, if the index reaches the size, it means the buffer is full and is flushed immediately
* After flushing, the index is set to 0.
Now what happens if the flush-output-port is interrupted and a write-char occurs in the interrupt routine?
* The buffer is already full, and its index is set equal to the size. (the invariant that index<size is broken here)
* A byte is written, and the index is advanced (now beyond the size)
* No more flushing occurs since the index != size.
* This keeps on happening until the system is fubar.
It appears that there is a serious design flaw in the output ports buffering mechanism.
Currently, output ports work as follows:
* The port has an index and a size. The index starts at 0, and the size is the size of the bytevector buffer.
* When you do a write-char (assume ascii), a single byte is written at bv[idx], and the index is advanced by 1.
* After advancing, if the index reaches the size, it means the buffer is full and is flushed immediately
* After flushing, the index is set to 0.
Now what happens if the flush-output-port is interrupted and a write-char occurs in the interrupt routine?
* The buffer is already full, and its index is set equal to the size. (the invariant that index<size is broken here)
* A byte is written, and the index is advanced (now beyond the size)
* No more flushing occurs since the index != size.
* This keeps on happening until the system is fubar.
Bummer!