I’m trying to write a stream of data in a file, and read it sequencially.
The stream consist in a record (Binary serialized) written every frame in the disk,
until here it’s fine, the problem is when i try to deserialize the Chunks of data, because the Bytes count is different for every frame.
Any idea?
Do i need a kind of header to define how many bytes per frame?
Is there a better way to do that?
I’ve seen a set of “Chunk” and “Tokenizer” related nodes but i don’t know how to use them.
StreamRecorder.vl (44.1 KB)