Debug timing influences timing of one IOBox by factor 100x

I am just in the process of optimizing a very large patch I have been working on which contains many subpatches. Something strange I noticed when enabling debug timing was, that one of the IOBoxes was shown with a timing of over 1000, while all other IOBoxes, which contain the same number of slices and are all real values, are showing values of around 5. I deleted and recreated the IOBox in question with the same result. Following that IOBox up to where the data originates I then came across something very weird. As soon as I enabled debug timing also in the subpatch where the data originates, the timing in all connected subpatches with that data dropped from 1000 to the expected 5. As soon as I disable debug timing in the source patch, the timing goes back up to 1000. This is only happening for 1 of the IOBoxes and data streams, even though there are 6 data streams, all of which have the same number of slices and are the same kind of data.

Here is the IOBox with debug timing enabled in one patch:

Here is the IOBox if I enable debug timing in the originating patch as well:

I can not feasibly upload the whole patch, as it has way too many dependencies, but maybe someone has come across something similar or knows a possible source of this bug. I of course cant be sure if the debug timing shown is even correct!

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.