In reference to Elias’ foo/bar/shared example: As I recall, Python package management loads foo/shared.dll for foo and bar/shared.dll for bar, as both may be of different versions. The code in foo brings the version it was built with it.
But imagine a package P1.0 depending on A1.0 and Q2.0 depending on B1.0 and A1.1. Now, someone creates a new package R1.0 depending on P1.0, Q2.0 and A1.2 and distributes that, resulting in three different versions of A being used (and needed for compatibility). And then someone creates a package depending on R1.0 and so forth.
As it is now, you load an old contribution, and many of them simply work, because old nodes are there but hidden, and updated versions of nodes are mostly backwards-compatible.
I don’t think that this completely self-contained way is the way to go for vvvv packages. Every module would have to contain it’s own dependencies, and everything would get polluted pretty fast (and you have the question of what to show in the NodeList - all stuff, all versions? Or a context dependent NodeList, which shows A1.0 if you have to change something in P1.0 and A1.1 if you have to change something in B1.0?)
Maybe this can be solved by not INCLUDING the dependencies, but LINKING to the version of them used. Developers would somehow have to write into the declaration of their packages when they update them whether a new version breaks backwards compatibility (which seldom happens with vvvv, as I think - e.g. new input pins don’t break it). Then a package can always download the newest release of its dependencies that didn’t break backwards compatibility.