In reference to Elias’ foo/bar/shared example: As I recall, Python package management loads foo/shared.dll for foo and bar/shared.dll for bar, as both may be of different versions. The code in foo brings the version it was built with it.
But imagine a package P1.0 depending on A1.0 and Q2.0 depending on B1.0 and A1.1. Now, someone creates a new package R1.0 depending on P1.0, Q2.0 and A1.2 and distributes that, resulting in three different versions of A being used (and needed for compatibility). And then someone creates a package depending on R1.0 and so forth.
As it is now, you load an old contribution, and many of them simply work, because old nodes are there but hidden, and updated versions of nodes are mostly backwards-compatible.
I don’t think that this completely self-contained way is the way to go for vvvv packages. Every module would have to contain it’s own dependencies, and everything would get polluted pretty fast (and you have the question of what to show in the NodeList - all stuff, all versions? Or a context dependent NodeList, which shows A1.0 if you have to change something in P1.0 and A1.1 if you have to change something in B1.0?)
Maybe this can be solved by not INCLUDING the dependencies, but LINKING to the version of them used. Developers would somehow have to write into the declaration of their packages when they update them whether a new version breaks backwards compatibility (which seldom happens with vvvv, as I think - e.g. new input pins don’t break it). Then a package can always download the newest release of its dependencies that didn’t break backwards compatibility.
By now, first version I’m planning to implement does not handle dependencies. I think it will be easier avoid the dependency hell and implement a minimum avalilable version.
I mean, you need contribution Foo, install it with one click and that’s it. Now a day contributions has no dependency and everybody is using them. The p4v I have in mind, just installs a package (or call it contribution) under your project folder, instead of you download it unizip it, etc.
@Elias, sounds good.
Main topics, that i see here
- We should bind all shared package versions to vvvv specific release, as currently done. Because of this we can partly avoid dependency hell, so vvvv basic installation should contain all shared dependencies already. If one guy inventing new stuff, like Leap or Kinect and it’s brings new global dll, he needs to make a pull request to vvvv. If he doing small project, based on external dll, no need to make it global. Here i see the scenario that guy needs debug versions of some shared libraries - we can provide it, using vvvv-sdk package.
- I never saw a library contributions in community. So dev packages, like in linux world can provide additional debug functionality, but it’s not needed because vvvv-sdk package already providing debug version for every core dependency.
- Regarding to the first point, we will remove build dependencies between packages and create only runtime dependencies, that easily to manage.
- I think exporting NuGet package from project is cool feature, but we can code it later, because usually you can just organize all dll’s by hands and run pack command. For the first step we can provide a contribution template, that will build to the right directories and run pack, if you select special configuration.
- Maybe we need sourcecode packages, in case if you want to include your package to vvvv-addonpack. This packages can be automatically downloaded, builded and tested by vvvv build server.
Core vvvv download should contain all dependencies, that we need for plugins - now it’s already like this. If we need more, we can download vvvv-sdk package, that providing debug versions of shared libs. If we developing for alpha, we just downloading vvvv-alpha package, and updating it, when it’s needed, by Update command. For easy contributions, we providing template system, so everyone can contribute anything - not only a plugins, but also modules, shaders etc.
About Local And Global Packages
I think we should go to local packages only, but with two scopes - VVVV45 and PROJECT. First scope working like a global, but we can have many different versions of vvvv installed, each with it’s own packages. PROJECT scope is just a project level scope, so we keep packages dir inside project folder, next to Main.v4p file.
Server and Hosting
We should find a hosting solution for our packages stream. I’m found some Java implementations of NuGet server, but not digged it too much. Usually people using expensive IIS hosting with .Net based NuGet server. We can use http://www.myget.org/ for testing, but i think better is go for self-hosted solution.
If we want to use current contributions page as is, our NuGet packages can be really small, and contain only download scripts for binary versions of plugins. But in this case we losing NuGet offline capabilities. What should be better?
Ok, i finished the PowerShell prototype https://github.com/smakhtin/vvvv-p4v/tree/powershell . To check, how it’s working just type p4v install patternTouch -g. -g key mean it’s a global installation. For global installation, put p4v folder inside of your vvvv dirrectory and create folder packages on the same level, as addonpack folder. I think in next versions i will autocreate it, also cache will be better supported.
Hi alg, are you in Milan? Or can we meet on skype this evening or later?
I would like to try it but I think I need more instructions, also I would like to merge.
One question: I pull the powershell branch, then how can I get p4v in my path?
I started another thread about a first minimal version of p4v that is just a vvvv patch https://discourse.vvvv.org/
finally found this thread again
i made a template for developing packs. https://github.com/velcrome/vvvv-Template
so far i didn’t make much progress into setting up correct nuspec files, which would seem the next logical step.