I’m getting round to something I’ve been putting off for ages and looking at using GIT to help manage vvvv projects.
The main need I have is efficient archive and versioning. Ease of collaboration would also be useful, though I understand the current issues with diffing given the current xml structure of a v4p file.
This solution will be used in a smallish company setting, and there will probably be only 3 regular vvvv users usimg the system, with occasional integration needed by external contractors. There will be a fairly high turnover of projects, some small and short lived, and others ongoing. There will be a lot of sharing of elements between projects, and the need for different versions of projects, for different events etc.
Can anyone suggest a good solution for this? I assume GIT is the way to go but don’t know whether github or something like http://www.redmine.org/ would be a better idea. I’m still new to the world of git and versioning in general.
yes, git will do the job. as a ‘for free’ user, i would host on https://bitbucket.org.
and use http://code.google.com/p/gitextensions as version control gui.
what about managing media assets like image textures and video content? How do these integrate into a git versioning system?
that should work in theory. bitbucket claims unlimited disc space. but i would google a bit for ‘git big files’.
Hi Mr Boni,
you can have everything about vvvv on GitHub and use an online (or not) for all your media file. You can upload media file on github but when they are big (like video) it’s not convenient.
Hi ton and clone. How does this work though with large files and binaries in general. Does it not take a long time for the files to be checked every time there is a commit? Can I flag a file to not be checked every time and only copied to the server once?
I think I’d only want to be keeping these files on a local git too, so only pushing the main vvvv project files to github itself
git only push changed file to the repo. For example, you commit your project, then push to github for the first time, then when you commit only the new and changed file will be pushed. It is the same for any file.
With github for windows you have the “sync” function. Means that you first commit localy and then you can sync to the online repo. After that you have the same thing both localy and online.
If you want to flag file that you don’t want to push online, you have to create a special file for git to tell it that, for example, .png, .avi, .xml etc… have to be ignored.
Does git not need to check the contents of each file at commit to see if it has changed? or is a hash made if the file changes and only the hash compared at commit?
Yes, (big) binary files are bad to have in git. Especially if they change often, they will blow up the size of your repo. I would not commit video files etc to git.
Afaik, with binary files git cannot compare and store the differences, so everytime a binary file changes the whole file is replaced (and the whole old version stored in the repo history.
Some thoughts/advice on how to deal with this can be found here: http://stackoverflow.com/questions/540535/managing-large-binary-files-with-git
I think git checks modification dates of files before starting to compare, to see if things have changed, but I’m not 100% sure.
To flag files for git not to look at (if they’re committed already), you could use assume-unchanged: http://gitready.com/intermediate/2009/02/18/temporarily-ignoring-files.html
To tell git to ignore files altogether, use .gitignore files (this is what lecloneur alluded to above)