Hi you all.
I’m completely new in this whole area, so I don’t have any way of knowing how complicated is what I’m asking and if possible.
I want to track the motion if people with a 2D web-cam (not Kinect), to produce a live projection that will look somewhat like this:
the movement should create some kind of dreamy-particles-movement, that will be intense as fast as the movement is, or as close to the camera, or at least will have some kind of interactivness.
is it possible (for a newbie like me)? what should i go through to make that? are there requeirments for the camera? (can’t afford the kinect)
if 2D is good enough i would start with a frame difference motion tracking and apply some texture fx to it. this is simple, effective and you have endless possibilities. ask @lecloneur for inspiration…
@ayalala the project you outlined is definitely doable. if you ask, whether it is possible with your newbie skill, most likely not - yet. but don’t be discouraged. :)
if you want to see your project made as fast as possible, find someone skilled to cooperate with. if you want to make it yourself without too much learning, touchdesigner might be for you.
if you want to experience adventure, start learning vvvv, there’s is wast amount of learning materials here. start with it teach yourself beyond beginners level and go to node festival, it will blow your mind.
having a project in mind is a great starting point and while you learn, you will come up with many more interesting ideas and it will greatly help you to collaborate with other artists or coders in this field. for every kind of artist, it is essential to master the medium he or she is using.
interesting resources relevant for you projects are maybe this ones:
check works of frieder weiss, mortal engine by chunky moves
Thank you for your answer! really helpful, and inspiring as well. i will keep on checking out all of the references.
the problem is that my computer doesnt have a good-enough graphic card for the TouchDesign software. any other alternatives? what about EyeWeb?
@ayalala Price of the equipment is always a challenge in interactive art and design. If you do not have proper tools, it is difficult to be productive and it will also influence the result. From the creator perspective it is often hard to separate the quality of the idea from the constrains introduced by hardware.
On the other hand, keep the costs low may lead to interesting innovations. I’ve always found VVVV to be very efficient when it came to hardware demands and I’m sure you can do something interesting even with an old computer. I do not have too many experiences with other tool so I can’t really compare.
With your project I think it is important that you ask yourself, how important is interactivity for you. Should the piece be interactive? Performing in the interactive system feels very different, but from the audience perspective it does not matter that much. There is lot of research around this questions, good starting point is book Digital Performance by Steve Dixon https://mitpress.mit.edu/books/digital-performance
Interactivity gives dancers and choreographers more freedom to make changes and improvisations, it also supports the choreography creation process of creating choreography - creative team can instantly see what works. On the other hand, interactivity can get very stressful when things do not work properly.
i hope its ok to ask, but i have some silly questions which i cant seem to find a solution to.
so i decided to try it anyway, and see how far i can get .better anything than nothing at all, or at least i’ll learn some things.
so
i have a simple webcam. and its on.
i thought maybe the next step would be to figure out wethear its possible to track a movement on the screen, but i couldn’t figure it out. is it? any ideas?
Hey guys, maybe someone can help me. Like mentioned above, I just started with the same project.
So far I have a video input (webcam and HAP video) running. Then process it through FrameDifference( from CV.Image Pack) to GridView and then to the Renderer (everything in dx11 on x64).
The result is nice but now I have no idea how to spawn some particles on the frameDifference result.
In processing or another language I would do:
look at all pixel -> if color changes for longer then 500msc then spawn particle on XY position on actual pixel position.
I tried to get the values with pipet(CV.Image Pack) but somehow it runs the fps in hell (5-10fps).
Is there a way to do it with vvvv nodes or do I have to write shader (what I really dont want and cant).
Maybe someone could give me a hint in the right direction. I`m new to vvvv, but I have some background in Touchdesigner and Processing.
this is my code so far. please note that i just started with vvvv one week ago. i’m a total beginner and many things are not perfect in this sketch because it’s still in progress
.
need: dx11.particles & cv.image pack.
if you can`t see anything, try to zoom out with the camera (did’t managed the best setting for cam yet)
Nope. Not working. I don’t why. I don’t understand what I am missing.
the cv.image and dx11 particles are installed. But whenever I open your patch, it’s calling out Visual. and from there it doesn’t go anywhere…
tried to write the pacth myself, but no success.
that’s getting frustrating… :’(
I changed the computer and went to an other one.
then the patch opend without a break.
but some nodes are missing.
As you said here, the nodes are existing in the file, but stays red. So I tried on both x86 and x64.
But still missing. how did you solve that?