RADICAL BODIES: Dance in VR

What’s happening in your workshop?
creating VR based dance experience with motion tracking, point cloud streaming, and light control.

we demonstrate samples in the begging, then start from basics of whole techniques like VR development with/without HMD, calibrate VR and Kinect coordinates, synchronize music and visuals with osc, etc.

after that let’s make your own effects! we will prepare some short music for performance, and templates of vvvv modules and effects.

we are going to setup 3 performance stages in the classroom. each stage has VR HMD (oculus or vive), kinect, PC, and DMX light. also all stages are connected with LAN, so you can choose single player or multiple player.

you don’t have to bring oculus or vive, but you can use yours of course!

What is the “RADICAL BODIES”?
it’s our research theme but also title of installation we did this year in Japan.


Who is the target audience of your workshop?
anyone interested in VR development, use kinect and VR devices together, streaming point cloud data, and RAM, which past project of YCAM (http://special.ycam.jp/ram/en/).

What knowledge do you presume your participants have?
basic understanding of vvvv

What will attendees of your workshop learn?
how to calibrate different coordinates, create custom plugin to get performance, how to deform point cloud and skeleton via patch or shader.


(note from here, don’t need for node17 page)

What technical requirements does your workshop have (apart from internet and a projector)?
e.g. special hardware, software, graphiccards, materials, dancers/performers
kinect2, oculus cv1, fast machine with geforce gpu, dencers are definitely cool;)

What 3 tags describe your workshop?
VR, dance, body tracking

In which of the three categories does your workshop fall: beginner, intermediate, advanced
intermediate

How long will your workshop be?
(Standard formats are 3h or 2x 3h, but other formats may be possible)
3h should be enough. if 2x 3h, we can start from how to setup environment, includes kinect & oculus calibration.

Who would be the two hosts?
Minoru Ito (mino)
Shunichi Kasahara (shunK)
Junji Nakaue (djjj)

Minoru Ito: vvvvook writer, contributor of metaio, shadertoy renderer, and anttweakbar plugin


7 Likes

Sounds super interesting, im in

+1

+1, would be awesome !

Sounds interesting. I’ll probably bring a vive to node- will it be oculus only?

@everyoneishappy
we’d like to support vive too, but not tried yet. we will consider how to do if this ws is selected