RoomAlive Toolkit

Hi, i had a look today to the microsoft RoomAlive toolkit. It seams to be great for small scale projection mapping. Even if not using the kinect for the mapping, i think it might be easier to align a mesh to the point cloud than to tweak the projector node.

The calibration process was simple and quick.
I managed to run the calibration and generate a calibration file with the toolkit but i got no idea how to use that in v4 and there is not much documentation coming with the toolkit, except the mapping sample demo.

I have attach the xml generated
And there is also a zip of a 3kinect x 3 projector config that you can download on the github page.

Any hint would be welcome to port that to v4.

calibrate.7z (1.2 kB)

Note that you can export a mesh as obj from the calibration tool and it load in v4 without any problem.
In CalibrateEnsemble.exe you can see the perspective view and switch to projector view.
I will be interrested to find out how to get that projector view in v4.
There is some matrix data in the generated xml, but i had no luck playing with that yet.

You don’t need a kinect and a projector to do some test you can load the sample data from github and load it into CalibrateEnsemble.exe

just found some info’s about the xml formatting:

The camera matrix listed there is the camera matrix for the projector.
The camera pose is the pose of the depth camera (during the optimization, projector is the center of the coordinate system, and we solve for the depth camera poses).
I believe if you invert the pose of the depth camera you will have the pose of the projector.

Thx guest
I managed to have it working , porting that code from Calibration ensemble source.
You can include roomalive dlls into a plugin (the one from proCamEnsembleCalibrationLib) to get the file parsed with ProjectorCameraEnsemble.FromFile() function, it make things easier.

Sorry i can’t share the plugin as it is for my work and not finish anyway, but here is the piece of code who need to be ported:

public static SharpDX.Matrix ProjectionMatrixFromCameraMatrix(float fx, float fy, float cx, float cy, float w, float h, float near, float far)
        {
            // fx, fy, cx, cy are in pixels
            // input coordinate sysem is x left, y up, z foward (right handed)
            // project to view volume where x, y in [-1, 1](-1, 1), z in [0, 1](0, 1), x right, y up, z forward
            // pre-multiply matrix

            // -(2 * fx / w),           0,   -(2 * cx / w - 1),                           0,
            //             0,  2 * fy / h,      2 * cy / h - 1,                           0,
            //             0,           0,  far / (far - near),  -near * far / (far - near),
            //             0,           0,                   1,                           0

            return new SharpDX.Matrix(
                -(2 * fx / w), 0, -(2 * cx / w - 1), 0,
                0, 2 * fy / h, 2 * cy / h - 1, 0,
                0, 0, far / (far - near), -near * far / (far - near),
                0, 0, 1, 0
                );
        }


  // could be method on Projector:
        void SetViewProjectionFromProjector(ProjectorCameraEnsemble.Projector projector)
        {
            if [| (projector.cameraMatrix == null](https://vvvv.org/documentation/projector.pose-==-null))
                Console.WriteLine("Projector pose/camera matrix not set. Please perform a calibration.");
            else
            {
                // pick up view and projection for a given projector
                view = new SharpDX.Matrix();
                for (int i = 0; i < 4; i++)
                    for (int j = 0; j < 4; j++)
                        view[i, j](i, j) = (float)projector.pose[i, j](i, j);
                view.Invert();
                view.Transpose();

                var cameraMatrix = projector.cameraMatrix;
                float fx = (float)cameraMatrix[0, 0](0, 0);
                float fy = (float)cameraMatrix[1, 1](1, 1);
                float cx = (float)cameraMatrix[0, 2](0, 2);
                float cy = (float)cameraMatrix[1, 2](1, 2);

                float near = 0.1f;
                float far = 100.0f;

                float w = projector.width;
                float h = projector.height;

                projection = GraphicsTransforms.ProjectionMatrixFromCameraMatrix(fx, fy, cx, cy, w, h, near, far);
                projection.Transpose();
            }
        }

Also if you load the obj with assimp dx11 you need to invert z scaling, i struggled with that for quite a while.