# VVVV.OpenVR | HMD Matrix to position and orientation?

Hi, I’m struggling a bit over here trying on how to obtain the position and orientation of the HMD with the OpenVR Pack. I can see that I can get the matrix from the HMD and then use that to put objects in my environment with the position and orientation of the HMD. But, I want to be able to cast a ray from the position of the HMD into the direction it’s looking so I can do an intersection test, with objects in my scene. For example, showing additional information about an object I’m looking at… Any thoughts on how I can get the position and orientation of the OpenVR HMD from its matrix?

i guess you can simply multiply with *(3D Vector) the HMD Pose transform to get a xyz Position. If you also need the rotation, use a decompose (Transform Vector) node

the intersect nodes in vvvv also expect a transformation for the intersection ray, so basically there is no need to decompose, but you might need to rotate by 90°.

have a look into the button 3d modules that ship with vvvv.

if you really need the position and direction vector do it as sebl suggests with * (3D Vector) and input (0, 0, 0) to get the position and (0, 0, 1) to get the direction.

I’ll give those a try. I’m using Microdee and evvvvil’s Intersection pack, and that wants a position and a direction. Is it possible to interchangeably use EX9 (intersection) and DX11 nodes (scenefile etc)? Because I was under the impression that it’s preferable not to combine those?

https://vvvv.org/contribution/airdrawspline have a look here, i think that’s exactly what you want

You can directly take the controller or HMD pose and send it into the Transform Line input pin on things like Intersect (EX9.Geometry Quad). Do a Scale (Transform) on it first though to extend the line however long you want. Works for me.

My understanding of the EX9 stuff is it’s not great to intermix DX9 and DX11 rendering; things that just do math on geometries should be OK, at least I have not seen performance hits with it.

Oh, and if you use the View input to the OpenVR Renderer to change your virtual position (like teleporting) remember you will have to invert and multiply that with the controller and HMD poses so they are in the proper virtual positions for display and doing intersection tests.

Ok, I ended up patching things again somewhat from scratch this afternoon (not using the HMD pose, but the Left Controller) And this seems to work well enough… It’s not 100% reliable (pointing at the top faces of the cubes seems to be not working), but it works well enough for the prototype I’m working on. If anyone is able to comment and improve on what I’m doing here that would be very much appreciated! Ideally I’d have this stuff rock-solid, as I imagine I’ll be using this in future projects as well…

DX11-Intersection-Test-VR.v4p (52.9 KB)

Requires Microdevil’s Intersect Pack + OpenVR

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.