0
$\begingroup$

I am trying to calculate the ray direction for each pixel in a post processing shader on the Vision Pro for raymarching. I am using the inverse of the projection matrix created from the tangents provided by the Vision OS API to multiply the screen's coordinate converted to NDC. This works perfectly on the simulator and matches the raster scene rendered with that projection matrix, however on the device, the projected rays are completely off from the raster scene (they seem warped at the edges and don't match the projection).

This is also the case if I calculate the rays directly from the FOV & aspect without using the projection matrix inverse.

From what I understand the projection frustum tangents are skewed somewhat in the VR projection, but i thought the inverse of the projection matrix should still unproject the coordinate correctly.

Does anyone have any experience with raycasting/raymarching in VR and could shed some light?

Thank you.

$\endgroup$

1 Answer 1

1
$\begingroup$

Better late than never. I made a small demo project with ray marching on AVP: https://github.com/murinson/MetalRaymarch In order for my ray marching shader to be executed for every pixel, I place a large sphere at the origin with front face culling. This sphere’s uv coordinates are used to calculate the ray direction. If you found a different approach, I’d be very interested to learn about it!

$\endgroup$
1
  • $\begingroup$ I ended up raymarching on top of geometry as well, though for me, I replicated the base geometry of the scene and raymarched on top of that. The real solution remains open! $\endgroup$ Commented May 3 at 2:30

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.