Hey and thanks in advance for any help/pointers
The solution is probably straight forward but i am still a little inexperienced (while creating quite a few different VR unity projects mostly study related, i never really build anything so the editor allowed for some shortcuts. see below).
I am currently developing for Windows mixed reality with unity. I am struggling to figure out how to make a (possibly secondary) application window run on the desktop that can be used during facilitated VR-experiences to change settings and generally provide facilitator input to the application while the participant would just perceive the VR world.
Edit: I have a similar project that is developed with Steam_vr library / Open_VR that works exactly as expected and portrayed by the unity editor; i.e. the desktop will shows the ingame footage + the screenspace overlay UI (which is interactable).Nevertheless quite a few features where implemented with the MixedRealityToolkit so I'd prefer to stick with that. And was wondering if there is a way to achieve the same/similar behavior with it.
Now with MRTK everything works straight forward when running from unity editor just making a canvas screenspace will prevent it from showing in vr while it can still be interacted with. the problem is that the facilitators themselves are a third party and having a build would just make their handling of the solution much easier. Not to mention performance gains from building and cleaning the project(?).
it will probably come down to creating a secondary window that runs on the desktop since my understanding is that uwp with MRTK just runs in the wmr portal rather than a separate app; and that the WMR portal simply copies the Headset render to the desktop. But i am completely blank on how to achieve this and was hoping that someone might have struggled with this before. But i couldn't really find anything. I'd prefer not to have to resolve to Networking, and will attempt to convert to Steam_VR /OpenVR in the meantime..
