1
\$\begingroup\$

I have a simple (and maybe quite naive) question regarding dual GPU use for Virtual Reality (VR); it nagged me for years now and I couldn't figure out why this can't work, at least in principle:

I realize that in alternating frame rendering techniques (like SLI and Crossfire used to do) the efficiency is quite meager because of synchronization issues etc. - and I am aware of Nvidia's SLI-VR project (that probably is dead meanwhile).

But shouldn't the use of 2 GPUs, one for the left eye and the other one for the right eye, working completely independently of each other, be a trivial method to double the frame rate, i.e. grant an "SLI-efficiency" of 100%? I've seen the flow-diagrams for SLI-VR and noticed that there's synchronization again; but why is that necessary for VR? The GPUs wouldn't need to work together on alternating frames (which requires synchronization), they would in fact not need to know anything about each other at all.

Couldn't we just do the following?: Regularly, a game engine sends out the job to render a scene for a certain camera position to the GPU, right? So couldn't we just have the game engine send out an additional job for the camera position "plus 60 mm to the right" (or whatever interpupillary distance one desires) to a second GPU that doesn't need to know anything about what the first GPU does? And whenever one of the GPUs has finished rendering its frame it would send it out to "its eye", call for the next camera position and render its next frame, and so on.

The images would not arrive synchronized in the VR device, but that wouldn't be necessary. The left and the right eye pieces could just be treated as two completely independent monitors, couldn't they?

I'd be very interested to learn where I went wrong with this idea - I think I did, otherwise the problem would long have been solved.

\$\endgroup\$
4
  • \$\begingroup\$ I think the reason why they need to be synchronized is because the head is constantly moving and any discrepancy between left and right eyes could break the stereoscopic effect. It could cause visual discomfort from flickering, blurriness, and left/right eye mismatching. \$\endgroup\$ Commented May 24, 2024 at 20:05
  • \$\begingroup\$ I could be wrong about this, but I don't think the video signal VR headsets take is two separate video streams for two separate monitors. They take a single video stream. So there would still be synchronization in composing the left and right halves into the final frame to send to the display. \$\endgroup\$ Commented Sep 30, 2024 at 11:33
  • \$\begingroup\$ Thank you for your inputs. I thought about the constantly changing head position, too, but I think that at >90 Hz there wouldn't be a problem. Actually, I'm not even sure whether at exceedingly low frequencies, e.g. 5 Hz, there'd be a problem. Could be, though. - About the single video stream: Good point, but couldn't this be changed? Of course that would require a change at several instances (engine, ...), but my question was about whether this would at least be possible in principle . \$\endgroup\$ Commented Oct 1, 2024 at 12:50
  • \$\begingroup\$ Possible in principle, if you don't care about supporting any of the HMD hardware currently on the market and want to wait for hardware specifically designed to accept two uncoordinated video streams. \$\endgroup\$ Commented Oct 6, 2024 at 18:39

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.