Hi,
I am building VR tours for a company. These experiences are based on stereoscopic 180º footage WITH 3D elements. I have looked at the OpenVideoPlayer method, but not only I didn't get any video playing, but also I believe it launches an intent to a video player which will play instead of my app, so 3D contents would disappear.
It used to be that VR SDKs would provide 2 cameras and anything was really easy to implement (although sometimes not performance optimized). It was really easy to create 2 spheres, one for each camera, each with it's content.
Now SDKs provide only one camera. I had no problems getting it to work with Google Cardboard/Daydream because there is a lot of documentation. However, it's really hard to find documentation for NibiruVR. The Unity SDK guide is very superficial.
I have managed to play video with the Unity VideoPlayer on a mobile phone with NibiruLauncher, but the same build does not play anything on the MagicSee M1 AIO headset.
Furthermore, since only one camera is exposed, I cannot find a way to separate the content between left eye and right eye.
Can someone please give me a hand with these two problems? I tried anything I could think of, but I had no results so far.
Thanks in advance. |
|