I am currently in the process of prototyping the character setup for my game. I am developing a game that has a first person perspective and allows you to look down and see the body of the main character you play. This main character is also wearing human/alien hybrid tech that is composed of a tablet/visor system. The tablet is used as a main control device for the related visor which acts as an augmented reality system and provides some other hardware capabilities. The visor has both a flat HUD that is maintained in the field of view and follows head movement and a full 3D interface that appears around the character and augments what is in the world. The idea behind this is to provide a system in the game that replaces any need for complicated keyboard/mouse/controller interaction or a traditional user interface and is integrated with the story and the game play.
So far, I have integrated the Rift setup to be mounted on the main character and I am utilizing NGUI and the IK elements in Mecanim to simulate interaction with the in game technology and interface elements. As the user selects interface elements on the tablet or interacts with the 3D interface elements around him, his hand moves to simulate this interaction. I am still trying to work out these concepts and will do a post with more details once I am further along.
After spending about two months or so with the Rift, I wanted to share the approach I took setting it up. I am hoping it may help some folks who are just getting their kits and trying to get setup and going with Unity. In my opinion a good development setup and smooth process are very important for productivity. This is Windows specific so if you are on a Mac, this likely won’t be of too much help.
The most difficult aspect of getting started with the Rift has been the monitor setup. The documentation from Oculus indicates that the rift must be the primary display and when Unity standalone applications are run fullscreen, they by default use the primary display. In windows this means the desktop with the start menu and programs bar are all here. It also seems windows likes to extend the desktop at the resolution of the primary display. After about 20 minutes of messing with the screen configuration in Windows 7, I came to the realization that I can’t force windows to use the non-primary display as the primary desktop. I realized this was just not going to work for me. Having the start menu and programs always starting up on a small screen was very unappealing.
I have a single 30 inch monitor running at 2560×1600 (connected via mini displayport) and what I ended up doing for development was to keep my primary display the large monitor and add the Rift (connected via dvi) as a second display mirrored with another monitor (connected via dvi). The second monitor ends up just being another input on my big dell monitor that I can flip over to or see both in PBP (what is shown below), although I may get another monitor at some point to make things a little easier. Most of the time I never need to see the second display except via the Rift so haven’t bothered with another monitor yet. I either run this second monitor at 1280×800 or 1920×1200. The higher resolution is the same aspect ratio and rendering at it and scaling down helps the output look better on the Rift. Rendering at the higher resolution also makes sense given the latest news from Oculus, that 1080p (1920×1080) will be the minimum resolution of the commercial version. It is important to always test at the higher resolution to see what performance is like.
With the second display setup, I just moved the game tab in unity to it as a window, and then maximized it. I then saved this as a layout in Unity. Having the window frame on the screen is not an issue because in the Rift you don’t see much of the edge of the screen and you never see the frame. The only issue is that it is shifted down by about half the thickness of the window title bar at the top. Overall this setup has worked very well for development.
When doing builds I have a shortcut setup that runs the .exe with a ‘-adapter 1’ after it. This causes unity to run it fullscreen on the second display when not windowed so that the vsync and crossfire will work. I am currently working with a Radeon 5970 which is a dual GPU card so running in crossfire gives a pretty big boost.
I hope these tips will help others get up and running and please leave any feedback related to your development setup.
Check out the previous installment at Rift Ramblings (Part 1).