Rift Ramblings (Part 3)

It has been a while since my last Rift Ramblings.  I was mildly burned out after pulling together Lost Loot for the VR Jam, but I have been busy developing the concepts for the successor to it.

During this process I have been trying to grapple with what makes a true VR game.  VR games extend beyond simply adapting current games to use a stereoscopic HMD.  They also need to bring interaction closer to the person experiencing the world.  My interpretation of this could be as simple as touching a fish that then darts away out of view causing you turn your head see where it goes or chasing a swarm of butterflies and having them fly about you avoiding your flailing limbs but landing on your outstretched hand if you hold it still.  It’s really anything that makes an experience magical by enveloping you and reacting to your presence.

Having a body is also important in the VR experience.  I have received feedback from several places now that people like the full body in Lost Loot and it helps give them a sense of presence in the game.  I have also begun to feel that a proper body reaction to movement is also important.  Your brain is wired to expect to see yourself and have believable movement of your extremities.  Either not having a body or it being static and never moving seems to put people off.

Since the VR jam, there has been the Sixense STEM kickstarter.

This video shows how positional movement of the main body in addition to hand interaction are going to be important for the future of immersive VR.  It is another indicator of how body and presence in the world are important to make your mind go along with it all.

My hope is that if Oculus implements positional tracking in the headset, they could build something similar to the STEM system, that would provide a complete package all integrated together.  For VR game developers, having a complete system for both immersion and interaction will go a long way to being able to build true VR games.

Additionally, if the plan is to have an Android integrated system with a reasonable GPU, perhaps this could all be a mobile wireless solution allowing complete autonomy for VR walkabouts, of course limited to the capabilities of the internal Android system unless tethered to a PC or other device.  This is sort of my dream device.

So, one example of how this works and doesn’t work.  While playing with the Hydra I realized that tracked hand motion is important when idle.  But when swimming or even walking the hands need to revert to animated motion or something that goes with the activity.  The hands reaching out in front and swinging back while swimming give a very comfortable feel and reinforce the motion your brain is seeing.

Another aspect to this is a proper body.  The Tuscany demo I tried with the Hydra uses disembodied hands that float around that you can use to grasp objects.  This is very disconcerting in VR compared to having a full body with IK controlled arms connecting the hands to the body.  This is just another indication of your brain’s need to see properly operating limbs when you do have a body.

This can also be tied back to the understanding that cockpits help provide a frame of reference to reduce your feelings of motion sickness.  The same can be said of a body being a frame of reference.  I have even seen feedback where there is a cockpit without a body and the end user complains of the lack of a body.  This can go one step further with complaints about a static body vs one that perhaps reaches out and manipulates controls or a flight stick in response to user input.  The ability to see the body and it reacting as a frame of reference seems important.

Changing the topic.  Oculus was in Boston yesterday, which was very exciting.  It was kind of a special developer meetup and hiring event.  Oculus made presentations about VR and the company and separately there were also a whole series of talks/exhibits by local researchers and developers with two tracks, one on games and one on educational and other non-game subjects.  I was able to meet Peter Giokaris, one of the engineers at Oculus who helped develop the Unity integration.  His Unite talk which covers the Unity integration is really good and worth watching.

I was also able to try Lost Loot on an HD Prototype Rift and see how it compared to the DK1.  The resolution improvement removes any concern I had about outdoor scenes and viewing detail at a far distance. There is also no screen door effect that people often mention with the current development kit.  The other big improvement was in color which I found to be much better.  The screen was also much brighter overall.  I can’t wait to see the additional improvements with whatever finally goes into DK2 and the commercial version.

One last thought.  I guess I want to somehow relate my feelings on this technology and the meetup yesterday.  VR to me and to many others is a collective dream.  Something we have wished for, dreamt of, and grown up with in science fiction.  To be a part of creating it is something special – even if this is take two.  We all hope for and want to work towards it being the next great paradigm shift in human and computer interaction.  Games is the obvious starting point, but there are so many other areas that will make a huge impact on society, the most profound probably being education, and changing the way we learn, hopefully bringing it closer to the natural instinctive mechanisms of exploration and play.  Another extremely important aspect of the Rift is pure economics.  It represents inexpensive VR for the masses.  This time around I think it will happen and real history will be made.

Check out the previous installment at Rift Ramblings (Part 2).

 
Share This:
Bookmark the permalink.

Leave a Reply