Virtual reality (VR) is one of the most interesting fields of computer graphics. Along with augmented reality (AR), VR sparked many new ideas and services in so many fields. Recently I needed an awesome demo to showcase in our booth at an important event. That’s when I thought, “Why not use a VR demo showcasing Watson services?”

I started planning the demo. I knew I wanted it to be a game. The premise of the game was a space exploration simulator. The player was a commander in the space federation searching a planet system to save alien refugees from the people chasing them. The player interacted with the ship through voice commands. I used a mix between Speech to Text, Conversation, and Text to Speech services to simulate an interactive ship. The Conversation service intents were specifically useful in issuing commands to the ship.

Here’s how I created the Watson Commander demo:

  • To lay the groundwork, I set up Unity game engine with the SteamVR Plugin to use the HTC Vive.
  • I created the environment. I searched for a good space skybox to use. I added sphere with a yellow halo to be used as the sun. I set it to illuminate the space by adding a light source to it.
  • I added 3 planets I found in an asset pack. I gave each planet a name to use in the navigation.
  • I added the Ship. Unfortunately, the only ship I found was a model without an interior, which is fine in most games but not for a VR demo. It would be a waste not to give the player the feeling of being inside of a spaceship. After looking at a few concept images for spaceships I decided to model a ship myself, roughly based on the Cobra MKII from “Elite Dangerous”. I designed the interior to fit the voice activated ship concept. I removed the steering wheel from the cockpit and added a big center console.
  • I added an enemy ship that I got from an assets pack.
  • After the setup was done I proceeded with the scripting. I thought about the architecture of the game and how components will interact with each other. After reading for a while I decided to use top down messaging for the ship guns and GameObject referencing for the other components. I set up the shooting system so that it broadcast a message “Shoot” for the guns whenever a shoot command is detected. Using this system with a bullet prefab that detected collisions I made a simple shooting system where the bullet damage is subtracted from the ship’s HP and when the ship’s HP reached 0 it exploded.
  • After that I moved on to the navigation scripts. There are two types of navigation I could have implemented: an organic flow based on physics without gravity or a simple magical orient yourself to the target and move forward. For the sake of simplicity and time I chose the latter. After searching and trying different approaches I used a mix of LookRotation and Transform for navigation. Now the navigation script took the target as either a planet or player name (in case of the enemy ship).
  • Finally, the Watson integration. Thankfully, IBM published a Unity Watson SDK to make it easier. It had a lot of prefabs and examples to help me get started. I used tag 1.3.1 from the Github Repo because it has the most examples. The Text to Speech and Conversation services are as easy as an API call. The problem with the Speech to Text service is that it isn’t an API – it’s a socket connection, which took a lot of search and trial. Unfortunately, I ran out of time so I decided to use IBM’s Speech Sandbox for this specific event and continue working on the demo later.

In the end, it was a lot of fun working on a VR project. After experiencing it first-hand I can see why people are crazy about it. There’s something thrilling and exciting about going inside of your own creation and experiencing it as if you are really there. I recommend exploring the possibilities of VR for yourself. Whether it’s a demo for an event or a business application VR is out to change the world.

Join The Discussion

Your email address will not be published. Required fields are marked *