Finally. A 3D gig. And doing Virtual Reality. This is going to be awesome! I have spent many hours in front of one particular game engine, Unity3D. Most of my hours have been spent tinkering and playing away to build endless prototypes of games. I reckon 50+ by last count, with one actually released. Done for IOS it’s lost to the ages now because I did not renew my developer licence (nor did I want to). We do it for the love right?

Most of my early days with computers were spent sitting in front of a Commodore 64. I even used to try to program the Dick Smith Wizzard. The days of watching the tape drive slowly load in a game or application are long gone, but the memory remains.

After a number of years, having too much fun in bands, I came back to a world of pc’s. From there I played lots of games, on many platforms. I watched games change over the years from gameplay emphasis, to graphics, to narrative. But nothing impacted a changing of the gaming experience quite like the release of the Oculus Rift Virtual Reality headset.

I experimented early in VR, but it was very early days, so early adopters were more enthusiasts in the space. There were hundreds of demos released in a short period of time, over the course of a few years. They ranged from meditating to horror experiences, and everything in between. The technology was not quite ready. The killer app was not there. Until then, mainstream adoption would take time.

When I got my first consulting gig at DiUS, I was placed at REA Group. While there, I was available to join a team and participate in their hack day. Wouldn’t you know it, but one of the teams wanted to build a VR project. I joined and helped create an experience where you walked in the shoes of a homeless person. It was a great experience and was the first time outside of my personal time I was able to use my Unity3D skills.

Fast forward a few years and the industry has changed, and VR is now accepted mainstream on smartphones and the technology is now a buzz word out in the commercial world. Unity3D is considered a part of the VR development landscape.

In August 2016, my work wishes came true. I landed myself in a 3D role, sent back out to help build a VR application for REA Group (realestate.com.au). Paired alongside Andras Bubics, onward we went. The following paragraphs detail the first journey for DiUS into the uncharted waters of VR development.


Photo: Andras and I immersed in VR at REA Group.

First months

The team at REA Group was pretty excited. The brief was clear - build a VR application for Google to launch on day 1 of their new Daydream VR platform. A team was already assembled. A grad, two devs, a product owner, BA, and two designers. With us onboard, we had 4 developers.

From the start our goals were clear, get a CD environment setup that builds the Unity3D apk for android. Sprinkle it with some TDD and the programming gods would be happy. Once that was working, we would contribute to the project.

There was a push from the team to get us onto the good Unity3D stuff sooner rather than later. We knew the CD environment was important so we kept at it for the entire first month. It gave us a better insight into the ability of a Unity3D development environment with modern industry practices. It had it’s challenges but we did succeed in setting up code testing and a deployment environment.

With the initial UX and design on the project, there was a mix of VR experience at play. From no experience to fully experienced, it felt like we were junior game devs. I also spent some time early around wireframing UX in VR. There was a lot of discovery here for me and some great learning to take forward.

The differences for using UX in VR are in physiological and vestibular reasoning. Mapping journeys and navigation exploration feel no different whether in 2D or 3D. The biggest difference is when testing the usability. It should be done as soon as possible in VR. We approached this In a ‘test’ scene, where we created a small user panel, floating in the 3D space. Then we would swap between image mockups for the test users, in realtime, while they wore the VR headset. We found that this is actually how the emerging VR industry recommends you test your VR applications.

Middle of the project

During the project, something clicked for me. I learnt a lot from my co DiUS developer, Andras. He has a lot of experience that showed constantly. Combining this with my usual Unity3D goto dev patterns, I learnt a lot.

I also un-learnt a lot. It took me a while to break out of a Unity3D focused world. As a 3D editor, it is one of the best introductions for non programmers into 3D. For programmers it removes so much low level stuff, you can get on with creating cool 3D stuff. You use c# to talk to the Unity libraries, so it is easy to script without the need of doing anything dot net heavy. You create simple c# scripts that make Unity components talk to each other. Then a unique paradox comes to the fore.

Non-programmers start programming and it works. Programmers can get lazy as you can just grab a component to do stuff, and with a little sauce, it works. You program with the editor in mind, not with programming in mind. You can get by with a little bit of knowledge. Also programmers can take shortcuts very easily. You usually end up with code that looks like javascript, pre frameworks.

In Unity3D, the usual approach is that each ‘Game Object’ in the scene is usually programmed as the model and the viewmodel of itself. You attach components such as physics and mesh renderers to it. Then you can program additional work for these components in separate classes, all attached to that game object or even freely attached to other game objects. This scripting freedom is what gives Unity3D its appeal and ease of use, and why you can build fast in it’s environment.  It also means you can end up with a messy domain model.

To clean up the domain model, we pulled out the business logic from Unity3D specific code. When doing this, I could see familiarity with every web framework I have tried. It suddenly looked like a classic web app, with one way data flowing down to Unity components that simply rendered according to what it was given.

As expected, this separation helped our focus on TDD. We used NUnit and tested the behaviour of the app. Unity3D life cycle events make the testing challenging, but from that it further encourages (forces?) you to break out your business logic.

So now we had Unity3D with models and views, and tests. Now I don’t need to remember everything about every part of the app. My tests can catch anything bad. No one else can break the app accidently. And Jenkins runs tests and builds an APK too after you push to git :) With a little more work, this jenkins box will be a complete CD from code to google store.

This was great. So with one month to go, we were just waiting on Google to send that VR kit they promised. Saaaweet!

The final straight

The BA went on a five week holiday near to product launch. We could do the BA work ok, but no one took complete ownership. Because of that, the board became more of a sticky notes area.

A little bit of panic stepped in and no one was ever 100% sure what was most important to get done. We had to build a product that conformed to the Google VR guidelines, but also met REA Group’s expected value proposition.

Andras and I ended up supporting two REA devs on the ground over in San Francisco to help with this final phase. The app had one major SDK for 3D home content, from a company called Matterport. When we finally got a pre-release Daydream VR SDK into the Unity3D project, it broke the Matterport VR viewer. If Google changed something, it was usually ok in Unity3D, but Matterport would have to play catch up. Then we would have to make all three sing together. We all worked together across the continents to address any concern that came up and it went better than expected.

The product owner took over the BA role, which ensured the business needs were met. I think us devs on our own would have had different intentions. The business had clear goals for the app but the devs always had a different list of what we considered important.

Despite the confusion at this time, I felt we were in a good place, and the main user experience was pretty sound going into launch.

It was intense in the last couple of weeks, but we got there. The app launched on November 11. Since its release, interest in the product from REA Group customers has increased. They see an opportunity to leverage the VR platform with the consumers.


Photo: Inside the realestate app.

Would I do it different?

Yes. I would.

Test value prop early, then usability.We started with a clear idea of the value proposition at project kick off. Due to a number of reasons outside of the team’s control, we kept testing the value prop and not the usability. We constantly debated it and got confused and it steered UX decisions. We were going to run with the original value prop in any case, as REA already knew the desired outcome in market of the product.

Use the Unity3D profiler. We would make changes in the code and not notice a loss in performance. You are developing on a machine with more hardware power than smartphones. Post release, we could not apply some performance improvements without a major refactor. We should be looking to use the profiler all the time to measure every change and addition to the code. We should also be running the profiler on product when testing with the smart phone device.

Increase my knowledge of Unity3D and testing with NUnit.The code was written to be highly testable, but we ended up with big bloated classes that were hard to follow at times. Also test coverage was low. At DiUS, we need to emphasise to our clients the reasons why we are aiming for TDD in the first place. In the final sprint to release, changes were made that broke functionality. These changes cost time attributed to new and unexpected bugs. Without tests, they made it into production. They were not show stoppers but we lost the ability to know when and why bugs crept in.

Finalise the Automation.In our retrospective, the benefits of an automated build pipeline were not recognised as much I would have hoped. We never got to completely finish it off due to time constraints. As mentioned above, in the final sprint to release changes were made that broke functionality. This project highlighted to me why an automated solution can be so important and why we need to ensure our clients understand the benefits.

Stability in a team matters. This was also brought up in the retrospective. The team was mostly seniors of their disciplines, and that made a difference when we had to make decisions about the product. We could organise ourselves, but we also had core roles to fulfill. At times we lacked personnel due to a number of factors and it created confusion at times during the project.

**
Photo: Joe comes to see what all the fuss is about.

Lastly…

If you want to learn about these new exciting platforms, grab yourself Unity3D for starters. Get a mobile phone and do some simple VR development.

Search for ‘Virtual Reality’ or ‘Augmented Reality’ on Gizmodo and Venturebeat to keep abreast of news in this area. Other good sites are:

Most of these news sites will have similar articles but are great sources of news in the space.

Overall, the project felt like an awesome experience personally, and I feel like all my Unity3D work outside of my normal day job finally has some relevance. The product is a success. Virtual and Augmented Reality is going to be massive, so I am here for the ride.

Want to know more about about realestate VR? Read the case study: Australia’s first virtual property app