Posts by brook

AR Coming of Age – and some AR you might not have seen

Posted by on Feb 22, 2016 in Blog Post

IMG_3280

A few years ago I was quite dubious about the future of Augmented Reality. We had made our first Augmented Reality application for webcam in 2009. At the time Augment Reality (AR) involved a dedicated design that was an anchor in the webcam image. This anchor is something the camera could recognise at any angle and use it position and orient a projected image.We had released a video game called ‘Dragon Master Spell Caster’, and at a conference we offered a range of cards that when viewed with our app through a web cam, displayed each of our four dragon characters. The anchor was a geometric design in the centre of our richly decorated cards. The four animated dragons were the projected images.

Dragon Master Spell Caster Card

Both sides of a Dragon Master Spell Caster Card with the anchor, an almost closed black square on the right hand side

We made plans for a range of these cards on which players could pit their dragons against each other in AR. While people were fascinated when they saw the dragons at conferences, the uptake and engagement afterwards was limited. It was clear then that Augmented Reality was largely a novelty and I dismissed it.  How wrong I was.

A few years later, with the rise of enabling technologies such as the smartphone and tablet I have seen some amazing applications of AR. The anchor is no longer an artificial addition to the scene, but can be some actual item in the environment. It might even be geolocation from the smartphone’s Assisted GPS and accelerometer. Far from being novelties, the new generation of AR spans entertainment, education, data visualisation, and many other fields. Here are a few applications that I have been around or involved with in the last two years that are worth a look.

QuiverVision

We shared some office space with one of the creators of ColAR, now called Quiver. (quivervision.com). With this app children color in a picture of a scene, and then when the scene is viewed with the app using an smartphone or tablet, the scene comes to full 3D life and is coloured with the child’s own pencil strokes.

The break through in this app was not just the ability to use any line drawing as the anchor for the projected reality scene, but the ability to take the coloured pencil strokes and map them onto the 3D objects in the scene. Content is king, but interacting with the content rules.

Snappadoodle

The Snappadoodle (www.snappadoodle.com) team are two Christchurch entrepreneurs who came up with the idea for their app during a startup weekend at the now famous, EPIC centre in Christchurch. This app was one of my favourites. In this case the anchor is a person’s facial characteristics, and the projected overlay is an animated head, replacing the subject’s head.

Snappadoodle

The anchor in this app is facial tracking. The phone, thanks to technology from Visage Technology (visagetechnologies.com) can recognise the position, orientation and scale of the face of the subject in view. The projected image in this case is one of several animations created by Stickmen Media (www.stickmenmedia.com). The app allows the user to capture video and post it to Facebook or YouTube.

Augview

Locating underground assets is a serious task that benefits from Augmented Reality (www.augview.net). This app uses augmented reality to show user the locations of underground assets such as pipes and drains. In this case the anchor is geo-location, and the projected images are pipes.

Otorohanga Bird Sanctuary

Otorohanga bird sanctuary(http://www.newzealand.com/in/plan/business/otorohanga-kiwi-house-and-native-bird-park/) used AR to bring back the giant moa, an extinct bird from New Zealand’s past. Augview’s (www.augview.net) geolocation technology was used to position the Moa within the sanctuary, where is could be viewed with a smartphone or tablet. The Moa itself was developed by Stickmen Media (www.stickmenmedia.com) with the assistance of renowned Moa expert, Trevor Worthy)

What’s Next?

These apps, while interesting or entertaining, these apps haven’t yet ignited the kind of public interest that we have seen in Virtual Reality.  Analysts have been skeptical about the prospects of AR, especially since the failure of Google Glass. Gartner’s Famous Hype Cycle for 2015 shows Augmented Reality buried in the ‘Trough of Disillusionment’ (www.gartner.com/newsroom/id/3114217). For the moment at least, technology analysts are waiting. There are two technologies they may be waiting for, Microsoft’s hololens(www.microsoft.com/microsoft-hololens) and Magic Leap(www.magicleap.com).

Hololens is a wearable technology by Microsoft that allows you to experience holograms in the real world. It is capable of using anchored projections and floating projections. With an array of sensors it does a very good job of locating its projections in space. They’ve also been ahead of the game in creating content, developing a workflow using the Unity Game Engine and Visual Studio (See this review on forbes) and now recruiting developers to purchase their development version.

Magic leap is something of a mystery. Again they can place projected images into the real world with what appears to be, rock solid anchoring using a technology called Dynamic Digitized Lightfield Signal™.  This technology uses technology that maps out the 3D space around the user, allowing projected images and animations to pass behind objects in the environment. The company recently received $793.5 million from Alibaba Group and has ongoing support from Google and Qualcomm. Magic Leap may also be courting content developers of the highest order. Sir Richard Taylor of the New Zealand Weta group of companies, who brought us Lord of the Rings and The Hobbit movies, is a member of the board of directors. He also has a role as the company’s arts ambassador to China(See Press Release here).

 

Beyond these, in the not-to-distant future are wearables such as AR contact lenses (CNet article on AR contacts)

Time will tell if these technologies have the accelerating effect on AR that smartphones and tablets did. I’m certainly not skeptical now. Having seen what can be done I, for one, would be deeply disappointed if AR didn’t yield its promised bounty of superimposed entertainment and data visualisation.

 

 

Facebooktwitterlinkedin Learn More

Physics and Frame Rate: Beating motion sickness in VR

Posted by on Dec 9, 2015 in Blog Post

Physics and Frame Rate

Many will be aware that one of the big technical risks in Virtual reality projects is motion sickness, or a subset of motion sickness called simulator sickness. Simulator sickness is experienced by people in virtual reality and is well known in pilot training in flight simulators. It has been believed that discrepancies between the motion of the simulator and the subject causes the brain to rebel and even induce nausea.

Agent-Smith-in-The-Matrix-agent-smith-24029441-1360-768

“Whole crops were lost!”

In our wheelchair trainer this form of motion sickness can compromise training by making the subject unwilling or unable to continue.

We developed the virtual reality wheelchair trainer in May this year. It was a joint project between Callaghan Innovation, Burwood Academy of Independent Living at Burwood Spinal Hospital, and Stickmen Media, our video game development company. MTech games was formed to commercialise the product. The idea was to produce a virtual reality environment for training and assessing people with traumatic spinal injury about using powered wheelchairs.

The result was a success, with a pilot study using current wheelchair users and clinicians, all agreeing that the trainer was just like being in a powered wheelchair. Unfortunately many people reported feelings of nausea after using the trainer. In fact, one collaborator joked that we should establish a bucket list, a list of those subjects who required a bucket during training. As part of the commercialisation effort we had to eliminate simulator sickness.

We began with a list of effects reported by users in the initial pilot study, that contributed to their feelings of discomfort:

  1. Flicker of the environment, a flickering of the whole environment that happened in particular circumstances;
  2. Flicker in the models, small areas of the model that flickered persistently;
  3. A feeling of sliding on corners; and
  4. A feeling of lightness of the chair, it started and stopped too fast. .

With some investigation we determined the following reasons for these effects.

Frame Rate Judder

The flicker of the entire environment is a common one in VR using headsets such as Oculus Rift. It has been termed judder, and it occurs when the frame rate drops below 75 fps. This was a challenge. Our 3D models are a very accurate representation of the Burwood TransitioNZ spinal unit. The first thing we did was to ensure that all, non moving parts of the environment model were being treated as static. In other words we made sure that the engine (Unity 5) wasn’t wasting cycles to see if those models had moved.

This made quite a difference, but we couldn’t quite tell how much, so we implemented a frames per second counter that could always be seen from the chair. We found that the framerate was at 75fps while the wheelchair was stationary but was mostly below 60fps while travelling. In fact, we found that each time we accelerated the framerate dropped.

The next thing we did was to change the way we read our controller information, using threaded lookups thereby decoupling the controller from the update loop. This caused the frame rate sit at 75fps consistently, even when driving. There are still some places where it drops to 60fps but we suspect that some occlusion mapping in our environment may fix these niggling problems. In-house testing showed a dramatic reduction in reported simulator sickness with the improved frame rate.

3D Model Overdrawing

There were several places in our environment which flickered annoyingly. Many of these seemed on the ceiling, above doors, typically above our normal line of sight. Such flickering, barely in the field of view, can produce nausea and headaches. At the very least it is an uncomfortable visual distraction. As game developers we recognised what caused this flickering. It occurs whenever there are coincident planes in our 3D world. The games engine, having ascertained that both planes are visible, attempts to draw one then the other, in rapid succession.

We eliminated all of these areas, with some millimeter tweaks to the geometry of the environment. Testing again showed an incremental improvement in motion sickness.  

Physics

The next two items seemed to be related. We have seen previously that the brain is very good at interpreting the physics of the real world producing a set of expectations based on perceived motion. The physics of our chair was a complicated algorithm accounting for its jockey wheels, contra-rotating drive wheels, and drift reduction on corners. Since we were directly calculating velocity, and assuming that the base was frictionless we had two, slight but perceptible problems with our motion:

  • The momentum felt wrong, which our users reported as the chair feeling light; and
  • There was still a slight feeling of drift on the corners.

Our calculations had been lovingly crafted from observations of an actual power wheelchair on a floor marked out in units using duct tape. Nonetheless it seemed they were not adequate for the harsh reality of VR. We scrapped our calculations and began a meticulous rebuild using a physics engine. We modelled wheels, suspension, motor torque and speed limiting.

Instantly our rebuilt chair ‘felt’ more like the real thing. Its acceleration and deceleration created a more natural sense of momentum. The subtle rock of the chair when it stops instantly enhances the feeling of braking.

Importantly, it caused a huge reduction in reports of motion sickness. In trials of 10 to 15 minutes less than one in six users report any motion sickness at all, and those that do report it as less than 3/10 where zero is no sickness, and 10 is wanting to vomit.

Summary

We have made vast improvements to our wheelchair simulator, particularly in reducing motion sickness. If you find yourself struggling with motion sickness in your game or simulation then try these 3 steps:

  1. Get the frame rate up to 75 fps consistently across the game;
  2. Adjust your models to remove flicker from coincident planes, in fact, eliminate all flicker;
  3. Use good physics simulations for moving the oculus camera.

Keep iterating until you’ve nailed it. We have seen now that there are roller coaster simulations that do not cause motion sickness in VR. This is an encouraging fact.

In the meantime, if you’re having reports of motion sickness , keep some mints beside your VR rig. Offer one to anyone who reports motion sickness. They should feel better in minutes.

A PDF file of this article can be found here.

Image from “The Matrix” , Copyright © 1999, Warner Bros

Brook Waters, CEO MTech Games

Facebooktwitterlinkedin Learn More

Week 12

Posted by on Nov 10, 2015 in Blog Post

Monday

Here it is, the final week of Lightning Lab. It has arrived too quickly and we are just not ready.

We were here yesterday, getting pitch feedback and changing our opening slides. Geoff and Charlie were concerned that we weren’t making enough of the fact that our opportunity has social impact. Others were concerned we weren’t clear enough about the business opportunity.  We found that it is really difficult to find good stock photos that exactly meet you needs. In future I will source photos early, taking our own if necessary. We spent the best part of 3 hours trying to find what we wanted.

Today we practiced, pitched and modified the deck two more times. We then submitted it, finally. Now, since it won’t change again I can start memorising it for real. The brochure is underway but not quite ready for printing.

Tuesday

side 1 mtech_drop_sheet_final

David, one of the lab techs, has finished the brochure. It looks great. I’ve read it through a couple of times and it seems error free. It’s on it’s way to the printers.

We have two pitch practices today. I’m still trying to memorise it. The deck is finalised now but the wording seems lumpy. I’ll refine it before we take it on the road.

Pitch one went well. Geoff and Charlie are reasonably happy. It was on time and pretty clear. I’ve found a balance between commercial and social impact. I read it. I am still trying to memorise.

Pacing backward and forward, talking to myself like a mad man has paid off. I did the second pitch practice from memory. It ran over by a minute but I remembered it.

Tomorrow is dress rehearsal. Time is running out. Margaret and Vinni have compiled a Question and Answer list so we all know how to answer the critical questions when we’re on the stands on Demo Day. Margaret has designed and printed sign up sheets.

The tshirts have arrived. I’m actually not sure about mine. The design is good but I should have hit the gym in preparation.

It’s starting to come together.

Wednesday D Day-1

selfie_jack_man

Its dress rehearsal day. We’re heading over to the Jack Man Auditorium at UC Dovedale campus. We’re not taking banners or machines today. We’ll set those up tomorrow morning. We do stop off to grab our brochures/drop sheets. I hadn’t really appreciated that they were done on card, so they’re not for folding up and putting in your pocket.

We did a decent dress rehearsal. Charlie has the technicals working backstage. The screen is huge, so I see why you can’t really interact with the presentation.

Getting mic’ed up was straightforward and we seem to be pretty confident. It seems one of the teams has withdrawn from presenting.

Thursday D Day

D Day is here. We start at the local supermarket, grabbing clipboards, pens, chocolates and bowls for our table. We then head to the lab to pack up the computer and oculus rift. We also print out some sign up sheets for anyone interested in our product.

Setting up went smoothly and a quick test of the product shows that everything is plugged in and functioning correctly.

Ready or not, we’re here, we’re as prepared as we can be and even so, I grab the stage for a few minutes to do one more run through of my slides. The strange thing about this event is that everyone presenting is going to miss it. We’ll spend most of our time in the green room.

We had subway for lunch. Everyone seems pretty relaxed. I have a chat with Dave Moskovitz about other opportunities to present our start-ups. New Zealand is developing an active angel investment scene. Canterbury now has its own group of angels, hopefully many of them will be here today.

In the green room we’re all starting to show nerves. The first 5 of us have been mic’ed up. There is a live stream happening but I don’t want to watch it until I have done my presentation.

me presenting

Me presenting. Photo courtesy of Erica Austin from Ministry of Awesome.

I waited backstage as Alex from Pocket Physio presented. Jo Nunnerley introduced me. She sounded as nervous as I was, I learned later she had been in a car accident on the way to the venue. The time on stage seemed to be over quickly. I stumbled a couple of times but overall I think I pulled it off.

As the last two talks were completing we were allowed into the investor room to start manning our tables. We had a great deal of interest and we did quite a few demos.

Its all over. If you want to know how it went I can’t tell you. I hardly saw any of it, like most participants. I will watch the saved livestream later.

We’ll be back in the lab tomorrow, following up with investors and working out what to do next. Tonight though, we’re off to Bentley’s.

Friday

Well, my feet are sore from pacing yesterday but we’re back at work. We’ve been calling interested investors and getting our investor deck ready to send them. Its hard to believe that Lightning Lab itself is all over. Its been 12 weeks of high intensity business modelling and development. There wont be a Good Bad and Ugly session today. Some of the lab techs are saying their goodbyes: Kate Blincoe, who did so much research for us; and Regina Speers who helped design our logo.

We would like to thank all of them, and the Lightning Lab executive team: Geoff Brash, Charlie Thomlinson and Michelle Panzer and the mentors. Thanks also to the other teams.

We’ll keep on here, at least till the end of November. I will still blog, more sporadically perhaps. I would like to finish this blog with the story of us being fully funded and operating and, although we have a lot of interest, we’re not quite there yet. The really interesting part of this story will be what we do next.

The full ll  team

The Lightning Lab Team, Christchurch 2015

Facebooktwitterlinkedin Learn More

Week 11

Posted by on Oct 27, 2015 in Blog Post

teampic

Team MTech

Monday

Its supposed to be a public holiday today but we’re in the office. We have appointments to get some team photos taken, and of course, there is the ongoing work on the pitch, dropsheet, brochure and our due diligence folder. These activities have taken over our lives. At least we’re not here alone. You wouldn’t know it was a holiday.

Tuesday

IMG_3064

The imagery on the projector is calming

Today we got the photos from Erica, the Ministry of Awesome’s photo ninja. The new headshots look much better than the ones we previously had in our documents.

We have to choose the music for our intro on pitch day. Each presentation begins with a 15 second musical intro. I guess we should have decided our song earlier, but we ended up rattling our brains on the way into work this morning. We’re trying to avoid ‘Chariots of Fire’ or ‘Eye of the Tiger’ clichés. Eventually Margaret came up with “Ordinary World” by Duran Duran, just the chorus. It seems pretty damn good considering the small amount of effort we put into the decision.

We learned that one of our mentors, Trevor will be down on Thursday. It will be handy to get some feedback on our drop sheet and pitch.

Wednesday

Again, a day busy with pitching. I have come up with an idea to record the pitch and listen to it using my iphone. Unfortunately as soon as anyone else hears the pitch it changes so I am not getting any traction on memorising it. It detracts from expression and emphasis when I read it so memorising it is all important.

Dave Moskovitz is back in the lab today. We pitched to him and he was very encouraging about how far we had come in Lightning Lab. I’m getting a little nervous as we have no commitments for investment yet. I have learned that the best groups from the Auckland Lightning Lab had at least soft commitments before demo day. Dave’s feedback was a bit of a lift. We still have time.

Thursday

As well as the pitch we have to prepare some media stuff. We need a 200 word summary and a 50 word summary of the business for various uses. By now we have enough materials that this is a quick cut-and-paste job. We picted to Trevor, and got his feedback on the deck. He suggested a few changes and we’re going to make them. He has seen many pitch decks before so it wouldn’t be wise to ignore the advice.

One of the lessons we have learned in Lightning Lab is that you can’t please everyone. While its a great idea to take advice from those more experienced, you also have to realise that its your business, and nobody knows it as well as you do. Trevor’s advice was right on the money though. It even ironed out a few things that I was already uncomfortable with.

After the work day finishes Margaret, Vinni and I have dinner with Trevor, and Charlie from Lightning Lab at Orlean’s in Strange’s Lane. After that, its off to rehearsal for Margaret’s band. Its a long day.

Friday

It’s an early morning. We’re pitching to Canterbury Angels at 7:30. We arrive at 7am and the four pitching teams, MTech, Bamtino, Debtor Daddy and Mighty Gem are shuffled into a side room until the angels all arrive. We’re called in to pitch. I still haven’t memorised the pitch so I read it. It went very well until they asked me the market size. I was trying to point out that NZ was too small as a market, but managed to give the impression that the market was only 100 people a year. God help me, I don’t want to do that at demo day!

Margaret has a hair appointment today. I already had a trim earlier in the week so we’re looking our special best for demo day 🙂

More work on the pitch, t-shirts are off being printed. We need t-shirts with logos to distinguish the teams. Vinni is writing the business plan, Dave has been giving us advice on the strategy and the time flies by until Good, Bad and Ugly time.

One of the funny issues that I have noticed. People really resent the bell that signals the start of GBYU and other group presentations. I think it’s because everyone is busy, and the bell, and whatever it is signalling, are a distraction, an unwanted intrusion on the work.

Good: We got our first investment commitment this week, hard rather than soft.

Bad: Vinni had an accident with a tree while she was bike riding.

Ugly: Giving the impression to the local angels that the market size for MTech was 100 🙁

Facebooktwitterlinkedin Learn More

Week 10

Posted by on Oct 23, 2015 in Blog Post

Monday

Its been a tense day. We’ve had some arguments and feelings are running high. Our Trello board is crammed with tasks and there is a feeling of being swamped. I think we may have to create a deferred list and move most of the least urgent tasks there. Trello is great, but the sheer number of tasks is overwhelming.

The pitch has been worked on by Margaret and Vinni all day but it is still incomplete. It rolls off my tongue like a brick, and I stumble through it still using my prompt notes.

We were meant to do a demo today, but somehow the visitor has gotten side tracked.

I booked tickets to go up to Auckland on Wednesday. The wheelchair trainer is a finalist in the NZ Innovators awards. Its not great timing as I will be missing half a day here.

Facebooktwitterlinkedin Learn More