Boris Smus

interaction engineering

Sensor fusion and motion prediction

A major technical challenge for VR is to make head tracking as good as possible. The metric that matters is called motion-to-photon latency. For mobile VR purposes, this is the time that it takes for a user's head rotation to be fully reflected in the rendered content.

Motion to photon pipeline

The simplest way to get up-and-running with head tracking on the web today is to use the deviceorientation events, which are generally well supported across most browsers. However, this approach suffers from several drawbacks which can be remedied by implementing our own sensor fusion. We can do even better by predicting head orientation from the gyroscope.

I'll dig into these techniques and their open web implementations. Everything discussed in this post is implemented and available open source as part of the WebVR Polyfill project. If you want to skip ahead, check out the latest head tracker in action, and play around with this motion sensor visualizer.

Continued →

Hot bread: delicious or deadly?

Despite free access to information via the Internet and an increasingly global world, people still seem to have all sorts of divergent ideas about how the world works. For example, did you know that eating hot bread and pastries is incredibly unhealthy? Indeed, it can often even lead to complete bowel obstruction! I learned this fact as a kid, while growing up in the Soviet Union. Understandably, I have been very careful to avoid eating hot baked goods. That is, until recently, when my American girlfriend questioned the validity of my belief and I began to harbor some doubts. I decided to check if it was actually true, and asked Google. The results were very clear: I had fallen prey to an old wives tale. My worldview, shattered.

Incredulous, I searched for the same thing in Russian and arrived at the opposite conclusion. "What's up with that?" I thought, and wrote this post.

Continued →

UbiComp and ISWC 2015

I recently returned from ISWC 2015, where I presented the Cardboard Magnet paper. In addition to seeing old friends, meeting new ones, and being inspired by some interesting research, it was an excellent excuse to visit Osaka, Japan! This year, ISWC was co-located with UbiComp, and the combined conference had four tracks. This post is by no means exhaustive, just some of the more interesting work I got a chance to see.

Continued →

Magnetic Input for Mobile VR

It's easy to do, just follow these steps:

  1. Cut two holes in a box
  2. Put your phone in that box
  3. Look inside the box

And that's the way you do it.

Your smartphone is now in a box, so how do you do input? Now that we have a paper accepted to ISWC 2015, I can tell you!

Continued →

Site redesign, version five

It's been over three years since the design of this site has been updated. Time to change that!

This is the fifth revision of this site's design. Looking over previous designs, I've been happier with minimal designs, especially this one from 2012. I was inspired by many excellent designs such as Butterick's Practical Typography, Teehan+Lax, Erik Johansson, Medium and Frank Chimero.

The new design is visually cleaner. I use flexbox in many places, which makes the CSS far more intuitive. The responsive parts are very simple, consisting of just ten CSS declarations.

Continued →

Spatial audio and web VR

Last summer I visited Austria, the capital of classical music. I had the pleasure of hearing the Vespers of 1610 in the great Salzburger Dom (photosphere). The most memorable part of the piece was that the soloists moved between movements, so their voices and instruments emanated from surprising parts of the great hall. Inspired, I returned to the west coast and eventually came around to building a spatial audio prototypes like this one:

Screenshot of a demo

Spatial audio is an important part of any good VR experience, since the more senses we simulate, the more compelling it feels to our sense fusing mind. WebVR, WebGL, and WebAudio all act as complementary specs to enable this necessary experience. As you would expect, because it uses the WebVR boilerplate, this demo can be viewed on mobile, desktop, in Cardboard or an Oculus Rift. In all cases, you will need headphones :)

Continued →

Responsive WebVR, headset optional

VR on the web threatens to cleave the web platform in twain, like mobile did before it. The solution then and the solution now is Responsive Web Design, which websites to scale well for all form factors. Similarly, for VR to succeed on the web, we need to figure out how to make VR experiences that work both in any VR headset, and also without a VR headset at all.

Various head mounted displays.

WebVR boilerplate is a new starting point for building responsive web VR experiences that work on popular VR headsets and degrace gracefully on other platforms. Check out a couple of demos, a simple one and one ported from MozVR.

Continued →

Web Sensor API: raw and uncut

Sensors found in smartphones define the mobile experience. GPS and the magnetometer enable the fluid experience of maps; motion sensing enables activity recognition and games, and of course the camera and microphone allow whole categories of rich media applications. Beyond these now obvious examples, sensors can also enable clever inventions, such as Cycloramic, which used the vibrator motor in iPhones (4 and 5) to rotate the phone and take a panorama, pushup counters which use the proximity sensor to count repetitions, and Send Me To Heaven, which uses the accelerometer to determine flight time of a phone thrown vertically as high as possible. I've had some experience using and abusing sensors too, most recently for the Cardboard magnet button.

However, over the last couple of years, I've had to step away from the web as a development platform, in part because of the poor state of sensor APIs. In this post, I will describe some of the problems, take a look at sensor APIs on iOS and Android, and suggest a solution in the spirit of the extensible web manifesto.

Continued →

UIST 2014 highlights

This year's UIST was held in Waikiki, Honolulu, the undisputed tourist capital of Hawaii. I've stuck to my now three year old habit of taking notes and posting my favorite work. Since last year, the conference has grown an extra track. The split was generally OK for me, with my track mostly dedicated to user interface innovation (sensors, etc) and another more concerned with crowdsourcing, visualization, and more traditional UIs.

My overall feeling was that the research was mostly interesting from a tech perspective, but focused on solving the wrong problem. For example, at least 5 papers/posters/demos were focused on typing on smartwatches. The keynotes were very thought provoking, especially when juxtaposed with one another.

Continued →

Spectrogram and oscillator

A live-input spectrogram written using Polymer using the Web Audio API.

Screenshot of spectrogram

If you're running Chrome or Firefox, see it in action. Once the spectrogram is running, see if you can make a pattern with your speech or by whistling. You can also click anywhere on the page to turn on the oscillator. For a mind-blowing effect, load this in a parallel tab.

Continued →