Sprint 1 - Discovery

I’ve just finished the first 2 week discovery “sprint” as part of my XR Research Project. I’ll be sharing my insights of each sprint on the blog as it’s completed.

Once I started investigating XR properly I realised it’s an absolutely vast field, even within one of the disciplines. I’d provisionally secured support from a number of mentors as part of my application so was very glad to lean on their expertise to help distill some of the information out there during my initial sprint.

Hardware & learning

My first mentor meeting was with David Johnston from BBC R&D, a rare and valuable mix of technologist and producer. We had a lively discussion and I instantly felt more confident with his support.

After we met, David sent over a load of resource links covering hardware and software. He recommended the online learning from both Unity and Oculus which seem great and also included a very interesting creative coding tutorial in Unity which aligned with my experience in Processing and Open Frameworks.

David also helped me zone in on purchasing an Oculus Quest which seemed to offer the right balance of ease of use and flexibility for my project. I’m not a hardcore gamer and my aging Macbook Pro is not compatible with many of the tethered headsets on the market so the all-in-one device appealed to me, even at the expense of some rendering power. I was also tempted by some Bose Frames, especially as some BBC R&D friends were working with them for audio AR prototypes but the Quest offers my more bang for my limited buck.

The Quest runs on Android so as long as my PC can manage Unity than I’m good. I did briefly consider other game engines but already had a bias of interest to Unity before I this started and couldn’t find a compelling reason to not follow that.

I enjoyed exploring the Quest - the ease of use for creating a guardian zone was impressive (a safe play space for standing 6DOF VR experience). I don’t have much to compare it to but I’m very happy with the product overall.

Perception Hacking

My second mentor meeting was with Dr. John Greenwood, a researcher in Experimental Psychology at UCL. John’s research is fascinating, focusing on visual perception and it’s disorders via psychophysics - “the scientific study of the relation between stimulus and sensation”. There’s a great overview of his projects over at Eccentric Vision.

He started by demonstrating the power and trickery of peripheral vision and the problems with visual crowding before going on to explain the relationship between conditions such as Amblyopia (often called lazy eye) and diseases such as dementia and Alzheimer’s. A big takeaway for me was to understand failing peripheral vision as a possible predictor for dementia.

We talked about various perception hacks/flaws including the Wagon Wheel Effect, and how the visual system can override the auditory one as demonstrated by the McGurk Effect. I had a written note about “Bunny Hop Illusion” which could refer to either of these; the first is a tactile illusion called Cutaneous Rabbit Illusion and the second is The Rabbit Illusion from Caltech.

John followed up with some slides on Spatial Vision: “the perception of the distribution of light across the visual field” which are “The ‘building blocks of object perception in the early stages of visual processing”.

Whilst studying the slides I also learned that Fourier transforms could be used to analyse images. I understood that sound can be created / distilled from simple sine waves but didn’t know about that generalising to any signal. Whilst looking for a good link to explain the theory I found An Interactive Introduction to Fourier Transforms by Jez Swanson - one of the best things I’ve seen on the web for ages! It breaks down the concepts in a really fun way, making it simpler to understand. The slides referenced Basic Vision: An Introduction to Visual Perception which I ordered instantly.


If you’re wondering what all this has to do with my XR research then take a look at my alpha mind map I created before I met John. Gestalt Psychology has been of interest for a while so I’m keen to explore other areas of perception and the way the brain tricks us into understanding the world.

Online learning

I worked through part of the aforementioned Introduction to XR: VR, AR, and MR Foundations Coursera. The videos and general structure were very welcome but I’m not so keen on the peer assessments and a bit suspicious of the “brainstorming XR apps” module. The biggest criticism by far though is the lack of engagement on the forum. They are pretty much a ghost town, with zero input from the tutor so I struggle to see the value? It might make more sense to follow the free Unity learning resources and take your chances in their forums.

Blogging software

Last but not least, I also spent some time extending this very website to give me blogging capabilities. It’s a custom site built with Elixir/Phoenix and I considered adding a static blog generator to the domain instead but I want to limit the amount of software I’m maintaining so was happy to consume the development overhead to keep streamlined and encourage some general maintenance to the site which was well overdue.

section: Blog

category: XR Research

filed as: xrvraroculus questbose framesdycppsychophysicsspatial visionvisual perception

published on 17 Aug 2019

This project is gratefully supported by the Arts Council England as part of their DYCP. fund.