Helen and Barney join Brendan at Alton Towers to see just how good Smiler is at putting a smile on their faces.
Thrill Laboratory uses some cutting edge, realtime, facial muscle recognition software developed by its latest computer science conscript, Dr Michel Valstar. This software is based on Paul Eckman‘s Facial Action Coding System. There are 43 muscles in the human face. Specific combinations of these, at various degrees of activation, help humans (the passengers on roller coasters) to communicate their emotional state. When a rider is in a sate of flow it’s hard for them to control facial expressions – the camera doesn’t lie!
This particular experiment was technically that little bit more complicated (Thrill Laboratory doesn’t do ‘easy’). Vibration, G-Forces, changing light conditions – from sun in the lens, to complete pitch black – made the continuous accurate capture of facial expressions all that more challenging. This required some open heart GoPro surgery to change lenses and light filters, and the design of some selfie night-vision lighting. With a stable video feed, Dr Valstar was able to monitor those 43 different muscles on Helen and Andy’s faces, and we were able to use that data to determine the different emotional states in real-time. We then worked with the BBC to integrate our graphics and video into their post-production editing. Et voila!
This experiment reminded be of a MechaTronics workshop I ran for Design Products and Interaction Design students at the Royal College of Art, 2007. They created a prototype machine that forced the wearer to smile when in close proximity to other humans. Or maybe they made it just for me…