I had a great time at the 17th International Symposium on Wearable Computers (ISWC), held this year at ETH Zurich, Switzerland alongside UbiComp. This year there was a record amount of submissions for all calls: papers, posters, Gadget Show and the Design Exhibition. The full programme and abstracts can be found here.
Me with my EEG Visualising Pendant
This year I submitted my EEG Visualising Pendant for selection in the Design Exhibition. The pendant uses EEG (Electroencephalography) signals, which are gleaned from a NeuroSky MindWave Mobile, a standalone headset device that detects electrical signals from the brain, which are accessed via a single electrode on a protruding arm from the headband. The pendant displays attention / concentration data as red LEDs (light emitting diodes) beside meditation / relaxation data in green LEDs on an LED matrix. The pendant has live, record and playback functions, which give the user the choice of displaying live EEG visualisations or recording and playing up to four minutes of previous brainwave data visualisations on a loop if they’re feeling mischievous or want to appear to be concentrating / paying attention or relaxed, or just want to use the pendant as an aesthetic piece of jewellery without the EEG headset. More information on the EEG Visualising Pendant can be found here.
During the Design Exhibition, I was interviewed by BBC Technology News, the coverage can be found here. I was also filmed by Swiss TV.
Here’s my short video tour around the Design Exhibition
Including my work, there were fourteen exhibits in the Design Exhibition, here’s a brief listing of them:
Fiber Optic Corset Dress (above), by Rachael Reichert, James Knight, Lisa Ciafaldi and Keith Connelly of Cornell University, USA, which glowed wonderfully in the darkened exhibition space. The dress also features in Rachael’s short film CyBelle Horizon.
Lüme (above) by Elizabeth E. Bigger, Luis E. Fraguada, Jorge & Esther and built by Associative Data, is a series of garments that incorporate embedded electronics which illuminate based on the wearer’s selection of colour and other choices, controlled from a smartphone. The garments shone and changed colour beautifully. Lüme won the Design Exhibition prize in the aesthetic garment category.
E-Shoe: A High Heeled Shoe Guitar, by Alex Murray-Leslie, Melissa Logan and Max Kibardin of the University of Technology, Sydney, Australia, is an intriguing and startlingly captivating shoe guitar that was created to explore acoustics in wearable technology and the practicalities of instruments for live multi-modal performances.
Brace Yourself – The World’s Sexiest Knee “Brace” by Crystal Compton and Guido Gioberto of the University of Minnesota, USA, is an interesting and playful look at how a stocking incorporating a bend sensor can be used to track movement in the leg in a new and more aesthetically pleasing way.
Play the Visual Music by Helen Koo of Auburn University, USA, is a garment for musicians and performers which responds to sound and intended to provide visual multi-sensory stimulations to the audience.
Garment with Stitched Stretch Sensors that Detects Breathing & AVAnav: Helmet-Mounted Display for Avalanche Rescue
AVAnav: Helmet-Mounted Display for Avalanche Rescue, by Jason O. Germany of the University of Oregon, USA, has developed a series of prototypes to assist rescue teams locate buried avalanche victims.
Haptic Mirror Therapy Glove by James Hallam of Georgia Institute of Technology, USA, is a glove that allows the stimulation of a paretic hand’s fingers following a stroke by tapping the fingers of the unaffected hand. James’ glove won the functional category prize in the Design Exhiibition.
Garment for rapid prototyping of pose-based applications, by Jacob Dennis, Robert Lewis, Tom Martin, Mark Jones, Kara Baumann, John New and Taylor Pearman of Virginia Tech, USA, is a loose fitting body-suit as the title suggests for rapid prototyping of pose-based applications.
Garment with Stitched Stretch Sensors that Detects Breathing, by
Mary Ellen Berglund, Guido Gioberto, Crystal Compton of the University of Minnesota, USA, is intended to be “a comfortable, everyday athletic garment incorporating a breathing sensor to monitor the activities of crewmembers on NASA missions”.
A Wearable Sensing Garment to Detect and Prevent Suit Injuries for Astronauts, by Crystal Compton, Reagan Rockers, Thanh Nguyen of the University of Minnesota, USA, was developed using pressure sensors to help detect and resolve areas of injury in spacesuits.
Garment Body Position Monitoring and Gesture Recognition by Sahithya Baskaran, Norma Easter, Cameron Hord, Emily Keen and Mauricio Uruena of Georgia Institute of Technology, USA, was designed to recognise arm movements that might lead to repetitive strain injuries and capture data on reaction time.
The Photonic Bike Clothing IV for Cute Cyclist by
Jiyoung Kim and Sunhee Lee Dong-A of the University of South Korea, uses solar panels to power heat pads to aid the comfort of the rider.
Strokes & Dots by Valérie Lamontagne is a collection of garments which are part of a research project looking at fostering advancement of creative innovation and aesthetics in wearable technology.
During the ISWC main conference, there were so many interesting papers presented, my favourites included:
Blitz the dog preparing for the FIDO presentation!
FIDO – Facilitating Interactions for Dogs with Occupations: Wearable Dog-Activated Interfaces by Melody Jackson, Thad Starner and Clint Zeagler of
Georgia Institute of Technology, USA. This research looks at how assistance dogs can communicate more directly with their human companions by using a wearable system of sensors embedded in an a dog jacket, activated by pulling, biting and nose touching. Examples shown included human companions who needed precise alerts to be given to them, such as a dog who could distinguish between a doorbell and a tornado alert and raise an alarm, and other canine companions who could get help from others in the case of a medical emergency. What fascinated me about this research is how intelligent and individual it showed the dogs to be, for example in the Q&A it emerged that some dogs can remember over 1000 commands or words and respond differently depending on breed and temperament. Another point that came out of the Q&A was how with the dogs help, this technology could be really valuable to people with severe disabilities such as ‘locked-in’ syndrome.
Halley Profita and Lucy Dunne during the Q&A
Don’t Mind Me Touching My Wrist: A Case Study of Interacting with On-Body Technology in Public by Halley Profita, James Clawson, Scott Gilliland, Clint Zeagler, Thad Starner, Jim Budd and Ellen Yi-Luen Do of University of Colorado at Boulder, USA. This piqued my interest as it examined social acceptability of wearables via how people felt about the placing of an e-textile ‘jogwheel’ (a circular controller) on specific parts of the body, their attitudes to where it was placed and why. The insights were both fascinating and amusing. The study used both male and female testers and used the setting of a lift as a public place. The testing was done in the US and Korea to find out how differing cultural attitudes affected the study. Korea was an interesting choice as contrary to the US couples do not hold hands or show affection in public and interacting with a wearable on the body did highlight different cultural attitudes to the body and personal space. The paper discusses a whole load of insights from the research, but to be brief, the study showed the torso to be the most awkward place to wear the e-textile jogwheel and the wrist and forearm to be the least awkward place to wear it. A majority of wearers found the e-textile jogwheel a potentially ‘useful’ device.
Sensor-Embedded Teeth for Oral Activity Recognition by Cheng-Yuan Li, Yen-Chang Chen, Wei-Ju Chen, Polly Huang and Hao-hua Chu of the National Taiwan University, Taipei, Taiwan. This presentation discussed how a tri-axial accelerometer system could recognise oral activities such as talking, chewing, drinking and laughing. The system results showed “93.8% oral activity recognition accuracy when using a person-dependent classifier and 59.8%
accuracy when using a person-independent classifier.” They discussed the uses for this such as dietary tracking. I found this research quite intriguing as I’m always looking for new and interesting ways to self quantify and will look out for news of their future work in this area.
Thad Starner giving his keynote.
Wearable Computing: Through the Looking Glass by Thad Starner of Georgia Institute of Technology, USA. Although I’ve read so many articles about Google Glass and possibly talked the hind leg off a donkey on the topic of Glass / lifelogging / privacy / surveillance / sousveillance in the last 18 months, I was still really looking forward to hearing Thad, who is also Technical Lead/Manager on Google’s Project Glass, talk about the device and discuss its tech specs. As Thad was previously part of the MIT Media Lab ‘Borg’ collective alongside Steve Mann, I was especially looking forward to hearing him present his thoughts on and about the history of wearable computing. I really enjoyed his talk and insights and best of all he brought along a box of some of his old head mounted display projects, one of which I cheekily tried on, see photo below.
ISWC 2013 was fantastic and I loved Zurich, next year it moves on to Seattle, being the last year (paws crossed) of my PhD, I hope I’ll have the time (thesis beckons) and money (am running out of cash) to get there! Many thanks to Lucy Dunne and Troy Nachtigall for all their hard work organising the Design Exhibition, and to Kristof Van Laerhoven, the programme committee, volunteers, speakers, exhibitors and attendees who made the conference such an excellent and thought provoking experience. Not forgetting to say thanks too for all the great vegan food that was organised for me!