Tag Archives: emotive wearables


Firstly, Happy New Year and best wishes for an amazing 2016! I’ve had a super-busy couple of months since my last post in September about my exciting trip to exhibit my ThinkerBelle EEG Amplifying Dress in Japan. I’ve moved house, which has flung me over the opposite side of London, which is going to require tons of work to create a comfy home and nice studio to work in. I have been plotting new emotive wearables pieces, plus investigating how I can develop this work and how it might evolve in the future wearables arena as a business. I’ve also given a couple of talks on my emotive wearable work at dorkbot London and the BBC, which was fun.

EPSRC UK ICT Pioneers finals, 2015
On stage at the UK ICT Pioneers final.

A fab experience for me was to get through the initial stages and to be selected as a finalist for the Engineering and Physical Sciences Research Council’s (EPSRC) UK ICT Pioneers Competition final. As the EPSRC describes it, “UK ICT Pioneers is a unique partnership between EPSRC and key stakeholders, which recognises the most exceptional UK doctoral students in ICT-related topics, who can demonstrate the commercial potential and impact of their research to business”. The competition included judges from EPSRC, Dstl, Hewlett Packard Enterprise, Facebook, BCS (The Chartered Institute for IT), Samsung and BT. At the finals held at the QEII Centre in Westminster, London, I was really excited to take examples of my doctoral practice and present my PhD research to four sets of judges at the final, plus invited academics and industry representatives. Although I didn’t win (the competition was exceptionally tough!), I had a great day meeting judges and hanging out with the other fabulous finalists who I’d already met many of at the media training day in October at EPSRC’s HQ in Swindon. During the finals, I was bowled over to hear more details about their amazing their research projects, which I’m sure I’ll hear more about in the future as they evolve and grow.

EPSRC UK ICT Pioneers finals, 2015
My stand at UK ICT Pioneers final.

EPSRC UK ICT Pioneers finals, 2015
All finalists on stage.

ThinkerBelle Fibre Optic EEG Amplifying Dress at ISWC Design Exhibition, Osaka, Japan

I’m just back from an amazing trip to Japan where I exhibited my ThinkerBelle EEG Amplifying Dress at the Design Exhibition of the 19th International Symposium on Wearable Computers (ISWC). This event was part of the 2015 ACM joint international conference of ISWC and Ubicomp, which took place this year at Grand Front Osaka, Japan.

ThinkerBelle EEG Amplifying Dress
ThinkerBelle EEG Amplifying Dress

I exhibited the dress alongside garments, accessories, textiles and devices, in the wearable tech categories of functional, aesthetic and fibre arts.

ThinkerBelle EEG Amplifying Dress
ThinkerBelle EEG Amplifying Dress

ThinkerBelle EEG Amplifying Dress in Tokyo!

Many thanks to this year’s Design Exhibition chairs Margarita Benitez and Halley Profita and jury panel: Maggie Orth, Sonny Vu, Tricia Flanagan and Frances Joseph.

Wear & Tear workshop with Thad Starner  at #ISWC15
Thad Starner’s keynote at Wear and Tear workshop.

At ISWC / Ubicomp I participated in two workshops, firstly Wear and Tear: Constructing Wearable Technology for The Real World. This was organised by colleagues at Georgia Tech Wearable Computing Centre and was a really useful and enjoyable day of reportage on building devices and systems. Thad Starner gave the keynote and was followed by various speakers who discussed what went right and what went wrong during the process of building their devices. Everyone shared useful approaches, tips and tricks to fixing issues and developing hardware and devices. A big thank you to the organisers: Peter Presti, Scott Gilliland, Abdelkareem Bedri, Clint Zeagler and Thad Starner, and the speakers, for a brilliant day.

Andy Quitmeyer's portable soldering shorts at ISWC Wear & Tear workshop
Andy Quitmeyer’s soldering station shorts at Wear and Tear workshop.

The second workshop I participated in was Broadening Participation. The event was created to increase the involvement of women, all students from developing countries, as well as underrepresented minorities, including persons with disabilities, in the field of ubiquitous and wearable computing. The day comprised of interesting and motivational talks and panels from those already working in the field of ubiquitous and wearable computing. There was also two poster sessions where participants discussed their research. I presented a poster on my doctoral research on Responsive and Emotive Wearables. I really enjoyed meeting and sharing my research with participants as well as hearing about their research, which was really interesting and there were some great crossover projects and research, which I’m going to follow up. Thanks very much to organisers: A. J. Brush, Miwako Doi, Gillian Hayes, Polly Huang, Judy Kay, Hitomi Tsujita, I.E. Yairi, Naomi Yamashita and Helen Ai He, and the speakers, for a great day.

Broadening Participation Workshop

Attendees of the Broadening Participation Workshop.

ThinkerBelle Fibre Optic EEG Amplifying Dress

ThinkerBelle EEG Amplifying Dress

I’m writing up my PhD thesis at the moment and analysing a huge amount of data from over 70 surveys and 8 hours of focus group audio transcripts. Anyway, without giving away too much about the data, as I’m saving it for my thesis, here’s a little preview of my ThinkerBelle EEG Amplifying Dress. I created this dress in response to a subsection of feedback data from my field trials and focus groups, which investigated the functionality, aesthetics and user experience of wearables and in particular wearer and observer feedback on experiences with my EEG Visualising Pendant. The motivation for creating the dress was for engagement in social situations in which the wearer might find themselves in a noisy or crowded area, where it is not possible to hear others and communicate easily – where forms of non-verbal communication may be useful. The dress broadcasts the meditation and attention data of the wearer for observers to make their own interpretations. It is up to the wearer if they want to divulge information regarding the physiological source of the data being visualised.

A short video of the dress.

A longer video of the dress shot in Tokyo, Japan.

ThinkerBelle fibre optic EEG dress

The dress was constructed with a satin fabric and fibre optic filament woven into organza. Using a NeuroSky MindWave Mobile EEG headset signals in the form of two separate streams, ‘attention’ and meditation’, are sent via Bluetooth to the dress, which amplifies and visualises the data via the fibre optic filament. Attention data is shown as red light and meditation signal data as green light. The dress is constructed so the two streams of data light overlap and interweave. The fibre optic filament is repositionable allowing the wearer to make their own lighting arrangements and dress design. The red and green light fades in an out as the levels of attention and meditation data of wearer highten or decline.

The dress’ hardware has a choice of modes, so it is possible to record and playback the data. This makes it possible for the wearer to appear to be concentrating or relaxed if wished to influence a social situation, what I call ’emotive engineering’. Also if the wearer would like to use their EEG data to create a certain mix of colour and light on the dress. It is also possible to set the playback mode and take off the EEG headset if the wearer wants to be headset free.

ThinkerBelle fibre optic EEG dress
Red = attention / green = meditation

As you can see I’ve included a few initial photos of the dress in action showing the EEG data as it is received from the headset. I have not made a successful video of the dress yet, as it’s difficult to light the dress for photos and filming. I will add a video when I’ve worked around this!

I have also been experimenting with changing the form factor of the headset for aesthetic and comfort, using various materials.

ThinkerBelle fibre optic EEG dress
Feeling relaxed = very green dress!

A bit of extra info, in case you were wondering… During my PhD research, I’ve been investigating the possibility of that wearable technology can be used with physiological data to create new forms of non-verbal communication. Since 2008 I’ve been experimenting with wearables, sensors and social situations, which led me to focus on wearables. These wearables amplify visualise and broadcast data from the body. As mentioned in previous blog posts, the field of wearable technology has blossomed and grown rapidly in recent years into a huge and mainly undefined set of devices, platforms, uses and practice. It was therefore necessary for me (a couple of years ago now) to create my own nomenclature to define the area I was creating and researching in. The first subset area being ‘responsive wearables’, which deals with wearables that respond to various physiological, environmental and other user related data and gives an output. This worked for a short while but still wasn’t definitive. I went on to drill down and make a new subset of this area to find a better definition for the emerging field I was working in, which I named ‘emotive wearables’. This area focuses on the area of wearable technology which deals with the gleaning of physiological data from the body, processes and broadcasts it in some way from the wearer. The output could be sound, movement, light, etc.

ThinkerBelle fibre optic EEG dress

My research with sensors, social situations, ambient and physiological data has led me to work with sound signal input (decibels), temperature (Celsius), pressure (Pascal) and altitude (metres) ECG (Electrocardiography), GSR (Galvanic Skin Response), EMG (Electromyography) and EEG (Electroencephalography), but my main focus for my PhD has been on the development and research of emotive wearables with EEG data.

AnemoneStarHeart EEG / ECG visualising device at Transmission Symposium

AnemoneStarHeart handheld EEG/ECG Visualising Device

At the end of April I spent a very enjoyable day at Bournemouth University attending Transmission Symposium: Strategies for Brainwave Interpretation in the Arts. There were some very interesting presentations, exchanges of ideas and discussion on the intersection between art, cognition and technology. Links to the event, artists and scientists taking part can be found here. Thank you to Oliver Gingrich for inviting me to participate and to all the attendees, especially those who visited my emotive wearable exhibits, asked questions and/or tried a device and filled in a feedback survey.

At Transmission Symposium I debuted my AnemoneStarHeart, which is a pendant which can also be used as handheld or standalone device (smaller version being tweaked!) I have developed for broadcasting, amplifying and visualising EEG and ECG data. I have been developing this device as part of the iteration process of the EEG Visualising Pendant. It brings together technology and elements from my aforementioned EEG Visualising Pendant and Flutter ECG pendant hack.

Watching 'Canal Trip' on BBC4 with AnemoneStarHeart broadcasting / visualising EEG
AnemoneStarHeart being used as an ambient device to observe relaxation whilst watching ‘Canal Trip’ slow TV programme, BBC4, May 15.

It can be used, for example as an aid for meditation, relaxation and concentration, as well as for personal viewing or sharing physiological data in social situations with others. Data is sent to the AnemoneStarHeart via Bluetooth and it is a battery operated, standalone device. It can either be worn as a pendant, viewed in the palm of the hand or placed in a convenient area of a room – illuminating the space with coloured light. Whilst sensors are transmitting data to the device, it constantly visualises it, changing colour and brightness based on the data it receives. The smaller, wearable version hangs from a chain as a necklace or in the style of a pocket watch so it can be brought out, looked at, then put away again. As I am interested in the commercial possibilities of bespoke couture wearables and small editions of emotive devices, at some point I aspire to crowdfund this project.

AnemoneStarHeart lit up with live EEG data

As part of my PhD research, I have spent the best part of a year organising and running focus groups with potential users of emotive wearables and the EEG Visualising Pendant in London and Amsterdam. I have also conducted field trials in various social and work situations across London and Brighton, plus collected feedback from observers of the pendant. Since the beginning of 2015 I have been analysing the resulting data. This is to discover the preferences and feedback of potential wearers of emotive wearables as well as the EEG Visualising Pendant. Out of the resulting data, so far, has evolved the AnemoneStarHeart device, for which I devised a new configuration of electronic components and code. I created a new enclosure for the electronics in 3D modelling package Rhino, with help from skills learned at Francis Bitonti’s computational design workshop. It was selective laser sintered (SLS) in Nylon, in one of D2W’s EOS machines in London.

Rain & AnemoneStarHeart lit up with live EEG data

At the moment I am mostly out of general circulation as I’m collecting and analysing data which is feeding into the new emotive wearable devices I am building, whilst simultaneously endeavoring to write up / finish my PhD thesis to deadline.

International Symposium on Wearable Computers 2013 (ISWC), ETH Zurich, Switzerland

At the International Symposium on Wearable Technology, Zurich 2013

I had a great time at the 17th International Symposium on Wearable Computers (ISWC), held this year at ETH Zurich, Switzerland alongside UbiComp. This year there was a record amount of submissions for all calls: papers, posters, Gadget Show and the Design Exhibition. The full programme and abstracts can be found here.

Showing my Bluetooth EEG Visualising Pendant at the Design Exhibition at ISWC

Me with my EEG Visualising Pendant

This year I submitted my EEG Visualising Pendant for selection in the Design Exhibition. The pendant uses EEG (Electroencephalography) signals, which are gleaned from a NeuroSky MindWave Mobile, a standalone headset device that detects electrical signals from the brain, which are accessed via a single electrode on a protruding arm from the headband. The pendant displays attention / concentration data as red LEDs (light emitting diodes) beside meditation / relaxation data in green LEDs on an LED matrix. The pendant has live, record and playback functions, which give the user the choice of displaying live EEG visualisations or recording and playing up to four minutes of previous brainwave data visualisations on a loop. The wearer can use this functionality they’re feeling mischievous i.e. want to manipulate a situation, what I term ’emotive engineering’ and want to appear to be concentrating / paying attention or relaxed, as well as if they just want to use the pendant as an aesthetic piece of jewellery without the EEG headset. More information on the EEG Visualising Pendant can be found here.

During the Design Exhibition, I was interviewed by BBC Technology News, the coverage can be found here. I was also filmed by Swiss TV.

Here’s my short video tour around the Design Exhibition

Rachael's fab fibre optic dress
Fiber Optic Corset Dress

Including my work, there were fourteen exhibits in the Design Exhibition, here’s a brief listing of them:

Fiber Optic Corset Dress (above), by Rachael Reichert, James Knight, Lisa Ciafaldi and Keith Connelly of Cornell University, USA, which glowed wonderfully in the darkened exhibition space. The dress also features in Rachael’s short film CyBelle Horizon.

Gorgeous Lüme

Lüme (above) by Elizabeth E. Bigger, Luis E. Fraguada, Jorge & Esther and built by Associative Data, is a series of garments that incorporate embedded electronics which illuminate based on the wearer’s selection of colour and other choices, controlled from a smartphone. The garments shone and changed colour beautifully. Lüme won the Design Exhibition prize in the aesthetic garment category.

E-Shoe: A High Heeled Shoe Guitar

E-Shoe: A High Heeled Shoe Guitar, by Alex Murray-Leslie, Melissa Logan and Max Kibardin of the University of Technology, Sydney, Australia, is an intriguing and startlingly captivating shoe guitar that was created to explore acoustics in wearable technology and the practicalities of instruments for live multi-modal performances.

Brace Yourself – The World’s Sexiest Knee “Brace”

Brace Yourself – The World’s Sexiest Knee “Brace” by Crystal Compton and Guido Gioberto of the University of Minnesota, USA, is an interesting and playful look at how a stocking incorporating a bend sensor can be used to track movement in the leg in a new and more aesthetically pleasing way.

Play the Visual Music

Play the Visual Music by Helen Koo of Auburn University, USA, is a garment for musicians and performers which responds to sound and intended to provide visual multi-sensory stimulations to the audience.

Garment with Stitched Stretch Sensors that Detects Breathing +  AVAnav: Helmet-Mounted Display for Avalanche Rescue Jason O. Germany

Garment with Stitched Stretch Sensors that Detects Breathing & AVAnav: Helmet-Mounted Display for Avalanche Rescue

AVAnav: Helmet-Mounted Display for Avalanche Rescue, by Jason O. Germany of the University of Oregon, USA, has developed a series of prototypes to assist rescue teams locate buried avalanche victims.

Haptic Mirror Therapy Glove by James Hallam of Georgia Institute of Technology, USA, is a glove that allows the stimulation of a paretic hand’s fingers following a stroke by tapping the fingers of the unaffected hand. James’ glove won the functional category prize in the Design Exhiibition.

At the International Symposium on Wearable Technology, Zurich 2013

Garment for rapid prototyping of pose-based applications, by Jacob Dennis, Robert Lewis, Tom Martin, Mark Jones, Kara Baumann, John New and Taylor Pearman of Virginia Tech, USA, is a loose fitting body-suit as the title suggests for rapid prototyping of pose-based applications.

Garment with Stitched Stretch Sensors that Detects Breathing, by
Mary Ellen Berglund, Guido Gioberto, Crystal Compton of the University of Minnesota, USA, is intended to be “a comfortable, everyday athletic garment incorporating a breathing sensor to monitor the activities of crewmembers on NASA missions”.


A Wearable Sensing Garment to Detect and Prevent Suit Injuries for Astronauts
, by Crystal Compton, Reagan Rockers, Thanh Nguyen of the University of Minnesota, USA, was developed using pressure sensors to help detect and resolve areas of injury in spacesuits.

Garment Body Position Monitoring and Gesture Recognition by Sahithya Baskaran, Norma Easter, Cameron Hord, Emily Keen and Mauricio Uruena of Georgia Institute of Technology, USA, was designed to recognise arm movements that might lead to repetitive strain injuries and capture data on reaction time.

The Photonic Bike Clothing IV for Cute Cyclist

The Photonic Bike Clothing IV for Cute Cyclist by
Jiyoung Kim and Sunhee Lee Dong-A of the University of South Korea, uses solar panels to power heat pads to aid the comfort of the rider.

Strokes & Dots by Valérie Lamontagne is a collection of garments which are part of a research project looking at fostering advancement of creative innovation and aesthetics in wearable technology.

During the ISWC main conference, there were so many interesting papers presented, my favourites included:

Eagerly waiting for FIDO: Ficilitating Interactions for Dogs with Ocupations

Blitz the dog preparing for the FIDO presentation!

FIDO – Facilitating Interactions for Dogs with Occupations: Wearable Dog-Activated Interfaces by Melody Jackson, Thad Starner and Clint Zeagler of
Georgia Institute of Technology, USA. This research looks at how assistance dogs can communicate more directly with their human companions by using a wearable system of sensors embedded in an a dog jacket, activated by pulling, biting and nose touching. Examples shown included human companions who needed precise alerts to be given to them, such as a dog who could distinguish between a doorbell and a tornado alert and raise an alarm, and other canine companions who could get help from others in the case of a medical emergency. What fascinated me about this research is how intelligent and individual it showed the dogs to be, for example in the Q&A it emerged that some dogs can remember over 1000 commands or words and respond differently depending on breed and temperament. Another point that came out of the Q&A was how with the dogs help, this technology could be really valuable to people with severe disabilities such as ‘locked-in’ syndrome.

Lucy Dunne conducts Q&A with Halley Profita on Don't Mind Me Touching My Wrist: A Case Study of Interacting with On-Body Technology in Public

Halley Profita and Lucy Dunne during the Q&A

Don’t Mind Me Touching My Wrist: A Case Study of Interacting with On-Body Technology in Public by Halley Profita, James Clawson, Scott Gilliland, Clint Zeagler, Thad Starner, Jim Budd and Ellen Yi-Luen Do of University of Colorado at Boulder, USA. This piqued my interest as it examined social acceptability of wearables via how people felt about the placing of an e-textile ‘jogwheel’ (a circular controller) on specific parts of the body, their attitudes to where it was placed and why. The insights were both fascinating and amusing. The study used both male and female testers and used the setting of a lift as a public place. The testing was done in the US and Korea to find out how differing cultural attitudes affected the study. Korea was an interesting choice as contrary to the US couples do not hold hands or show affection in public and interacting with a wearable on the body did highlight different cultural attitudes to the body and personal space. The paper discusses a whole load of insights from the research, but to be brief, the study showed the torso to be the most awkward place to wear the e-textile jogwheel and the wrist and forearm to be the least awkward place to wear it. A majority of wearers found the e-textile jogwheel a potentially ‘useful’ device.

Sensor-Embedded Teeth for Oral Activity Recognition

Sensor-Embedded Teeth for Oral Activity Recognition by Cheng-Yuan Li, Yen-Chang Chen, Wei-Ju Chen, Polly Huang and Hao-hua Chu of the National Taiwan University, Taipei, Taiwan. This presentation discussed how a tri-axial accelerometer system could recognise oral activities such as talking, chewing, drinking and laughing. The system results showed “93.8% oral activity recognition accuracy when using a person-dependent classifier and 59.8%
accuracy when using a person-independent classifier.” They discussed the uses for this such as dietary tracking. I found this research quite intriguing as I’m always looking for new and interesting ways to self quantify and will look out for news of their future work in this area.

Thad Starner Keynote 'Through the looking glass'  at ISWC / Ubicomp

Thad Starner giving his keynote.

Wearable Computing: Through the Looking Glass by Thad Starner of Georgia Institute of Technology, USA. Although I’ve read so many articles about Google Glass and possibly talked the hind leg off a donkey on the topic of Glass / lifelogging / privacy / surveillance / sousveillance in the last 18 months, I was still really looking forward to hearing Thad, who is also Technical Lead/Manager on Google’s Project Glass, talk about the device and discuss its tech specs. As Thad was previously part of the MIT Media Lab ‘Borg’ collective alongside Steve Mann, I was especially looking forward to hearing him present his thoughts on and about the history of wearable computing. I really enjoyed his talk and insights and best of all he brought along a box of some of his old head mounted display projects, one of which I cheekily tried on, see photo below.

Cheekily trying on Thad Starner's computer / Twiddler glasses at   at ISWC / Ubicomp - I hope he didn't mind ;-)

ISWC 2013 was fantastic and I loved Zurich, next year it moves on to Seattle, being the last year (paws crossed) of my PhD, I hope I’ll have the time (thesis beckons) and money (am running out of cash) to get there! Many thanks to Lucy Dunne and Troy Nachtigall for all their hard work organising the Design Exhibition, and to Kristof Van Laerhoven, the programme committee, volunteers, speakers, exhibitors and attendees who made the conference such an excellent and thought provoking experience. Not forgetting to say thanks too for all the great vegan food that was organised for me!

EEG Data Visualising Pendant – wearable technology for use in social situations

Moi & EEG Visualising Pendant worn with 3D printed frame

EEG Visualising Pendant shown with 3D printed frames

I developed my EEG visualising pendant for use in social situations. The pendant uses EEG (Electroencephalography) signals, which are gleaned from a NeuroSky MindWave Mobile headset. The MindWave is a standalone headset device that detects electrical signals from the brain, which are accessed via a single electrode on a protruding arm from the headband. The electrode makes contact via the wearer’s forehead at the pre-frontal cortex area, where higher thinking states are dominant. The pendant displays attention / concentration data as red LEDs (light emitting diodes) beside meditation / relaxation data in green LEDs on an LED matrix. The pendant has live, record and playback functions, which give the user the choice of displaying live EEG visualisations or recording and playing up to four minutes of previous brainwave data visualisations on a loop if they’re feeling mischievous or want to appear to be concentrating / paying attention or relaxed, what I call ’emotive engineering’, or just want to use the pendant as an aesthetic piece of jewellery without the EEG headset.

EEG Visualising Pendant - now with live, record & playback modes!

Image shows the pendant in action, plus selection options for pendant modes: live, record or playback.

I created this video to show the EEG Pendant working with the MindWave Mobile headset, I’ve added some crowd atmos to simulate being in a networking situation. You can see on the pendant my attention (red) and medidation (green) levels changing.

My motivation for developing this piece of wearable technology is that in certain spaces and situations we feel more awkward and vulnerable than in others. These situations include conferences and networking events, which put us in social situations where we might be alone or do not know other people very well and also in social areas such as bars and parties. All are situations where people often assume it’s okay to come into someone’s space and talk to them, which depending on how someone is feeling might make them uncomfortable. As well as asking personal questions, some conversations can go on for too long and it’s not usually socially acceptable to interrupt a person speaking mid-flow, then walk away – so how can we best let people know when we feel uncomfortable? As not everyone is adept at recognising or interpreting correctly the emotional signals of the person they are currently interacting with via body language alone, I developed the EEG visualising pendant as a means to go some way to bridge that gap by creating a piece of wearable technology that visualises the wearer’s concentration / meditation levels to signal when the wearer is attentive and interested or drifting away from the conversation. The pendant can also display when the wearer is more relaxed or unfocused (possibly when tired too) – in this state the LEDs display more green LEDs.

I am also interested in how we can manipulate social situations and how others see us by controlling our physiological data, either by using the record and playback functions, or by practicing how to control one’s own physiological data, in the case of EEG by, for example, reading, counting backwards, doing times-tables (attention) or defocusing / zoning out (meditation).

Showing my Bluetooth EEG Visualising Pendant at the Design Exhibition at ISWC

Here I am showing my EEG Visualising Pendant at the International Symposium on Wearable Computers (ISWC) in Zurich, September 2013.

Development of hardware and software of the EEG Pendant

The LED (Light Emitting Diode) matrix form factor I chose for the pendant makes it small and versatile. Its 3 x 3 centimetre size in allows it to be transferable to various outfits and worn in different ways, for example, as a pendant, brooch or badge clipped to a jacket, shirt or tie. The EEG data is visualised in three distinct styles, each being a proportional representation of the signal in real time.

My first action on purchasing an MindWave Mobile back in autumn 2012, was to ascertain how one could use the MindWave Mobile outside its intentional usage, which is to communicate with iOS and Android devices. I’d already found some information on the developer area of the NeuroSky website suggesting there were various other devices and applications that could work with the MindWave Mobile, such as Arduino microcontrollers, but at the time it didn’t have enought information, so I hunted around online for clues and began to piece together an idea of how to go about communicating with the MindWave.

The pendant’s first circuit prototype consisted of an Arduino Uno microcontroller connected via breadboard to a Bluetooth dongle and an LED bar-graph. At this stage the prototype was only visualising one aspect of the EEG data at a time, i.e. attention or meditation data.

Behold - my brainwaves visualised on an LED bar graph

I decided that for using the pendant both the attention and meditation data really needed to be shown next to each other, so I swapped the LED bar-graph for a square, single colour LED matrix. This gave a better display of how the EEG levels compared, but I felt these levels needed to be shown to be distinctive from each other, so the green LED matrix was exchanged for a bi-colour LED matrix and C code updated to display the attention data levels as red rectangles and meditation levels as green rectangles. The rectangles were split over two halves of the square matrix and enlarged and contracted in accordance with the data from the MindWave Mobile headset.

EEG visualisations matrix on a Shrimp circuit with Mindwave Mobile

Development of the pendant’s data visualisation could have concluded at this point, but it is important to consider the design and aesthetic nature of a piece of wearable technology, from both the wearer’s and of the viewer’s point of view. Also, it is important to consider how to make most of the data in terms of creating an innovative and unique piece of wearable technology. Exploring how the EEG data can be creatively portrayed is a crucial part of the software and hardware evolution of the pendant. So bearing this in mind, I updated the code to add circular and diagonal data visualisations of red / attention and green / meditation. This was originally reflected as lines on the LED matrix, but later as filled shapes with overlaps shown as yellow, which in my opinion, is overall more pleasing to the eye of the viewer.

EEG Visualising Pendant data shape cycles

For transferring the prototype to stripboard, my first attempt used an ATtiny85 microcontroller, which looked like a good fit for the circuit and as the name suggests it’s very small high-performance, low-power Atmel 8-bit microcontroller. Unfortunately, it wasn’t possible to use the ATtiny85 for this project as the LED matrix graphics libraries and the code for the cycle of three data visualising styles meant that it added up to too much code for the 8k Flash memory of the ATtiny85. Instead, I used a low cost Shrimp microcontroller kit that was designed specifically for breadboard / stripboard prototyping and comes as a bag of loose components which makes it fairly flexible in terms of putting together. The Shrimp is based on the Arduino Uno and includes the same Atmel 328-PU microcontroller chip at its heart, so there was not a problem uploading the code and libraries from the breadboard and Arduino Uno circuit. The next step was to test the circuit with appropriate batteries to ensure it could be powered as a stand-alone piece of wearable technology, three AAA batteries sufficed to run the circuit and all its components. I considered using two coin cell batteries in parallel, but decided I preferred a rechargeable AAA option.

Mood lighting on my EEG Visualising Pendant at Design Exhibition teardown
The EEG Visualising Pendant on show at the Design Exhibition of the International Symposium on Wearable Computers (ISWC) 2013, Zurich.

Having tested the circuit, the schematic was then drawn out out to ensure the circuit and its components could be neatly fitted onto stripboard. An appropriate size of stripboard was cut, tracks that needed to be cut to prevent short circuits were dug out and the components laid out for the circuit and then soldered. This is quite a time consuming business, but I enjoy building circuits.

The pendant was then ready to be attached to a necklace via small metal hoop links, spaced well enough away from any circuitry not to cause any short circuits. The LED matrix / pendant was attached to the main circuit board via detachable male / female jumper wires so for ease of putting on and also so it could be detached from the necklace and worn as a brooch. During usage, the stripboard circuit can be housed in a small bespoke box to protect it and keep it insulated, so it can then be tucked into a pocket.

The key technical challenge laid around communications with the headset, as it is a proprietary device, designed primarily to use with downloaded apps and games. The MindWave Mobile headset communicates with Bluetooth enabled devices that have the MindWave Mobile software installed but does not come with a Bluetooth dongle to communicate with other hardware for development purposes, as does with the NeuroSky research package. So once I was able to get the Bluetooth dongle to pair with the MindWave Mobile, my next task was to have code that then checks for packet strength and quality.

In terms of aesthetic design, the LED matrix was chosen because of its small pendant-like size and shape. It is also very light, so will not weigh heavily on the neck or on the body if worn as a pendant or a brooch. Aesthetics for wearable technology need to be carefully considered if we expect people to wear these devices, so we should ensure that we design them to look elegant and enjoyable to wear. In the past wearable technology has been clunky, heavy and often not very pleasing to the eye or designed with the wearer’s individual needs in mind. Plus we should take into consideration the different groups of people who might wear our devices and the individual preferences of each group. Demographics such as age, gender and lifestyle should be accounted for and user testing on designs carried out.

The NeuroSky MindWave Mobile is a useful low-cost EEG headset, however only having the one electrode on the product can make finding a signal a little cumbersome, though in return we get a comparatively discreet headset compared to others and do not need to use a gel to establish conductivity from the head to the electrode.

Ribbonacci frame for EEG Visualising Pendant

Using a Shrimp kit for the microcontroller circuit made for a smaller and less bulky circuit, plus brought the price of the project down. Although this means a little extra time needed to be spent putting the circuits together, soldering and testing to look for short circuits and any mistakes in the layout of components. As mentioned in the project development, the ATtiny85 was an ambitious approach to making the circuit smaller and easier to wear, but was not appropriate due to not enough memory being available for the code and libraries to drive the circuit and LED matrix.

Future Work
The EEG visualising pendant will progress as a project by testing and developing new ways of visualising EEG data that appeal to the user. The presentation of the pendant will be developed in terms of user profiles, for example, how could the matrix be housed and embellished to suit different demographics of users, plus looking at styles for male and female users.

In terms of the hardware, there are possible improvements that can be made to the configuration of the circuit to make the circuit board smaller and more compact. Smaller and lighter batteries would considerably lessen the weight and the bulkiness of the circuit board. As EEG technology progresses it may not be long before the headset form factor may be done away with altogether as smaller and less obvious ways of wearing the EEG electrode and transmitting the data are developed and favoured.

Wearing the SolarStar frame for EEG Visualising Pendant

Above polymer clay textured frame, below 3D printed frames in sparkly alumide (printed by Shapeways)