Category Archives: EEG

ThinkerBelle Fibre Optic EEG Amplifying Dress

ThinkerBelle EEG Amplifying Dress

I’m writing up my PhD thesis at the moment and analysing a huge amount of data from over 70 surveys and 8 hours of focus group audio transcripts. Anyway, without giving away too much about the data, as I’m saving it for my thesis, here’s a little preview of my ThinkerBelle EEG Amplifying Dress. I created this dress in response to a subsection of feedback data from my field trials and focus groups, which investigated the functionality, aesthetics and user experience of wearables and in particular wearer and observer feedback on experiences with my EEG Visualising Pendant. The motivation for creating the dress was for engagement in social situations in which the wearer might find themselves in a noisy or crowded area, where it is not possible to hear others and communicate easily – where forms of non-verbal communication may be useful. The dress broadcasts the meditation and attention data of the wearer for observers to make their own interpretations. It is up to the wearer if they want to divulge information regarding the physiological source of the data being visualised.


A short video of the dress.


A longer video of the dress shot in Tokyo, Japan.

ThinkerBelle fibre optic EEG dress

The dress was constructed with a satin fabric and fibre optic filament woven into organza. Using a NeuroSky MindWave Mobile EEG headset signals in the form of two separate streams, ‘attention’ and meditation’, are sent via Bluetooth to the dress, which amplifies and visualises the data via the fibre optic filament. Attention data is shown as red light and meditation signal data as green light. The dress is constructed so the two streams of data light overlap and interweave. The fibre optic filament is repositionable allowing the wearer to make their own lighting arrangements and dress design. The red and green light fades in an out as the levels of attention and meditation data of wearer highten or decline.

The dress’ hardware has a choice of modes, so it is possible to record and playback the data. This makes it possible for the wearer to appear to be concentrating or relaxed if wished to influence a social situation, what I call ’emotive engineering’. Also if the wearer would like to use their EEG data to create a certain mix of colour and light on the dress. It is also possible to set the playback mode and take off the EEG headset if the wearer wants to be headset free. If you’d like to read my ISWC 15 paper on the dress, it’s available from the ACM or ask me for a copy.

ThinkerBelle fibre optic EEG dress
Red = attention / green = meditation

As you can see I’ve included a few initial photos of the dress in action showing the EEG data as it is received from the headset. I have not made a successful video of the dress yet, as it’s difficult to light the dress for photos and filming. I will add a video when I’ve worked around this!

I have also been experimenting with changing the form factor of the headset for aesthetic and comfort, using various materials.

ThinkerBelle fibre optic EEG dress
Feeling relaxed = very green dress!

A bit of extra info, in case you were wondering… During my PhD research, I’ve been investigating the possibility of that wearable technology can be used with physiological data to create new forms of non-verbal communication. Since 2008 I’ve been experimenting with wearables, sensors and social situations, which led me to focus on wearables. These wearables amplify visualise and broadcast data from the body. As mentioned in previous blog posts, the field of wearable technology has blossomed and grown rapidly in recent years into a huge and mainly undefined set of devices, platforms, uses and practice. It was therefore necessary for me (a couple of years ago now) to create my own nomenclature to define the area I was creating and researching in. The first subset area being ‘responsive wearables’, which deals with wearables that respond to various physiological, environmental and other user related data and gives an output. This worked for a short while but still wasn’t definitive. I went on to drill down and make a new subset of this area to find a better definition for the emerging field I was working in, which I named ‘emotive wearables’. This area focuses on the area of wearable technology which deals with the gleaning of physiological data from the body, processes and broadcasts it in some way from the wearer. The output could be sound, movement, light, etc.

ThinkerBelle fibre optic EEG dress

My research with sensors, social situations, ambient and physiological data has led me to work with sound signal input (decibels), temperature (Celsius), pressure (Pascal) and altitude (metres) ECG (Electrocardiography), GSR (Galvanic Skin Response), EMG (Electromyography) and EEG (Electroencephalography), but my main focus for my PhD has been on the development and research of emotive wearables with EEG data.

AnemoneStarHeart EEG / ECG visualising device at Transmission Symposium

AnemoneStarHeart handheld EEG/ECG Visualising Device

At the end of April I spent a very enjoyable day at Bournemouth University attending Transmission Symposium: Strategies for Brainwave Interpretation in the Arts. There were some very interesting presentations, exchanges of ideas and discussion on the intersection between art, cognition and technology. Links to the event, artists and scientists taking part can be found here. Thank you to Oliver Gingrich for inviting me to participate and to all the attendees, especially those who visited my emotive wearable exhibits, asked questions and/or tried a device and filled in a feedback survey.

At Transmission Symposium I debuted my AnemoneStarHeart, which is a pendant which can also be used as handheld or standalone device (smaller version being tweaked!) I have developed for broadcasting, amplifying and visualising EEG and ECG data. I have been developing this device as part of the iteration process of the EEG Visualising Pendant. It brings together technology and elements from my aforementioned EEG Visualising Pendant and Flutter ECG pendant hack.

Watching 'Canal Trip' on BBC4 with AnemoneStarHeart broadcasting / visualising EEG
AnemoneStarHeart being used as an ambient device to observe relaxation whilst watching ‘Canal Trip’ slow TV programme, BBC4, May 15.

It can be used, for example as an aid for meditation, relaxation and concentration, as well as for personal viewing or sharing physiological data in social situations with others. Data is sent to the AnemoneStarHeart via Bluetooth and it is a battery operated, standalone device. It can either be worn as a pendant, viewed in the palm of the hand or placed in a convenient area of a room – illuminating the space with coloured light. Whilst sensors are transmitting data to the device, it constantly visualises it, changing colour and brightness based on the data it receives. The smaller, wearable version hangs from a chain as a necklace or in the style of a pocket watch so it can be brought out, looked at, then put away again. As I am interested in the commercial possibilities of bespoke couture wearables and small editions of emotive devices, at some point I aspire to crowdfund this project.

AnemoneStarHeart lit up with live EEG data

As part of my PhD research, I have spent the best part of a year organising and running focus groups with potential users of emotive wearables and the EEG Visualising Pendant in London and Amsterdam. I have also conducted field trials in various social and work situations across London and Brighton, plus collected feedback from observers of the pendant. Since the beginning of 2015 I have been analysing the resulting data. This is to discover the preferences and feedback of potential wearers of emotive wearables as well as the EEG Visualising Pendant. Out of the resulting data, so far, has evolved the AnemoneStarHeart device, for which I devised a new configuration of electronic components and code. I created a new enclosure for the electronics in 3D modelling package Rhino, with help from skills learned at Francis Bitonti’s computational design workshop. It was selective laser sintered (SLS) in Nylon, in one of D2W’s EOS machines in London.

Rain & AnemoneStarHeart lit up with live EEG data

At the moment I am mostly out of general circulation as I’m collecting and analysing data which is feeding into the new emotive wearable devices I am building, whilst simultaneously endeavoring to write up / finish my PhD thesis to deadline.

Quantified Self Europe 2014: Emotive Wearables Breakout Session

Quantified Self Europe pre-party

It was great to visit Amsterdam again and see friends at the 3rd Quantified Self Europe Conference, previously I have spoken at the conference on Sensing Wearables, in 2011 and Visualising Physiological Data, in 2013.

There were two very prominent topics being discussed at Quantified Self Europe 2014, firstly around the quantifying of grief and secondly on privacy and surveillance. These are two very contrasting and provocative areas for attendees to contemplate, but also very important to all, for they’re very personal areas we can’t avoid having a viewpoint on. Rather than me try to summarise a few of the talks, if you’d like to find out more about the excellent presentations and discussions at the conference, search for ‘QSEU14’ or ‘europe’ on the Quantified Self website where many of the sessions have write-ups, photos and video documentation.

My contribution to the conference was to lead a Breakout Session on Emotive Wearables and demonstrated my EEG Visualising Pendant. Breakout Sessions are intended for audience participation and I wanted to use this one-hour session to get feedback on my pendant for its next iteration and also find out what people’s opinions were on emotive wearables generally.

I’ve been making wearable technology for six years and have been a PhD student investigating wearables for three years; during this time I’ve found wearable technology is such a massive field that I have needed to find my own terms to describe the areas I work in, and focus on in my research. Two subsets that I have defined terms for are, responsive wearables: which includes garments, jewellery and accessories that respond to the wearer’s environment, interactivity with technology or physiological signals taken from sensor data worn on or around the body, and emotive wearables: which describes garments, jewellery and accessories that amplify, broadcast and visualise physiological data that is associated with non-verbal communication, for example, the emotions and moods of the wearer. In my PhD research I am looking at whether such wearable devices can used to express non-verbal communication and I wanted to find out what Quantified Self Europe attendees opinions and attitudes would be to such technology, as many attendees are super-users of personal tracking technology and are also developing it.

Demo-ing EEG Visualising Pendant

My EEG Visualising Pendant is an example of my practice that I would describe as an emotive wearable, because it amplifies and broadcasts physiological data of the wearer and may provoke a response from those around the wearer. The pendant visualises the brainwave attention and meditation data of the wearer simultaneously (using data from a Bluetooth NeuroSky MindWave headset), via an LED (Light Emitting Diode) matrix, allowing others to make assumptions and interpretations from the visualisations. For example, whether the person wearing the pendant is paying attention or concentrating on what is going on around them, or is relaxed and not concentrating.

After I demonstrated the EEG Visualising Pendant, I invited attendees of my Breakout Session to participate in a discussion and paper survey about attitudes to emotive wearables and in particular feedback on the pendant. We had a mixed gender session of various ages and we had a great discussion, which covered areas such as, who would wear this device and other devices that also amplified one’s physiological data. We discussed the appropriateness of such personal technology and also thought in depth about privacy and the ramifications of devices that upload such data to cloud websites for processing, plus the positive and the possible negative aspects of data collection. Other issues we discussed included design and aesthetics of prominent devices on the body and where we would be comfortable wearing them.

I am still transcribing the audio from the session and analysing the paper surveys that were completed, overall the feedback was very positive. The data I have gathered will feed into the next iteration of the EEG Visualising Pendant prototype and future devices. It will also feed into my PhD research. Since the Quantified Self Europe Conference, I have run the same focus group three more times with women interested in wearable technology, in London. I will update my blog with my findings from the focus groups and surveys in due course, plus of course information on the EEG Visualising Pendant’s next iteration as it progresses.

International Symposium on Wearable Computers 2013 (ISWC), ETH Zurich, Switzerland

At the International Symposium on Wearable Technology, Zurich 2013

I had a great time at the 17th International Symposium on Wearable Computers (ISWC), held this year at ETH Zurich, Switzerland alongside UbiComp. This year there was a record amount of submissions for all calls: papers, posters, Gadget Show and the Design Exhibition. The full programme and abstracts can be found here.

Showing my Bluetooth EEG Visualising Pendant at the Design Exhibition at ISWC

Me with my EEG Visualising Pendant

This year I submitted my EEG Visualising Pendant for selection in the Design Exhibition. The pendant uses EEG (Electroencephalography) signals, which are gleaned from a NeuroSky MindWave Mobile, a standalone headset device that detects electrical signals from the brain, which are accessed via a single electrode on a protruding arm from the headband. The pendant displays attention / concentration data as red LEDs (light emitting diodes) beside meditation / relaxation data in green LEDs on an LED matrix. The pendant has live, record and playback functions, which give the user the choice of displaying live EEG visualisations or recording and playing up to four minutes of previous brainwave data visualisations on a loop. The wearer can use this functionality they’re feeling mischievous i.e. want to manipulate a situation, what I term ’emotive engineering’ and want to appear to be concentrating / paying attention or relaxed, as well as if they just want to use the pendant as an aesthetic piece of jewellery without the EEG headset. More information on the EEG Visualising Pendant can be found here. Link to my paper from the ISWC Adjunct Proceedings, EEG Visualising Pendant for use in Social Situations.

During the Design Exhibition, I was interviewed by BBC Technology News, the coverage can be found here. I was also filmed by Swiss TV.

Here’s my short video tour around the Design Exhibition

Rachael's fab fibre optic dress
Fiber Optic Corset Dress

Including my work, there were fourteen exhibits in the Design Exhibition, here’s a brief listing of them:

Fiber Optic Corset Dress (above), by Rachael Reichert, James Knight, Lisa Ciafaldi and Keith Connelly of Cornell University, USA, which glowed wonderfully in the darkened exhibition space. The dress also features in Rachael’s short film CyBelle Horizon.

Gorgeous Lüme

Lüme (above) by Elizabeth E. Bigger, Luis E. Fraguada, Jorge & Esther and built by Associative Data, is a series of garments that incorporate embedded electronics which illuminate based on the wearer’s selection of colour and other choices, controlled from a smartphone. The garments shone and changed colour beautifully. Lüme won the Design Exhibition prize in the aesthetic garment category.

E-Shoe: A High Heeled Shoe Guitar

E-Shoe: A High Heeled Shoe Guitar, by Alex Murray-Leslie, Melissa Logan and Max Kibardin of the University of Technology, Sydney, Australia, is an intriguing and startlingly captivating shoe guitar that was created to explore acoustics in wearable technology and the practicalities of instruments for live multi-modal performances.

Brace Yourself – The World’s Sexiest Knee “Brace”

Brace Yourself – The World’s Sexiest Knee “Brace” by Crystal Compton and Guido Gioberto of the University of Minnesota, USA, is an interesting and playful look at how a stocking incorporating a bend sensor can be used to track movement in the leg in a new and more aesthetically pleasing way.

Play the Visual Music

Play the Visual Music by Helen Koo of Auburn University, USA, is a garment for musicians and performers which responds to sound and intended to provide visual multi-sensory stimulations to the audience.

Garment with Stitched Stretch Sensors that Detects Breathing +  AVAnav: Helmet-Mounted Display for Avalanche Rescue Jason O. Germany

Garment with Stitched Stretch Sensors that Detects Breathing & AVAnav: Helmet-Mounted Display for Avalanche Rescue

AVAnav: Helmet-Mounted Display for Avalanche Rescue, by Jason O. Germany of the University of Oregon, USA, has developed a series of prototypes to assist rescue teams locate buried avalanche victims.

Haptic Mirror Therapy Glove by James Hallam of Georgia Institute of Technology, USA, is a glove that allows the stimulation of a paretic hand’s fingers following a stroke by tapping the fingers of the unaffected hand. James’ glove won the functional category prize in the Design Exhiibition.

At the International Symposium on Wearable Technology, Zurich 2013

Garment for rapid prototyping of pose-based applications, by Jacob Dennis, Robert Lewis, Tom Martin, Mark Jones, Kara Baumann, John New and Taylor Pearman of Virginia Tech, USA, is a loose fitting body-suit as the title suggests for rapid prototyping of pose-based applications.

Garment with Stitched Stretch Sensors that Detects Breathing, by
Mary Ellen Berglund, Guido Gioberto, Crystal Compton of the University of Minnesota, USA, is intended to be “a comfortable, everyday athletic garment incorporating a breathing sensor to monitor the activities of crewmembers on NASA missions”.

IMG_3730


A Wearable Sensing Garment to Detect and Prevent Suit Injuries for Astronauts
, by Crystal Compton, Reagan Rockers, Thanh Nguyen of the University of Minnesota, USA, was developed using pressure sensors to help detect and resolve areas of injury in spacesuits.

Garment Body Position Monitoring and Gesture Recognition by Sahithya Baskaran, Norma Easter, Cameron Hord, Emily Keen and Mauricio Uruena of Georgia Institute of Technology, USA, was designed to recognise arm movements that might lead to repetitive strain injuries and capture data on reaction time.

The Photonic Bike Clothing IV for Cute Cyclist

The Photonic Bike Clothing IV for Cute Cyclist by
Jiyoung Kim and Sunhee Lee Dong-A of the University of South Korea, uses solar panels to power heat pads to aid the comfort of the rider.

Strokes & Dots by Valérie Lamontagne is a collection of garments which are part of a research project looking at fostering advancement of creative innovation and aesthetics in wearable technology.

During the ISWC main conference, there were so many interesting papers presented, my favourites included:

Eagerly waiting for FIDO: Ficilitating Interactions for Dogs with Ocupations

Blitz the dog preparing for the FIDO presentation!

FIDO – Facilitating Interactions for Dogs with Occupations: Wearable Dog-Activated Interfaces by Melody Jackson, Thad Starner and Clint Zeagler of
Georgia Institute of Technology, USA. This research looks at how assistance dogs can communicate more directly with their human companions by using a wearable system of sensors embedded in an a dog jacket, activated by pulling, biting and nose touching. Examples shown included human companions who needed precise alerts to be given to them, such as a dog who could distinguish between a doorbell and a tornado alert and raise an alarm, and other canine companions who could get help from others in the case of a medical emergency. What fascinated me about this research is how intelligent and individual it showed the dogs to be, for example in the Q&A it emerged that some dogs can remember over 1000 commands or words and respond differently depending on breed and temperament. Another point that came out of the Q&A was how with the dogs help, this technology could be really valuable to people with severe disabilities such as ‘locked-in’ syndrome.

Lucy Dunne conducts Q&A with Halley Profita on Don't Mind Me Touching My Wrist: A Case Study of Interacting with On-Body Technology in Public

Halley Profita and Lucy Dunne during the Q&A

Don’t Mind Me Touching My Wrist: A Case Study of Interacting with On-Body Technology in Public by Halley Profita, James Clawson, Scott Gilliland, Clint Zeagler, Thad Starner, Jim Budd and Ellen Yi-Luen Do of University of Colorado at Boulder, USA. This piqued my interest as it examined social acceptability of wearables via how people felt about the placing of an e-textile ‘jogwheel’ (a circular controller) on specific parts of the body, their attitudes to where it was placed and why. The insights were both fascinating and amusing. The study used both male and female testers and used the setting of a lift as a public place. The testing was done in the US and Korea to find out how differing cultural attitudes affected the study. Korea was an interesting choice as contrary to the US couples do not hold hands or show affection in public and interacting with a wearable on the body did highlight different cultural attitudes to the body and personal space. The paper discusses a whole load of insights from the research, but to be brief, the study showed the torso to be the most awkward place to wear the e-textile jogwheel and the wrist and forearm to be the least awkward place to wear it. A majority of wearers found the e-textile jogwheel a potentially ‘useful’ device.

Sensor-Embedded Teeth for Oral Activity Recognition

Sensor-Embedded Teeth for Oral Activity Recognition by Cheng-Yuan Li, Yen-Chang Chen, Wei-Ju Chen, Polly Huang and Hao-hua Chu of the National Taiwan University, Taipei, Taiwan. This presentation discussed how a tri-axial accelerometer system could recognise oral activities such as talking, chewing, drinking and laughing. The system results showed “93.8% oral activity recognition accuracy when using a person-dependent classifier and 59.8%
accuracy when using a person-independent classifier.” They discussed the uses for this such as dietary tracking. I found this research quite intriguing as I’m always looking for new and interesting ways to self quantify and will look out for news of their future work in this area.

Thad Starner Keynote 'Through the looking glass'  at ISWC / Ubicomp

Thad Starner giving his keynote.

Wearable Computing: Through the Looking Glass by Thad Starner of Georgia Institute of Technology, USA. Although I’ve read so many articles about Google Glass and possibly talked the hind leg off a donkey on the topic of Glass / lifelogging / privacy / surveillance / sousveillance in the last 18 months, I was still really looking forward to hearing Thad, who is also Technical Lead/Manager on Google’s Project Glass, talk about the device and discuss its tech specs. As Thad was previously part of the MIT Media Lab ‘Borg’ collective alongside Steve Mann, I was especially looking forward to hearing him present his thoughts on and about the history of wearable computing. I really enjoyed his talk and insights and best of all he brought along a box of some of his old head mounted display projects, one of which I cheekily tried on, see photo below.

Cheekily trying on Thad Starner's computer / Twiddler glasses at   at ISWC / Ubicomp - I hope he didn't mind ;-)

ISWC 2013 was fantastic and I loved Zurich, next year it moves on to Seattle, being the last year (paws crossed) of my PhD, I hope I’ll have the time (thesis beckons) and money (am running out of cash) to get there! Many thanks to Lucy Dunne and Troy Nachtigall for all their hard work organising the Design Exhibition, and to Kristof Van Laerhoven, the programme committee, volunteers, speakers, exhibitors and attendees who made the conference such an excellent and thought provoking experience. Not forgetting to say thanks too for all the great vegan food that was organised for me!

EEG Data Visualising Pendant – wearable technology for use in social situations

Moi & EEG Visualising Pendant worn with 3D printed frame

EEG Visualising Pendant shown with 3D printed frames

Introduction
I developed my EEG visualising pendant for use in social situations. The pendant uses EEG (Electroencephalography) signals, which are gleaned from a NeuroSky MindWave Mobile headset. The MindWave is a standalone headset device that detects electrical signals from the brain, which are accessed via a single electrode on a protruding arm from the headband. The electrode makes contact via the wearer’s forehead at the pre-frontal cortex area, where higher thinking states are dominant. The pendant displays attention / concentration data as red LEDs (light emitting diodes) beside meditation / relaxation data in green LEDs on an LED matrix. The pendant has live, record and playback functions, which give the user the choice of displaying live EEG visualisations or recording and playing up to four minutes of previous brainwave data visualisations on a loop if they’re feeling mischievous or want to appear to be concentrating / paying attention or relaxed, what I call ’emotive engineering’, or just want to use the pendant as an aesthetic piece of jewellery without the EEG headset.

EEG Visualising Pendant - now with live, record & playback modes!

Image shows the pendant in action, plus selection options for pendant modes: live, record or playback.

I created this video to show the EEG Pendant working with the MindWave Mobile headset, I’ve added some crowd atmos to simulate being in a networking situation. You can see on the pendant my attention (red) and medidation (green) levels changing.

My motivation for developing this piece of wearable technology is that in certain spaces and situations we feel more awkward and vulnerable than in others. These situations include conferences and networking events, which put us in social situations where we might be alone or do not know other people very well and also in social areas such as bars and parties. All are situations where people often assume it’s okay to come into someone’s space and talk to them, which depending on how someone is feeling might make them uncomfortable. As well as asking personal questions, some conversations can go on for too long and it’s not usually socially acceptable to interrupt a person speaking mid-flow, then walk away – so how can we best let people know when we feel uncomfortable? As not everyone is adept at recognising or interpreting correctly the emotional signals of the person they are currently interacting with via body language alone, I developed the EEG visualising pendant as a means to go some way to bridge that gap by creating a piece of wearable technology that visualises the wearer’s concentration / meditation levels to signal when the wearer is attentive and interested or drifting away from the conversation. The pendant can also display when the wearer is more relaxed or unfocused (possibly when tired too) – in this state the LEDs display more green LEDs.

I am also interested in how we can manipulate social situations and how others see us by controlling our physiological data, either by using the record and playback functions, or by practicing how to control one’s own physiological data, in the case of EEG by, for example, reading, counting backwards, doing times-tables (attention) or defocusing / zoning out (meditation).

Showing my Bluetooth EEG Visualising Pendant at the Design Exhibition at ISWC

Here I am showing my EEG Visualising Pendant at the International Symposium on Wearable Computers (ISWC) in Zurich, September 2013.


Development of hardware and software of the EEG Pendant

The LED (Light Emitting Diode) matrix form factor I chose for the pendant makes it small and versatile. Its 3 x 3 centimetre size in allows it to be transferable to various outfits and worn in different ways, for example, as a pendant, brooch or badge clipped to a jacket, shirt or tie. The EEG data is visualised in three distinct styles, each being a proportional representation of the signal in real time.

My first action on purchasing an MindWave Mobile back in autumn 2012, was to ascertain how one could use the MindWave Mobile outside its intentional usage, which is to communicate with iOS and Android devices. I’d already found some information on the developer area of the NeuroSky website suggesting there were various other devices and applications that could work with the MindWave Mobile, such as Arduino microcontrollers, but at the time it didn’t have enought information, so I hunted around online for clues and began to piece together an idea of how to go about communicating with the MindWave.

The pendant’s first circuit prototype consisted of an Arduino Uno microcontroller connected via breadboard to a Bluetooth dongle and an LED bar-graph. At this stage the prototype was only visualising one aspect of the EEG data at a time, i.e. attention or meditation data.

Behold - my brainwaves visualised on an LED bar graph

I decided that for using the pendant both the attention and meditation data really needed to be shown next to each other, so I swapped the LED bar-graph for a square, single colour LED matrix. This gave a better display of how the EEG levels compared, but I felt these levels needed to be shown to be distinctive from each other, so the green LED matrix was exchanged for a bi-colour LED matrix and C code updated to display the attention data levels as red rectangles and meditation levels as green rectangles. The rectangles were split over two halves of the square matrix and enlarged and contracted in accordance with the data from the MindWave Mobile headset.

EEG visualisations matrix on a Shrimp circuit with Mindwave Mobile

Development of the pendant’s data visualisation could have concluded at this point, but it is important to consider the design and aesthetic nature of a piece of wearable technology, from both the wearer’s and of the viewer’s point of view. Also, it is important to consider how to make most of the data in terms of creating an innovative and unique piece of wearable technology. Exploring how the EEG data can be creatively portrayed is a crucial part of the software and hardware evolution of the pendant. So bearing this in mind, I updated the code to add circular and diagonal data visualisations of red / attention and green / meditation. This was originally reflected as lines on the LED matrix, but later as filled shapes with overlaps shown as yellow, which in my opinion, is overall more pleasing to the eye of the viewer.

EEG Visualising Pendant data shape cycles

For transferring the prototype to stripboard, my first attempt used an ATtiny85 microcontroller, which looked like a good fit for the circuit and as the name suggests it’s very small high-performance, low-power Atmel 8-bit microcontroller. Unfortunately, it wasn’t possible to use the ATtiny85 for this project as the LED matrix graphics libraries and the code for the cycle of three data visualising styles meant that it added up to too much code for the 8k Flash memory of the ATtiny85. Instead, I used a low cost Shrimp microcontroller kit that was designed specifically for breadboard / stripboard prototyping and comes as a bag of loose components which makes it fairly flexible in terms of putting together. The Shrimp is based on the Arduino Uno and includes the same Atmel 328-PU microcontroller chip at its heart, so there was not a problem uploading the code and libraries from the breadboard and Arduino Uno circuit. The next step was to test the circuit with appropriate batteries to ensure it could be powered as a stand-alone piece of wearable technology, three AAA batteries sufficed to run the circuit and all its components. I considered using two coin cell batteries in parallel, but decided I preferred a rechargeable AAA option.

Mood lighting on my EEG Visualising Pendant at Design Exhibition teardown
The EEG Visualising Pendant on show at the Design Exhibition of the International Symposium on Wearable Computers (ISWC) 2013, Zurich.

Having tested the circuit, the schematic was then drawn out out to ensure the circuit and its components could be neatly fitted onto stripboard. An appropriate size of stripboard was cut, tracks that needed to be cut to prevent short circuits were dug out and the components laid out for the circuit and then soldered. This is quite a time consuming business, but I enjoy building circuits.

The pendant was then ready to be attached to a necklace via small metal hoop links, spaced well enough away from any circuitry not to cause any short circuits. The LED matrix / pendant was attached to the main circuit board via detachable male / female jumper wires so for ease of putting on and also so it could be detached from the necklace and worn as a brooch. During usage, the stripboard circuit can be housed in a small bespoke box to protect it and keep it insulated, so it can then be tucked into a pocket.

Challenges
The key technical challenge laid around communications with the headset, as it is a proprietary device, designed primarily to use with downloaded apps and games. The MindWave Mobile headset communicates with Bluetooth enabled devices that have the MindWave Mobile software installed but does not come with a Bluetooth dongle to communicate with other hardware for development purposes, as does with the NeuroSky research package. So once I was able to get the Bluetooth dongle to pair with the MindWave Mobile, my next task was to have code that then checks for packet strength and quality.

In terms of aesthetic design, the LED matrix was chosen because of its small pendant-like size and shape. It is also very light, so will not weigh heavily on the neck or on the body if worn as a pendant or a brooch. Aesthetics for wearable technology need to be carefully considered if we expect people to wear these devices, so we should ensure that we design them to look elegant and enjoyable to wear. In the past wearable technology has been clunky, heavy and often not very pleasing to the eye or designed with the wearer’s individual needs in mind. Plus we should take into consideration the different groups of people who might wear our devices and the individual preferences of each group. Demographics such as age, gender and lifestyle should be accounted for and user testing on designs carried out.

The NeuroSky MindWave Mobile is a useful low-cost EEG headset, however only having the one electrode on the product can make finding a signal a little cumbersome, though in return we get a comparatively discreet headset compared to others and do not need to use a gel to establish conductivity from the head to the electrode.

Ribbonacci frame for EEG Visualising Pendant

Using a Shrimp kit for the microcontroller circuit made for a smaller and less bulky circuit, plus brought the price of the project down. Although this means a little extra time needed to be spent putting the circuits together, soldering and testing to look for short circuits and any mistakes in the layout of components. As mentioned in the project development, the ATtiny85 was an ambitious approach to making the circuit smaller and easier to wear, but was not appropriate due to not enough memory being available for the code and libraries to drive the circuit and LED matrix.

Future Work
The EEG visualising pendant will progress as a project by testing and developing new ways of visualising EEG data that appeal to the user. The presentation of the pendant will be developed in terms of user profiles, for example, how could the matrix be housed and embellished to suit different demographics of users, plus looking at styles for male and female users.

In terms of the hardware, there are possible improvements that can be made to the configuration of the circuit to make the circuit board smaller and more compact. Smaller and lighter batteries would considerably lessen the weight and the bulkiness of the circuit board. As EEG technology progresses it may not be long before the headset form factor may be done away with altogether as smaller and less obvious ways of wearing the EEG electrode and transmitting the data are developed and favoured.

Link to my paper from the 17th International Symposium on Wearable Computers Design Exhibition (ISWC), 2013, Adjunct Proceedings, EEG Visualising Pendant for use in Social Situations.

Wearing the SolarStar frame for EEG Visualising Pendant

Above polymer clay textured frame, below 3D printed frames in sparkly alumide (printed by Shapeways)