InterFACES: the human face of assistive technologies
User Group Leaders – Creative:
Mr James Brosnan, Mr Bobby Byrne, Ms Katie Gilligan, Ms Sapna Ramnani, Ms Amy Kelly, Mr Daniel O’Haire
Academic Research Team:
Co-PIS Dr Mick Donegan (Technology Director/Eyegaze expert/communications grid developer) and Professor Lizbeth Goodman (Director/Educational and Performance/Communications Technology expert);
Colm O’Snodaigh (Musical Director)
With Aejaz Zahid, Dr Brian Duffy (Robotics and HCI expert), Kathryn Brosnan and the Brosnan brothers, Sandy Ana Fuentes, Tina Gilligan, The Ramnani family, Liz and Dave O’Haire, Turlif Vilbrandt, Dr Brian Dillon, Jeremi Sudol, Eoin O’Brien, Aisling Kinsella and Kíla.
Work underway includes the next iteration of the InterFACES project, co-directed by Professor Goodman with Dr Mick Donegan of Special Effect. (Mick is currently the Research Group Leader for SMARTlab’s Multimodal and Assistive Technologies Research Group. He was formerly the full-time Deputy Director of SMARTlab, having joining the team from the Oxford ACE Centre, where he has pushed the boundaries of eye-tracking technologies for assistive tech and user empowerment.)
The InterFACES team as a whole now includes professors and associate researchers, along with a group of research students, working in close collaboration with a group of User Group Leaders, all engaging in the multi-disciplinary domain of Assistive Technology interfaces for gaming on a universal-design platform, along with new research into bio-affective feedback triggers for movement and ‘control’ in virtual worlds and game environments (including learning environments).
We are testing the effectiveness of available tools for using eye movement as a control mechanism for communications by people with little or no other voluntary muscle movement.
Our first explorations began in 2003, and the InterFACES Project emerged in its first iteration in 2005. From the beginning, one of our main User Group Leaders has been collaborator James Brosnan, the ‘alpha user’ of the system. James has been contributing to a book including an expert analysis of the health and wellbeing advantages of using an ‘unplugged’ interface system (forthcoming from MIT Press), as well as an article on the use of the Mytobii system that has been tested and iterated by the team in Dublin and London since 2005.
We are testing the efficacy and also the comfort and ease of use of eyegaze systems in the home and in rehabilitative centres, to see whether the use of this tool, as opposed to manual switches, can enable users to sit more comfortably for longer periods of time, and to write faster. The team has customised the eye-click mechanism, including links to a bespoke musical and ‘emotional register’ keyboard for quick and easy personalisation of communications enabling a ‘new poetics’ of assistive technologies.
The InterFACES team is testing the levels of increased physical comfort and overall wellbeing of our User Group Leaders as they engage with a wide range of creative technology tools linked to the eyegaze interface: from the playing of musical instruments in real time, to the choreography of dance and the playing of collaborative games online. New versions of the eye-control system also support ambient assisted living agendas, by providing the capability for a person with little or no controllable voluntary physical movement to turn lights and heating controls on and off, and to control television and radio/audio channels and volume as well.
We are also testing a range of related assistive technology systems, with a view to creating an integrated accessible tech Playbox or toolkit, including GPS sensor, bluetooth, gyro-control and click-ogo systems integrated with Mytobii, litewriter and our own customised content boards.
Our new proposed SMARTcare project for the European Commission will, if funded, provide a portable and affordable toolkit, integrating gestural and eye-gaze technologies, along with computer-vision tools for the blind and partially sighted: a full suite of interfaces to enable learning and communication for ALL.
The final link in the communications chain is the aspect of human interaction. In the early experiment below, James and Lizbeth are engaged in testing the limits and appropriate levels of ‘human connected predictive text’ when Lizbeth uses a keyboard linked to the Mytobii to help type in words and punctuation marks that James begins, but only when they are physically present together and when James can indicate with his eyes which of Lizbeth’s ‘predictions’ is accurate to his exact meaning at any point in time. James has provided an ‘emotional lexicon’ for the music and poetics keyboards, and also a list of key words and phrases that Lizbeth can type in without checking, whereas any more creative or less common word will be checked with James in this duet of the eyes and keys.
James first performed with SMARTlab in the pre-show to the Dublin Special Olympics in 2003 (see ‘Féileachán’ on this site). The group, with band Kíla, then performed at two World Summits – in Geneva in 2003 and in Tunis in 2005. We led workshops on interactive eye-controlled integrated performance in workshops at the Ohio University in 2006.
We then presented our first interactive InterFACES showcase, with a musical score contributed by James, live in London at the SMARTlab Studio in July 2007, and then with a full band at the Leonardo Art-Science Anniversary summit in Prague, in November 2007.
InterFACES was nominated for the Times Higher Award for Services to People with Disabilities in 2007 and 2008, winning commendations.
We performed live showcase events with Katy Gilligan and Kíla, and with Bobby Byrne, at the Science Gallery in Dublin in 2008 and 2009.
We are currently nominated for a WISE Award: for the World Summit of Innovation in Education.
Our aim is to empower ALL learners by making eye-control and other assistive-technology tools ubiquitous, by integrating them into commercial products such as the Xbox Kinect. This is so that people with disabilities will no longer need to buy ‘special tools’, but can rather engage freely and easily with mainstream tools and technologies, all with the ‘click’ of an eye.
Interview with Dr. Mick Donegan
Eyejamming show, Dublin, April 2008
- Future-making serious games: SMARTlab: Serious games for social inclusion
- (Beyond) The 7 Movements of James
- Ms Katie Gilligan’s Website for InterFACES (to be updated)
- Music from the eyes – world premiere from SMARTlab at UEL _ UEL News Tuesday 10 July 2007
- THE Awards