Smell output in human-computer interaction

Do you have problems indexing and searching through your computer-based photo collection? Have you thought about using  smells   aromas as a guide? Professor Stephen A. Brewster (at the Multimodal Interaction Group, Department of Computing Science, University of Glasgow, UK) has. And. along with colleagues he has developed: ‘Olfoto: ‘ (details of which are published in Proceedings of ACM CHI 2006, Montreal, Canada, ACM Press Addison-Wesley, pp 653-662).
“Smell output is an under-explored presentation technique in human-computer interaction.” explain the team. 12 experimental participants were paid £10 each to organise (or not) their sets of photos with the aid of ‘smell-cubes’ [from Dale Air, UK ]. “The smells chosen from the Dale Air website to match our categories were: Brewery, Alpine, Bread, Ozone, Sea Shore, Smoke, Farm-yard, Dusty, Grass, Floral, Sea Breeze, Sweaty Feet, River-bank, Unisex Perfume, Machine Oil, Dark Chocolate.”

Results were positive (ish): “In general, participants were able to use the smells to identify pictures. Participants were performing at greater than chance levels with the smells. They were less effective than text tags, but this is perhaps not surprising.”

* Note:  Not to be confused with Aromacones®