Show simple item record

dc.contributor.authorAkram, Wajeehen_US
dc.contributor.authorTiberii, Lauraen_US
dc.contributor.authorBetke, Margriten_US
dc.date.accessioned2011-10-20T05:23:43Z
dc.date.available2011-10-20T05:23:43Z
dc.date.issued2006-05-11
dc.identifier.urihttps://hdl.handle.net/2144/1867
dc.description.abstractMany people suffer from conditions that lead to deterioration of motor control and makes access to the computer using traditional input devices difficult. In particular, they may loose control of hand movement to the extent that the standard mouse cannot be used as a pointing device. Most current alternatives use markers or specialized hardware to track and translate a user's movement to pointer movement. These approaches may be perceived as intrusive, for example, wearable devices. Camera-based assistive systems that use visual tracking of features on the user's body often require cumbersome manual adjustment. This paper introduces an enhanced computer vision based strategy where features, for example on a user's face, viewed through an inexpensive USB camera, are tracked and translated to pointer movement. The main contributions of this paper are (1) enhancing a video based interface with a mechanism for mapping feature movement to pointer movement, which allows users to navigate to all areas of the screen even with very limited physical movement, and (2) providing a customizable, hierarchical navigation framework for human computer interaction (HCI). This framework provides effective use of the vision-based interface system for accessing multiple applications in an autonomous setting. Experiments with several users show the effectiveness of the mapping strategy and its usage within the application framework as a practical tool for desktop users with disabilities.en_US
dc.description.sponsorshipNational Science Foundation (IIS-0093367, IIS-0329009, 0202067)en_US
dc.language.isoen_US
dc.publisherBoston University Computer Science Departmenten_US
dc.relation.ispartofseriesBUCS Technical Reports;BUCS-TR-2006-006
dc.subjectComputer visionen_US
dc.subjectAssistive technologyen_US
dc.subjectAlternative input devicesen_US
dc.subjectVideo-based human-computer interfacesen_US
dc.subjectAutonomous navigationen_US
dc.titleA Customizable Camera-based Human Computer Interaction System Allowing People With Disabilities Autonomous Hands Free Navigation of Multiple Computing Tasken_US
dc.typeTechnical Reporten_US


This item appears in the following Collection(s)

Show simple item record