CS547 Human-Computer Interaction Seminar  (Seminar on People, Computers, and Design)

Fridays 12:30-1:50 · Gates B01 · Open to the public
Previous | Next
Archive
Ann Copestake||Greg Edwards||Elizabeth Macken||Neil Scott
Stanford Center for the Study of Language and Information||Stanford Center for the Study of Language and Information||Stanford Center for the Study of Language and Information||Stanford Center for the Study of Language and Information
Using Computers to Aid Communication, A Case Study
April 5, 1996

Project Archimedes aims to alleviate communication problems by using modern computer technology.

Today we will focus on the work we have been doing for Jerry Lieberman, Professor Emeritus of Operations Research and Statistics and former Provost of Stanford. Jerry has amyotrophic lateral sclerosis (ALS or Lou Gehrig's disease).

The system Jerry uses is a personal accessor based on Neil Scott's Total Access Platform. The platform allows mix and match capabilities among a variety of input devices including keyboard and mouse, voice, head tracking, and eye pointing and a variety of host computers including PCs, Macs, SGI machines, and Suns. Jerry's system is one of a family of personal accessors that facilitate rapid and effective communication.

Lou Gehrig's disease is progressive; as Jerry's case has progressed we have developed and made use of a number of different input technologies. Thus his case allows us not only to explain our general approach but also to demonstrate many of the particular technologies we are working on: speech access, speech synthesis, input augmentation through AI, and input via an eye-tracker.



Ann Copestake has been engaged in research in computational linguistics since 1985, and has worked on a range of projects involving natural languages interfaces, lexical acquisition and representation, and machine translation. Recently she has been collaborating with the English Resource Grammar group at CSLI on building a broad-coverage computational grammar for English, and with the Archimedes group on implementing a word predictor and other language generation aids for speech prostheses. Greg Edwards is leading the design and development effort for the eye-tracking communication tool. A graduate of Stanford's Symbolic Systems Program with a concentration in HCI, Greg has been working on various projects, such as developing the Total Access Port (TAP) micro-controllers to enable transparent control of multiple platforms, a speech-recognition application which works with the TAP, a communication tool based on radial/pie menus, and the "Jerry Communicator" projects.

Elizabeth Macken is the Associate Director of the Center for the Study of Language and Information (CSLI) and Leader of the Archimedes Project. She is developing a model of American Sign Language as a heterogeneous system of communication, and is applying the model to communication aids in other modalities. She has also worked extensively in the field of computer-aided instruction, focusing on evaluation methods for instructional strategies whose aim is to individualize instruction, thereby deliberately spreading levels of accomplishment.

Neil Scott, Chief Engineer for the Archimedes Project, has been designing and implementing special computer access systems for individuals with disabilities since 1977. During the past five years, he has focussed much attention on making speech recognition a truly practical alternative to the keyboard and mouse. He is the inventor of the "Total Access System" which enables disabled individuals to easily access any computer.