The Archimedes Project, started at Stanford University in 1992 and now at the University of Hawaii, is a multi-disciplinary research group focused on ensuring everybody is able to access information regardless of individual needs, abilities, preferences, and culture. The Archimedes Project is unique in that it is organized around eliminating the communication blockages that limit access, not the disabilities that cause them. The goal is to understand how to provide universal access for all.
Archimedes has a small core group, based in Hawaii, with contractors and collaborating researchers in California, North Carolina, New York, the Netherlands, Japan, and New Zealand.
The work of Archimedes ranges from studying user needs and usability issues to the research and development of special hardware and software. The Archimedes approach is based on the Intelligent Total Access System (iTASK), providing individuals with a personal information appliance, called an accessor, that provides alternative ways to perform all of the keyboard, mouse and/or monitor functions of any target information technology (IT) device or appliance.
The iTASK incorporates an Integration Manager and Natural Interaction Processor (IMNIP), recently invented by Archimedes researchers, that enables people to interact with devices using their own natural language and gestures.
<top of page>
Core Archimedes technologies include:
- Total Access System (TAS)
- Integration Manager and Natural Interaction Processor (IMNIP)
- Intelligent Total Access System (iTASK) Modules.
Applications being developed around these core technologies include:
- Collaborative Networking performed by a group of interconnected iTASK modules allows intelligence to be distributed throughout a smart environment.
- Speech Accessors: Speech recognition is the most widely used alternative input strategy. The goal is to develop smaller, more portable speech accessors based on the latest generation of hand-held or embedded processors. The Archimedes Project has pioneered techniques for combining speech recognition with alternative pointing devices such as head trackers and eye trackers.
- Head Tracking Accessors: A head tracker enables a person to control the mouse cursor with small movements of the head.
- Nose Tracking Accessor: A video camera and neural network tracks the direction in which the nose is pointing.
- Eye Tracking Accessors: An eye tracker determines where a person is looking by computing eye gaze direction. Eye-aware applications have been developed to give a person complete control of a computer using only eye movements. Word prediction techniques are also included in the keyboard software to assist the user in creating text.
- GUI Accessors: Archimedes Researchers are developing the TAS components required to make visually presented information accessible to blind and visually impaired computer users. GUI Accessors separate textual and graphical components and present them to the user. GUI Accessors use Optical Character Recognition to recover text, which it presents to the user as synthesized speech or Braille.
- ASL Accessors: Deaf and hard-of hearing individuals are becoming increasingly disadvantaged by the growing use of speech-only interfaces in computer, Web, and telephone-based applications. The ASL Accessor project is developing a system for automatically generating high quality sign language on computer and television displays. The ultimate goal is to provide real-time two-way communication between deaf and hearing people.
- Smart Environments: The Archimedes Project is studying smart environments as part of a larger topic of "Human Centered Interfaces to Ubiquitous Computers." By taking care of access issues in the environment, we automatically take care of many access issues facing disabled and aging people and at the same time improve access for everyone. Human Centered Interfaces accept inputs from and provide feedback to users in a variety of forms that closely match individual needs, abilities, and preferences.
<top of page>