CSR Logo USYD Lion  
         
 
About CSR
People
Projects
 

Fish-Bird: Realisation

Previous: Conceptual Foundation

 

 

 

 

 

 

Under the seat: Batteries, electronics and control computers. About 2/3 of the volume is filled with batteries.

 

Many aspects of the system design are strongly influenced by the desire to conceal the underlying technological apparatus. It should not be obvious to a spectator/participant how a wheelchair moves, promoting rapid engagement with the work and focussing attention on the form of interactive movement. As a consequence of this conceptual and ideological consideration, standard electrical wheelchairs could not form the basis of the autokinetic objects in the artwork. A wheelchair together with all associated electronics and software were custom-designed for the project.

Apart from the front wheels and rear rims, the entire wheelchair is custom-built. The tubing that forms the chassis was carefully shaped to give the impression of a hospital wheelchair. Dimensions of the structural elements were freely adapted to suit the requirements of other components whilst maintaining a strong visual impression of a stereotypical wheelchair.

The steel tubing, and fabricated parts were satin chromed to unify them visually. Seat cushions were upholstered in a synthetic fabric that has a discrete geometric self-pattern and a pronounced metallic sheen. These finishes were chosen to distinguish the chairs as designed objects that exist in a space outside of the hospital or nursing-home environment where one might expect to encounter them.

All power storage, electronics and computing are concealed within the seat of the wheelchair. Cables for motor current and encoder phase signals were routed inside the frame tubes. The motors and gearboxes are concealed by a trim tube that runs the full width of the wheelchair.

Motion Control Subsystem

 

PIC 18F458 microcontroller board.

 

Closeup of the under-seat electronics.

Power is provided by two Nickel-metal hydride battery packs that provide a total capacity of 600 A.hr, and occupy about two-thirds of the under-seat volume. Two standalone motion controllers drive the rear wheels via DC motors and reduction gearboxes. Each motion controller is provided with computer-controlled relays that allow the motors to be disconnected from the drives.

The small front wheels are free to roll and caster – neither rotation is measured. This choice has proven to be adequate, as the front wheels are quite lightly loaded and do not substantially complicate the wheelchair dynamics.

Power and size considerations dominate the design of onboard electronics and computing. Onboard computing is therefore restricted to two custom-designed PIC18F452 microcontroller-based motion control boards.

All data communication with the wheelchairs is through Class 2 Bluetooth 1.1 transceiver modules that give line-of-sight data rates up to 723 Kbps at distances up to 100 m. Bluetooth was selected for its low power consumption. Each motion control board has a dedicated transceiver, allowing computationally-intensive tasks such as wheelchair trajectory generation to be placed off-board.

Code that executes on the motion control boards is written in C, and uses the Salvo cooperative real-time kernel for task scheduling. Four main tasks run in parallel: a parser/dispatcher for messages on the Bluetooth radio link, the motion controller itself, a task that monitors the infrared proximity sensors for imminent collisions, and a software ‘heartbeat’ that notifies the off-board installation controller of the controller’s health.

 

System Architecture

The diagram to the right gives an overview of the system architecture. The motion of the two robots is coordinated by an installation controller that monitors the estimated positions of the robots and people within the space. It initiates transitions between various behavioural states in response to a variety of events such as people entering or leaving the space, approaching the robots, or simply standing and observing. Motion commands, as a series of waypoints, are sent to the pilot module which issues commands to the robots via the Bluetooth interface.

System Architecture

Scanning laser sensor.

Firewire camera.

Most of the sensing for the system is mounted off-board. This minimizes the requirement for power storage on the wheelchairs, and allows a much wider variety of sensors to be used for tracking human and robot participants in the installation space.

In the third stage implementation, two scanning laser sensors are concealed on the perimeter of the space and provide range and bearing observations to targets moving within the space. Cameras mounted in the ceiling also report observations of moving objects within their fields of view. Laser and camera observations are sent to the installation controller where a series of Kalman filters are used to estimate the current state of the system. Communication between the various modules in the system is based on the Active Sensor Networks architecture. Once the wheelchair tracks have been acquired, tracking is maintained using the wheel encoder and sensor observations.

 

 

Behavioural Scripting

 Many robotic systems are commanded and controlled using a combination of scripting and reasoning systems. The behaviour of each robot in the Fish-Bird system is controlled through a finite state machine (FSM) containing a number of discrete states. Each state corresponds to a behavioural primitive, or action, such as ‘sleep,’ ‘talk,’ ‘gaze,’ ‘follow,’ and so on. Transitions between the various states are handled by the behavioural engine, and both the conditions that cause state transitions, and the transition target states are specified by a scripting language.

The state-based, non-blocking scripting language devised for the project facilitates composition of system behaviours from behavioural primitives. That is, it provides a high-level compositional interface to the robots. This procedural language allows complex interaction with audience participants to be encoded, and behaviours to be implemented without changing or rebuilding the code base of the system. By specifying the conditions that trigger state transitions of the robot’s FSM, ‘stage directions’ can be given to the robots, readily creating complex behaviour patterns.

Next: Interdisciplinary Collaboration

  USYD Crest