Robocrane® is six degrees-of-freedom work platform suspended by cables driven by winches under computer control. A PC based Enhanced Machine Controller (EMC) incorporating a Real-Time Control System (RCS) reference model open-system architecture is being designed for control of this system. The controller will have standard interfaces between functional modules for servo control, trajectory generation, constrained motion modes, discrete event logic, task sequencing, and an operator interface that incorporates the following elements: joystick control, a graphics programming environment, and a telepresence vision system with virtual reality displays that facilitate remote control by human operators.
Construction of this system will lead to more efficient systems for
users, greater market opportunities for equipment manufacturers, and lower
costs due to greater competition between suppliers. Interface standards
will mean easier, faster, and more robust integration of subsystems, less
expensive customization of intelligent machine systems used in multiple
applications, and lower costs for spare parts and system up-grades.
For more information on ROBOCRANE® project see: ROBOCRANE®
Project
The goal in the Tetrahedral Robotic Apparatus (TETRA) project is to develop a new class of large-scale cargo manipulators, based on Stewart platform principles, which augment existing cranes or structures in order to improve operational safety, efficiency and versatility of crane operations and also use it as a hardware testbed for continued development of an open architecture Real-time Control System (RCS).
The principles of the Stewart platform parallel link manipulator will
be applied, using winches and cables as the links, to statically position
crane cargo in all six degrees of freedom. Moreover, it will be used to
study dynamic compensation of crane cargo. The concepts of active, real-time
control based on cable tensions and platform position to compensate for
dynamic perturbations of the cargo will be implemented. Adaptive control
techniques will incorporate a variety of sensors into the controller such
as tactile, proximity and vision systems. These enhancements will lead
to stable crane operations, autonomous cargo
handling capabilities and further development of the RCS controller.
It is expected that enhancement of the open architecture RCS and its application to parallel link manipulators will advance controller development and possibly lead to standards in PC based machine controllers.
For More information on TETRA project see: ROBOCRANE®
Project
The testbed consists of a CMM, advanced sensors, and the NIST Real Time Control System (RCS) open architecture controller. The advanced sensors include analog touch probes, a video camera, an analog capacitance probe and a laser triangulation probe. The RCS controller permits real-time processing of sensor data for feedback control of the inspection probe. The controller also permits integration of a video camera for part feature recognition and for vision-guided motion control of the inspection probe. The controller will provide interfaces to CAD models and to the Dimensional Measuring Interface Standard to allow inspection that is driven by model data.
For More information on the NGIS project see: NGIS
API.
In recent years, the concept of Autonomous Highway System (AHS) has gained a considerable attention. This is because it brings numerous benefits to the society including safer highways and higher highway throughput. Such a project requires design and implementation of intelligent vehicles. In this project NIST is developing an intelligent perception and vehicle navigation control system using the ISD Real-time Control System (RCS), an open system reference architecture.
RCS provides a systematic analysis, design, and implementation methodology for the developing real-time sensor based control systems. Functional task execution is viewed hierarchically with motor skill functions, like steering and speed control, performed at lower levels and coordinated actions between vehicles performed at higher levels. The control system uses sensory information to guide the intelligent vehicle in the execution of complex tasks. Planning for task execution and for adaptation to changes in the environment are also part of the total hierarchy. Active and passive vision are the primary sensors for performing dynamic image perception analysis during navigation. Other sensors, like accelerometers, Inertial Navigation Systems (INS) and Differential Global Positioning System (DGPS) receivers, measure vehicle motion through the environment and provide precise localization of vehicles, targets, obstacles, and terrain features on a map database.
For more information on the Intelligent Autonomous Vehicles project
see:
Intelligent
Control of Mobility Systems.
The list of the projects, shortly described above, is only a partial
list of the current NIST Intelligent Systems Division's RCS applications
and projects. If you want to obtain more information about these projects
or any other NIST ISD projects please visit their web site at ISD
Project Homepages.