Project Profile


Goals of the project

The power of robotics lies in creating autonomous robots which exhibit behaviors that respond to an environment. This course project is a study in creating an autonomous behavior. The Acrobot is an underactuated robot that senses its own position and responds by moving its arm back and forth to excite its body dynamics into expressive movements. The Acrobot will perform its behavior autonomously but also responds to the touch of human interaction. It also has human-like behaviors in that it does get “tired” and requires occasional breaks.

As the project evolved, we also developed a companion robot which coordinates with the lead robot via wireless communication. In a choreographed dance, the leader Acrobot guides the follower Acrobot in its movement, creating a mesmerizing visual display. The two machines have similar but not identical structure, so the performance is not perfectly synced and the follower tends to step on the leaders toes. When they do sync up, the results are fascinating to watch.

Nature of the Collaboration

This project was iteratively developed as a series of assignments for the new “Introduction to Physical Computing” IDeATe course in the Fall 2014 semester. The nature of the course is that everybody is tasked to learn new skills outside their expertise, to teach their own knowledge, and to become effective collaborators by learning the language and approach of other areas.

The concept was developed mutually, then each stage of hardware and software development proceeded in tandem. Brian has more fabrication experience, and Luke more software experience, but all problems were solved together. The central problem of formulating the control equations was entirely new to both of us and approached as a collaborative empirical problem.  

Skills

Making the robots required a combination of skills in mechanical design, embedded programming, wireless communication, Arduino programming, and Kinect SDK programming. Building the structures required designing low-friction joints, mounting actuators, and fabricating custom laser-cut parts. The software skills for the robots included real-time sensor processing, dynamic control programming, state-machine design, and developing an ad hoc communication protocol. The final iteration also included writing a Windows program to send Kinect data wirelessly to the robots.

Tools

The structures were built from a combination of OpenBeam extrusions and custom laser-cut parts made using the course facilities. The primary controller was an Arduino, with wireless communication implemented using Wixel modules. The motion sensing used an accelerometer for inertial sensing and a Hall effect sensor for rotation counting. The final version also used a Kinect to sense a human interacting via body gesture.

Process

The initial prompt was to design a mechanism with a single controlled axis, and we used Mark Spong's Acrobot research as an inspirational starting point for our first prototype. Our approach was to build prototypes rapidly using aluminum extrusion and standard components in order to focus on co-developing the control software with the hardware. We discovered the potential for inertial sensor feedback to produce unstable control with expressive oscillations, and used that as the basis for the behavior architecture. All in all, we developed the idea through four main iterations, culminating in a human-scale device controlled via body gestures sensed using a Kinect 3D camera.

Milestones

The first iteration included large passive wheels below a body with the dynamic arm. However, once the interesting unstable behaviors were discovered, the second iteration replaced the wheels with a pivot on a stable structure to focus attention on the dynamics of the arm and body. The success of this version prompted construction of a companion robot in the third iteration to explore the effects of mapping the dynamics of one body onto a similar but non-identical robot. The four version involved constructing an entirely new floor-standing machine to explore the differences in the motion in the change from tabletop scale to human scale.

Challenges encountered

There were many mechanical design challenges even within building a simple machine. There was also a taste of the complexity of designing control strategies for dynamic machines. We lacked the formal mathematical theory for developing a controller so we relied instead on intuition and observation. A more theoretically grounded control strategy could have allowed us to produce specific commanded movements, but the key lesson is that by taking advantage of the observed dynamics we were able to pivot the project direction to structure interesting behaviors around an empirical solution.

Major outcomes


Innovations, impact and successes

~ ~ ~