Robot Magic and Music: Humanoid Application Challenge

Robots and humans jamming together

In 2019, we signed up to participate in the Humanoid Application Challenge from the international conference IROS 2019. The competition this year was about magic and music. The sky was the limit regarding what could be achieved. Still, it had to be highly technical, fun and a wowing experience for judges and spectators. Team Snobots has always participated in doing magic and has done very well, even winning a robot in a previous year. This time, because of the music addition and that most of the team members were musical in a way, we decided to take that route instead of magic.

In a previous competition, we managed to have our two robots share the same code. We were excited to make them play together with us as a human and robot band. Mario, one of our lab members and a drummer, made the motions for our big robot to play drums and… oh my goodness, the result was the real magic for us. As soon as we saw it we knew we were ready, even if we didn’t know how to do the rest, but we were so energized and inspired. We posted our Qualification video and people were impressed.

The real challenge began. Music is hard to program as you need precision. Robots can be reasonably accurate, but they have some milliseconds as a margin of error, which can become messy after playing for some time. Adding an extra robot to the scene made it even harder. In the end, we figured it out: UDP signals between the two robots to “update” each other in which part of the song they are in, an algorithm to measure how long each motion took and make them faster or slower to account for inaccuracies and a timer/calculator for the bpm of the song and how long each motion would take.

In between, I learned more about OpenCV, visual servoing, inverse kinematics, UDP connections, human-robot interaction and music theory. Not forgetting also how to be a good showman and interact with the robot through some text-to-speech. For more information about my version of visual servoing you can take a look at my Github repo.

The trip to Macau was very exciting and long. When we arrived, we had some hardware issues which got me nervous at the moment of performance. We decided to play two simple songs (excluding “sound checks” that will show off our work with visual servoing and inverse kinematics). One of our songs was in Cantonese, and as soon as we played it, a lot of people entered the room as it is also a very popular song in Macau and Hong Kong. It was really touching.

In the end, we got 2nd place among six teams! The prize was a hand sponsored by Seed Robotics. Here is a videoof the show (I sometimes cringe at how nervous I was, but I still would do it all over again)

Here are some solo videos of each part of the performance:
Dr. Meng Cheng Lau (Post Doc Fellow), Chris Melendez (me)

An interesting video highlight where my profesor talked for a bit… and I got a cameo!