mc_rtc is an open-source application framework for robotics developed jointly by the CNRS-AIST Joint Robotics Laboratory (JRL) in Japan and the CNRS-UM Interactive Digital Human (IDH) in France.

It has been used extensively over the past few years and allowed efficient realization of many complex demonstrators:

  • Industrial applications of humanoid robots (e.g. COMANOID)
  • Humanoids car driving (e.g. DARPA Robotics Challenge)
  • Human-Robot interaction on the Pepper platform (SOFTBANK Robotics)
  • Embodiment for service robotics (Avatar X-Prize)
  • and many others that will be showcased using simpler robotic arms

It is now used in many teams across the world and has been deployed on many humanoid platforms - HRP robots, Nao, Pepper, REEM-C and Talos - as well as non-humanoid robots such as Panda, Kuka or UR robots.

With the release of the 2.0 version, the framework is now more extendable than ever to conduct your own robotics research with the tools provided by the framework.

In this tutorial we will introduce the capabilities of the framework through the implementation of an example scenario that requires the combination of visual-servoing, force sensing and walking. The scenario is representative of the complexity of programming humanoid robots and will highlight the power of mc_rtc in easing this process.

Topics

  • Optimization-based task-space QP control
  • Robotic applications programming
  • Visual servoing
  • Human-robot interaction