Live ContactNets Demo with Franka Arm

May 2022   Bianchini

At the International Conference on Robotics and Automation (ICRA) in 2022 in Philadelphia, we took advantage over the conference being in our neighborhood and performed live demonstrations of our ContactNets project. The demo featured a Franka Panda robotic arm tossing a test object onto a table, and learning the object's geometry just by observing its contact-rich trajectory. My collaborators are fellow Penn PhD student Mathew Halm, Penn masters student Kausik Sivakumar, and Penn faculty Michael Posa.

Video 1:  A short video of the demonstration.


The demonstration consists of the Franka robot repeatedly tossing an unknown object, while an instance of ContactNets uses the toss trajectories to train a geometry mesh and a friction parameter.  The object mesh is viewable from a browser window, with options to see the mesh continuously spinning or to interact with the mesh by panning around on the webpage.

Figure 1:  The Franka goes to pick up the object.

While more data means a higher quality model, we are particularly interested in the low data regimes.  With our novel training setup (see our L4DC 2022 paper and my RSS 2022 presentation for more of those details), we see good convergence on the object mesh and frictional parameter at around 10-20 tosses.


Physical setup

Figure 2:  The physical setup of the demo.

Many of the necessary components for running the demo are pictured above.  Note that while our example object is a cube, any convex shape would work as long as we configure TagSLAM to track the new object.  Unpictured components include:

  • the computers (noted in the network connections diagram),
  • the Franka control box (noted in the network connections diagram), and
  • the camera calibration board.