Overview:

Many industrial tasks nowadays require machines to be flexible, i.e., they should be able to
  • 1) understand the changing environments and tasks
  • 2) generate corresponding actions in real time.
  • For example, for machine tending, the manipulators should generate actions regarding the real-time configuration of the materials, which needs to be perceived online. Hence, it is important to enable real-time perception-action loops for these intelligent manipulators in these flexible tasks. However, it remains challenging to optimally and efficiently configure and adapt these perception-action loops, under changing environments and tasks. For different tasks and environments, the optimal configuration of the perception-action loops may vary significantly, e.g., the mounting location of the camera, the focus on the camera, and the optimal update frequency of the perception-action loop. Moreover, there are variations across different hardware platforms so that the optimal configuration for one platform may not be optimal for the other. To ensure optimality and consistency across different platforms, we will develop a task agnostic few-shot learning method that can
  • 1) automatically calibrate the perception-action loop to optimize user specified objectives (e.g., minimizing cycle time, maximizing the task success rate);
  • 2) monitor and adapt the system in real time if the environment changes to maintain optimality.

  • Research Topics

    Real-time industrial robot teleoperation

    Safe human robot interaction

    Multimodal AR environment

    Humanoid robot teleoperation

    Sponsor: Siemens

    Period of Performance: 2021 ~ Now

    Point of Contact: Yifan Sun