• Skip to main content
  • Skip to secondary menu
  • Skip to primary sidebar
  • Skip to footer
  • IEEE.org
  • IEEE Xplore
  • IEEE Standards
  • IEEE Spectrum
  • More Sites

IEEE Robotics & Automation Magazine

  • IEEE.org
  • IEEE Xplore
  • IEEE Standards
  • IEEE Spectrum
  • More Sites

Digital Robot Judge: Building a Task-Centric Performance Database of Real-World Manipulation With Electronic Task Boards

December 12, 2024 by Peter So, Andriy Sarabakha, Fan Wu, Utku Culha, Fares J. Abu-Dakka, Sami Haddadin

Robotics aims to develop manipulation skills approaching human performance. However, skill complexity is often over- or underestimated based on individual experience, and the real-world performance gap is difficult or expensive to measure through in-person competitions. To bridge this gap, we propose a compact, internet-connected, electronic task board to measure manipulation performance remotely; we call it the digital robot judge or “DR.J.” By detecting key events on the board through performance circuitry, DR.J provides an alternative to transporting equipment to in-person competitions and serves as a portable test and data generation system that captures and grades performances making comparisons less expensive. Data collected are automatically published on a web dashboard that provides a living performance benchmark that can visualize improvements in real-world manipulation skills of robot platforms over time across the globe. In this paper, we share the results of a proof-of-concept electronic task board with industry-inspired tasks used in an international competition in 2021 and 2022 to benchmark localization, insertion, and disassembly tasks. We present data from 10 DR.J task boards, a method to derive Relative Task Complexity (RTC) from timing data, and compare robot solutions with a human performer. In the best case, robots performed 9× faster than humans in specialized tasks but achieved only 16% of human speed across the full set of tasks. Finally, we present the modular design, instructions, and software to replicate the electronic task board or to adapt it to new use cases to promote task-centric benchmarking.

For more about this article see link below.

https://ieeexplore.ieee.org/document/10378967

For the open access PDF link of this article please click here.

Filed Under: Past Features Tagged With: Automation, Benchmark testing, Protocols, Robot sensing systems, Robots, Service robots, Task analysis

Primary Sidebar

Current Issue

Get the entire issue now.

 

About the Magazine

IEEE Robotics & Automation Magazine (RAM) has over 14,000 readers who are the people who drive this remarkable technology. More than half work in basic research and many of the others are top level engineers and decision-makers in industry.  This magazine highlights new concepts in Robotics and Automation that are applied to real-world systems. It delivers tutorial and survey papers by distinguished experts in the field, organizes focused special issues on hot topics, and provides a forum for disseminating and discussing emerging trends, novel achievements, and selected news relevant to the development of the whole community active in these fields worldwide.

Past Issues

Search

Footer

LINKS

Home | Contact IEEE | Accessibility |
Nondiscrimination  Policy | IEEE Ethics Reporting | Terms & Disclosures| IEEE Privacy Policy

© Copyright 2025 IEEE – All rights reserved. A public charity, IEEE is the world’s largest technical professional organization dedicated to advancing technology for the benefit of humanity.

ABOUT US

IEEE Robotics & Automation Magazine  publishes four issues per year: March, June, September and December.