Context

Neuromorphic technology in robotics, such as event cameras and neuromorphic processors, has advanced significantly over the last three decades. Despite breakthroughs in hardware and algorithms, the adoption of neuromorphic technology in real-world robotic systems remains limited. Key challenges include real-time processing, managing high data rates, learning representations, and integration with control and planning frameworks. This workshop aims to bridge the gap between the neuromorphic and robotics communities, encouraging cross-disciplinary dialogue and fostering adoption of event-based control and sensing in field robotics and automation. Key topics include representations, event cameras in field robotics, opportunities compared to conventional image sensors, hardware, and software architectures for neuromorphic processing, event-based simultaneous localization and mapping (SLAM). We will emphasize both academic progress and industrial applications, with a focus on challenges, opportunities, and lessons from experimentation or field deployment. Attendees will gain insight into current capabilities and limitations of the technology, as well as open challenges in the field. We aim to make the workshop a forum for roboticists to gain exposure to the technology and apply it to their research domains.

Call for Papers

Important dates

Submission deadline: March 23, 2026 (AoE).
Notification to Authors: April 15, 2026.
Workshop: June 5, 2026.

The workshop "Challenges and Opportunities of Neuromorphic Field Robotics and Automation" seeks contributions in the intersection of neuromorphic hardware and event-based perception targeted for real-world robotics applications. Late-breaking and preliminary results are encouraged in the following areas, including but not limited to:

  • Event camera motion and object segmentation, UAV detection
  • Event-based tracking and optical flow estimation
  • Depth estimation
  • Event-based navigation
  • Generative modeling of neuromorphic sensors
  • Foundational representations for multi-task perception and autonomy
  • Robust odometry and event-based SLAM in challenging conditions
  • Computational imaging for event cameras
  • Hardware and real-time implementations of neuromorphic software on FPGAs and GPUs
  • Sensor fusion
  • High-fidelity event simulation

Submissions should be two to eight pages long with references, with a total size under 10 MB, and follow the IEEE Manuscript Template format. Papers will be reviewed by the organizing committee and at least two specialized reviewers, single-blind.

Please use CMT to submit your contribution.

Invited Speakers

Contact

For any questions, please contact us at fclad[at]seas.upenn.edu.

Credits

The Microsoft CMT service was used for managing the peer-reviewing process for this workshop. This service was provided for free by Microsoft and they bore all expenses, including costs for Azure cloud services as well as for software development and support.

This site was built with Hugo, using the template Hugo Story, ported from Story.