back to list

Project: Generative Object Detection Models for Mobile Robotics Applications

Description

Project description:

In the dynamic landscape of mobile robotics, object detection remains a foundational challenge, critical for enabling machines to interact intelligently with their surroundings. At Avular, a pioneering mobile robotics company in Eindhoven, we are excited to explore novel and innovative approaches in this domain. This project aims to explore and develop generative object detection models that go beyond the limitations of traditional discriminative methods, offering fresh perspectives and enhanced capabilities for robotic perception.

Throughout the course of this thesis, your investigation will focus on the development and evaluation of generative object detection models. These models hold the potential to redefine how robots perceive and understand their environment. Generative models could revolutionize object detection by:

  • Handling Unfamiliar Scenarios*: One of the challenges in robotics is navigating unfamiliar environments. Generative models can equip robots with the ability to identify unfamiliar objects and scenarios, and respond appropriately, enhancing safety and adaptability.
  • Adapting to Diverse Environments: Traditional discriminative methods often struggle to generalize across diverse environmental conditions. Generative models, with their inherent ability to synthesize data, might offer improved adaptability across varying lighting, weather, and context.
  • Active Learning Strategies: Generative models can facilitate active learning strategies, enabling robots to intelligently select samples that will contribute the most to improving their detection capabilities. This dynamic interaction between the model and the environment can lead to more efficient learning and adaptation.

To thoroughly assess the efficacy and practicality of the generative object detection model, you will conduct comprehensive experiments. This will involve evaluation on both standard object detection benchmarks and real-world data collected from Avular's robots. By comparing its performance across diverse datasets, including Avular's proprietary data, you will gain insights into the model's adaptability and its potential to excel in robotic applications.


About Avular

We are a fast-growing high-tech scale-up from Eindhoven. At Avular, we believe that mobile robotics will help make the world a better place, so with our products and services we accelerate the creation of new mobile robotic applications that shape a brighter future for all. Our goal is to make robots accessible on a global scale, and ensure robots can add their value to our future on a worldwide scale. Our robots, both on the ground and in the air, can thus be found across industries, always keeping our purpose in mind: from agriculture to cleaning and from construction to hospitality. We set ambitious goals for ourselves, but with our team of highly motivated and bright colleagues, we together achieve our goals. We value our people and pay a lot of attention to their wellbeing, which is reflected in the informal culture at our office. Besides that, we find it important to celebrate our successes, as a team and as a company. As of day one, we will make sure you feel part of our tribe!


Literature (examples):

  • Chen, Shoufa, et al. "Diffusiondet: Diffusion model for object detection", https://arxiv.org/abs/2211.09788
  • Deja, Kamil, Tomasz Trzcinski, and Jakub M. Tomczak. "Learning Data Representations with Joint Diffusion Models", https://arxiv.org/abs/2301.13622
  • Tomczak, “Deep Generative Modeling”, https://link.springer.com/book/10.1007/978-3-030-93158-2 


Prerequisites:

  • reading and understanding scientific literature
  • very good coding skills in Python using PyTorch and other ML libraries
  • good knowledge of Deep Learning and the basics of Generative AI
  • curious attitude, independence, thinking out-of-the box
Details
Supervisor
Jakub Tomczak
Secondary supervisor
BK
Bart Keulen
External location
Avular
Interested?
Get in contact