Denoising Diffusion Planner: Learning Complex Paths from Low-Quality Demonstrations

Michiel Nikken, Nicolò Botteghi, Wesley Roozing, Federico Califano

Research output: Working paper

28 Downloads (Pure)

Abstract

Denoising Diffusion Probabilistic Models (DDPMs) are powerful generative deep learning models that have been very successful at image generation, and, very recently, in path planning and control. In this paper, we investigate how to leverage the generalization and conditional sampling capabilities of DDPMs to generate complex paths for a robotic end effector. We show that training a DDPM with synthetic and low-quality demonstrations is sufficient for generating nontrivial paths reaching arbitrary targets and avoiding obstacles. Additionally, we investigate different strategies for conditional sampling combining classifier-free and classifier-guided approaches. Eventually, we deploy the DDPM in a receding-horizon control scheme to enhance its planning capabilities. The Denoising Diffusion Planner is experimentally validated through various experiments on a Franka Emika Panda robot.
Original languageEnglish
PublisherArXiv.org
DOIs
Publication statusPublished - 28 Oct 2024

Keywords

  • cs.RO

Fingerprint

Dive into the research topics of 'Denoising Diffusion Planner: Learning Complex Paths from Low-Quality Demonstrations'. Together they form a unique fingerprint.

Cite this