A GPU-accelerated model-based tracker for untethered submillimeter grippers

Stefano Scheggi (Corresponding Author), ChangKyu Yoon, Arijit Ghosh, David H. Gracias, Sarthak Misra

    Research output: Contribution to journalArticleAcademicpeer-review

    1 Citation (Scopus)
    33 Downloads (Pure)

    Abstract

    Miniaturized grippers that possess an untethered structure are suitable for a wide range of tasks, ranging from micromanipulation and microassembly to minimally invasive surgical interventions. In order to robustly perform such tasks, it is critical to properly estimate their overall configuration. Previous studies on tracking and control of miniaturized agents estimated mainly their 2D pixel position, mostly using cameras and optical images as a feedback modality. This paper presents a novel solution to the problem of estimating and tracking the 3D position, orientation and configuration of the tips of submillimeter grippers from marker-less visual observations. We consider this as an optimization problem, which is solved using a variant of the Particle Swarm Optimization algorithm. The proposed approach has been implemented in a Graphics Processing Unit (GPU) which allows a user to track the submillimeter agents online. The proposed approach has been evaluated on several image sequences obtained from a camera and on B-mode ultrasound images obtained from an ultrasound probe. The sequences show the grippers moving, rotating, opening/closing and grasping biological material. Qualitative results obtained using both hydrogel (soft) and metallic (hard) grippers with different shapes and sizes ranging from 750 microns to 4 mm (tip to tip), demonstrate the capability of the proposed method to track the agent in all the video sequences. Quantitative results obtained by processing synthetic data reveal a tracking position error of 25 ±7μm and orientation error of 1.7 ± 1.3 degrees. We believe that the proposed technique can be applied to different stimuli responsive miniaturized agents, allowing the user to estimate the full configuration of complex agents from visual marker-less observations.

    Original languageEnglish
    Pages (from-to)111-121
    Number of pages11
    JournalRobotics and autonomous systems
    Volume103
    DOIs
    Publication statusPublished - 1 May 2018

    Fingerprint

    Grippers
    Graphics Processing Unit
    Model-based
    Configuration
    Ultrasonics
    Camera
    Cameras
    Microassembly
    Micromanipulation
    Hydrogel
    Ultrasound Image
    Grasping
    Synthetic Data
    Ultrasound
    Image Sequence
    Particle Swarm Optimization Algorithm
    Hydrogels
    Estimate
    Biological materials
    Particle swarm optimization (PSO)

    Cite this

    Scheggi, Stefano ; Yoon, ChangKyu ; Ghosh, Arijit ; Gracias, David H. ; Misra, Sarthak. / A GPU-accelerated model-based tracker for untethered submillimeter grippers. In: Robotics and autonomous systems. 2018 ; Vol. 103. pp. 111-121.
    @article{cf9eafb26dbf403fa65dd0e841aa32f6,
    title = "A GPU-accelerated model-based tracker for untethered submillimeter grippers",
    abstract = "Miniaturized grippers that possess an untethered structure are suitable for a wide range of tasks, ranging from micromanipulation and microassembly to minimally invasive surgical interventions. In order to robustly perform such tasks, it is critical to properly estimate their overall configuration. Previous studies on tracking and control of miniaturized agents estimated mainly their 2D pixel position, mostly using cameras and optical images as a feedback modality. This paper presents a novel solution to the problem of estimating and tracking the 3D position, orientation and configuration of the tips of submillimeter grippers from marker-less visual observations. We consider this as an optimization problem, which is solved using a variant of the Particle Swarm Optimization algorithm. The proposed approach has been implemented in a Graphics Processing Unit (GPU) which allows a user to track the submillimeter agents online. The proposed approach has been evaluated on several image sequences obtained from a camera and on B-mode ultrasound images obtained from an ultrasound probe. The sequences show the grippers moving, rotating, opening/closing and grasping biological material. Qualitative results obtained using both hydrogel (soft) and metallic (hard) grippers with different shapes and sizes ranging from 750 microns to 4 mm (tip to tip), demonstrate the capability of the proposed method to track the agent in all the video sequences. Quantitative results obtained by processing synthetic data reveal a tracking position error of 25 ±7μm and orientation error of 1.7 ± 1.3 degrees. We believe that the proposed technique can be applied to different stimuli responsive miniaturized agents, allowing the user to estimate the full configuration of complex agents from visual marker-less observations.",
    author = "Stefano Scheggi and ChangKyu Yoon and Arijit Ghosh and Gracias, {David H.} and Sarthak Misra",
    year = "2018",
    month = "5",
    day = "1",
    doi = "10.1016/j.robot.2017.11.003",
    language = "English",
    volume = "103",
    pages = "111--121",
    journal = "Robotics and autonomous systems",
    issn = "0921-8890",
    publisher = "Elsevier",

    }

    A GPU-accelerated model-based tracker for untethered submillimeter grippers. / Scheggi, Stefano (Corresponding Author); Yoon, ChangKyu; Ghosh, Arijit; Gracias, David H.; Misra, Sarthak.

    In: Robotics and autonomous systems, Vol. 103, 01.05.2018, p. 111-121.

    Research output: Contribution to journalArticleAcademicpeer-review

    TY - JOUR

    T1 - A GPU-accelerated model-based tracker for untethered submillimeter grippers

    AU - Scheggi, Stefano

    AU - Yoon, ChangKyu

    AU - Ghosh, Arijit

    AU - Gracias, David H.

    AU - Misra, Sarthak

    PY - 2018/5/1

    Y1 - 2018/5/1

    N2 - Miniaturized grippers that possess an untethered structure are suitable for a wide range of tasks, ranging from micromanipulation and microassembly to minimally invasive surgical interventions. In order to robustly perform such tasks, it is critical to properly estimate their overall configuration. Previous studies on tracking and control of miniaturized agents estimated mainly their 2D pixel position, mostly using cameras and optical images as a feedback modality. This paper presents a novel solution to the problem of estimating and tracking the 3D position, orientation and configuration of the tips of submillimeter grippers from marker-less visual observations. We consider this as an optimization problem, which is solved using a variant of the Particle Swarm Optimization algorithm. The proposed approach has been implemented in a Graphics Processing Unit (GPU) which allows a user to track the submillimeter agents online. The proposed approach has been evaluated on several image sequences obtained from a camera and on B-mode ultrasound images obtained from an ultrasound probe. The sequences show the grippers moving, rotating, opening/closing and grasping biological material. Qualitative results obtained using both hydrogel (soft) and metallic (hard) grippers with different shapes and sizes ranging from 750 microns to 4 mm (tip to tip), demonstrate the capability of the proposed method to track the agent in all the video sequences. Quantitative results obtained by processing synthetic data reveal a tracking position error of 25 ±7μm and orientation error of 1.7 ± 1.3 degrees. We believe that the proposed technique can be applied to different stimuli responsive miniaturized agents, allowing the user to estimate the full configuration of complex agents from visual marker-less observations.

    AB - Miniaturized grippers that possess an untethered structure are suitable for a wide range of tasks, ranging from micromanipulation and microassembly to minimally invasive surgical interventions. In order to robustly perform such tasks, it is critical to properly estimate their overall configuration. Previous studies on tracking and control of miniaturized agents estimated mainly their 2D pixel position, mostly using cameras and optical images as a feedback modality. This paper presents a novel solution to the problem of estimating and tracking the 3D position, orientation and configuration of the tips of submillimeter grippers from marker-less visual observations. We consider this as an optimization problem, which is solved using a variant of the Particle Swarm Optimization algorithm. The proposed approach has been implemented in a Graphics Processing Unit (GPU) which allows a user to track the submillimeter agents online. The proposed approach has been evaluated on several image sequences obtained from a camera and on B-mode ultrasound images obtained from an ultrasound probe. The sequences show the grippers moving, rotating, opening/closing and grasping biological material. Qualitative results obtained using both hydrogel (soft) and metallic (hard) grippers with different shapes and sizes ranging from 750 microns to 4 mm (tip to tip), demonstrate the capability of the proposed method to track the agent in all the video sequences. Quantitative results obtained by processing synthetic data reveal a tracking position error of 25 ±7μm and orientation error of 1.7 ± 1.3 degrees. We believe that the proposed technique can be applied to different stimuli responsive miniaturized agents, allowing the user to estimate the full configuration of complex agents from visual marker-less observations.

    U2 - 10.1016/j.robot.2017.11.003

    DO - 10.1016/j.robot.2017.11.003

    M3 - Article

    VL - 103

    SP - 111

    EP - 121

    JO - Robotics and autonomous systems

    JF - Robotics and autonomous systems

    SN - 0921-8890

    ER -