Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Dense range map reconstruction from a versatile robotic sensor system with an active trinocular vision and a passive binocular vision

Not Accessible

Your library or personal account may give you access

Abstract

One major research issue associated with 3D perception by robotic systems is the creation of efficient sensor systems that can generate dense range maps reliably. A visual sensor system for robotic applications is developed that is inherently equipped with two types of sensor, an active trinocular vision and a passive stereo vision. Unlike in conventional active vision systems that use a large number of images with variations of projected patterns for dense range map acquisition or from conventional passive vision systems that work well on specific environments with sufficient feature information, a cooperative bidirectional sensor fusion method for this visual sensor system enables us to acquire a reliable dense range map using active and passive information simultaneously. The fusion algorithms are composed of two parts, one in which the passive stereo vision helps active vision and the other in which the active trinocular vision helps the passive one. The first part matches the laser patterns in stereo laser images with the help of intensity images; the second part utilizes an information fusion technique using the dynamic programming method in which image regions between laser patterns are matched pixel-by-pixel with help of the fusion results obtained in the first part. To determine how the proposed sensor system and fusion algorithms can work in real applications, the sensor system is implemented on a robotic system, and the proposed algorithms are applied. A series of experimental tests is performed for a variety of configurations of robot and environments. The performance of the sensor system is discussed in detail.

© 2008 Optical Society of America

Full Article  |  PDF Article
More Like This
UAV system with trinocular vision for external obstacle detection of transmission lines

Yunpeng Ma, Zhihong Yu, Yaqin Zhou, Qingwu Li, and Yi Wu
Appl. Opt. 61(12) 3297-3311 (2022)

Reducing the minimum range of a RGB-depth sensor to aid navigation in visually impaired individuals

Kailun Yang, Kaiwei Wang, Hao Chen, and Jian Bai
Appl. Opt. 57(11) 2809-2819 (2018)

Defect inspection for underwater structures based on line-structured light and binocular vision

Yi Wu, Yaqin Zhou, Shangjing Chen, Yunpeng Ma, and Qingwu Li
Appl. Opt. 60(25) 7754-7764 (2021)

Cited By

You do not have subscription access to this journal. Cited by links are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access Optica Member Subscription

Figures (13)

You do not have subscription access to this journal. Figure files are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access Optica Member Subscription

Tables (1)

You do not have subscription access to this journal. Article tables are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access Optica Member Subscription

Equations (7)

You do not have subscription access to this journal. Equations are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access Optica Member Subscription

Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.