Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group
  • Chinese Optics Letters
  • Vol. 6,
  • Issue 9,
  • pp. 654-656
  • (2008)

New color correction method for multi-view video using disparity vector information

Not Accessible

Your library or personal account may give you access

Abstract

Color inconsistency is an urgent problem to be solved in free viewpoint television. In this letter, a new color correction method is proposed by using disparity vector information. At first, we separate foreground and background from the scene with a method of mean-removed disparity estimation. Then the correction parameters are estimated by adopting linear fitting for foreground and background regions, respectively. Next, with expectation-maximization algorithm, we integrate correction parameters of foreground and background to get the final corrected image. Finally, video tracking technique is performed to correct multi-view video. Experimental results show that the proposed method is quite effective.

© 2008 Chinese Optics Letters

PDF Article
More Like This
Objective visual comfort evaluation method based on disparity information and motion for stereoscopic video

Bochao Kan, Yan Zhao, and Shigang Wang
Opt. Express 26(9) 11418-11437 (2018)

Separation of foreground and background from light field using gradient information

Jae Young Lee and Rae-Hong Park
Appl. Opt. 56(4) 1069-1078 (2017)

Virtual view synthesis for 3D light-field display based on scene tower blending

Duo Chen, Xinzhu Sang, Peng Wang, Xunbo Yu, Xin Gao, Binbin Yan, Huachun Wang, Shuai Qi, and Xiaoqian Ye
Opt. Express 29(5) 7866-7884 (2021)

Cited By

You do not have subscription access to this journal. Cited by links are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access Optica Member Subscription

Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.