Abstract

This paper presents a low-power and small-size motion gesture sensor (MGS) based on an active infrared (IR) proximity sensor. While conventional proximity-based MGSs require two LEDs (the most power-hungry components) and one photodiode, the proposed MGS eliminates one of LEDs at the expense of an additional photodiode as a separate channel. In conjunction with an optical block that limits the field-of-view of each photodiode, the power consumption and area of the proposed MGS can be reduced by up to 52% and 69%, respectively, compared with conventional proximity-based MGSs. Optical simulation and test results validate theoretical analysis presented.

© 2013 OSA

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. S. Mitra and T. Acharya, “Gesture recognition: a survey,” IEEE Trans. Syst. Man Cybern. C37(3), 311–324 (2007).
    [CrossRef]
  2. J. O. Wobbrock, A. D. Wilson, and Y. Li, “Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes,” in Proceedings of ACM Symposium on User Interface Software and Technology (ACM, New York, 2007), pp. 159–168.
    [CrossRef]
  3. H. Lee and J. Park, “Touch play pool: touch gesture interaction for mobile multifunction devices,” in Proceedings of IEEE Conference on Consumer Electronics (Institute of Electrical and Electronics Engineers, 2012), pp. 291–292.
  4. I. Yang and O. Kwon, “A touch controller using differential sensing method for on-cell capacitive touch screen panel systems,” IEEE Trans. Consum. Electron.57(3), 1027–1032 (2011).
    [CrossRef]
  5. J.-H. Kim, N. D. Thang, and T.-S. Kim, “3-D hand motion tracking and gesture recognition using a data glove,” in Proceedings of IEEE International Symposium on Industrial Electronics (Institute of Electrical and Electronics Engineers, 2009), pp. 1013–1018.
    [CrossRef]
  6. Y. Han, “A low-cost visual motion data glove as an input device to interpret human hand gestures,” IEEE Trans. Consum. Electron.56(2), 501–509 (2010).
    [CrossRef]
  7. J. Kela, P. Korpipää, J. Mäntyjärvi, S. Kallio, G. Savino, L. Jozzo, and D. Marca, “Accelerometer-based gesture control for a design environment,” Pers. Ubiquitous Comput.10(5), 285–299 (2006).
    [CrossRef]
  8. E. Y. Ahn, J. H. Lee, T. Mullen, and J. Yen, “Dynamic vision sensor camera based bare hand gesture Recognition,” in Proceedings of IEEE International Symposium on Computer Intelligence for Multimedia (Institute of Electrical and Electronics Engineers, 2011), pp. 52–59.
    [CrossRef]
  9. T. B. Moeslund and E. Granum, “A survey of computer vision-based human motion capture,” Comput. Vis. Image Underst.81(3), 231–268 (2001).
    [CrossRef]
  10. H. Cheng, A. M. Chen, A. Razdan, and E. Buller, “Contactless gesture recognition system using proximity sensors,” in Proceedings of IEEE Conference on Consumer Electronics (Institute of Electrical and Electronics Engineers, 2011), pp. 149–150.
    [CrossRef]
  11. A. M. Chen, H. T. Cheng, A. Razdan, and E. B. Buller, “Methods and apparatus for contactless gesture recognition,” Qualcomm Inc., U.S. Patent App. 13/161,955 (2011).
  12. M. Igaki, H. Osawa, and T. Tsuchikawa, “Illumination device,” Rohm Co. LTD., U.S. Patent App. 13/187,593 (2011).
  13. T. Chang, K. P. Wu, C. J. Fang, C. T. Chan, C. T. Chuang, and F. Y. Liu, “Light sensor system for object detection and gesture recognition, and object detection method,” U.S. Patent App. 13/494,000 (2012).
  14. Silicon Labs white paper, “Infrared gesture sensing” (Silicon Laboratories Inc., 2011), http://www.silabs.com/Support%20Documents/TechnicalDocs/AN580.pdf
  15. T. Martinez, D. Wick, and S. Restaino, “Foveated, wide field-of-view imaging system using a liquid crystal spatial light modulator,” Opt. Express8(10), 555–560 (2001).
    [CrossRef] [PubMed]
  16. R. Winston, J. C. Miano, and P. G. Benitez, Nonimaging Optics (Academic, 2004).
  17. I. Luque-Heredia, J. Moreno, P. Magalhaes, R. Cervantes, G. Quemere, and O. Laurent, Concentrator Photovoltaics (Springer, 2007), Chap. 11.

2011

I. Yang and O. Kwon, “A touch controller using differential sensing method for on-cell capacitive touch screen panel systems,” IEEE Trans. Consum. Electron.57(3), 1027–1032 (2011).
[CrossRef]

2010

Y. Han, “A low-cost visual motion data glove as an input device to interpret human hand gestures,” IEEE Trans. Consum. Electron.56(2), 501–509 (2010).
[CrossRef]

2007

S. Mitra and T. Acharya, “Gesture recognition: a survey,” IEEE Trans. Syst. Man Cybern. C37(3), 311–324 (2007).
[CrossRef]

2006

J. Kela, P. Korpipää, J. Mäntyjärvi, S. Kallio, G. Savino, L. Jozzo, and D. Marca, “Accelerometer-based gesture control for a design environment,” Pers. Ubiquitous Comput.10(5), 285–299 (2006).
[CrossRef]

2001

T. B. Moeslund and E. Granum, “A survey of computer vision-based human motion capture,” Comput. Vis. Image Underst.81(3), 231–268 (2001).
[CrossRef]

T. Martinez, D. Wick, and S. Restaino, “Foveated, wide field-of-view imaging system using a liquid crystal spatial light modulator,” Opt. Express8(10), 555–560 (2001).
[CrossRef] [PubMed]

Acharya, T.

S. Mitra and T. Acharya, “Gesture recognition: a survey,” IEEE Trans. Syst. Man Cybern. C37(3), 311–324 (2007).
[CrossRef]

Granum, E.

T. B. Moeslund and E. Granum, “A survey of computer vision-based human motion capture,” Comput. Vis. Image Underst.81(3), 231–268 (2001).
[CrossRef]

Han, Y.

Y. Han, “A low-cost visual motion data glove as an input device to interpret human hand gestures,” IEEE Trans. Consum. Electron.56(2), 501–509 (2010).
[CrossRef]

Jozzo, L.

J. Kela, P. Korpipää, J. Mäntyjärvi, S. Kallio, G. Savino, L. Jozzo, and D. Marca, “Accelerometer-based gesture control for a design environment,” Pers. Ubiquitous Comput.10(5), 285–299 (2006).
[CrossRef]

Kallio, S.

J. Kela, P. Korpipää, J. Mäntyjärvi, S. Kallio, G. Savino, L. Jozzo, and D. Marca, “Accelerometer-based gesture control for a design environment,” Pers. Ubiquitous Comput.10(5), 285–299 (2006).
[CrossRef]

Kela, J.

J. Kela, P. Korpipää, J. Mäntyjärvi, S. Kallio, G. Savino, L. Jozzo, and D. Marca, “Accelerometer-based gesture control for a design environment,” Pers. Ubiquitous Comput.10(5), 285–299 (2006).
[CrossRef]

Korpipää, P.

J. Kela, P. Korpipää, J. Mäntyjärvi, S. Kallio, G. Savino, L. Jozzo, and D. Marca, “Accelerometer-based gesture control for a design environment,” Pers. Ubiquitous Comput.10(5), 285–299 (2006).
[CrossRef]

Kwon, O.

I. Yang and O. Kwon, “A touch controller using differential sensing method for on-cell capacitive touch screen panel systems,” IEEE Trans. Consum. Electron.57(3), 1027–1032 (2011).
[CrossRef]

Mäntyjärvi, J.

J. Kela, P. Korpipää, J. Mäntyjärvi, S. Kallio, G. Savino, L. Jozzo, and D. Marca, “Accelerometer-based gesture control for a design environment,” Pers. Ubiquitous Comput.10(5), 285–299 (2006).
[CrossRef]

Marca, D.

J. Kela, P. Korpipää, J. Mäntyjärvi, S. Kallio, G. Savino, L. Jozzo, and D. Marca, “Accelerometer-based gesture control for a design environment,” Pers. Ubiquitous Comput.10(5), 285–299 (2006).
[CrossRef]

Martinez, T.

Mitra, S.

S. Mitra and T. Acharya, “Gesture recognition: a survey,” IEEE Trans. Syst. Man Cybern. C37(3), 311–324 (2007).
[CrossRef]

Moeslund, T. B.

T. B. Moeslund and E. Granum, “A survey of computer vision-based human motion capture,” Comput. Vis. Image Underst.81(3), 231–268 (2001).
[CrossRef]

Restaino, S.

Savino, G.

J. Kela, P. Korpipää, J. Mäntyjärvi, S. Kallio, G. Savino, L. Jozzo, and D. Marca, “Accelerometer-based gesture control for a design environment,” Pers. Ubiquitous Comput.10(5), 285–299 (2006).
[CrossRef]

Wick, D.

Yang, I.

I. Yang and O. Kwon, “A touch controller using differential sensing method for on-cell capacitive touch screen panel systems,” IEEE Trans. Consum. Electron.57(3), 1027–1032 (2011).
[CrossRef]

Comput. Vis. Image Underst.

T. B. Moeslund and E. Granum, “A survey of computer vision-based human motion capture,” Comput. Vis. Image Underst.81(3), 231–268 (2001).
[CrossRef]

IEEE Trans. Consum. Electron.

I. Yang and O. Kwon, “A touch controller using differential sensing method for on-cell capacitive touch screen panel systems,” IEEE Trans. Consum. Electron.57(3), 1027–1032 (2011).
[CrossRef]

Y. Han, “A low-cost visual motion data glove as an input device to interpret human hand gestures,” IEEE Trans. Consum. Electron.56(2), 501–509 (2010).
[CrossRef]

IEEE Trans. Syst. Man Cybern. C

S. Mitra and T. Acharya, “Gesture recognition: a survey,” IEEE Trans. Syst. Man Cybern. C37(3), 311–324 (2007).
[CrossRef]

Opt. Express

Pers. Ubiquitous Comput.

J. Kela, P. Korpipää, J. Mäntyjärvi, S. Kallio, G. Savino, L. Jozzo, and D. Marca, “Accelerometer-based gesture control for a design environment,” Pers. Ubiquitous Comput.10(5), 285–299 (2006).
[CrossRef]

Other

E. Y. Ahn, J. H. Lee, T. Mullen, and J. Yen, “Dynamic vision sensor camera based bare hand gesture Recognition,” in Proceedings of IEEE International Symposium on Computer Intelligence for Multimedia (Institute of Electrical and Electronics Engineers, 2011), pp. 52–59.
[CrossRef]

J. O. Wobbrock, A. D. Wilson, and Y. Li, “Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes,” in Proceedings of ACM Symposium on User Interface Software and Technology (ACM, New York, 2007), pp. 159–168.
[CrossRef]

H. Lee and J. Park, “Touch play pool: touch gesture interaction for mobile multifunction devices,” in Proceedings of IEEE Conference on Consumer Electronics (Institute of Electrical and Electronics Engineers, 2012), pp. 291–292.

J.-H. Kim, N. D. Thang, and T.-S. Kim, “3-D hand motion tracking and gesture recognition using a data glove,” in Proceedings of IEEE International Symposium on Industrial Electronics (Institute of Electrical and Electronics Engineers, 2009), pp. 1013–1018.
[CrossRef]

H. Cheng, A. M. Chen, A. Razdan, and E. Buller, “Contactless gesture recognition system using proximity sensors,” in Proceedings of IEEE Conference on Consumer Electronics (Institute of Electrical and Electronics Engineers, 2011), pp. 149–150.
[CrossRef]

A. M. Chen, H. T. Cheng, A. Razdan, and E. B. Buller, “Methods and apparatus for contactless gesture recognition,” Qualcomm Inc., U.S. Patent App. 13/161,955 (2011).

M. Igaki, H. Osawa, and T. Tsuchikawa, “Illumination device,” Rohm Co. LTD., U.S. Patent App. 13/187,593 (2011).

T. Chang, K. P. Wu, C. J. Fang, C. T. Chan, C. T. Chuang, and F. Y. Liu, “Light sensor system for object detection and gesture recognition, and object detection method,” U.S. Patent App. 13/494,000 (2012).

Silicon Labs white paper, “Infrared gesture sensing” (Silicon Laboratories Inc., 2011), http://www.silabs.com/Support%20Documents/TechnicalDocs/AN580.pdf

R. Winston, J. C. Miano, and P. G. Benitez, Nonimaging Optics (Academic, 2004).

I. Luque-Heredia, J. Moreno, P. Magalhaes, R. Cervantes, G. Quemere, and O. Laurent, Concentrator Photovoltaics (Springer, 2007), Chap. 11.

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1

The flowchart of a conventional MGS for detecting four different gestures: Left to right, right to left, push, and pull.

Fig. 2
Fig. 2

Conceptual diagram of a conventional active IR proximity-based MGS using a phase-based sensing algorithm for detecting the motion of a hand. (a) Hardware formation. (b) Output timing diagram for detecting a hand moving from left to right.

Fig. 3
Fig. 3

Various zones according to the emission and acceptance angles of each component in a conventional active IR proximity-based MGS shown in Fig. 1. (a) Conical emission of each LED. (b) FOV of each component, where αβ.

Fig. 4
Fig. 4

The first stage development of the proposed MGS using a channel-based sensing algorithm. (a) FOV of each component, where α β. (b) Timing diagram when a hand moves from left to right.

Fig. 5
Fig. 5

The second stage development of the proposed MGS using a channel base sensing algorithm with an optical block. (a) An example of an optical block used for a sun tracking sensor with a pair of photodiodes. (b) Component formation of the proposed scheme. (c) 2D cross-sectional view. (d) Approximated FOVs for calculating the length of detectable zone.

Fig. 6
Fig. 6

Simulation results for the length of detectable zones according to three different values of β. (a) Conventional MGSs with a hand (l0 = 80 mm) and a finger (l0 = 20mm). (b) The proposed scheme with an optical block at γ = 7.5°. Dashed lines are for conventional schemes from Fig. 6(a).

Fig. 7
Fig. 7

Optical simulation using LightTools. (a) Simulation setup. (b) Luminance distributions with β = 60° and γ = 7.5°.

Fig. 8
Fig. 8

Test results for the proposed MGS. (a) Simplified block diagram of the proposed MGS (only single channel is shown). (b) Optical component formation. (c) Analog outputs for swiping from left to right without an optical block. (d) Analog outputs for swiping from left to right with an optical block.

Fig. 9
Fig. 9

Outputs of the proposed MGS as a function of object’s position with three different values of γ. (a) γ = β/2 = 30°. (b) γ = 18.2°. (c) γ = 7.5°.

Tables (1)

Tables Icon

Table 1 Minimum Sampling Frequency Required to Achieve the Recognition Rate of 99.5%

Equations (6)

Equations on this page are rendered with MathJax. Learn more.

l= l 0 , where h h 0 = 0.5 l 0 tan(α/2 ) .
TD=l/v= l 0 /v,
l= l 0 , where h h 0 = 0.5 l 0 tan(β/2 )  and αβ.
l=h(tan(β/2 )tanγ),
P prop = P LED + P BOS =(D I LED + I ¯ ) V DD ,
PR= P conv P prop P conv = D I LED D I LED 2 I ¯ 0.525(mA) 0.525(mA)21(mA) =0.52,

Metrics