Abstract

Fluorescent foils are used with silicon photodiodes for large-area detection of objects, when combined with lasers forming a light curtain. An object entering the detection area penetrates the light curtain and casts shadows onto the fluorescent foils. Using a simple mathematical algorithm, the position of the object is detected with high speed. The device is suitable for security applications and can be used as a touch input device for computers, gaming and presentations.

© 2013 OSA

1. Introduction

Indoor position sensors allow for the detection and tracking of objects in different applications like localization and controlling of humans and robots, in the entertainment industry and in safety monitoring systems. Today most of these sensors are based on the detection of different transmitters and subsequent triangulation or trilateration of the objects. This allows for multiple localization and identification with high precision in the measuring zone. GPS (global positioning system) stands in the forefront and is the most used and reliable technique for position detection. While GPS object detection works very well on large scales it becomes much less effective in indoor areas due to the strong signal attenuation. This fact led to the development of several other techniques, based on the detection of acoustical [1], optical [2,3] or RF [4] signals. Reviews of such techniques are given in [4,5].

Though all of these approaches provide excellent position detection, the objects must be equipped with transmitters or detectors in most instances, and the systems have to be extensively calibrated before first usage. An instantaneous detection of unknown objects entering the measurement area is therefore impossible, an important demand in security and machine supervision applications. For this purpose, other detection systems have to be found.

In this work we present a simple and affordable device consisting of fluorescent waveguides with attached silicon photodiodes and linear lasers forming a light curtain. When an object enters the active area, several shadows are formed, which are detected by the waveguides and collected after subsequent light transport at the appropriate photo receivers (Fig. 1). Using a simple calculation algorithm, the position of the object is then obtained by triangulation. The simple setup affords a fast detection of unknown objects, allowing the device to find use in security and monitoring applications, in the control of robots as well as in an interface device for computers and machines.

 

Fig. 1 Scheme of the 2-dimensional large-area object detection system. An object entering the light curtain casts shadows, which are detected with stripe detectors on the edges of the device.

Download Full Size | PPT Slide | PDF

2. Position sensitive devices

Position tracing without any transmitters or sensors being attached to the object is already common in medicine and biomechanics research, and may emerge as an interactive input device for humans in future markets. One possibility is the use of one or more cameras for object tracing in their field of vision [6,7]. Unfortunately, complex backgrounds or objects with different and alternating colours make a recognition and determination difficult. An excellent functionality can be achieved only by using light markers and/or a homogeneous background with uniform colours.

On the other hand position detection with reflected radio frequency waves or airborne ultrasound is well known. Even though radar techniques are well-proven and widely used, they are more adequate for detecting objects on large scales. In contrast, the slower propagation of sound waves makes ultrasonic techniques much more suitable for robot supervision and safety monitoring in enclosed areas. Recently in [8] an airborne ultrasonic setup has been reported, where objects as well as their shape are detected with high accuracy. Even so this technology often provides better results in comparison to optical systems (for example in dusty environments), drawbacks like unwanted reflections and temperature dependence of sound wave propagation considerably affects its functionality.

Optical systems based on laser light are widely used in security systems as well. Nevertheless, common security systems equipped with single point lasers are limited due to the fact that only one photodetector is correlated to a single laser source. If a gapless supervision on large areas is required, this results in a very complex setup and a much slower readout velocity of the detection system. A better alternative is the use of a laser as a light source in combination with optical lenses to create a continuous light barrier (so called light curtain). This allows for the detection and measurement of the speed of vehicles and for monitoring traffic data [9]. Such systems were also used for measurements of gun bullet velocities [10].

In [11,12] a position detection with linear lasers illuminating an array of silicon photodiodes, placed in a rectangular frame perpendicular to the lasers is described. When an object enters the device, the continuous light curtain is disturbed and a shadow on the array is created. Position retrieval is achieved due to the intensity decrease on the respective photodiodes. Although this method provides very good results, it still has to deal with two important restrictions. First, for the calculation the exact diameter of the object has to be known, which limits its use in different applications. Secondly, the minimum diameter dmin of an object, which can be measured across the whole detection area, is given by [11]:

dmin=(2b+m)

b is the length of the single photodiode and m is the distance between the photodiodes. So for a complete supervision of the detection area - without any blind spots - the photodiodes must be stacked gapless. In the end this results in a complex electronic design and significantly limits the detection speed of the device. In the following we show that in contrast to a photodiode array, the combination of an optical fluorescent waveguide and standard silicon photodiodes results in a fast and simple solution for position detection across large areas. Due to the detection of shadows along the whole continuous fluorescent waveguide the setup does not require a minimum size of the objects. The smallest objects entering the active area already create a noticeable intensity decrease on the foil. In [13] we already introduced the first approach of such a large area position detector.

3. Fabrication and setup of the position sensitive device

The core of our system consists of a commercially available polycarbonate foil (Bayer Makrofol LISA green) with embedded fluorescent dyes, showing a strong light absorption in the blue and near UV spectrum. A similar type of foil was already successfully used as a light point detector for large areas [14]. Laser light impinging the foil is absorbed by the dyes and reemitted as undirected fluorescent light at higher wavelengths due to the Stokes shift. The high refractive index of the polycarbonate foil forces most of this fluorescence light to be coupled into the planar waveguide mode of the foil. The light is then guided along the foil and is coupled out at the edges or at various defects on the surface. In our case, the light is coupled into silicon SMD-photodiodes, which are affixed on the polycarbonate foil using transparent glue (Fig. 2).

 

Fig. 2 Schematic drawing of the absorption of an incident light beam within the luminescent waveguide. The luminescent light propagates within the planar waveguide to the attached silicon photodiode.

Download Full Size | PPT Slide | PDF

The fluorescent foil was cut into two short pieces with dimensions of 604x22 mm and two long pieces of 844x22 mm. Black paint colouring of the foil edges prevent back reflection of light. Afterwards, silicon photodiodes (Everlight PD15-21C, with an active area of ~0.36 mm2) were affixed to the foil stripes with a transparent epoxy resin. The spacing between the photodiodes was 60 mm, resulting in an overall number of 11 photodiodes placed on the 604 mm long and 15 photodiodes on the 844 mm long foil stripe, all positioned in a distance of 2 mm from the upper long edge of the fluorescent stripe (Fig. 3(a)). The foil stripes were then affixed in a 10° angle in two U-shaped aluminum bars (Fig. 3(b)). This slight tilting helps to avoid reflections from one foil stripe to another, which predominantly appear for small irradiation angles of the lasers.

 

Fig. 3 a The silicon photodiodes are affixed at the edges of the luminescent foil. Each photodiode is then connected separately to a transimpedance amplifier circuit. b Cross-section of the attached luminescent foil within aluminium bars forming the frame of the light curtain detector. The routed opening for the linear laser was set to 5 mm.

Download Full Size | PPT Slide | PDF

The aluminum bars served as electrical and optical stray-light shield. Primary square-formed, we mechanically routed 12 mm of the first, and 13 mm of the second aluminum bar along their long sides. After assembling these two bars we obtained an approximately 5 mm wide opening, where the laser light could enter and impinge the foil stripes inside the aluminum bars. The distance between the impinged line laser and the photodiodes was set to a minimum distance of 8 mm.

To assure a mechanically stable construction, the aluminum bars were fixed onto a 5 mm thick laser-cut aluminum frame and in each corner a blue laser diode (405 nm, ca. 40 mW each, measured directly in continuous mode at the optical aperture of the lasers) was mounted. Flat PMMA Fresnel-type line optics (with a 90° opening angle) was affixed to the optical aperture of each laser. These lenses allow the conversion of a collimated beam to an approximately 2 mm wide laser line.

The full laser power is strongly attenuated and thus not harmful during the operation. We consider, that the laser intensity is weakened due to the broadening effect of the Fresnel lens. For diverging beams we calculate 100 mm to be the minimum distance between the source and the human eye (see the European Laser Safety Standards [15]). For that reason we obtain a 157 mm long quarter-circular arc. Taking 7 mm as a maximum diameter for a pupil, only 1/22 of the intensity can incide the eye.

During the operation each laser is switched on for only a fraction of a second due to the alternation of the four lasers in the measurement routine. Each laser has a temporal even on/off switching routine, further halving the intensity of the laser beam.

For the final intensity we obtain:

40mW7mm157mm420.22mW

which is considerably below the maximum admissible value of 1 mW for a class 2M laser.

Each laser was then adjusted to irradiate exactly two of the opposite dye-doped foils through the routed openings. In Fig. 4 a photograph of the setup during operation is presented.

 

Fig. 4 Photograph showing the principle of object detection with a light curtain and luminescent concentrator linear PSDs.

Download Full Size | PPT Slide | PDF

The photodiode signals of each array were connected via transimpedance amplifier circuits (using a 1 MΩ resistor parallel to a 3.9 pF capacitor) to two National Instruments CompactRIO System input modules (NI9205). The on-off switching of the lasers was performed via a NI9269 output module of the CompactRIO System.

We knowingly omitted the use of costly lasers with different wavelength or additional optical filters to obtain a simple and low-cost device preferably. The sensor foils and the corresponding electronics are placed in an aluminum frame, which reduce possible electrical and optical interspersion (from surrounding light). A high-pass filtering of the signals is done by the measurement routine described in section 4 and 5 in this work, which allows to neglect a steady surrounding light (e.g. from light bulb or daylight) as well as impinging light signals with low frequencies. Stray light with high frequencies (e.g. from modern fluorescent lamps, operating approximately at 40 kHz), is averaged in our system and neglected for the position calculation. A very important factor is the speed of the laser control - a fast sampling rate of the whole system allows the use of shorter and stronger laser pulses. This helps us to obtain a large amplitude of the laser light without attaining too large and long and therefore harmful laser power.

4. Simulation of the object detection system

The signal amplitude obtained from the photodiodes follows the expression:

I=Ae(αx)x
where A is the amplitude, α the extinction coefficient and I the measured signal of the photodiode at the distance x between the incident light and the detector.

Before showing the operation of the device we discuss the behaviour of an ideal detector. The detector has the same dimensions as the experimental system, but differs in the following points:

  • 1) No gaps exist between the fluorescent stripes - in the experimental object detection system these stripes must be disconnected in the corners to make space for the lasers and for their alignment mechanism.
  • 2) We assumed an ideal light distribution in the simulation - when using Fresnel lenses, the intensity distribution of the laser light varies across the angular range.
  • 3) We supposed an ideal coupling of light into each silicon photodiode. Mainly due to the affixing of the photodiodes to the fluorescent foil by hand, small differences in the amount of the glue prevent such an equal out-coupling of light for all photodiodes.

In the calculation we used a uniform light distribution with 100 points for each foil stripe and set the distance to 8 mm between the photodiodes and the line lasers impinging the foil, with the extinction coefficient set to 0.008 mm−1 [14]. This means, we simulated the impinging light intensity according to Eq. (3) for each point each 8.44 mm along the 844 mm long foil stripe, and each 6.04 mm along the 604 mm long foil stripe. In this fashion, the closer the position of the calculated point to a photodiode, the larger the effect when calculating a change in the light intensity (e.g. by an insertion of an object).

Afterwards, we simulated an object with a size of 20x20 mm being inserted into the detection area, starting 10x10 mm from the first edge and moving with 20x20 mm steps in a regular lattice in the active area. A total of four shadows for each object position are formed, falling onto three or four fluorescent stripes depending on the position of the object. The simulation of the inserted object cause a shadowing of the affected points (whose intensity contribution for each photodiode falls from a positive value according to Eq. (3) to zero) and hence induce a decrease in the calculated light intensity for each related photodiode. In Fig. 5 an example for a random object position is shown.

 

Fig. 5 A schematic drawing showing the casting of shadows (the active laser is marked in blue colour). This yields in a decrease of light intensity, which is detected on the respective fluorescent photo receivers (P1 to P4) placed on the edges of the device. The numbering of the photodiodes needed for the calculation of the shadow centre is done in a clockwise order.

Download Full Size | PPT Slide | PDF

By inserting an object into the detection area we obtain four shadows falling onto the fluorescent detector stripes. For each laser the shadow of the object falls onto one or two fluorescent stripes, which are placed opposite the respective laser.

By measuring the amplitude decrease on each photodiode of the two stripes and knowing the exact distance between the photodiodes, the position of the simulated shadow centre SCs is obtained using a simple computational algorithm:

SCs=[(dP#)SIi][SIi]
where d is the distance between the photodiodes (in mm) and P# the number of the photodiode. SIi is obtained by simulating the light intensity for each photodiode i (with i = 0 to 25, using Eq. (3) in the manuscript), which are placed on the two detector stripes:
SIi=1(SwSwo)
where Sw is the simulated amplitude with the inserted object casting the shadow on the detector stripes and Swo is the simulated amplitude without the shadow (simulated background measurement).

In the case of laser 1 the counting of the photodiodes i starts in the upper left corner of the device (foil stripe P1, see Fig. 5) and continue in a clockwise order to the last photodiode (placed in the lower right corner on stripe P2). The counting of the photodiodes for laser 2 start in the upper right corner (foil stripe P2) and stops in the lower left corner (foil stripe P3), and so on. Due to the geometrical form of the device the distance d between photodiode P14 and P15 (for laser 1 and 3) and P10 and P11 (for laser 2 and 4) is set to 0 mm.

We constructed then a straight line between each laser and the respective center of the shadow, and after an intersection of these four lines we obtained a maximum number of six crossover points. A calculation of their point of balance resulted in the final x-y-position of the object. The calculated x-y-plot and the deviation between the real and the calculated position from the simulation are presented in Fig. 6.

 

Fig. 6 Final x-y-positions obtained during the simulation (left). The grey dots represent the real and the purple dots the simulated position. On the right side a 3D-plot showing the deviation between the real and the calculated positions in the simulation is presented.

Download Full Size | PPT Slide | PDF

A deviation from the calculated position of the shadow center on the fluorescent stripe shifts the intersection point of the lines constructed between the laser and the shadow center. The more steeply the crossing of these lines and the smaller their cutting angle, the greater is the influence of a slight shifting of the shadow center for the calculations of the object position. As can be seen in Fig. 6, this effect can be observed primarily along the diagonals of the detector area and especially in its center, due to the crossing of all four laser lines under small cutting angles. In addition, a position detection of the shadow in the edges of the foil stripes is in general less precise. This is because fewer photodiodes are strongly affected by the insertion of an object, which leads to a decreasingly less change in the light intensity. Nevertheless, the mean deviation of the simulated positions was finally calculated to be only 4.2 mm.

5. Position measurement of real objects

First the incoming background light during laser operation was measured, without any objects being placed inside the frame. For this purpose all lasers were switched off and the ambient lighting for all photodiodes placed opposite to laser 1 (arrays P1 and P2) was obtained. Laser 1 was switched on and the signals of the same photodiodes (arrays P1 and P2) were collected as before. All lasers were switched off again and the same procedure was performed for all other lasers consecutively (arrays P2 + P3 for laser 2, P3 + P4 for laser 3, and P4 + P1 for laser 4). Afterwards we subtracted the ambient light signals from the signals we obtained for the switched-on lasers. This procedure neglected the effect of various alterations in ambient lighting, which could appear during measurements. In Fig. 7 a flowchart presenting one measurement cycle is shown. After having obtained the 100% light amplitude attaining each photodiode (background amplitude), the object could be placed inside the frame.

 

Fig. 7 One measurement cycle showing the sequential switching of the lasers and measuring of the incoming light signals.

Download Full Size | PPT Slide | PDF

As an object we chose a round wooden stick with 20 mm diameter, similar in size to a human finger. The starting position was set to x = 10 mm and y = 10 mm from the bottom edge, and the object was shifted in a regular pattern of 20x20 mm. For each position we collected the photodiode signals like described before. The position finding algorithm is similar to the one used in the simulation of the detector configuration. The measured background amplitude Rwo replaces then the simulated background amplitude Swo. The computational algorithm for the position finding of the shadow centre SCr of the real object on each detector stripe is then obtained by:

SCr=[(dP#)RIi][RIi]

RIi contains the background measurement and is calculated as follows:

RIi=1(RwRwo)

where Rw is the measured amplitude of the photodiode with the inserted object and Rwo the amplitude for the background measurement (100% attained light without the inserted object).

In Fig. 8 the corresponding x-y-plot is shown. During the measurement we observed the most noticeable deviations along the edges of the device and not along the diagonals, as found in the simulations. The main reason is most likely the relatively low light intensity of the lasers in combination with small shadow width, when the object is placed near the fluorescent stripe at the edge of the device. In addition, a stronger deviation of the shadow position appears for a large distance of the laser mounting located between the fluorescent stripes, if the shadow is located around one of the corners. In this case a systematic measurement of the object's position near an edge using a single detection is less precise than a position detection of an object located further away from the edges. This disadvantages will be improved in the next generation of position detectors based on this technology, particularly by decreasing the distance between the photodiodes and the fluorescent stripe detectors in the corners of the device, which will significantly lead to an increase of the detector sensibility. Additionally, an improvement in the light distribution characteristic of the line lens can help to get a more homogenous signal level along the sides, together with a better calculation algorithm and an optimized synchronization of the lasers and the readout electronics. However, the individual positions can be averaged as well, which will lead to a much higher accuracy of the final position of the object.

 

Fig. 8 Final retrieval of the positions (left). The grey dots represent the real and the blue dots the recovered position. The deviation between the real and the recovered positions are shown on the right side.

Download Full Size | PPT Slide | PDF

The deviation of the device averages to 18.6 mm in total and is thus within the size of the object diameter of 20 mm. Related to the dimensions of the device this results in a mean deviation of about 3%. The main advantage of the detector is its high detection speed of approximately 100Hz, allowing simultaneous input and output actions in video speed. A short video showing the detection system during the use is presented in Fig. 9.

 

Fig. 9 A single-frame excerpt from a short video file presenting the object detection system in use (Media 1). The device shown in this video operates already with an enhanced calculation algorithm, capable of detecting two (or more) objects simultaneously. After the localization the positions of the objects are projected onto the white screen via a video projector.

Download Full Size | PPT Slide | PDF

6. Conclusion

A large-area device was fabricated, allowing a fast and simple solution for object detection on large areas, with a mean accuracy on the order of the object size. Using simple and low-cost materials, we showed that object detection can be realized without the need of any transmitters being placed on the object. Together with advanced calculation algorithms we plan to achieve detections of multiple objects even in 3-dimensional space. This will allow the use in versatile applications like security supervision and in the fast-growing field of input devices for computers, presentations and gaming. With more refined readout electronics, we can achieve a stable readout of considerably smaller signals as well, thus allowing for much larger sizes of the frame.

Acknowledgment

The authors would like to thank the Austrian Science Foundation (FWF), the Austrian Forschungsförderungsgesellschaft (FFG) within the FIT-IT project 825835, the ERC Advanced Grant SoftMap and the Austrian Centre of Competence in Mechatronics (ACCM) for supporting this work.

References and links

1. A. Klapproth and S. Knauth, “Indoor Localisation - Technologies and Applications,” in Technologies and Applications Proceedings of the embedded world (2007) Conference, Nürnberg. http://www.ihomelab.ch/ihomelab2/index.php?option=com_content&view=category&layout=blog&id=48&Itemid=71&lang=de

2. J. Pugh, X. Raemy, C. Favre, R. Falconi, and A. Martinoli, “A Fast On-Board Relative Positioning Module for Multi-Robot Systems,” IEEE Trans. Mechatronics 14(2), 151–162 (2009). [CrossRef]  

3. S. Bergbreiter, A. Mehta, and K. S. J. Pister, “PhotoBeacon: Design of an Optical System for Localization and Communication in Multi-Robot Systems,” RoboComm 2007, Athens, October 15–17 (2007).

4. H. M. Khoury and V. R. Kamat, “Evaluation of position tracking technologies for user localization in indoor construction environments,” Autom. Construct. 18(4), 444–457 (2009). [CrossRef]  

5. H. Koyuncu and S. H. Yang, “A Survey of Indoor Positioning and Object Locating Systems,” Int. J. Comp. Sci. Netw. Security 10, 5, May (2010).

6. J. Chrásková, Y. Kaminsky, and I. Krekule, “An automatic 3D tracking system with a PC and a single TV camera,” J. Neurosci. Methods 88(2), 195–200 (1999). [CrossRef]   [PubMed]  

7. Y. Pang, Q. Huang, X. Quan, J. Zheng, and X. Wu, “Fast Object Location and Tracing Using Two CCD Cameras and Laser Range Finder,” in Proceedings of the 2005 IEEE International Conference on Information Acquisition, Hong Kong and Macau, China, June 27 – July 3 (2005).

8. G. Kaniak and H. Schweinzer, “A 3D Airborne Ultrasound Sensor for High-Precision Location Data Estimation and Conjunction,” 2MTC 2008 – IEEE Instrumentation and Measurement Technology Conference, Victoria, Canada, May 12–15 (2008).

9. H. H. Cheng, B. D. Shaw, J. Palen, J. E. Larson, X. Hu, and K. Van katwyk, “A Real-Time Laser-Based Detection System for Measurement of Delineations of Moving Vehicles,” IEEE/ASME Trans. Mechatron. 6(2), 170–187 (2001). [CrossRef]  

10. E. Musayev, “Laser-based large detection area speed measurement methods and systems,” Opt. Lasers Eng. 45(11), 1049–1054 (2007). [CrossRef]  

11. E. Musa, “Laser-based light barrier,” Appl. Opt. 47(19), 3415–3422 (2008). [CrossRef]   [PubMed]  

12. E. Musa and M. Demirer, “Laser-based light barrier having a rectangular detection area,” Opt. Lasers Eng. 48(4), 435–440 (2010). [CrossRef]  

13. P. Bartu, R. Koeppe, A. Neulinger, S. Isikatanlar, and S. Bauer, “Flexible large area photodetectors for human machine interfaces,” Procedia Engineering 5, 295–298 (2010). [CrossRef]  

14. R. Koeppe, A. Neulinger, P. Bartu, and S. Bauer, “Video-speed detection of the absolute position of a light point on a large-area photodetector based on luminescent waveguides,” Opt. Express 18(3), 2209–2218 (2010). [CrossRef]   [PubMed]  

15. http://www.certified-laser-eyewear.com/safety-eu-aus/

References

  • View by:
  • |
  • |
  • |

  1. A. Klapproth and S. Knauth, “Indoor Localisation - Technologies and Applications,” in Technologies and Applications Proceedings of the embedded world (2007) Conference, Nürnberg. http://www.ihomelab.ch/ihomelab2/index.php?option=com_content&view=category&layout=blog&id=48&Itemid=71&lang=de
  2. J. Pugh, X. Raemy, C. Favre, R. Falconi, and A. Martinoli, “A Fast On-Board Relative Positioning Module for Multi-Robot Systems,” IEEE Trans. Mechatronics14(2), 151–162 (2009).
    [CrossRef]
  3. S. Bergbreiter, A. Mehta, and K. S. J. Pister, “PhotoBeacon: Design of an Optical System for Localization and Communication in Multi-Robot Systems,” RoboComm 2007, Athens, October 15–17 (2007).
  4. H. M. Khoury and V. R. Kamat, “Evaluation of position tracking technologies for user localization in indoor construction environments,” Autom. Construct.18(4), 444–457 (2009).
    [CrossRef]
  5. H. Koyuncu and S. H. Yang, “A Survey of Indoor Positioning and Object Locating Systems,” Int. J. Comp. Sci. Netw. Security10, 5, May (2010).
  6. J. Chrásková, Y. Kaminsky, and I. Krekule, “An automatic 3D tracking system with a PC and a single TV camera,” J. Neurosci. Methods88(2), 195–200 (1999).
    [CrossRef] [PubMed]
  7. Y. Pang, Q. Huang, X. Quan, J. Zheng, and X. Wu, “Fast Object Location and Tracing Using Two CCD Cameras and Laser Range Finder,” in Proceedings of the 2005 IEEE International Conference on Information Acquisition, Hong Kong and Macau, China, June 27 – July 3 (2005).
  8. G. Kaniak and H. Schweinzer, “A 3D Airborne Ultrasound Sensor for High-Precision Location Data Estimation and Conjunction,” 2MTC 2008 – IEEE Instrumentation and Measurement Technology Conference, Victoria, Canada, May 12–15 (2008).
  9. H. H. Cheng, B. D. Shaw, J. Palen, J. E. Larson, X. Hu, and K. Van katwyk, “A Real-Time Laser-Based Detection System for Measurement of Delineations of Moving Vehicles,” IEEE/ASME Trans. Mechatron.6(2), 170–187 (2001).
    [CrossRef]
  10. E. Musayev, “Laser-based large detection area speed measurement methods and systems,” Opt. Lasers Eng.45(11), 1049–1054 (2007).
    [CrossRef]
  11. E. Musa, “Laser-based light barrier,” Appl. Opt.47(19), 3415–3422 (2008).
    [CrossRef] [PubMed]
  12. E. Musa and M. Demirer, “Laser-based light barrier having a rectangular detection area,” Opt. Lasers Eng.48(4), 435–440 (2010).
    [CrossRef]
  13. P. Bartu, R. Koeppe, A. Neulinger, S. Isikatanlar, and S. Bauer, “Flexible large area photodetectors for human machine interfaces,” Procedia Engineering5, 295–298 (2010).
    [CrossRef]
  14. R. Koeppe, A. Neulinger, P. Bartu, and S. Bauer, “Video-speed detection of the absolute position of a light point on a large-area photodetector based on luminescent waveguides,” Opt. Express18(3), 2209–2218 (2010).
    [CrossRef] [PubMed]
  15. http://www.certified-laser-eyewear.com/safety-eu-aus/

2010 (4)

H. Koyuncu and S. H. Yang, “A Survey of Indoor Positioning and Object Locating Systems,” Int. J. Comp. Sci. Netw. Security10, 5, May (2010).

E. Musa and M. Demirer, “Laser-based light barrier having a rectangular detection area,” Opt. Lasers Eng.48(4), 435–440 (2010).
[CrossRef]

P. Bartu, R. Koeppe, A. Neulinger, S. Isikatanlar, and S. Bauer, “Flexible large area photodetectors for human machine interfaces,” Procedia Engineering5, 295–298 (2010).
[CrossRef]

R. Koeppe, A. Neulinger, P. Bartu, and S. Bauer, “Video-speed detection of the absolute position of a light point on a large-area photodetector based on luminescent waveguides,” Opt. Express18(3), 2209–2218 (2010).
[CrossRef] [PubMed]

2009 (2)

J. Pugh, X. Raemy, C. Favre, R. Falconi, and A. Martinoli, “A Fast On-Board Relative Positioning Module for Multi-Robot Systems,” IEEE Trans. Mechatronics14(2), 151–162 (2009).
[CrossRef]

H. M. Khoury and V. R. Kamat, “Evaluation of position tracking technologies for user localization in indoor construction environments,” Autom. Construct.18(4), 444–457 (2009).
[CrossRef]

2008 (1)

2007 (1)

E. Musayev, “Laser-based large detection area speed measurement methods and systems,” Opt. Lasers Eng.45(11), 1049–1054 (2007).
[CrossRef]

2001 (1)

H. H. Cheng, B. D. Shaw, J. Palen, J. E. Larson, X. Hu, and K. Van katwyk, “A Real-Time Laser-Based Detection System for Measurement of Delineations of Moving Vehicles,” IEEE/ASME Trans. Mechatron.6(2), 170–187 (2001).
[CrossRef]

1999 (1)

J. Chrásková, Y. Kaminsky, and I. Krekule, “An automatic 3D tracking system with a PC and a single TV camera,” J. Neurosci. Methods88(2), 195–200 (1999).
[CrossRef] [PubMed]

Bartu, P.

P. Bartu, R. Koeppe, A. Neulinger, S. Isikatanlar, and S. Bauer, “Flexible large area photodetectors for human machine interfaces,” Procedia Engineering5, 295–298 (2010).
[CrossRef]

R. Koeppe, A. Neulinger, P. Bartu, and S. Bauer, “Video-speed detection of the absolute position of a light point on a large-area photodetector based on luminescent waveguides,” Opt. Express18(3), 2209–2218 (2010).
[CrossRef] [PubMed]

Bauer, S.

R. Koeppe, A. Neulinger, P. Bartu, and S. Bauer, “Video-speed detection of the absolute position of a light point on a large-area photodetector based on luminescent waveguides,” Opt. Express18(3), 2209–2218 (2010).
[CrossRef] [PubMed]

P. Bartu, R. Koeppe, A. Neulinger, S. Isikatanlar, and S. Bauer, “Flexible large area photodetectors for human machine interfaces,” Procedia Engineering5, 295–298 (2010).
[CrossRef]

Cheng, H. H.

H. H. Cheng, B. D. Shaw, J. Palen, J. E. Larson, X. Hu, and K. Van katwyk, “A Real-Time Laser-Based Detection System for Measurement of Delineations of Moving Vehicles,” IEEE/ASME Trans. Mechatron.6(2), 170–187 (2001).
[CrossRef]

Chrásková, J.

J. Chrásková, Y. Kaminsky, and I. Krekule, “An automatic 3D tracking system with a PC and a single TV camera,” J. Neurosci. Methods88(2), 195–200 (1999).
[CrossRef] [PubMed]

Demirer, M.

E. Musa and M. Demirer, “Laser-based light barrier having a rectangular detection area,” Opt. Lasers Eng.48(4), 435–440 (2010).
[CrossRef]

Falconi, R.

J. Pugh, X. Raemy, C. Favre, R. Falconi, and A. Martinoli, “A Fast On-Board Relative Positioning Module for Multi-Robot Systems,” IEEE Trans. Mechatronics14(2), 151–162 (2009).
[CrossRef]

Favre, C.

J. Pugh, X. Raemy, C. Favre, R. Falconi, and A. Martinoli, “A Fast On-Board Relative Positioning Module for Multi-Robot Systems,” IEEE Trans. Mechatronics14(2), 151–162 (2009).
[CrossRef]

Hu, X.

H. H. Cheng, B. D. Shaw, J. Palen, J. E. Larson, X. Hu, and K. Van katwyk, “A Real-Time Laser-Based Detection System for Measurement of Delineations of Moving Vehicles,” IEEE/ASME Trans. Mechatron.6(2), 170–187 (2001).
[CrossRef]

Huang, Q.

Y. Pang, Q. Huang, X. Quan, J. Zheng, and X. Wu, “Fast Object Location and Tracing Using Two CCD Cameras and Laser Range Finder,” in Proceedings of the 2005 IEEE International Conference on Information Acquisition, Hong Kong and Macau, China, June 27 – July 3 (2005).

Isikatanlar, S.

P. Bartu, R. Koeppe, A. Neulinger, S. Isikatanlar, and S. Bauer, “Flexible large area photodetectors for human machine interfaces,” Procedia Engineering5, 295–298 (2010).
[CrossRef]

Kamat, V. R.

H. M. Khoury and V. R. Kamat, “Evaluation of position tracking technologies for user localization in indoor construction environments,” Autom. Construct.18(4), 444–457 (2009).
[CrossRef]

Kaminsky, Y.

J. Chrásková, Y. Kaminsky, and I. Krekule, “An automatic 3D tracking system with a PC and a single TV camera,” J. Neurosci. Methods88(2), 195–200 (1999).
[CrossRef] [PubMed]

Khoury, H. M.

H. M. Khoury and V. R. Kamat, “Evaluation of position tracking technologies for user localization in indoor construction environments,” Autom. Construct.18(4), 444–457 (2009).
[CrossRef]

Koeppe, R.

P. Bartu, R. Koeppe, A. Neulinger, S. Isikatanlar, and S. Bauer, “Flexible large area photodetectors for human machine interfaces,” Procedia Engineering5, 295–298 (2010).
[CrossRef]

R. Koeppe, A. Neulinger, P. Bartu, and S. Bauer, “Video-speed detection of the absolute position of a light point on a large-area photodetector based on luminescent waveguides,” Opt. Express18(3), 2209–2218 (2010).
[CrossRef] [PubMed]

Koyuncu, H.

H. Koyuncu and S. H. Yang, “A Survey of Indoor Positioning and Object Locating Systems,” Int. J. Comp. Sci. Netw. Security10, 5, May (2010).

Krekule, I.

J. Chrásková, Y. Kaminsky, and I. Krekule, “An automatic 3D tracking system with a PC and a single TV camera,” J. Neurosci. Methods88(2), 195–200 (1999).
[CrossRef] [PubMed]

Larson, J. E.

H. H. Cheng, B. D. Shaw, J. Palen, J. E. Larson, X. Hu, and K. Van katwyk, “A Real-Time Laser-Based Detection System for Measurement of Delineations of Moving Vehicles,” IEEE/ASME Trans. Mechatron.6(2), 170–187 (2001).
[CrossRef]

Martinoli, A.

J. Pugh, X. Raemy, C. Favre, R. Falconi, and A. Martinoli, “A Fast On-Board Relative Positioning Module for Multi-Robot Systems,” IEEE Trans. Mechatronics14(2), 151–162 (2009).
[CrossRef]

Musa, E.

E. Musa and M. Demirer, “Laser-based light barrier having a rectangular detection area,” Opt. Lasers Eng.48(4), 435–440 (2010).
[CrossRef]

E. Musa, “Laser-based light barrier,” Appl. Opt.47(19), 3415–3422 (2008).
[CrossRef] [PubMed]

Musayev, E.

E. Musayev, “Laser-based large detection area speed measurement methods and systems,” Opt. Lasers Eng.45(11), 1049–1054 (2007).
[CrossRef]

Neulinger, A.

P. Bartu, R. Koeppe, A. Neulinger, S. Isikatanlar, and S. Bauer, “Flexible large area photodetectors for human machine interfaces,” Procedia Engineering5, 295–298 (2010).
[CrossRef]

R. Koeppe, A. Neulinger, P. Bartu, and S. Bauer, “Video-speed detection of the absolute position of a light point on a large-area photodetector based on luminescent waveguides,” Opt. Express18(3), 2209–2218 (2010).
[CrossRef] [PubMed]

Palen, J.

H. H. Cheng, B. D. Shaw, J. Palen, J. E. Larson, X. Hu, and K. Van katwyk, “A Real-Time Laser-Based Detection System for Measurement of Delineations of Moving Vehicles,” IEEE/ASME Trans. Mechatron.6(2), 170–187 (2001).
[CrossRef]

Pang, Y.

Y. Pang, Q. Huang, X. Quan, J. Zheng, and X. Wu, “Fast Object Location and Tracing Using Two CCD Cameras and Laser Range Finder,” in Proceedings of the 2005 IEEE International Conference on Information Acquisition, Hong Kong and Macau, China, June 27 – July 3 (2005).

Pugh, J.

J. Pugh, X. Raemy, C. Favre, R. Falconi, and A. Martinoli, “A Fast On-Board Relative Positioning Module for Multi-Robot Systems,” IEEE Trans. Mechatronics14(2), 151–162 (2009).
[CrossRef]

Quan, X.

Y. Pang, Q. Huang, X. Quan, J. Zheng, and X. Wu, “Fast Object Location and Tracing Using Two CCD Cameras and Laser Range Finder,” in Proceedings of the 2005 IEEE International Conference on Information Acquisition, Hong Kong and Macau, China, June 27 – July 3 (2005).

Raemy, X.

J. Pugh, X. Raemy, C. Favre, R. Falconi, and A. Martinoli, “A Fast On-Board Relative Positioning Module for Multi-Robot Systems,” IEEE Trans. Mechatronics14(2), 151–162 (2009).
[CrossRef]

Shaw, B. D.

H. H. Cheng, B. D. Shaw, J. Palen, J. E. Larson, X. Hu, and K. Van katwyk, “A Real-Time Laser-Based Detection System for Measurement of Delineations of Moving Vehicles,” IEEE/ASME Trans. Mechatron.6(2), 170–187 (2001).
[CrossRef]

Van katwyk, K.

H. H. Cheng, B. D. Shaw, J. Palen, J. E. Larson, X. Hu, and K. Van katwyk, “A Real-Time Laser-Based Detection System for Measurement of Delineations of Moving Vehicles,” IEEE/ASME Trans. Mechatron.6(2), 170–187 (2001).
[CrossRef]

Wu, X.

Y. Pang, Q. Huang, X. Quan, J. Zheng, and X. Wu, “Fast Object Location and Tracing Using Two CCD Cameras and Laser Range Finder,” in Proceedings of the 2005 IEEE International Conference on Information Acquisition, Hong Kong and Macau, China, June 27 – July 3 (2005).

Yang, S. H.

H. Koyuncu and S. H. Yang, “A Survey of Indoor Positioning and Object Locating Systems,” Int. J. Comp. Sci. Netw. Security10, 5, May (2010).

Zheng, J.

Y. Pang, Q. Huang, X. Quan, J. Zheng, and X. Wu, “Fast Object Location and Tracing Using Two CCD Cameras and Laser Range Finder,” in Proceedings of the 2005 IEEE International Conference on Information Acquisition, Hong Kong and Macau, China, June 27 – July 3 (2005).

Appl. Opt. (1)

Autom. Construct. (1)

H. M. Khoury and V. R. Kamat, “Evaluation of position tracking technologies for user localization in indoor construction environments,” Autom. Construct.18(4), 444–457 (2009).
[CrossRef]

IEEE Trans. Mechatronics (1)

J. Pugh, X. Raemy, C. Favre, R. Falconi, and A. Martinoli, “A Fast On-Board Relative Positioning Module for Multi-Robot Systems,” IEEE Trans. Mechatronics14(2), 151–162 (2009).
[CrossRef]

IEEE/ASME Trans. Mechatron. (1)

H. H. Cheng, B. D. Shaw, J. Palen, J. E. Larson, X. Hu, and K. Van katwyk, “A Real-Time Laser-Based Detection System for Measurement of Delineations of Moving Vehicles,” IEEE/ASME Trans. Mechatron.6(2), 170–187 (2001).
[CrossRef]

Int. J. Comp. Sci. Netw. Security (1)

H. Koyuncu and S. H. Yang, “A Survey of Indoor Positioning and Object Locating Systems,” Int. J. Comp. Sci. Netw. Security10, 5, May (2010).

J. Neurosci. Methods (1)

J. Chrásková, Y. Kaminsky, and I. Krekule, “An automatic 3D tracking system with a PC and a single TV camera,” J. Neurosci. Methods88(2), 195–200 (1999).
[CrossRef] [PubMed]

Opt. Express (1)

Opt. Lasers Eng. (2)

E. Musayev, “Laser-based large detection area speed measurement methods and systems,” Opt. Lasers Eng.45(11), 1049–1054 (2007).
[CrossRef]

E. Musa and M. Demirer, “Laser-based light barrier having a rectangular detection area,” Opt. Lasers Eng.48(4), 435–440 (2010).
[CrossRef]

Procedia Engineering (1)

P. Bartu, R. Koeppe, A. Neulinger, S. Isikatanlar, and S. Bauer, “Flexible large area photodetectors for human machine interfaces,” Procedia Engineering5, 295–298 (2010).
[CrossRef]

Other (5)

http://www.certified-laser-eyewear.com/safety-eu-aus/

Y. Pang, Q. Huang, X. Quan, J. Zheng, and X. Wu, “Fast Object Location and Tracing Using Two CCD Cameras and Laser Range Finder,” in Proceedings of the 2005 IEEE International Conference on Information Acquisition, Hong Kong and Macau, China, June 27 – July 3 (2005).

G. Kaniak and H. Schweinzer, “A 3D Airborne Ultrasound Sensor for High-Precision Location Data Estimation and Conjunction,” 2MTC 2008 – IEEE Instrumentation and Measurement Technology Conference, Victoria, Canada, May 12–15 (2008).

S. Bergbreiter, A. Mehta, and K. S. J. Pister, “PhotoBeacon: Design of an Optical System for Localization and Communication in Multi-Robot Systems,” RoboComm 2007, Athens, October 15–17 (2007).

A. Klapproth and S. Knauth, “Indoor Localisation - Technologies and Applications,” in Technologies and Applications Proceedings of the embedded world (2007) Conference, Nürnberg. http://www.ihomelab.ch/ihomelab2/index.php?option=com_content&view=category&layout=blog&id=48&Itemid=71&lang=de

Supplementary Material (1)

» Media 1: MOV (3908 KB)     

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1

Scheme of the 2-dimensional large-area object detection system. An object entering the light curtain casts shadows, which are detected with stripe detectors on the edges of the device.

Fig. 2
Fig. 2

Schematic drawing of the absorption of an incident light beam within the luminescent waveguide. The luminescent light propagates within the planar waveguide to the attached silicon photodiode.

Fig. 3
Fig. 3

a The silicon photodiodes are affixed at the edges of the luminescent foil. Each photodiode is then connected separately to a transimpedance amplifier circuit. b Cross-section of the attached luminescent foil within aluminium bars forming the frame of the light curtain detector. The routed opening for the linear laser was set to 5 mm.

Fig. 4
Fig. 4

Photograph showing the principle of object detection with a light curtain and luminescent concentrator linear PSDs.

Fig. 5
Fig. 5

A schematic drawing showing the casting of shadows (the active laser is marked in blue colour). This yields in a decrease of light intensity, which is detected on the respective fluorescent photo receivers (P1 to P4) placed on the edges of the device. The numbering of the photodiodes needed for the calculation of the shadow centre is done in a clockwise order.

Fig. 6
Fig. 6

Final x-y-positions obtained during the simulation (left). The grey dots represent the real and the purple dots the simulated position. On the right side a 3D-plot showing the deviation between the real and the calculated positions in the simulation is presented.

Fig. 7
Fig. 7

One measurement cycle showing the sequential switching of the lasers and measuring of the incoming light signals.

Fig. 8
Fig. 8

Final retrieval of the positions (left). The grey dots represent the real and the blue dots the recovered position. The deviation between the real and the recovered positions are shown on the right side.

Fig. 9
Fig. 9

A single-frame excerpt from a short video file presenting the object detection system in use (Media 1). The device shown in this video operates already with an enhanced calculation algorithm, capable of detecting two (or more) objects simultaneously. After the localization the positions of the objects are projected onto the white screen via a video projector.

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

d min =( 2b+m )
40mW7mm 157mm42 0.22mW
I= A e ( αx ) x
S C s = [ ( d P # ) S I i ] [ S I i ]
S I i =1( S w S wo )
S C r = [ ( d P # ) R I i ] [ R I i ]
R I i =1( R w R wo )

Metrics