Abstract

Measuring the shape (coordinates x, y, z ) and spectral characteristics (wavelength-dependent reflectance R (λi)) of macroscopic objects as a function of time (t) is of great interest in areas such as medical imaging, precision agriculture, or optical sorting. Here, we present an approach that allows to determine all these quantities with high resolution and accuracy, enabling measurement in five dimensions. We call this approach 5D hyperspectral imaging. We describe the design and implementation of a 5D sensor operating in the visible to near-infrared spectral range, which provides excellent spatial and spectral resolution, great depth accuracy, and high frame rates. The results of various experiments strongly indicate the great benefit of the new technology.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Hyperspectral imaging is a powerful technology for capturing our physical environment by detecting an object or scene in n (typically a dozen to several hundred) narrow wavelength ranges over a continuous spectrum [1]. Hyperspectral cameras generate a set of data points

pk=[xk,ykIk(λ1,,λn)]
which provide the recorded wavelength-dependent radiation intensity I(λi) depending on the two-dimensional coordinates x and y. Originally developed for remote sensing and astronomy [2], hyperspectral imaging systems are constantly opening up new application areas. The analysis of artistic and cultural objects [3], quality control of foodstuffs [4], determination of the state of health of plants [5], and a variety of medical studies [4, 6, 7] are just a few examples.

As the surface topography of the measurement objects has a significant influence on the spectral information obtained, it is very beneficial to combine the spectral information with three-dimensional (3D) surface models [8–10], resulting in data points

pk=[xk,yk,zk,Ik(λ1,,λn)].
Such hyperspectral 3D data can be acquired by using a single sensor [11–14] or two different sensors which generate the 3D surface model and the hypercube separately [9, 15, 16]. Yet, state-of-the-art systems rely on textured objects for 3D reconstruction and/or do not combine high accuracy, spatial and spectral resolution, and measurement speed. The approach introduced here, for the first time, exhibits all these features, enabling 5D hyperspectral imaging:
pk5D=[xk,yk,zk,t,Ik(λ1,,λn)].
It is suitable for the reconstruction of close-range objects and is based on two hyperspectral snapshot cameras [10, 17, 18] and a special broadband pattern projector [19].

As hyperspectral snapshot cameras generate data points pk (see Eq. (1)) within a single acquisition, they allow the detection of dynamic scenes [20–22]. In order to use them to measure the 3D shape of even unstructured or untextured surfaces with high accuracy and robustness, we employ the method of active close-range stereophotogrammetry [23–27]. The depth impression of an observed scene is deduced from the so-called disparity between the images of two cameras, i.e., the spatial displacement of image points of the same object point. Such corresponding points are reliably and accurately detected by illuminating the object with a series of patterns. For measurements at high frame rates, this necessitates a broadband high-speed pattern projector.

The method of GOBO projection of aperiodic sinusoidal patterns introduced by us [19] meets exactly these requirements. As a GOBO projector basically consists of a radiation source, a rotating slide (the so-called GOBO = GOes Before Optics), and an imaging lens, pattern generation is almost wavelength independent. Depending on the more or less arbitrarily interchangeable source of radiation, varying aperiodic sinusoidal patterns can be projected in a wide spectral range at frame rates of up to several kilohertz. The combination of GOBO projection and hyperspectral snapshot cameras is therefore the ideal approach for 5D hyperspectral imaging. It allows for the fast and accurate shape reconstruction of even texture-less object surfaces with comparatively low computational effort and feasible equipment. Reflectance spectra are acquired directly, making detailed knowledge of the scene and time-consuming computations unnecessary. In addition, our approach offers the possibility to perform the 3D reconstruction separately in different wavelength ranges.

2. Hyperspectral snapshot cameras

Figure 1(a) illustrates the basic sensor design of a hyperspectral snapshot camera. Based on the approach of a Bayer filter mosaic, pixel-by-pixel Fabry-Pérot interference (FPI) filters are monolithically integrated onto a conventional CMOS sensor. Depending on the number of spectral channels, they result in a per-pixel filter mosaic of different size which extends periodically over the entire detector [20, 28]. Each filter consists of two semi-transparent mirrors which form an optical resonator (see Fig. 1(b)). The reflectance of the mirrors determines the full width at half maximum and the distance between the mirrors determines the central transmission wavelengths [29, 30].

 figure: Fig. 1

Fig. 1 Operating principle of a hyperspectral snapshot camera. (a) Schematic representation of a hyperspectral, 5 × 5 tessellated snapshot sensor. (b) Optical path in the Fabry-Pérot interferometer.

Download Full Size | PPT Slide | PDF

The transmittance of an FPI surrounded by air (without taking into account any absorption) is

T=(1R1)(1R2)(1R1R2)2+4R1R2sin2Δφ2
withΔφ=4πndcosϑλandnsinϑ=sinϑ0
with the reflectivity R1 and R2 of the two semi-transparent mirrors, the refractive index n of the intermediate material, the distance d between the mirrors, and the incidence angle ϑ0 of the electromagnetic radiation (see Fig. 1(b)) [29,30]. Thus, the transmittance maxima of such an FPI are at
λk=2ndcosϑkwithk=1,2,3,.
Due to the fact that an FPI filter transmits light of harmonic wavelengths that disturb the spectral observation, band-pass or edge filters have to be placed in front of the camera chip or in the lens to select a single wavelength per pixel. Each channel i is characterized by its quantum efficiency ηi and the transmittance TFPI,i of the corresponding FPI. If their product Qi = ηiTFPI,i and the transmittance Tfilter of the band-pass/edge filter in front of the sensor are known, the detected gray value gi in channel i is given by (without taking noise or rounding to the next integer into account)
gi=Ktexp=:K0Robj(λ)Ee(λ)Tlens(λ)Tfilter(λ)Qi(λ)=:Fi(λ)dλwithi=1,,n
with the overall system gain K, the exposure time texp, the wavelength-dependent reflectance Robj of the measurement object, the spectral irradiance Ee of the light source, and the transmittance Tlens of the objective lens.

Due to crosstalk, channel i does not only cover the object reflectance between λia and λib (with λia=0, λib=λi+1a, and λib). Instead, the combined quantum efficiency is Qi ≠ 0 even outside this wavelength range. In order to determine the object’s reflectance spectrum nonetheless, the integral in Eq. (7) is split into the n wavelength ranges covered by the n channels:

gi=Kj=1nλjaλjbRobj(λ)Fi(λ)dλ.
Within each wavelength range, Robj is assumed to be constant, i.e.,
gi=Kj=1nRobj,jλjaλjbFi(λ)dλ.
Equation (9) defines a linear system of equations that allows to determine the n unknowns Robj; j up to a factor K′.

In order to achieve an external consistency of the hypercube with data sets from other hyperspectral imaging systems, the values Robj=KRobj determined by the sensor need to be transformed into a suitable physical quantity Robj. For our application of the hyperspectral surface detection of non-emitting objects in the near field, a two-point calibration and reflection transformation by means of a so-called Spectralon panel is suitable. The plane surface of the calibration standard, diffusely reflecting with a reflectance of more than 98 % over a wavelength range from 400 to 1 000 nm, must be recorded under constant ambient and irradiation conditions. In order to compute the reflectance Robj of a measurement object, the values Robj of the captured scene, the values Rdark corresponding to a dark scene, and a recording of the Spectralon panel Rspec are required [31]:

Robj=RobjRdarkRspecRdark.

3. Structured light 3D sensors

The 3D coordinates of a point on the object surface are determined by triangulating corresponding camera pixels in the stereo image pair, i.e., image points of the same object point. For the correct matching of corresponding pixels, the machine processing unit of the measurement system requires pixel-wise unique features. In order to provide these features over a broad spectral range, a GOBO projector can be applied to encode the object surface with aperiodic sinusoidal patterns [19].

The epipolar geometry is used to restrict the search range of the correspondence analysis in the right camera image (see Fig. 2(a)) [25, 27]. The epipolar lines l1 and l2 represent the intersection lines between the image planes of the cameras and the epipolar plane. The epipolar plane is defined by the observed object point P and the projection centers C1 and C2 of both cameras. Thus, the corresponding pixel p2 of a certain image point p1 is always on the epipolar line l2. Based on this approach, the rectification of the stereo images makes it possible to further simplify the correspondence search. For this purpose, the image planes of both cameras are transformed into a common plane parallel to the baseline so that corresponding epipolar lines are on the same image row. Thus, the corresponding pixel p2 needs to be searched for solely on the image row on which point p1 is located.

 figure: Fig. 2

Fig. 2 (a) Illustration of a stereo camera system and its epipolar geometry. (b) Camera images of an aperiodic sinusoidal pattern projected onto a measurement object, making the detection of corresponding image points p1 and p2 possible.

Download Full Size | PPT Slide | PDF

The GOBO-projected patterns allow the corresponding pixel on the rectified epipolar line to be identified unambiguously even for texture-less objects (see Fig. 2(b)). The rotating GOBO slide is used to project N (typically N = 10) varying aperiodic sinusoidal patterns, which are as perpendicular as possible to the epipolar lines, onto the measurement object surface:

Ikproj(x,y)=ak(x)+bk(x)sin[ck(x)x+dk(x)]withk=1,,N
with the parameters a (offset), b (amplitude), c (directly related to the period length), and d (phase shift), which are spatially and temporally variable [32]. In this way, unique gray value sequences can be determined for each pixel of an image line.

Corresponding pixels are detected using the normalized cross-correlation. The correlation coefficient ρ between a pixel in camera 1 with successively detected gray values g1(1), …, gN(1) and a pixel in camera 2 with gray values g1(2), …, gN(2) is given by

ρ=k=1N[gk(1)g(1)¯][gk(2)g(2)¯]k=1N[gk(1)g(1)¯]2k=1N[gk(2)g(2)¯]2
with the temporal mean values
g(1)¯=1Nk=1Ngk(1)andg(2)¯=1Nk=1Ngk(2).
Corresponding points have a maximum correlation coefficient. By interpolating the intensity values of adjacent pixels on the epipolar in each of the N images, corresponding points can be determined with a subpixel accuracy of up to 1/30 px. Using the corresponding points and having knowledge of the parameters of the stereo camera system, the 3D coordinates of the points in the world coordinate system can be calculated [26,27].

The stereo calibration of the described 3D sensor is a necessary condition for the correct generation of 3D surface models. The aim of the calibration is to determine the intrinsic and extrinsic parameters of the cameras using the pinhole camera model. For this purpose, stereoscopic images of a calibration standard at several different positions in the measurement volume are acquired, e.g., of a flat board with a chessboard pattern of known size. The detection of corresponding chessboard intersection points in both cameras allows the determination of the intrinsic and extrinsic camera parameters [26, 33].

The extrinsic parameters describe the relative orientation of both camera coordinate systems in terms of a translation between the two projection centers C1 and C2 and a rotation. The intrinsic parameters include the camera constant (i.e., the distance between camera center and image plane), the coordinates of the principal point p0 (i.e., the intersection of the optical axis of the lens with the image plane), and the image distortion (e.g., radial and tangential distortion).

The calibration determined by means of the full resolution of the hyperspectral cameras must be converted to the images of the individual spectral channels, as these have both a lower resolution and a specific displacement. A suitable approach is to adapt the rectification maps of the two hyperspectral cameras, which have been determined by the stereo calibration. The rectification map indicates the position in the original image from which the information of a pixel in the rectified image originates. For a hyperspectral camera with a 5 × 5 filter mosaic, the rectification map R(i, j) of the channel (i, j) (with i, j = 0, …, 4) can be calculated from the full-resolution rectification map Rfull according to Eq. (14) (see Fig. 3 for a better understanding):

R(i,j)=[Rfull(i,j)]/5.
Using the modified maps, rectified images in full resolution can be generated from the lower-resolution spectral band images in a common coordinate system.

 figure: Fig. 3

Fig. 3 Location of the principal point p0 in the coordinate system of a spectral channel: for channel (0, 1) with its virtual pixels marked with orange lines, the principal point has the coordinates p0(0,1)=[(x0,y0)(0,1)]/5. In general, it is p0(i,j)=[(x0,y0)(0,1)]/5.

Download Full Size | PPT Slide | PDF

For each channel, the same camera constant κ of the rectified system (in pixel units), coordinates cx1 and cy1 of the (rectified) principal point in camera 1, and x -coordinate cx2 of the (rectified) principal point in camera 2 can be used for reconstruction. An object point P, defined by corresponding points p1=(x1,y) and p2=(x2,y)=(x1disp,y), is then given in homogeneous coordinates by

Q(x1ydisp1)withQ=[100cx1010cy1000κ001/l(cx1cx2)/l].
By carrying out this calculation for all detected corresponding points, the entire 3D point cloud is reconstructed.

4. Experimental setup

Our first 5D sensor (see Fig. 4(a)) consists of a halogen lamp-based GOBO projector and two XIMEA “MQ022HG-IM-SM5X5-NIR” hyperspectral snapshot cameras providing 5 × 5 different spectral channels in the visible to near-infrared range (VIS-NIR). As the cameras have an (effective) resolution of 2045 × 1080 px, each of the 25 images of the individual spectral channels that can be extracted has a resolution of 409 × 216 px. Due to the pixel size of 5.5 µm × 5.5 µm, the pixel pitch of the spectral channels is 27.5 µm. The full width at half maximum of the spectral bands varies between 10 and 15 nm.

 figure: Fig. 4

Fig. 4 Measurement setup. (a) Photograph of the developed 5D sensor comprising two hyperspectral snapshot cameras, a GOBO projector illuminating the measurement object with varying aperiodic sinusoidal patterns, and a projector homogeneously illuminating the scene. (b) GOBO wheel consisting of a borosilicate glass plate with a chromium coating from which aperiodic strips have been etched.

Download Full Size | PPT Slide | PDF

As an FPI transmits light of harmonic wavelengths that disturb the spectral observation, the active range of the hyperspectral cameras needs to be constrained by optical filters. A band-pass filter in front of the sensor limits its response to wavelengths between 600 and 1 000 nm. Remaining unwanted harmonic wavelengths of the FPIs are eliminated by either an additional short-pass or long-pass filter in the camera lens. The combination of a band-pass and an edge filter limits the detectable spectral range, resulting in two operation modes:

  • (OM1) 600 to 875 nm and
  • (OM2) 675 to 975 nm.
Figure 5 shows the combined transmittance Tfilter(λ) of the band-pass/edge filter in front of the sensor in the operation mode (OM1) and the combined quantum efficiency Qi(λ) of one of the cameras.

 figure: Fig. 5

Fig. 5 Spectral properties of one of the applied hyperspectral snapshot cameras: combined quantum efficiency Qi of the 5 × 5 different channels and combined transmittance Tfilter of the band-pass/edge filter in front of the sensor in the operation mode (OM1), i.e., with a spectral response between 600 and 875 nm.

Download Full Size | PPT Slide | PDF

In the experimental setup, both hyperspectral cameras are mounted on a bar at a fixed distance of 350 mm from each other. At a working distance of 550 mm, they capture a measurement field of approximately 170 × 85 mm2. Between the two cameras, a GOBO projector [19] containing a halogen lamp as a source of radiation is integrated into the system. Over the entire spectral range between 600 and 975 nm, the halogen lamp produces a continuous spectrum and provides an irradiance in the measurement field that is sufficiently high for the cameras. A light funnel made of mirrors with a silver coating homogeneously guides the emitted electromagnetic radiation to a square cut-out of the rotating slide. This slide consists of a circular heat-resistant borosilicate glass, which has a chromium coating comprising radial strips of varying width (see Fig. 4(b)).

The GOBO projector is used to illuminate the scene with a series of aperiodic sinusoidal patterns [32] which allow for a robust and accurate 3D reconstruction as described in Sec. 3. The average strip width must be adjusted to match the magnification of the projection lens and the resolution of the hyperspectral cameras. As 3D reconstruction can either be performed using the full resolution images of the hyperspectral snapshot cameras (2045 × 1080 px) or by extracting low-resolution images from each of the 25 spectral channels (409 × 216 px), two GOBO wheels with different average strip widths have been manufactured. Depending on the desired application, either one of the two is installed in the sensor. It is uniformly rotated at a speed that depends on the frame rate and exposure time of both cameras.

Additionally, a second projector of the same design, but without GOBO slide, is installed in the sensor (see Fig. 4(a)). By using suitable filters in the projection lenses (e.g., a 650-nm short-pass filter and a 650-nm long-pass filter, respectively), it is possible to identify the aperiodic sinusoidal patterns with some spectral channels of the cameras, while the remaining channels detect the homogeneously illuminated measurement object. In this way, both the 3D reconstruction (in one or very few channels) and the recording of the spectral reflection data (in many channels) is possible within a single measurement, similar to the approach of Ozawa et al. [34].

Altogether, the 5D sensor can be operated in four different modes so that we obtain at any point in time either

  • (OM1a) 25 different 3D point clouds (600 875 nm),
  • (OM1b) 6 different 3D point clouds (600 650 nm) + 19 mapped textures (650 875 nm),
  • (OM2a) 25 different 3D point clouds (675 975 nm), or
  • (OM2b) 3 different 3D point clouds (675 700 nm) + 22 mapped textures (700 975 nm).
Figure 6 illustrates the four operation modes.

 figure: Fig. 6

Fig. 6 Operation modes of the 5D sensor. Depending on the filters mounted in front of the objective lenses of the hyperspectral cameras (HSC1 and HSC2), the projector for homogeneous illumination (HI), and the GOBO projector for aperiodic sinusoidal patterns (ASP), the sensor can be operated in four different modes.

Download Full Size | PPT Slide | PDF

5. Results

In order to demonstrate the benefits of our new approach to 5D hyperspectral imaging, we conducted various experiments using the developed prototype.

5.1. Characterization of spectral response and 3D performance

We examined the 3D performance of the sensor as well as the spectral behavior of the cameras. As can be seen from Fig. 5, the signal of a single channel can be affected by both the crosstalk between channels and the harmonics of the FPI. Therefore, in order to determine a consistent object reflectance Robj(λ), the impact of the other channels must be eliminated and the transformation into the physical quantity Robj must be calibrated as described in Sec. 2.

In general, the accumulated energy differs significantly from channel to channel (see Fig. 5). Furthermore, channels with λi < 675 nm detect large amounts of higher-wavelength radiation. For those channels, the ratio of accumulated energy in the wavelength range of interest to the total detected energy is less than 40 %. This means that 60 % or more of the information in these channels originates from a spectral range that is actually of no interest. These channels are of limited suitability for experiments where 3D reconstruction is performed in single channels. For instance, in the operation mode (OM1), the reliable range is between 675 and 850 nm instead of the full range (600 875 nm).

The 3D performance of the system has been evaluated according to the guidelines of the VDI/VDE 2634 Part 2 [35]. The cuboid measurement volume was defined as 170 × 85 × 85 mm3 (width × height × depth). In order to determine the expected measurement inaccuracies, we generated and evaluated 3D surface models of four different test specimens in the spectral range between 600 and 875 nm (operation mode (OM1a)). For each single view of a measurement object, one 3D surface model can be generated from the N = 10 full-resolution images of all spectral bands of the sensor, or up to 25 3D surface models from the low-resolution images of each spectral channel. Depending on the object reflectance and thus the camera exposure time, images can be recorded at the cameras’ maximum frame rate of 170 Hz at 8-bit resolution. Therefore, up to 17 hyperspectral 3D models per second can be generated with the current setup. The average distance of the 3D points is about 75 µm (full resolution) or 375 µm (channel resolution) in x and y direction.

One of the parameters used to characterize the 3D performance is the flatness measurement error F = max dk − min dk. When measuring a flat object, it describes the range of the measured points’ deviation dk from a fitted plane. In the entire measurement volume and for each spectral channel, the flatness measurement error was below 0.7 mm. In the reliable range, it was smaller than 0.5 mm. The 3D points’ standard deviation from the fitted plane was between 25 and 100 µm for each spectral channel.

Another important quantity we have determined is the 3D point error when measuring a sphere. We used a spherical specimen with a diameter of 38 mm, which we placed at nine different positions evenly distributed in the measurement volume. For each spectral channel, the standard deviation of the measured points from a fitted sphere ranged from 20 to 80 µm. Again, channels in the reliable range (675 to 850 nm) allowed for the highest accuracy with standard deviations between 20 and 40 µm.

5.2. Measurement of a historical globe

5D hyperspectral imaging offers a promising non-invasive option for the analysis and classification of historical measurement objects [3, 36, 37]. In order to demonstrate the suitability of our sensor for the digital documentation of art and cultural objects, we measured a historical globe using the two operation modes (OM1b) and (OM2b). The examined globe is a relief globe from 1885, which was created by the geographic-artistic institution of Ludwig Julius Heymann in Berlin (see Fig. 7(a)).

 figure: Fig. 7

Fig. 7 Hyperspectral 3D measurement of a historical globe. (a) Photograph of the globe with marked measurement area. (b) 3D surface model with artificial blue shading. (c) 3D surface model with mapped intensities from selected spectral channels, colored according to their wavelength.

Download Full Size | PPT Slide | PDF

As can be seen in Fig. 7(b), the relief structure of the globe can be very well documented using our 5D sensor. In addition, the transfer of the texture recorded in each spectral channel to the reconstructed models enables a detailed visualization of the surface properties and spectral signatures of the measurement object. Figure 7(c) exemplarily shows four such models in false-color representation, colored according to their wavelength. In general, this allows, e.g., to classify objects made of different materials.

5.3. Measurement of a human hand

By creating 5D models of a human hand, it is possible to demonstrate the simplified detection of veins in the NIR compared to the visible spectral range. In particular, images in the spectral range between 800 and 850 nm are well suited, regardless of skin tone, degree of dehydration, fat content, or body hair. They provide, e.g., a reliable way to determine which veins are suitable for an infusion [38, 39].

Using our sensor, we measured the left hand of a 26-year-old male subject and created hyperspectral 3D models in the 600-875-nm and 675-975-nm configuration (operation modes (OM1b) and (OM2b)). Due to the necessary conversion of the sensor between the two acquisitions by exchanging the optical filters in the lenses of the hyperspectral cameras, the position and condition of the hand do not exactly match. Figure 8(a) shows the 3D model recorded and reconstructed in the range between 600 and 875 nm, Fig. 8(b) shows the one in the range between 675 and 975 nm. On the left, the 3D surface models (with artificial blue shading) are shown. On the right, assigned to the respective wavelengths, exemplary 3D models with mapped texture of a single spectral channel are shown.

 figure: Fig. 8

Fig. 8 Hyperspectral 3D measurement of a human hand. 3D surface model without (left) and with mapped intensities from selected channels in the (a) 600-875-nm and (b) 675-975-nm camera configuration.

Download Full Size | PPT Slide | PDF

In all the images, the veins of the test person are very clearly visible, in Fig. 8(a) even more pronounced than in Fig. 8(b). The reason for this is that the measurement was carried out in the 600-875-nm configuration first, immediately after stimulation of blood flow in the hand by light physical activity. The subsequent measurement in the 675-975-nm configuration was performed after the sensor had been converted for a few minutes, resulting in a slight decrease in blood flow. This can also be seen from the 3D surface models on the left.

5.4. Determination of leaf water content

Another interesting field of application for the 5D sensor is plant phenotyping. For instance, VIS/NIR hyperspectral images can be used for the evaluation of fruit and vegetable quality [40] or for the determination of leaf water content [41]. For instance, the water content of leaves of Mediterranean plants can be deduced from the reflection spectrum in the NIR region (between 930 and 980 nm, depending on the specific plant) [42]. In general, a dry plant leaf reflects electromagnetic radiation in the NIR much stronger than a healthy plant leaf containing a sufficient amount of water.

In our experiment, we used our sensor in the spectral range from 675 to 975 nm (operation mode (OM2b)) to measure a citrus plant at regular time intervals during water absorption. In the initial state (t = 0 min), the plant was very dry after two weeks without water supply. Immediately after adding water, a measurement was taken every ten minutes until 210 minutes had passed.

Figure 9(a) shows both the condition of the plant at time t = 0 min, 90 min, and 210 min documented by a conventional digital camera and the area observed by the 5D sensor. The corresponding 3D surface models are shown in Fig. 9(b). They illustrate the process of water absorption by the plant whose leaves unfold. In the orange marked regions, we have also determined the wavelength-dependent reflectance. In order to obtain a temporally and spatially consistent reflectance spectrum, we used the 3D point cloud to estimate the surface normals and perform shading correction, assuming Lambertian reflection [43, 44]. The resulting reflectance spectrum is plotted in Fig. 9(c). It shows similar characteristics as in comparable investigations [45–47]. In particular, the drop in the leaf’s reflectance in the range from 950 to 975 nm due to water absorption is confirmed [42].

 figure: Fig. 9

Fig. 9 Water absorption by a citrus plant. (a) Color images of the plant after supplying water. (b) 3D surface models. (c) Reflectance spectrum.

Download Full Size | PPT Slide | PDF

5.5. Investigation of translucent objects

In addition to the determination of the reflectance spectrum, our 5D sensor is also suitable for the investigation of other wavelength-dependent parameters, e.g., the penetration depth of light into translucent objects. For this purpose, we used our sensor to measure both a translucent and an opaque sphere with a radius of approximately 30 mm each (see Fig. 10(a)) in different spectral channels between 600 and 875 nm (operation mode (OM1a)).

 figure: Fig. 10

Fig. 10 Measurement of an opaque and a translucent sphere. (a) Exemplary camera images during the measurement in the 750-nm channel. (b) Illustration of the measured points’ deviation from the real surface when measuring a translucent sphere. (c) Results obtained using our hyperspectral 3D sensor. Error bars denote the standard deviation of the results of the measurement which was repeated ten times.

Download Full Size | PPT Slide | PDF

Lutzke et al. [48–50] have shown that the measurement of translucent objects results in a deviation between real and measured surface. Figure 10(b) shows this effect (highly exaggerated) when measuring a sphere. Instead of the actual sphere (green line), the orange colored points are detected. The greater the light’s penetration depth, the more the position of a best-fit sphere (orange line) differs from the real sphere.

We repeatedly measured a diffusely reflecting white opaque sphere and a translucent synthetic resin sphere with a polished surface. To each of the point clouds generated in the different spectral ranges, a sphere with a (known) fixed radius was fitted. Figure 10(c) shows the resulting displacement of the sphere center (X0, Y0, Z0) in the z direction with respect to an (unknown) reference value Zref in the reliable wavelength range between 675 and 850 nm. When measuring the opaque sphere, the position of the sphere center does not change significantly (blue line). But when measuring the translucent sphere, an increasing shift ΔZ with larger wavelengths (orange line) is clearly noticeable. Thus, with the help of our 5D sensor, we have been able to confirm previous results [12, 51] that the light’s penetration depth into the synthetic resin sphere increases with increasing wavelength and that our system is suitable for the investigation of such wavelength-dependent properties.

6. Conclusion

Our new approach of 5D hyperspectral imaging enables us to accurately measure the shape and reflection characteristics of the surface of macroscopic objects at video rate. By using a specially developed broadband high-speed pattern projector and two hyperspectral snapshot cameras, excellent spatial and spectral resolution, great depth accuracy, and high frame rates can be realized by a very compact, cost-effective, and robust sensor. A first system based on the proposed technology comprises two synchronized cameras, the CMOS sensors of which are tessellated with 5 × 5 different Fabry-Pérot interference filters, and a GOBO projector, which projects temporally varying aperiodic sinusoidal patterns into the measurement volume.

When projecting appropriate patterns (in terms of average fringe width, fringe width variation, or GOBO wheel rotational speed [52]), the 5D sensor’s specifications are determined only by the applied hyperspectral cameras. The more spectral channels the cameras have, the higher the spectral resolution and the lower the spatial resolution and depth accuracy of the 5D sensor, and vice versa. Our prototype features 25 different channels in the visible to near-infrared range, each of them providing a 3D point standard deviation of less than 100 µm (measurement of a test plane) and a radius standard deviation of less than 80 µm (measurement of a test sphere). Measurements can be performed at a frame rate of up to 17 Hz, which is significantly faster than state-of-the-art systems.

A number of different experiments demonstrated potential applications of the developed 5D sensor, such as investigation of art and cultural objects, plant phenotyping, or medicine. The reconstructed 3D point clouds can be used for shading correction to obtain temporally and spatially consistent spectra. In addition, the developed system makes it possible to investigate the penetration of light into different materials which was demonstrated by the measurement of an opaque and a translucent sphere.

In the future, the 5D sensor should be further optimized. This particularly involves the applied camera technology. By a more homogeneous signal-to-noise ratio and by minimizing the crosstalk between the individual channels, we expect a more reliable determination of the wavelength dependence of the reflectance. By further increasing the camera frame rate, even dynamically changing object properties can be monitored. Furthermore, the additional use of hyperspectral cameras working in the visible spectral range should be considered. In this way, the 5D sensor could cover an even wider wavelength range and draw more precise conclusions about the properties of the measured objects.

Funding

German Federal Ministry of Education and Research (BMBF) (03ZZ0436, 03ZZ0462).

Acknowledgments

The authors thank A. Christoph from cultur3D, an innovation project funded by the European Regional Development Fund (ERDF), which deals with the digital recording and visualization of cultural assets. He made the experimental investigation of the historical globe possible.

References and links

1. R. M. Willett, M. F. Duarte, M. A. Davenport, and R. G. Baraniuk, “Sparsity and structure in hyperspectral imaging : Sensing, reconstruction, and target detection,” IEEE Signal Process. Mag. 31, 116–126 (2014). [CrossRef]  

2. A. F. H. Goetz, G. Vane, J. E. Solomon, and B. N. Rock, “Imaging spectrometry for earth remote sensing,” Science 228, 1147–1153 (1985). [CrossRef]   [PubMed]  

3. C. Balas, V. Papadakis, N. Papadakis, A. Papadakis, E. Vazgiouraki, and G. Themelis, “A novel hyper-spectral imaging apparatus for the non-destructive analysis of objects of artistic and historic value,” J. Cult. Herit. 4(1), 330–337 (2003). [CrossRef]  

4. A. A. Gowen, C. P. O’Donnell, P. J. Cullen, G. Downey, and J. M. Frias, “Hyperspectral imaging – an emerging process analytical tool for food quality and safety control,” Trends Food Sci. Technol. 18, 590–598 (2007). [CrossRef]  

5. A.-K. Mahlein, U. Steiner, C. Hillnhütter, H.-W. Dehne, and E.-C. Oerke, “Hyperspectral imaging for small-scale analysis of symptoms caused by different sugar beet diseases,” Plant Methods 8, 3 (2012). [CrossRef]   [PubMed]  

6. G. Lu and B. Fei, “Medical hyperspectral imaging: a review,” J. Biomed. Opt. 19, 010901 (2014). [CrossRef]  

7. G. ElMasry and D.-W. Sun, “Principles of hyperspectral imaging technology,” in “Hyperspectral Imaging for Food Quality Analysis and Control,” D.-W. Sun, ed. (Academic, San Diego, 2010), pp. 3–43. [CrossRef]  

8. H. Liang, A. Lucian, R. Lange, C. S. Cheung, and B. Su, “Remote spectral imaging with simultaneous extraction of 3D topography for historical wall paintings,” ISPRS J. Photogramm. Remote. Sens. 95, 13–22 (2014). [CrossRef]  

9. J. Behmann, A.-K. Mahlein, S. Paulus, J. Dupuis, H. Kuhlmann, E.-C. Oerke, and L. Plümer, “Generation and application of hyperspectral 3D plant models: methods and challenges,” Mach. Vis. Appl. 27, 611–624 (2016). [CrossRef]  

10. R. Bridgelall, J. B. Rafert, D. Atwood, and D. D. Tolliver, “Hyperspectral range imaging for transportation systems,” Proc. SPIE 9803, 98032Y (2016). [CrossRef]  

11. J. Wu, B. Xiong, X. Lin, J. He, J. Suo, and Q. Dai, “Snapshot hyperspectral volumetric microscopy,” Sci. Rep. 6, 24624 (2016). [CrossRef]   [PubMed]  

12. C. Zhang, M. Rosenberger, A. Breitbarth, and G. Notni, “A novel 3D multispectral vision system based on filter wheel cameras,” IEEE Transactions on Imaging Syst. Tech. 4, 267–272 (2016).

13. A. Zia, J. Liang, J. Zhou, and Y. Gao, “3D reconstruction from hyperspectral images,” in Proceedings of the Winter Conference on Applications of Computer Vision (WACV) pp. 318–325 (2015).

14. S. Ullman, “The interpretation of structure from motion,” Proc. R. Soc. Lond., B, Biol. Sci. 203, 405–426 (1979). [CrossRef]   [PubMed]  

15. S Zhang, P. Liu, J. Huang, and R. Xu, “Multiview hyperspectral topography of tissue structural and functional characteristics,” J. Biomed. Opt. 21, 016012 (2016). [CrossRef]  

16. M. H. Kim, T. A. Harvey, D. S. Kittle, H. Rushmeier, J. Dorsey, R. O. Prum, and D. J. Brady, “3D imaging spectroscopy for measuring hyperspectral patterns on solid objects,” ACM Trans. Graph. 31, 38 (2012). [CrossRef]  

17. E. Honkavaara, M. A. Eskelinen, I. Pölönen, H. Saari, H. Ojanen, R. Mannila, C. Holmlund, T. Hakala, P. Litkey, T. Rosnell, N. Viljanen, and M. Pulkkanen, “Remote sensing of 3-D geometry and surface moisture of a peat production area using hyperspectral frame cameras in visible to short-wave infrared spectral ranges onboard a small unmanned airborne vehicle (UAV),” IEEE Transactions on Geosci. Remote. Sens. 54, 5440–5454 (2016). [CrossRef]  

18. H. Aasen, “The acquisition of hyperspectral digital surface models of crops from UAV snapshot cameras,” Ph.D. thesis, University of Cologne (2016).

19. S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016). [CrossRef]  

20. B. Geelen, N. Tack, and A. Lambrechts, “A compact snapshot multispectral imager with a monolithically integrated per-pixel filter mosaic,” Proc. SPIE 8974, 89740L (2014). [CrossRef]  

21. N. Hagen and M. W. Kudenov, “Review of snapshot spectral imaging technologies,” Opt. Eng. 52, 090901 (2013). [CrossRef]  

22. P. Agrawal, K. Tack, B. Geelen, B. Masschelein, P. M. A. Moran, A. Lambrechts, and M. Jayapala, “Characterization of VNIR hyperspectral sensors with monolithically integrated optical filters,” Proc. IS&T 2016, 1–7 (2016).

23. J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photon. 3, 128–160 (2011). [CrossRef]  

24. J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recogn. 43, 2666–2680 (2010). [CrossRef]  

25. T. Luhmann, S. Robson, S. Kyle, and J. Böhm, Close-Range Photogrammetry and 3D Imaging (Walter de Gruyter, 2014).

26. S. Zhang, Handbook of 3D Machine Vision: Optical Metrology and Imaging (CRC, 2013). [CrossRef]  

27. R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision (Cambridge University, 2004). [CrossRef]  

28. N. Tack, A. Lambrechts, P. Soussan, and L. Haspeslagh, “A compact, high-speed, and low-cost hyperspectral imager,” Proc. SPIE 8266, 82660Q (2012). [CrossRef]  

29. M. Born, E. Wolf, A. B. Bhatia, P. C. Clemmow, D. Gabor, A. R. Stokes, A. M. Taylor, P. A. Wayman, and W. L. Wilcock, Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light (Cambridge University, 1999). [CrossRef]  

30. E. Hecht, Optics (Addison-Wesley, 2002).

31. J. Burger and P. Geladi, “Hyperspectral NIR image regression part I: calibration and correction,” J. Chemom. 19, 355–363 (2005). [CrossRef]  

32. S. Heist, A. Mann, P. Kühmstedt, P. Schreiber, and G. Notni, “Array projection of aperiodic sinusoidal fringes for high-speed three-dimensional shape measurement,” Opt. Eng. 53, 112208 (2014). [CrossRef]  

33. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on Pattern Analysis Mach. Intell. 22, 1330–1334 (2000). [CrossRef]  

34. K. Ozawa, I. Sato, and M. Yamaguchi, “Hyperspectral photometric stereo for a single capture,” J. Opt. Soc. Am. A 34, 384–394 (2017). [CrossRef]  

35. “VDI/VDE 2634 part 2: Optical 3-D measuring systems – optical systems based on area scanning,” (2012).

36. C. Fischer and I. Kakoulli, “Multispectral and hyperspectral imaging technologies in conservation: current research and potential applications,” Stud. Conserv. 51, 3–16 (2006). [CrossRef]  

37. H. Liang, “Advances in multispectral and hyperspectral imaging for archaeology and art conservation,” Appl. Phys. A 106, 309–323 (2012). [CrossRef]  

38. A. Shahzad, M. N. Saad, N. Walter, A. S. Malik, and F. Meriaudeau, “Hyperspectral venous image quality assessment for optimum illumination range selection based on skin tone characteristics,” Biomed. Eng. Online 13, 109 (2014). [CrossRef]   [PubMed]  

39. G. Lu and B. Fei, “Medical hyperspectral imaging: a review,” J. Biomed. Opt. 19, 19–24 (2014). [CrossRef]  

40. B. M. Nicolaï, K. Beullens, E. Bobelyn, A. Peirs, W. Saeys, K. I. Theron, and J. Lammertyn, “Nondestructive measurement of fruit and vegetable quality by means of NIR spectroscopy: A review,” Postharvest Biol. Technol. 46, 99–118 (2007). [CrossRef]  

41. Q. Zhang, Q. Li, and G. Zhang, “Rapid determination of leaf water content using VIS/NIR spectroscopy analysis with wavelength selection,” Spectrosc. Int. J. 27, 93–105 (2012). [CrossRef]  

42. J. Peñuelas, J. Piñol, R. Ogaya, and I. Filella, “Estimation of plant water concentration by the reflectance water index wi (R900/R970),” Int. J. Remote. Sens. 18, 2869–2875 (1997). [CrossRef]  

43. N. J. Mitra and A. Nguyen, “Estimating surface normals in noisy point cloud data,” Proc. ASCG 19, 322–328 (2003).

44. S. Heist, P. Kühmstedt, A. Tünnermann, and G. Notni, “BRDF-dependent accuracy of array projection-based 3D sensors,” Appl. Opt. 56, 2162–2170 (2017). [CrossRef]   [PubMed]  

45. M. Min and W. S. Lee, “Determination of significant wavelengths and prediction of nitrogen content for citrus,” Trans. ASAE 48, 455–461 (2005). [CrossRef]  

46. D. A. Sims and J. A. Gamon, “Estimation of vegetation water content and photosynthetic tissue area from spectral reflectance: a comparison of indices based on liquid water and chlorophyll absorption features,” Remote. Sens. Environ. 84, 526–537 (2003). [CrossRef]  

47. D. Zhao, K. R. Reddy, V. G. Kakani, and V. Reddy, “Nitrogen deficiency effects on plant growth, leaf photosynthesis, and hyperspectral reflectance properties of sorghum,” Eur. J. Agron. 22, 391–403 (2005). [CrossRef]  

48. P. Lutzke, P. Kühmstedt, and G. Notni, “Measuring error compensation on three-dimensional scans of translucent objects,” Opt. Eng. 50, 063601 (2011). [CrossRef]  

49. P. Lutzke, P. Kühmstedt, and G. Notni, “Fast error simulation of optical 3D measurements at translucent objects,” Proc. SPIE 8493, 84930U (2012). [CrossRef]  

50. P. Lutzke, S. Heist, P. Kühmstedt, R. Kowarschik, and G. Notni, “Monte carlo simulation of three-dimensional measurements of translucent objects,” Opt. Eng. 54, 084111 (2015). [CrossRef]  

51. C. Zhang, M. Rosenberger, A. Breitbarth, and G. Notni, “Wavelength dependency of optical 3D measurements at translucent objects using fringe pattern projection,” Proc. SPIE 10220, 1022007 (2017). [CrossRef]  

52. S. Heist, P. Dietrich, M. Landmann, P. Kühmstedt, and G. Notni, “High-speed 3D shape measurement by GOBO projection of aperiodic sinusoidal fringes: a performance analysis,” Proc. SPIE 10667, 106670A (2018).

References

  • View by:
  • |
  • |
  • |

  1. R. M. Willett, M. F. Duarte, M. A. Davenport, and R. G. Baraniuk, “Sparsity and structure in hyperspectral imaging : Sensing, reconstruction, and target detection,” IEEE Signal Process. Mag. 31, 116–126 (2014).
    [Crossref]
  2. A. F. H. Goetz, G. Vane, J. E. Solomon, and B. N. Rock, “Imaging spectrometry for earth remote sensing,” Science 228, 1147–1153 (1985).
    [Crossref] [PubMed]
  3. C. Balas, V. Papadakis, N. Papadakis, A. Papadakis, E. Vazgiouraki, and G. Themelis, “A novel hyper-spectral imaging apparatus for the non-destructive analysis of objects of artistic and historic value,” J. Cult. Herit. 4(1), 330–337 (2003).
    [Crossref]
  4. A. A. Gowen, C. P. O’Donnell, P. J. Cullen, G. Downey, and J. M. Frias, “Hyperspectral imaging – an emerging process analytical tool for food quality and safety control,” Trends Food Sci. Technol. 18, 590–598 (2007).
    [Crossref]
  5. A.-K. Mahlein, U. Steiner, C. Hillnhütter, H.-W. Dehne, and E.-C. Oerke, “Hyperspectral imaging for small-scale analysis of symptoms caused by different sugar beet diseases,” Plant Methods 8, 3 (2012).
    [Crossref] [PubMed]
  6. G. Lu and B. Fei, “Medical hyperspectral imaging: a review,” J. Biomed. Opt. 19, 010901 (2014).
    [Crossref]
  7. G. ElMasry and D.-W. Sun, “Principles of hyperspectral imaging technology,” in “Hyperspectral Imaging for Food Quality Analysis and Control,” D.-W. Sun, ed. (Academic, San Diego, 2010), pp. 3–43.
    [Crossref]
  8. H. Liang, A. Lucian, R. Lange, C. S. Cheung, and B. Su, “Remote spectral imaging with simultaneous extraction of 3D topography for historical wall paintings,” ISPRS J. Photogramm. Remote. Sens. 95, 13–22 (2014).
    [Crossref]
  9. J. Behmann, A.-K. Mahlein, S. Paulus, J. Dupuis, H. Kuhlmann, E.-C. Oerke, and L. Plümer, “Generation and application of hyperspectral 3D plant models: methods and challenges,” Mach. Vis. Appl. 27, 611–624 (2016).
    [Crossref]
  10. R. Bridgelall, J. B. Rafert, D. Atwood, and D. D. Tolliver, “Hyperspectral range imaging for transportation systems,” Proc. SPIE 9803, 98032Y (2016).
    [Crossref]
  11. J. Wu, B. Xiong, X. Lin, J. He, J. Suo, and Q. Dai, “Snapshot hyperspectral volumetric microscopy,” Sci. Rep. 6, 24624 (2016).
    [Crossref] [PubMed]
  12. C. Zhang, M. Rosenberger, A. Breitbarth, and G. Notni, “A novel 3D multispectral vision system based on filter wheel cameras,” IEEE Transactions on Imaging Syst. Tech. 4, 267–272 (2016).
  13. A. Zia, J. Liang, J. Zhou, and Y. Gao, “3D reconstruction from hyperspectral images,” in Proceedings of the Winter Conference on Applications of Computer Vision (WACV) pp. 318–325 (2015).
  14. S. Ullman, “The interpretation of structure from motion,” Proc. R. Soc. Lond., B, Biol. Sci. 203, 405–426 (1979).
    [Crossref] [PubMed]
  15. S Zhang, P. Liu, J. Huang, and R. Xu, “Multiview hyperspectral topography of tissue structural and functional characteristics,” J. Biomed. Opt. 21, 016012 (2016).
    [Crossref]
  16. M. H. Kim, T. A. Harvey, D. S. Kittle, H. Rushmeier, J. Dorsey, R. O. Prum, and D. J. Brady, “3D imaging spectroscopy for measuring hyperspectral patterns on solid objects,” ACM Trans. Graph. 31, 38 (2012).
    [Crossref]
  17. E. Honkavaara, M. A. Eskelinen, I. Pölönen, H. Saari, H. Ojanen, R. Mannila, C. Holmlund, T. Hakala, P. Litkey, T. Rosnell, N. Viljanen, and M. Pulkkanen, “Remote sensing of 3-D geometry and surface moisture of a peat production area using hyperspectral frame cameras in visible to short-wave infrared spectral ranges onboard a small unmanned airborne vehicle (UAV),” IEEE Transactions on Geosci. Remote. Sens. 54, 5440–5454 (2016).
    [Crossref]
  18. H. Aasen, “The acquisition of hyperspectral digital surface models of crops from UAV snapshot cameras,” Ph.D. thesis, University of Cologne (2016).
  19. S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
    [Crossref]
  20. B. Geelen, N. Tack, and A. Lambrechts, “A compact snapshot multispectral imager with a monolithically integrated per-pixel filter mosaic,” Proc. SPIE 8974, 89740L (2014).
    [Crossref]
  21. N. Hagen and M. W. Kudenov, “Review of snapshot spectral imaging technologies,” Opt. Eng. 52, 090901 (2013).
    [Crossref]
  22. P. Agrawal, K. Tack, B. Geelen, B. Masschelein, P. M. A. Moran, A. Lambrechts, and M. Jayapala, “Characterization of VNIR hyperspectral sensors with monolithically integrated optical filters,” Proc. IS&T 2016, 1–7 (2016).
  23. J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photon. 3, 128–160 (2011).
    [Crossref]
  24. J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recogn. 43, 2666–2680 (2010).
    [Crossref]
  25. T. Luhmann, S. Robson, S. Kyle, and J. Böhm, Close-Range Photogrammetry and 3D Imaging (Walter de Gruyter, 2014).
  26. S. Zhang, Handbook of 3D Machine Vision: Optical Metrology and Imaging (CRC, 2013).
    [Crossref]
  27. R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision (Cambridge University, 2004).
    [Crossref]
  28. N. Tack, A. Lambrechts, P. Soussan, and L. Haspeslagh, “A compact, high-speed, and low-cost hyperspectral imager,” Proc. SPIE 8266, 82660Q (2012).
    [Crossref]
  29. M. Born, E. Wolf, A. B. Bhatia, P. C. Clemmow, D. Gabor, A. R. Stokes, A. M. Taylor, P. A. Wayman, and W. L. Wilcock, Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light (Cambridge University, 1999).
    [Crossref]
  30. E. Hecht, Optics (Addison-Wesley, 2002).
  31. J. Burger and P. Geladi, “Hyperspectral NIR image regression part I: calibration and correction,” J. Chemom. 19, 355–363 (2005).
    [Crossref]
  32. S. Heist, A. Mann, P. Kühmstedt, P. Schreiber, and G. Notni, “Array projection of aperiodic sinusoidal fringes for high-speed three-dimensional shape measurement,” Opt. Eng. 53, 112208 (2014).
    [Crossref]
  33. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on Pattern Analysis Mach. Intell. 22, 1330–1334 (2000).
    [Crossref]
  34. K. Ozawa, I. Sato, and M. Yamaguchi, “Hyperspectral photometric stereo for a single capture,” J. Opt. Soc. Am. A 34, 384–394 (2017).
    [Crossref]
  35. “VDI/VDE 2634 part 2: Optical 3-D measuring systems – optical systems based on area scanning,” (2012).
  36. C. Fischer and I. Kakoulli, “Multispectral and hyperspectral imaging technologies in conservation: current research and potential applications,” Stud. Conserv. 51, 3–16 (2006).
    [Crossref]
  37. H. Liang, “Advances in multispectral and hyperspectral imaging for archaeology and art conservation,” Appl. Phys. A 106, 309–323 (2012).
    [Crossref]
  38. A. Shahzad, M. N. Saad, N. Walter, A. S. Malik, and F. Meriaudeau, “Hyperspectral venous image quality assessment for optimum illumination range selection based on skin tone characteristics,” Biomed. Eng. Online 13, 109 (2014).
    [Crossref] [PubMed]
  39. G. Lu and B. Fei, “Medical hyperspectral imaging: a review,” J. Biomed. Opt. 19, 19–24 (2014).
    [Crossref]
  40. B. M. Nicolaï, K. Beullens, E. Bobelyn, A. Peirs, W. Saeys, K. I. Theron, and J. Lammertyn, “Nondestructive measurement of fruit and vegetable quality by means of NIR spectroscopy: A review,” Postharvest Biol. Technol. 46, 99–118 (2007).
    [Crossref]
  41. Q. Zhang, Q. Li, and G. Zhang, “Rapid determination of leaf water content using VIS/NIR spectroscopy analysis with wavelength selection,” Spectrosc. Int. J. 27, 93–105 (2012).
    [Crossref]
  42. J. Peñuelas, J. Piñol, R. Ogaya, and I. Filella, “Estimation of plant water concentration by the reflectance water index wi (R900/R970),” Int. J. Remote. Sens. 18, 2869–2875 (1997).
    [Crossref]
  43. N. J. Mitra and A. Nguyen, “Estimating surface normals in noisy point cloud data,” Proc. ASCG 19, 322–328 (2003).
  44. S. Heist, P. Kühmstedt, A. Tünnermann, and G. Notni, “BRDF-dependent accuracy of array projection-based 3D sensors,” Appl. Opt. 56, 2162–2170 (2017).
    [Crossref] [PubMed]
  45. M. Min and W. S. Lee, “Determination of significant wavelengths and prediction of nitrogen content for citrus,” Trans. ASAE 48, 455–461 (2005).
    [Crossref]
  46. D. A. Sims and J. A. Gamon, “Estimation of vegetation water content and photosynthetic tissue area from spectral reflectance: a comparison of indices based on liquid water and chlorophyll absorption features,” Remote. Sens. Environ. 84, 526–537 (2003).
    [Crossref]
  47. D. Zhao, K. R. Reddy, V. G. Kakani, and V. Reddy, “Nitrogen deficiency effects on plant growth, leaf photosynthesis, and hyperspectral reflectance properties of sorghum,” Eur. J. Agron. 22, 391–403 (2005).
    [Crossref]
  48. P. Lutzke, P. Kühmstedt, and G. Notni, “Measuring error compensation on three-dimensional scans of translucent objects,” Opt. Eng. 50, 063601 (2011).
    [Crossref]
  49. P. Lutzke, P. Kühmstedt, and G. Notni, “Fast error simulation of optical 3D measurements at translucent objects,” Proc. SPIE 8493, 84930U (2012).
    [Crossref]
  50. P. Lutzke, S. Heist, P. Kühmstedt, R. Kowarschik, and G. Notni, “Monte carlo simulation of three-dimensional measurements of translucent objects,” Opt. Eng. 54, 084111 (2015).
    [Crossref]
  51. C. Zhang, M. Rosenberger, A. Breitbarth, and G. Notni, “Wavelength dependency of optical 3D measurements at translucent objects using fringe pattern projection,” Proc. SPIE 10220, 1022007 (2017).
    [Crossref]
  52. S. Heist, P. Dietrich, M. Landmann, P. Kühmstedt, and G. Notni, “High-speed 3D shape measurement by GOBO projection of aperiodic sinusoidal fringes: a performance analysis,” Proc. SPIE 10667, 106670A (2018).

2018 (1)

S. Heist, P. Dietrich, M. Landmann, P. Kühmstedt, and G. Notni, “High-speed 3D shape measurement by GOBO projection of aperiodic sinusoidal fringes: a performance analysis,” Proc. SPIE 10667, 106670A (2018).

2017 (3)

2016 (8)

P. Agrawal, K. Tack, B. Geelen, B. Masschelein, P. M. A. Moran, A. Lambrechts, and M. Jayapala, “Characterization of VNIR hyperspectral sensors with monolithically integrated optical filters,” Proc. IS&T 2016, 1–7 (2016).

J. Behmann, A.-K. Mahlein, S. Paulus, J. Dupuis, H. Kuhlmann, E.-C. Oerke, and L. Plümer, “Generation and application of hyperspectral 3D plant models: methods and challenges,” Mach. Vis. Appl. 27, 611–624 (2016).
[Crossref]

R. Bridgelall, J. B. Rafert, D. Atwood, and D. D. Tolliver, “Hyperspectral range imaging for transportation systems,” Proc. SPIE 9803, 98032Y (2016).
[Crossref]

J. Wu, B. Xiong, X. Lin, J. He, J. Suo, and Q. Dai, “Snapshot hyperspectral volumetric microscopy,” Sci. Rep. 6, 24624 (2016).
[Crossref] [PubMed]

C. Zhang, M. Rosenberger, A. Breitbarth, and G. Notni, “A novel 3D multispectral vision system based on filter wheel cameras,” IEEE Transactions on Imaging Syst. Tech. 4, 267–272 (2016).

S Zhang, P. Liu, J. Huang, and R. Xu, “Multiview hyperspectral topography of tissue structural and functional characteristics,” J. Biomed. Opt. 21, 016012 (2016).
[Crossref]

E. Honkavaara, M. A. Eskelinen, I. Pölönen, H. Saari, H. Ojanen, R. Mannila, C. Holmlund, T. Hakala, P. Litkey, T. Rosnell, N. Viljanen, and M. Pulkkanen, “Remote sensing of 3-D geometry and surface moisture of a peat production area using hyperspectral frame cameras in visible to short-wave infrared spectral ranges onboard a small unmanned airborne vehicle (UAV),” IEEE Transactions on Geosci. Remote. Sens. 54, 5440–5454 (2016).
[Crossref]

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

2015 (1)

P. Lutzke, S. Heist, P. Kühmstedt, R. Kowarschik, and G. Notni, “Monte carlo simulation of three-dimensional measurements of translucent objects,” Opt. Eng. 54, 084111 (2015).
[Crossref]

2014 (7)

B. Geelen, N. Tack, and A. Lambrechts, “A compact snapshot multispectral imager with a monolithically integrated per-pixel filter mosaic,” Proc. SPIE 8974, 89740L (2014).
[Crossref]

G. Lu and B. Fei, “Medical hyperspectral imaging: a review,” J. Biomed. Opt. 19, 010901 (2014).
[Crossref]

H. Liang, A. Lucian, R. Lange, C. S. Cheung, and B. Su, “Remote spectral imaging with simultaneous extraction of 3D topography for historical wall paintings,” ISPRS J. Photogramm. Remote. Sens. 95, 13–22 (2014).
[Crossref]

R. M. Willett, M. F. Duarte, M. A. Davenport, and R. G. Baraniuk, “Sparsity and structure in hyperspectral imaging : Sensing, reconstruction, and target detection,” IEEE Signal Process. Mag. 31, 116–126 (2014).
[Crossref]

S. Heist, A. Mann, P. Kühmstedt, P. Schreiber, and G. Notni, “Array projection of aperiodic sinusoidal fringes for high-speed three-dimensional shape measurement,” Opt. Eng. 53, 112208 (2014).
[Crossref]

A. Shahzad, M. N. Saad, N. Walter, A. S. Malik, and F. Meriaudeau, “Hyperspectral venous image quality assessment for optimum illumination range selection based on skin tone characteristics,” Biomed. Eng. Online 13, 109 (2014).
[Crossref] [PubMed]

G. Lu and B. Fei, “Medical hyperspectral imaging: a review,” J. Biomed. Opt. 19, 19–24 (2014).
[Crossref]

2013 (1)

N. Hagen and M. W. Kudenov, “Review of snapshot spectral imaging technologies,” Opt. Eng. 52, 090901 (2013).
[Crossref]

2012 (6)

M. H. Kim, T. A. Harvey, D. S. Kittle, H. Rushmeier, J. Dorsey, R. O. Prum, and D. J. Brady, “3D imaging spectroscopy for measuring hyperspectral patterns on solid objects,” ACM Trans. Graph. 31, 38 (2012).
[Crossref]

A.-K. Mahlein, U. Steiner, C. Hillnhütter, H.-W. Dehne, and E.-C. Oerke, “Hyperspectral imaging for small-scale analysis of symptoms caused by different sugar beet diseases,” Plant Methods 8, 3 (2012).
[Crossref] [PubMed]

H. Liang, “Advances in multispectral and hyperspectral imaging for archaeology and art conservation,” Appl. Phys. A 106, 309–323 (2012).
[Crossref]

Q. Zhang, Q. Li, and G. Zhang, “Rapid determination of leaf water content using VIS/NIR spectroscopy analysis with wavelength selection,” Spectrosc. Int. J. 27, 93–105 (2012).
[Crossref]

N. Tack, A. Lambrechts, P. Soussan, and L. Haspeslagh, “A compact, high-speed, and low-cost hyperspectral imager,” Proc. SPIE 8266, 82660Q (2012).
[Crossref]

P. Lutzke, P. Kühmstedt, and G. Notni, “Fast error simulation of optical 3D measurements at translucent objects,” Proc. SPIE 8493, 84930U (2012).
[Crossref]

2011 (2)

P. Lutzke, P. Kühmstedt, and G. Notni, “Measuring error compensation on three-dimensional scans of translucent objects,” Opt. Eng. 50, 063601 (2011).
[Crossref]

J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photon. 3, 128–160 (2011).
[Crossref]

2010 (1)

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recogn. 43, 2666–2680 (2010).
[Crossref]

2007 (2)

B. M. Nicolaï, K. Beullens, E. Bobelyn, A. Peirs, W. Saeys, K. I. Theron, and J. Lammertyn, “Nondestructive measurement of fruit and vegetable quality by means of NIR spectroscopy: A review,” Postharvest Biol. Technol. 46, 99–118 (2007).
[Crossref]

A. A. Gowen, C. P. O’Donnell, P. J. Cullen, G. Downey, and J. M. Frias, “Hyperspectral imaging – an emerging process analytical tool for food quality and safety control,” Trends Food Sci. Technol. 18, 590–598 (2007).
[Crossref]

2006 (1)

C. Fischer and I. Kakoulli, “Multispectral and hyperspectral imaging technologies in conservation: current research and potential applications,” Stud. Conserv. 51, 3–16 (2006).
[Crossref]

2005 (3)

J. Burger and P. Geladi, “Hyperspectral NIR image regression part I: calibration and correction,” J. Chemom. 19, 355–363 (2005).
[Crossref]

M. Min and W. S. Lee, “Determination of significant wavelengths and prediction of nitrogen content for citrus,” Trans. ASAE 48, 455–461 (2005).
[Crossref]

D. Zhao, K. R. Reddy, V. G. Kakani, and V. Reddy, “Nitrogen deficiency effects on plant growth, leaf photosynthesis, and hyperspectral reflectance properties of sorghum,” Eur. J. Agron. 22, 391–403 (2005).
[Crossref]

2003 (3)

N. J. Mitra and A. Nguyen, “Estimating surface normals in noisy point cloud data,” Proc. ASCG 19, 322–328 (2003).

D. A. Sims and J. A. Gamon, “Estimation of vegetation water content and photosynthetic tissue area from spectral reflectance: a comparison of indices based on liquid water and chlorophyll absorption features,” Remote. Sens. Environ. 84, 526–537 (2003).
[Crossref]

C. Balas, V. Papadakis, N. Papadakis, A. Papadakis, E. Vazgiouraki, and G. Themelis, “A novel hyper-spectral imaging apparatus for the non-destructive analysis of objects of artistic and historic value,” J. Cult. Herit. 4(1), 330–337 (2003).
[Crossref]

2000 (1)

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on Pattern Analysis Mach. Intell. 22, 1330–1334 (2000).
[Crossref]

1997 (1)

J. Peñuelas, J. Piñol, R. Ogaya, and I. Filella, “Estimation of plant water concentration by the reflectance water index wi (R900/R970),” Int. J. Remote. Sens. 18, 2869–2875 (1997).
[Crossref]

1985 (1)

A. F. H. Goetz, G. Vane, J. E. Solomon, and B. N. Rock, “Imaging spectrometry for earth remote sensing,” Science 228, 1147–1153 (1985).
[Crossref] [PubMed]

1979 (1)

S. Ullman, “The interpretation of structure from motion,” Proc. R. Soc. Lond., B, Biol. Sci. 203, 405–426 (1979).
[Crossref] [PubMed]

Aasen, H.

H. Aasen, “The acquisition of hyperspectral digital surface models of crops from UAV snapshot cameras,” Ph.D. thesis, University of Cologne (2016).

Agrawal, P.

P. Agrawal, K. Tack, B. Geelen, B. Masschelein, P. M. A. Moran, A. Lambrechts, and M. Jayapala, “Characterization of VNIR hyperspectral sensors with monolithically integrated optical filters,” Proc. IS&T 2016, 1–7 (2016).

Atwood, D.

R. Bridgelall, J. B. Rafert, D. Atwood, and D. D. Tolliver, “Hyperspectral range imaging for transportation systems,” Proc. SPIE 9803, 98032Y (2016).
[Crossref]

Balas, C.

C. Balas, V. Papadakis, N. Papadakis, A. Papadakis, E. Vazgiouraki, and G. Themelis, “A novel hyper-spectral imaging apparatus for the non-destructive analysis of objects of artistic and historic value,” J. Cult. Herit. 4(1), 330–337 (2003).
[Crossref]

Baraniuk, R. G.

R. M. Willett, M. F. Duarte, M. A. Davenport, and R. G. Baraniuk, “Sparsity and structure in hyperspectral imaging : Sensing, reconstruction, and target detection,” IEEE Signal Process. Mag. 31, 116–126 (2014).
[Crossref]

Behmann, J.

J. Behmann, A.-K. Mahlein, S. Paulus, J. Dupuis, H. Kuhlmann, E.-C. Oerke, and L. Plümer, “Generation and application of hyperspectral 3D plant models: methods and challenges,” Mach. Vis. Appl. 27, 611–624 (2016).
[Crossref]

Beullens, K.

B. M. Nicolaï, K. Beullens, E. Bobelyn, A. Peirs, W. Saeys, K. I. Theron, and J. Lammertyn, “Nondestructive measurement of fruit and vegetable quality by means of NIR spectroscopy: A review,” Postharvest Biol. Technol. 46, 99–118 (2007).
[Crossref]

Bhatia, A. B.

M. Born, E. Wolf, A. B. Bhatia, P. C. Clemmow, D. Gabor, A. R. Stokes, A. M. Taylor, P. A. Wayman, and W. L. Wilcock, Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light (Cambridge University, 1999).
[Crossref]

Bobelyn, E.

B. M. Nicolaï, K. Beullens, E. Bobelyn, A. Peirs, W. Saeys, K. I. Theron, and J. Lammertyn, “Nondestructive measurement of fruit and vegetable quality by means of NIR spectroscopy: A review,” Postharvest Biol. Technol. 46, 99–118 (2007).
[Crossref]

Böhm, J.

T. Luhmann, S. Robson, S. Kyle, and J. Böhm, Close-Range Photogrammetry and 3D Imaging (Walter de Gruyter, 2014).

Born, M.

M. Born, E. Wolf, A. B. Bhatia, P. C. Clemmow, D. Gabor, A. R. Stokes, A. M. Taylor, P. A. Wayman, and W. L. Wilcock, Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light (Cambridge University, 1999).
[Crossref]

Brady, D. J.

M. H. Kim, T. A. Harvey, D. S. Kittle, H. Rushmeier, J. Dorsey, R. O. Prum, and D. J. Brady, “3D imaging spectroscopy for measuring hyperspectral patterns on solid objects,” ACM Trans. Graph. 31, 38 (2012).
[Crossref]

Breitbarth, A.

C. Zhang, M. Rosenberger, A. Breitbarth, and G. Notni, “Wavelength dependency of optical 3D measurements at translucent objects using fringe pattern projection,” Proc. SPIE 10220, 1022007 (2017).
[Crossref]

C. Zhang, M. Rosenberger, A. Breitbarth, and G. Notni, “A novel 3D multispectral vision system based on filter wheel cameras,” IEEE Transactions on Imaging Syst. Tech. 4, 267–272 (2016).

Bridgelall, R.

R. Bridgelall, J. B. Rafert, D. Atwood, and D. D. Tolliver, “Hyperspectral range imaging for transportation systems,” Proc. SPIE 9803, 98032Y (2016).
[Crossref]

Burger, J.

J. Burger and P. Geladi, “Hyperspectral NIR image regression part I: calibration and correction,” J. Chemom. 19, 355–363 (2005).
[Crossref]

Cheung, C. S.

H. Liang, A. Lucian, R. Lange, C. S. Cheung, and B. Su, “Remote spectral imaging with simultaneous extraction of 3D topography for historical wall paintings,” ISPRS J. Photogramm. Remote. Sens. 95, 13–22 (2014).
[Crossref]

Clemmow, P. C.

M. Born, E. Wolf, A. B. Bhatia, P. C. Clemmow, D. Gabor, A. R. Stokes, A. M. Taylor, P. A. Wayman, and W. L. Wilcock, Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light (Cambridge University, 1999).
[Crossref]

Cullen, P. J.

A. A. Gowen, C. P. O’Donnell, P. J. Cullen, G. Downey, and J. M. Frias, “Hyperspectral imaging – an emerging process analytical tool for food quality and safety control,” Trends Food Sci. Technol. 18, 590–598 (2007).
[Crossref]

Dai, Q.

J. Wu, B. Xiong, X. Lin, J. He, J. Suo, and Q. Dai, “Snapshot hyperspectral volumetric microscopy,” Sci. Rep. 6, 24624 (2016).
[Crossref] [PubMed]

Davenport, M. A.

R. M. Willett, M. F. Duarte, M. A. Davenport, and R. G. Baraniuk, “Sparsity and structure in hyperspectral imaging : Sensing, reconstruction, and target detection,” IEEE Signal Process. Mag. 31, 116–126 (2014).
[Crossref]

Dehne, H.-W.

A.-K. Mahlein, U. Steiner, C. Hillnhütter, H.-W. Dehne, and E.-C. Oerke, “Hyperspectral imaging for small-scale analysis of symptoms caused by different sugar beet diseases,” Plant Methods 8, 3 (2012).
[Crossref] [PubMed]

Dietrich, P.

S. Heist, P. Dietrich, M. Landmann, P. Kühmstedt, and G. Notni, “High-speed 3D shape measurement by GOBO projection of aperiodic sinusoidal fringes: a performance analysis,” Proc. SPIE 10667, 106670A (2018).

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

Dorsey, J.

M. H. Kim, T. A. Harvey, D. S. Kittle, H. Rushmeier, J. Dorsey, R. O. Prum, and D. J. Brady, “3D imaging spectroscopy for measuring hyperspectral patterns on solid objects,” ACM Trans. Graph. 31, 38 (2012).
[Crossref]

Downey, G.

A. A. Gowen, C. P. O’Donnell, P. J. Cullen, G. Downey, and J. M. Frias, “Hyperspectral imaging – an emerging process analytical tool for food quality and safety control,” Trends Food Sci. Technol. 18, 590–598 (2007).
[Crossref]

Duarte, M. F.

R. M. Willett, M. F. Duarte, M. A. Davenport, and R. G. Baraniuk, “Sparsity and structure in hyperspectral imaging : Sensing, reconstruction, and target detection,” IEEE Signal Process. Mag. 31, 116–126 (2014).
[Crossref]

Dupuis, J.

J. Behmann, A.-K. Mahlein, S. Paulus, J. Dupuis, H. Kuhlmann, E.-C. Oerke, and L. Plümer, “Generation and application of hyperspectral 3D plant models: methods and challenges,” Mach. Vis. Appl. 27, 611–624 (2016).
[Crossref]

ElMasry, G.

G. ElMasry and D.-W. Sun, “Principles of hyperspectral imaging technology,” in “Hyperspectral Imaging for Food Quality Analysis and Control,” D.-W. Sun, ed. (Academic, San Diego, 2010), pp. 3–43.
[Crossref]

Eskelinen, M. A.

E. Honkavaara, M. A. Eskelinen, I. Pölönen, H. Saari, H. Ojanen, R. Mannila, C. Holmlund, T. Hakala, P. Litkey, T. Rosnell, N. Viljanen, and M. Pulkkanen, “Remote sensing of 3-D geometry and surface moisture of a peat production area using hyperspectral frame cameras in visible to short-wave infrared spectral ranges onboard a small unmanned airborne vehicle (UAV),” IEEE Transactions on Geosci. Remote. Sens. 54, 5440–5454 (2016).
[Crossref]

Fei, B.

G. Lu and B. Fei, “Medical hyperspectral imaging: a review,” J. Biomed. Opt. 19, 010901 (2014).
[Crossref]

G. Lu and B. Fei, “Medical hyperspectral imaging: a review,” J. Biomed. Opt. 19, 19–24 (2014).
[Crossref]

Fernandez, S.

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recogn. 43, 2666–2680 (2010).
[Crossref]

Filella, I.

J. Peñuelas, J. Piñol, R. Ogaya, and I. Filella, “Estimation of plant water concentration by the reflectance water index wi (R900/R970),” Int. J. Remote. Sens. 18, 2869–2875 (1997).
[Crossref]

Fischer, C.

C. Fischer and I. Kakoulli, “Multispectral and hyperspectral imaging technologies in conservation: current research and potential applications,” Stud. Conserv. 51, 3–16 (2006).
[Crossref]

Frias, J. M.

A. A. Gowen, C. P. O’Donnell, P. J. Cullen, G. Downey, and J. M. Frias, “Hyperspectral imaging – an emerging process analytical tool for food quality and safety control,” Trends Food Sci. Technol. 18, 590–598 (2007).
[Crossref]

Gabor, D.

M. Born, E. Wolf, A. B. Bhatia, P. C. Clemmow, D. Gabor, A. R. Stokes, A. M. Taylor, P. A. Wayman, and W. L. Wilcock, Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light (Cambridge University, 1999).
[Crossref]

Gamon, J. A.

D. A. Sims and J. A. Gamon, “Estimation of vegetation water content and photosynthetic tissue area from spectral reflectance: a comparison of indices based on liquid water and chlorophyll absorption features,” Remote. Sens. Environ. 84, 526–537 (2003).
[Crossref]

Gao, Y.

A. Zia, J. Liang, J. Zhou, and Y. Gao, “3D reconstruction from hyperspectral images,” in Proceedings of the Winter Conference on Applications of Computer Vision (WACV) pp. 318–325 (2015).

Geelen, B.

P. Agrawal, K. Tack, B. Geelen, B. Masschelein, P. M. A. Moran, A. Lambrechts, and M. Jayapala, “Characterization of VNIR hyperspectral sensors with monolithically integrated optical filters,” Proc. IS&T 2016, 1–7 (2016).

B. Geelen, N. Tack, and A. Lambrechts, “A compact snapshot multispectral imager with a monolithically integrated per-pixel filter mosaic,” Proc. SPIE 8974, 89740L (2014).
[Crossref]

Geladi, P.

J. Burger and P. Geladi, “Hyperspectral NIR image regression part I: calibration and correction,” J. Chemom. 19, 355–363 (2005).
[Crossref]

Geng, J.

Goetz, A. F. H.

A. F. H. Goetz, G. Vane, J. E. Solomon, and B. N. Rock, “Imaging spectrometry for earth remote sensing,” Science 228, 1147–1153 (1985).
[Crossref] [PubMed]

Gowen, A. A.

A. A. Gowen, C. P. O’Donnell, P. J. Cullen, G. Downey, and J. M. Frias, “Hyperspectral imaging – an emerging process analytical tool for food quality and safety control,” Trends Food Sci. Technol. 18, 590–598 (2007).
[Crossref]

Hagen, N.

N. Hagen and M. W. Kudenov, “Review of snapshot spectral imaging technologies,” Opt. Eng. 52, 090901 (2013).
[Crossref]

Hakala, T.

E. Honkavaara, M. A. Eskelinen, I. Pölönen, H. Saari, H. Ojanen, R. Mannila, C. Holmlund, T. Hakala, P. Litkey, T. Rosnell, N. Viljanen, and M. Pulkkanen, “Remote sensing of 3-D geometry and surface moisture of a peat production area using hyperspectral frame cameras in visible to short-wave infrared spectral ranges onboard a small unmanned airborne vehicle (UAV),” IEEE Transactions on Geosci. Remote. Sens. 54, 5440–5454 (2016).
[Crossref]

Hartley, R.

R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision (Cambridge University, 2004).
[Crossref]

Harvey, T. A.

M. H. Kim, T. A. Harvey, D. S. Kittle, H. Rushmeier, J. Dorsey, R. O. Prum, and D. J. Brady, “3D imaging spectroscopy for measuring hyperspectral patterns on solid objects,” ACM Trans. Graph. 31, 38 (2012).
[Crossref]

Haspeslagh, L.

N. Tack, A. Lambrechts, P. Soussan, and L. Haspeslagh, “A compact, high-speed, and low-cost hyperspectral imager,” Proc. SPIE 8266, 82660Q (2012).
[Crossref]

He, J.

J. Wu, B. Xiong, X. Lin, J. He, J. Suo, and Q. Dai, “Snapshot hyperspectral volumetric microscopy,” Sci. Rep. 6, 24624 (2016).
[Crossref] [PubMed]

Hecht, E.

E. Hecht, Optics (Addison-Wesley, 2002).

Heist, S.

S. Heist, P. Dietrich, M. Landmann, P. Kühmstedt, and G. Notni, “High-speed 3D shape measurement by GOBO projection of aperiodic sinusoidal fringes: a performance analysis,” Proc. SPIE 10667, 106670A (2018).

S. Heist, P. Kühmstedt, A. Tünnermann, and G. Notni, “BRDF-dependent accuracy of array projection-based 3D sensors,” Appl. Opt. 56, 2162–2170 (2017).
[Crossref] [PubMed]

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

P. Lutzke, S. Heist, P. Kühmstedt, R. Kowarschik, and G. Notni, “Monte carlo simulation of three-dimensional measurements of translucent objects,” Opt. Eng. 54, 084111 (2015).
[Crossref]

S. Heist, A. Mann, P. Kühmstedt, P. Schreiber, and G. Notni, “Array projection of aperiodic sinusoidal fringes for high-speed three-dimensional shape measurement,” Opt. Eng. 53, 112208 (2014).
[Crossref]

Hillnhütter, C.

A.-K. Mahlein, U. Steiner, C. Hillnhütter, H.-W. Dehne, and E.-C. Oerke, “Hyperspectral imaging for small-scale analysis of symptoms caused by different sugar beet diseases,” Plant Methods 8, 3 (2012).
[Crossref] [PubMed]

Holmlund, C.

E. Honkavaara, M. A. Eskelinen, I. Pölönen, H. Saari, H. Ojanen, R. Mannila, C. Holmlund, T. Hakala, P. Litkey, T. Rosnell, N. Viljanen, and M. Pulkkanen, “Remote sensing of 3-D geometry and surface moisture of a peat production area using hyperspectral frame cameras in visible to short-wave infrared spectral ranges onboard a small unmanned airborne vehicle (UAV),” IEEE Transactions on Geosci. Remote. Sens. 54, 5440–5454 (2016).
[Crossref]

Honkavaara, E.

E. Honkavaara, M. A. Eskelinen, I. Pölönen, H. Saari, H. Ojanen, R. Mannila, C. Holmlund, T. Hakala, P. Litkey, T. Rosnell, N. Viljanen, and M. Pulkkanen, “Remote sensing of 3-D geometry and surface moisture of a peat production area using hyperspectral frame cameras in visible to short-wave infrared spectral ranges onboard a small unmanned airborne vehicle (UAV),” IEEE Transactions on Geosci. Remote. Sens. 54, 5440–5454 (2016).
[Crossref]

Huang, J.

S Zhang, P. Liu, J. Huang, and R. Xu, “Multiview hyperspectral topography of tissue structural and functional characteristics,” J. Biomed. Opt. 21, 016012 (2016).
[Crossref]

Jayapala, M.

P. Agrawal, K. Tack, B. Geelen, B. Masschelein, P. M. A. Moran, A. Lambrechts, and M. Jayapala, “Characterization of VNIR hyperspectral sensors with monolithically integrated optical filters,” Proc. IS&T 2016, 1–7 (2016).

Kakani, V. G.

D. Zhao, K. R. Reddy, V. G. Kakani, and V. Reddy, “Nitrogen deficiency effects on plant growth, leaf photosynthesis, and hyperspectral reflectance properties of sorghum,” Eur. J. Agron. 22, 391–403 (2005).
[Crossref]

Kakoulli, I.

C. Fischer and I. Kakoulli, “Multispectral and hyperspectral imaging technologies in conservation: current research and potential applications,” Stud. Conserv. 51, 3–16 (2006).
[Crossref]

Kim, M. H.

M. H. Kim, T. A. Harvey, D. S. Kittle, H. Rushmeier, J. Dorsey, R. O. Prum, and D. J. Brady, “3D imaging spectroscopy for measuring hyperspectral patterns on solid objects,” ACM Trans. Graph. 31, 38 (2012).
[Crossref]

Kittle, D. S.

M. H. Kim, T. A. Harvey, D. S. Kittle, H. Rushmeier, J. Dorsey, R. O. Prum, and D. J. Brady, “3D imaging spectroscopy for measuring hyperspectral patterns on solid objects,” ACM Trans. Graph. 31, 38 (2012).
[Crossref]

Kowarschik, R.

P. Lutzke, S. Heist, P. Kühmstedt, R. Kowarschik, and G. Notni, “Monte carlo simulation of three-dimensional measurements of translucent objects,” Opt. Eng. 54, 084111 (2015).
[Crossref]

Kudenov, M. W.

N. Hagen and M. W. Kudenov, “Review of snapshot spectral imaging technologies,” Opt. Eng. 52, 090901 (2013).
[Crossref]

Kuhlmann, H.

J. Behmann, A.-K. Mahlein, S. Paulus, J. Dupuis, H. Kuhlmann, E.-C. Oerke, and L. Plümer, “Generation and application of hyperspectral 3D plant models: methods and challenges,” Mach. Vis. Appl. 27, 611–624 (2016).
[Crossref]

Kühmstedt, P.

S. Heist, P. Dietrich, M. Landmann, P. Kühmstedt, and G. Notni, “High-speed 3D shape measurement by GOBO projection of aperiodic sinusoidal fringes: a performance analysis,” Proc. SPIE 10667, 106670A (2018).

S. Heist, P. Kühmstedt, A. Tünnermann, and G. Notni, “BRDF-dependent accuracy of array projection-based 3D sensors,” Appl. Opt. 56, 2162–2170 (2017).
[Crossref] [PubMed]

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

P. Lutzke, S. Heist, P. Kühmstedt, R. Kowarschik, and G. Notni, “Monte carlo simulation of three-dimensional measurements of translucent objects,” Opt. Eng. 54, 084111 (2015).
[Crossref]

S. Heist, A. Mann, P. Kühmstedt, P. Schreiber, and G. Notni, “Array projection of aperiodic sinusoidal fringes for high-speed three-dimensional shape measurement,” Opt. Eng. 53, 112208 (2014).
[Crossref]

P. Lutzke, P. Kühmstedt, and G. Notni, “Fast error simulation of optical 3D measurements at translucent objects,” Proc. SPIE 8493, 84930U (2012).
[Crossref]

P. Lutzke, P. Kühmstedt, and G. Notni, “Measuring error compensation on three-dimensional scans of translucent objects,” Opt. Eng. 50, 063601 (2011).
[Crossref]

Kyle, S.

T. Luhmann, S. Robson, S. Kyle, and J. Böhm, Close-Range Photogrammetry and 3D Imaging (Walter de Gruyter, 2014).

Lambrechts, A.

P. Agrawal, K. Tack, B. Geelen, B. Masschelein, P. M. A. Moran, A. Lambrechts, and M. Jayapala, “Characterization of VNIR hyperspectral sensors with monolithically integrated optical filters,” Proc. IS&T 2016, 1–7 (2016).

B. Geelen, N. Tack, and A. Lambrechts, “A compact snapshot multispectral imager with a monolithically integrated per-pixel filter mosaic,” Proc. SPIE 8974, 89740L (2014).
[Crossref]

N. Tack, A. Lambrechts, P. Soussan, and L. Haspeslagh, “A compact, high-speed, and low-cost hyperspectral imager,” Proc. SPIE 8266, 82660Q (2012).
[Crossref]

Lammertyn, J.

B. M. Nicolaï, K. Beullens, E. Bobelyn, A. Peirs, W. Saeys, K. I. Theron, and J. Lammertyn, “Nondestructive measurement of fruit and vegetable quality by means of NIR spectroscopy: A review,” Postharvest Biol. Technol. 46, 99–118 (2007).
[Crossref]

Landmann, M.

S. Heist, P. Dietrich, M. Landmann, P. Kühmstedt, and G. Notni, “High-speed 3D shape measurement by GOBO projection of aperiodic sinusoidal fringes: a performance analysis,” Proc. SPIE 10667, 106670A (2018).

Lange, R.

H. Liang, A. Lucian, R. Lange, C. S. Cheung, and B. Su, “Remote spectral imaging with simultaneous extraction of 3D topography for historical wall paintings,” ISPRS J. Photogramm. Remote. Sens. 95, 13–22 (2014).
[Crossref]

Lee, W. S.

M. Min and W. S. Lee, “Determination of significant wavelengths and prediction of nitrogen content for citrus,” Trans. ASAE 48, 455–461 (2005).
[Crossref]

Li, Q.

Q. Zhang, Q. Li, and G. Zhang, “Rapid determination of leaf water content using VIS/NIR spectroscopy analysis with wavelength selection,” Spectrosc. Int. J. 27, 93–105 (2012).
[Crossref]

Liang, H.

H. Liang, A. Lucian, R. Lange, C. S. Cheung, and B. Su, “Remote spectral imaging with simultaneous extraction of 3D topography for historical wall paintings,” ISPRS J. Photogramm. Remote. Sens. 95, 13–22 (2014).
[Crossref]

H. Liang, “Advances in multispectral and hyperspectral imaging for archaeology and art conservation,” Appl. Phys. A 106, 309–323 (2012).
[Crossref]

Liang, J.

A. Zia, J. Liang, J. Zhou, and Y. Gao, “3D reconstruction from hyperspectral images,” in Proceedings of the Winter Conference on Applications of Computer Vision (WACV) pp. 318–325 (2015).

Lin, X.

J. Wu, B. Xiong, X. Lin, J. He, J. Suo, and Q. Dai, “Snapshot hyperspectral volumetric microscopy,” Sci. Rep. 6, 24624 (2016).
[Crossref] [PubMed]

Litkey, P.

E. Honkavaara, M. A. Eskelinen, I. Pölönen, H. Saari, H. Ojanen, R. Mannila, C. Holmlund, T. Hakala, P. Litkey, T. Rosnell, N. Viljanen, and M. Pulkkanen, “Remote sensing of 3-D geometry and surface moisture of a peat production area using hyperspectral frame cameras in visible to short-wave infrared spectral ranges onboard a small unmanned airborne vehicle (UAV),” IEEE Transactions on Geosci. Remote. Sens. 54, 5440–5454 (2016).
[Crossref]

Liu, P.

S Zhang, P. Liu, J. Huang, and R. Xu, “Multiview hyperspectral topography of tissue structural and functional characteristics,” J. Biomed. Opt. 21, 016012 (2016).
[Crossref]

Llado, X.

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recogn. 43, 2666–2680 (2010).
[Crossref]

Lu, G.

G. Lu and B. Fei, “Medical hyperspectral imaging: a review,” J. Biomed. Opt. 19, 010901 (2014).
[Crossref]

G. Lu and B. Fei, “Medical hyperspectral imaging: a review,” J. Biomed. Opt. 19, 19–24 (2014).
[Crossref]

Lucian, A.

H. Liang, A. Lucian, R. Lange, C. S. Cheung, and B. Su, “Remote spectral imaging with simultaneous extraction of 3D topography for historical wall paintings,” ISPRS J. Photogramm. Remote. Sens. 95, 13–22 (2014).
[Crossref]

Luhmann, T.

T. Luhmann, S. Robson, S. Kyle, and J. Böhm, Close-Range Photogrammetry and 3D Imaging (Walter de Gruyter, 2014).

Lutzke, P.

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

P. Lutzke, S. Heist, P. Kühmstedt, R. Kowarschik, and G. Notni, “Monte carlo simulation of three-dimensional measurements of translucent objects,” Opt. Eng. 54, 084111 (2015).
[Crossref]

P. Lutzke, P. Kühmstedt, and G. Notni, “Fast error simulation of optical 3D measurements at translucent objects,” Proc. SPIE 8493, 84930U (2012).
[Crossref]

P. Lutzke, P. Kühmstedt, and G. Notni, “Measuring error compensation on three-dimensional scans of translucent objects,” Opt. Eng. 50, 063601 (2011).
[Crossref]

Mahlein, A.-K.

J. Behmann, A.-K. Mahlein, S. Paulus, J. Dupuis, H. Kuhlmann, E.-C. Oerke, and L. Plümer, “Generation and application of hyperspectral 3D plant models: methods and challenges,” Mach. Vis. Appl. 27, 611–624 (2016).
[Crossref]

A.-K. Mahlein, U. Steiner, C. Hillnhütter, H.-W. Dehne, and E.-C. Oerke, “Hyperspectral imaging for small-scale analysis of symptoms caused by different sugar beet diseases,” Plant Methods 8, 3 (2012).
[Crossref] [PubMed]

Malik, A. S.

A. Shahzad, M. N. Saad, N. Walter, A. S. Malik, and F. Meriaudeau, “Hyperspectral venous image quality assessment for optimum illumination range selection based on skin tone characteristics,” Biomed. Eng. Online 13, 109 (2014).
[Crossref] [PubMed]

Mann, A.

S. Heist, A. Mann, P. Kühmstedt, P. Schreiber, and G. Notni, “Array projection of aperiodic sinusoidal fringes for high-speed three-dimensional shape measurement,” Opt. Eng. 53, 112208 (2014).
[Crossref]

Mannila, R.

E. Honkavaara, M. A. Eskelinen, I. Pölönen, H. Saari, H. Ojanen, R. Mannila, C. Holmlund, T. Hakala, P. Litkey, T. Rosnell, N. Viljanen, and M. Pulkkanen, “Remote sensing of 3-D geometry and surface moisture of a peat production area using hyperspectral frame cameras in visible to short-wave infrared spectral ranges onboard a small unmanned airborne vehicle (UAV),” IEEE Transactions on Geosci. Remote. Sens. 54, 5440–5454 (2016).
[Crossref]

Masschelein, B.

P. Agrawal, K. Tack, B. Geelen, B. Masschelein, P. M. A. Moran, A. Lambrechts, and M. Jayapala, “Characterization of VNIR hyperspectral sensors with monolithically integrated optical filters,” Proc. IS&T 2016, 1–7 (2016).

Meriaudeau, F.

A. Shahzad, M. N. Saad, N. Walter, A. S. Malik, and F. Meriaudeau, “Hyperspectral venous image quality assessment for optimum illumination range selection based on skin tone characteristics,” Biomed. Eng. Online 13, 109 (2014).
[Crossref] [PubMed]

Min, M.

M. Min and W. S. Lee, “Determination of significant wavelengths and prediction of nitrogen content for citrus,” Trans. ASAE 48, 455–461 (2005).
[Crossref]

Mitra, N. J.

N. J. Mitra and A. Nguyen, “Estimating surface normals in noisy point cloud data,” Proc. ASCG 19, 322–328 (2003).

Moran, P. M. A.

P. Agrawal, K. Tack, B. Geelen, B. Masschelein, P. M. A. Moran, A. Lambrechts, and M. Jayapala, “Characterization of VNIR hyperspectral sensors with monolithically integrated optical filters,” Proc. IS&T 2016, 1–7 (2016).

Nguyen, A.

N. J. Mitra and A. Nguyen, “Estimating surface normals in noisy point cloud data,” Proc. ASCG 19, 322–328 (2003).

Nicolaï, B. M.

B. M. Nicolaï, K. Beullens, E. Bobelyn, A. Peirs, W. Saeys, K. I. Theron, and J. Lammertyn, “Nondestructive measurement of fruit and vegetable quality by means of NIR spectroscopy: A review,” Postharvest Biol. Technol. 46, 99–118 (2007).
[Crossref]

Notni, G.

S. Heist, P. Dietrich, M. Landmann, P. Kühmstedt, and G. Notni, “High-speed 3D shape measurement by GOBO projection of aperiodic sinusoidal fringes: a performance analysis,” Proc. SPIE 10667, 106670A (2018).

C. Zhang, M. Rosenberger, A. Breitbarth, and G. Notni, “Wavelength dependency of optical 3D measurements at translucent objects using fringe pattern projection,” Proc. SPIE 10220, 1022007 (2017).
[Crossref]

S. Heist, P. Kühmstedt, A. Tünnermann, and G. Notni, “BRDF-dependent accuracy of array projection-based 3D sensors,” Appl. Opt. 56, 2162–2170 (2017).
[Crossref] [PubMed]

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

C. Zhang, M. Rosenberger, A. Breitbarth, and G. Notni, “A novel 3D multispectral vision system based on filter wheel cameras,” IEEE Transactions on Imaging Syst. Tech. 4, 267–272 (2016).

P. Lutzke, S. Heist, P. Kühmstedt, R. Kowarschik, and G. Notni, “Monte carlo simulation of three-dimensional measurements of translucent objects,” Opt. Eng. 54, 084111 (2015).
[Crossref]

S. Heist, A. Mann, P. Kühmstedt, P. Schreiber, and G. Notni, “Array projection of aperiodic sinusoidal fringes for high-speed three-dimensional shape measurement,” Opt. Eng. 53, 112208 (2014).
[Crossref]

P. Lutzke, P. Kühmstedt, and G. Notni, “Fast error simulation of optical 3D measurements at translucent objects,” Proc. SPIE 8493, 84930U (2012).
[Crossref]

P. Lutzke, P. Kühmstedt, and G. Notni, “Measuring error compensation on three-dimensional scans of translucent objects,” Opt. Eng. 50, 063601 (2011).
[Crossref]

O’Donnell, C. P.

A. A. Gowen, C. P. O’Donnell, P. J. Cullen, G. Downey, and J. M. Frias, “Hyperspectral imaging – an emerging process analytical tool for food quality and safety control,” Trends Food Sci. Technol. 18, 590–598 (2007).
[Crossref]

Oerke, E.-C.

J. Behmann, A.-K. Mahlein, S. Paulus, J. Dupuis, H. Kuhlmann, E.-C. Oerke, and L. Plümer, “Generation and application of hyperspectral 3D plant models: methods and challenges,” Mach. Vis. Appl. 27, 611–624 (2016).
[Crossref]

A.-K. Mahlein, U. Steiner, C. Hillnhütter, H.-W. Dehne, and E.-C. Oerke, “Hyperspectral imaging for small-scale analysis of symptoms caused by different sugar beet diseases,” Plant Methods 8, 3 (2012).
[Crossref] [PubMed]

Ogaya, R.

J. Peñuelas, J. Piñol, R. Ogaya, and I. Filella, “Estimation of plant water concentration by the reflectance water index wi (R900/R970),” Int. J. Remote. Sens. 18, 2869–2875 (1997).
[Crossref]

Ojanen, H.

E. Honkavaara, M. A. Eskelinen, I. Pölönen, H. Saari, H. Ojanen, R. Mannila, C. Holmlund, T. Hakala, P. Litkey, T. Rosnell, N. Viljanen, and M. Pulkkanen, “Remote sensing of 3-D geometry and surface moisture of a peat production area using hyperspectral frame cameras in visible to short-wave infrared spectral ranges onboard a small unmanned airborne vehicle (UAV),” IEEE Transactions on Geosci. Remote. Sens. 54, 5440–5454 (2016).
[Crossref]

Ozawa, K.

Papadakis, A.

C. Balas, V. Papadakis, N. Papadakis, A. Papadakis, E. Vazgiouraki, and G. Themelis, “A novel hyper-spectral imaging apparatus for the non-destructive analysis of objects of artistic and historic value,” J. Cult. Herit. 4(1), 330–337 (2003).
[Crossref]

Papadakis, N.

C. Balas, V. Papadakis, N. Papadakis, A. Papadakis, E. Vazgiouraki, and G. Themelis, “A novel hyper-spectral imaging apparatus for the non-destructive analysis of objects of artistic and historic value,” J. Cult. Herit. 4(1), 330–337 (2003).
[Crossref]

Papadakis, V.

C. Balas, V. Papadakis, N. Papadakis, A. Papadakis, E. Vazgiouraki, and G. Themelis, “A novel hyper-spectral imaging apparatus for the non-destructive analysis of objects of artistic and historic value,” J. Cult. Herit. 4(1), 330–337 (2003).
[Crossref]

Paulus, S.

J. Behmann, A.-K. Mahlein, S. Paulus, J. Dupuis, H. Kuhlmann, E.-C. Oerke, and L. Plümer, “Generation and application of hyperspectral 3D plant models: methods and challenges,” Mach. Vis. Appl. 27, 611–624 (2016).
[Crossref]

Peirs, A.

B. M. Nicolaï, K. Beullens, E. Bobelyn, A. Peirs, W. Saeys, K. I. Theron, and J. Lammertyn, “Nondestructive measurement of fruit and vegetable quality by means of NIR spectroscopy: A review,” Postharvest Biol. Technol. 46, 99–118 (2007).
[Crossref]

Peñuelas, J.

J. Peñuelas, J. Piñol, R. Ogaya, and I. Filella, “Estimation of plant water concentration by the reflectance water index wi (R900/R970),” Int. J. Remote. Sens. 18, 2869–2875 (1997).
[Crossref]

Piñol, J.

J. Peñuelas, J. Piñol, R. Ogaya, and I. Filella, “Estimation of plant water concentration by the reflectance water index wi (R900/R970),” Int. J. Remote. Sens. 18, 2869–2875 (1997).
[Crossref]

Plümer, L.

J. Behmann, A.-K. Mahlein, S. Paulus, J. Dupuis, H. Kuhlmann, E.-C. Oerke, and L. Plümer, “Generation and application of hyperspectral 3D plant models: methods and challenges,” Mach. Vis. Appl. 27, 611–624 (2016).
[Crossref]

Pölönen, I.

E. Honkavaara, M. A. Eskelinen, I. Pölönen, H. Saari, H. Ojanen, R. Mannila, C. Holmlund, T. Hakala, P. Litkey, T. Rosnell, N. Viljanen, and M. Pulkkanen, “Remote sensing of 3-D geometry and surface moisture of a peat production area using hyperspectral frame cameras in visible to short-wave infrared spectral ranges onboard a small unmanned airborne vehicle (UAV),” IEEE Transactions on Geosci. Remote. Sens. 54, 5440–5454 (2016).
[Crossref]

Pribanic, T.

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recogn. 43, 2666–2680 (2010).
[Crossref]

Prum, R. O.

M. H. Kim, T. A. Harvey, D. S. Kittle, H. Rushmeier, J. Dorsey, R. O. Prum, and D. J. Brady, “3D imaging spectroscopy for measuring hyperspectral patterns on solid objects,” ACM Trans. Graph. 31, 38 (2012).
[Crossref]

Pulkkanen, M.

E. Honkavaara, M. A. Eskelinen, I. Pölönen, H. Saari, H. Ojanen, R. Mannila, C. Holmlund, T. Hakala, P. Litkey, T. Rosnell, N. Viljanen, and M. Pulkkanen, “Remote sensing of 3-D geometry and surface moisture of a peat production area using hyperspectral frame cameras in visible to short-wave infrared spectral ranges onboard a small unmanned airborne vehicle (UAV),” IEEE Transactions on Geosci. Remote. Sens. 54, 5440–5454 (2016).
[Crossref]

Rafert, J. B.

R. Bridgelall, J. B. Rafert, D. Atwood, and D. D. Tolliver, “Hyperspectral range imaging for transportation systems,” Proc. SPIE 9803, 98032Y (2016).
[Crossref]

Reddy, K. R.

D. Zhao, K. R. Reddy, V. G. Kakani, and V. Reddy, “Nitrogen deficiency effects on plant growth, leaf photosynthesis, and hyperspectral reflectance properties of sorghum,” Eur. J. Agron. 22, 391–403 (2005).
[Crossref]

Reddy, V.

D. Zhao, K. R. Reddy, V. G. Kakani, and V. Reddy, “Nitrogen deficiency effects on plant growth, leaf photosynthesis, and hyperspectral reflectance properties of sorghum,” Eur. J. Agron. 22, 391–403 (2005).
[Crossref]

Robson, S.

T. Luhmann, S. Robson, S. Kyle, and J. Böhm, Close-Range Photogrammetry and 3D Imaging (Walter de Gruyter, 2014).

Rock, B. N.

A. F. H. Goetz, G. Vane, J. E. Solomon, and B. N. Rock, “Imaging spectrometry for earth remote sensing,” Science 228, 1147–1153 (1985).
[Crossref] [PubMed]

Rosenberger, M.

C. Zhang, M. Rosenberger, A. Breitbarth, and G. Notni, “Wavelength dependency of optical 3D measurements at translucent objects using fringe pattern projection,” Proc. SPIE 10220, 1022007 (2017).
[Crossref]

C. Zhang, M. Rosenberger, A. Breitbarth, and G. Notni, “A novel 3D multispectral vision system based on filter wheel cameras,” IEEE Transactions on Imaging Syst. Tech. 4, 267–272 (2016).

Rosnell, T.

E. Honkavaara, M. A. Eskelinen, I. Pölönen, H. Saari, H. Ojanen, R. Mannila, C. Holmlund, T. Hakala, P. Litkey, T. Rosnell, N. Viljanen, and M. Pulkkanen, “Remote sensing of 3-D geometry and surface moisture of a peat production area using hyperspectral frame cameras in visible to short-wave infrared spectral ranges onboard a small unmanned airborne vehicle (UAV),” IEEE Transactions on Geosci. Remote. Sens. 54, 5440–5454 (2016).
[Crossref]

Rushmeier, H.

M. H. Kim, T. A. Harvey, D. S. Kittle, H. Rushmeier, J. Dorsey, R. O. Prum, and D. J. Brady, “3D imaging spectroscopy for measuring hyperspectral patterns on solid objects,” ACM Trans. Graph. 31, 38 (2012).
[Crossref]

Saad, M. N.

A. Shahzad, M. N. Saad, N. Walter, A. S. Malik, and F. Meriaudeau, “Hyperspectral venous image quality assessment for optimum illumination range selection based on skin tone characteristics,” Biomed. Eng. Online 13, 109 (2014).
[Crossref] [PubMed]

Saari, H.

E. Honkavaara, M. A. Eskelinen, I. Pölönen, H. Saari, H. Ojanen, R. Mannila, C. Holmlund, T. Hakala, P. Litkey, T. Rosnell, N. Viljanen, and M. Pulkkanen, “Remote sensing of 3-D geometry and surface moisture of a peat production area using hyperspectral frame cameras in visible to short-wave infrared spectral ranges onboard a small unmanned airborne vehicle (UAV),” IEEE Transactions on Geosci. Remote. Sens. 54, 5440–5454 (2016).
[Crossref]

Saeys, W.

B. M. Nicolaï, K. Beullens, E. Bobelyn, A. Peirs, W. Saeys, K. I. Theron, and J. Lammertyn, “Nondestructive measurement of fruit and vegetable quality by means of NIR spectroscopy: A review,” Postharvest Biol. Technol. 46, 99–118 (2007).
[Crossref]

Salvi, J.

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recogn. 43, 2666–2680 (2010).
[Crossref]

Sato, I.

Schmidt, I.

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

Schreiber, P.

S. Heist, A. Mann, P. Kühmstedt, P. Schreiber, and G. Notni, “Array projection of aperiodic sinusoidal fringes for high-speed three-dimensional shape measurement,” Opt. Eng. 53, 112208 (2014).
[Crossref]

Shahzad, A.

A. Shahzad, M. N. Saad, N. Walter, A. S. Malik, and F. Meriaudeau, “Hyperspectral venous image quality assessment for optimum illumination range selection based on skin tone characteristics,” Biomed. Eng. Online 13, 109 (2014).
[Crossref] [PubMed]

Sims, D. A.

D. A. Sims and J. A. Gamon, “Estimation of vegetation water content and photosynthetic tissue area from spectral reflectance: a comparison of indices based on liquid water and chlorophyll absorption features,” Remote. Sens. Environ. 84, 526–537 (2003).
[Crossref]

Solomon, J. E.

A. F. H. Goetz, G. Vane, J. E. Solomon, and B. N. Rock, “Imaging spectrometry for earth remote sensing,” Science 228, 1147–1153 (1985).
[Crossref] [PubMed]

Soussan, P.

N. Tack, A. Lambrechts, P. Soussan, and L. Haspeslagh, “A compact, high-speed, and low-cost hyperspectral imager,” Proc. SPIE 8266, 82660Q (2012).
[Crossref]

Steiner, U.

A.-K. Mahlein, U. Steiner, C. Hillnhütter, H.-W. Dehne, and E.-C. Oerke, “Hyperspectral imaging for small-scale analysis of symptoms caused by different sugar beet diseases,” Plant Methods 8, 3 (2012).
[Crossref] [PubMed]

Stokes, A. R.

M. Born, E. Wolf, A. B. Bhatia, P. C. Clemmow, D. Gabor, A. R. Stokes, A. M. Taylor, P. A. Wayman, and W. L. Wilcock, Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light (Cambridge University, 1999).
[Crossref]

Su, B.

H. Liang, A. Lucian, R. Lange, C. S. Cheung, and B. Su, “Remote spectral imaging with simultaneous extraction of 3D topography for historical wall paintings,” ISPRS J. Photogramm. Remote. Sens. 95, 13–22 (2014).
[Crossref]

Sun, D.-W.

G. ElMasry and D.-W. Sun, “Principles of hyperspectral imaging technology,” in “Hyperspectral Imaging for Food Quality Analysis and Control,” D.-W. Sun, ed. (Academic, San Diego, 2010), pp. 3–43.
[Crossref]

Suo, J.

J. Wu, B. Xiong, X. Lin, J. He, J. Suo, and Q. Dai, “Snapshot hyperspectral volumetric microscopy,” Sci. Rep. 6, 24624 (2016).
[Crossref] [PubMed]

Tack, K.

P. Agrawal, K. Tack, B. Geelen, B. Masschelein, P. M. A. Moran, A. Lambrechts, and M. Jayapala, “Characterization of VNIR hyperspectral sensors with monolithically integrated optical filters,” Proc. IS&T 2016, 1–7 (2016).

Tack, N.

B. Geelen, N. Tack, and A. Lambrechts, “A compact snapshot multispectral imager with a monolithically integrated per-pixel filter mosaic,” Proc. SPIE 8974, 89740L (2014).
[Crossref]

N. Tack, A. Lambrechts, P. Soussan, and L. Haspeslagh, “A compact, high-speed, and low-cost hyperspectral imager,” Proc. SPIE 8266, 82660Q (2012).
[Crossref]

Taylor, A. M.

M. Born, E. Wolf, A. B. Bhatia, P. C. Clemmow, D. Gabor, A. R. Stokes, A. M. Taylor, P. A. Wayman, and W. L. Wilcock, Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light (Cambridge University, 1999).
[Crossref]

Themelis, G.

C. Balas, V. Papadakis, N. Papadakis, A. Papadakis, E. Vazgiouraki, and G. Themelis, “A novel hyper-spectral imaging apparatus for the non-destructive analysis of objects of artistic and historic value,” J. Cult. Herit. 4(1), 330–337 (2003).
[Crossref]

Theron, K. I.

B. M. Nicolaï, K. Beullens, E. Bobelyn, A. Peirs, W. Saeys, K. I. Theron, and J. Lammertyn, “Nondestructive measurement of fruit and vegetable quality by means of NIR spectroscopy: A review,” Postharvest Biol. Technol. 46, 99–118 (2007).
[Crossref]

Tolliver, D. D.

R. Bridgelall, J. B. Rafert, D. Atwood, and D. D. Tolliver, “Hyperspectral range imaging for transportation systems,” Proc. SPIE 9803, 98032Y (2016).
[Crossref]

Tünnermann, A.

S. Heist, P. Kühmstedt, A. Tünnermann, and G. Notni, “BRDF-dependent accuracy of array projection-based 3D sensors,” Appl. Opt. 56, 2162–2170 (2017).
[Crossref] [PubMed]

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

Ullman, S.

S. Ullman, “The interpretation of structure from motion,” Proc. R. Soc. Lond., B, Biol. Sci. 203, 405–426 (1979).
[Crossref] [PubMed]

Vane, G.

A. F. H. Goetz, G. Vane, J. E. Solomon, and B. N. Rock, “Imaging spectrometry for earth remote sensing,” Science 228, 1147–1153 (1985).
[Crossref] [PubMed]

Vazgiouraki, E.

C. Balas, V. Papadakis, N. Papadakis, A. Papadakis, E. Vazgiouraki, and G. Themelis, “A novel hyper-spectral imaging apparatus for the non-destructive analysis of objects of artistic and historic value,” J. Cult. Herit. 4(1), 330–337 (2003).
[Crossref]

Viljanen, N.

E. Honkavaara, M. A. Eskelinen, I. Pölönen, H. Saari, H. Ojanen, R. Mannila, C. Holmlund, T. Hakala, P. Litkey, T. Rosnell, N. Viljanen, and M. Pulkkanen, “Remote sensing of 3-D geometry and surface moisture of a peat production area using hyperspectral frame cameras in visible to short-wave infrared spectral ranges onboard a small unmanned airborne vehicle (UAV),” IEEE Transactions on Geosci. Remote. Sens. 54, 5440–5454 (2016).
[Crossref]

Walter, N.

A. Shahzad, M. N. Saad, N. Walter, A. S. Malik, and F. Meriaudeau, “Hyperspectral venous image quality assessment for optimum illumination range selection based on skin tone characteristics,” Biomed. Eng. Online 13, 109 (2014).
[Crossref] [PubMed]

Wayman, P. A.

M. Born, E. Wolf, A. B. Bhatia, P. C. Clemmow, D. Gabor, A. R. Stokes, A. M. Taylor, P. A. Wayman, and W. L. Wilcock, Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light (Cambridge University, 1999).
[Crossref]

Wilcock, W. L.

M. Born, E. Wolf, A. B. Bhatia, P. C. Clemmow, D. Gabor, A. R. Stokes, A. M. Taylor, P. A. Wayman, and W. L. Wilcock, Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light (Cambridge University, 1999).
[Crossref]

Willett, R. M.

R. M. Willett, M. F. Duarte, M. A. Davenport, and R. G. Baraniuk, “Sparsity and structure in hyperspectral imaging : Sensing, reconstruction, and target detection,” IEEE Signal Process. Mag. 31, 116–126 (2014).
[Crossref]

Wolf, E.

M. Born, E. Wolf, A. B. Bhatia, P. C. Clemmow, D. Gabor, A. R. Stokes, A. M. Taylor, P. A. Wayman, and W. L. Wilcock, Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light (Cambridge University, 1999).
[Crossref]

Wu, J.

J. Wu, B. Xiong, X. Lin, J. He, J. Suo, and Q. Dai, “Snapshot hyperspectral volumetric microscopy,” Sci. Rep. 6, 24624 (2016).
[Crossref] [PubMed]

Xiong, B.

J. Wu, B. Xiong, X. Lin, J. He, J. Suo, and Q. Dai, “Snapshot hyperspectral volumetric microscopy,” Sci. Rep. 6, 24624 (2016).
[Crossref] [PubMed]

Xu, R.

S Zhang, P. Liu, J. Huang, and R. Xu, “Multiview hyperspectral topography of tissue structural and functional characteristics,” J. Biomed. Opt. 21, 016012 (2016).
[Crossref]

Yamaguchi, M.

Zhang, C.

C. Zhang, M. Rosenberger, A. Breitbarth, and G. Notni, “Wavelength dependency of optical 3D measurements at translucent objects using fringe pattern projection,” Proc. SPIE 10220, 1022007 (2017).
[Crossref]

C. Zhang, M. Rosenberger, A. Breitbarth, and G. Notni, “A novel 3D multispectral vision system based on filter wheel cameras,” IEEE Transactions on Imaging Syst. Tech. 4, 267–272 (2016).

Zhang, G.

Q. Zhang, Q. Li, and G. Zhang, “Rapid determination of leaf water content using VIS/NIR spectroscopy analysis with wavelength selection,” Spectrosc. Int. J. 27, 93–105 (2012).
[Crossref]

Zhang, Q.

Q. Zhang, Q. Li, and G. Zhang, “Rapid determination of leaf water content using VIS/NIR spectroscopy analysis with wavelength selection,” Spectrosc. Int. J. 27, 93–105 (2012).
[Crossref]

Zhang, S

S Zhang, P. Liu, J. Huang, and R. Xu, “Multiview hyperspectral topography of tissue structural and functional characteristics,” J. Biomed. Opt. 21, 016012 (2016).
[Crossref]

Zhang, S.

S. Zhang, Handbook of 3D Machine Vision: Optical Metrology and Imaging (CRC, 2013).
[Crossref]

Zhang, Z.

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on Pattern Analysis Mach. Intell. 22, 1330–1334 (2000).
[Crossref]

Zhao, D.

D. Zhao, K. R. Reddy, V. G. Kakani, and V. Reddy, “Nitrogen deficiency effects on plant growth, leaf photosynthesis, and hyperspectral reflectance properties of sorghum,” Eur. J. Agron. 22, 391–403 (2005).
[Crossref]

Zhou, J.

A. Zia, J. Liang, J. Zhou, and Y. Gao, “3D reconstruction from hyperspectral images,” in Proceedings of the Winter Conference on Applications of Computer Vision (WACV) pp. 318–325 (2015).

Zia, A.

A. Zia, J. Liang, J. Zhou, and Y. Gao, “3D reconstruction from hyperspectral images,” in Proceedings of the Winter Conference on Applications of Computer Vision (WACV) pp. 318–325 (2015).

Zisserman, A.

R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision (Cambridge University, 2004).
[Crossref]

ACM Trans. Graph. (1)

M. H. Kim, T. A. Harvey, D. S. Kittle, H. Rushmeier, J. Dorsey, R. O. Prum, and D. J. Brady, “3D imaging spectroscopy for measuring hyperspectral patterns on solid objects,” ACM Trans. Graph. 31, 38 (2012).
[Crossref]

Adv. Opt. Photon. (1)

Appl. Opt. (1)

Appl. Phys. A (1)

H. Liang, “Advances in multispectral and hyperspectral imaging for archaeology and art conservation,” Appl. Phys. A 106, 309–323 (2012).
[Crossref]

Biomed. Eng. Online (1)

A. Shahzad, M. N. Saad, N. Walter, A. S. Malik, and F. Meriaudeau, “Hyperspectral venous image quality assessment for optimum illumination range selection based on skin tone characteristics,” Biomed. Eng. Online 13, 109 (2014).
[Crossref] [PubMed]

Eur. J. Agron. (1)

D. Zhao, K. R. Reddy, V. G. Kakani, and V. Reddy, “Nitrogen deficiency effects on plant growth, leaf photosynthesis, and hyperspectral reflectance properties of sorghum,” Eur. J. Agron. 22, 391–403 (2005).
[Crossref]

IEEE Signal Process. Mag. (1)

R. M. Willett, M. F. Duarte, M. A. Davenport, and R. G. Baraniuk, “Sparsity and structure in hyperspectral imaging : Sensing, reconstruction, and target detection,” IEEE Signal Process. Mag. 31, 116–126 (2014).
[Crossref]

IEEE Transactions on Geosci. Remote. Sens. (1)

E. Honkavaara, M. A. Eskelinen, I. Pölönen, H. Saari, H. Ojanen, R. Mannila, C. Holmlund, T. Hakala, P. Litkey, T. Rosnell, N. Viljanen, and M. Pulkkanen, “Remote sensing of 3-D geometry and surface moisture of a peat production area using hyperspectral frame cameras in visible to short-wave infrared spectral ranges onboard a small unmanned airborne vehicle (UAV),” IEEE Transactions on Geosci. Remote. Sens. 54, 5440–5454 (2016).
[Crossref]

IEEE Transactions on Imaging Syst. Tech. (1)

C. Zhang, M. Rosenberger, A. Breitbarth, and G. Notni, “A novel 3D multispectral vision system based on filter wheel cameras,” IEEE Transactions on Imaging Syst. Tech. 4, 267–272 (2016).

IEEE Transactions on Pattern Analysis Mach. Intell. (1)

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on Pattern Analysis Mach. Intell. 22, 1330–1334 (2000).
[Crossref]

Int. J. Remote. Sens. (1)

J. Peñuelas, J. Piñol, R. Ogaya, and I. Filella, “Estimation of plant water concentration by the reflectance water index wi (R900/R970),” Int. J. Remote. Sens. 18, 2869–2875 (1997).
[Crossref]

ISPRS J. Photogramm. Remote. Sens. (1)

H. Liang, A. Lucian, R. Lange, C. S. Cheung, and B. Su, “Remote spectral imaging with simultaneous extraction of 3D topography for historical wall paintings,” ISPRS J. Photogramm. Remote. Sens. 95, 13–22 (2014).
[Crossref]

J. Biomed. Opt. (3)

G. Lu and B. Fei, “Medical hyperspectral imaging: a review,” J. Biomed. Opt. 19, 010901 (2014).
[Crossref]

S Zhang, P. Liu, J. Huang, and R. Xu, “Multiview hyperspectral topography of tissue structural and functional characteristics,” J. Biomed. Opt. 21, 016012 (2016).
[Crossref]

G. Lu and B. Fei, “Medical hyperspectral imaging: a review,” J. Biomed. Opt. 19, 19–24 (2014).
[Crossref]

J. Chemom. (1)

J. Burger and P. Geladi, “Hyperspectral NIR image regression part I: calibration and correction,” J. Chemom. 19, 355–363 (2005).
[Crossref]

J. Cult. Herit. (1)

C. Balas, V. Papadakis, N. Papadakis, A. Papadakis, E. Vazgiouraki, and G. Themelis, “A novel hyper-spectral imaging apparatus for the non-destructive analysis of objects of artistic and historic value,” J. Cult. Herit. 4(1), 330–337 (2003).
[Crossref]

J. Opt. Soc. Am. A (1)

Mach. Vis. Appl. (1)

J. Behmann, A.-K. Mahlein, S. Paulus, J. Dupuis, H. Kuhlmann, E.-C. Oerke, and L. Plümer, “Generation and application of hyperspectral 3D plant models: methods and challenges,” Mach. Vis. Appl. 27, 611–624 (2016).
[Crossref]

Opt. Eng. (4)

S. Heist, A. Mann, P. Kühmstedt, P. Schreiber, and G. Notni, “Array projection of aperiodic sinusoidal fringes for high-speed three-dimensional shape measurement,” Opt. Eng. 53, 112208 (2014).
[Crossref]

N. Hagen and M. W. Kudenov, “Review of snapshot spectral imaging technologies,” Opt. Eng. 52, 090901 (2013).
[Crossref]

P. Lutzke, P. Kühmstedt, and G. Notni, “Measuring error compensation on three-dimensional scans of translucent objects,” Opt. Eng. 50, 063601 (2011).
[Crossref]

P. Lutzke, S. Heist, P. Kühmstedt, R. Kowarschik, and G. Notni, “Monte carlo simulation of three-dimensional measurements of translucent objects,” Opt. Eng. 54, 084111 (2015).
[Crossref]

Opt. Lasers Eng. (1)

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

Pattern Recogn. (1)

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recogn. 43, 2666–2680 (2010).
[Crossref]

Plant Methods (1)

A.-K. Mahlein, U. Steiner, C. Hillnhütter, H.-W. Dehne, and E.-C. Oerke, “Hyperspectral imaging for small-scale analysis of symptoms caused by different sugar beet diseases,” Plant Methods 8, 3 (2012).
[Crossref] [PubMed]

Postharvest Biol. Technol. (1)

B. M. Nicolaï, K. Beullens, E. Bobelyn, A. Peirs, W. Saeys, K. I. Theron, and J. Lammertyn, “Nondestructive measurement of fruit and vegetable quality by means of NIR spectroscopy: A review,” Postharvest Biol. Technol. 46, 99–118 (2007).
[Crossref]

Proc. ASCG (1)

N. J. Mitra and A. Nguyen, “Estimating surface normals in noisy point cloud data,” Proc. ASCG 19, 322–328 (2003).

Proc. IS&T (1)

P. Agrawal, K. Tack, B. Geelen, B. Masschelein, P. M. A. Moran, A. Lambrechts, and M. Jayapala, “Characterization of VNIR hyperspectral sensors with monolithically integrated optical filters,” Proc. IS&T 2016, 1–7 (2016).

Proc. R. Soc. Lond., B, Biol. Sci. (1)

S. Ullman, “The interpretation of structure from motion,” Proc. R. Soc. Lond., B, Biol. Sci. 203, 405–426 (1979).
[Crossref] [PubMed]

Proc. SPIE (6)

R. Bridgelall, J. B. Rafert, D. Atwood, and D. D. Tolliver, “Hyperspectral range imaging for transportation systems,” Proc. SPIE 9803, 98032Y (2016).
[Crossref]

B. Geelen, N. Tack, and A. Lambrechts, “A compact snapshot multispectral imager with a monolithically integrated per-pixel filter mosaic,” Proc. SPIE 8974, 89740L (2014).
[Crossref]

N. Tack, A. Lambrechts, P. Soussan, and L. Haspeslagh, “A compact, high-speed, and low-cost hyperspectral imager,” Proc. SPIE 8266, 82660Q (2012).
[Crossref]

C. Zhang, M. Rosenberger, A. Breitbarth, and G. Notni, “Wavelength dependency of optical 3D measurements at translucent objects using fringe pattern projection,” Proc. SPIE 10220, 1022007 (2017).
[Crossref]

S. Heist, P. Dietrich, M. Landmann, P. Kühmstedt, and G. Notni, “High-speed 3D shape measurement by GOBO projection of aperiodic sinusoidal fringes: a performance analysis,” Proc. SPIE 10667, 106670A (2018).

P. Lutzke, P. Kühmstedt, and G. Notni, “Fast error simulation of optical 3D measurements at translucent objects,” Proc. SPIE 8493, 84930U (2012).
[Crossref]

Remote. Sens. Environ. (1)

D. A. Sims and J. A. Gamon, “Estimation of vegetation water content and photosynthetic tissue area from spectral reflectance: a comparison of indices based on liquid water and chlorophyll absorption features,” Remote. Sens. Environ. 84, 526–537 (2003).
[Crossref]

Sci. Rep. (1)

J. Wu, B. Xiong, X. Lin, J. He, J. Suo, and Q. Dai, “Snapshot hyperspectral volumetric microscopy,” Sci. Rep. 6, 24624 (2016).
[Crossref] [PubMed]

Science (1)

A. F. H. Goetz, G. Vane, J. E. Solomon, and B. N. Rock, “Imaging spectrometry for earth remote sensing,” Science 228, 1147–1153 (1985).
[Crossref] [PubMed]

Spectrosc. Int. J. (1)

Q. Zhang, Q. Li, and G. Zhang, “Rapid determination of leaf water content using VIS/NIR spectroscopy analysis with wavelength selection,” Spectrosc. Int. J. 27, 93–105 (2012).
[Crossref]

Stud. Conserv. (1)

C. Fischer and I. Kakoulli, “Multispectral and hyperspectral imaging technologies in conservation: current research and potential applications,” Stud. Conserv. 51, 3–16 (2006).
[Crossref]

Trans. ASAE (1)

M. Min and W. S. Lee, “Determination of significant wavelengths and prediction of nitrogen content for citrus,” Trans. ASAE 48, 455–461 (2005).
[Crossref]

Trends Food Sci. Technol. (1)

A. A. Gowen, C. P. O’Donnell, P. J. Cullen, G. Downey, and J. M. Frias, “Hyperspectral imaging – an emerging process analytical tool for food quality and safety control,” Trends Food Sci. Technol. 18, 590–598 (2007).
[Crossref]

Other (9)

H. Aasen, “The acquisition of hyperspectral digital surface models of crops from UAV snapshot cameras,” Ph.D. thesis, University of Cologne (2016).

G. ElMasry and D.-W. Sun, “Principles of hyperspectral imaging technology,” in “Hyperspectral Imaging for Food Quality Analysis and Control,” D.-W. Sun, ed. (Academic, San Diego, 2010), pp. 3–43.
[Crossref]

A. Zia, J. Liang, J. Zhou, and Y. Gao, “3D reconstruction from hyperspectral images,” in Proceedings of the Winter Conference on Applications of Computer Vision (WACV) pp. 318–325 (2015).

“VDI/VDE 2634 part 2: Optical 3-D measuring systems – optical systems based on area scanning,” (2012).

M. Born, E. Wolf, A. B. Bhatia, P. C. Clemmow, D. Gabor, A. R. Stokes, A. M. Taylor, P. A. Wayman, and W. L. Wilcock, Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light (Cambridge University, 1999).
[Crossref]

E. Hecht, Optics (Addison-Wesley, 2002).

T. Luhmann, S. Robson, S. Kyle, and J. Böhm, Close-Range Photogrammetry and 3D Imaging (Walter de Gruyter, 2014).

S. Zhang, Handbook of 3D Machine Vision: Optical Metrology and Imaging (CRC, 2013).
[Crossref]

R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision (Cambridge University, 2004).
[Crossref]

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1
Fig. 1 Operating principle of a hyperspectral snapshot camera. (a) Schematic representation of a hyperspectral, 5 × 5 tessellated snapshot sensor. (b) Optical path in the Fabry-Pérot interferometer.
Fig. 2
Fig. 2 (a) Illustration of a stereo camera system and its epipolar geometry. (b) Camera images of an aperiodic sinusoidal pattern projected onto a measurement object, making the detection of corresponding image points p 1 and p 2 possible.
Fig. 3
Fig. 3 Location of the principal point p 0 in the coordinate system of a spectral channel: for channel (0, 1) with its virtual pixels marked with orange lines, the principal point has the coordinates p 0 ( 0 , 1 ) = [ ( x 0 , y 0 ) ( 0 , 1 ) ] / 5. In general, it is p 0 ( i , j ) = [ ( x 0 , y 0 ) ( 0 , 1 ) ] / 5.
Fig. 4
Fig. 4 Measurement setup. (a) Photograph of the developed 5D sensor comprising two hyperspectral snapshot cameras, a GOBO projector illuminating the measurement object with varying aperiodic sinusoidal patterns, and a projector homogeneously illuminating the scene. (b) GOBO wheel consisting of a borosilicate glass plate with a chromium coating from which aperiodic strips have been etched.
Fig. 5
Fig. 5 Spectral properties of one of the applied hyperspectral snapshot cameras: combined quantum efficiency Qi of the 5 × 5 different channels and combined transmittance Tfilter of the band-pass/edge filter in front of the sensor in the operation mode (OM1), i.e., with a spectral response between 600 and 875 nm.
Fig. 6
Fig. 6 Operation modes of the 5D sensor. Depending on the filters mounted in front of the objective lenses of the hyperspectral cameras (HSC1 and HSC2), the projector for homogeneous illumination (HI), and the GOBO projector for aperiodic sinusoidal patterns (ASP), the sensor can be operated in four different modes.
Fig. 7
Fig. 7 Hyperspectral 3D measurement of a historical globe. (a) Photograph of the globe with marked measurement area. (b) 3D surface model with artificial blue shading. (c) 3D surface model with mapped intensities from selected spectral channels, colored according to their wavelength.
Fig. 8
Fig. 8 Hyperspectral 3D measurement of a human hand. 3D surface model without (left) and with mapped intensities from selected channels in the (a) 600-875-nm and (b) 675-975-nm camera configuration.
Fig. 9
Fig. 9 Water absorption by a citrus plant. (a) Color images of the plant after supplying water. (b) 3D surface models. (c) Reflectance spectrum.
Fig. 10
Fig. 10 Measurement of an opaque and a translucent sphere. (a) Exemplary camera images during the measurement in the 750-nm channel. (b) Illustration of the measured points’ deviation from the real surface when measuring a translucent sphere. (c) Results obtained using our hyperspectral 3D sensor. Error bars denote the standard deviation of the results of the measurement which was repeated ten times.

Equations (15)

Equations on this page are rendered with MathJax. Learn more.

p k = [ x k , y k I k ( λ 1 , , λ n ) ]
p k = [ x k , y k , z k , I k ( λ 1 , , λ n ) ] .
p k 5 D = [ x k , y k , z k , t , I k ( λ 1 , , λ n ) ] .
T = ( 1 R 1 ) ( 1 R 2 ) ( 1 R 1 R 2 ) 2 + 4 R 1 R 2 sin 2 Δ φ 2
with Δ φ = 4 π n d cos ϑ λ and n sin ϑ = sin ϑ 0
λ k = 2 n d cos ϑ k with k = 1 , 2 , 3 , .
g i = K t exp = : K 0 R obj ( λ ) E e ( λ ) T lens ( λ ) T filter ( λ ) Q i ( λ ) = : F i ( λ ) d λ with i = 1 , , n
g i = K j = 1 n λ j a λ j b R obj ( λ ) F i ( λ ) d λ .
g i = K j = 1 n R obj , j λ j a λ j b F i ( λ ) d λ .
R obj = R obj R dark R spec R dark .
I k proj ( x , y ) = a k ( x ) + b k ( x ) sin [ c k ( x ) x + d k ( x ) ] with k = 1 , , N
ρ = k = 1 N [ g k ( 1 ) g ( 1 ) ¯ ] [ g k ( 2 ) g ( 2 ) ¯ ] k = 1 N [ g k ( 1 ) g ( 1 ) ¯ ] 2 k = 1 N [ g k ( 2 ) g ( 2 ) ¯ ] 2
g ( 1 ) ¯ = 1 N k = 1 N g k ( 1 ) and g ( 2 ) ¯ = 1 N k = 1 N g k ( 2 ) .
R ( i , j ) = [ R full ( i , j ) ] / 5 .
Q ( x 1 y disp 1 ) with Q = [ 1 0 0 c x 1 0 1 0 c y 1 0 0 0 κ 0 0 1 / l ( c x 1 c x 2 ) / l ] .

Metrics