Abstract

In single-pixel imaging (SPI), a large number of illuminations is usually required to capture one single image. Consequently, SPI may only achieve a very low frame rate for a fast-moving object and the reconstructed images are contaminated with blur and noise. In previous works, some attempts are made to perform motion estimation between neighboring frames in a SPI video to enhance the image quality. However, the motion estimation and quality enhancement from one single image frame in dynamic SPI was seldom investigated. In this work, it assumed that some prior knowledge about the type of motion the object undergoes is known. A motion model of the target object is constructed and the motion parameters can be optimized within a search space. Our proposed scheme is different from common motion deblur techniques for photographs since the motion blur mechanism in SPI is significantly different from a conventional camera. Experimental results demonstrate that the reconstructed images with our proposed scheme in dynamic SPI have much better quality.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

In a conventional camera, a pixelated sensor array is usually utilized to capture a two-dimensional image of the target object. As a novel computational imaging technique, single-pixel imaging (SPI) [1–5] only requires a sensor with one single pixel for recording a photograph with two-dimensional spatial resolution. The object is sequentially illuminated with varying patterns and the total light intensity of the entire object scene is recorded as a single-pixel value by the detector at each time. Finally, the object image is computationally reconstructed from both the recorded single-pixel intensity sequence and the illumination patterns. A typical optical setup for SPI is shown in Fig. 1. SPI exhibits substantial advantages in some invisible wavebands where conventional pixelated sensor arrays are expensive or even not available [4]. In addition, SPI can realize indirect-line-of-sight imaging and imaging under weak light conditions. The potential applications of SPI include but not limited to remote sensing [6], three-dimensional measurement [7,8], microscopy [9,10], image encryption [11–13], lidar detection [14,15], gas leak monitoring [16] and tomography [17].

 

Fig. 1 Optical setup for a single-pixel imaging system.

Download Full Size | PPT Slide | PDF

Despite the success, SPI has one major drawback of low imaging efficiency. Common illumination devices, such as spatial light modulator (SLM) or digital micromirror device (DMD), can only project a limited number of patterns within a certain time period, it is difficult for a SPI system to capture many images within a short time. This problem can be addressed to certain extent by designing optimized illumination patterns (e.g. Fourier transform patterns [18], Hadamard transform patterns [19], principal component patterns [20], wavelet patterns [21], sub-pixel shifted patterns [22]), utilizing a multi-element detector [23] and improved reconstruction algorithms [24]. However, a considerable number of illuminations and recordings are often still required for capturing one single image in SPI. The frame rate for capturing a SPI video of a dynamic object scene can be rather low. In this work, the capture of a single image or a video clip with SPI for a non-static object moving sufficiently fast, compared with the frame rate of the SPI system, is referred to as dynamic SPI. If the target object is moving sufficiently fast within the imaging period of one image frame in the video, the reconstructed image from SPI can suffer from severe quality degradation such as blur and noise. In some previous works [25,26], motion estimation is performed between consecutive video frames to recover high-quality reconstructed images for low frame rate SPI. However, the motion estimation and quality enhancement from only one single image frame in dynamic SPI was seldom investigated in the past. In addition, the common single-image deblurring techniques for photographs captured by conventional cameras, such as Wiener filter, regularized filter and Lucy-Richardson method [27,28], are not directly applicable to the reconstructed images in SPI. The reason is that the motion blur mechanism in a SPI system is significantly different from that of a conventional camera. We propose a tailor-made scheme for modeling the motion blur in SPI and reduce the motion blur effect from one single image in SPI for the first time in this work.

In our proposed scheme, it is considered that if the object is moving during pattern illuminations in SPI (the type of motion is known), it is equivalent that the object is static but the illumination patterns are geometrically transformed in a reverse manner at different time points. Then the object image can be reconstructed from the recorded single-pixel data sequence using transformed illumination patterns, instead of the original illumination patterns. The motion parameters in transforming the patterns, such as moving or rotating speed, can be optimized by evaluating the quality of reconstructed image using certain metric functions. Finally, the object image can be reconstructed with high fidelity using our optimized parameters.

2. Principles of our proposed scheme

2.1. Model of single-pixel imaging

In single-pixel imaging, the target object image is assumed to be O(n) and the illumination patterns are assumed to be Hadamard patterns H(m,n). The total number of pixels in the object image and in each illumination pattern is assumed to be N. The intensity of each pixel in the object image O(n) can be represented as a column matrix (or column vector) with length N, where n denotes the index of each pixel (1<=n<=N). The number of illumination patterns is assumed to be M and M equals N if the sampling ratio is equal to 1. Each two-dimensional illumination pattern can be represented as a row vector with length N. H(m,n) denotes the nth pixel (or element) in the mth illumination pattern (1<=m<=M,1<=n<=N). All M illumination patterns constitute a matrix H consisting of M rows and N columns. The recorded single-pixel intensity sequence I(m) can be represented as a column vector of length M (1<=m<=M), which equals the multiplication of H and O, given by Eqs. (1) and (2).

I=HO
[I(1)I(2)I(M)]=[H(1,1)H(1,N)H(M,1)H(M,N)][O(1)O(2)O(N)]

The object image can be computationally recovered from I and H with various algorithms [24]. The alternating projection algorithm [24,29] is employed in this work to reconstruct the object image, which can yield more accurate results with than direct inverse Hadamard transform.

However, if the object image is non-static while the 1th, 2nd, 3rd, …, Mth patterns are illuminated sequentially, the recorded single-pixel intensity sequence contains mixed information of object images in different motion status. The original object image at a certain time point cannot be reconstructed with high fidelity. To address this problem, we propose a scheme of motion estimation and quality enhancement for a dynamic scene in SPI.

2.2. Proposed scheme of motion estimation and quality enhancement

2.2.1. Model of motion estimation

In this work, we consider that if the object is translationally shifting, rotating around a certain center point or geometrically transformed in other ways, it is equivalent that the object is static, but the illumination patterns are geometrically transformed in an inverse manner. For example, in Fig. 2, the object is translationally shifting to the right direction during the illumination of varying Hadamard patterns.

 

Fig. 2 Translationally shifting object in SPI: (a) the object is moving;(b) the illumination patterns are static; (c) the object is static; (d) the illumination patterns are shifted in a reverse manner.

Download Full Size | PPT Slide | PDF

It is equivalent that the object is static and the Hadamard illumination patterns are translationally shifted to the left direction. If the object is rotating clockwise, equivalently the illumination patterns are rotating counter-clockwise, shown in Fig. 3.

 

Fig. 3 Rotating object in SPI: (a) the object is moving; (b) the illumination patterns are static; (c) the object is static; (d) the illumination patterns are rotating in a reverse manner.

Download Full Size | PPT Slide | PDF

In practice, we may have some prior knowledge about the type of movement the object undergoes (e.g. it is shifting or rotating or moving in other ways), but the exact motion parameters (such as moving or rotation speed) are not known in advance. It can be assumed that within the imaging time of one image frame, the motion parameters of the object remain constant and a rough range of parameter values is known in advance.

It is supposed that the time interval between two subsequent illumination patterns within one frame is Δt. The object is translationally shifting and the moving speed in x direction and y direction is vx and vy respectively. The illumination patterns will be geometrically transformed in the following way: the first illumination pattern (denoted by the first row in H) remains unchanged, the second illumination pattern (denoted by the second row in H) will be shifted by vxΔt and vyΔt in two directions respectively, …, and the Kth illumination pattern will be shifted by (K1)vxΔt and (K1)vyΔt in two directions. Then the new illumination matrix after transformation, Hnew can be expressed as Eq. (3).

Hnew(vx,vy)=T(H,vx,vy)
where Hnew denotes the illumination pattern matrix after the transform and T denotes the transform operation. The object image Onew(vx,vy) will be reconstructed from Hnew (instead of original H) and I. If the estimated vx and vy are identical or close to the actual moving speed of the target object, the object image can be recovered with good fidelity. Otherwise, the reconstructed image may still suffer from noise and degradation. A rough range of possible values of vx and vy can be pre-defined and a search space can be constructed.

For other types of motion in addition to translational shift, the parameters can be optimized in a similar way. For example, if the object is rotating, then three parameters, x-coordinate of the rotation center Px, y-coordinate of the rotation center Py and the rotation speed w need to be optimized simultaneously, given by Eq. (4).

Hnew(Px,Py,w)=T(H,Px,Py,w)

2.2.2. Optimization of motion parameters

The optimal motion parameters can be determined in the following way in our scheme, shown in Fig. 4.

 

Fig. 4 Flowchart of our proposed motion estimation scheme.

Download Full Size | PPT Slide | PDF

The optimized Hnew(vx,vy) and Onew(vx,vy) can be obtained by finding the optimal vx and vy in the search space. An image quality metric function E[Onew(vx,vy)] is used to evaluate the quality of each reconstructed image with different motion parameters in the optimization. In this work, the metric function is defined as the image variance, given by Eq. (5).

E[Onew(vx,vy)]=Var[Onew(vx,vy)]
where Var[] denotes the average variance of all pixel intensity values of one image statistically. This metric can be used to measure the level of noise and blur in an image. E[Onew(vx,vy)] becomes lower if Onew(vx,vy) contains more noise and blur in our examples. In previous works, similar metrics have been employed for the distinction between sharp in-focus images and blurred defocus images [30,31]. A reconstructed image with maximum E[Onew(vx,vy)] (minimum amount of noise) will be obtained when vx and vy reach the optimal values.

In this work, simple genetic algorithm (SGA) [32] is adopted for the optimization of motion parameters. Other heuristic global optimization algorithms such as simulated annealing algorithm [33] and artificial bee colony algorithm (ABC) [34] can be attempted for this problem as well. As a classical optimization method, the working principles of genetic algorithm have been widely reported in many past literatures. In this work, we shall only briefly describe how SGA is used to solve our problem and omit the details.

First, the value of each motion parameters (such as vx and vy in translational shifting movement or Px,Py and w in rotating movement) is encoded into a binary bit sequence. For example, 8 bits are used to represent 255 possible values for vx within the search range [-31 31]. The binary bit sequences representing all the parameters are concatenated to constitute a single binary sequence, referred to as an individual in the algorithm. The population consists of a certain number of individuals (e.g. 30), which is referred to as the population size. Initially, each individual inside the population is a random binary bit sequence, representing random motion parameters. Then the fitness of each individual, which is the quality of reconstructed image measured by the metric function in our problem, is evaluated. According to the fitness values, selection, crossover and mutation operations are implemented on the individuals inside the population. The individuals in the population are updated and the fitness values are evaluated again in the next iteration. The population remains evolving and the individuals remain being updated after each iteration. Finally, after a certain number of iterations, one individual corresponding to optimal motion parameters, that yields a reconstructed image with best fidelity, can be obtained and the search process is finished.

After the SGA optimization based on the motion model and metric functions, a reconstructed image with enhanced quality can be obtained with the optimized motion parameters.

3. Results and discussion

3.1. Simulation results

In the simulation, the object images and illumination patterns contain 32 × 32 pixels (N = 1024). The total number of illuminations is assumed to be M = 1024 and the sampling ratio is 1. The illumination patterns are commonly used Hadamard patterns.

In the first example, the frame rate is one frame per second. The object (a car) is moving from left to right at four different speeds (0 pixels per second, 1 pixel per second, 8 pixels per second, 15 pixels per second). The motion status of the object before the first illumination and after the last illumination (1024th illumination pattern) in one image frame is shown in Fig. 5(a) and Fig. 5(b). The reconstructed images from the recorded single-pixel intensity data and standard Hadamard patterns with conventional methods are shown in Fig. 5(c). It can be observed that the reconstructed images have good quality for static or slowly moving objects (Example 1.1 and 1.2). But the reconstructed images are noisy and blurred for fast moving objects (Example 1.3 and 1.4). The quality of reconstructed results is increasingly more degraded as the speed increases. In the conventional reconstruction method, vx and vy are both considered to be 0, which is identical or very close to the actual moving speed for a static or very slowly moving object (Example 1.1 and 1.2). Consequently, there is no need to estimate the actual motion parameters with our proposed scheme. However, the assumption that vx and vy are both 0 is obviously invalid for fast moving objects (Example 1.3, 1.4 and 1.5) and motion estimation with our proposed scheme is necessary. The reconstruction results with our proposed scheme for fast moving objects are shown in Fig. 5(d). The search range for vx and vy in SGA is set to be [-31 31] and [-31 31] (unit: pixels/s). Each parameter is represented by 8 bits and the population size is 30 in SGA optimization. The maximum number of iterations in SGA is set to be 10. The results illustrate that the reconstructed images in our scheme have significantly improved image quality than those obtained by the conventional method. The reconstructed images have similar quality in our scheme for varying speeds since it takes equal amount of calculation to estimate a high-speed value or a low-speed value. In these examples above, the object moves in the y direction with varying speeds and our proposed scheme can achieve similar performance when the object moves in the x direction.

 

Fig. 5 Simulated translationally shifting object: (a) Object image at the start of one video frame; (b) Object image at the end of one video frame; (c) Reconstructed results with conventional methods; (d) Reconstructed results with our proposed scheme.

Download Full Size | PPT Slide | PDF

If the object is moving in both x direction and y direction (vx = 5, vy = 10), the reconstructed results are shown in Example 1.5 of Fig. 5 and our proposed scheme can still perform accurate motion estimation and significantly enhance the image quality. The estimated motion parameter and true motion parameters are compared in Table 1 and their values are quite close. The causes of error in the estimated values are analyzed at the end of Section 3.1.

Tables Icon

Table 1. Comparison of estimated and true motion parameters for simulated translational shifting object

In the second example, the assumed frame rate is one frame per second as well. The object (airplane) is rotating clockwise around a center point (8,24) at four different speeds (0 degrees per second, 5 degrees per second, 15 degrees per second, 30 degrees per second). The motion status of the object at the start and the end of a video frame is shown in Fig. 6(a) and Fig. 6(b). The reconstructed images from the recorded single-pixel intensity data and standard Hadamard patterns with conventional methods are shown in Fig. 6(c). Similar to Fig. 5, it can be observed that when the object is static and rotating very slowly, the conventionally reconstructed images have good quality. However, when the object is rotating fast, the conventionally reconstructed images are heavily degraded. Then the object images are reconstructed with our proposed scheme for fast moving situations (Example 1.3, 1.4 and 1.5). In the optimization, the search range ofPx, Py and w is [1,32], [1,32] and [-100 100] (Unit: degrees/s) respectively. Each parameter is represented by 8 bits and the population size is 50 in SGA optimization. The maximum number of iterations in SGA is set to be 10. The results in Fig. 6(d) exhibits significant better quality than Fig. 6(c). The estimated motion parameters are close to the actual values, with minor discrepancies, illustrated in Table 2.

 

Fig. 6 Simulated rotating airplane object: (a)Object image at the start of one video frame; (b) Object image at the end of one video frame; (c)Reconstructed results with conventional methods; (d) Reconstructed results with our proposed scheme.

Download Full Size | PPT Slide | PDF

Tables Icon

Table 2. Comparison of estimated and true motion parameters for simulated rotating airplane object

Compared with the first example, three unknown motion parameters instead of two unknown parameters need to be optimized in this example. The population size in SGA is increased from 30 to 50 due to the enlargement of search space. The reconstructed images with the conventional method contain blurred object and some surrounding noise points in both Fig. 5 and Fig. 6 for translational shifting movement and rotating movement. Both low frequency and high frequency noise is introduced since the recorded single-pixel intensity sequence (i.e. Hadamard spectrum) is distorted for a fast-moving object. The performances of our proposed method in the restoration of reconstructed images in translational shifting movement and rotating movement with varying speeds are similar. The images of original target objects can be recovered with good fidelity by maximumly reducing the blur and noise.

In addition, the simulation for a rotating disk printed with white number digits rotating around its center at the specified speeds is performed, which models the optical experiment in Section 3.2. Each object image (32 × 32 pixels) contains a digital number and the number is rotating around a center point outside the image, shown in Fig. 7 and Fig. 8(a). The position of rotating center is not known and the search range of Px and Py is [0 32] and [-100 −20] in our scheme, shown in Fig. 7.

 

Fig. 7 Search of optimal motion parameters for the rotating disk object.

Download Full Size | PPT Slide | PDF

 

Fig. 8 Simulated rotating disk with digit numbers: (a)Original object images; (b) Reconstructed results with conventional methods when the object is static; (c)Reconstructed results with conventional methods when the disk is rotating at 2 rounds per second; (d) Reconstructed results with conventional methods when the disk is rotating at 4 rounds per second; (e) Reconstructed results with conventional methods when the disk is rotating at 8 rounds per second; (f) Reconstructed results with our proposed scheme when the disk is rotating at 4 rounds per second; (g) Reconstructed results with our proposed scheme when the disk is rotating at 8 rounds per second.

Download Full Size | PPT Slide | PDF

In our proposed scheme, the search range for the rotating speed w is [0 80] (Unit: rounds/s). The actual rotating speed of the object is assumed to be 0, 2, 4 and 8 rounds/s respectively. The number of illumination patterns projected within one second is 500000. The reconstructed images with conventional method are shown in Fig. 8(b), Fig. 8(c), Fig. 8(d) and Fig. 8(e). Similar to the examples above, the reconstructed image quality becomes degraded as the disk rotates faster. The reconstructed images with our proposed scheme are shown in Fig. 8(f) and Fig. 8(g) when the disk is rotating at 4 and 8 rounds/s. It can be observed that the image fidelity is significantly improved compared with Fig. 8(d) and Fig. 8(e). The estimated parameters with our proposed scheme are shown in Table 3, in comparison with the actual motion parameters.

Tables Icon

Table 3. Comparison of estimated and true motion parameters for simulated digits on a rotating disk

In Table 1, Table 2 and Table 3, the estimated motion parameters in our proposed scheme are close to the actual ones but some minor error and discrepancy can be found. The estimated values do not fully agree with the actual values due to the following three reasons. First, SGA is not an exhaustive search algorithm and does not attempt every possible value in the search space. An absolute best solution cannot be ensured but a sub-optimal solution close to the optimal solution can be usually obtained. Second, the function E[Onew(vx,vy)] (i.e. variance of image intensities in this work) is a very effective image quality metric but may not always fully reflect the true image quality. The metric value may not be exactly at the peak when the estimated parameters are identical to the actual ones. Third, sometimes a set of motion parameters with not fully correct values may yield very similar results as the set of correct motion parameters. For example, in Fig. 9(a) and Fig. 9(b) the same object rotates around different centers with different speeds. The radius is Fig. 9(a) is longer but the rotating speed in Fig. 9(b) is higher. The two set of motion parameters are compared in Fig. 9(c). It can be observed that the object moves both from Point A to Point B with very similar motion path even though the motion parameters are different. In this situation, if the motion parameters in Fig. 9(b) are used as the estimated parameters in our proposed scheme for the moving object in Fig. 9(a), the reconstructed images will still have very satisfactory quality.

 

Fig. 9 Similar moving path with different sets of motion parameters: (a)First set of motion parameters with longer rotation radius but lower rotation speed; (b)Second set of motion parameters with shorter rotation radius but higher rotation speed; (c)Comparison of Fig. 9(a) and Fig. 9(b).

Download Full Size | PPT Slide | PDF

3.2. Experimental results

In the optical experiment, a black disk printed with white number digits is rotating around its center at the specified speeds. The optical setup is same as the previous work [35]. Each digit will pass through a fixed imaging window as the disk is rotating. An LED array with 32 × 32 pixel resolution is employed to project Hadamard illumination patterns at a rate of 500000 patterns per second. The rotating speed of the disk is specified as 2 rounds per second, 4 rounds per second and 8 rounds per second. The single-pixel intensity data of three images at different time points are recorded by the SPI system for each rotating speed. The object images are first reconstructed with conventional methods and the reconstructed results [Fig. 10(a)] when the disk is rotating at 2 rounds per second are in good quality. However, when the rotating speed increases, the reconstructed images [Fig. 10(b) and Fig. 10(c)] will be increasingly more degraded. After that, the object images are reconstructed with our proposed scheme based on a rotating motion model shown in Fig. 7 for rotating speeds of 4 rounds per second and 8 rounds per second, shown in Fig. 10(d) and Fig. 10(e). In the optimization, the parameters are the same as the ones in the simulation. The search range of Px, Py and w is [0 32], [-100 −20] and [0 80] (Unit: rounds/s) respectively. Each parameter is represented by 8 bits and the population size is 30 in SGA optimization. The maximum number of iterations in SGA is set to be 10. As a comparison, the average image variances of Fig. 10(b) and Fig. 10(d) are 0.0260 and 0.0354 respectively. The average image variances of Fig. 10(c) and Fig. 10(e) are 0.0252 and 0.0359 respectively. The visual quality of images in Fig. 10(d) and 10(e) evidently outperform those in Fig. 10(b) and 10(c).

 

Fig. 10 Optical experimental results: reconstructed images with conventional methods when the disk is rotating at (a) 2 rounds per second; (b) 4 rounds per second; (c) 8 rounds per second; reconstructed images with our proposed scheme when the disk is rotating at (d) 4 rounds per second; (e) 8 rounds per second.

Download Full Size | PPT Slide | PDF

The results in optical experiments generally agree with the simulation results in Section 3.1 and the effectiveness of our proposed scheme is verified experimentally. Compared with the simulation results in Fig. 8, the reconstructed images in Fig. 10 are contaminated with more noise, regardless of conventional reconstruction method or our proposed method. In practical optical experiments, many aspects such as light illumination, precision of alignment and accuracy of detector may deviate from the ideal assumptions in the simulation to some extent. As a result, more noise will be introduced in the reconstructed images in addition to the blur and noise due to object motion. The reconstruction results indicate that our proposed scheme has certain robustness under noise conditions.

4. Conclusion

In single-pixel imaging (SPI), a two-dimensional image can be captured by a single-pixel detector instead a sensor array device. As a trade-off, a large number of illuminations are required for capturing one single image. Since the projection device can only project a limited number of illumination patterns within a short time, the video frame rate in SPI can be quite low for a highly dynamic object scene. When the target object is moving fast, the reconstructed image in SPI will suffer from severe quality degradation due to motion blur. Motion estimation and image deblurring can be realized to enhance the reconstructed image quality. Different from the motion estimation schemes from multiple neighboring video frames in previous works, we propose a motion estimation and image quality enhancement scheme for one single reconstructed image in SPI for the first time. Our proposed scheme is based on the unique mechanism of SPI and different from conventional single-photograph deblur methods for sensor array cameras. It is assumed that some prior knowledge about the type of object motion is known and a motion model is constructed for the target object. The object is assumed to be static and the illumination patterns are equivalently moving in a reverse manner. The motion parameters in the model are optimized and a reconstructed image with high fidelity can be reconstructed from transformed illumination patterns with the optimal parameters. Experimental results demonstrate that the reconstructed results with our proposed scheme have substantially better image quality than conventional methods.

In future works, the blind motion estimation and quality enhancement for one single reconstructed image in SPI without any prior knowledge of the motion type will be attempted. The type of motion that the object undergoes can be possibly identified with artificial intelligent methods from the recorded single-pixel intensity sequence or low-quality reconstructed image.

Funding

National Natural Science Foundation of China (61805145, 11774240, 61675016); Natural Science Foundation of Beijing Municipality (4172039); Leading Talents of Guangdong Province Program (00201505); Natural Science Foundation of Guangdong Province (2016A030312010).

References

1. A. Gatti, E. Brambilla, M. Bache, and L. A. Lugiato, “Ghost imaging with thermal light: comparing entanglement and classical correlation,” Phys. Rev. Lett. 93(9), 093602 (2004). [CrossRef]   [PubMed]  

2. J. H. Shapiro, “Computational ghost imaging,” Phys. Rev. A 78(6), 061802 (2008). [CrossRef]  

3. M. F. Duarte, M. A. Davenport, D. Takhar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25(2), 83–91 (2008). [CrossRef]  

4. M. P. Edgar, G. M. Gibson, and M. J. Padgett, “Principles and prospects for single-pixel imaging,” Nat. Photonics 13(1), 13–20 (2019). [CrossRef]  

5. M. J. Sun and J. M. Zhang, “Single-pixel imaging and its application in three-dimensional reconstruction: a brief review,” Sensors (Basel) 19(3), 732 (2019). [CrossRef]   [PubMed]  

6. B. I. Erkmen, “Computational ghost imaging for remote sensing,” J. Opt. Soc. Am. A 29(5), 782–789 (2012). [CrossRef]   [PubMed]  

7. M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7(1), 12010 (2016). [CrossRef]   [PubMed]  

8. E. Salvador-Balaguer, P. Latorre-Carmona, C. Chabert, F. Pla, J. Lancis, and E. Tajahuerce, “Low-cost single-pixel 3D imaging by using an LED array,” Opt. Express 26(12), 15623–15631 (2018). [CrossRef]   [PubMed]  

9. N. Radwell, K. J. Mitchell, G. M. Gibson, M. P. Edgar, R. Bowman, and M. J. Padgett, “Single-pixel infrared and visible microscope,” Optica 1(5), 285–289 (2014). [CrossRef]  

10. R. S. Aspden, N. R. Gemmell, P. A. Morris, D. S. Tasca, L. Mertens, M. G. Tanner, R. A. Kirkwood, A. Ruggeri, A. Tosi, R. W. Boyd, G. S. Buller, R. H. Hadfield, and M. J. Padgett, “Photon-sparse microscopy: visible light imaging using infrared illumination,” Optica 2(12), 1049 (2015). [CrossRef]  

11. Z. Zhang, S. Jiao, M. Yao, X. Li, and J. Zhong, “Secured single-pixel broadcast imaging,” Opt. Express 26(11), 14578–14591 (2018). [CrossRef]   [PubMed]  

12. S. Jiao, C. Zhou, Y. Shi, W. Zou, and X. Li, “Review on optical image hiding and watermarking techniques,” Opt. Laser Technol. 109, 370–380 (2019). [CrossRef]  

13. S. Liansheng, W. Jiahao, T. Ailing, and A. Asundi, “Optical image hiding under framework of computational ghost imaging based on an expansion strategy,” Opt. Express 27(5), 7213–7225 (2019). [CrossRef]   [PubMed]  

14. C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101(14), 141123 (2012). [CrossRef]  

15. W. Gong, C. Zhao, H. Yu, M. Chen, W. Xu, and S. Han, “Three-dimensional ghost imaging lidar via sparsity constraint,” Sci. Rep. 6(1), 26133 (2016). [CrossRef]   [PubMed]  

16. G. M. Gibson, B. Sun, M. P. Edgar, D. B. Phillips, N. Hempler, G. T. Maker, G. P. Malcolm, and M. J. Padgett, “Real-time imaging of methane gas leaks using a single-pixel camera,” Opt. Express 25(4), 2998–3005 (2017). [CrossRef]   [PubMed]  

17. J. Peng, M. Yao, J. Cheng, Z. Zhang, S. Li, G. Zheng, and J. Zhong, “Micro-tomography via single-pixel imaging,” Opt. Express 26(24), 31094–31105 (2018). [CrossRef]   [PubMed]  

18. Z. Zhang, X. Ma, and J. Zhong, “Single-pixel imaging by means of Fourier spectrum acquisition,” Nat. Commun. 6(1), 6225 (2015). [CrossRef]   [PubMed]  

19. L. Wang and S. Zhao, “Fast reconstructed and high-quality ghost imaging with fast Walsh–Hadamard transform,” Photon. Res. 4(6), 240–244 (2016). [CrossRef]  

20. S. Jiao, “Design of optimal illumination patterns in single-pixel imaging using image dictionaries,” arXiv preprint arXiv:1806.01340 (2018).

21. W. K. Yu, M. F. Li, X. R. Yao, X. F. Liu, L. A. Wu, and G. J. Zhai, “Adaptive compressive ghost imaging based on wavelet trees and sparse representation,” Opt. Express 22(6), 7133–7144 (2014). [CrossRef]   [PubMed]  

22. M.-J. Sun, M. P. Edgar, D. B. Phillips, G. M. Gibson, and M. J. Padgett, “Improving the signal-to-noise ratio of single-pixel imaging using digital microscanning,” Opt. Express 24(10), 10476–10485 (2016). [CrossRef]   [PubMed]  

23. M.-J. Sun, W. Chen, T.-F. Liu, and L.-J. Li, “Image retrieval in spatial and temporal domains with a quadrant detector,” IEEE Photonics J. 9(5), 1 (2017). [CrossRef]  

24. L. Bian, J. Suo, Q. Dai, and F. Chen, “Experimental comparison of single-pixel imaging algorithms,” J. Opt. Soc. Am. A 35(1), 78–87 (2018). [CrossRef]   [PubMed]  

25. D. B. Phillips, M. J. Sun, J. M. Taylor, M. P. Edgar, S. M. Barnett, G. M. Gibson, and M. J. Padgett, “Adaptive foveated single-pixel imaging with dynamic supersampling,” Sci. Adv. 3(4), e1601782 (2017). [CrossRef]   [PubMed]  

26. V. Kravets and A. Stern, “Video compressive sensing using Russian dolls ordering of Hadamard basis for multi-scale sampling of a scene in motion using a single pixel camera,” In Computational Imaging III. International Society for Optics and Photonics 10669, 106690 (2018).

27. “Digital Image Processing”, R. C. Gonzalez & R. E. Woods, Addison-Wesley Publishing Company Inc. (1992).

28. D. S. Biggs and M. Andrews, “Acceleration of iterative image restoration algorithms,” Appl. Opt. 36(8), 1766–1775 (1997). [CrossRef]   [PubMed]  

29. K. Guo, S. Jiang, and G. Zheng, “Multilayer fluorescence imaging on a single-pixel detector,” Biomed. Opt. Express 7(7), 2425–2431 (2016). [CrossRef]   [PubMed]  

30. S. Jiao and P. W. M. Tsang, “Enhanced autofocusing scheme in digital holography based on hologram decomposition,” IEEE International Conference on Industrial Informatics (INDIN)14,541–545 (2016).

31. S. Jiao, P. W. M. Tsang, T. C. Poon, J. P. Liu, W. Zou, and X. Li, “Enhanced autofocusing in optical scanning holography based on hologram decomposition,” IEEE Trans. Industr. Inform. 13(5), 2455–2463 (2017). [CrossRef]  

32. N. Sannomiya, H. Iima, and N. Akatsuka, “Genetic algorithm approach to a production ordering problem in an assembly process with constant use of parts,” Int. J. Syst. Sci. 25(9), 1461–1472 (1994). [CrossRef]  

33. S. Kirkpatrick, C. D. Gelatt Jr., and M. P. Vecchi, “Optimization by simulated annealing,” Science 220(4598), 671–680 (1983). [CrossRef]   [PubMed]  

34. A. S. M. Jiao, P. W. M. Tsang, and T. C. Poon, “Restoration of digital off-axis Fresnel hologram by exemplar and search based image inpainting with enhanced computing speed,” Comput. Phys. Commun. 193, 30–37 (2015). [CrossRef]  

35. Z. H. Xu, W. Chen, J. Penuelas, M. Padgett, and M. J. Sun, “1000 fps computational ghost imaging using LED-based structured illumination,” Opt. Express 26(3), 2427–2434 (2018). [CrossRef]   [PubMed]  

References

  • View by:
  • |
  • |
  • |

  1. A. Gatti, E. Brambilla, M. Bache, and L. A. Lugiato, “Ghost imaging with thermal light: comparing entanglement and classical correlation,” Phys. Rev. Lett. 93(9), 093602 (2004).
    [Crossref] [PubMed]
  2. J. H. Shapiro, “Computational ghost imaging,” Phys. Rev. A 78(6), 061802 (2008).
    [Crossref]
  3. M. F. Duarte, M. A. Davenport, D. Takhar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25(2), 83–91 (2008).
    [Crossref]
  4. M. P. Edgar, G. M. Gibson, and M. J. Padgett, “Principles and prospects for single-pixel imaging,” Nat. Photonics 13(1), 13–20 (2019).
    [Crossref]
  5. M. J. Sun and J. M. Zhang, “Single-pixel imaging and its application in three-dimensional reconstruction: a brief review,” Sensors (Basel) 19(3), 732 (2019).
    [Crossref] [PubMed]
  6. B. I. Erkmen, “Computational ghost imaging for remote sensing,” J. Opt. Soc. Am. A 29(5), 782–789 (2012).
    [Crossref] [PubMed]
  7. M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7(1), 12010 (2016).
    [Crossref] [PubMed]
  8. E. Salvador-Balaguer, P. Latorre-Carmona, C. Chabert, F. Pla, J. Lancis, and E. Tajahuerce, “Low-cost single-pixel 3D imaging by using an LED array,” Opt. Express 26(12), 15623–15631 (2018).
    [Crossref] [PubMed]
  9. N. Radwell, K. J. Mitchell, G. M. Gibson, M. P. Edgar, R. Bowman, and M. J. Padgett, “Single-pixel infrared and visible microscope,” Optica 1(5), 285–289 (2014).
    [Crossref]
  10. R. S. Aspden, N. R. Gemmell, P. A. Morris, D. S. Tasca, L. Mertens, M. G. Tanner, R. A. Kirkwood, A. Ruggeri, A. Tosi, R. W. Boyd, G. S. Buller, R. H. Hadfield, and M. J. Padgett, “Photon-sparse microscopy: visible light imaging using infrared illumination,” Optica 2(12), 1049 (2015).
    [Crossref]
  11. Z. Zhang, S. Jiao, M. Yao, X. Li, and J. Zhong, “Secured single-pixel broadcast imaging,” Opt. Express 26(11), 14578–14591 (2018).
    [Crossref] [PubMed]
  12. S. Jiao, C. Zhou, Y. Shi, W. Zou, and X. Li, “Review on optical image hiding and watermarking techniques,” Opt. Laser Technol. 109, 370–380 (2019).
    [Crossref]
  13. S. Liansheng, W. Jiahao, T. Ailing, and A. Asundi, “Optical image hiding under framework of computational ghost imaging based on an expansion strategy,” Opt. Express 27(5), 7213–7225 (2019).
    [Crossref] [PubMed]
  14. C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101(14), 141123 (2012).
    [Crossref]
  15. W. Gong, C. Zhao, H. Yu, M. Chen, W. Xu, and S. Han, “Three-dimensional ghost imaging lidar via sparsity constraint,” Sci. Rep. 6(1), 26133 (2016).
    [Crossref] [PubMed]
  16. G. M. Gibson, B. Sun, M. P. Edgar, D. B. Phillips, N. Hempler, G. T. Maker, G. P. Malcolm, and M. J. Padgett, “Real-time imaging of methane gas leaks using a single-pixel camera,” Opt. Express 25(4), 2998–3005 (2017).
    [Crossref] [PubMed]
  17. J. Peng, M. Yao, J. Cheng, Z. Zhang, S. Li, G. Zheng, and J. Zhong, “Micro-tomography via single-pixel imaging,” Opt. Express 26(24), 31094–31105 (2018).
    [Crossref] [PubMed]
  18. Z. Zhang, X. Ma, and J. Zhong, “Single-pixel imaging by means of Fourier spectrum acquisition,” Nat. Commun. 6(1), 6225 (2015).
    [Crossref] [PubMed]
  19. L. Wang and S. Zhao, “Fast reconstructed and high-quality ghost imaging with fast Walsh–Hadamard transform,” Photon. Res. 4(6), 240–244 (2016).
    [Crossref]
  20. S. Jiao, “Design of optimal illumination patterns in single-pixel imaging using image dictionaries,” arXiv preprint arXiv:1806.01340 (2018).
  21. W. K. Yu, M. F. Li, X. R. Yao, X. F. Liu, L. A. Wu, and G. J. Zhai, “Adaptive compressive ghost imaging based on wavelet trees and sparse representation,” Opt. Express 22(6), 7133–7144 (2014).
    [Crossref] [PubMed]
  22. M.-J. Sun, M. P. Edgar, D. B. Phillips, G. M. Gibson, and M. J. Padgett, “Improving the signal-to-noise ratio of single-pixel imaging using digital microscanning,” Opt. Express 24(10), 10476–10485 (2016).
    [Crossref] [PubMed]
  23. M.-J. Sun, W. Chen, T.-F. Liu, and L.-J. Li, “Image retrieval in spatial and temporal domains with a quadrant detector,” IEEE Photonics J. 9(5), 1 (2017).
    [Crossref]
  24. L. Bian, J. Suo, Q. Dai, and F. Chen, “Experimental comparison of single-pixel imaging algorithms,” J. Opt. Soc. Am. A 35(1), 78–87 (2018).
    [Crossref] [PubMed]
  25. D. B. Phillips, M. J. Sun, J. M. Taylor, M. P. Edgar, S. M. Barnett, G. M. Gibson, and M. J. Padgett, “Adaptive foveated single-pixel imaging with dynamic supersampling,” Sci. Adv. 3(4), e1601782 (2017).
    [Crossref] [PubMed]
  26. V. Kravets and A. Stern, “Video compressive sensing using Russian dolls ordering of Hadamard basis for multi-scale sampling of a scene in motion using a single pixel camera,” In Computational Imaging III. International Society for Optics and Photonics 10669, 106690 (2018).
  27. “Digital Image Processing”, R. C. Gonzalez & R. E. Woods, Addison-Wesley Publishing Company Inc. (1992).
  28. D. S. Biggs and M. Andrews, “Acceleration of iterative image restoration algorithms,” Appl. Opt. 36(8), 1766–1775 (1997).
    [Crossref] [PubMed]
  29. K. Guo, S. Jiang, and G. Zheng, “Multilayer fluorescence imaging on a single-pixel detector,” Biomed. Opt. Express 7(7), 2425–2431 (2016).
    [Crossref] [PubMed]
  30. S. Jiao and P. W. M. Tsang, “Enhanced autofocusing scheme in digital holography based on hologram decomposition,” IEEE International Conference on Industrial Informatics (INDIN)14,541–545 (2016).
  31. S. Jiao, P. W. M. Tsang, T. C. Poon, J. P. Liu, W. Zou, and X. Li, “Enhanced autofocusing in optical scanning holography based on hologram decomposition,” IEEE Trans. Industr. Inform. 13(5), 2455–2463 (2017).
    [Crossref]
  32. N. Sannomiya, H. Iima, and N. Akatsuka, “Genetic algorithm approach to a production ordering problem in an assembly process with constant use of parts,” Int. J. Syst. Sci. 25(9), 1461–1472 (1994).
    [Crossref]
  33. S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, “Optimization by simulated annealing,” Science 220(4598), 671–680 (1983).
    [Crossref] [PubMed]
  34. A. S. M. Jiao, P. W. M. Tsang, and T. C. Poon, “Restoration of digital off-axis Fresnel hologram by exemplar and search based image inpainting with enhanced computing speed,” Comput. Phys. Commun. 193, 30–37 (2015).
    [Crossref]
  35. Z. H. Xu, W. Chen, J. Penuelas, M. Padgett, and M. J. Sun, “1000 fps computational ghost imaging using LED-based structured illumination,” Opt. Express 26(3), 2427–2434 (2018).
    [Crossref] [PubMed]

2019 (4)

M. P. Edgar, G. M. Gibson, and M. J. Padgett, “Principles and prospects for single-pixel imaging,” Nat. Photonics 13(1), 13–20 (2019).
[Crossref]

M. J. Sun and J. M. Zhang, “Single-pixel imaging and its application in three-dimensional reconstruction: a brief review,” Sensors (Basel) 19(3), 732 (2019).
[Crossref] [PubMed]

S. Jiao, C. Zhou, Y. Shi, W. Zou, and X. Li, “Review on optical image hiding and watermarking techniques,” Opt. Laser Technol. 109, 370–380 (2019).
[Crossref]

S. Liansheng, W. Jiahao, T. Ailing, and A. Asundi, “Optical image hiding under framework of computational ghost imaging based on an expansion strategy,” Opt. Express 27(5), 7213–7225 (2019).
[Crossref] [PubMed]

2018 (6)

2017 (4)

S. Jiao, P. W. M. Tsang, T. C. Poon, J. P. Liu, W. Zou, and X. Li, “Enhanced autofocusing in optical scanning holography based on hologram decomposition,” IEEE Trans. Industr. Inform. 13(5), 2455–2463 (2017).
[Crossref]

D. B. Phillips, M. J. Sun, J. M. Taylor, M. P. Edgar, S. M. Barnett, G. M. Gibson, and M. J. Padgett, “Adaptive foveated single-pixel imaging with dynamic supersampling,” Sci. Adv. 3(4), e1601782 (2017).
[Crossref] [PubMed]

M.-J. Sun, W. Chen, T.-F. Liu, and L.-J. Li, “Image retrieval in spatial and temporal domains with a quadrant detector,” IEEE Photonics J. 9(5), 1 (2017).
[Crossref]

G. M. Gibson, B. Sun, M. P. Edgar, D. B. Phillips, N. Hempler, G. T. Maker, G. P. Malcolm, and M. J. Padgett, “Real-time imaging of methane gas leaks using a single-pixel camera,” Opt. Express 25(4), 2998–3005 (2017).
[Crossref] [PubMed]

2016 (5)

2015 (3)

A. S. M. Jiao, P. W. M. Tsang, and T. C. Poon, “Restoration of digital off-axis Fresnel hologram by exemplar and search based image inpainting with enhanced computing speed,” Comput. Phys. Commun. 193, 30–37 (2015).
[Crossref]

Z. Zhang, X. Ma, and J. Zhong, “Single-pixel imaging by means of Fourier spectrum acquisition,” Nat. Commun. 6(1), 6225 (2015).
[Crossref] [PubMed]

R. S. Aspden, N. R. Gemmell, P. A. Morris, D. S. Tasca, L. Mertens, M. G. Tanner, R. A. Kirkwood, A. Ruggeri, A. Tosi, R. W. Boyd, G. S. Buller, R. H. Hadfield, and M. J. Padgett, “Photon-sparse microscopy: visible light imaging using infrared illumination,” Optica 2(12), 1049 (2015).
[Crossref]

2014 (2)

2012 (2)

B. I. Erkmen, “Computational ghost imaging for remote sensing,” J. Opt. Soc. Am. A 29(5), 782–789 (2012).
[Crossref] [PubMed]

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101(14), 141123 (2012).
[Crossref]

2008 (2)

J. H. Shapiro, “Computational ghost imaging,” Phys. Rev. A 78(6), 061802 (2008).
[Crossref]

M. F. Duarte, M. A. Davenport, D. Takhar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25(2), 83–91 (2008).
[Crossref]

2004 (1)

A. Gatti, E. Brambilla, M. Bache, and L. A. Lugiato, “Ghost imaging with thermal light: comparing entanglement and classical correlation,” Phys. Rev. Lett. 93(9), 093602 (2004).
[Crossref] [PubMed]

1997 (1)

1994 (1)

N. Sannomiya, H. Iima, and N. Akatsuka, “Genetic algorithm approach to a production ordering problem in an assembly process with constant use of parts,” Int. J. Syst. Sci. 25(9), 1461–1472 (1994).
[Crossref]

1983 (1)

S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, “Optimization by simulated annealing,” Science 220(4598), 671–680 (1983).
[Crossref] [PubMed]

Ailing, T.

Akatsuka, N.

N. Sannomiya, H. Iima, and N. Akatsuka, “Genetic algorithm approach to a production ordering problem in an assembly process with constant use of parts,” Int. J. Syst. Sci. 25(9), 1461–1472 (1994).
[Crossref]

Andrews, M.

Aspden, R. S.

Asundi, A.

Bache, M.

A. Gatti, E. Brambilla, M. Bache, and L. A. Lugiato, “Ghost imaging with thermal light: comparing entanglement and classical correlation,” Phys. Rev. Lett. 93(9), 093602 (2004).
[Crossref] [PubMed]

Baraniuk, R. G.

M. F. Duarte, M. A. Davenport, D. Takhar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25(2), 83–91 (2008).
[Crossref]

Barnett, S. M.

D. B. Phillips, M. J. Sun, J. M. Taylor, M. P. Edgar, S. M. Barnett, G. M. Gibson, and M. J. Padgett, “Adaptive foveated single-pixel imaging with dynamic supersampling,” Sci. Adv. 3(4), e1601782 (2017).
[Crossref] [PubMed]

Bian, L.

Biggs, D. S.

Bowman, R.

Boyd, R. W.

Brambilla, E.

A. Gatti, E. Brambilla, M. Bache, and L. A. Lugiato, “Ghost imaging with thermal light: comparing entanglement and classical correlation,” Phys. Rev. Lett. 93(9), 093602 (2004).
[Crossref] [PubMed]

Buller, G. S.

Chabert, C.

Chen, F.

Chen, M.

W. Gong, C. Zhao, H. Yu, M. Chen, W. Xu, and S. Han, “Three-dimensional ghost imaging lidar via sparsity constraint,” Sci. Rep. 6(1), 26133 (2016).
[Crossref] [PubMed]

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101(14), 141123 (2012).
[Crossref]

Chen, W.

Z. H. Xu, W. Chen, J. Penuelas, M. Padgett, and M. J. Sun, “1000 fps computational ghost imaging using LED-based structured illumination,” Opt. Express 26(3), 2427–2434 (2018).
[Crossref] [PubMed]

M.-J. Sun, W. Chen, T.-F. Liu, and L.-J. Li, “Image retrieval in spatial and temporal domains with a quadrant detector,” IEEE Photonics J. 9(5), 1 (2017).
[Crossref]

Cheng, J.

Dai, Q.

Davenport, M. A.

M. F. Duarte, M. A. Davenport, D. Takhar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25(2), 83–91 (2008).
[Crossref]

Duarte, M. F.

M. F. Duarte, M. A. Davenport, D. Takhar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25(2), 83–91 (2008).
[Crossref]

Edgar, M. P.

M. P. Edgar, G. M. Gibson, and M. J. Padgett, “Principles and prospects for single-pixel imaging,” Nat. Photonics 13(1), 13–20 (2019).
[Crossref]

D. B. Phillips, M. J. Sun, J. M. Taylor, M. P. Edgar, S. M. Barnett, G. M. Gibson, and M. J. Padgett, “Adaptive foveated single-pixel imaging with dynamic supersampling,” Sci. Adv. 3(4), e1601782 (2017).
[Crossref] [PubMed]

G. M. Gibson, B. Sun, M. P. Edgar, D. B. Phillips, N. Hempler, G. T. Maker, G. P. Malcolm, and M. J. Padgett, “Real-time imaging of methane gas leaks using a single-pixel camera,” Opt. Express 25(4), 2998–3005 (2017).
[Crossref] [PubMed]

M.-J. Sun, M. P. Edgar, D. B. Phillips, G. M. Gibson, and M. J. Padgett, “Improving the signal-to-noise ratio of single-pixel imaging using digital microscanning,” Opt. Express 24(10), 10476–10485 (2016).
[Crossref] [PubMed]

M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7(1), 12010 (2016).
[Crossref] [PubMed]

N. Radwell, K. J. Mitchell, G. M. Gibson, M. P. Edgar, R. Bowman, and M. J. Padgett, “Single-pixel infrared and visible microscope,” Optica 1(5), 285–289 (2014).
[Crossref]

Erkmen, B. I.

Gatti, A.

A. Gatti, E. Brambilla, M. Bache, and L. A. Lugiato, “Ghost imaging with thermal light: comparing entanglement and classical correlation,” Phys. Rev. Lett. 93(9), 093602 (2004).
[Crossref] [PubMed]

Gelatt, C. D.

S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, “Optimization by simulated annealing,” Science 220(4598), 671–680 (1983).
[Crossref] [PubMed]

Gemmell, N. R.

Gibson, G. M.

M. P. Edgar, G. M. Gibson, and M. J. Padgett, “Principles and prospects for single-pixel imaging,” Nat. Photonics 13(1), 13–20 (2019).
[Crossref]

G. M. Gibson, B. Sun, M. P. Edgar, D. B. Phillips, N. Hempler, G. T. Maker, G. P. Malcolm, and M. J. Padgett, “Real-time imaging of methane gas leaks using a single-pixel camera,” Opt. Express 25(4), 2998–3005 (2017).
[Crossref] [PubMed]

D. B. Phillips, M. J. Sun, J. M. Taylor, M. P. Edgar, S. M. Barnett, G. M. Gibson, and M. J. Padgett, “Adaptive foveated single-pixel imaging with dynamic supersampling,” Sci. Adv. 3(4), e1601782 (2017).
[Crossref] [PubMed]

M.-J. Sun, M. P. Edgar, D. B. Phillips, G. M. Gibson, and M. J. Padgett, “Improving the signal-to-noise ratio of single-pixel imaging using digital microscanning,” Opt. Express 24(10), 10476–10485 (2016).
[Crossref] [PubMed]

M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7(1), 12010 (2016).
[Crossref] [PubMed]

N. Radwell, K. J. Mitchell, G. M. Gibson, M. P. Edgar, R. Bowman, and M. J. Padgett, “Single-pixel infrared and visible microscope,” Optica 1(5), 285–289 (2014).
[Crossref]

Gong, W.

W. Gong, C. Zhao, H. Yu, M. Chen, W. Xu, and S. Han, “Three-dimensional ghost imaging lidar via sparsity constraint,” Sci. Rep. 6(1), 26133 (2016).
[Crossref] [PubMed]

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101(14), 141123 (2012).
[Crossref]

Guo, K.

Hadfield, R. H.

Han, S.

W. Gong, C. Zhao, H. Yu, M. Chen, W. Xu, and S. Han, “Three-dimensional ghost imaging lidar via sparsity constraint,” Sci. Rep. 6(1), 26133 (2016).
[Crossref] [PubMed]

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101(14), 141123 (2012).
[Crossref]

Hempler, N.

Iima, H.

N. Sannomiya, H. Iima, and N. Akatsuka, “Genetic algorithm approach to a production ordering problem in an assembly process with constant use of parts,” Int. J. Syst. Sci. 25(9), 1461–1472 (1994).
[Crossref]

Jiahao, W.

Jiang, S.

Jiao, A. S. M.

A. S. M. Jiao, P. W. M. Tsang, and T. C. Poon, “Restoration of digital off-axis Fresnel hologram by exemplar and search based image inpainting with enhanced computing speed,” Comput. Phys. Commun. 193, 30–37 (2015).
[Crossref]

Jiao, S.

S. Jiao, C. Zhou, Y. Shi, W. Zou, and X. Li, “Review on optical image hiding and watermarking techniques,” Opt. Laser Technol. 109, 370–380 (2019).
[Crossref]

Z. Zhang, S. Jiao, M. Yao, X. Li, and J. Zhong, “Secured single-pixel broadcast imaging,” Opt. Express 26(11), 14578–14591 (2018).
[Crossref] [PubMed]

S. Jiao, P. W. M. Tsang, T. C. Poon, J. P. Liu, W. Zou, and X. Li, “Enhanced autofocusing in optical scanning holography based on hologram decomposition,” IEEE Trans. Industr. Inform. 13(5), 2455–2463 (2017).
[Crossref]

S. Jiao and P. W. M. Tsang, “Enhanced autofocusing scheme in digital holography based on hologram decomposition,” IEEE International Conference on Industrial Informatics (INDIN)14,541–545 (2016).

Kelly, K. F.

M. F. Duarte, M. A. Davenport, D. Takhar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25(2), 83–91 (2008).
[Crossref]

Kirkpatrick, S.

S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, “Optimization by simulated annealing,” Science 220(4598), 671–680 (1983).
[Crossref] [PubMed]

Kirkwood, R. A.

Kravets, V.

V. Kravets and A. Stern, “Video compressive sensing using Russian dolls ordering of Hadamard basis for multi-scale sampling of a scene in motion using a single pixel camera,” In Computational Imaging III. International Society for Optics and Photonics 10669, 106690 (2018).

Lamb, R.

M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7(1), 12010 (2016).
[Crossref] [PubMed]

Lancis, J.

Laska, J. N.

M. F. Duarte, M. A. Davenport, D. Takhar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25(2), 83–91 (2008).
[Crossref]

Latorre-Carmona, P.

Li, E.

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101(14), 141123 (2012).
[Crossref]

Li, L.-J.

M.-J. Sun, W. Chen, T.-F. Liu, and L.-J. Li, “Image retrieval in spatial and temporal domains with a quadrant detector,” IEEE Photonics J. 9(5), 1 (2017).
[Crossref]

Li, M. F.

Li, S.

Li, X.

S. Jiao, C. Zhou, Y. Shi, W. Zou, and X. Li, “Review on optical image hiding and watermarking techniques,” Opt. Laser Technol. 109, 370–380 (2019).
[Crossref]

Z. Zhang, S. Jiao, M. Yao, X. Li, and J. Zhong, “Secured single-pixel broadcast imaging,” Opt. Express 26(11), 14578–14591 (2018).
[Crossref] [PubMed]

S. Jiao, P. W. M. Tsang, T. C. Poon, J. P. Liu, W. Zou, and X. Li, “Enhanced autofocusing in optical scanning holography based on hologram decomposition,” IEEE Trans. Industr. Inform. 13(5), 2455–2463 (2017).
[Crossref]

Liansheng, S.

Liu, J. P.

S. Jiao, P. W. M. Tsang, T. C. Poon, J. P. Liu, W. Zou, and X. Li, “Enhanced autofocusing in optical scanning holography based on hologram decomposition,” IEEE Trans. Industr. Inform. 13(5), 2455–2463 (2017).
[Crossref]

Liu, T.-F.

M.-J. Sun, W. Chen, T.-F. Liu, and L.-J. Li, “Image retrieval in spatial and temporal domains with a quadrant detector,” IEEE Photonics J. 9(5), 1 (2017).
[Crossref]

Liu, X. F.

Lugiato, L. A.

A. Gatti, E. Brambilla, M. Bache, and L. A. Lugiato, “Ghost imaging with thermal light: comparing entanglement and classical correlation,” Phys. Rev. Lett. 93(9), 093602 (2004).
[Crossref] [PubMed]

Ma, X.

Z. Zhang, X. Ma, and J. Zhong, “Single-pixel imaging by means of Fourier spectrum acquisition,” Nat. Commun. 6(1), 6225 (2015).
[Crossref] [PubMed]

Maker, G. T.

Malcolm, G. P.

Mertens, L.

Mitchell, K. J.

Morris, P. A.

Padgett, M.

Padgett, M. J.

M. P. Edgar, G. M. Gibson, and M. J. Padgett, “Principles and prospects for single-pixel imaging,” Nat. Photonics 13(1), 13–20 (2019).
[Crossref]

G. M. Gibson, B. Sun, M. P. Edgar, D. B. Phillips, N. Hempler, G. T. Maker, G. P. Malcolm, and M. J. Padgett, “Real-time imaging of methane gas leaks using a single-pixel camera,” Opt. Express 25(4), 2998–3005 (2017).
[Crossref] [PubMed]

D. B. Phillips, M. J. Sun, J. M. Taylor, M. P. Edgar, S. M. Barnett, G. M. Gibson, and M. J. Padgett, “Adaptive foveated single-pixel imaging with dynamic supersampling,” Sci. Adv. 3(4), e1601782 (2017).
[Crossref] [PubMed]

M.-J. Sun, M. P. Edgar, D. B. Phillips, G. M. Gibson, and M. J. Padgett, “Improving the signal-to-noise ratio of single-pixel imaging using digital microscanning,” Opt. Express 24(10), 10476–10485 (2016).
[Crossref] [PubMed]

M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7(1), 12010 (2016).
[Crossref] [PubMed]

R. S. Aspden, N. R. Gemmell, P. A. Morris, D. S. Tasca, L. Mertens, M. G. Tanner, R. A. Kirkwood, A. Ruggeri, A. Tosi, R. W. Boyd, G. S. Buller, R. H. Hadfield, and M. J. Padgett, “Photon-sparse microscopy: visible light imaging using infrared illumination,” Optica 2(12), 1049 (2015).
[Crossref]

N. Radwell, K. J. Mitchell, G. M. Gibson, M. P. Edgar, R. Bowman, and M. J. Padgett, “Single-pixel infrared and visible microscope,” Optica 1(5), 285–289 (2014).
[Crossref]

Peng, J.

Penuelas, J.

Phillips, D. B.

Pla, F.

Poon, T. C.

S. Jiao, P. W. M. Tsang, T. C. Poon, J. P. Liu, W. Zou, and X. Li, “Enhanced autofocusing in optical scanning holography based on hologram decomposition,” IEEE Trans. Industr. Inform. 13(5), 2455–2463 (2017).
[Crossref]

A. S. M. Jiao, P. W. M. Tsang, and T. C. Poon, “Restoration of digital off-axis Fresnel hologram by exemplar and search based image inpainting with enhanced computing speed,” Comput. Phys. Commun. 193, 30–37 (2015).
[Crossref]

Radwell, N.

M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7(1), 12010 (2016).
[Crossref] [PubMed]

N. Radwell, K. J. Mitchell, G. M. Gibson, M. P. Edgar, R. Bowman, and M. J. Padgett, “Single-pixel infrared and visible microscope,” Optica 1(5), 285–289 (2014).
[Crossref]

Ruggeri, A.

Salvador-Balaguer, E.

Sannomiya, N.

N. Sannomiya, H. Iima, and N. Akatsuka, “Genetic algorithm approach to a production ordering problem in an assembly process with constant use of parts,” Int. J. Syst. Sci. 25(9), 1461–1472 (1994).
[Crossref]

Shapiro, J. H.

J. H. Shapiro, “Computational ghost imaging,” Phys. Rev. A 78(6), 061802 (2008).
[Crossref]

Shi, Y.

S. Jiao, C. Zhou, Y. Shi, W. Zou, and X. Li, “Review on optical image hiding and watermarking techniques,” Opt. Laser Technol. 109, 370–380 (2019).
[Crossref]

Stern, A.

V. Kravets and A. Stern, “Video compressive sensing using Russian dolls ordering of Hadamard basis for multi-scale sampling of a scene in motion using a single pixel camera,” In Computational Imaging III. International Society for Optics and Photonics 10669, 106690 (2018).

Sun, B.

G. M. Gibson, B. Sun, M. P. Edgar, D. B. Phillips, N. Hempler, G. T. Maker, G. P. Malcolm, and M. J. Padgett, “Real-time imaging of methane gas leaks using a single-pixel camera,” Opt. Express 25(4), 2998–3005 (2017).
[Crossref] [PubMed]

M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7(1), 12010 (2016).
[Crossref] [PubMed]

Sun, M. J.

M. J. Sun and J. M. Zhang, “Single-pixel imaging and its application in three-dimensional reconstruction: a brief review,” Sensors (Basel) 19(3), 732 (2019).
[Crossref] [PubMed]

Z. H. Xu, W. Chen, J. Penuelas, M. Padgett, and M. J. Sun, “1000 fps computational ghost imaging using LED-based structured illumination,” Opt. Express 26(3), 2427–2434 (2018).
[Crossref] [PubMed]

D. B. Phillips, M. J. Sun, J. M. Taylor, M. P. Edgar, S. M. Barnett, G. M. Gibson, and M. J. Padgett, “Adaptive foveated single-pixel imaging with dynamic supersampling,” Sci. Adv. 3(4), e1601782 (2017).
[Crossref] [PubMed]

Sun, M.-J.

M.-J. Sun, W. Chen, T.-F. Liu, and L.-J. Li, “Image retrieval in spatial and temporal domains with a quadrant detector,” IEEE Photonics J. 9(5), 1 (2017).
[Crossref]

M.-J. Sun, M. P. Edgar, D. B. Phillips, G. M. Gibson, and M. J. Padgett, “Improving the signal-to-noise ratio of single-pixel imaging using digital microscanning,” Opt. Express 24(10), 10476–10485 (2016).
[Crossref] [PubMed]

M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7(1), 12010 (2016).
[Crossref] [PubMed]

Sun, T.

M. F. Duarte, M. A. Davenport, D. Takhar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25(2), 83–91 (2008).
[Crossref]

Suo, J.

Tajahuerce, E.

Takhar, D.

M. F. Duarte, M. A. Davenport, D. Takhar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25(2), 83–91 (2008).
[Crossref]

Tanner, M. G.

Tasca, D. S.

Taylor, J. M.

D. B. Phillips, M. J. Sun, J. M. Taylor, M. P. Edgar, S. M. Barnett, G. M. Gibson, and M. J. Padgett, “Adaptive foveated single-pixel imaging with dynamic supersampling,” Sci. Adv. 3(4), e1601782 (2017).
[Crossref] [PubMed]

Tosi, A.

Tsang, P. W. M.

S. Jiao, P. W. M. Tsang, T. C. Poon, J. P. Liu, W. Zou, and X. Li, “Enhanced autofocusing in optical scanning holography based on hologram decomposition,” IEEE Trans. Industr. Inform. 13(5), 2455–2463 (2017).
[Crossref]

A. S. M. Jiao, P. W. M. Tsang, and T. C. Poon, “Restoration of digital off-axis Fresnel hologram by exemplar and search based image inpainting with enhanced computing speed,” Comput. Phys. Commun. 193, 30–37 (2015).
[Crossref]

S. Jiao and P. W. M. Tsang, “Enhanced autofocusing scheme in digital holography based on hologram decomposition,” IEEE International Conference on Industrial Informatics (INDIN)14,541–545 (2016).

Vecchi, M. P.

S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, “Optimization by simulated annealing,” Science 220(4598), 671–680 (1983).
[Crossref] [PubMed]

Wang, H.

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101(14), 141123 (2012).
[Crossref]

Wang, L.

Wu, L. A.

Xu, W.

W. Gong, C. Zhao, H. Yu, M. Chen, W. Xu, and S. Han, “Three-dimensional ghost imaging lidar via sparsity constraint,” Sci. Rep. 6(1), 26133 (2016).
[Crossref] [PubMed]

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101(14), 141123 (2012).
[Crossref]

Xu, Z. H.

Yao, M.

Yao, X. R.

Yu, H.

W. Gong, C. Zhao, H. Yu, M. Chen, W. Xu, and S. Han, “Three-dimensional ghost imaging lidar via sparsity constraint,” Sci. Rep. 6(1), 26133 (2016).
[Crossref] [PubMed]

Yu, W. K.

Zhai, G. J.

Zhang, J. M.

M. J. Sun and J. M. Zhang, “Single-pixel imaging and its application in three-dimensional reconstruction: a brief review,” Sensors (Basel) 19(3), 732 (2019).
[Crossref] [PubMed]

Zhang, Z.

Zhao, C.

W. Gong, C. Zhao, H. Yu, M. Chen, W. Xu, and S. Han, “Three-dimensional ghost imaging lidar via sparsity constraint,” Sci. Rep. 6(1), 26133 (2016).
[Crossref] [PubMed]

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101(14), 141123 (2012).
[Crossref]

Zhao, S.

Zheng, G.

Zhong, J.

Zhou, C.

S. Jiao, C. Zhou, Y. Shi, W. Zou, and X. Li, “Review on optical image hiding and watermarking techniques,” Opt. Laser Technol. 109, 370–380 (2019).
[Crossref]

Zou, W.

S. Jiao, C. Zhou, Y. Shi, W. Zou, and X. Li, “Review on optical image hiding and watermarking techniques,” Opt. Laser Technol. 109, 370–380 (2019).
[Crossref]

S. Jiao, P. W. M. Tsang, T. C. Poon, J. P. Liu, W. Zou, and X. Li, “Enhanced autofocusing in optical scanning holography based on hologram decomposition,” IEEE Trans. Industr. Inform. 13(5), 2455–2463 (2017).
[Crossref]

Appl. Opt. (1)

Appl. Phys. Lett. (1)

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101(14), 141123 (2012).
[Crossref]

Biomed. Opt. Express (1)

Comput. Phys. Commun. (1)

A. S. M. Jiao, P. W. M. Tsang, and T. C. Poon, “Restoration of digital off-axis Fresnel hologram by exemplar and search based image inpainting with enhanced computing speed,” Comput. Phys. Commun. 193, 30–37 (2015).
[Crossref]

IEEE Photonics J. (1)

M.-J. Sun, W. Chen, T.-F. Liu, and L.-J. Li, “Image retrieval in spatial and temporal domains with a quadrant detector,” IEEE Photonics J. 9(5), 1 (2017).
[Crossref]

IEEE Signal Process. Mag. (1)

M. F. Duarte, M. A. Davenport, D. Takhar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25(2), 83–91 (2008).
[Crossref]

IEEE Trans. Industr. Inform. (1)

S. Jiao, P. W. M. Tsang, T. C. Poon, J. P. Liu, W. Zou, and X. Li, “Enhanced autofocusing in optical scanning holography based on hologram decomposition,” IEEE Trans. Industr. Inform. 13(5), 2455–2463 (2017).
[Crossref]

In Computational Imaging III. International Society for Optics and Photonics (1)

V. Kravets and A. Stern, “Video compressive sensing using Russian dolls ordering of Hadamard basis for multi-scale sampling of a scene in motion using a single pixel camera,” In Computational Imaging III. International Society for Optics and Photonics 10669, 106690 (2018).

Int. J. Syst. Sci. (1)

N. Sannomiya, H. Iima, and N. Akatsuka, “Genetic algorithm approach to a production ordering problem in an assembly process with constant use of parts,” Int. J. Syst. Sci. 25(9), 1461–1472 (1994).
[Crossref]

J. Opt. Soc. Am. A (2)

Nat. Commun. (2)

M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7(1), 12010 (2016).
[Crossref] [PubMed]

Z. Zhang, X. Ma, and J. Zhong, “Single-pixel imaging by means of Fourier spectrum acquisition,” Nat. Commun. 6(1), 6225 (2015).
[Crossref] [PubMed]

Nat. Photonics (1)

M. P. Edgar, G. M. Gibson, and M. J. Padgett, “Principles and prospects for single-pixel imaging,” Nat. Photonics 13(1), 13–20 (2019).
[Crossref]

Opt. Express (8)

E. Salvador-Balaguer, P. Latorre-Carmona, C. Chabert, F. Pla, J. Lancis, and E. Tajahuerce, “Low-cost single-pixel 3D imaging by using an LED array,” Opt. Express 26(12), 15623–15631 (2018).
[Crossref] [PubMed]

G. M. Gibson, B. Sun, M. P. Edgar, D. B. Phillips, N. Hempler, G. T. Maker, G. P. Malcolm, and M. J. Padgett, “Real-time imaging of methane gas leaks using a single-pixel camera,” Opt. Express 25(4), 2998–3005 (2017).
[Crossref] [PubMed]

J. Peng, M. Yao, J. Cheng, Z. Zhang, S. Li, G. Zheng, and J. Zhong, “Micro-tomography via single-pixel imaging,” Opt. Express 26(24), 31094–31105 (2018).
[Crossref] [PubMed]

Z. Zhang, S. Jiao, M. Yao, X. Li, and J. Zhong, “Secured single-pixel broadcast imaging,” Opt. Express 26(11), 14578–14591 (2018).
[Crossref] [PubMed]

W. K. Yu, M. F. Li, X. R. Yao, X. F. Liu, L. A. Wu, and G. J. Zhai, “Adaptive compressive ghost imaging based on wavelet trees and sparse representation,” Opt. Express 22(6), 7133–7144 (2014).
[Crossref] [PubMed]

M.-J. Sun, M. P. Edgar, D. B. Phillips, G. M. Gibson, and M. J. Padgett, “Improving the signal-to-noise ratio of single-pixel imaging using digital microscanning,” Opt. Express 24(10), 10476–10485 (2016).
[Crossref] [PubMed]

Z. H. Xu, W. Chen, J. Penuelas, M. Padgett, and M. J. Sun, “1000 fps computational ghost imaging using LED-based structured illumination,” Opt. Express 26(3), 2427–2434 (2018).
[Crossref] [PubMed]

S. Liansheng, W. Jiahao, T. Ailing, and A. Asundi, “Optical image hiding under framework of computational ghost imaging based on an expansion strategy,” Opt. Express 27(5), 7213–7225 (2019).
[Crossref] [PubMed]

Opt. Laser Technol. (1)

S. Jiao, C. Zhou, Y. Shi, W. Zou, and X. Li, “Review on optical image hiding and watermarking techniques,” Opt. Laser Technol. 109, 370–380 (2019).
[Crossref]

Optica (2)

Photon. Res. (1)

Phys. Rev. A (1)

J. H. Shapiro, “Computational ghost imaging,” Phys. Rev. A 78(6), 061802 (2008).
[Crossref]

Phys. Rev. Lett. (1)

A. Gatti, E. Brambilla, M. Bache, and L. A. Lugiato, “Ghost imaging with thermal light: comparing entanglement and classical correlation,” Phys. Rev. Lett. 93(9), 093602 (2004).
[Crossref] [PubMed]

Sci. Adv. (1)

D. B. Phillips, M. J. Sun, J. M. Taylor, M. P. Edgar, S. M. Barnett, G. M. Gibson, and M. J. Padgett, “Adaptive foveated single-pixel imaging with dynamic supersampling,” Sci. Adv. 3(4), e1601782 (2017).
[Crossref] [PubMed]

Sci. Rep. (1)

W. Gong, C. Zhao, H. Yu, M. Chen, W. Xu, and S. Han, “Three-dimensional ghost imaging lidar via sparsity constraint,” Sci. Rep. 6(1), 26133 (2016).
[Crossref] [PubMed]

Science (1)

S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, “Optimization by simulated annealing,” Science 220(4598), 671–680 (1983).
[Crossref] [PubMed]

Sensors (Basel) (1)

M. J. Sun and J. M. Zhang, “Single-pixel imaging and its application in three-dimensional reconstruction: a brief review,” Sensors (Basel) 19(3), 732 (2019).
[Crossref] [PubMed]

Other (3)

S. Jiao, “Design of optimal illumination patterns in single-pixel imaging using image dictionaries,” arXiv preprint arXiv:1806.01340 (2018).

“Digital Image Processing”, R. C. Gonzalez & R. E. Woods, Addison-Wesley Publishing Company Inc. (1992).

S. Jiao and P. W. M. Tsang, “Enhanced autofocusing scheme in digital holography based on hologram decomposition,” IEEE International Conference on Industrial Informatics (INDIN)14,541–545 (2016).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1
Fig. 1 Optical setup for a single-pixel imaging system.
Fig. 2
Fig. 2 Translationally shifting object in SPI: (a) the object is moving;(b) the illumination patterns are static; (c) the object is static; (d) the illumination patterns are shifted in a reverse manner.
Fig. 3
Fig. 3 Rotating object in SPI: (a) the object is moving; (b) the illumination patterns are static; (c) the object is static; (d) the illumination patterns are rotating in a reverse manner.
Fig. 4
Fig. 4 Flowchart of our proposed motion estimation scheme.
Fig. 5
Fig. 5 Simulated translationally shifting object: (a) Object image at the start of one video frame; (b) Object image at the end of one video frame; (c) Reconstructed results with conventional methods; (d) Reconstructed results with our proposed scheme.
Fig. 6
Fig. 6 Simulated rotating airplane object: (a)Object image at the start of one video frame; (b) Object image at the end of one video frame; (c)Reconstructed results with conventional methods; (d) Reconstructed results with our proposed scheme.
Fig. 7
Fig. 7 Search of optimal motion parameters for the rotating disk object.
Fig. 8
Fig. 8 Simulated rotating disk with digit numbers: (a)Original object images; (b) Reconstructed results with conventional methods when the object is static; (c)Reconstructed results with conventional methods when the disk is rotating at 2 rounds per second; (d) Reconstructed results with conventional methods when the disk is rotating at 4 rounds per second; (e) Reconstructed results with conventional methods when the disk is rotating at 8 rounds per second; (f) Reconstructed results with our proposed scheme when the disk is rotating at 4 rounds per second; (g) Reconstructed results with our proposed scheme when the disk is rotating at 8 rounds per second.
Fig. 9
Fig. 9 Similar moving path with different sets of motion parameters: (a)First set of motion parameters with longer rotation radius but lower rotation speed; (b)Second set of motion parameters with shorter rotation radius but higher rotation speed; (c)Comparison of Fig. 9(a) and Fig. 9(b).
Fig. 10
Fig. 10 Optical experimental results: reconstructed images with conventional methods when the disk is rotating at (a) 2 rounds per second; (b) 4 rounds per second; (c) 8 rounds per second; reconstructed images with our proposed scheme when the disk is rotating at (d) 4 rounds per second; (e) 8 rounds per second.

Tables (3)

Tables Icon

Table 1 Comparison of estimated and true motion parameters for simulated translational shifting object

Tables Icon

Table 2 Comparison of estimated and true motion parameters for simulated rotating airplane object

Tables Icon

Table 3 Comparison of estimated and true motion parameters for simulated digits on a rotating disk

Equations (5)

Equations on this page are rendered with MathJax. Learn more.

I = H O
[ I ( 1 ) I ( 2 ) I ( M ) ] = [ H ( 1 , 1 ) H ( 1 , N ) H ( M , 1 ) H ( M , N ) ] [ O ( 1 ) O ( 2 ) O ( N ) ]
H n e w ( v x , v y ) = T ( H , v x , v y )
H n e w ( P x , P y , w ) = T ( H , P x , P y , w )
E [ O n e w ( v x , v y ) ] = V a r [ O n e w ( v x , v y ) ]

Metrics