Abstract

Event-based cameras (EBCs) are of interest for potential application to space domain awareness (SDA). EBC attributes, including asynchronous response and low latency provide data reduction, break the trade-off between latency and power, and enable consideration of additional algorithms and processing architectures due to individual timestamps for each event. Potential data reduction by a factor of 10 or greater is particularly attractive for SDA from satellite platforms with constraints on system power, processing, and communication bandwidth. Here we report our initial evaluation of Prophesee third-generation commercial-off-the-shelf (COTS) EBCs, including development of, and comparison with, a limiting magnitude model. The analytic model is a function of sky background radiance; EBC parameters including contrast threshold, dark current, pixel pitch, and spectral quantum efficiency; and the optic aperture diameter and focal length. Using an 85 mm f/1.4 lens, the measured detection limits for the half-size video graphics array (HVGA) and video graphics array (VGA)-format EBCs are 6.9 and 9.8 visual magnitudes (${m_V}$), respectively, at a sky background level of about ${20.3}\;{m_V}$ per square arcsecond. The empirical sensitivity limit for the VGA differs by ${0.1}\;{m_V}$ from our analytic prediction of 9.7 (less than 10% difference in flux). The limiting magnitude model assumes slow motion of point objects across the EBC focal plane array. Additional experiments exploring temporal behavior show that no stars are detected while scanning across the night sky faster than 0.5 deg per second using the VGA-EBC mounted to a 200 mm f/2.0 lens. The limited sensitivity of the evaluated COTS EBCs prevents their use as a replacement for typical CCD/CMOS framing sensors, but EBCs show clear promise for small-aperture, large-field persistent SDA in terms of their efficient capture of temporal information.

Full Article  |  PDF Article
More Like This
Use of commercial off-the-shelf digital cameras for scientific data acquisition and scene-specific color calibration

Derya Akkaynak, Tali Treibitz, Bei Xiao, Umut A. Gürkan, Justine J. Allen, Utkan Demirci, and Roger T. Hanlon
J. Opt. Soc. Am. A 31(2) 312-321 (2014)

Shack-Hartmann wavefront sensing using spatial-temporal data from an event-based image sensor

Fanpeng Kong, Andrew Lambert, Damien Joubert, and Gregory Cohen
Opt. Express 28(24) 36159-36175 (2020)

All-sky camera system providing high temporal resolution annual time series of irradiance in the Arctic

Geir Johnsen, Artur Zolich, Stephen Grant, Rune Bjørgum, Jonathan H. Cohen, David McKee, Tomasz P. Kopec, Daniel Vogedes, and Jørgen Berge
Appl. Opt. 60(22) 6456-6468 (2021)

References

  • View by:

  1. M. Mahowald, “VLSI analogs of neuronal visual processing: a synthesis of form and function,” Ph.D. dissertation (California Institute of Technology, 1992).
  2. G. Gallego, T. Delbrück, G. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. J. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza, “Event-based vision: a survey,” IEEE Trans. Pattern Anal. Mach. Intell. (to be published).
  3. T. Finateu, A. Niwa, D. Matolin, K. Tsuchimoto, A. Mascheroni, E. Reynaud, P. Mostafalu, F. Brady, L. Chotard, F. LeGoff, H. Takahashi, H. Wakabayashi, Y. Oike, and C. Posch, “A 1280 × 720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 µm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline,” IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, California, USA, 2020, pp. 112–114.
  4. G. Cohen, S. Afshar, A. van Schaik, A. Wabnitz, T. Bessell, M. Rutten, and B. Morreale, “Event-based sensing for space situational awareness,” Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, USA, September2017.
  5. S. Afshar, A. P. Nicholson, A. van Schaik, and G. Cohen, “Event-based object detection and tracking for space situational awareness,” IEEE Sens. J. 20, 15117–15132 (2020).
    [Crossref]
  6. G. Taverni, D. P. Moeys, C. Li, C. Cavaco, V. Motsnyi, D. S. S. Bello, and T. Delbrück, “Front and back illuminated dynamic and active pixel vision sensors comparison,” IEEE Trans. Circuits Syst. II 65, 677–681 (2018).
    [Crossref]
  7. G. Cohen, S. Afshar, and A. van Schaik, “Approaches for astrometry using event-based sensors,” Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, USA, September2018.
  8. D. Lang, D. W. Hogg, K. Mierle, M. Blanton, and S. Roweis, “Astrometry.net: blind astrometric calibration of arbitrary astronomical images,” Astron. J. 139, 1782–1800 (2010).
    [Crossref]
  9. M. Żołnowski, R. Reszelewski, D. P. Moeys, T. Delbrück, and K. Kamiński, “Observational evaluation of event cameras performance in optical space surveillance,” NEO and Debris Detection Conference, Darmstadt, Germany, January2019.
  10. D. P. Moeys, F. Corradi, C. Li, S. A. Bamford, L. Longinotti, F. F. Voigt, S. Berry, G. Taverni, F. Helmchen, and T. Delbrück, “A sensitive dynamic and active pixel vision sensor for color or neural imaging applications,” IEEE Trans. Biomed. Circuits Syst. 12, 123–136 (2018).
    [Crossref]
  11. C. Posch, T. Serrano-Gotarredona, B. Linares-Barranco, and T. Delbrück, “Retinomorphic event-based vision sensors: bioinspired cameras with spiking output,” Proc. IEEE 102, 1470–1484 (2014).
    [Crossref]
  12. C. Posch, D. Matolin, and R. Wohlgenannt, “A QVGA 143 dB dynamic range frame-free PWM image sensor with lossless pixel-level video compression and time-domain CDS,” IEEE J. Solid-State Circuits 46, 259–275 (2011).
    [Crossref]
  13. C. Posch and D. Matolin, “Sensitivity and uniformity of a 0.18 µm CMOS temporal contrast pixel array,” IEEE International Symposium of Circuits and Systems (ISCAS), Rio de Janeiro, Brazil, May2011, pp. 1572–1575.
  14. S.-C. Liu, T. Delbrück, G. Indiveri, A. Whatley, and R. Douglas, eds., Event-Based Neuromorphic Systems (Wiley, 2015).
  15. D. Joubert, M. Hébert, H. Konik, and C. Lavergne, “Characterization setup for event-based imagers applied to modulated light signal detection,” Appl. Opt. 58, 1305–1317 (2019).
    [Crossref]
  16. T. Delbrück, Y. Hu, and Z. He, “V2E: from video frames to realistic DVS event camera streams,” arXiv:2006.07722 (2020).
  17. S. B. Howell, Handbook of CCD Astronomy, 2nd ed. (Cambridge University, 2006).
  18. P. Lichtsteiner, “An AER temporal contrast vision sensor,” Ph.D. dissertation (ETH Zurich, 2006).
  19. J. M. Carrasco, “5.3.7 Photometric relationships with other photometric systems,” Gaia Data Release 2 Documentation release Gaia Data Release 2 Documentation release 1.2, European Space Agency, 2021, https://gea.esac.esa.int/archive/documentation/GDR2/Data_processing/chap_cu5pho/sec_cu5pho_calibr/ssec_cu5pho_PhotTransf.html#Ch5.T8 .
  20. iniVation.com, “Specifications—current models,” 2021, https://inivation.com/wp-content/uploads/2021/02/2021-02-05-DVS-Specifications.pdf .
  21. ESA, “Gaia DR2 Space Catalog,” European Space Agency, 2021, https://www.cosmos.esa.int/gaia .

2020 (1)

S. Afshar, A. P. Nicholson, A. van Schaik, and G. Cohen, “Event-based object detection and tracking for space situational awareness,” IEEE Sens. J. 20, 15117–15132 (2020).
[Crossref]

2019 (1)

2018 (2)

G. Taverni, D. P. Moeys, C. Li, C. Cavaco, V. Motsnyi, D. S. S. Bello, and T. Delbrück, “Front and back illuminated dynamic and active pixel vision sensors comparison,” IEEE Trans. Circuits Syst. II 65, 677–681 (2018).
[Crossref]

D. P. Moeys, F. Corradi, C. Li, S. A. Bamford, L. Longinotti, F. F. Voigt, S. Berry, G. Taverni, F. Helmchen, and T. Delbrück, “A sensitive dynamic and active pixel vision sensor for color or neural imaging applications,” IEEE Trans. Biomed. Circuits Syst. 12, 123–136 (2018).
[Crossref]

2014 (1)

C. Posch, T. Serrano-Gotarredona, B. Linares-Barranco, and T. Delbrück, “Retinomorphic event-based vision sensors: bioinspired cameras with spiking output,” Proc. IEEE 102, 1470–1484 (2014).
[Crossref]

2011 (1)

C. Posch, D. Matolin, and R. Wohlgenannt, “A QVGA 143 dB dynamic range frame-free PWM image sensor with lossless pixel-level video compression and time-domain CDS,” IEEE J. Solid-State Circuits 46, 259–275 (2011).
[Crossref]

2010 (1)

D. Lang, D. W. Hogg, K. Mierle, M. Blanton, and S. Roweis, “Astrometry.net: blind astrometric calibration of arbitrary astronomical images,” Astron. J. 139, 1782–1800 (2010).
[Crossref]

Afshar, S.

S. Afshar, A. P. Nicholson, A. van Schaik, and G. Cohen, “Event-based object detection and tracking for space situational awareness,” IEEE Sens. J. 20, 15117–15132 (2020).
[Crossref]

G. Cohen, S. Afshar, A. van Schaik, A. Wabnitz, T. Bessell, M. Rutten, and B. Morreale, “Event-based sensing for space situational awareness,” Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, USA, September2017.

G. Cohen, S. Afshar, and A. van Schaik, “Approaches for astrometry using event-based sensors,” Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, USA, September2018.

Bamford, S. A.

D. P. Moeys, F. Corradi, C. Li, S. A. Bamford, L. Longinotti, F. F. Voigt, S. Berry, G. Taverni, F. Helmchen, and T. Delbrück, “A sensitive dynamic and active pixel vision sensor for color or neural imaging applications,” IEEE Trans. Biomed. Circuits Syst. 12, 123–136 (2018).
[Crossref]

Bartolozzi, C.

G. Gallego, T. Delbrück, G. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. J. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza, “Event-based vision: a survey,” IEEE Trans. Pattern Anal. Mach. Intell. (to be published).

Bello, D. S. S.

G. Taverni, D. P. Moeys, C. Li, C. Cavaco, V. Motsnyi, D. S. S. Bello, and T. Delbrück, “Front and back illuminated dynamic and active pixel vision sensors comparison,” IEEE Trans. Circuits Syst. II 65, 677–681 (2018).
[Crossref]

Berry, S.

D. P. Moeys, F. Corradi, C. Li, S. A. Bamford, L. Longinotti, F. F. Voigt, S. Berry, G. Taverni, F. Helmchen, and T. Delbrück, “A sensitive dynamic and active pixel vision sensor for color or neural imaging applications,” IEEE Trans. Biomed. Circuits Syst. 12, 123–136 (2018).
[Crossref]

Bessell, T.

G. Cohen, S. Afshar, A. van Schaik, A. Wabnitz, T. Bessell, M. Rutten, and B. Morreale, “Event-based sensing for space situational awareness,” Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, USA, September2017.

Blanton, M.

D. Lang, D. W. Hogg, K. Mierle, M. Blanton, and S. Roweis, “Astrometry.net: blind astrometric calibration of arbitrary astronomical images,” Astron. J. 139, 1782–1800 (2010).
[Crossref]

Brady, F.

T. Finateu, A. Niwa, D. Matolin, K. Tsuchimoto, A. Mascheroni, E. Reynaud, P. Mostafalu, F. Brady, L. Chotard, F. LeGoff, H. Takahashi, H. Wakabayashi, Y. Oike, and C. Posch, “A 1280 × 720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 µm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline,” IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, California, USA, 2020, pp. 112–114.

Cavaco, C.

G. Taverni, D. P. Moeys, C. Li, C. Cavaco, V. Motsnyi, D. S. S. Bello, and T. Delbrück, “Front and back illuminated dynamic and active pixel vision sensors comparison,” IEEE Trans. Circuits Syst. II 65, 677–681 (2018).
[Crossref]

Censi, A.

G. Gallego, T. Delbrück, G. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. J. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza, “Event-based vision: a survey,” IEEE Trans. Pattern Anal. Mach. Intell. (to be published).

Chotard, L.

T. Finateu, A. Niwa, D. Matolin, K. Tsuchimoto, A. Mascheroni, E. Reynaud, P. Mostafalu, F. Brady, L. Chotard, F. LeGoff, H. Takahashi, H. Wakabayashi, Y. Oike, and C. Posch, “A 1280 × 720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 µm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline,” IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, California, USA, 2020, pp. 112–114.

Cohen, G.

S. Afshar, A. P. Nicholson, A. van Schaik, and G. Cohen, “Event-based object detection and tracking for space situational awareness,” IEEE Sens. J. 20, 15117–15132 (2020).
[Crossref]

G. Cohen, S. Afshar, and A. van Schaik, “Approaches for astrometry using event-based sensors,” Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, USA, September2018.

G. Cohen, S. Afshar, A. van Schaik, A. Wabnitz, T. Bessell, M. Rutten, and B. Morreale, “Event-based sensing for space situational awareness,” Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, USA, September2017.

Conradt, J.

G. Gallego, T. Delbrück, G. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. J. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza, “Event-based vision: a survey,” IEEE Trans. Pattern Anal. Mach. Intell. (to be published).

Corradi, F.

D. P. Moeys, F. Corradi, C. Li, S. A. Bamford, L. Longinotti, F. F. Voigt, S. Berry, G. Taverni, F. Helmchen, and T. Delbrück, “A sensitive dynamic and active pixel vision sensor for color or neural imaging applications,” IEEE Trans. Biomed. Circuits Syst. 12, 123–136 (2018).
[Crossref]

Daniilidis, K.

G. Gallego, T. Delbrück, G. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. J. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza, “Event-based vision: a survey,” IEEE Trans. Pattern Anal. Mach. Intell. (to be published).

Davison, A. J.

G. Gallego, T. Delbrück, G. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. J. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza, “Event-based vision: a survey,” IEEE Trans. Pattern Anal. Mach. Intell. (to be published).

Delbrück, T.

D. P. Moeys, F. Corradi, C. Li, S. A. Bamford, L. Longinotti, F. F. Voigt, S. Berry, G. Taverni, F. Helmchen, and T. Delbrück, “A sensitive dynamic and active pixel vision sensor for color or neural imaging applications,” IEEE Trans. Biomed. Circuits Syst. 12, 123–136 (2018).
[Crossref]

G. Taverni, D. P. Moeys, C. Li, C. Cavaco, V. Motsnyi, D. S. S. Bello, and T. Delbrück, “Front and back illuminated dynamic and active pixel vision sensors comparison,” IEEE Trans. Circuits Syst. II 65, 677–681 (2018).
[Crossref]

C. Posch, T. Serrano-Gotarredona, B. Linares-Barranco, and T. Delbrück, “Retinomorphic event-based vision sensors: bioinspired cameras with spiking output,” Proc. IEEE 102, 1470–1484 (2014).
[Crossref]

M. Żołnowski, R. Reszelewski, D. P. Moeys, T. Delbrück, and K. Kamiński, “Observational evaluation of event cameras performance in optical space surveillance,” NEO and Debris Detection Conference, Darmstadt, Germany, January2019.

G. Gallego, T. Delbrück, G. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. J. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza, “Event-based vision: a survey,” IEEE Trans. Pattern Anal. Mach. Intell. (to be published).

T. Delbrück, Y. Hu, and Z. He, “V2E: from video frames to realistic DVS event camera streams,” arXiv:2006.07722 (2020).

Finateu, T.

T. Finateu, A. Niwa, D. Matolin, K. Tsuchimoto, A. Mascheroni, E. Reynaud, P. Mostafalu, F. Brady, L. Chotard, F. LeGoff, H. Takahashi, H. Wakabayashi, Y. Oike, and C. Posch, “A 1280 × 720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 µm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline,” IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, California, USA, 2020, pp. 112–114.

Gallego, G.

G. Gallego, T. Delbrück, G. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. J. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza, “Event-based vision: a survey,” IEEE Trans. Pattern Anal. Mach. Intell. (to be published).

He, Z.

T. Delbrück, Y. Hu, and Z. He, “V2E: from video frames to realistic DVS event camera streams,” arXiv:2006.07722 (2020).

Hébert, M.

Helmchen, F.

D. P. Moeys, F. Corradi, C. Li, S. A. Bamford, L. Longinotti, F. F. Voigt, S. Berry, G. Taverni, F. Helmchen, and T. Delbrück, “A sensitive dynamic and active pixel vision sensor for color or neural imaging applications,” IEEE Trans. Biomed. Circuits Syst. 12, 123–136 (2018).
[Crossref]

Hogg, D. W.

D. Lang, D. W. Hogg, K. Mierle, M. Blanton, and S. Roweis, “Astrometry.net: blind astrometric calibration of arbitrary astronomical images,” Astron. J. 139, 1782–1800 (2010).
[Crossref]

Howell, S. B.

S. B. Howell, Handbook of CCD Astronomy, 2nd ed. (Cambridge University, 2006).

Hu, Y.

T. Delbrück, Y. Hu, and Z. He, “V2E: from video frames to realistic DVS event camera streams,” arXiv:2006.07722 (2020).

Joubert, D.

Kaminski, K.

M. Żołnowski, R. Reszelewski, D. P. Moeys, T. Delbrück, and K. Kamiński, “Observational evaluation of event cameras performance in optical space surveillance,” NEO and Debris Detection Conference, Darmstadt, Germany, January2019.

Konik, H.

Lang, D.

D. Lang, D. W. Hogg, K. Mierle, M. Blanton, and S. Roweis, “Astrometry.net: blind astrometric calibration of arbitrary astronomical images,” Astron. J. 139, 1782–1800 (2010).
[Crossref]

Lavergne, C.

LeGoff, F.

T. Finateu, A. Niwa, D. Matolin, K. Tsuchimoto, A. Mascheroni, E. Reynaud, P. Mostafalu, F. Brady, L. Chotard, F. LeGoff, H. Takahashi, H. Wakabayashi, Y. Oike, and C. Posch, “A 1280 × 720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 µm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline,” IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, California, USA, 2020, pp. 112–114.

Leutenegger, S.

G. Gallego, T. Delbrück, G. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. J. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza, “Event-based vision: a survey,” IEEE Trans. Pattern Anal. Mach. Intell. (to be published).

Li, C.

G. Taverni, D. P. Moeys, C. Li, C. Cavaco, V. Motsnyi, D. S. S. Bello, and T. Delbrück, “Front and back illuminated dynamic and active pixel vision sensors comparison,” IEEE Trans. Circuits Syst. II 65, 677–681 (2018).
[Crossref]

D. P. Moeys, F. Corradi, C. Li, S. A. Bamford, L. Longinotti, F. F. Voigt, S. Berry, G. Taverni, F. Helmchen, and T. Delbrück, “A sensitive dynamic and active pixel vision sensor for color or neural imaging applications,” IEEE Trans. Biomed. Circuits Syst. 12, 123–136 (2018).
[Crossref]

Lichtsteiner, P.

P. Lichtsteiner, “An AER temporal contrast vision sensor,” Ph.D. dissertation (ETH Zurich, 2006).

Linares-Barranco, B.

C. Posch, T. Serrano-Gotarredona, B. Linares-Barranco, and T. Delbrück, “Retinomorphic event-based vision sensors: bioinspired cameras with spiking output,” Proc. IEEE 102, 1470–1484 (2014).
[Crossref]

Longinotti, L.

D. P. Moeys, F. Corradi, C. Li, S. A. Bamford, L. Longinotti, F. F. Voigt, S. Berry, G. Taverni, F. Helmchen, and T. Delbrück, “A sensitive dynamic and active pixel vision sensor for color or neural imaging applications,” IEEE Trans. Biomed. Circuits Syst. 12, 123–136 (2018).
[Crossref]

Mahowald, M.

M. Mahowald, “VLSI analogs of neuronal visual processing: a synthesis of form and function,” Ph.D. dissertation (California Institute of Technology, 1992).

Mascheroni, A.

T. Finateu, A. Niwa, D. Matolin, K. Tsuchimoto, A. Mascheroni, E. Reynaud, P. Mostafalu, F. Brady, L. Chotard, F. LeGoff, H. Takahashi, H. Wakabayashi, Y. Oike, and C. Posch, “A 1280 × 720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 µm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline,” IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, California, USA, 2020, pp. 112–114.

Matolin, D.

C. Posch, D. Matolin, and R. Wohlgenannt, “A QVGA 143 dB dynamic range frame-free PWM image sensor with lossless pixel-level video compression and time-domain CDS,” IEEE J. Solid-State Circuits 46, 259–275 (2011).
[Crossref]

C. Posch and D. Matolin, “Sensitivity and uniformity of a 0.18 µm CMOS temporal contrast pixel array,” IEEE International Symposium of Circuits and Systems (ISCAS), Rio de Janeiro, Brazil, May2011, pp. 1572–1575.

T. Finateu, A. Niwa, D. Matolin, K. Tsuchimoto, A. Mascheroni, E. Reynaud, P. Mostafalu, F. Brady, L. Chotard, F. LeGoff, H. Takahashi, H. Wakabayashi, Y. Oike, and C. Posch, “A 1280 × 720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 µm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline,” IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, California, USA, 2020, pp. 112–114.

Mierle, K.

D. Lang, D. W. Hogg, K. Mierle, M. Blanton, and S. Roweis, “Astrometry.net: blind astrometric calibration of arbitrary astronomical images,” Astron. J. 139, 1782–1800 (2010).
[Crossref]

Moeys, D. P.

G. Taverni, D. P. Moeys, C. Li, C. Cavaco, V. Motsnyi, D. S. S. Bello, and T. Delbrück, “Front and back illuminated dynamic and active pixel vision sensors comparison,” IEEE Trans. Circuits Syst. II 65, 677–681 (2018).
[Crossref]

D. P. Moeys, F. Corradi, C. Li, S. A. Bamford, L. Longinotti, F. F. Voigt, S. Berry, G. Taverni, F. Helmchen, and T. Delbrück, “A sensitive dynamic and active pixel vision sensor for color or neural imaging applications,” IEEE Trans. Biomed. Circuits Syst. 12, 123–136 (2018).
[Crossref]

M. Żołnowski, R. Reszelewski, D. P. Moeys, T. Delbrück, and K. Kamiński, “Observational evaluation of event cameras performance in optical space surveillance,” NEO and Debris Detection Conference, Darmstadt, Germany, January2019.

Morreale, B.

G. Cohen, S. Afshar, A. van Schaik, A. Wabnitz, T. Bessell, M. Rutten, and B. Morreale, “Event-based sensing for space situational awareness,” Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, USA, September2017.

Mostafalu, P.

T. Finateu, A. Niwa, D. Matolin, K. Tsuchimoto, A. Mascheroni, E. Reynaud, P. Mostafalu, F. Brady, L. Chotard, F. LeGoff, H. Takahashi, H. Wakabayashi, Y. Oike, and C. Posch, “A 1280 × 720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 µm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline,” IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, California, USA, 2020, pp. 112–114.

Motsnyi, V.

G. Taverni, D. P. Moeys, C. Li, C. Cavaco, V. Motsnyi, D. S. S. Bello, and T. Delbrück, “Front and back illuminated dynamic and active pixel vision sensors comparison,” IEEE Trans. Circuits Syst. II 65, 677–681 (2018).
[Crossref]

Nicholson, A. P.

S. Afshar, A. P. Nicholson, A. van Schaik, and G. Cohen, “Event-based object detection and tracking for space situational awareness,” IEEE Sens. J. 20, 15117–15132 (2020).
[Crossref]

Niwa, A.

T. Finateu, A. Niwa, D. Matolin, K. Tsuchimoto, A. Mascheroni, E. Reynaud, P. Mostafalu, F. Brady, L. Chotard, F. LeGoff, H. Takahashi, H. Wakabayashi, Y. Oike, and C. Posch, “A 1280 × 720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 µm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline,” IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, California, USA, 2020, pp. 112–114.

Oike, Y.

T. Finateu, A. Niwa, D. Matolin, K. Tsuchimoto, A. Mascheroni, E. Reynaud, P. Mostafalu, F. Brady, L. Chotard, F. LeGoff, H. Takahashi, H. Wakabayashi, Y. Oike, and C. Posch, “A 1280 × 720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 µm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline,” IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, California, USA, 2020, pp. 112–114.

Orchard, G.

G. Gallego, T. Delbrück, G. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. J. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza, “Event-based vision: a survey,” IEEE Trans. Pattern Anal. Mach. Intell. (to be published).

Posch, C.

C. Posch, T. Serrano-Gotarredona, B. Linares-Barranco, and T. Delbrück, “Retinomorphic event-based vision sensors: bioinspired cameras with spiking output,” Proc. IEEE 102, 1470–1484 (2014).
[Crossref]

C. Posch, D. Matolin, and R. Wohlgenannt, “A QVGA 143 dB dynamic range frame-free PWM image sensor with lossless pixel-level video compression and time-domain CDS,” IEEE J. Solid-State Circuits 46, 259–275 (2011).
[Crossref]

T. Finateu, A. Niwa, D. Matolin, K. Tsuchimoto, A. Mascheroni, E. Reynaud, P. Mostafalu, F. Brady, L. Chotard, F. LeGoff, H. Takahashi, H. Wakabayashi, Y. Oike, and C. Posch, “A 1280 × 720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 µm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline,” IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, California, USA, 2020, pp. 112–114.

C. Posch and D. Matolin, “Sensitivity and uniformity of a 0.18 µm CMOS temporal contrast pixel array,” IEEE International Symposium of Circuits and Systems (ISCAS), Rio de Janeiro, Brazil, May2011, pp. 1572–1575.

Reszelewski, R.

M. Żołnowski, R. Reszelewski, D. P. Moeys, T. Delbrück, and K. Kamiński, “Observational evaluation of event cameras performance in optical space surveillance,” NEO and Debris Detection Conference, Darmstadt, Germany, January2019.

Reynaud, E.

T. Finateu, A. Niwa, D. Matolin, K. Tsuchimoto, A. Mascheroni, E. Reynaud, P. Mostafalu, F. Brady, L. Chotard, F. LeGoff, H. Takahashi, H. Wakabayashi, Y. Oike, and C. Posch, “A 1280 × 720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 µm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline,” IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, California, USA, 2020, pp. 112–114.

Roweis, S.

D. Lang, D. W. Hogg, K. Mierle, M. Blanton, and S. Roweis, “Astrometry.net: blind astrometric calibration of arbitrary astronomical images,” Astron. J. 139, 1782–1800 (2010).
[Crossref]

Rutten, M.

G. Cohen, S. Afshar, A. van Schaik, A. Wabnitz, T. Bessell, M. Rutten, and B. Morreale, “Event-based sensing for space situational awareness,” Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, USA, September2017.

Scaramuzza, D.

G. Gallego, T. Delbrück, G. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. J. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza, “Event-based vision: a survey,” IEEE Trans. Pattern Anal. Mach. Intell. (to be published).

Serrano-Gotarredona, T.

C. Posch, T. Serrano-Gotarredona, B. Linares-Barranco, and T. Delbrück, “Retinomorphic event-based vision sensors: bioinspired cameras with spiking output,” Proc. IEEE 102, 1470–1484 (2014).
[Crossref]

Taba, B.

G. Gallego, T. Delbrück, G. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. J. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza, “Event-based vision: a survey,” IEEE Trans. Pattern Anal. Mach. Intell. (to be published).

Takahashi, H.

T. Finateu, A. Niwa, D. Matolin, K. Tsuchimoto, A. Mascheroni, E. Reynaud, P. Mostafalu, F. Brady, L. Chotard, F. LeGoff, H. Takahashi, H. Wakabayashi, Y. Oike, and C. Posch, “A 1280 × 720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 µm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline,” IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, California, USA, 2020, pp. 112–114.

Taverni, G.

D. P. Moeys, F. Corradi, C. Li, S. A. Bamford, L. Longinotti, F. F. Voigt, S. Berry, G. Taverni, F. Helmchen, and T. Delbrück, “A sensitive dynamic and active pixel vision sensor for color or neural imaging applications,” IEEE Trans. Biomed. Circuits Syst. 12, 123–136 (2018).
[Crossref]

G. Taverni, D. P. Moeys, C. Li, C. Cavaco, V. Motsnyi, D. S. S. Bello, and T. Delbrück, “Front and back illuminated dynamic and active pixel vision sensors comparison,” IEEE Trans. Circuits Syst. II 65, 677–681 (2018).
[Crossref]

Tsuchimoto, K.

T. Finateu, A. Niwa, D. Matolin, K. Tsuchimoto, A. Mascheroni, E. Reynaud, P. Mostafalu, F. Brady, L. Chotard, F. LeGoff, H. Takahashi, H. Wakabayashi, Y. Oike, and C. Posch, “A 1280 × 720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 µm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline,” IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, California, USA, 2020, pp. 112–114.

van Schaik, A.

S. Afshar, A. P. Nicholson, A. van Schaik, and G. Cohen, “Event-based object detection and tracking for space situational awareness,” IEEE Sens. J. 20, 15117–15132 (2020).
[Crossref]

G. Cohen, S. Afshar, A. van Schaik, A. Wabnitz, T. Bessell, M. Rutten, and B. Morreale, “Event-based sensing for space situational awareness,” Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, USA, September2017.

G. Cohen, S. Afshar, and A. van Schaik, “Approaches for astrometry using event-based sensors,” Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, USA, September2018.

Voigt, F. F.

D. P. Moeys, F. Corradi, C. Li, S. A. Bamford, L. Longinotti, F. F. Voigt, S. Berry, G. Taverni, F. Helmchen, and T. Delbrück, “A sensitive dynamic and active pixel vision sensor for color or neural imaging applications,” IEEE Trans. Biomed. Circuits Syst. 12, 123–136 (2018).
[Crossref]

Wabnitz, A.

G. Cohen, S. Afshar, A. van Schaik, A. Wabnitz, T. Bessell, M. Rutten, and B. Morreale, “Event-based sensing for space situational awareness,” Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, USA, September2017.

Wakabayashi, H.

T. Finateu, A. Niwa, D. Matolin, K. Tsuchimoto, A. Mascheroni, E. Reynaud, P. Mostafalu, F. Brady, L. Chotard, F. LeGoff, H. Takahashi, H. Wakabayashi, Y. Oike, and C. Posch, “A 1280 × 720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 µm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline,” IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, California, USA, 2020, pp. 112–114.

Wohlgenannt, R.

C. Posch, D. Matolin, and R. Wohlgenannt, “A QVGA 143 dB dynamic range frame-free PWM image sensor with lossless pixel-level video compression and time-domain CDS,” IEEE J. Solid-State Circuits 46, 259–275 (2011).
[Crossref]

Zolnowski, M.

M. Żołnowski, R. Reszelewski, D. P. Moeys, T. Delbrück, and K. Kamiński, “Observational evaluation of event cameras performance in optical space surveillance,” NEO and Debris Detection Conference, Darmstadt, Germany, January2019.

Appl. Opt. (1)

Astron. J. (1)

D. Lang, D. W. Hogg, K. Mierle, M. Blanton, and S. Roweis, “Astrometry.net: blind astrometric calibration of arbitrary astronomical images,” Astron. J. 139, 1782–1800 (2010).
[Crossref]

IEEE J. Solid-State Circuits (1)

C. Posch, D. Matolin, and R. Wohlgenannt, “A QVGA 143 dB dynamic range frame-free PWM image sensor with lossless pixel-level video compression and time-domain CDS,” IEEE J. Solid-State Circuits 46, 259–275 (2011).
[Crossref]

IEEE Sens. J. (1)

S. Afshar, A. P. Nicholson, A. van Schaik, and G. Cohen, “Event-based object detection and tracking for space situational awareness,” IEEE Sens. J. 20, 15117–15132 (2020).
[Crossref]

IEEE Trans. Biomed. Circuits Syst. (1)

D. P. Moeys, F. Corradi, C. Li, S. A. Bamford, L. Longinotti, F. F. Voigt, S. Berry, G. Taverni, F. Helmchen, and T. Delbrück, “A sensitive dynamic and active pixel vision sensor for color or neural imaging applications,” IEEE Trans. Biomed. Circuits Syst. 12, 123–136 (2018).
[Crossref]

IEEE Trans. Circuits Syst. II (1)

G. Taverni, D. P. Moeys, C. Li, C. Cavaco, V. Motsnyi, D. S. S. Bello, and T. Delbrück, “Front and back illuminated dynamic and active pixel vision sensors comparison,” IEEE Trans. Circuits Syst. II 65, 677–681 (2018).
[Crossref]

Proc. IEEE (1)

C. Posch, T. Serrano-Gotarredona, B. Linares-Barranco, and T. Delbrück, “Retinomorphic event-based vision sensors: bioinspired cameras with spiking output,” Proc. IEEE 102, 1470–1484 (2014).
[Crossref]

Other (14)

C. Posch and D. Matolin, “Sensitivity and uniformity of a 0.18 µm CMOS temporal contrast pixel array,” IEEE International Symposium of Circuits and Systems (ISCAS), Rio de Janeiro, Brazil, May2011, pp. 1572–1575.

S.-C. Liu, T. Delbrück, G. Indiveri, A. Whatley, and R. Douglas, eds., Event-Based Neuromorphic Systems (Wiley, 2015).

T. Delbrück, Y. Hu, and Z. He, “V2E: from video frames to realistic DVS event camera streams,” arXiv:2006.07722 (2020).

S. B. Howell, Handbook of CCD Astronomy, 2nd ed. (Cambridge University, 2006).

P. Lichtsteiner, “An AER temporal contrast vision sensor,” Ph.D. dissertation (ETH Zurich, 2006).

J. M. Carrasco, “5.3.7 Photometric relationships with other photometric systems,” Gaia Data Release 2 Documentation release Gaia Data Release 2 Documentation release 1.2, European Space Agency, 2021, https://gea.esac.esa.int/archive/documentation/GDR2/Data_processing/chap_cu5pho/sec_cu5pho_calibr/ssec_cu5pho_PhotTransf.html#Ch5.T8 .

iniVation.com, “Specifications—current models,” 2021, https://inivation.com/wp-content/uploads/2021/02/2021-02-05-DVS-Specifications.pdf .

ESA, “Gaia DR2 Space Catalog,” European Space Agency, 2021, https://www.cosmos.esa.int/gaia .

G. Cohen, S. Afshar, and A. van Schaik, “Approaches for astrometry using event-based sensors,” Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, USA, September2018.

M. Żołnowski, R. Reszelewski, D. P. Moeys, T. Delbrück, and K. Kamiński, “Observational evaluation of event cameras performance in optical space surveillance,” NEO and Debris Detection Conference, Darmstadt, Germany, January2019.

M. Mahowald, “VLSI analogs of neuronal visual processing: a synthesis of form and function,” Ph.D. dissertation (California Institute of Technology, 1992).

G. Gallego, T. Delbrück, G. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. J. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza, “Event-based vision: a survey,” IEEE Trans. Pattern Anal. Mach. Intell. (to be published).

T. Finateu, A. Niwa, D. Matolin, K. Tsuchimoto, A. Mascheroni, E. Reynaud, P. Mostafalu, F. Brady, L. Chotard, F. LeGoff, H. Takahashi, H. Wakabayashi, Y. Oike, and C. Posch, “A 1280 × 720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 µm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline,” IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, California, USA, 2020, pp. 112–114.

G. Cohen, S. Afshar, A. van Schaik, A. Wabnitz, T. Bessell, M. Rutten, and B. Morreale, “Event-based sensing for space situational awareness,” Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, USA, September2017.

Data Availability

Data underlying the results presented in this paper are available in Refs. [19, 21]. Event-based camera observational data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

19. J. M. Carrasco, “5.3.7 Photometric relationships with other photometric systems,” Gaia Data Release 2 Documentation release Gaia Data Release 2 Documentation release 1.2, European Space Agency, 2021, https://gea.esac.esa.int/archive/documentation/GDR2/Data_processing/chap_cu5pho/sec_cu5pho_calibr/ssec_cu5pho_PhotTransf.html#Ch5.T8.

21. ESA, “Gaia DR2 Space Catalog,” European Space Agency, 2021, https://www.cosmos.esa.int/gaia.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (11)

Fig. 1.
Fig. 1. Histograms of the number of Source-Extractor detections per VGA pseudoframe successfully correlated to the Gaia DR2 star catalog, for bias files B3 and B4. The on-sky data were recorded using the VGA EBC mounted to an 85 mm f/1.4 lens.
Fig. 2.
Fig. 2. Composite integrand of Eq. (12), which defines the spectrally resolved photon flux density for a zero-magnitude star in the VGA EBC-limiting magnitude model.
Fig. 3.
Fig. 3. (a) Contrast threshold and (b) minimum photocurrent constraints of Eqs. (7) and (8) plotted separately. Regions where a constraint is violated are plotted with cross-hatching.
Fig. 4.
Fig. 4. Example on-sky data captured in fixed-pointing observing mode with VGA EBC mounted to an 85 mm f/1.4 lens and presented as (a) a 3D point plot where red and green points correspond to negative-polarity and positive-polarity events, respectively and (b) a 2D event-count histogram (polarity agnostic).
Fig. 5.
Fig. 5. eCDFs of matched star magnitudes for the HVGA and VGA EBCs based on one full night of data collection with each camera using an 85 mm f/1.4 lens.
Fig. 6.
Fig. 6. Limiting magnitude model for the VGA EBC with 85 mm f/1.4 lens (cross-hatching defines the undetectable region). The empirical result for dark sky limiting magnitude (${9.8}\;{m_V}$) is plotted as a green star calculated from the 90% threshold of the eCDF of matched stars (averaged over 30 nights).
Fig. 7.
Fig. 7. Empirical results for limiting magnitude (sensitivity) versus scan rate in terms of detecting stars against a dark sky using the VGA EBC with 200 mm f/2.0 lens. Sidereal rate is indicated by a vertical dashed red line.
Fig. 8.
Fig. 8. Measured photometric response for the HVGA and VGA EBCs using an 85 mm f/1.4 lens.
Fig. 9.
Fig. 9. Histogram of mean motions of all space object TLEs correlated to at least one VGA detection given a radial error tolerance of 0.06 deg (3.6 arcmin).
Fig. 10.
Fig. 10. Histograms of mean motions of detected and correlated space objects (TLEs) with mean motions less than or equal to 1.4 revs/day and plotted separately for payload versus other objects (rocket body or debris).
Fig. 11.
Fig. 11. (a) Example difference of two pseudoframes registered to align star backgrounds. Note the detected point-like periodic signature at bottom center, which correlates to object #22245 in GEO disposal orbit with a mean motion of 0.998 revs/day. The duration of blinking is about 0.3 s, with a period of approximately 18.1 s. (b) The subregion containing object #22245 shown as a close-up.

Tables (3)

Tables Icon

Table 1. Galactic Coordinates at which Gaia DR2 Star Catalog Was Interrogated to Estimate Local Star Densities

Tables Icon

Table 2. Summary of Important Camera System and Observing Parameters

Tables Icon

Table 3. Summary of VGA Bias Files Considered

Equations (13)

Equations on this page are rendered with MathJax. Learn more.

T C ( t 1 ) = ln [ I p h o ( t 1 ) ] ln [ I p h o ( t 0 ) ] = ln [ I p h o ( t 1 ) I p h o ( t 0 ) ] ,
| T C ( t 1 ) | = | ln [ I p h o ( t 1 ) I p h o ( t 0 ) ] | θ ,
T C ( t 1 ) = I p h o ( t 1 ) I p h o ( t 0 ) e θ O N .
T C ( t 1 ) I p h o ( t 1 ) I p h o ( t 0 ) I p h o ( t 0 ) θ O N .
e i = { ( x , y ) i , p i , t i } i = 1 N ,
T C o n s k y = ( I p h o o b j + I p h o s k y + I d r k ) ( I p h o s k y + I d r k ) I p h o s k y + I d r k = I p h o o b j I p h o s k y + I d r k ,
I p h o o b j I p h o s k y + I d r k 0.12
( I p h o o b j + I p h o s k y ) 13.3 k p . e . / s e c o n d ,
E ( m V ) = E 0 × 10 0.4 m V ,
E λ ( λ , m V , T e f f ) = C 1 λ 4 10 0.4 m V exp [ C 2 / C 2 ( 0.55 T e f f ) ( 0.55 T e f f ) 1 ] exp [ C 2 / C 2 ( 10 6 λ T e f f ) ( 10 6 λ T e f f ) 1 ] ,
I p h o o b j ( m V ; A , η ( λ ) , Λ 0 ) = 10 0.4 m V π ( D 2 ) 2 Λ 0 max λ [ η ( λ ) ] ,
Λ 0 = 0.3 E 6 1.0 E 6 η ( λ ) τ ( λ ) E λ ( λ ; m V = 0 , T e f f = 5920 K ) d λ .
I p h o s k y ( M s k y ; A , ϕ , Λ 0 ) = 10 0.4 M s k y π ( D 2 ) 2 ϕ 2 Λ 0 ,

Metrics