Abstract
Auroras are space weather light phenomena caused by interactions between the solar wind, Earth’s magnetic field, and Earth’s atmosphere. Scientific aurora images are often captured with camera exposure times of up to 1-2 seconds for sufficient light input. However, long exposure times also intensify other light emissions, e.g. urban- and moonlight, thus, dark night conditions are preferred. Studies have carried out high-speed imaging with up to 160 Hz, but higher has been elusive. Here, we propose to use the emerging Dynamic Vision Sensors (DVS) technology as an alternative or complementary imaging approach for auroras with high dynamic range (110–120 dB) and sampling rate (5KHz–1MHz). We present the first observations of auroras at 5KHz using DVS, highlighting the potential use within this application. Approaches to reconstructing the brightness intensity image are introduced, giving the photon flux for each pixel and the whole sensor, mimicking a photometer. We show that DVS can observe auroras in challenging urban- and moonlight conditions in high-temporal resolution, enabling a paradigm shift within the scientific field. Finally, we discuss the use of DVS more broadly within geoscience.
Similar content being viewed by others
Introduction
Auroras are a space weather phenomenon caused by the interaction between the solar wind, Earth’s magnetic field and the ionosphere. These interactions can increase the amount of charged particles in the magnetosphere, enabling them to enter the ionosphere and, depending on the energy spectrum, cause diffuse or discrete auroras1.
Diffuse auroras are somewhat faint, whereas discrete auroras are clearer and vary greatly in intensity1,2. During certain conditions, the embedded magnetic field in the solar wind and Earth’s magnetic field can reconnect, which leads to dynamic processes in the magnetosphere. These processes can result in a high particle precipitation rate in the ionosphere, creating strong discrete auroras3. These processes have implications for society, such as disturbances in the ionosphere known as scintillations that, in severe cases, can cause GNSS receivers to lose track of satellites, resulting in reduced positioning reliability in the polar regions4. Auroras are visible to the human eye on a clear, dark sky, but modern cameras are more sensitive and can capture more details5.
Imaging of auroras uses Complementary Metal Oxide Semiconductor (CMOS) or Charge Coupled Device (CCD) cameras equipped with all-sky or narrow field lenses pointed towards zenith. CMOS/CCD sensors collect photons in a pixelated area to acquire an image. They typically require long exposure times, typically 1 second, for the most common equipment5, which also intensifies background brightness, constraining observations during moonlight and in urban areas. The requirement for long exposure times has limited the observations of 2D dynamic changes to 160Hz6.
Instead, we propose using the emerging imaging technology called Dynamic Vision Sensors (DVS) to circumvent these instrumental challenges. DVS utilises alternative processing of the electrons measured by photoreceptors, enabling asynchronous and independent pixel measurements and a paradigm shift in how visual information is acquired. DVS offer high temporal resolution and dynamic range, low power consumption and data-sparse output7. Here, we present and discuss the first DVS observations of auroras, demonstrating examples of how the event information can be used for this application. Further, we suggest its use within geoscience more broadly.
Dynamic vision sensors
A new paradigm for observing auroras
DVS or an event camera is a biology-inspired silicon retina with pixels that comprises a photoreceptor, which converts photons to a current, I(t), and an electrical circuit that compares with the previous current, \(I(t-1)\). If I(t) exceeds \(I(t-1)\) by a certain percentage threshold, \(\theta\), in either the negative or positive direction, an event is reported at the pixel. Depending on the direction of change, the event is categorised as an on or off event for positive and negative change, respectively. This relationship is given by8:
As triggering events depend solely on the photons reaching the photoreceptor of the pixel, the pixel can independently adjust to lighting conditions. No events are registered when no light changes, providing a variable data rate and low power consumption. In contrast, CMOS/CCD cameras measure the brightness level for all pixels at constant intervals, making the measurement dependent on a constant frame rate.
The dynamic range is the ratio between the highest and lowest illumination. A typical CCD/CMOS camera has a dynamic range of 60-80 dB. For DVS, the brightness level defines the range, such that an 80% contrast change is detected in at least 50% of the pixels9. This definition provides DVS with a typical dynamic range of 110-120 dB7 with conditions ranging from 100k lux (the sun) to 0.1 lux (moonlit sky)9,10.
DVS operates without shutter or exposure time while events are transmitted as detected. When the sensor is not saturated, the timestamp precision is 200 us, and the readout latency is less than 1 ms10. In periods of sensor saturation, the readout latency error can be in the order of 1 ms, whereas the 557.7 nm emission half-live is around 800 ms11. The electric circuit detecting events is analogue and can timestamp events rapidly. Event capture rates are reported at 5kHz to 1MHz7,10. High-speed cameras, while expensive, can achieve similar rates. However, as these rates increase, the spatial resolution decreases12. These cameras also possess the same absolute brightness constraints as typical CMOS/CCD cameras. Furthermore, their observation time is limited due to the rapid filling of the internal memory buffer.
Current DVS are constrained by sensor size and the number of pixels, from \(346 \times 260\) to \(1280 \times 960\) pixels7,10, and the number of simultaneously acquired events that can be registered, termed maximum throughput, in Million Events Per Second (MEPS), ranging from 12 to 450 MEPS. Finally, event noise is present and exaggerated in low or high illumination, motivating postprocessing.
The first observation of auroras with a DVS
An 826 seconds long DVS footage of auroras was captured March 3rd 2023, 19:17 UTC from the roof of the Auroral Observatory in the centre of Tromsø, Norway, with an unobstructed sky view in temperatures around −5 to \(-8^{\circ C}\). The iniVation DVXplorer camera10 and the associated Dynamic Vision Viewer (DV)13 software were used, with the high sensitivity setting enabled, corresponding to a 5% change in brightness14.
The DVXplorer specifications are a temporal resolution of 65 - 200\(\mu s\) (5kHz - 15kHz), maximum throughput of 165MEPS, a dynamic range of 110dB (0.3 lux - 100k lux) and \(640 \times 480\) pixels. The field of view was \(37.44^\circ \times 28.52^\circ\) with a diagonal sensor length of 7.2mm, and a focal length of 8.5mm lens pointed at angles of \(266^\circ\) azimuth and \(25^\circ\) inclination. The recording conditions are shown in Fig. 1a, captured with a 1-second exposure time highlighting the night sky’s brightness from the surrounding city and the moon. In Fig. 1b, a green aurora is present, showcasing the dominant type of structures present during the recording. According to15, the background brightness is 16.82 magnitude arcsecond−2 (3.52 \(\times 10^{-5}\) \(Wm^{-2}sr^{-1}\)) at 19.48 UTC measured by a TESS-W photometer at the observatory16. The geomagnetic index, Kp, was \(\sim {EMPTY}4\) at observation time17 and done during the expansion phase of a magnetic substorm. The Geophysical Observatory in Tromsø recorded horizontal deflection of the magnetic field that reached 400 nT at Tromsø, while the strongest activity was located between Svalbard and mainland Norway18. Due to the abundant moonlight, most other nearby scientific cameras were turned off. A few cameras were running at Ramfjordmoen and Skibotn, but those areas were covered by clouds at the measurement time, and the images were unusable, making a direct comparison with traditional cameras infeasible.
Supplementary visual materials are available19, including the event data given in the native “aedat4” event file format20 and numerous integrated, smoothed, and decay accumulated reconstructed brightness frames (Subsection 3.1) sampled in 60 Hz and 600 Hz and played in 60 Frames Per Second (FPS). In addition, integrated and decayed event videos (Subsection 3.2) are available at 60 Hz, 600 Hz and 5000Hz.
(a) The DVS setup pointed West and \(25^\circ\) inclination angle, highlighting the conditions of the observations at the Auroral Observatory in Tromsø, Norway. Image taken with 1-second exposure time. (b) Green aurora in the observation direction taken 19:24:54 UTC with 2-second exposure time. The red square illustrates the approximate DVS viewing area at the time corresponding to Fig. 3b. Both images were taken by N. Gulbrandsen with a Canon EOS 6D at ISO 800 using a Samyang 14 mm f/2.8 at aperture f/4.
Reconstructed brightness observations
(a) Integrated reconstructed photon flux reaching the sensor (green) in equivalent Kilo Rayleigh [kR] for 557.7 nm over time [s] sampled in 5KHz (950,000 points). Magenta and blue lines indicate the period shown in Fig. 3. (b) zoomed-in section corresponding to the yellow square.
As introduced in Section 2, the event camera reports events when the brightness of the observed scene changes at a given pixel. Knowing the threshold at which those event triggers happen allows us to re-calculate the brightness from the events using a so-called event accumulation approach (Subsection 5.2). Several noise sources are present; the most prominent come from front-end noise, false events that can appear in any pixel, and “hot” -pixels that produce excessive same polarised events. Noise filtering (Subsection 5.1) removed 90.29% of total events while having 67% more on than off events. Hot pixels are replaced by an average of the spatial neighbours at each frame. In addition, a bilateral smoothing filter was applied, which convolutes a 2D image with a gaussian kernel. The filter reduces noise while preserving edges. A minor brightness decay is added at each accumulated frame with 90% event value reduction in 15 seconds to stabilise the reconstruction.
The video in21 is displayed in real-time at 60FPS and integrated with 60 Hz, showing an auroral arc almost perpendicular to and aligned vertically in the centre of the frame. The arc oscillates macroscopically from left to right at various intensities. We also observe micro changes that are common in auroras, which appear as wave and blanket movements along the transverse direction of the arc. A slowed version of a portion of the video is available at22 sampled at 600 Hz and displayed in 60FPS, i.e. in one-tenth of reality.
To further demonstrate the DVS’s capabilities for observing auroras, we simulate a photometer by integrating the photon flux on each pixel across the sensor for all reconstructed frames (Subsection 5.2). In the reconstruction, we assume that the field of view of the DVS has captured the brightness changes dominated by the 557.7nm aurora emissions during the observation period. The assumption is justified by the fact that other aurora oxygen emissions lines may appear quickly but fade too slowly to be detected by the DVS and thereby filtered out. An example of such an emission is the 630 nm line with a lifetime of hundreds of seconds23 and with lower intensities. The measured equivalent 557.7nm photon flux is given in Rayleigh and highlighted in green in Fig. 2a. The obtained signal indicates an aurora photon flux in the range of 2-11kR consistent with category 1-2 green aurora in24. Note that the photometer’s and DVS’s exact spectral response are unknown. Therefore, they are assumed to be identical for the background light pollution, so these are approximate values. A thorough calibration of the DVS could improve the accuracy8. Fig. 2b exhibits a subsample of the high-frequency signal illustrating the high sampling rate. In addition, two reconstructed brightness image series in 2-second time steps of the two time periods highlighted as magenta and blue in Fig. 2 are included in Fig. 3. Fig. 3a shows a croissant-shaped aurora arc intensifying and fading. Fig. 3b illustrates an intense aurora arc rapidly moving from left to right before fading.
Event observations
Another approach to interpreting the information from the event camera is to illustrate the brightness changes by counting the number of events per pixel within a given period and exponentially decaying each event. Another 12 seconds video of the observations is displayed in25 at 60FPS, accumulating events in frames at 60 Hz and applying a decay factor of 0.975 at each frame, showing on and off events as green and blue, respectively (Subsection 5.3). The video shows a narrow aurora appearing in the left portion of the image and racing towards the centre before moving left again, and an off region appears. Afterwards, a wave-like dynamic signal appears with alternating and braiding on and off event regions. A slowed-down version sampled at 600 Hz and displayed at 60Fps in26 shows a shorter period in one-tenth of reality. The aurora appears brighter in the upper and lower portions of the image. These upper and lower parts pulsate while the centre has fewer events. Finally, a video at the DVS’s rated sampling rate, 5kHz, is shown at 60FPS in27. Here, individual areas of molecule deexcitation are visible, illustrating the rate of event capture. Two histograms in Fig. 4 highlight the number of events for each frame in the videos27 and27 using the same colour scheme. Fig. 4a, sampled in 600 Hz, emphasises the aurora entering the scene, with the number of events several times higher than the proceeding frames. The period with the highest event count appears to have a high rate of fluctuations from high counts to low. This is further inspected in Fig. 4b, sampled in 5kHz, exaggerating the event fluctuation with several sampling periods of zero events registered and very few off events. In video25, the missing off events are also apparent with a visible ”hole” in the off-event region at the frame centre. The on-event oscillations are also apparent in the 5kHz event video27, as the arc entering from the left appears to be blinking.
During these rapid brightness changes, the quality of the event signal appears to deteriorate, illustrated by the unexpected fluctuations from many events to none. The sensor likely reached saturation where the event count exceeded the maximum throughput capacity, leading events to be timestamped in bursts, resulting in a loss of temporal granularity. However, this may not fully explain the occurrence, as the maximum number of total unfiltered events registered with an integration time using the upper specified limit of \(65\mu\)s is 493, corresponding to \(\sim 7.6\)MEPS. At the same time, the specifications in10 suggest that the upper limit should be 10,750 events per \(65\mu\). Furthermore, if the real number of supposed registered events were shifted and registered with identical timestamps, integrating at lower frequencies should remove the fluctuations. It should, therefore, not be visible in Fig. 4a, leaving the notion that some events are unregistered. The DVS has a temporary memory buffer that can hold some events, but the events could be lost in overflow, which could also explain the lack of off events. If a random percentage of events are discarded, it could cause the reduction of off events, making them more noticeably missing. On-events may have registration priority over off-events.
The 165MEPS reported in10 may either not be reflecting the capability of the DVXplorer or could be a result of utilising the camera in a different environment of −5 to −8 degrees Celsius compared to a laboratory testing environment with room temperature. Possible temperature dependence in above-zero environments has been documented in28,29. Here28, is most relevant, using another iniVation DVS, the DAVIS240C, albeit still a different sensor. The authors document the dark current leakage - the passive current always present in an imaging system - for temperatures from 0 to 83 degrees Celsius, showing a decrease in dark current and subsequent fall in leakage events with decreasing temperatures towards 0. Extrapolating the tendency, the DVS should operate more optimally in this sub-zero environment. However, the variation relative to the mean dark current has been shown to increase30 at lower temperatures, which could lead to an increased number of front-end noise events but not necessarily a degradation of the MEPS. The true number of events could have been captured if fewer noise events were registered, considering the filtering process removed 90.29% of the recorded events. Another possibility is that the light changes too quickly for the DVS to capture every brightness change, effectively leaping through several thresholds. Therefore, the issue may be a combination of timestamping several events simultaneously, threshold leaping and memory overflow.
Discussion
The attached videos highlight the DVS’s capability to acquire dynamic observations of auroras in a challenging scenario, which traditional cameras otherwise struggle to capture due to the low brightness of the phenomena in contrast to the background. DVS have accompanying challenges, such as technological immaturity and low sensor size, but also in interpreting events, the reconstruction of the brightness and the maximum throughput ceiling.
Reconstructing the integrated brightness from events (Subsection 3.1) requires a delicate balance of filtering choices, particularly hot pixel noise events, which otherwise dominate the integrated reconstructed signal due to the exponential term in Eq. 2. In addition, assumptions regarding spectral responses and event capturing were made to demonstrate the concept and should be explored further for scientific auroral observations. The background activity filter (Subsection 5.1) removes low brightness noise and disproportionately affects off events. This is illustrated by the increased difference between on and off events. Off events are in lower abundance and thus more spatiotemporally isolated and, therefore, more likely to be removed in the background activity filter, necessitating careful choosing of the temporal spacing threshold. A minor decay is applied to remove the remaining front-end event noise, with the downside being the loss of the static signal over time. Finally, the bilateral smoothing filter provides a more realistic representation of the reconstructed brightness.
The aedat4 file is \(\sim 5\)Gb in size. However, given the subsequent 90% reduction following the removal of hot pixels and reduction of front-end noise, the data volume becomes attractive and manageable for a high-speed 2D imaging sensor. Another obstacle is the data loss during rapidly changing brightness (Subsection 3.2), which needs further investigation but may be mitigated by selecting a DVS with a higher maximum throughput, such as the DVXplorer Micro10 with 450MEPS or by utilising an alternative DVS, such as the DAVIS346 DVS, which is rated to be functional down to 0.1 lux compared to 0.3 lux for the DVXplorer10. This could, in theory, allow for less front-end noise but has a significantly lower maximum throughput of 12MEPS.
The sensitivity of DVS is promising, enabling aurora observations in challenging conditions, such as urban and moonlight, that allow for observation of green auroras in 2–11 KR while retaining the high temporal resolution. However, using a photometer with known system constants or a camera to provide measurements of the background brightness could enable a more accurate reconstruction of the brightness. The technology could provide further insights into the rapid modulation of flickering and pulsating auroras. Furthermore, the high dynamic range could capture dynamic spatiotemporal changes of discrete auroras during a bright auroral breakup, which can saturate traditional cameras6,31,32. However, long-lifetime emissions, such as red auroras1, were also attempted but were unsuccessful due to slow dynamics. The next step is measuring using both traditional imaging techniques in conjunction with DVS. As the internal clock in the DVS is set at the factory and could experience time drift, a cross-comparison of measurements with another instrument can only be done by implementing time synchronisation. Thus, the internal clock may have experienced some drift. This could have been mitigated by, e.g. connecting a GNSS signal.
However, we are not comparing to other measurements where high temporal precision is paramount. Further studies, which may require accurate intensities, can benefit from using interference filters and from an absolute calibration of the camera, which was not done for the present paper. Future studies that necessitate high accuracy intensity may benefit from utilising interference filters and an absolute DVS calibration.
Acquiring the geometrical positioning of the sensor’s field of view using stars could also be possible. Still, a technique has to be developed involving either slightly shaking or applying a shutter in front of the DVS to trigger events from mostly stationary stars. It is also possible that DVS could observe auroras in daylight. However, optical filters may be required to remove wavelengths that auroras do not emit to increase the aurora signal sufficiently compared to the background.
DVS appears to be an interesting technology for future studies of auroras with observations in harsh brightness conditions and high temporal resolution. However, significant data processing is required to interpret the measurements.
Geoscience applications for DVS
The DVS technology is still emerging and undergoing maturation. While the development is focused on consumer products such as smartphones33 and unmanned and self-driving vehicles34, many geoscience and space applications could benefit from the technology. Applications taking advantage of observing rapid dynamics or for objects and phenomena constrained by traditional cameras’ dynamic range should be investigated. Successful utilisation of DVS is exemplified by the THOR-DAVIS project in collaboration with the Danish Astronaut Andreas Mogensen onboard the International Space Station (ISS) for monitoring lightning and high atmospheric discharges35. Here, the high temporal resolution is utilised and could provide new insights into atmospheric dynamics not yet fully understood36,37,38,39. Earth observation applications are explored with the Falcon Neuro instrument installed onboard the ISS40,41. Ground-based applications involve tracking objects in the sky for space situational awareness, such as satellites42,43,44, stars45 or comets. In42, the authors have shown the capability of DVS to track satellites in daylight, leading to an increase in monitoring duration otherwise limited by light conditions compared with traditional cameras. In summary, there are many exciting applications for DVS within geoscience, with much more to explore.
Methods
DVS noise filtering
Two types of noise are apparent: hot pixels and front-end noise. Hot pixels generate excessive amounts of the same polarised events, several magnitudes higher than the regular scene event rate. The front-end noise has several contributions, including dark current noise, which is particularly apparent in low brightness conditions and with high sensitivity measurements where circuit leakage current can trigger events7,9,28,46. Removing these noise elements requires postprocessing as described in Subsections 5.1 and 5.1.
Hot pixel filtering
The total events per pixel distribution is Poisson distributed as illustrated in Fig. 5. The figure shows the distribution of the total events for each pixel summed throughout the full footage of 750 seconds in 500 bins. Pixels with events exceeding the standard deviation of 30,371 total events are put in the same histogram bin. The distribution is visibly skewed, with a long tail of pixels and excess event counts. A pixel is labelled hot if it has more events than a specific threshold. A quantile threshold of 97% is selected, corresponding to a total event count of 2,381 in a single pixel, highlighted as a red vertical line, excluding 9,214 pixels. For comparison, the mean number of total events is 2,346, the median is 576, and the maximum number of events for a pixel is \(\sim {EMPTY}160,000\). The hot pixels are removed afterwards by masking out the identified pixels. The resulting data reduction is \(\sim 71.88\%\).
Histogram of the total number of events for each pixel during the entire footage duration in 500 bins. The red line indicates a 0.97 quantile threshold of 2,975 events used to exclude pixels with high activity. Pixels with events higher than the standard deviation are stacked in the same bin at 30,371 events.
Front-end noise filtering
Several filters have been designed to cope with frontend noise, such as the ynoise47 and knoise-filters48. iniVation has developed a BackgroundActivityNoiseFilter available through their dv-procesing Python package49, which works based on temporal and spatial event support, similar to47,48. This BackgroundActivityNoiseFilter is used. In short, the event is discarded if another event is not present in a nearby pixel or within a temporal spacing. A temporal spacing threshold is selected to be 50ms, which allows for a good signal-to-noise ratio. This reduces the number of events by an additional 18.41%, leading to a total event reduction of 90.29%.
Filtering outcome
In Fig. 6a, the number of on, off and total events in green, blue and grey, respectively, with an integration time of 1 second across the footage length, is illustrated. The discrepancy between the number of on and off pixels is 51%. The event reduction effect of the filtering is visualised in Fig. 6b, showing the equivalent number of on, off and total events. Here, the discrepancy between on and off pixels is 67%, indicating a growing percentage gap between on and off events. Fig. 6b illustrates the effectiveness of the filters, increasing the signal-to-noise ratio by making the periods of consecutive high event rates and peaks clearer.
Brightness reconstruction
Image reconstruction
The irradiance at a pixel can be calculated from the registered events following the equation:
where \(E_{e,t}(x, y)\) is the (x, y) pixel irradiance at a given time t, \(E_{e,0}(x, y)\) is the initial (x, y) pixel irradiance corresponding to the background brightness, the exponential term is the amplification of light measured by the pixel, \(\text {CS}_{\text {on}/\text {off}, t}(x, y)\) are the respective Cumulated Sum of on and off events for pixel (x, y) at a given time t and \(\text {on}/\text {off}_{\text {thold}}\) is the on and off event threshold of the sensor, which for the DVXplorer, with high sensitivity enabled, is 5%. It is assumed that \(E_{e,0}(x, y)\) is identical for all (x, y) pixels, simplifying the term to \(E_{e,0}\).
A decay factor is added when calculating the cumulated on and off sums to simulate the natural decay towards \(E_{e,0}\) using the formula:
where \(\text {CS}_{\text {on}/\text {off}, t}(x, y)\) is cumulated on or off event sum for pixel (x, y) at a given time t, decay is the scalar value that the previous summed events at time \([t - 1]\) is decayed by, and finally \(\text {frame}_{\text {on}/\text {off}, t}(x, y)\) is the number of new on or off events for pixel (x, y) at a given time t. A decay factor is selected to decay the impact of a single event to 10% after roughly 15 seconds, giving the following factors: 0.997 and 0.9997 for integration rates of 60 Hz and 600Hz. The reconstructed \(E_{e,\text {frame}, t}\) is displayed in greyscale, as the reconstructed video does not contain colour information, and in the range \([E_{e,0}, E_{e,0} \times \exp (8 \times \log (1 + \text {on}_{\text {thold}}))]\), where the upper range indicates the irradiance from eight consecutive on events, a compromise to illustrate both low and high relative irradiance. In addition, the removed hot pixels are replaced by a reconstructed irradiance average of the nearby neighbours in a \(5 \times 5\) window, excluding the hot pixel. Furthermore, a bilateral filter, a non-linear, edge-preserving and noise-reducing smoothing filter, is applied at each timestep during the reconstruction. The filter replaces individual pixel brightness with a gaussian distribution of nearby pixel brightnesses that depend on the Euclidean distance and brightness differences. The filter is implemented using the Python package ”kornia” with the bilateral_blur function using a kernel size of \((7 \times 7)\), sigma_color of 5 and sigma_space of \((100 \times 100)\).
Signal reconstruction
Simulating a photometer necessitates reconstructing the total irradiance generated from the incoming photon flux on the sensor. This can be achieved by integrating over every pixel and dividing by the \(m^{2}\) area of the sensor, A:
where \(E_{e,t}(x, y)\) is the individual irradiance for pixel (x, y) at a given time t, calculated with Eq. 2 and decayed using Eq. 3. Removing hot pixels as described in Subsection 5.1 is paramount since these would otherwise dominate the reconstructed signal due to the nature of the exponential term in Eq. 2. Practically, the computation of \(E_{e,\text {total}, t}\) in Fig. 2 required the iterative computation of a matrix of size \(950,000 \times 640 \times 480\), with each of the 950, 000 steps computed using a Graphical Processing Unit (GPU).
It can be noted that when no events occur, \(E_{e,t}(x, y)\) will equal \(E_{e,0}(x, y)\). Thus, the following relation in Eq. 2 will occur:
To get the aurora contribution in terms of the irradiance, \(E_{e,\text {aurora}, t}\), at time t, we remove the background contribution from the total irradiance to obtain:
where \(E_{e,0} = \frac{1}{A} \times \int _{x, y} E_{e,0}(x, y)\, dxdy\). The aurora radiance at time t, \(L_{\text {aurora}, t}\), can be calculated using the reference measurement distributed across the pixels:
where \(L_{0}\) is the reference background radiance in \(Wm^{-2}sr^{-1}\). \(\frac{E_{e,\text {aurora}, t}}{E_{e,0}}\) represents the signal amplification. This effectively provides the reference background radiance amplification.
The background radiance, \(L_{0}\), is measured by a photometer located at the Auroral Observatory in Tromsø with measurements from the evening. It is a TESS-W system, a part of the STARS4ALL project. The photometer measurement is given in magnitude arsecond−2 measured as a function of photometer frequency, f, dark frequency, \(f_{d}\) and system Zero Point (ZP) of 20.5 with the following relationship50:
\(m_{T}\) was measured at 16.82 magnitude arcseond−2 at \(\sim\)19:48 UTC15 without auroras. \(L_{0}\) can then be determined using50:
where the system constant G is in \(Wm^{-2}sr^{-1}Hz^{-1}\). As the system values of the photometer are unavailable, we assume a representative G value of \(1.19\times 10^{-6}Wm^{-2}sr^{-1}Hz^{-1}\), an average of the G values of the calibrated TESS-W systems reported in50. Another assumption is that \(f_{d}\) is zero, in accordance with50, given the local sub-zero temperature conditions. This gives an radiance \(L_{0} = 3.52\times 10^{-5}\) \(Wm^{-2}sr^{-1}\). We also assume the photon sensitivity is identical across the DVXplorer and photometer. We make this assumption, as iniVation do not report the quantum efficiency for the DVXplorer in their specifications.
Finally, the photon radiance contribution from the aurora at time t, \(L_{q,\text {aurora}, t}\), can be estimated by converting the radiance:
here, \(L_{q}\) can be expressed in Rayleigh by dividing by \(10^{10}/ 4 \pi \, \text {photons} \, s^{-1} \, m^{-2} \, sr^{-1}\)24,51. E is the photon energy at a specific wavelength in Joules, given as \(E=\frac{h \times c}{\lambda }\), where h is Planck’s constant, c is the speed of light and \(\lambda\) is the wavelength at 557.7nm. Dividing by E transfers the radiance to an equivalent photon radiance for 557.7nm.
Event video displaying and scaling
Integrated and decayed event frame videos are composed, presenting on events as green and off events as blue. The number of events is far below the maximum RGB channel value of 255, necessitating a scaling of the events. Here, 8 on and 5 off events are selected as the maximum display values for the 60 Hz video, a compromise between showing a lower activity level and the most activity, and scaled to 255. Maximum values are 4 and 2 for 600 Hz and 2 and 1 for 5KHz for on and off events, respectively. A lower decay value of 0.975 was selected to limit overlapping on and off events for all integration frequencies. Note that hot pixels are not replaced, and the bilateral filter is not applied.
Supplementary information
Associated videos are available at19.
Data availability
The original DVS observations are available from link in the reference20, available from the corresponding author on reasonable request.
References
Prölss, G. Physics of the Earth’s Space Environment: An Introduction (Springer, 2012).
Davidson, G. Pitch-angle diffusion and the origin of temporal and spatial structures in morningside aurorae. Space Sci. Rev. 53(1–2), 45–82 (1990).
Frey, H.U., Mende, S.B., Angelopoulos, V., Donovan, E.F. Substorm onset observations by IMAGEFUV. J. Geophys. Res. Space Phys. 109(A10) (2004).
van der Meeren, C., Oksavik, K., Lorentzen, D. A., Rietveld, M. T. & Clausen, L. B. N. Severe and localized GNSS scintillation at the poleward edge of the nightside auroral oval during intense substorm aurora. J. Geophys. Res. Space Phys. 120(12), 10–107 (2015).
Nishimura, Y., Dyer, A., Donovan, E. F. & Angelopoulos, V. Finescale Structures of STEVE Revealed by 4k Imaging. J. Geophys. Res. Space Phys. 128(12), e2023JA032008 (2023).
Fukuda, Y. et al. First evidence of patchy flickering aurora modulated by multi-ion electromagnetic ion cyclotron waves. Geophys. Res. Lett. 44(9), 3963–3970 (2017).
Gallego, G. et al. Event-based vision: A survey. IEEE Trans. Pattern Anal. Mach. Intell. 44(1), 154–180 (2022).
McReynolds, B., Graca, R. & Delbruck, T. Experimental methods to predict dynamic vision sensor event camera performance. Opt. Eng. 61(07), 074103 (2022).
iniVation: Understanding the Performance of Neuromorphic Event-based Vision Sensors. Technical report.
Aerne, C., Eng, K. Inivation Specifications – Current models. Technical report. Accessed 15 August 2023.
Itikawa, Y. & Ichimura, A. Cross Sections for Collisions of Electrons and Photons with Atomic Oxygen. J. Phys. Chem. Ref. Data 19(3), 637–651 (1990).
Phantom: T4040. https://www.phantomhighspeed.com/products/cameras/tseries/t4040. Accessed 12 December 2023.
iniVation: DV · Dynamic Vision System. https://inivation.gitlab.io/dv/dv-docs/. Accessed 25 October 2023.
iniVation: modules/cameras/dvxplorer.cpp · master · iniVation AG / dv-core / dv-runtime, line 377-380 · GitLab. https://gitlab.com/inivation/dv/dv-runtime/-/blob/master/modules/cameras/dvxplorer.cpp. Accessed 28 November 2023.
stars603 - Tromsø, Norway. https://tess.dashboards.stars4all.eu/d/tess_raw/s4a-photometer-network-raw?orgId=1%26var-Tess=stars603%26from=1677862800000%26to=1677884399000. Accessed 14 February 2024.
Zamorano, J. et al. Stars4all Night Sky Brightness Photometer. Int. J. Sustain. Light. 18, 49–54 (2017).
Matzka, J., Bronkalla, O., Tornow, K., Elger, K., Stolle, C.: Geomagnetic Kp index. V. 1.0. GFZ Data Services.
Tromsø Geophysical Observatory: Magnetic field H component intensity. https://flux.phys.uit.no/cgi-bin/mkstackplot.cgi?%26comp=H%26cust=%26site=tro2a%26Sync=1677884400%26.
Stokholm, A., Gulbrandsen, N., Pedersen, N., Kucik, A., Olesen, D., Willer, A.N., Chanrion, O., Hvidegaard, S.M. The first observations of auroras with dynamic vision sensors. https://doi.org/10.11583/DTU.c.6963765.
Stokholm, A., Gulbrandsen, N., Pedersen, N., Kucik, A., Olesen, D., Willer, A.N., Chanrion, O., Hvidegaard, S.M.: Aurora DVS observations, March 3 2023 20:17:05-20:31:32. https://doi.org/10.11583/DTU.24720198
Stokholm, A., Gulbrandsen, N., Pedersen, N., Kucik, A., Olesen, D., Willer, A.N., Chanrion, O., Hvidegaard, S.M. Reconstructed brightness video, 20:21:56-20:25:06, 60Hz. https://doi.org/10.11583/DTU.24721509.
Stokholm, A., Gulbrandsen, N., Pedersen, N., Kucik, A., Olesen, D., Willer, A.N., Chanrion, O., Hvidegaard, S.M. Reconstructed brightness video, 20:23:06-20:23:56, 600Hz. https://doi.org/10.11583/DTU.24745377.
Chamberlain, J.W. Physics of the Aurora and Airglow. (1961).
Hunten, D. M., Roach, F. E. & Chamberlain, J. W. A photometric unit for the airglow and aurora. J. Atmos. Terr. Phys. 8(6), 345–346 (1956).
Stokholm, A., Gulbrandsen, N., Pedersen, N., Kucik, A., Olesen, D., Willer, A.N., Chanrion, O., Hvidegaard, S.M. Aurora event observations, 20:28:09-20:28:31, 60Hz. https://doi.org/10.11583/DTU.24721200.
Stokholm, A., Gulbrandsen, N., Pedersen, N., Kucik, A., Olesen, D., Willer, A.N., Chanrion, O., Hvidegaard, S.M. Aurora event observations, 20:28:11-20:28:14, 600Hz. https://doi.org/10.11583/DTU.24721317.
Stokholm, A., Gulbrandsen, N., Pedersen, N., Kucik, A., Olesen, D., Willer, A.N., Chanrion, O., Hvidegaard, S.M. Aurora event observations, 20:28:11-20:28:12, 5000Hz. https://doi.org/10.11583/DTU.24721326.
Nozaki, Y. & Delbruck, T. Temperature and parasitic photocurrent effects in dynamic vision sensors. IEEE Trans. Electron Devices 64(8), 3239–3245 (2017).
Berthelon, X., Chenegros, G., Finateu, T., Ieng, S.-H. & Benosman, R. Effects of cooling on the SNR and contrast detection of a low-light event-based camera. IEEE Trans. Biomed. Circuits Syst. 12(6), 1467–1474 (2018).
Widenhorn, R., Blouke, M.M., Weber, A., Rest, A., Bodegom, E. Temperature dependence of dark current in a CCD. SPIE Proc. (2002).
McHarg, M. G., Hampton, D. L. & StenbaekNielsen, H. C. Fast photometry of flickering in discrete auroral arcs. Geophys. Res. Lett. 25(14), 2637–2640 (1998).
Gustavsson, B., Lunde, J., Blixt, E.M. Optical observations of flickering aurora and its spatiotemporal characteristics. J. Geophys. Res. Space Phys. 113(A12), (2008).
Prophesee: Prophesee Announces Collaboration with Qualcomm. https://www.prophesee.ai/2023/02/27/prophesee-qualcomm-collaboration-snapdragon/. Accessed 01 December 2023. (2023).
Khan, N. & Martini, M. G. Data rate estimation based on scene complexity for dynamic vision sensors on unmanned vehicles. In 2018 IEEE 29th Annual International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC) (eds Khan, N. & Martini, M. G.) (IEEE, 2018).
Chanrion, O., Pedersen, N., Stokholm, A., Hauptmann, B., Neubert, T. Thor-DAVIS: A neuromorphic camera to observe thunderstorms from inboard ISS. Technical report, Copernicus GmbH (2023).
Chanrion, O. et al. Profuse activity of blue electrical discharges at the tops of thunderstorms. Geophys. Res. Lett. 44(1), 496–503 (2017).
Neubert, T., Østgaard, N., Reglero, V., Blanc, E., Chanrion, O., Oxborrow, C.A., Orr, A., Tacconi, M., Hartnack, O., Bhanderi, D.D.V. The ASIM Mission on the International Space Station. Space Sci. Rev. 215(2) (2019).
Surkov, V. V. & Hayakawa, M. Progress in the study of transient luminous and atmospheric events: A review. Surv. Geophys. 41(5), 1101–1142 (2020).
Soler, S., GordilloVázquez, F.J., PérezInvernón, F.J., Luque, A., Li, D., Neubert, T., Chanrion, O., Reglero, V., NavarroGonzález, J., Østgaard, N. Global Distribution of Key Features of Streamer Corona Discharges in Thunderclouds. J. Geophys. Res. Atmos. 127(24), (2022).
McHarg, M.G., Balthazor, R.L., McReynolds, B.J., Howe, D.H., Maloney, C.J., O’Keefe, D., Bam, R., Wilson, G., Karki, P., Marcireau, A., Cohen, G. Falcon Neuro: an event-based sensor on the International Space Station. Opt. Eng. 61(08), (2022).
Arja, S., Marcireau, A., Balthazor, R.L., McHarg, M.G., Afshar, S., Cohen, G.: Density Invariant Contrast Maximization for Neuromorphic Earth Observations. https://arxiv.org/abs/2304.14125. Accessed 24 October 2023. (2023).
Cohen, G. et al. Event-based Sensing for Space Situational Awareness. J. Astronaut. Sci. 66(2), 125–141 (2019).
Kamiński, K., Cohen, G., Delbruck, T., Żołnowski, M., Gędek, M. Observational evaluation of event cameras performance in optical space surveillance. https://conference.sdo.esoc.esa.int/proceedings/neosst1/paper/475 (2023).
Afshar, S., Nicholson, A. P., van Schaik, A. & Cohen, G. Event-based object detection and tracking for space situational awareness. IEEE Sens. J. 20(24), 15117–15132 (2020).
Chin, T.-J., Bagchi, S., Eriksson, A., van Schaik, A. Star Tracking Using an Event Camera. In: 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) (IEEE, 2019).
Graça, R., McReynolds, B., Delbruck, T.: Shining light on the DVS pixel: A tutorial and discussion about biasing and optimization. In: 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) (IEEE, 2023). Accessed 23 October 2023.
Feng, Y. et al. Event density based denoising method for dynamic vision sensor. Appl. Sci. 10(6), 2024 (2020).
Khodamoradi, A., Kastner, R.: O(N)-Space Spatiotemporal Filter for Reducing Noise in Neuromorphic Vision Sensors. IEEE Transactions on Emerging Topics in Computing, 1–1 (2018).
iniVation: Filtering events — dv-processing rel_1.7 documentation. URL: https://dv-processing.inivation.com/rel_1_7/event_filtering.html. Accessed 25 October 2023.
Bará, S., Tapia, C. & Zamorano, J. Absolute radiometric calibration of TESS-W and SQM night sky brightness sensors. Sensors 19(6), 1336 (2019).
Baker, D. J. & Romick, G. J. The rayleigh: Interpretation of the unit in terms of column emissionrate or apparent radiance expressed in si units. Appl. Opt. 15(8), 1966–1968. https://doi.org/10.1364/AO.15.001966 (1976).
Acknowledgements
The authors would like to thank Susanne Vennerstrøm (DTU Space), Nicolas Longépé (ESA \(\Phi\)-lab) and Björn Gustavsson and Juha Vierinen (Department of Physics and Technology, UiT the Arctic University of Norway) for their support of this research. In addition, we express our gratitude to Ulla og Mogens Almennyttige Fond for funding the purchase of the utilised DVXplorer and the Niels Bohr Foundation and the Thomas B. Thriges Fundation who provided financial assistance during Andreas Stokholm’s external research stay at UiT.
Author information
Authors and Affiliations
Contributions
AS co-formulated the idea, co-formulated the methodology, carried out experiments, performed data analysis and wrote the manuscript. NJ assisted in carrying out the experiments with AS and contributed valuable domain knowledge to the experimentation and manuscript. NP assisted in the data analysis and in formulating the methodology. AK assisted in formulating the methodology. DHO co-formulated the idea and contributed with knowledge about potential impact of the application. ANW contributed valuable domain knowledge for the manuscript. OC co-supervised AS, co-formulated the methodology, assisted in the data analysis and provided valuable input to the manuscript leading up to the initial draft. SMH co-supervised AS, co-formulated the idea, attempted initial experiments and provided valuable input to the manuscript leading up to the initial draft. All authors have reviewed the manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Stokholm, A., Gulbrandsen, N., Pedersen, N. et al. The first observations of auroras with dynamic vision sensors. Sci Rep 15, 26539 (2025). https://doi.org/10.1038/s41598-025-09821-2
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41598-025-09821-2