MADISON – Long relegated to scientific niches like astronomy and microscopy, sensors that catch just the minimum amount of light – a single photon at a time – could be better than typical digital cameras at capturing everyday memories in challenging environments.
In a dark room or a motion-heavy scene, conventional cameras face a choice: a quick look that freezes movement nicely but turns out dark, or a longer exposure that captures more light but blurs moving parts.
“That’s always been a fundamental trade-off in any kind of photography,” says Mohit Gupta, a University of Wisconsin-Madison computer sciences professor. “But we are working on overcoming that trade-off with a different kind of sensor.”
Gupta’s lab in the School of Computing, Data and Information Sciences is collaborating with Professor Edoardo Charbon at the École Polytechnique Fédérale de Lausanne in Switzerland, an expert in packing together a type of single-photon sensors called single-photon avalanche diodes, or SPADs, in ever-larger arrays. The researchers are using SPADs for what they call quanta burst photography – taking many images in bursts, and then processing those many images to squeeze one good picture from a poorly lit or fast-moving subject. They will present their work during the SIGGRAPH 2020 conference, tentatively scheduled for August.
“You can think of each pixel of a camera as a light bucket which collects photons. Typical camera pixels need a lot of photons to make a reasonable image,” Gupta says. “But with a camera made from single-photon sensors, the bucket is more a collection of teaspoons that fill up as soon as they detect a minimum quantity of light.”
SPADs lend a camera two benefits: great sensitivity and remarkable speed. Theoretically, the single-photon teaspoons can catch light more than a million times a second. The SwissSPAD array used in the burst photography work is fast enough to record 100,000 single-photon frames per second.