Imint – Image Intelligence AB (Imint) was spun out from next-generation research conducted at the Uppsala University, Center for Image Analysis, and has been providing its intelligent video processing software Vidhance® to the technologically demanding industrial and defense markets, where it is used to improve time critical decision making from remote video feeds, such as military surveillance from unmanned aerial vehicles (UAVs) and remotely operated submarines. The Vidhance software algorithms have been tuned over the years to meet hard requirements from these traditional markets, where harsh filming conditions, real-time requirements and resource limited portable computer systems are common.
The Smartphone market has seen a technology arms race in terms of camera performance over the years, and this battle is intensified. Camera capabilities for both still images and video capture are typically highlighted as the main, and differentiating, features with the release of new models. Tremendous investments have been made in camera technology; in optics, sensors and image processing hardware. To address image stabilization, device motion data from built-in sensors are used to compensate for the higher frequency vibrations and hand shaking, that earlier gave blurry images and other artifacts such as rolling shutter effects. The increased quality of still image photography from smartphones has made social image sharing virulent.
However, Imint’s hypothesis was that the specific quality degradation from video recording while moving was not currently successfully addressed, so a number of flagship devices were taken out for testing in real life situations, like filming while slowly walking forward.
Vidhance is able to do a real-time analysis of the low frequency camera movements, separate the intended motion from the unintended motion, and cancel out the latter. This gives an instant “camera on virtual rails” effect as you move around and film.
The results proved the assumption; whereas all phones did a good job (but with varying quality) at dampening high frequency motion, the longer and slower motions were mainly left unresolved in the resulting video. The graph illustrates how different video stabilization techniques address different frequency bands and time scales.
Simon Mika, Imint’s CTO, concluded on the test results:
“The results from our tests were well aligned with our expectations. To properly address the unintended walking motion a method using the video data itself is needed. Our variation on the theme of optical flow analysis used by Vidhance, is efficient enough to be used as a real-time software addition on driver level. It can also keep latency low, as no look-ahead buffer is needed. Vidhance can thus fill the gap, and complement existing techniques for in-frame quality issues done in camera hardware.”
An abridged version of the test result can be requested by contacting Imint.