Panning – case in point for distinguishing realistic from robotic motions
Panning is an excellent example of a movement that stands out as either artificial or realistic in videos. The logical approach to panning is to take the shortest possible route from point A to point B. This results in a single, robotic motion, on a straight line from start to end, maintaining the same speed throughout. Johan explains why this, in fact, is not the way it works in real life and how Imint is addressing this:
“Take a closer look at a movie or watch a professional cameraperson at work. They are trained to follow the leading lines of the scenery, such as the horizon. It also feels less robotic if you make the accelerations and decelerations softer and keep the panning level even along the horizon. Considering how a human naturally turns their head, all of this feels more realistic. Our next-gen video stabilization algorithms now support horizontal panning, and in the future, AI could enable us to identify and follow other leading lines,” says Johan.
The journey from pixel-perfect stabilization to a realistic user experience
Video stabilization testing has traditionally had both a subjective and objective component, with the lab-based objective side normally taking precedence. But what if a smartphone camera deliberately reduces video stabilization power to create a more realistic experience? If users prefer this camera over one that scores higher in the objective testing, then the subjective side of testing should take this user experience into account to provide a fair score. Johan sheds light on this shift in priorities from pixel-perfect stabilization to a realistic user experience:
“We’ve worked with some of the leading smartphone hardware manufacturers on the journey toward achieving pixel-perfect stabilization. Now that this goal has largely been achieved, some may see this as the end of the road – but not us. The future journey will transcend pixel-perfect and focus more on a realistic user experience supported by increasingly advanced AI capabilities. This paradigm shift will empower everyday consumers to film professional-quality video. We’re aiming to get out in front of this with our next-gen Vidhance, and other video stabilization stakeholders, like test institutes, will want to hop onboard this train soon before it’s too late,” says Johan.
Let’s redefine video stabilization testing together
At Imint, we are leading the transition to a new paradigm of how video stabilization is defined, tested, optimized and used. We’re working on a new generation of our world-leading video enhancement platform Vidhance to create more realistic video experiences and enable more flexible optimizations. Learn more in our guide, “Redefining video stabilization quality”.
Nobody has all the answers to questions like how video stabilization test criteria can be adapted yet, but let’s work on them together. We want to share our knowledge and help craft meaningful test criteria for next-gen video stabilization. Contact Johan to continue the dialogue. For inspiration, insights and best practices for the next generation of video stabilization, enter your email address below and subscribe to our newsletter.