An inside look at video quality testing for cameras in motion

When integrating video enhancement features like video stabilization into cameras in motion, testing plays a crucial role in ensuring high video quality, a smooth integration and an outstanding user experience. Video quality testing comes into play at several stages in the product development process, as was covered in our guide, “Mastering video stabilization in product development . Now we’ll take a deeper look at all the state-of-the art equipment, proven methodologies, creative time savers, hard work and careful consideration that goes into video quality testing with insider insights from our own test center.

BlogTesting

Where the magic happens – our test center

In an otherwise bright-colored, glassy and buzzing office just off the main street in Uppsala, Sweden, a multitude of computer screens and phones light up even the darkest of Swedish winter evenings. They also provide rich amounts of heat, when that fourth cup of Earl Grey just isn’t enough.

But one of the rooms is dark and different. Its walls are painted black, and the windows have special blinders to completely block out all external light. The room has all sorts of gadgets and accessories either used directly in video quality testing or to create the right testing environment. This optics lab is a must for a company like Imint, and we’ll tell you more about it later.
At the other end of the office, another room is filled with blinking lights, computers and a plethora of different phones, all of which hum a steady hum. It’s loud and warm in there, but never too warm, monitoring systems and cooling ventilation make sure of that. The server room is a common sight at most software companies, but this one has some special parts to it that we’ll get back to.

Writing a high-performance video technology SDK like Vidhance SDK is very hard, especially when dealing with the cutting edge. No major performer would go on stage without rehearsing. A major part of the development cycle consists of similar tests, taking the time to make sure our software maintains the quality that users expect without consuming too much energy. This requires both automatic and manual video quality testing and calibration.

Automatic testing – with inspiration from the industry leader

A well-known company in the camera testing industry is DxOMark. They’ve subsequently gotten very popular in the smartphone industry as camera quality has become more and more of a selling point. DxO makes a lot of editing software, but DxOMark specifically ranks and awards numeric scores for a range of areas of both photo and video quality, such as auto focus, noise and stabilization. These scores are later weighted together to create a final numeric score that describes the camera’s overall prowess. Remember how we said that it’s not all about the megapixels before? It may (at least partially) be all about the DxO score, which many of our customers have scored high on.

A hexapod is a high-precision device for movement in all six degrees of freedom, a necessity for reproducible stabilization tests.

The black-painted camera lab has equipment in the range of one million Swedish kronor. We’ve got a shake rig, lighting gear, moving trains, paintings of TV test views and a plethora of other gadgets. Their use cases include calibration, comparing with different software versions and other manufacturers, testing new algorithms and reproducing the DxO setup.

We have worked with DxO and been on location for various device tests in the past. For better or worse, the DxO score is the only agreed-upon customer and industry standard for camera quality. It’s not necessarily entirely objective. Some things certainly are subjective, and a cut-off decision is reached somewhere about how to measure something.

Although we carry the same equipment, there is still some margin of error when measuring locally compared with testing at DxO in France. There may be slight differences in the software used, and it may be too easy to optimize for what you are good at when you’ve written the software yourself. More embarrassing entries on this list include accidentally moving the lights or some stickers on the floor when cleaning, corrupting future tests until it’s discovered.

However, general scores mean nothing for products like Live Composer, which is based on object tracking and auto zoom technologies. The reason is that there is no industry standard for – and likely no way to quantify – success. Yet we still need rigorous video quality testing, and this is where human subjects come in.

Manual testing – human subjects and real-world focus

Sometimes there’s no objective truth. Although we can agree on how to measure “shakiness” and to a certain degree how good something like automatic white balance is, designing a user interface is a complex process. Deciding what works best in the end requires real-life human testers.

After determining exactly what we want to know, we ask people to perform certain tasks, like using Live Composer to create a video of a moving Lego train. We simultaneously record all screen interactions and comments. Our largest test so far involved 16 people.

Several quantifiable values came out of this approach, such as the time it takes to perform the given task and the number of mistakes or recovery paths taken along the way. The goal, of course, is to minimize these numbers. But other qualitative observations can be far more valuable, such as noticing – and putting into words – what users expect to happen and find intuitive.

Model trains aren’t just a kid’s toy, they’re a valuable tool for testing Live Composer. If you haven’t found a Christmas present for our lab crew yet…

For reasons of confidentiality, we invite external test subjects only once a product is already known and more or less released. Before that, we only use employees here at Imint. This set of people is obviously biased. We are all professionals in our field, and know exactly what we want the end result to be like. It’s like predicting the world cup winners by only asking people from one country. Fortunately, so far we’ve seen that results of internal tests and discussions are well reflected by what other people say and do at later stages.

We also perform tests in simulated real-world scenarios as much as possible. For instance, we take a closer look at performance in especially windy, warm, cold, light and dark conditions. After all, our technology will be used in the real world and needs to be ready for anything our end users might need to do with it.

Why video quality testing is an important and complex task

Like most companies, Imint’s customers set requirements for their vendors. Like us, the vendors need to ensure that what they deliver meets the standards their users expect. They test the software they buy, of course, but the later a bug is discovered the more difficult it may be to fix. As a result, it’s important that we are diligent during development. DxO testing takes place late in the process and the Vidhance integration needs to already be in great shape by then. Even worse is if a serious bug is discovered in the hands of end users.

Unreleased prototype phones sure aren’t pretty, and the software isn’t always stable, but we make sure the video is always both pretty and stable.

Testing software is a science all on its own. At Imint, it can be even more complicated, not only because we are very rigorous – so are other companies – but because of the hardware involved. We need to test on many different devices from different manufacturers, using different operating systems, chipsets, CPU architectures and other parameters.

That requires a lot of preparation, especially for the integration tests when all components are supposed to come together. We care deeply about variables like battery time and heat dissipation. Both are closely monitored and related: hotter temperatures mean using more energy and worse performance. You could say we put much effort into being “cool”.

How we make the video quality testing process more efficient

When testing our Vidhance video enhancement software, other software on the device isn’t necessarily reliable, especially during the early development cycle. To decrease dependencies and possible crashes, we use newly-flashed devices with only the bare minimum installed. Building Android from source can require upwards of 100 GB of disk space (although the end result is much smaller) and a fair bit of time, just for one of many versions.

We found a clever way of managing such sizes efficiently. All required versions are stored on a computer with plenty of disk space, with as much as possible precompiled, and with relevant parts ready to be replaced with Vidhance code. If these were moved to a new computer later, the edit dates would change, and the build system would think everything changed and rebuild everything from scratch.

To avoid this, code is stored in a light-weight virtual machine called a docker container with its own emulated file system. The fake file system fools the build system and only the new parts are built before assembling the build. The docker image also ensures the build environment is the same every time. This is fast and efficient.

No cloud, no problem

Testing, and evaluation of your video quality testing procedures, are continuous processes. It’s a constant trade-off between time vs. the granularity of your bug-catching net. The later a problem is discovered, the more expensive and stressful it becomes. All the software testing we have talked about here is done locally on our premises. However, some tuning and calibration can be performed on location on the customer’s premises depending on the feature and our agreements.

Some of the automatic tests can certainly be performed in the cloud, but with our needs for specialized on-demand devices and GPUs, it’s harder to find solutions. Services like AWS Device Farm enable app testing, but we often want to reprogram the lower levels of the device itself, and mostly on phones not yet released. No such service is known to us (maybe this could be a business idea for someone?)

In any case, it’s no problem for us, as the bottleneck is usually the tests’ runtime on the phone, and the cloud simultaneously opens up a new attack vector in terms of IT security, which is why we haven’t pursued this venue.

Seeing is believing

Want to see with your own eyes how much we can improve video quality, save on battery consumption or fine-tune auto zoom and object tracking in Live Composer? Contact us to book a demo of Vidhance and discuss testing and other parts of the process of integrating video stabilization with our video quality experts. For inspiration, insights and best practices for the next generation of video enhancement, enter your email address and subscribe to our newsletter. 

banner-vidhance-3

Vidhance

We’re all about video enhancement

Book a Demo