Often the subject of auto framing cameras come up and everyone has their own opinion. Over the years, the technology has become better by the established players out there, but there are now several new entrants - so how do we pick the one that is right for us? Read on for Bryan's take on the tech and how the UC Test Lab evaluates auto framing.
Years ago as a camera developer I would talk to potential customers about auto framing. Back then, users really had two choices, Polycom or Cisco cameras connected to a codec in a conference room. Many people I talked to said they turned it off, either leaving a full FoV capture of the room or using the remote control to manually PTZ their way to the speaker.Luckily, the technology has improved to the point where users no longer feel like they have to disable the ability to auto frame participants. However, with all the vendors now providing auto framing cameras, there are that many different philosophies behind what gets captured, how fast it moves and what causes it to move in the first place. I am firm believer that users need to test as many cameras as possible to see what works in their specific room(s) and use case(s) knowing that this can possibly lead to choosing different cameras from different vendors for different rooms.
So how does one test auto framing cameras? Here are few ways I suggest testing them in your rooms. Tests were derived from actual experiences using auto framing cameras.
Accuracy is a pretty easy metric. When someone speaks, does the camera capture them? Test this by having person speak and see if the camera captures them. If you are the only person in your room, does it capture you if you change seats? I have tested cameras that triangulate between itself, a voice and a face and sometimes the camera will struggle if there is a voice disassociated from a face. The test involved playing a voice file from a mobile phone in the corner of the lab, while I was sitting at the conference table. The camera we were testing would not capture me, even when speaking because it recognized another voice. Why does this matter? Open spaces. If there is the potential for extraneous voices in the room, be aware that the framing may struggle. In a group meeting, we test accuracy by having two people talk to each other. Meetings are dynamic. Some cameras try to capture each person when speaking and are usually behind. Other cameras may opt to stay on one speaker and others may frame them both until one dominates the conversation or the other stops speaking.
When the camera captures you, how large are you? Participant size to the remote viewer may be important and if the camera captures too much of the room and too little of you, it may lead to a poor experience. We typically take screen shots and measure the percentage of head size in relation to the captured scene. In addition, take note of the person's relative location to the center of the capture. Some meeting services offer their own framing scheme and it may result in a negative affect if the framing from the camera puts the participant off to the side.
Is the framing fast enough? Is it too fast? We test speed of capture by picking two points just far enough apart that the person walks out of the frame, forcing the camera to adjust. Putting tape on the floor helps if you are testing several cameras over the course of several sessions. A person walks back and forth between the marks in a recorded meeting, waiting for the camera to adjust. When viewing the recording, we measure how long it takes for the camera to recapture. This is also another test of accuracy. We do this test both while speaking and while silent as we have found some cameras can struggle with non-speaking participants.
Inadvertent capturing, also known as mistakes, are common. So common that we need to be aware of what may cause a camera to misfire. We have a whiteboard in the lab and when testing, I'll draw a face on the board. Yes, sometimes the camera captures that. There is also a ring light that we have in the lab. There are times when cameras pick up the circular shape and frames it. So, this may not be an issue in your room, but what if you have a glass wall separating the conference room from a busy hallway? You do NOT want your camera capturing people walking down the hallway. In a group test, have a person cough or fake a sneeze. If that person gets captured in a real meeting, it can be embarrassing and it happens if a camera is too aggressive.
360 degree cameras present another issue that a typical "above/below the display" camera doesn't have to deal with in a normal conference room - capturing a remote person on the display and presenting it as the framed participant in the room. This is a horrible experience. Even if you utilize an above/below display camera, be careful in conference rooms where there are many ancillary displays containing remote participants hanging on walls within the potential capture of the camera.
Hopefully this helps and shows why you need to test cameras not just in one room for proof of concept, but in multiple rooms as there may be differing factors that can lead to a poor experience - or a great one.