A startup has launched a Zoom tool that, he says, can assess the quality of video meetings in real time.
Founded by former Foursquare CEO David Shim, Read AI says its goal is to eliminate bad meetings by helping people adapt their approach on the fly.
The company’s tool extracts video and audio data and spits out a color code that indicates whether sentiment and engagement levels are good, neutral, or bad. This data is presented to all meeting participants, not just the person who activated the tool.
âIn the aftermath of the pandemic, people have been taken to the bottom of it when it comes to video conference. There hasn’t been a normal edification process, so people aren’t trained to understand how to read on video, âShim said. TechRadar Pro.
“[Using Read] it’s like looking at your car’s dashboard to check your speed. This gives you the time to think about how the call is going and respond accordingly.
The tool is currently available for anyone wishing to join the waiting list, but is expected to roll out to the official Zoom app store in the coming weeks. The company also says it will aim to bring Read to other popular platforms (like Teams and Meet) soon.
Presented with an app that scans your voice and facial expressions to determine if you’re paying attention, the immediate instinct may be to back off. However, during our conversation, Shim went out of his way to point out that Read is not a monitoring tool.
Instead, he says the company has taken a number of steps to ensure that the service is not used to assess individual contributions to meetings and distribute blame, but rather as a means of improving performance. quality of video meetings.
For example, Read posts a chat message at the start of every call to make sure all participants know it’s active. If anyone in the lobby isn’t comfortable with the idea, just type “unsubscribe” in the chat and the tool will be disabled for everyone.
Shim says the company also ensured that data privacy was not compromised. Instead of extracting data into its system to improve the performance of its algorithms, Read deletes all data within 24 hours of ending a call.
Finally, while engagement and sentiment data is pulled from each participant’s feeds, the information is sent to the dashboard as a whole, and therefore cannot be used to call specific people.
It should be noted, however, that this is not the case for the talk time metric, which highlights the three most and least dominant speakers on the call. The company says the feature is designed to alert people who may have a habit of speaking above others, but it’s harder to justify the section that exposes the quieter participants.
Work in progress
The potential value of a tool like Read is clear, giving users the flexibility to accommodate or even end a meeting that is not scheduled. However, the implementation leaves much to be desired at the moment.
For example, the Read interface is currently a bit difficult to interpret. With engagement data broken down into time segments, the ability to derive value from the information depends on whether the user remembers what was happening at a given point in time.
We also found the very presence of the tool off-putting to an extent. This might be the kind of thing people will get used to over time, but for now, it can be difficult to focus on what you’re saying without thinking about how it might manifest in the graphic.
However, Shim quickly noticed that this was the first edition of an evolving product. It is not a tool that will diagnose issues at a granular level, but a tool designed to help identify high-level trends across multiple meetings.
Over time, he says, the company will move towards more specific recommendations based on the data it collects. But inevitably, an increase in specificity risks compromising the all-important anonymity, which is not an exchange that Shim is eager to make.