Child theme
artificial intelligence video analysis of zoom meetings - chorus.ai zapps - otter.ai NLP

AI Video Meeting Intelligence

In Video Comms, Artificial Intelligence (AI) by Scott

Here's a thought:

Analysing Zoom Calls


Natural language processing (NLP) is now so ubiquitous many of us take it for granted. Analysing the spoken word has become almost mundane, but what happens when video data from recorded online meetings can be analysed, what can happen next?
The Rise Of The

Superhuman Sales Force

It seems like a long time ago when we would to meet in coffee shops before heading into a client meeting at their office. Nowadays all our meetings are conducted digitally online, with tools like Zoom. However there is a big difference between the meetings we would have in person, in an office, compared to the meetings we now have online.

As soon as something becomes digitised it can become fodder for computer-based analysis, and the now with the growth in artificial intelligence (AI) the possibilities for this analysis are becoming almost endless.

Relatively recent advances in AI have led to massive increase in voice-to-text technology - in fact I am using Siri to dictate this article. Of course this kind of technology, now called Natural Language Processing (NLP), is rife in our modern communities, powering things like Amazon Echo and Alexa, Google Home and Assistant, Microsoft Cortana and many other voice analysis products. It has become so quickly prevalent, we were have almost forgotten the power of this.

Recording online meetings

There are many companies today that help businesses run more smoothly, more efficiently, by transcribing voice conversations into text, and allowing sharing and analysis of that text after the conversation, or meeting. One example that I use all the time is Otter.ai, which is a brilliant app for transcribing conversations. I use this because it allows me to focus completely on the conversation, knowing that I will have a good set of notes that I can refer to after the call.

AI analysis of video calls

Chorus.ai - Participation analysis and content tagging in Zoom calls

Going beyond Otter.ai's capability, other companies such as Chorus.ai provide this kind of technology with greater depth of enterprise integration. For example, with Chorus.ai you can transcribe entire zoom meetings and assign them to the relevant entries in your Salesforce CRM.

In October 2020, Zoom launched an integration API called Zapps. Zapps allow third-parties is to create new integrations with the Zoom platform.

On the same day, Chorus.ai announced their Zapps integration for real time relationship intelligence for Zoom based sales meetings.

Currently this integration seems to focus on the voice-to-text (NLP) capabilities of Chorus.ai, but adds real-time tagging and highlighting of keywords. It also provides pre-meeting insights as to what happened in other meetings and conversations through its integration into your enterprises CRM system, and can prompt the sales team in real-time with this information.

Video Meeting Analysis

However, I think this is only the tip of the iceberg for online video meeting integrations.

Having been involved in many multi-million pound business deals, I know that every change of intonation, every twitch of someone's face, each glance to a colleague across the other side of the table, or even someone important not turning up to the meeting, can be used as a signal during business conversation and negotiation.

The difference between a good salesperson, and an outstanding salesperson is the ability to perceive and comprehend the signals, and use them to their advantage.

Now all of this is being done digitally, online, and much of it is being recorded under the auspicious guise of compliancy, or note-taking, there is a huge amount of video data that can be analysed during, and after the meeting.

Sign up to receive more moments like this

[cp_slide_in display="inline" id="cp_id_efee1"][/cp_slide_in]

How this Will Progress

First we will see more convenience being integrated into the systems, for example using facial recognition to match meeting participants to their LinkedIn profiles. Then we will see emotional awareness technologies come to play to analyse facial responses, non-verbal and body language cues do you understand sentiment, to pinpoint supporters and detractors, to understand what is resonating, and what isn't.

Beyond this, we will see attention analysis, for example who is really paying attention, who is multitasking and perhaps who is sending messages to other people in the meeting. Perhaps we could even start to see reflections in glasses being analysed to read what's on the participant screen ... or maybe that's going too far?

I have heard it said many times in the last few months, how your choice of background in a video call says a lot about you. I have seen people with carefully constructed bookcase backgrounds, meticulously placed photographs highlighting their photographic achievements, and guitars and hockey sticks hanging nonchalantly in the background. Many others choose just to have their background blurred or an artificial scene present behind them.

What insights could be gained from analysing the background of millions of meetings across hundreds of thousands of companies? Could we create an understanding of how likely a deal is to close, and who is likely to be the one to close it, based on what is on display in the room?

Perhaps if we record these, and we look for changes and patterns in the choice of background in our participants chosen setting, we can find other hidden clues about their well-being, their state of mind, and what they are trying to tell us. How could that then be used within negotiations?

Do you see pictures of children, if so do we assume we can talk about family? Does that help us create greater empathy with our customers? Perhaps we see a pair of skis in the background one month, but not in the next meeting. Would we notice that on our own? Or now, through the power of AI and video analysis, do we now have superhuman observation skills, that allow us to be creepily specific?

What about when things go wrong? Could be easily fooled? Could I carefully construct my background, where specific types of clothes, and use repetition of certain keywords to elevate my status within my organisation, whether I'm a good salesperson or not?

Natural language processing itself still has a long way to go, and even in this article I had to correct many mistakes. Whilst very useful, my Otter.ai transcriptions are often far from perfect. So even though I believe we are not too far away from having our video meetings analysed in ever increasing depth and sophistication, given the shortcomings of NLP, I think we're in for a bumpy ride in the next few years with this technology.

Your Thoughts

Tell me what you think

Where do you think this is going? How do you feel about a video of you being analysed in this way after a meeting? Would it make you think twice about joining a zoom meeting? Or would you think again about what is behind you and how that changes over time? Or do you think I'm crazy?

Share this

About the Author

Scott

Facebook Twitter

Scott is an Independent Technology Analyst, Content Writer and Connector of interesting people. Scott is a technologist at heart, with a history of technology innovation and marketing leadership roles. As the founder of this website and several other businesses, he is passionate about helping technology companies communicate their relevance and awesomeness in a way that engages and excites everybody. Get in touch with Scott here or connect with him on LinkedIn. Learn Scott's tips for content marketing, download his free template here..