AI and AV: Rise of the machines

Artificial intelligence is generating lots of buzz in other verticals. Tim Kridel explores what AV can learn from those and how vendors such as Avaya and Harman are applying AI.

A decade ago, Steve Jobs introduced the iPhone by explaining why it didn’t include a stylus. “We’re going to use the best pointing device in the world: our fingers,” he said. “We’re born with 10 of them.” 

We’re also born with a voice, which is rapidly emerging as another user interface (UI), including for pro AV systems. That’s largely because of advances in artificial intelligence (AI), which keeps getting better at understanding people no matter how heavy their accent or when they use everyday terms instead of industry jargon. That capability often is referred to as “natural-language understanding” or “natural-language processing.”

Another reason is that people’s experiences as consumers set their expectations about what’s possible and preferable at work. The iPhone, for example, introduced a lot of people to concepts such as gesture control and, a few versions later, speech-powered virtual assistants. This familiarity sets the stage for pro AV to tackle challenges such as the bewilderment that people get when walking into an unfamiliar conference room and trying to figure out how to turn on the projector, connect their laptop or lower the blinds.

That’s why in April, Harman partnered with IBM Watson to develop what they call “voice-enabled cognitive rooms” for verticals such as health care and hospitality. The solutions begin shipping this year and include Harman soundbars embedded with IBM Watson’s AI technology, which lets people simply talk to the equipment to get information or get it to do something. 

For example, instead of using a hotel room’s thermostat, guests could say, “Turn up the heat” or “Turn on the air and set it to 20.” Or instead of using the TV remote, they could say, “Turn on CNN.”
“It’s all about making it simple and easy for guests,” says David McKinney, vice president of Harman's Hospitality Customer Solutions unit. “They don’t have to learn the environment. They don’t have to learn a certain terminology. It’s a natural-language type approach to it.”

What’s the business case?

For hotels and other businesses, a big part of AI-powered AV’s appeal is that it helps them save money. For example, convention centres, libraries and other large venues use digital signage to help people find their way around on their own. So they save money because they don’t need as many, or any, staff to help with wayfinding. 

AI has the potential to extend that efficiency to many other areas. Suppose a hotel room has a smart speaker such as Amazon Alexa or Google Home, and it’s connected to multiple departments, including maintenance and housekeeping. Now when guests say, “I need more towels,” or “There’s no hot water,” the system can automatically alert staff—but without the need for additional staff at the front desk to field and relay those calls. 

SignageLive facial recognition exampleThat scenario also is an example of one way AV firms can add value: by identifying tasks that can be automated. For instance, an AV consultant could analyse the front desk’s inbound calls to determine the 25 most common guest inquiries and then develop an AI solution capable of fielding and routing them without staff involvement. If that analysis also shows how many personnel hours that would save, it also would help justify the project’s budget.

That type of analysis also highlights how AI ties in with another buzzword: big data. Suppose the AI is embedded in wayfinding digital signage, and people keep asking about the same half-dozen places. That could point to a need to update the signage content to anticipate those questions so people no longer feel a need to ask them. 

It also could identify business opportunities. For example, if hotel guests often ask their room’s smart speaker whether there’s an Italian restaurant nearby, maybe it’s time to add one to the property.

“There’s a lot of data that comes out of how these systems are used, [such as] what sorts of commands are coming in,” McKinney says.

Another business driver involves brand reputation. For example, the AI system could be programmed to recognise words that indicate a person’s emotion. If it’s negative, the system could alert a staff member to resolve the problem—all without the person asking for help, and avoiding experiences that lead to negative reviews and lost business. 
“There’s a lot of data that comes out of how these systems are used, [such as] what sorts of commands are coming in.”
“If someone says turn on the expletive lights, then you can tell they’re not having the best experience,” McKinney says. “The system could have the customer service team call the guest and make sure they can solve those issues.”

At your service

In the enterprise, AI also could help integrators, vendors and their clients provide better user experiences while lowering support costs. For example, in October, Avaya announced the A.I.Connect initiative to develop AI solutions for applications such as contact centres and unified communications. An enterprise AV/IT help desk is a contact centre, so it’s worth looking at how AI use cases from other contact centre applications could be adapted. 

One A.I.Connect partner is Nuance, which specialises in natural language understanding for applications such as interactive voice response (IVR) phone systems and virtual assistants. Nuance has discussed how an AI platform could ingest all of the manuals, FAQs and other collateral for a product and use that to power a virtual assistant to support that product.

In the AV world, one example is an enterprise help desk where the AI-powered virtual assistant fields questions about how to connect a laptop to a projector. Another possibility is an AV integrator or vendor that uses a virtual assistant to support its products. 

In consumer-facing contact centres, virtual assistants often can handle up to 80% of inquiries. So if AV vendors, integrators or their customers can achieve comparable automation, it would free up AV/IT staff to focus on other tasks. 

“The variety of use cases for applying AI to improving customer experiences is somewhat staggering,” says Eric Rossman, Avaya vice president, alliances and partnerships. “Companies that are focused on implementing digital channels are looking heavily towards expert systems-based chatbots and virtual assistants, which rely upon semantic analysis, natural language speech recognition and rule-based pattern matching capabilities.” 

Remember the example of a hotel guest who swears about turning on the lights? The AI identified that frustration using sentiment analysis, which a help desk virtual assistant could use to determine that it’s time to transfer the call to a human.

“Sentiment analysis offers insight to the effectiveness of the customer interaction based on speech patterns, timing, volume and key words and phrases used,” Rossman says. “[It has] reached a level of sophistication thanks to AI techniques, providing real-time feedback to the agent as to how receptive a caller may be to the overall tone and tenor of the conversation.”

Of course, virtual assistants won’t be able to handle every inquiry, especially technically complex ones. In those cases, AI still could play a role by taking over some of the work that help desk staff typically do during a call. One possibility is listening on the call for certain keywords, such as product names.

“Being able to proactively place guidance and related resources in the hands of the agent without them having to manually search knowledge bases and other internal sources for those materials only makes the customer interactions go smoother and flow more naturally,” Rossman says. “This ‘agent augmentation’ capability can easily leverage AI-enabled applications that data mine the wealth of knowledge bases, help desk tickets, even internal video training and recorded webinars that a company may have, learning to identify common themes and recurring answers that can form ready-made results for both automated and human-assisted interactions.” 

AV also could adapt voice biometrics, which some contact centres and virtual assistants use to authenticate users so they don’t have to remember a PIN or password. One possibility is a conference room where the AI identifies each presenter by voice and automatically downloads their content from the cloud to the projector or display. That would alleviate the common frustration of trying to figure out an unfamiliar AV system.

New skills required? 

Some of these scenarios might seem a bit outside of pro AV’s traditional wheelhouse. But so are energy efficiency, digital signage content creation and the Internet of Things, which are just three examples of areas that some AV firms have expanded into.

Selling and supporting new technologies usually means AV pros need to add new skills, such as iOS expertise when the iPad emerged as a touchpanel alternative and content source. How many new skills depends on how deep they want to get into a new market and what their vendors offer. AI is no exception.

For instance, integrators selling Harman-IBM Watson solutions wouldn’t have to hire, say, speech scientists to design and support voice-powered systems. Instead, they can focus on installing mics and loudspeakers.  
“Compelling content is by and large dictated by the old adage ‘right place, right time.’ Real-time data helps facilitate that.”
“We’ve built software applications to enable people to install and make it easy to set up a mass deployment,” McKinney says. “[For] integrators doing those sort of control systems already, a lot of their skill sets can ported into that.”

AI also could give integrators and end users new ways to maximise the effectiveness and RoI of traditional AV systems. In retail, for example, AI could analyse camera feeds to determine how certain demographics react to certain content on digital signage. 

“We are seeing a growing number of retailers either adding or looking to add audience measurement technologies to serve two functions,” says Jason Cremins, founder and CEO of Signagelive, which is working with AdMobilize on AI analytics. “The first is to collect viewer data that can be analysed against the proof of play (media logs) and proof of display (device status data) that we collect and report within our platform. Adding proof of view completes the dataset, allowing them to [apply] POS sales data and other internal and external metrics (e.g., weather) to provide a deep insight into the impact of their digital signage network and content strategy.
“The second use case is using the data gathered to dynamically shape and schedule the media playing on the digital signage displays. In this scenario, the scheduled content is adjusted at the point of playback to optimise the content shown based on the insights gathered.” 

For retailers and other businesses that use digital signage, one longstanding challenge is quantifying the reach and effectiveness of both the displays’ locations and the content on them. AI enables them to get deeper, actionable insights that wouldn’t be practical or possible if humans did that analysis.

Faster analysis also means businesses can react faster.

“One thing we tell all of our partners is to initially correlate the data to the brief or RFP that will drive the investment in digital signage in the first place,” says Mike Neel, AdMobilize global head of marketing/sales. “We often find that the data that can be provided greatly improves the KPIs associated with the investment in digital signage. 

“Compelling content is by and large dictated by the old adage ‘right place, right time.’ Real-time data helps facilitate that.”
Watson IoT headquarters in Munich

Article Categories

Most Viewed