Japan telecoms giant creates AI filter that calms angry customer voices in call centres

Source: South China Morning Post

Survey shows nearly half of staff harassed, get abusive language, threats, SoftBank works with Tokyo University, starts using technology next year

We all know the frustration of not getting our queries dealt with by customer services, or being kept waiting too long.

Some people take it out on the person dealing with the issue. Imagine being that person.

Now Japanese telecoms giant SoftBank has created an artificial intelligence (AI) filter that masks angry customers’ voices with a softer tone, to ease pressure on staff.

In Japan, customer harassment, or kasu-hara, has increasingly become a problem in the workplace alongside power harassment and sexual harassment.

According to a 2024 survey conducted by Japan’s biggest union, UA Zensen, of about 30,000 staff working in service and other sectors, 46.8 per cent said they had experienced customer anger or intimidation in the past two years.

Incidents included abusive language, repetitive complaints, threats and unreasonable demand for apologies.

SoftBank has been working with Tokyo University on an AI filter that could identify angry customers’ voices, and soften them into a less aggressive tone.

The new tech was published by SoftBank on April 15.

In the product’s demonstration video, a male customer’s angry voice was adapted into that of one described by a Japanese news anchor as “an anime dubbing artist”.

It is expected that the technology would reduce the negative impact on customer service staff’s mental health, so they stay in their jobs.

In Japan, it is traditionally seen as a virtue to be servile to superiors and customers at work.

However, the situation has gradually improved in recent years.

In 2022, Japan’s Ministry of Health, Labour and Welfare published a manual that instructs and urges companies to tackle customer harassment.

Some service providers, such as ANA Holdings, the parent of All Nippon Airways, and West Japan Railway, had already unveiled policies on customer harassment.

West Japan Railway told staff they could stop selling products or providing services to customers who verbally or physically abuse them.

Lawyers could also become involved to help employees take legal action against customers.

SoftBank is likely to begin using its AI filter in 2025.

The technology has received widespread support online.

“It is really good to have such technology. However, people should learn to control their temper when talking to customer service staff.”

one person said on YouTube.

“It would also be good if AI altered the staff’s voices to sound like an intimidating gangster.”

another joked..

A third person said a filter was unnecessary: “The AI should just cut off the call when recognising an angry voice.”

Leave a comment

Your email address will not be published. Required fields are marked *