Advertisement

Voice Technology for HR Arrives

Article main image
Jun 7, 2019

Amazon Alexa has been a stunning success with 100 million devices sold, so it should be no surprise that this kind of voice driven technology has made its way into HR tech.

In essence, Alexa-like technology allows employees to issue commands by voice instead of a keyboard. In the old world, if an employee wanted to know when their next shift was, they would have to log in and navigate through menus to find out. With this new technology, they can just ask their smart phone, “When is my next shift?”

You can see an example of how Ceridian Dayforce has implemented this using the Google natural language processing engine in this 1-minute video.

https://youtu.be/vurHzZZQEuU

How the technology works

You can think of this voice technology as having three layers:

  1. Transcribing speech to text — Natural Language Processing (NLP) can take speech, which to the computer is just a bunch of sounds, and reliably transcribe it to text. Just a few years ago it was impossible to do this well, now its commonplace and robust enough for many consumer and business applications.
  2. Understanding the intent — In traditional computing you have to give commands in an exact way. Computers are now smart enough to, quite often, understand the intent. An employee can say “When is my next shift?” or “Tell me about my shifts” or “Do I have an upcoming shift?” and the NLP engine will understand that all those variations have the same intent and hence should execute the same command.
    While leading NLP engines like those from Amazon and Google do a remarkable job of understanding everyday language, they will work even better if optimized for the domain. So vendors like Ceridian need engineers who can ensure the system is good at recognizing HR specific commands such as “I’d like my overtime report.”
  3. Executing a particular software command — A software vendor will decide what commands they want to be able to execute by voice and build an interface. The system will only be able to execute voice commands for which an interface has been built. (This interface is called a “Skill” in the Alexa ecosystem.) An employee may say something simple that the natural language processing understands like “print this,” but unless the vendor has built the print command into the interface then nothing will happen.

The first two layers, which are really the amazing parts, are software tools anyone can go out and buy. As new and as stunning as this technology is, it’s already becoming relatively straightforward to implement.

Limitations

The main problem with voice driven commands is that it’s hard to know what the system can and cannot do. Many will have experienced the initial excitement of talking to Apple’s Siri only to be disappointed by its limitations.

With a menu system, you can see the available commands, with voice driven commands you have to be told the options or make a good guess.

Luckily for businesses, employees are rapidly gaining experience on how to accommodate the limitations of these systems with the products they use at home such as Alexa, Google Home, Apple Homepod, Microsoft’s Cortana or Samsung’s Bixby.

The future

We can expect the ability to control HR software by voice to take off in the years ahead. One signal of how fast things are moving is the growth of skills in Alexa, in 2018 it went from having 26,000 skills to 57,000. Furthermore, the natural language modules are getting smarter all the time which will make them ever more useful.

For those of us who remember the days before self-service, and those of us who are still living without mobile access to our HR systems, the rapid onset of voice driven command systems is a reminder of how fast the world of HR tech is changing. It’s a warning that we need to change our approach to technology so that we can keep up.