Back to school with Microsoft Power Virtual Agents
 

Recently we’ve been doing a lot of work with Microsoft’s Power Virtual Agents (PVA) for a client who already has Office 365 and Azure. This has been a significant learning curve, not because it’s difficult to get started with PVA (it isn’t) but because it’s such a new product that there are a few shortcomings. Some things that should be easy and obvious take quite a bit of working around, and there is some effort involved in creating supporting “Actions” in the sister product, Power Automate.

Luckily, Microsoft has a full roadmap for PVA, and it looks like all the things we wish were here now, have at least been announced as coming some time down the track. Contrast this with Google’s approach to their similar product, Dialogflow, which seems to be “You’ll get what we want, when we want it”. At least now Google do have a release plan, which is not anywhere near as much of a commitment as a roadmap.

When it comes to enterprise clients, the Microsoft approach is much easier to digest - buying into a product that has a planned future, with the downside that the pricing model is very different. Dialogflow allows extensive free use, while PVA has a free 30 day trial followed by a US$1000/month price tag.

If you would like to compare Dialogflow - which we think is an industry heavyweight - with the new kid on the block, download our comparison here.

 
DialogFlow vs. Power Virtual Agent - May 2020.png
doug maloney
Chatbot exercises
 

When you are creating an automated agent - a chatbot - the plan for rolling out the bot to customers needs to be thought through carefully. There is nothing magic about this - it’s just like rolling out any other software system - first you need to be comfortable that the chatbot works from a functional, technical point of view (i.e. customers can access it and it doesn’t actually break down), then you need to do several rounds of progressively deeper user testing.

Several rounds of user testing are usually done to make sure you’re comfortable having this new automated team member talking to your customers.

Internal User Testing

It is normally fairly easy to conduct some testing with a group of staff or internal users, though it is often short-lived as the nominated testers can bombard the chatbot with imagined scenarios and quickly exhaust the chatbot’s knowledge.

This type of testing can generate some information on the stability of the chatbot, or misses from the initial chatbot training, but in our experience, few employees really understand how customers interact with their business so the conversations between the chatbot and tester tend to be quite forced.

That being said, this is a useful first step in flushing out any very obvious issues not revealed during technical testing.

Friendly User Testing

If you are the sort of business where each of your customers will contact you a lot, then it might be feasible to identify a group of users who are familiar with your offering and can provide decent feedback of their experience with your new agent.

On the other hand, if you have only occasional contact with each of your customers, it may be difficult to identify some friendly customers, and they may not be able to provide good feedback.

If you have previously run a friendly user trial for something else then this might be one of the options to consider, but if you haven’t done one before, a chatbot project which by its nature can be a bit open ended, might not be the one to start with.

Segmented A/B testing

Offering your new chatbot to only one segment, or group, of customers then connecting them randomly with either a human or chatbot can work really well.

 
A chatbot is a new member of the team
 

When you’re considering automating some or all of your interactions with customers, we recommend to consider how a newly introduced automated agent or chatbot works in with your existing -  human - team as well as with your customers.

  1. Start with “Novice” mode - your automated agent has a more experienced team member as a buddy and listens in on, or reads, a lot of conversations on the side while the buddy deals with the customer conversation. When your trainee agent is sure they’ve understood a question, and know the answer, then they let their buddy know and take onboard any corrective guidance and direction.

  2. Move on to “Trainee” mode - the automated agent talks directly to your customers, handing off to their buddy with plenty of information on how to take over when they get stuck or when the customer asks for human help.

  3. Finally you can go into “Co-worker” mode - the automated agent handles all customer questions by itself, being well trained enough to handle most situations and to know when to escalate to a manager if their customer asks for it.

The behaviour of the chatbot needs to be carefully monitored and measured at each stage to ensure it is understanding the intent of the customer, and is receiving the right direction and training.

Just like a real team member.

 
doug maloney
What do we know ? A series of brief posts on chatbot lessons learned
 

A series of fairly brief posts about things we’ve learned while designing, building and managing chatbots, automated conversations and virtual agents …

We will cover the following topics, and add as we come across new snippets of information.

  • Introducing a chatbot as a new team member

  • Chatbot, Virtual Agent, Automated Assistant - what do you call it ?

  • Chatbot personas - cute artifice or figurative realism

  • Chatbot exercises

  • Off-piste - what to do when your customers go off the beaten track

  • Why do customers ignore the prompts ?

  • 50% will understand. 50% won’t

 
doug maloney