From data integration to ban lists, what are the frequently asked questions when implementing a feedback program?
The answer is, absolutely. We have a very integration-friendly platform that can work with pretty much anything. We already have data-feeds in place going for other clients. So if you need to integrate to other technologies, existing tools, your CRM or even a data lake, we can.
We have manual file delivery options that some clients use, especially during a pilot or initial deployment stage, to get a feedback program going with little or no involvement from limited internal IT resources.
Automated delivery is important to put in place as soon as you can, because it keeps the process robust , and the data fresh.
Other forms of optional integration can add even more value to your Cx program and we discuss those below.
The following optional integrations can be relevant / beneficial to your program if your other platforms (like CRM) are integration friendly.
Sometimes events that happen in your CRM (e.g. new customer) , or that your CRM is notified about (e.g. call center interaction or a product purchase), and be things that you want to trigger a feedback request. To trigger feedback requests you need to send us data (so we know what the trigger is and who to contact etc). Sometimes it is your CRM that is the best system to send us that data. If it is, or if it is feeding data to the system that does send to us, it is very helpful to include a customer reference from the CRM in the data we get sent with the trigger, and ideally a link to that customer in the CRM. This enables the next type of integration Customer Links to CRM from Cx reporting
If your trigger data includes a CRM customer reference, or, even better, a link to a specific customer record in your CRM system, then that information can be included in our Cx tools like email alerts, and in Cx reporting, allowing those with permission to click through from some feedback directly to the customer it came from in your CRM.
If your CRM provides APIs or other automated means to augment a customer record, we can insert links to customer feedback in the CRM record, so anyone viewing that customer, if they have permission can click straight through to their feedback. Useful information to include alongside the feedback links are data and time of the feedback, whether feedback was positive or negative. Generally it is a good practise to use links rather than full feedback detail in the CRM itself, as this minimises CRM integration, and makes the integration more robust over time (for e.g. robust to changes to the feedback program).
If your organisation supports single sign on using Auth0, the cross linking integrations described above can be made even more seamless, because a user logged into your CRM would not even need to login again to access the Cx information, and vice versa.
If you have other feedback programs running and you want to coordinate the stand-down (ban) period, so customers don’t get feedback requests close together, then we can both accept and share ban lists.
If your company has a centralised data lake for use in cross department reporting, we can export the feedback data we collect for you into that lake, all ready to use. This allows your internal reporting to combine feedback with your other , often quite sensitive internal data (that you typically don’t want to make available to your Cx provider). For example,
What is the revenue/profit value of detractors who have cancelled after giving feedback on a bad experience?
What is the revenue/profit value of customers that you rescued and retained by capturing and following up on their feedback?
If you don’t have an existing Business Intelligence platform that can answer these questions, we have a valued partner Didici, who are doing fantastic work in this area. We would be delighted to introduce you to our friends there.
No matter how you want to ask the why, we can assist. With a global base of clients BigEars has developed the ability to reach any audience, in any language.
At BigEars we have developed alerts to ensure that nothing important is missed. Customers who leave a very low score can request a call back and an alert is sent to the person who is responsible for rescuing unhappy customers so this opportunity is not missed. We also generate alerts for any low scores – so your team can follow up where necessary. And alerts are also generated for exceptionally high scores so that the staff members who are doing a great job can be recognised and congratulated.
At Big Ears we strongly recommend using open-ended questions, allowing the customer to volunteer extra, unexpected details. You want your customers to focus on what matters to them, not what you think should matter to them. A good example is the following question:
Were you greeted at the door when you entered the store?
Protocol might dictate that staff greet every customer when they walk through the door, but this may not be important (or even welcome) to the customer. The customer might feel that the staff were really nice and warm and friendly – but might not remember if he was immediately greeted when he walked through the door. So a better question might be:
Were the staff helpful and friendly?
A lot of organisations want their feedback systems to have consistency across all media. But remember, a voice survey is going to give you better and more useful data – if you let it! Don’t blow it by trying to use the same questions and rating scale as on your website. There is no one-size-fits-all in customer surveying – make the best use of every channel you have available to you. Your web-based or paper-based survey might be the place to get all that demographic data you wanted – but it won’t give you the unedited, uncensored voice of your customer.
When the decision to become customer centric is made by a company, the temptation is to go into customer feedback overdrive and to try and collect as much data as possible as often as possible. What is known is, this feedback overdrive approach can actually be harmful, affecting the brand, degrading the relationship and ironically leading to less valuable data. In fact so many companies are asking for feedback so often the universal effect has been a drop in survey responses globally over the past decade. The lesson is – don’t over survey your customers, and make it the best possible experience when you do. Surveys should have value, when a customer has a compelling reason they are happy to take the time to give you considered and insightful feedback and in turn everything you collect, you have an obligation to use really well.
One of the worst things you can do is duplicate questions, forcing the customer to repeat herself. If a customer senses that her time is being wasted repeating information she’s already given, she will become frustrated and resentful, and will feel that you aren’t taking the survey seriously. If you haven’t taken the time to get the questions right, why should she waste her time taking part?
So, a strong edit is necessary when designing your survey. Too often the tough decisions aren’t made at the time of survey-design and what should be five questions becomes twelve or thirteen. Be prepared to give up some of the things you’d like to find out – such as whether or not your staff are greeting every arrival at the door – to achieve your primary goal, which is authentic customer insight.
Ironically, we find that a lot of customer experience feedback programmes are not, in themselves, a good customer experience. Most customers want to give feedback because they want you to improve, but remember that you are asking them to give up their time to help your business. It’s important to make this as easy and painless as possible. Let your customers give feedback on their terms. That means letting them do it at the right time – the time that suits them – and letting them expand freely on their ideas. Think about it from their point of view – they’re ringing to tell you something, let them get on with it!
A good example is demographic questions – it’s great to know the gender and age of the caller, but the customer is not at all interested in that. These questions are a roadblock to the customer and prevent him from saying what’s on his mind. So our recommendation is that if you want to include demographic questions, include them in a pilot or for a short period of time so that you can get a helpful measure – and then turn them off.
Before offering a customer the opportunity to leave feedback we scrub the data against our ban lists. This ensures that any customers you have tagged as not to be contacted will not receive an invite. Any customers who have opted out or unsubscribed will not receive an invite and we manage things so that you can specify the stand down period between a customer giving feedback and then being asked to give feedback again (this can be customised for different companies and different customer groups).
We always recommend using a zero-to nine scale for a number of reasons.
Anchoring the lower end of your scale at zero avoids confusion about which end of the scale is good. For example, people with a good impression might give you a number one score, feeling that you’re number one! But nobody feels that zero is best. This is particularly important with a voice survey because you lack the visual dimension where by convention the left-hand side is bad and the right-hand side is good.
The reason we recommend a top score of nine and not ten is that there’s no number ten on the phone. This means that to give a top score of ten the customer has to do something slightly different than with all the other scores, which is never a good thing. Furthermore, when a customer enters a one you are forced to wait to see if they follow that up with a zero (making ten). How long do you wait? If you wait too long the customer hears a long dead pause, and if you don’t wait long enough you risk accidentally reversing a potentially very high score (ten) to a very low score (one). In terms of mistakes that can badly corrupt your data, this is one of the worst.
A single-number rating scale is also more reliable on occasions where you have a bad phone line and a key press goes undetected. If a single-digit score is entered and is undetected, our survey will assume nothing has been entered and ask for the rating again. However, if you’ve got a two-digit scale and the zero or the one is missed, this will be entered as a valid answer which is completely wrong.
Some people feel that NPS mandates a zero-to-ten score, but it doesn’t. The NPS rating is point-agnostic by its very nature.
There’s never an easy answer to what the response rate is going to be for a new campaign, because it depends on the company’s relationship with its customers. Incentives help, but some companies will struggle to get any feedback because the relationship with the customer is so broken. Other companies who have a good relationship with their customers can get an eighty or ninety percent response rate.
In retail situations where the call to action is a poster or a flier in the store, the attitude of the staff is key in determining the response rate. If your staff understand the value of voice feedback and are supportive of the program, you’re much more likely to get a good response rate. If your staff fear the program or are cynical about its goals, then you’ll find that you receive little or no responses.
One of the ways we help you to achieve a good response rate is to give you data about which locations have the best customer engagement. This means that you can reinforce across the entire company what certain locations are doing to receive feedback and which locations are consistently getting good feedback. Customer Radio is a great way to convert the cynics among your staff, because once they hear audio clips they’ll want to hear more, and they’ll want to hear clips about their own location. For the staff it becomes an exciting opportunity to impress their employer. They realise that every customer interaction matters – that each and every interaction might get played back to the CEO of the company.
In our experience incentives do improve response rates and are good for programmes, but the key when offering an incentive is to get the balance right. The incentive conveys to the caller that you value their opinion and it can help the customer to overcome the mental barrier that the effort required to give feedback is not worth it. However, if the incentive is too strong you risk attracting feedback that isn’t genuine.
One of the things you can do to make the feedback experience more enjoyable for your customer is to have a senior member of the company; a spokesperson or a celebrity closely associated with your brand record the voice-over for your survey. This conveys to the customer that her feedback is important to you while maintaining a strong brand-presence during the call.
Imagine that the voice your customer hears asking for feedback is that of the company-CEO. Your customer is going to have a strong sense of expectation that someone in that company will listen to her feedback and that it’s not going to end up gathering dust in a file somewhere.