The Australian Government coat of Arms

Communities of practice

Communities of practice

Feedback form


#1

How might we make it easy for users to give feedback on part of a digital service?

Do we need to rely on third party widgets to capture simple comments, queries and scores/ratings for content, or can the Australian Government have its own guidelines and basic template?

User feedback in government, industry and private sectors is captured in a myriad of ways and many are designed to satisfy only business needs. The kinds of data captured differ too and depend on whatever the service provider cares to measure.

Sometimes feedback is captured through generic contact forms, Net Promoter Scoring, embedded off-brand widgets, surveys/questionnaires, disrupting (and often inaccessible) modals, or users are forced to go through many different paths to leave feedback.

How might feedback mechanisms be:

  • relevant to the scenarios users are in
  • easy and accessible for all users
  • safe to use: no hidden :eyes: nasties or dark patterns at work
  • recognisable - particularly across multiple touchpoints
  • supportive of different government agencies’ feedback and development processes?

#2

Totes agree.
I think there is a strong need for something like this, even if it’s just a recommended pattern to start with. Because a whole of gov solution quickly becomes a bigger conversation about shared platforms, data collection, data sovereignty, and all that jazz. (Digital Service Standard: No.4, No.5, No.7, No.11) Which is much broader than the Design System.


RE: Net Promoter Score
I believe Net Promoter Score is highly inappropriate for government services.
(I know you weren’t necessarily suggesting it, but just in case people are looking into it.)

How likely is it that you would recommend paying a parking ticket to a friend or colleague?

A more appropriate metric for government services might be the Single Ease Question

Overall how easy was it to pay your parking ticket?


#3

Yeah I’m against NPS as it’s not an accurate measure of people’s behaviour; opinions can be made up on the spot when prompted.

There are so many cons to using them and besides, asking a user to imagine and rate some kind of task (especially without a previous track record of attempting that task) is delusional and won’t give you real data you can use.

I don’t think it’s worth anyone’s time using NPS :smiley:


#4

Here’s something I put together today as a quick wireflow based on what we plan to track and measure re. site satisfaction and use.

*not complete, just a bud of an idea


#5

Nice @alexandra I reckon this is pretty clean!

Minor feedback: I know disabled buttons is a bit contentious as it’s frustrating as a user not knowing what needs to be done to make it not disabled. Especially with the optional fields.


#6

We found the term “helpful” was flavoured by a user’s emotional perception of the service itself rather than the page, in the same way NPS is misleading. We went with “useful” because it didn’t seem to have the same emotional undertones.

We have tried free text feedback and the trial was stopped due to extreme frequency of abuse, threats, submission of personal information and cries for help requiring immediate staff intervention. This could be handled on a small scale, but does not scale well across millions of customers. As a result we will be using a conditional multiselect as our secondary question to ask why a user feels that way. We are broadly thinking the second question will be something like:
a) I don’t understand the content
b) I disagree with the content
c) it doesn’t answer my question

It is also worth noting that asking for someone’s email immediately increases the required form storage security, which disqualifies the information from being stored in govCMS, Google Analytics or any other third party non-approved cloud platform. In the context of a service delivery agency, and when combined with a free text field, the email address will also prompt users to expect their specific service enquiry can be resolved through this form. like “here is my fortnightly earnings, make sure I get paid - joe@blo.gs”. And then two weeks later “I submitted my earnings here but you didn’t update my record - joe@blo.gs”


#7

Yes I prefer ‘useful’ too though our user researchers are suggesting this to be a better approach. We also cannot directly measure usefulness through any means on the site but ‘helpful’ can indicate their inclination to do something with the information.

Something we can’t ever control (or wish to) would be a user’s emotional state and this is always a factor to consider when collecting feedback.

Yes, we are very much aware of the scalability concerns with a mechanism of this scope. We do not have the same users or volume as a large-scale public information service, and are developing a system based on an assessment of where we’re headed and what we’re capable of supporting within our existing networks.

Thank you for pointing this out for those who aren’t already aware. Digital Guides are currently doing a tech spike into how we can securely manage data in the cloud (and btw email isn’t required :wink: ) for the particular case of collecting personally-identifiable data.

As this early concept is an iteration on a couple of different previous feedback mechanisms we’ve developed, we are refining an idea to provide contextual feedback and not to set something up for troubleshooting.