A recent case illustrates the need to think about privacy in both system design and human decision-making. Plus, how keeping user experience (UX) front of mind when designing systems or processes should result in better privacy outcomes too – and maybe help preserve some human dignity along the way.
In DQJ v Secretary, Department of Family and Community Services [2019] NSWCATAD 138, a mix of poor system design and human error caused a distressing disclosure.
DQJ was a homeless woman applying for housing through the Department of Family and Community Service’s online application form. The online form made it mandatory to enter a contact residential address. Thus even though she had no residential address, in order to lodge her application she had to nominate an address. DQJ therefore listed a previous address. However she also made clear in the application form that she had no fixed address, and preferred to be contacted by email.
The Terms and Conditions for using the system, which DQJ had to ‘accept’ in order to make her application, said that once an outcome on the housing application had been made, she would receive communication (via email, SMS and/or letter) either shortly after receiving the last required supporting documentation or within two months of lodgement.
The Respondent argued that this constituted DQJ’s ‘consent’ to be contacted at the old address. The Tribunal certainly thought this lent some weight to the department’s argument.
Personally, I would disagree. Mandatory Terms and Conditions cannot indicate consent, because DQJ had no alternative. She was a homeless woman in need of emergency housing, so if she refused to accept the Ts&Cs she would remain homeless, which is hardly a position from which to offer ‘voluntary’ consent.
Further, the system made a field mandatory which, in the context of applications from potentially homeless people, seems illogical. The system also asked for her preferred contact mechanism, but then did not respect her answer. And importantly the Ts&Cs said she would be contacted by “email, SMS and/or letter”, which in our view can also be read as meaning that hard copy letter was not the only mechanism, especially for someone who had clearly nominated email as her preferred mechanism.
But what actually happened was this. The online system automatically generated a hard copy letter which was sent to the old address. The contents of the letter included that DQJ was homeless and in need of accommodation, and mentioned a health condition relevant to her accommodation needs.
DQJ complained about the letter being sent to the old address. There was no evidence to suggest the letter had been opened by the new occupants, but nor was the letter returned to the Department, so there was a potential disclosure to whoever received that letter, if they opened and read it.
Although found to be out of scope for this litigation, from a design and privacy risk management perspective it is useful to understand what happened next.
After DQJ made a complaint to the Department about them mailing hard copy letters instead of emails as she had requested, the Department conducted an internal review, in which it admitted this conduct was an unauthorised disclosure. The Department apologised and updated the system to show that the old address was no longer in use.
Nonetheless, even after that update, three further letters were sent to the old address. This was described as human error, because unlike the first letter they were not system-generated letters. The description of the problem was that:
“the Respondent … end-dated DQJ’s contact address to avoid system generated correspondence being sent and a client specific notification (was) updated within the system advising that DQJ’s is only to receive correspondence by email”.
However, “the notation … had not been read”. The officer responsible for managing the privacy complaint then had to speak to the team leader about staff training, and the agency had to add an additional pop-up warning on the system.
So we’ve got multiple points of failure here. Poor design of the application form, especially in the context of the Department’s client base. Poor translation of the answers given in the application form about preferred contact mechanism into system-generated outcomes. And then even when the problem was supposed to be remedied, poor staff practices meant that notes went either unread or ignored.
In this case, only the disclosure in the first letter was in scope in the Tribunal, along with a separate matter. The complainant had brought her complaint to the Tribunal because she was seeking compensation. Despite the Department admitting this was an unauthorised disclosure in the internal review, the Tribunal actually found it was authorised, on the basis that DQJ had only expressed her ‘preference’ to be contacted by email, and had not requested that all communication be by email. (There is no mention in the case as to whether this was expressed as an option on the form.) The Tribunal also gave weight to the wording of the Ts&Cs.
Personally, I think that decision puts too much onus on individuals to have to proactively object to something, going beyond already saying “here is how I prefer to be contacted”, when in fact a better system design would have prevented this problem in the first place.
Imagine instead an online housing application form which allows people to say: “I don’t have a fixed address, I am homeless, so please only contact me via this email or this phone number.”
So some takeaway lessons here:
- Don’t make data fields mandatory if doing so forces people to give you incorrect data, just to get through a web page.
- Think about UX, or in other words consider your client base when you design points of data collection. For example, if you are in the business of offering emergency housing assistance, do not be surprised that some of your clients are homeless, and therefore will not have a current residential address.
- Offer multiple ways of receiving communications.
- If you are going to let people nominate their preferred contact mechanism, respect their wishes, and only use that mechanism to contact them.
- Make sure that both system-generated and human-generated communications follow the same business rules.
- Ensure that staff are trained in those business rules.
This case illustrates the need to think about upskilling staff throughout your organisation to think in terms of ‘privacy by design’ – and, indeed, user experience. From the precise way online forms are designed, to the way systems act on the data collected, and the staff who need to stop and think before they use or disclose personal information, at least basic privacy skills, and the ability to stand in the shoes of your customer, are needed at every decision-making point.
Photograph (c) Shutterstock