Ethic

Image credit: Pixabay

Done correctly, UX design work should benefit both parties involved. The person experiencing it, and the person (or business) that provided it. But this doesn’t always happen. Businesses ultimately want to drive as many conversions as possible. This can lead them to act in ways that do produce more revenue but might cross some ethical lines.

The ever-increasing extent to which UX work relies on complex data has only exacerbated this prospect. Concerns about matters such as personalisation have been around for a long while. You may recall Target’s teen pregnancy fiasco for instance. However, systems have become vastly more sophisticated in recent years, and some high-profile security flubs have resulted in action being taken to curtail irresponsible use.

In this piece, we’re going to look at why the use of data in UX is concerning, the rise of GDPR and its effect on business policies, and how the future of UX is going to navigate murky ethical waters. Let’s begin.

How the smartphone made UX personal

UX doesn’t really have a ceiling. There’s no conceivable perfect UX upon which you cannot possibly improve. Hence, any technological innovation that comes along provides a fresh opportunity to revise UX standards. This was true with the smartphone. While touchscreen interfaces took some time to become slick enough for use with pocket devices, they changed the game as soon as that hurdle had been overcome.

The standard UX of today is not only heavily inspired by mobile design but is in fact led by it. The advent of mobile-first methodology has seen website and app developers focusing on mobile screens before expanding to suit larger screens. But the influence of smartphone design isn’t only seen in button placement and the like. It has also seen in the rise of UI customisation and personalisation.

It soon became commonplace for people to carry around browsing devices that uniquely identified them in various ways (their names, their addresses, their locations, their histories, etc.). UX designers then realised that they could use that information to their advantage. They could provide features that used that data to provide better experiences. This in turn makes users happier and more inclined to spend money.

Concerns about personal data

As I see it, it quickly became apparent that there were three major issues with all of this data being so readily accessible:

  • UX designers could get overly familiar and come across as invasive.

 

      • The Target example was a prime demonstration of how to take UX personalisation too far. There comes a point at which remembering things about users moves past convenience and enters the realm of creepiness.

 

  • Users didn’t really understand what data was being stored about them.

 

      • Years back, people might vaguely recognise that their data was being requested at times, but some assumed it was anonymous. Others likely figured that there were regulations in place and there was nothing to be concerned about.

 

  • The data collected for UX could be appropriated for other purposes.

 

    • People would willingly provide large amounts of personal information to brands they trusted. However, that information wouldn’t necessarily be secure. Some external party could get hold of it and then use it unethically.

UX designers (and the brands employing them) may well have had good intentions, but there was just too much scope for the information to be exploited for them to be sure that their use of

it could be considered unquestionably ethical. (It also bears noting that many of the companies using personal data were likely similarly unworried about the dangers it posed, naively assuming that no one would have the understanding or the inclination to abuse it.

The introduction of GDPR

Formulated in April 2016 and implemented in May this year, the GDPR (Global Data Protection Regulation) was devised to address the various issues with the vastly-expanded use of personal user data. Placing significant limits on how much data can be collected, how it must be stored, and what it can be used for. The hope for GDPR in the long term was that it would lay the groundwork for radically-improved procedures.

While it’s much too early to say conclusively if it will prove effective in that role, it cannot be denied that the implementation of GDPR has sent shockwaves across the digital landscape. Panicked businesses met the official implementation date of May 25th with email barrages, pleading for users to provide the knowing consent they now need to conduct their segmented targeting and personalisation, and the significance of personal data has been solidified in the public awareness.

Something tying into this is the notion of legacy responsibility. What happens when you buy a business with an existing database? Suppose that you’re based in London but find a business for sale in Houston that exclusively functions online and comes with a giant spreadsheet of previous customers. Are you responsible for disposing of it? If not, who is?

In the short term, I don’t think it really matters that GDPR is causing a great deal of confusion about when and how it applies. This will hopefully encourage uncertain businesses to err on the side of caution. Just as smartphones set new standards for UX that then carried across to other devices. GDPR may ultimately set a new global standard for UX data storage.

UX, ethics, and transparency

To recap, we’ve thus far looked at how the smartphone made complex personal data available. 

There’s no going back to the days of using devices anonymously. The most important thing the UX industry can do from an ethical standpoint is to operate with as much transparency as can realistically be achieved.

That doesn’t just mean adhering to requirements such as those laid out in GDPR. It also means taking a user-first approach to the design process. It’s perfectly possible for something to be within the letter of the law but not in keeping with the spirit. What really matters is communicating with your users and getting their feedback. The more they know about how their data is being used, the happier they will be. Plus, the more willing they’ll be to provide consent when requested.

The possibilities for customisation in UX are remarkable, but brands need to be very careful and avoid seeming too casual in using personal data. Brand followers care more than ever about ethical behaviour. We’re in the “age of activism”, as Forbes contends, so be mindful that you don’t allow your brand to fall foul of public opinion.

About the Author:
Kayleigh Alexandra is a content writer for Micro Startups. A site dedicated to giving through growth hacking. Visit the blog for your latest dose of startup, entrepreneur, and charity insights from top experts around the globe. Follow them on Twitter @getmicrostarted.