Technology took a dark turn. Where do we go from here?

Dark patterns, bad habits of technology and UX

Earlier this month, we invited three distinct voices from the Danish tech scene to share their views on the dangers of technology: Entrepreneur Imran Rashid, Digital Director & UX specialist Randi Hovmann & Mark Anthony Fiedel from the Centre for Cyber Security.  As the home for over 350 tech entrepreneurs, we are the first ones … Continue reading Technology took a dark turn. Where do we go from here?

Earlier this month, we invited three distinct voices from the Danish tech scene to share their views on the dangers of technology: Entrepreneur Imran Rashid, Digital Director & UX specialist Randi Hovmann & Mark Anthony Fiedel from the Centre for Cyber Security. 

As the home for over 350 tech entrepreneurs, we are the first ones to speak up about all the great things technology brings us. This makes it all the more important to also dare to venture into those dark alleys. To empower people to navigate the pitfalls and create healthy relationships with technology. Don’t be afraid, be informed, as they say. 

Throughout the evening, we approached three aspects of the dangers of tech: 1) Technology as a bad habit, 2) Technology as manipulation of its users, and 3) technology as an external (and national) threat. 

1. Technology as a bad habit

Imran Rashid joined us, an experienced entrepreneur and physician and co-author of “Offline”: A book on how we regain control and become masters of our own digital behavior. On January 1st, Imran is back with a book that explores how we rewire habits, “Sunde Vaner”. 

Why do we need to talk about habits when it comes to technology? 

Our tech habits, and the way we interact with our devices and the digital services on them, form a fairly new space in our lives. If we don’t become conscious about how we use tech, we risk walking into behavioral labyrinths with our eyes closed, allowing for impulses to control our lives. 

Would you say that this a new phenomenon? Have we not witnessed similar worries before, with the emergence of the TV, the radio etc.?

On the one hand, you can say that we have adapted our lives to new inventions before. But radio and television did not change our lives the way our digital use does. Today, you get a new religion pushed at you by the major tech companies (and their gamification and re-targeting). You are offered a fast escape from your life and an offer of social gratification. All you have to do is to swipe. It is problematic how much this logic affects us emotionally because it is a business model – not a sustainable way of living. For me, it’s an evolutionary u-turn. Facebook is one of the clearest symbols of it. Marketed as access to knowledge and learning about other people’s points of view.

That’s what books did when they were first invented. But in this case, the knowledge made available is full of errors.

In your book, you speak about all of us being two humans in one. What does this mean for our relationship to technology and the habits we create around it? 

First, it’s useful to acknowledge that you’re probably not particularly good or bad at controlling your habits. Most people I meet and ask: “Do you use your smartphone too much?”, will answer yes. We live in a culture where we have been taught to trust the idea of free will. But we all contain two humans in one: a rational Homo Sapiens that would go for all the things research that says makes us happier: health and strong relationships. Then we have someone that is inclined to do what feels great in the short term. Who I like to call Homo Habitus.

How do we escape our Homo Habitus? Is it possible to build a healthy relationship with technology within the near future? 

We are in many ways dealing with this in reverse. We should have started thinking these things through 10 years ago when digital devices became ingrained in our lives. As we’re dealing with our habits now, we are like kids who need to be schooled. Rewiring a habit takes time. It requires that you define an inner motivation for why you want to change it (eg. this habit eats an hour from my day that I could have spent with my kid). Then you need to consider the things around the habit: Where are you when you do it? Who are you hanging out with?

From there you can try to change one of those components in your surroundings. Think about it as testing something within yourself (like you would test a product). The testing should feel easy, fun, and you can give yourself a kind of emotional reward for trying. If it feels like a pure duty, you’re never going to move on it. Changing a habit is a process, and you need to like that process. Another thing to consider as you are forming new habits is to establish relations that enforce them.   

So there is a value in learning to like the process itself and not just the reward? 

We are used to fast rewards (getting a like or a match on Tinder), but we need to slow down our understanding of rewards. Reading is a slower reward that has become harder for us to appreciate. When it comes down to it, you need to abandon the need for an outer reward. Choose habits where the process is rewarding in itself. Life is one long process or a series of processes, where the good ones are defined by three things: They develop you, you can shape them, and the process has an end goal that does not only point towards you.

Is forming healthy relationships to technology an individual responsibility or a societal one? 

It’s a personal responsibility to deal with these things. You can’t expect your colleagues to help you create that free digital space. All of us need to define boundaries that make sense: Where we put up our own sign of “do not disturb”. In my case, I might send emails out of office hours because I need to send something to get peace of mind. But for it to not be intrusive, it requires that the recipient has defined boundaries too. The origin of the word digital is – literally – finger, and sometimes you need to put that finger in your ears, sometimes in the ground, and only sometimes should you swipe with it.

Looking to the future, where do you believe we are headed? What would you prescribe for a healthy relationship with technology?

What I would like is just a general consciousness about our tech habits. Habits make up your life, so ask: How many of them have you chosen for yourself? On a brighter note, there is an increasing demand for tech companies to market better habits towards us. A whole ethical wave is doing well with Vestager, Techfestival, and governmental workgroups. All looking at: Can we export humanism through technology?

2. Manipulation of consumers + where to draw the line

Randi Hovmann, Digital Director in Operate and a UX specialist, also joined the conversation. Randi shared tools for building products that we can be proud of and steer free of manipulating the users. 

You believe that companies have a moral responsibility towards their users. Is this something that is happening today? 

As a commercially driven company, you can be tempted to design your website so it only makes people buy your product. But it is up to every entrepreneur to say: “I would like my customers to be conscious about that decision when it happens.” You can choose to create a non-transparent user experience. Or you can – as I believe most entrepreneurs want to – design something that solves a real problem for people.

We already expect transparency on other levels from companies: Openness about how much money they make, what they invest, how they gather data and what they use it for. A transparent user flow, where you know what you’re gonna get, should – and will hopefully soon – be a given, too. Digital businesses will not stop designing manipulating architectures overnight, but there is a need for all of us to be more ethical. On the user side, we might have to be a bit more skeptical towards companies.

How would you define good UX?  

Good UX is based on an open ongoing communication with the user, who is let in on what is going to happen at all times. As an entrepreneur you need to ask yourself before every customer meeting: Is it clear what’s happening now? Do my users know what they’re agreeing to? A set of purchase conditions should not be a wall of text that is impossible to read. No subscription should be difficult to cancel. Any kind of friction you create, just to make the user stay a bit longer, is not cool. These are the basics. 

You have even created a matrix for what can be considered transparent and useful UX, versus hidden and harmful UX.

In terms of setting a direction for your product that is in the interest of your user, you can break it down as a matrix. On one hand you move between providing a service to providing a pain. On the other, you move between being transparent with the user to being non-transparent. Pains can come in many versions: a company that keeps sending the same offer, although the user displayed little interest. Or moving unclearly/too fast from a trial version of a product to pushing the full one. Don’t expect that the user wants to pay full price for your product and don’t expect that the user wants to stick around forever. 

In some cases, it is not even a question of morals. Can it be downright illegal to do shady UX?   

Yes. As an add-on to the matrix, it is worth mentioning that there are not only white, grey, and dark zones of UX and marketing. In certain cases what happens is downright illegal. Spamming contact lists of individual users without their permission, for example, is illegal marketing. Or writing fake reviews, which is a case of illegal documentation. Then there are borderline legal examples, where you can buy access to direct emailing through a premium account. This practice might be legal in the USA, but not in Denmark.

3. Technology as a national security issue 

Our third speaker, Mark Anthony Fiedel, took on the task of telling us about the changing landscape of cyber attacks and the threats that the Danish Defence Intelligence Service fight against on an everyday basis. Mark is Head of Threat Assessment at Centre for Cyber Security.

One would imagine that the work of the intelligence service is too secret to talk about to the public. Is it not? 

The details of our intelligence work are of course quite secret. But it is also our job to inform openly about what we do. To make it clear that it is not like a Hollywood movie. And also to strike a balance between not saying “danger, danger,” all the time, but being direct about the threats you should be aware of.

We hear a lot about cyber threats, but what is an actual threat and what isn’t? How do you assess this? 

When we assess a threat – whether it’s espionage targeting a state or a company, or it’s for criminal purpose – we look at intention and capacity. On the intention side: States have an incentive to spy on each other and most cybercrimes are financially motivated. Those two types of threats are here to stay. A few individuals may be motivated by creating chaos, but that is more of a stereotype from the movies. Hacktivism perhaps comes closest – however, here you also have a message that the actor wants to convey, a political one or a cause you want to move attention towards. Whatever the attack might be, it plays out in different phases. There is an initial phase, where the attacker looks for vulnerabilities. Then the actual exploitation occurs, followed by trying to stay in the system as long as they need to, in order to do their business.

One of your examples of what this could look like is WannaCry. What is it? 

WannaCry is a kind of software or ransomware that spreads as circles in water. It was harmful, because, as it was released, it found vulnerable machines to target across the internet. For example, it hit hospitals in the UK, and it was a global wakeup call in terms of the need to update safety measures. However, as it typically happens, the awareness after an attack lowers quite quickly. WannaCry did create a lot of damage, but it was poorly constructed ransomware. The payment part of the setup was inefficient, and it also had a ‘kill switch’ that a researcher discovered: a domain that, when registered, slowed down further malware infections.

Another example is NotPetya. What was that? 

That was another wakeup call that took place about a month after WannaCry. In this case, the malware tried to destroy data and systems, primarily in Ukraine. It spread via a program that businesses in Ukraine had if they paid taxes in the country. The malware was hidden as a software update. It also hit Maersk, which lost between 1,6 and 1,9 billion DKK. The victims were both the public and private sectors. One long term consequence of this can potentially be hesitation in establishing new businesses in Ukraine.

Would you say that all new technology brings new vulnerabilities? 

Every code is born with vulnerabilities. Today we have more touchpoints that can be attacked because we have more devices that are connected to the internet. In the case of IoT, security is often overlooked. You might be busy focusing on the functionalities of your new IoT fridge, which can index the food in it, match up with a shopping list etc. Safety is not thought of as a primary function. AI also opens up a lot of opportunities, for instance, new ways to detect attacks or malicious code. However, it can potentially introduce vulnerabilities as well. But it is still early days. In general, I would love it if Danish developers tried to make security a part of the quality, or the brand, they deliver to the market.

So there is a mission in this for entrepreneurs? What would be a few things they could do to take part in a safe tech development? 

If your data is hosted externally, ask for a certain safety standard. When you don’t ask for it, you’re probably not going to get it. If an essential part of your business model or concept is a piece of code, make sure it is stored safely, whether it is in a cloud or in your own keeping. Have a policy for patching, and for whitelisting applications in your business. Same for passwords. And think about this as an insurance policy. Decide what the most valuable part of your company is and protect it. Remember that it is always more expensive to recreate data and systems once the attack has already happened.

The message is clear. With great power comes great responsibility, and technology is not an exception. Whether it is the responsibility we have towards ourselves to foster good and healthy habits; to our consumers to be transparent and honest, or to our societies to keep them from harm, we are on a mission to get the basics right, when it comes to safe and ethical technology. 

Further readings: 

  1. Pre-order Imran Rashid’s forthcoming book Sunde Vaner.
  2. Randi Hovmann on the dark patterns matrix (in Danish).
  3. The latest threat assessments from Centre for Cyber Security. 

Follow us on Facebook for future events

Read more about the future of working with tech or learn more about the pitfalls of data-driven marketing.