Transcript
There’s a saying that goes, “If you’re not paying for the product, then you are the product.”
Whenever you use a service for free, it’s not free. Your data is what allows the service provider to make money, and therefore, you are their product.
This business model means the company’s revenue is not through the product they develop, but the data they’re collecting behind the scenes. Is there any consensus about the ethicality of that as a business model, especially when it’s often hidden from the consumer? And it’s worth pointing out that these consumers include children. For example, a lot of free games use the same psychologist that created the gaming system within casinos to get your mind wrapped around one thing, I got to do one more, I got to do one more. Is that somehow pernicious or unethical, or is it just an extension of people’s weakness?
Most applications rely on active users to monetize. Therefore they would use psychological tricks that tap into the weakness within human behaviour collectively in order to keep us there.
From a behavioural science and psychological perspective, we know that we are less in control than we often think we are. There are certain triggers, for example, colours, information, and sounds, that tend to have influences on people’s behaviour. A lot of these companies, particularly social network sites, spend a lot of time actively exploring this, to ensure that users spend as much time as possible on their site. As they do that, they can collect more data and then feed it into the model again to create more insights for product design and profitability.
We’re just talking about these implications for us, what does this mean for the next generation? We already discussed that children and young adults today have a different perception of privacy than previous generations. With distributed ledger and other technologies, will we be able to control and own our own data and determine what people access, or is this just going to be a new way to solidify this power of the data.
On the one hand, blockchain and other technologies are in some respects, more anonymous. Even though they’re open, they’re more private and protect their users’ privacy as the system wants them to be. So to some extent, FinTech and other technologies actually create some greater levels of anonymity that might have existed in the traditional financial system.
But on the other hand, there’s a lot more information that was private that is now public as well. So it’s a very interesting dichotomy that people would have to live in as they get older. We think as our children and our students grow older, they’ll live in a world that’s definitely less silo. The distinctions between a bank, a consumer company, a store, etc. will start blurring.
One of David Bishop’s favourite fake news clips of all time was from this website called The Onion. In 2011, they revealed (fake) that Facebook was a CIA protocol, a secret programme to get people to post their private information in a public way, and its founder Mark Zuckerberg was a CIA agent with the code name, Overlord. After years of working as an intelligence agency to get private information on people, they now realise that everyone just posts voluntarily.
Obviously this was meant to be a joke, but the idea is that we live in a society where so many things are open and even the concept of privacy doesn’t even mean the same thing as it used to mean.
Speaking of Facebook, we are familiar with the incident in which Facebook was used to manipulate the 2016 US presidential election. There seems to be a lot of clear evidence that tied various entities in Russia in the attempts of influencing certain election outcomes in the United States.
The use of data that can make it easy for large retailers to send you a personalised coupon, is through the same psychological analysis which convinces you to pick a certain presidential candidate.
Discussion Questions
- How can blockchain and other distributed ledger technologies help us control our own data?