Nowadays, the tech industry has its users seeking for solutions they can trust with their personal data. In light of the recent and now too regular cyber attacks on tech giants like Equifax, Facebook, Deloitte, Yahoo, FedEx, Uber, Under Armour or even Pizza Hut, the poor communication and, even in some cases, cover-up of the unfortunate facts, the public confidence towards the tech industry has been severely eroded within the last couple of years. The future of the tech industry lies in its ability to build trust with its users, but is trust really enough to keep data truly private?
One of the likeliest security points of failure for any internet user is with whom they share their data. Internet connections are now much more secure than they ever were, but tech giants somehow always find the way to leak, lose or get our data breached. One of the main reason why that is is that they have specifically identified data mining and reselling as one of the most profitable practices in the industry and have engaged in aggressive strategies to make money out of our private life, most of the time in complete disregard of the very people being affected by such actions. In fact, once our private data exit the servers of a company to which we’ve agreed to their privacy policies over to another company which we haven’t agreed to anything, it becomes extremely hard to keep track of who’s using our data, with whom they are sharing it and for what purpose.
Why Does It Even Matter?
“(Personal) data is the new oil”. Tech companies have indeed systematically commodified data in order to make millions of dollars off of it but the users from whom they’re taking it from are very rarely compensated or even notified about it. How can large and very popular tech companies offer their platform for free while staying perfectly sustainable? Is it only through ads? Nothing is free on the internet, companies exist for the sole purpose of making profits and huge websites have huge maintenance costs. When a platform appears as free, it is most likely because you are making “personal data payments”.
- “Data is the new oil” — Jonathan Taplin, director emeritus of the USC Annenberg Innovation Lab and the author of Move Fast and Break Things: How Google, Facebook, and Amazon Cornered Culture and Undermined Democracy.
On one hand, this “data economy” is good for partner companies that can then make their products and platforms arguably better for their users. The ability for them to know who exactly is using their platform allows them to make a much more customized and personal experience. But on the other hand, things can get ugly quickly.
Making money off your back
The most common and recurring concern from the public when it comes to personal data sharing is the fact that companies are making millions if not billions reselling every single detail of our private lives to whoever is willing to pay for it. Very rarely are users, the very people from whom this data is taken from, compensated or even notified about it. This creates an environment where it is quite difficult for the public to trust tech companies as they are mostly kept in the dark even though it affects them directly.
As it becomes very difficult and even unpractical to track and limit what happens to our personal data once we share it with a company that in turn shares it with another one, the user loses any power of consent as to how their personal data is used. The best example to illustrate how this can directly affect anybody is with the #DeleteFacebook scandal. In that particular scenario, Facebook shared, without their users’ consent, personal data to a data analytics company (Cambridge Analytica) that later strategically advised the Republican campaign during the 2016 US Presidential Elections. In other words, every Facebook users, even those who despise the Republican party, helped their political rival win the elections by providing strategical data to it and did so unwillingly and with no consent whatsoever. This is only one example that has gone public, but rest assured that this isn’t an isolated case, it is the crude reality of the data economy. Your own data can and will be used against you, with or without your consent or knowledge.
In the wrong hands
When you decide to trust a company with your personal data, you need to not only trust that they will not abuse or misuse it but also that they will keep it secure. As data has become the new oil, it has positioned itself as a very valuable target for hackers all around the world. As evidenced by the exponentially rising number of data breaches, hackers have identified personal data as one of the top loot they can get. Whether it be to steal your credit card numbers, identity, email or any other type of private data, criminals constantly try to exploit and breach servers every day. As a general rule of thumb, the bigger a tech company becomes, the more data it will have mined from its users thus the bigger of a target it becomes. Trusting a company mean that you are trusting their honesty as well as their technical ability to sustain continuous hack attempts. On a long enough timeline, the risk of your data being breached is quite high, especially as companies will share your data with other companies.
Keeping your private life private
One of the often overlooked arguments against data mining is simply the fact that privacy is, simply put, a human right. People should always have the choice to keep their private life private and share their data only if they feel like it. With the current trend in tech, finding solutions that provide a quality service without mining your data is proving harder to accomplish.
“Arguing that you don’t care about the right to privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say.” - Edward Snowden
The GDPR, A Step in the Good Direction But Is It Enough?
A couple of weeks ago, more than two years’ worth of anticipation concluded when the European Union’s General Data Protection Regulation (GDPR) finally came into effect. This new EU privacy and personal data protection regulation addresses the sharing of personal data outside the EU and EEA and aims primarily to give control to citizens and residents over their personal data within a simplified and stricter regulatory framework. In short, the GDPR demands that companies clearly notify their users of what data they will collect and what exactly they will do with it. It also commands that they use a more responsible approach as to how they share their users’ data and adopt better strategies to keep it secure. While it is refreshing to see such a strong legal stance in favor of the protection of personal data, it is unfortunately not enough.
Do we really have a choice?
In the first place, a lot of users don’t really have the choice to accept the new terms and conditions or not. After all, most of the time, rejecting new terms and conditions results in a user’s services being denied and canceled. In some cases, users may have to go through the hassle of finding alternative services that will share (or not) their data in a way they deem responsible, an undertaking that can prove extremely difficult. It can also mean these users could lose accumulated loyalty points, saved data, statistics, progress or even contact with certain people (i.e. Facebook). Some people may even be forced to accept terms and conditions because they absolutely need to use a specific piece of software for work. As the vast majority of the popular internet services are now owned by a handful of tech companies (Google, Facebook, Amazon, Microsoft or Apple) that aggressively acquire any new emerging platform, opting out of their wide range of services because of how they treat our data can prove to be a rather difficult task which can also lead to a considerably inferior user-experience. After it is all said and done, the great majority of people will accept any new terms and conditions and forfeit their personal information to the data economy because of the lack of alternatives.
You still need to trust them!
The Need To Do Away With Trusting Strangers
In the tech industry just like in life, every new problem presents an opportunity waiting to be taken. Digital privacy and data security have only very recently started to become a mainstream discussion topic, thus new technologies and practices are set to gain grounds. As pointed in the previous section, the main problem with tech companies has all to do with trust. Will the company behave honestly? Will it really be able to secure and protect your personal data against attacks? Could there be a way to completely eliminate the need to trust a third-party? There actually is!
Blockchain technology allows two or more users to directly exchange data without the use of any third-party to facilitate the data transaction. Its most popular use-case today is to be used as the foundation for cryptocurrencies such as Bitcoin and Ethereum where that exchangeable data has a monetary value attached to it. These cryptocurrencies allow people to directly transfer money to anyone in the world and do so without the need for a third-party such as a bank, a payment processor or a company.
However, this groundbreaking technology can be used to do much more than virtual currencies. As it allows users to send almost any kind of data, it can indeed be used to build entire applications on the blockchain, usually called decentralized applications (dApps), on which users can directly interact with each other without any third-party or company running the software. It is possible, for example, to build a decentralized marketplace where users could buy, sell or trade between each other without requiring to trust a big company like Amazon, eBay or Alibaba with their personal data. Think P2P torrents, but taken a step further.
Is Blockchain Technology Really Better at Keeping Your Private Data Private?
Yes and no. Blockchain technology is great for connecting people around the world in an almost unlimited number of use-cases, but there definitely is some drawback when it comes to personal data protection. In fact, blockchains are public and permanent data ledgers, meaning that data stored on them will forever be publicly visible. One of the biggest misconceptions about blockchain technology, in general, is the fact that it is perceived as anonymous when it is not. Most blockchains are pseudonymous rather than anonymous, meaning that while transactions are not linked to a clear identity (they are instead linked to cryptographic “pseudonyms”), they can be tracked with relative easiness and user identities can be revealed.
The case that private data may be safer on some company server rather than being stored on a public blockchain could certainly be debated. After all, blockchain analysis and tracking is already becoming easy as cake as major companies and governmental agencies have invested millions in companies that are working on tracking crypto transactions such as the partnership between the United State’s Department of Homeland Security and Lockheed Martin, the world’s largest defense (military) contractor, in as early as 2016 (https://www.coindesk.com/us-government-lockheed-martin-bitcoin-analysis-tool/). This, coupled with the fact that any data stored on a blockchain is publicly visible until the end of times, makes it extremely likely that all public transactions will one day be fully tracked back in their entirety by future or present analysis technologies analyzing past data in retrospect. Do we really want our data out in the open and waiting to be revealed to the literally anyone interested in it? Once blockchain analysis technologies and services become powerful and accessible enough, it will be considerably easier to obtain someone’s private data stored on the blockchain than it will be to breach a tech giant like Amazon.
Taking Blockchain Technologies to the Next Level
The blockchain technology is definitely a breakthrough in how users can directly share data between each other, but as stated above, they are not the ideal in terms of personal data protection. What if this technology could be improved and made private-by-design, and what if its privacy could be dramatically improved if combined with other decentralized P2P technologies?
Privacy-focused blockchains are good…
The first step for a more private solution is to make data transfers untraceable. The biggest drawback with regular blockchains is that data transactions are publicly auditable. Fortunately, there are some new and groundbreaking developments happening in blockchain technology today where transactions can be made untraceable by hiding transactions amounts, participants, or both. The main two such technologies are RingCT and zero-knowledge proofs, both attacking this privacy issue from different angles. Note that “transactions”, as referred here, not only relates to currency transactions but any type of data transaction made on a blockchain.
Private-by-design blockchains are even better!
To be as effective as they can possibly be, private transactions should be mandatory. For example, a decentralized application (i.e. a marketplace) with an opt-in privacy protocol is much less effective than if it was mandatory. Having opt-in privacy dramatically reduces the number of private transactions made, thus making them theoretically easier to track and even potentially more suspect.
Not everything needs to be stored on-chain
Blockchain technology is a great way to transact data, but another thing it’s not ideal for is to store large pieces of data (i.e. pictures, videos, sound files, etc). As mentioned above, everything stored on a blockchain is stored in there permanently. Anyone looking to run a node then needs to download the entire ledger in order to be able to use it, thus making on-chain data storage both a scaling and security nightmare.
It is possible for blockchain networks to run alongside other P2P networks that use the same nodes. These native P2P networks can then be connected to their accompanying blockchain and used for data storage. What’s even better is that these P2P aren’t blockchains, meaning they can have different properties not entirely possible with blockchains such as amnesia. Taking again the example of a decentralized and trust-less marketplace, a relatively separate decentralized storage network could run on the same nodes where its blockchain is hosted. It could then be used to store listing pictures, videos or even digital items and be configured in a way that it permanently forgets/delete data after pre-determined conditions (time, context, user-triggered, etc), hence the “amnesic” properties.
It’s not only about transactions and uploaded data!
Another security and privacy aspect that cannot be overlooked when it comes to protecting your personal data and privacy is your network connection. In fact, most blockchains allow their users to connect to them through a Tor routed connection. This effectively makes users’ IP addresses hidden to other peers, making it almost impossible to identify users based on their connection information.
Again, as with private blockchain transactions, this IP address anonymizing becomes all the more effective when it’s enabled by design. In fact, if only a small portion of users obfuscate their IP addresses, they may end up looking suspicious or deemed to be “of interest” by various parties. “Together we’re stronger”!
The Particl Project Is Making That Vision a Reality
With the mission to give users willing to protect their personal data a viable option, Particl is building a private-by-design decentralized application platform where dApps can be built and used with optimal security and privacy parameters. It is a blockchain/P2P hybrid network and as technology evolves quite rapidly, is designed in such a way that it can work using almost any currency and integrate any P2P decentralized storage network (such as IPFS, Blockstack, etc).
The project just released the alpha version of their first private-by-design decentralized application, an eBay-like marketplace for products and services, and is working on a developer SDK to make building other decentralized applications very user-friendly. The project is built on the foundation of privacy and security, and as such is having its breakthrough privacy protocol (RingCT on the Bitcoin codebase) reviewed for any flaw by the NJIT Department of Technology.
Particl aims at decentralizing popular web services as well as the cryptocurrency ecosystem and making them private-by-default, heralding a new, secure, private and trust-less way for people to use the internet while keeping their private life provably private.