Q. Decentralized public infrastructures and alternative platform models, as one finds in projects like Decode, are increasingly emerging as attempts to challenge the status quo of the digital economy. Can you talk about the political vision motivating such projects?

Stefano: In my professional experience, I’ve been part of two such projects, D-CENT, and then secondly Decode. In fact, there is a kind of continuity between them – as D-CENT was the one that really gave the motivation to proceed with Decode.

As Francesca Bria says, D-CENT was about decentralizing deliberation, and attempting to use digital technology to bolster democratic processes. However, very quickly there was a realization that not only was there a need to have a platform that would facilitate public participation in democracy, but that crucial issues of data sovereignty and data ownership needed to be thought through. So, then came Decode, which focused on continuing to develop the tools and platforms that D-CENT had been producing, but doing so in a way that put the emphasis on collective ownership.

There was a political vision here, to open up the democratic process to citizens and at the same time, to do it in a way that was not privacy-invasive and that respected people’s ownership over their data. Hence, the main focus of Decode was using structures and technologies that had already been developed by D-CENT or other projects, but then going back and looking at issues of data-sovereignty and trying to build in components to ensure dataprotection and privacy in these various platforms/tools.

So, in Decode we were perfecting, for instance, the way that you could express your vote anonymously, facilitating the minimal personal data disclosure that can verify identity while disallowing people from voting multiple times. This was a need that was felt strongly then, as every day we learned of the horrible things that Facebook was up to, and the problems with the platform business model as it was being employed by Big Tech, a lot of which revolved around relentless data extractivism.

Decode wanted to strongly oppose this, and try to build alternatives that would use digital technology and even data, but outside of this extractivist logic. Public participation related tools are examples of the type of things we worked on, but there were many others with the same approach. In one instance, an initiative required the measurement of noise in a square in Barcelona. Our challenge was how to use sensors to generate this data but in a way that preserved the privacy of the data.

 

Q. Could you share, what you think are some of the more significant achievements, or most interesting examples that have emerged from these projects? Particularly with respect to the public use of data in decentralized technological initiatives.

Stefano: Probably one of the biggest successes of these projects is the Decidim platform, which works as a digital medium for democratic public deliberation. It has been employed at the pan-European level, having received the endorsement of the European Commission and the European Parliament. But there were also many smaller platforms. I remember at one point my institution collaborated with another initiative that was similar, De Stem van West (the Vote of West), to submit ideas for the neighborhood, and there were three-four such projects across Europe, trying to promote these types of platforms and using them at the local administration level, such as Consul and Your Priorities.

All that said, I always had a sense that the work we were doing was a strong pioneering effort which was still not fully embraced as a mainstream project. These were several initiatives spearheading the movement, for the push towards ownership and decentralization. I think they were also seen as flagship and innovative endeavors. But at the same time, it was not evident that ordinary people would use it or adopt it en masse. It was clear to us that it was a broad societal need, but we didn’t know if people or society at large would already feel this because it’s a bit of a tricky situation.

When you are working in this space and on these issues, you have an awareness of what happens and what the dangers are. As one can imagine, it may be totally different for example, in a commercial environment where ordinary users rely on apps like Facebook, and Instagram for their everyday needs and are so used to these platforms that they do not have a critical perspective on what these companies are doing. This reality was and is still strongly present in society and so my impression was that with most initiatives, they still ended up as somewhat niche initiatives in relation to wide-scale use.

I feel the debate is still going strong. On one hand, there is increasing awareness of the fact that dominant platforms and business models are possibly damaging society at large. But I don’t know how much this is making people change their habits and whether many continue to basically relegate these facts and continue relying on these platforms.

There is increasing awareness that dominant platforms and business models are possibly damaging society at large. But I don’t know how much this is making people change their habits. Given the convenience with which most of our needs are fulfilled by the existing commercial platforms, we’ve become dependent on their tools and do not feel the need to use other platforms and have more control over our data.

There is increasing awareness that dominant platforms and business models are damaging society at large. But how much of this is making people change their habits remains to be seen. We’ve become dependent on the tools of existing commercial platforms and do not feel the need to use other platforms and have more control over our data.

The goal should always be to increase the awareness of people on one side, but also their knowledge. So that every citizen feels that they have the right and the means to take part in the democratic process. However, it’s also a duty which can be fulfilled only when you more fully understand what’s happening in the technology space, in the economy, in politics, and everything else. Thus, one needs to convey the information to the people to make their own choices without delegating them to someone else. In my opinion, it is still a battle for the awareness and knowledge of the people.

 

Q. How do you see the relationship between such decentralized models and the development of public policy? What are points of intervention needed in the latter in order to create a more enabling environment for the former?

Stefano: At the end of the day, I am a fan of public intervention. When you look at the field, at innovative new technology for decentralization or for cryptography, there is always this stream of people that say “okay, the technology can regulate itself. One does not need regulations because the technology is decentralized and is basically open to everybody. One does not need any intermediary.” Like with cryptocurrencies, for instance, the idea is that regulation can be encoded in the platform.

When you look at innovative new technology for decentralization or for cryptography, there is always this stream of people that say “okay, the technology can regulate itself. But one has to realize that there is no such technology. Even Blockchain and Bitcoin are susceptible to attacks and abuse. Moreover, when you look at what these arrangements are, one doesn’t find just technology therein. There are also a whole set of relationships with a societal environment around them, and they have to be protected through regulations.

One has to realize that there is no technology that can regulate itself. Even Blockchain and Bitcoin are susceptible to attacks and abuse. They are not some invulnerable entities, which can resist any and all attempts at misuse or attacks on the power-systems that compromise the whole edifice.

Indeed, even if you look at projects like D-CENT or Decode, these did not just come about privately, they were European projects. There was somebody in the European Commission who saw the value of promoting such projects, hence they started with the financing. This turned out to be an interesting initiative bringing about innovation, and the next step involved bringing this under the eyes of many more people. The public sector was crucial to making all this happen, and it will also be vital for creating the regulation, the ecosystem, and the funding that can support this kind of stuff.

I look at the GDPR, which has been criticized for not being far-reaching enough. But I think it’s a great first step since it institutes some hard limits for platforms, and it is that capacity that legislation has that really has been a support for a lot of other positive elements.
But of course, these kinds of technological projects need to ‘be encouraged’ and not obstructed by public policy. Ideally, these advances should be supported and financed by these legislations, not encumbered by them.

 

Q. What are the principle challenges in safeguarding such public infrastructures and other decentralized models from incursion and capture by private actors/Big Tech?

Stefano: I have an example of something that actually happened during the course of Decode that illustrates this danger. The team developing the cryptography used by Decode was basically acquired en masse by Facebook during the project. There was a full takeover of the team and they all went because, at the moment, Facebook was developing the Libra currency, and it was going around ruthlessly buying out whatever it needed. There was a complete takeover of this technology that we were working with.

In a way, it is almost like war. When they were acquired by Facebook, they were working on a project which was for the public good. When they left, they also took the knowledge built using public money and public resources, and this knowledge was now going to be used by Facebook. So, these dangers are very real.

It’s interesting how these technologies evolve as well. If you look at when cryptocurrencies first came out, there were a lot of discussions about how these were meant to basically put the banks out of business. But of course, banks were not willing to let that happen and soon they also entered the space to try and understand these developments and carve a position for themselves. I think that’s what happens to every technology that is either generated by the public sector or by private-public collaboration. As they contain within them possibilities which threaten the private sector, the latter is always incentivized to take steps to see how they can ride the wave or use it to their advantage.

Now, when it comes to what can be done about this, I’m not sure. The experience of the Facebook takeover really showed how little you can do in these situations. Fortunately for us, what was basically a protection against it, was that the knowledge that this team had and the things they had developed were all open. So, at least, when the team left, the knowledge did not leave with them. It was possible for other people in Decode to carry forward what they had done because it was all open. So, that is one safeguard – just the openness.

Of course, that is only one form of private players trying to take advantage and we know that they try to do this in many ways, and it’s quite difficult to curb these behaviors in our current landscape. We have to find ways to ensure that with new markets and platforms, the conditions should be such that each player is forced to play fairly, even if a public-private company enters it. For example, if platforms are built in a way that is privacy-preserving, then as companies enter this space, they will be constrained in how they can access and use the data within the platform, and so will have to find other ways to generate their profits. I’m not totally sure whether this is 100% possible, but I think the creation of strong privacy-preserving platforms can certainly create some safeguards that can reduce the opportunity for people to exploit and misuse data.

We have to ensure that with new markets and platforms, the conditions force each player to play fairly, even if a public-private company enters it. Ideally, if they play by rules, then at least they would not have their own way over the platform in totality.

Moreover, if you ever enter a situation where there is no plurality in the market, for example, no more than one actor in the stakeholder space, it’s going to be hard. For example, some time ago Facebook wanted to offer internet access to everybody for free, which is an extremely controversial thing since it means something that is a human right cannot be achieved through public funding and requires supplementation from a private company. This is a hard discussion to have since on one hand some people will never get to access the internet and will effectively have to pay a lot. On the other hand, there is also the threat of the disruptive activities that Facebook can do with free internet access.

 

Q. What are the most significant technical challenges in building these platforms and doing so in a way that generally creates public value?

Stefano: Well, if you look at cryptography, this is a field that has been mature for some time, and recently has also become even better.

Basically, we are at a stage where we can do pretty interesting things without disclosing too much data. I think, for instance, Blockchain is an interesting technology. There are a lot of things happening around that space. There’s also the holochain, which is a related platform/technology that doesn’t have the limitation of blockchains and is not as power-hungry. If you look at privacy, cryptography can really contribute to building things like zero-knowledge proofs, or identity systems that deploy minimum data and allow you to prove your identity without divulging much data. Hence, there is quite some strong infrastructure here that is more or less on the way to becoming public institutions.

However, the fact is that data ownership and sovereignty force a citizen to also be more aware and in control of their data, pushing them to take more effort to think about what they are going to do with the data. This basically opens more space for action by the person, requiring their thinking and that must be something essentially valued by people. Because it doesn’t work if you are not willing to make the effort to participate, when you delegate thinking, you don’t want the extra burden.

Data ownership and sovereignty force a citizen to be more aware and in control of their data, pushing them to take more effort to think about what they are going to do with the data.

The technical challenges are few, but one of the limitations to scaling these models is they rely on people being aware, being in control, and willing to think about what is the right thing to do, what the possibilities of the decisions they could take are without delegating decisions to national or other commercial entities. For me, the barriers and obstacles exist more in people’s mentality than in technical things. There are quite some initiatives to be enacted and played, but it’s more of a matter of getting the right mentality and right values, and having people demand these values from platforms.

 

Q. What is the scope for the future of such initiatives? How do you see them evolving over the next few years, and what are the factors most likely to shape the trajectory of their evolution?

Stefano: If we look at what the European Union is trying to do for the future: on one hand, there is a push for the public sector to finance infrastructure projects. For example, we have a Blockchain that everybody can use as an infrastructure. At the same time, there’s a lot of debate in the EU about ethics in Artificial Intelligence. They are tackling, more or less, the broad spectrum, from enabling technological tools to trying not to lose sight of guiding values. Which, I think, is a positive thing.

I don’t know how mainstream they will ever become because although I see a lot of good initiatives in the European Union, it is also subject to a lot of lobbying from the industries. Thus, sometimes it feels like they are okay with these alternative models and allow them to work only because these platforms, at the moment, remain niche. The moment they step out and become a genuine threat, there will be swift attempts to prevent them from growing. Nevertheless, I’m happy about these developments that I see in the European Union’s public funding mechanism, and it certainly holds potential.

Of course, there are also private companies that will not stay still, but I am not entirely sure of what they are exactly trying to do. Often, these companies endorse some ethical principle, but they all implement things in a way that defeats the purpose of the principle. For example, Google has been announcing, for a couple of years, their willingness to abolish cookies which is one of the main tracking technologies that follow user activities on the internet, and thus they say that they are embracing privacy. But the question is how are they going to replace these technologies? Although proposals have been made, there are also many concerns about them. Sure, you are replacing a technology that people have begun to understand the dangers of, but then you may replace it with something that nobody knows anything about. It may initially have the appearance of preserving your privacy, but it could well turn out to be just another technology which you’re not shielded from, it will take time to uncover, and people will be unprepared to understand and be unable to protect themselves from it.

The question is how are Big Tech companies going to replace technologies that track or violate privacy? Although proposals have been made, there are also many concerns about them. Sure, you are replacing a technology that people have begun to understand the dangers of, but then you may replace it with something that nobody knows anything about.

Facebook always says that they understand user concerns and that they are committed to making changes on sensitive points and positions. Yet, this is usually a cynical attempt to follow predominant norms and remain under the radar, while continuing the same actions as before. This is also the case for Apple, which strangely and positively enough, is the paladin of privacy for some reason. But, when they announced that they wanted to stop tracking users on their phones, Facebook said that they would change their application to a paid offering on Apple devices as a kind of threat. So, we don’t know what is on the horizon with these conflicts.

 

Q. In terms of the vision for the governance of these technologies going forward, especially as we try to think of ways to not only to protect people’s privacy but also to find public ways to use data and to create projects while preserving privacy, what do you think are the key things we need to keep in mind to make that happen?

Stefano: Yes, the governance aspect is a fundamental point. For example, one of the initiatives of Waag – where I have worked – is what they call ‘The Public Stack’, which would be basically replacing the proprietary technology with public and open counterparts to have a space called the Public Internet. Here, we’re talking about an online internet space, which is not ruled by closed software and commercial interests. However, there is still an issue. One cannot just provide an open space that becomes inclusive and safe for everybody to participate in. In fact, there is a strong discourse in conferences and talks on these issues about how the open-source community is actually not very inclusive and can be especially threatening to women. Hence, it’s not like opening the technology and making it public will automatically make it an inclusive space.

You have to strike a balance between strong regulation by the state and their non-interference. I don’t really have a good formula, since merely opening a platform and making it free from the pervasiveness of the Big Tech will not make it a platform that is pleasant for people to participate in or express their opinions in, and any fanatical group can still attack them for these reasons. Hence, there is still a core issue of governance.

It would be important that different initiatives on preserving autonomy on data are brought together, and pushed out of their niche areas and the limited number of partners they talk to, in order to bridge multiple gaps. It is essential for civil society and technical organizations to get together, form alliances, and push the discussion further.

Q. What do you think is the role of other constituencies like activists, civil society, and academia, and what can they contribute to this space?

Stefano: There is quite a significant role of civil society in such projects. I’m involved in a project about DNA data ownership where not only the concerns of data privacy arise but also DNA data’s use in research. On one hand, you have private methods that allow the owner to take decisions on exactly what shall happen with the data, yet you want to make it available for the societal good. This is known as the concept of ‘Data Commons’ and there is quite some understanding of this within civil society organizations.

In this project, we came in contact with an organization for patients named Mijn Data Onze Gezondheid (My Data Our Health), who had the necessary sensitivity and awareness of the importance of autonomy with respect to data. It would be important that these different initiatives are brought together somehow, pushing them to step out of their niche areas, in order to bridge multiple gaps. This will also allow spaces for forming and refining views, pushing towards more momentum for each of these organizations. For example, this organization for patients can bring in more organizations that can enable a more medically refined point of view of the patients and their data usage, while organizations like Waag can supplement the technological and policy aspects of the initiative. Hence, it is almost essential for civil society and technical organizations to get together, form alliances, and push the discussion further.