Image Image Image Image Image Image Image Image Image

Jun 04, 2019

Governing the Future: The need for standards for digital labour platforms

June 4, 2019

In recent years, digitization has transformed our lives in significant ways – from the way we shop, to how we interact with others, to our working habits. The fact that many of these often transactional activities are now facilitated via digital platforms represents a critical shift in our economy and society.

But while these digital platforms have certainly increased convenience and efficiency in many parts of our lives, the largely unexamined ways in which they are structured and governed – or often not governed – are increasingly raising serious and novel concerns. In particular, digital giants – such as Google, Amazon, Facebook, Apple and Uber – now wield enormous, arguably monopolistic, powers within important sectors of the economy.

Many fear that these companies are too big to be regulated by traditional national governments. But one major reason for this concentration of power is that these firms have been allowed to construct and use digital platforms in ways that put their business interests ahead of the interests of other stakeholders.

The Mowat Centre recently completed a project focused on exploring the question of where standards-based solutions can play a larger role in addressing the challenges raised by the digitization of our economy and society. This report focused on three key areas – data governance, AI and algorithms, and digital platforms. This TLDR highlights some of the key insights from our research in the area of digital platforms.

Key challenges

In many sectors of the economy, digital platforms have reduced transaction costs, facilitated connections, and increased flexibility and opportunity. At the same time, they have often led to decreased competition and increased concentration of power in the hands of a few global players. Worryingly, some have also facilitated the exploitation of digital platform workers.

Some of the reasons why they have produced these negative consequences include:

  • Network effects
    The term “network effects” refers to the phenomenon by which a product or service gains additional value or utility as its usage increases. For example, social network users want to be where their friends already are, and buyers and sellers want to be where the most sellers and buyers already are. Once a particular network attracts a large group of users (often by providing initial short-term incentives) and becomes the dominant player with the most users, switching to other platforms becomes unattractive. These high “switching costs”, and the “stickiness” they produce, have resulted in the concentration of market power for many digital platforms, making it difficult for new entrants to establish themselves and compete on an equal footing.
  • Platform architecture and business models
    The architectures and business models of some digital platforms play an active role in promoting power imbalances. Platforms often deliberately create important information and power asymmetries by setting the terms of interaction for users of the platform. This can include the setting of wages, controlling dispute resolution mechanisms, deciding what information is shared with whom, and collecting user data to influence behaviour and interactions. For instance, on many crowdworking or freelancing labour platforms, as well as ride-sourcing and short-term accommodation platforms, users depend on their work history or reputation data – sometimes for their livelihoods. Since such data is usually not portable, i.e. users cannot transfer it between platforms, switching to a new platform means building one’s reputation from scratch – a major disincentive for many users.
  • Terms of Service (ToS)
    Another way in which platforms gain leverage over users is through their Terms of Service agreements. ToS are often designed to push users to accept them without reading or understanding what they are agreeing to. Not only are these agreements far too long (in many cases, over 10,000 words) and legally complex for the average user to read or comprehend, they also often include clauses which allow platforms to unilaterally change their contents without notice. They are also one-sided – the only choice available to users is to agree or to not use the service – no negotiation is permitted. This is especially problematic in the context of digital labour platforms where workers’ access to the platform may be essential to their livelihoods.
  • Transnational nature
    Many digital platforms operate on a global level. For example, crowdworking platforms enable workers to work remotely from anywhere, and on-demand location-based app companies are not necessarily based in the jurisdictions where they operate. This means that it can be difficult to determine which jurisdiction’s laws should apply in the case of a dispute. Additionally, the geographically dispersed and impersonal nature of the work makes it difficult for workers to organize collectively. In the case of crowdwork, where many workers from across the world compete for the same jobs, solidarity becomes even more difficult as workers come from diverse countries and backgrounds with starkly different economic conditions, such as minimum wage and labour laws. This often results in workers perceiving each other as competitors instead of colleagues, and feeling powerless in negotiating wages because they fear being easily replaced.
  • Regulatory arbitrage
    Platform operators play a key role in governing interactions on their platforms, but they resist being classified as “employers.” Rather, they argue that the users who work on their platforms are “independent contractors” or “self-employed.” This enables the platforms to claim that many labour laws are simply not applicable to digital platform workers. Consequently, these non-standard workers are often placed in incredibly precarious situations. They rarely receive employment protections or benefits from platforms or clients, and their right to collective bargaining is often not recognized. Digital workers are also often paid less than standard workers.

Current governance landscape

There are few existing rule instruments (i.e. laws, regulations or standards) that seek to respond to the specific structural issues related to digital platforms. While legislation such as the European Union’s General Data Protection Regulation (GDPR) has focused on aspects like privacy and consent, standards for platform design are sorely lacking.

Nevertheless, some less formal initiatives are worth highlighting. Many of these are aimed at helping users make informed decisions online, for example, by providing evaluations of various ToS agreements. Creative Commons’ “human readable” versions of software licenses, the “Terms of Service; Didn’t Read (TOS;DR)” project, and watchdog project FairCrowdWork.org’s assessment of the ToS of crowdworking platforms are all examples of initiatives that seek to provide easy-to-read summaries and ratings for the ToS of major websites and Internet services.

In terms of labour laws, traditional rule instruments are largely inadequate to respond to the novel challenges presented by digital labour platforms, and no new laws have been adopted to promote decent digital work so far. That said, the International Labour Organisation (ILO), which has overseen the system of international labour standards since 1919, recently launched a “Future of Work initiative” designed to respond to the changing nature of work and advance social justice.

Some other initiatives worth noting include a “Crowdsourcing Code of Conduct” that was developed in 2015 by the German software testing platform, Testbirds. This code sets out principles such as fair payment, only serious tasks, and open and transparent communication. An Ombuds office was established in 2017 to enforce the code and resolve disputes between workers and platforms. Similarly, the UNI Global Union developed “10 Principles for Workers’ Data Rights and Privacy” to protect workers’ rights in an increasingly digitized employment landscape.

Closer to home, the Standards Council of Canada and the CSA Group facilitated a workshop, held in 2017, that resulted in an ISO International Workshop Agreement (IWA 27:2017) called the Guiding principles and framework for the sharing economy. This framework includes high-level principles that can act as a foundation for the governance of the highly digitized sharing economy.

The current lack of rule instruments designed to address the issues raised by platforms’ designs and structures suggests that there is room for innovative solutions.

What role can standards play?

The current lack of rule instruments designed to address the issues raised by platforms’ designs and structures suggests that there is room for innovative solutions. In our report, we identify opportunities for standards to be used to make ToS agreements more user-friendly, and to reduce power imbalances between platforms and users. We also suggest developing standards for platform design and architecture to enhance transparency and reduce information asymmetry.

Digital platform workers’ work history and reputation data provides an example of what greater transparency in platform design might look like. By creating industry-wide standards for platforms’ peer-review and ratings systems, SDOs can create comparability in this data and enable their transfer across platforms. Doing so could give individuals greater control over their data as well as increase competition between platforms. For example, ride-sourcing drivers cannot transfer their five-star rating data and use it on other platforms. A common standard for such rating information could neutralize an important potential excuse for resisting data portability and, in so doing, promote worker mobility and competition on prices and wages between platforms.

This TLDR is the final post in a three-part series examining the potential for standards-based solutions to play a positive role in governing the digital economy. Other posts examine opportunities for standards in the contexts of data governance and artificial intelligence (AI) and algorithms. All three posts draw on research conducted for a project commissioned by the CSA Group which produced a report entitled: The Digital Age: Exploring the Role of Standards for Data Governance, Artificial Intelligence and Emerging Platforms.

 

Authors

Kiran Alwani
Michael Crawford Urban

Release Date

June 4, 2019

Related Reading