Next Article in Journal
Once More, with Feeling! Digital Campaigns and Emotional Candidacies in X in Andalusia and Castilla y León
Next Article in Special Issue
Women’s Tailored Food Delivery Platform: The Case of a Small Company in Italy
Previous Article in Journal
Perceptions and Acceptance of Artificial Intelligence: A Multi-Dimensional Study
Previous Article in Special Issue
Are You Really Your Own Boss? Flexi-Vulnerability and False Consciousness of Autonomy in the Digital Labor Culture of Riders
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Algorithmic Discriminations and New Forms of Protections: An Analysis of the Italian Case

Struttura Mercato del Lavoro, Istituto Nazionale per l’Analisi delle Politiche Pubbliche (INAPP), Corso d’Italia 33, 00198 Roma, Italy
Author to whom correspondence should be addressed.
Soc. Sci. 2023, 12(9), 503;
Original submission received: 27 July 2023 / Revised: 23 August 2023 / Accepted: 2 September 2023 / Published: 7 September 2023
(This article belongs to the Special Issue Gender Gaps in Digital Labour Platforms)


This research aims to investigate how to protect workers from discrimination dictated by an algorithm in the contractual conditions. Article 15 of the Italian Workers’ Statute declares invalid any agreement or act aimed at: dismissing a worker, discriminating him in the assignment of qualifications or tasks, transfers, disciplinary measures, or otherwise prejudicing him because of his affiliation or union activity, or his participation in a strike. These provisions shall also apply to pacts or acts for the purposes of political, religious, racial, language, sex, disability, age, sexual orientation, or belief. Our work intends to explore the risk of gender or age discrimination in the contractual terms for platform workers in Italy. How can workers’ protections be preserved when decisions are made by an algorithm? The research is conducted with a multidisciplinary methodology. We first analyze both national and international literature and jurisprudence. Then, by means of probit models on INAPP PLUS 2021 survey data, we analyze contract characteristics, in particular the written form of the contract and the hourly minimum wage. Controlling for individual and job characteristics, we find evidence of discrimination according to gender and age of workers. We conclude with policy recommendations.

1. Introduction

This paper, by means of a multidisciplinary methodology, aims to investigate the presence of gender and age discrimination in the contractual conditions of platform workers in Italy, through the analysis of data, and to understand how to preserve workers’ protections when decisions are made by an algorithm by analyzing the jurisprudence. The labor market must deal with the risk of new and unique forms of discrimination dictated by IT systems which are opaque rather than transparent and therefore difficult to decipher, let alone to counteract. In fact, considering important developments of the past few years in information and communication technologies, algorithmic discrimination is currently global and pervasive in scope (Giorgini Pigniatello 2021). The issue of discrimination dictated by algorithms is gaining space, both in terms of access to the labor market and in what concerns contractual conditions. Both forms of discrimination perpetuated in the labor market are well-known and tend to harm the most vulnerable groups, such as women, young people, and people of foreign origin (Inapp 2021; Troisi 2012; Van Wolleghem et al. 2022).
The Employment Outlook (OECD 2023) examines emerging cases of the impact of artificial intelligence on the labor market, highlighting the level of uncertainty surrounding the current and especially future impact of AI on the labor market, and the most appropriate policy actions to promote the legitimate use of AI.
According to the OECD report, AI appears to be different from previous digital technological changes in several ways: (I) it significantly expands the range of tasks that can be automated beyond routine non-cognitive tasks; (II) AI is a technology of general use, which means that almost always industries and employment will be affected; and (III) the rate of development is unprecedented.
This is why AI is able to replace repetitive and simpler work tasks by removing space for jobs.
Ensuring trustworthy AI in the workplace can be challenging, because the technology entails risks, notably for human rights (e.g., privacy, discrimination, and labor rights), job quality, transparency, explainability, and accountability. Moreover, it is important to identify possible risks that currently do not manifest themselves, but which may appear in the near future when new AI systems are developed or applied in new contexts (Salvi Del Piero et al. 2022).
An important initiative at European level took place on 14 June 2023 when the European Parliament approved the Artificial Intelligence Act, which will regulate artificial intelligence while respecting the rights and values of the European Union.
Currently, workforce analytics systems play a leading role, processing the profiles of people by aggregating information that goes beyond the information normally examined in the field of human resources, such as education and work experience. These systems serve to provide companies and organizations with datasets that contain information on the behavior of employees in their private life, outside the places and times of work. In this way, and through data mining software, companies can use this information, collected without the user’s consent, to make inferences related to health, or to verify, for example, if an employee is pregnant. These data can be used for recruitment processes, selecting the most suitable candidates for the profile to choose, but it can also be used to manage work shifts, facilitating those activities that should be performed by an individual. Among the “social pathologies” of the platforms, there is, therefore, a risk of discrimination facilitated by the functioning of the algorithm (Giorgini Pigniatello 2021). Algorithms, in fact, if guided by imprecise, partial, or non-representative data of the phenomenon to which they apply, can produce unclear and distorted results and lead to various forms of discrimination.
Discrimination has always been the subject of legal protection. The first acknowledgment at the international level of the principle of non-discrimination is found in the Charter of the United Nations, the agreement which established the UN, which was signed in San Francisco on June 26, 1945. Later, the ILO Convention n. 111 of 1958, at Art. 1, affirms that the term discrimination includes “any distinction, exclusion or preference made on the basis of race, color, sex, religion, political opinion, national extraction or social origin, which has the effect of nullifying or impairing equality of opportunity or treatment in employment or occupation”. In the European context, some of these principles can be found in the Charter of Nice of 2000, as well as in some articles of the Treaty on the Functioning of the European Union (TFEU) of 2008. In implementing Art. 19 TFEU, some Directives were adopted, specifically Directive 2000/78/EC, the Framework Directive, concerning the protection against discrimination in the context of labor law, which, in Art. 1, states that its purpose is “to lay down a general framework for combating discrimination on the grounds of religion or belief, disability, age or sexual orientation as regards employment and occupation, with a view to putting into effect in the Member States the principle of equal treatment”.
In Italy, the recognition of the principles of equality and non-discrimination first dates to the entry into force of the Constitution of 1948, in Articles 2 and 3. Echoing the constitutional rules, the prohibition of discrimination has been recalled, specified, and broadened in numerous other rules. First, we have the Workers’ Charter, which in Art. 1 refers to religious opinions and personal beliefs; Articles 15 and 16 of the same law refer to protection against discriminatory treatment in hiring, dismissals, demotions, and professional assignments. In fact, Art. 15(b) of the Italian Workers’ Charter declares invalid any agreement or act intended to dismiss a worker, discriminate against him/her in the assignment of titles or roles, transfers, disciplinary actions, or otherwise prejudice him/her by virtue of his/her union affiliation or activities, or his/her participation in a strike. These provisions also apply to agreements or acts to prevent discrimination based on political, religious, racial, linguistic, gender, disability, age, sexual orientation, or personal conviction. Then, Art. 10 of Legislative Decree 276/2003 sets forth the prohibition for authorized or accredited employment and other public and private entities or agencies “to carry out any investigation or in any case process data, in order to preselect workers, also with their consent (…), on the basis of religious beliefs”.
A more recent amendment was made with Law 162/2021, published in the Official Gazette on 18 November 2021, in Art. 2, which amended the concept of direct and indirect discrimination in the workplace, thereby modifying Art. 25 of Legislative Decree 198/2006.
This is also confirmed by the caselaw which has begun to look at the issue, finding forms of direct and indirect algorithmic discrimination which perpetuate or even amplify already existing discrimination in the labor market (Perulli 2021).
Existing (non-AI specific) legislation—for instance on discrimination, workers’ rights to organize, or product liability—is an important foundation for regulating workplace AI.
For instance, all OECD member countries have in place laws that aim to protect data and privacy. In some countries, such as Italy, existing anti-discrimination legislation has been successfully applied in Court cases related to the use of AI in the workplace (OECD 2023).
The first decision fully adhering to this issue and of particular interest is that of the Court of Bologna, which found the existence of indirect discrimination in relation to the terms of access to the reservation of work shifts through a digital platform, conditions based on statistics related to the “participation and reliability” of workers during peak workloads according to a ranking mechanism. The Court ordered the defendant company to remove the effects of the conduct and to pay for the legal costs (Barbera 2021).
Recently, the European Parliament and the Council of Europe approved the text of a Directive “to strengthen the application of the principle of equal pay for men and women for equal work, or work of equal value” (gender pay gap). Among other things, the principle of wage transparency was introduced. The Directive (which, it should be remembered, must be transposed by the individual Member States within two years of its entry into force) provides that companies must provide information on wages to workers who request it, who will be bound by confidentiality with respect to information obtained.
According to Inapp-Plus data 2021, there is a low number of platform workers who have written employment contracts, in addition to the evidence of inequality between men and women and by age group, in blatant contrast with Law n. 128 of 2 November 2019 (converting into law, with amendments, by Law Decree n. 101 of 3 September 2019), regarding the food delivery app industry, which foresees written contracts for workers in this sector (De Minicis et al. 2020).
The law introduces twin-track regulation: on one side, there is a sectoral framework which exclusively favors the autonomous riders, while, on the other, there is an extension of the field of application of employment protection regulations, to also include written contracts and the prohibition against piecework pay, through an extension of the concept of the hetero-organization (Voza 2017).The latter amendment thus extends the employment regulations to all platform workers and is not limited to food delivery workers when the employment relationship is qualified as hetero-organizational.
In addition, the law intervenes in a limited manner with respect to issues of discrimination, as legal interventions only deal with them after-the-fact, when the discrimination has already occurred—through Art. 47 (d/quinquies). Moreover, new and adequate safeguards are not identified; rather, the existing worker protections against discrimination are generically extended to autonomous platform workers.
Considering the above, it appears that the law is not particularly effective, at least with respect to written contracts and as we will demonstrate in the analysis, with respect to the lack of a minimum hourly remuneration, above all for women and young workers. The minimum wage in Italy is not provided by any law, but by several collective agreements. Directive (EU) 2022/2041 of 19 October 2022 on adequate minimum wages was published on 25 October 20221. The objective of the directive is to promote and create favorable conditions in order to guarantee workers in the Member States an adequate minimum wage, which can be ensured by collective agreement or by law. To this end, procedures are established for the adequacy of legal minimum wages, for the promotion of collective bargaining on wage setting, and for improving the effectiveness of the application of minimum wages, regardless of the method of their setting, legal or contractual.
The main scope of our paper is to understand if the differences found in the contractual conditions, by gender and age, for platform workers, can be defined as discriminatory. To do so, we run a set of probit models and control for different type of activities done and for the importance that those activities may have for workers, namely if the platform job represents their main activity, other than for individual and locations characteristics. We find that, despite our controls, women and younger people working for platforms are discriminated both in what concerns the presence of a written contract and a minimum hourly wage.
In what follows, we will provide a broad contextualization of the topic of algorithmic discrimination by a review of both the literature and the main case law dealing with it (Section 2); we then present our data, methods, and descriptive statistics (Section 3) and the empirical results (Section 4); we then briefly discuss the results (Section 5) and finally, we conclude with some policy recommendations (Section 5).

2. Context

2.1. Literature Review

Digital economy and digital labor platforms are now a matter of fact and involve both consumers and employers. Digital labor platforms open new markets and offer new opportunity of earnings, even for those who are out of the labor market. This new actor heavily entered the scene during the lockdown in 2020 and is modifying not only firms’ organizational patterns and processes, but also the employment relations between workers and employers (Donini 2015). The pandemic has maybe accelerated this transition with the increase in the diffusion of digital platforms and the related technologies, but also even the use of big data and algorithms, with positive effects on new form of earnings, above all for those who have lost their jobs (ILO 2021). Nevertheless, this new model lets firms organize their activities without investing in capital goods or hiring employees. Digital platforms indeed play the role of an “intermediary” between workers and customers and manage the whole process by means of algorithms. The advance in data and computing power and the way in which employers apply these tools can improve workplace fairness or, on the contrary, aggravate inequalities in the labor market (Kim 2017).
A central theme in dealing with labor platforms regards working conditions and salaries. It is proven the workers’ exposure to in-work poverty, along with a lack of their involvement in collective bargaining and the consequent exclusion from the social protection system (ILO 2021). Following these argumentations, the work organized and directed by digital platforms represents a prosecution—but even a new form—of the well-known process of fragmentation and “fissuring” in the labor market (Guarascio et al. 2021). As is well known, non-standard work is widespread, and starting from the beginning of the 1990s, an extensive debate has been developed about how to reshape the work regulatory system. This debate seems to be mainly concentrated on how to modify single aspects regarding labor legislation and concerning the quality of work and the conditions of non-standard workers. However, the regulation of the fundamental labor rights (especially for casual workers and platform workers), such as the one related to the collective bargaining, does not seem to have had the same attention (De Stefano and Aloisi 2019).
In this context, the crucial questions of precariousness and job insecurity (also economically speaking) are put forth again and pushed to the limit. In the labor platforms, workers are considered “a scalable labor force on demand” (Inapp 2019) that is far away from the idea of a new collaborative economy peer-to-peer allowing young generations to find an “easy” earning through an autonomous activity, without constrictions and specific commitments (Codagnone et al. 2016; Inapp 2022b).
Digital labor platforms, theoretically and potentially, can have positive effects on the productivity and on the labor market: “They can increase the pool of employers and workers by removing barriers and reducing transaction costs, improving matching, increasing human capital specialization, with potential net welfare effects such as more efficient labor markets and increased employment” (Codagnone et al. 2016). Nevertheless, it is clear that a (re)distributive impact is not evident, especially if we consider contractual arrangement, salary, and hiring efficiency. In particular, some scholars (e.g., Pallais 2014; Stanton and Thomas 2014) point out that processes and choices, but also the use of ratings in the job recruitments can produce matching-frictions and the entry-level hiring can be inefficient, even for the platform itself.
In term of composition, some analysis shows how, in general, platform workers are younger and better educated. Furthermore, the incidence of non-standard work is recurring (Guarascio et al. 2021) and evidence shows that platform workers do not work for “pin money”, but this source is, for a large share of workers, the primary one, or an important one among others (Inapp 2022b; Codagnone et al. 2016). Concerning the share of women and men, women seem to be slightly more represented, though it depends on the type of platform (online labor markets or mobile labor markets) and on the country (Codagnone et al. 2016).
The existence of different forms of gender and ethnic discrimination has been proven, even regarding the contract types (Uhlmann and Silberzahn 2014) based on the hiring mechanisms, but also on the “reputations” and on heuristics and maybe facilitated by a gap in the regulatory system.
In this framework, algorithms play a key role in hiring, managing, and making decisions. Even if they are created to restrict human power in decisions, it is highly probable that there is a risk of introducing other forms of bias. One risk is certainly the “intentional discrimination” (e.g., by race or sex), legitimated by business reasons and masked by algorithms (Kim 2017). This means that despite a neutrality in theory, platforms now (but even maybe from their birth) have an active role in managing contents and making decisions. There are different scholars reporting cases of gender discriminations perpetrated by algorithms, in particular considering the working life analysis, and data show discriminations in hiring (e.g., Amazon) or in rating (e.g., task rabbit), which determines unequal treatments in relation to gender or ethnic groups (Orwat 2020; Curran 2023).

2.2. The Main Caselaw Dealing with Algorithmic Discrimination

The first Italian court decision, probably the first European decision, regarding algorithmic discrimination was the order of the Court of First Instance of Bologna of 31 December 2020, which decided the case brought by the labor unions Filcams Cgil Bologna, Nidil Cgil Bologna and Filt Cgil Bologna against the company Deliveroo. The issue noted in the Court’s order presented by the labor unions concerned the neutrality of the conduct of the digital platforms in relation to possible discrimination in violation of the labor laws. More specifically, as stated above, Filcams Cgil, Nidil Cgil, and Filt Cgil brought the action against Deliveroo holding that Deliveroo’s digital platform, in its evaluation of reputational rankings, could penalize some riders, resulting in the reduction or cessation of calls received by some of these individuals to work. In this sense, the case focused on those food delivery riders connected to the Deliveroo platform who, in explaining the reasons for their abstentions from work, stated that they had invoked their right to strike. The discovery phase was, however, based on a lack of proof and allegations of “concrete mechanisms of algorithmic functions which elaborate the riders’ statistics” and consequently it “precludes at the root a more in-depth examination of the issue” (Barbera 2021). Specifically, the Court of Bologna was asked whether the self-service booking (SSB) system that the Deliveroo digital platform used in Italy is discriminatory in nature at least up to a few months before the order.
The SSB system was governed by the individual contract signed by the rider with the company. This mechanism consisted of a “flexible service of self-service reservation (…) which may be freely used to log into and book sessions for which the rider wants to receive proposals for service. Reservation through the SSB tool is wholly optional, but if used and confirmed, the rider will be guaranteed access to receive shift proposals for the sessions booked. The availability during the booked shift, if not cancelled by the rider in advance and activity during heavy traffic times may constitute elements of preference for the reservation of later sessions” (Barbera 2021). This SSB system allowed riders, every Monday, to access the calendar of the next week and reserve working sessions, work shifts, in which they wanted to receive proposals for service. Through SSB, the riders were able to choose the available slots and the areas where they would perform their work. Access was granted to the riders to the reservation calendar at three different times, based on their score: on Mondays, beginning at 11:00, 15:00, or 17:00. The ability to access the SSB reservation system in one of three times depended on two indices: reliability and participation in peak times. These indices were characterized by several elements which were given values according to the demonstrated willingness to work, whether they logged in at a set time and worked during certain peaks/requested hours. Afterwards, they were recalculated by the platform through computational calculations which the discovery phase was unable to do anything about. The result of this calculation gave a score to each rider, the so-called reputational ranking from which it was deduced that, beginning at 11:00, 15% of the riders with the best statistics could access the calendar, while beginning at 15:00, 25% of those with progressively lower statistics had access to the calendar, and at 17:00, all the other riders could access the calendar. Of course, those who accessed the 11:00 Monday session would have their choice among all the free sessions of the following week and take all the shifts they pleased, leaving little choice to the riders who accessed the later sessions.
In the Order under examination, the labor union asked the Court of Bologna to assess whether this reservation system is discriminatory in nature with reference to the possible requests for work by virtue of the fact that these conditions prevented “the legitimate actions of riders to participate in a labor union strike initiatives because it inexorably exposes them to the loss of chances for future employment, marginalizing them in the selection of shifts”.
More specifically, the Court of Bologna took its position, stating that “it appears proven that adhesion to an initiative of collective work stoppages is capable of affecting the riders’ statistics, since the rider who adheres to a strike, thereby abstaining from working activities, which consists in behavior definable as a strike, and thereby not participating in a reserved session, will inevitably result in a reduction of its score under the profile of reliability, and possibly also of participation when the reserved session takes place between the weekend hours of 20:00–22:00”. The result is that, in the case where the food delivery rider decides to adhere to a strike without cancelling the reservation within the contractually agreed time period (at least 24 h before the beginning of the reserved shift), it suffers discriminatory treatment by virtue of the fact that the statistics and reputational rankings will worsen. This would also take place in the case where the rider decides to avoid the prejudicial effects arising from adhesion to the strike by cancelling the reserved session in advance, because this would allow the platform to substitute the food delivery rider with another, thereby annulling the effects of the strike. According to the Court, there would be a sort of “unconsciousness” or “blindness” of Deliveroo’s platform, as it would be unable to distinguish between trivial and important reasons, such as a strike, injury, illness, or maternity leave.
Other examples of indirect discrimination in the workplace on digital platforms include Uber, where the workers’ pay is calculated by an algorithm which determines the amount using criteria such as the availability rate and client evaluations. Both criteria can be subject to distortions due to stereotypes which create inequality. In the above example, the widespread prejudice of the unreliability of women drivers influenced the client evaluations of their driving to a certain degree and thus resulted in disadvantages in the calculation of their pay.
Another example regards discrimination in the selection phase. The algorithm used by Amazon examined and classified workers through a set of training data which compared them with the profiles of employees hired in the past. Although gender or ethnicity were not part of all the variables included in the model, the algorithm automatically correlated these characteristics to the performance of the candidates, thereby effectively discriminating against women. In conclusion, the data reflect the social discrimination present in the labor market and amplify it, thus affecting many individuals.

3. Materials and Methods

3.1. Data

This contribution draws on data from the Inapp-Plus 2021. The Inapp-Plus survey has been in the National Statistical Plan since 2005 and contributes to the production of Official Statistics in Italy. The survey includes only direct respondents (meaning it does not include proxy respondents, i.e., people answering for other family members), which contributes to improving the accuracy of detail and self-perception questions. Inapp-Plus is a sample CATI (computer-assisted telephone interview) survey, and it is representative of the entire national territory ( (accessed on 1 September 2023)). The ninth round of the survey was conducted between March and July 2021 on a sample of over 45,000 individuals aged 18–74 years.
In the Inapp-Plus questionnaire, there is a specific section devoted to gig economy, aiming to collect information about the general use of platforms as a financial instrument for earning money. Then, examining in detail, the section is divided in three subsections: one concerns online product-selling activities; the second is dedicated to the labor platforms; the last bears in mind the capital platforms.
Therefore, referring to this section, according to Inapp-Plus data, it is possible to know the number of people claiming they earn money through digital platforms. Specifically, it is worth distinguishing between those who use platforms to sell products (advertising platforms), those who use them to rent property (product platforms or capital platforms), and those who use them to work (lean platforms or platform work). In Italy, the number of people aged between 18 and 74 who declared to have earned an income between 2020 and 2021 using platforms is 2,228,427 (5.2% of the total population). Furthermore, Inapp-Plus estimates that there are 570,521 platform workers, representing a share of 25.6% of the total number of people who earn money through the Internet and 1.3% of the total population aged 18–74 (Inapp 2022a, 2022b).
According to Inapp-Plus survey results, platforms workers are mainly men and seven out of ten are from 30 to 49 years old.
Looking at the employment status of the platform workers, data show that around 48% of them consider the work for the platform as their main activity. On the other hand, for 24% of them, platform work is a secondary activity. Finally, a share of 28% carries out occasional work activities and earns some money through digital platforms, but they are “formally” unemployed or inactive. However, according to their answer, for 68% of respondents, the platform work is their main activity. Around seven workers out of ten claim they have an employment contract, especially when their activity for the platform is considered their main activity.
According to Inapp-Plus estimates, the overall distribution of platform workers by educational qualification is in line with that of the general population, except for a higher presence of graduates. Some differences can be seen in the three groups of platform workers: while people working on platforms as their main activity show higher levels of education, casual workers show concentrations in the lowest educational qualifications. The activities carried out by platform workers are quite heterogeneous: delivering of packages or meals, domestic work such as cleaning, driving people, and carrying out online tasks (translations, computer programs, image recognition). Indeed, the picture of platform work goes beyond the representation of platform workers as riders.
Applying the ILO classification, which draws a distinction between web-based platforms, where micro-tasks are carried out without location constraints, and location-based platforms, where the assigned tasks are carried out in a specific location, it emerges that 35% of the respondents are involved in activities performed online (Inapp 2022b). This proves the emergence of work on a completely online platform also in Italy, which is less visible and traceable, but has reached a quota like that of riders.

3.2. Methods, Stylized Facts, and Descriptive Statistics

In what follows, we present some evidence of the differences by gender and age in the main variables of interest in our analysis, namely, the form of contract (written or not), the presence of a minimum hourly wage, the management of the account, and finally the evaluation process.
Women and young people show the lowest quota of workers with a written contract and a minimum hourly wage (Table 1).
Concentrating the analysis on the form of the contract, we found that the management of the account partially explains the presence of unwritten contracts. In fact, unwritten contracts are mainly present where the account is not directly managed by workers. We also found out that this explanation is strongly correlated with the gender of workers. Thus, if we look at written contracts for women, only 6% of them are related to activities done without a direct control of the account, the same share for unwritten contracts is 52% (Figure 1).
Finally, we present some stylized facts about the evaluation process. In Figure 2, it emerges that women are more often evaluated according to subjective criteria, such as the customer rating (46.6% of women), compared to men (37.9%). In contrast, men are more often evaluated according to objective criteria, such as the number of tasks completed (55.8% of men and 47.2% of women). It is important to stress that subjective criteria may be biased, e.g., in the case of customer rating being a criterion.
The effects that the evaluation can produce are, however, similar for men and women (Figure 3), apart from a higher share of women that face the non-payment of the service (8.6% of women against 3% of men) as an effect of a bad evaluation and a higher share of men that, in comparison, face a worsening of the timetables (27.4% of men against 17.9% of women).
To detect whether the gender and age differences that we noticed in the contractual condition are rather explained by the different kinds of activities done, the geographical area of residence of workers, or other individual characteristics, we ran several probit models. In Equation (1), we show our basic model. Dependent variables are two dummies: the first one taking value “1” if the worker has a written contract and zero otherwise, the second dummy taking value “1” if the worker has a minimum hourly wage and zero otherwise.
Written contract/Minimum hourly wage = gender + age + education + geographical area of residence + type of activity + direct management of platform + main activity
The main predictors are worker’s gender and age, control variables are the educational level, the macro geographical area of residence, the type of activities run, the fact of directly controlling the working account, and, finally, the fact that the activity run is the main one according to respondent.
With the aim of detecting gender differences and following the principle that gender differences are not adequately captured using simple additive dummy variables (D’Ippoliti 2011), we ran our models for the total population and for the subset of women and men.
In Table 2, we present all the variables involved in our analysis with main descriptive statistics.

4. Results

The average marginal effects of the probit models are shown in Table 3. We found that being a woman decreases the probability of having a written form of the contract by almost 16 percentage point (p.p.) and a minimum hourly wage by 14 p.p.
Being in the middle class of age (30–49) increases the probability of having both a written form of the contract (by 15 p.p.) and a minimum hourly wage (by 16 p.p.) compared to being in the youngest class of age in all models’ specification. Being in the oldest class of age (50 and above) also increases those probabilities if we look at the whole sample. However, for men, being in the oldest class of age decreases the probability of having a written form of the contract by 7 p.p. compared to being among the youngest.
As expected, looking at the geographical area of residence, we find that residing in the southern regions of Italy decreases the probability of having a minimum hourly wage. However, in what concerns the written form of the contract, we find a negative effect of residing in southern regions only for women, and having a higher education level increases the probability of having a written form of the contract. Surprisingly, especially for men, being more educated decreases the chances of having a minimum hourly wage.
According to our models, variables related to the characteristics of the activities run also explain the contractual conditions. In what concerns the activities run, comparing to the delivery of products, delivering meals and running online activities decrease the probability of having a written form of the contract; instead, being a domestic worker increases the probability of having better contractual conditions if we look at the whole sample. However, if we concentrate only on women, we find a negative effect for both the probability of having a written contract and a minimum hourly wage.
Directly managing the working account increases the probability of having good contractual conditions in any model specifications. However, for women, the size of the effect on the probability of having a written contract is much higher (46 p.p.) than for men (21.5 p.p.).
Finally, as expected, we find that the probability of having better conditions increases, in all model specifications, if the activities done through platform is the main one for the worker.

5. Discussion

Despite controlling for several variables, the type of activities run and the importance that the job has for the workers, as well as individual characteristics and macro-area of residence, the results of our analysis confirm gender discrimination against women in the contractual conditions under analysis, in any specification of our models. The age discrimination against youngers is also confirmed, but the results also show the weakness of the oldest compared to the middle class of age.
Running our models, we expected that the control variables would have captured the effect of gender, thus showing that the contractual conditions are determined according to objective reasons, e.g., the type of activities run or the importance that the activities have for the workers. Unfortunately, we find that being a woman is itself a reason for having different conditions, which also happen to not be the better ones.
The finding that directly managing the account increases the chances of having better contractual conditions, especially for women, is here interpreted as a sign of the absence of digital gangmastering. The gangmaster system is a sort of illegal form of employment, rather evident when referring to migrant workers with very low wage working in the agricultural sector, but also characterizing digital labor platforms. Platforms are conceived of as a direct relation between worker and his “digital employer”, so this employment relation being mediated by a third actor (thus a non-authorized person) seems to hide something illicit. Furthermore, as happens in the agricultural sector, this could also mean that the intermediate holds a part of the worker’s earnings as a “right of intermediations” (Inapp 2022b, 2022a).
Finally, having deepened the topic with a multivariate analysis, we also found differences in the evaluation process of platform workers according to their gender. In fact, women happen to be evaluated according to subjective criteria, which may increase the risk of discrimination against them. However, further analyses are needed to better understand the reasons behind such differences.

6. Conclusions

The multivariate analysis here presented confirmed gender discrimination in working conditions for platform workers. Moreover, we detected sign of a digital “caporalato” (digital gangmaster). However, Italian legislation is lacking to combat algorithmic discrimination in practice.
In fact, regardless of the evidence stressed by scholars and the jurisprudence (cfr. Section 2) about the concrete risk of algorithmic discrimination, there are few safeguards for workers. In the Italian legal system, law 128/2019 only provides guarantees for the food delivery sector and the jurisprudence is limited in the sector of algorithmic discrimination. Digital workers find themselves in a situation of contractual weakness and it is difficult to take legal action to see their rights protected.
Furthermore, it seems that data provided to the algorithm are not neutral and this could lead to discrimination on the grounds of sex, race, religion, sexual orientation, etc.
Therefore, the algorithm can amplify the existing discrimination characterizing in particular the Italian labor market. This could be due to the fact that algorithms are conceived in a system that considers efficiency a “dogma” and in this view, there is not enough space for taking into account special attention to females’ needs in the labor market. In addition, data entered are usually provided by men which therefore perpetuates the discrimination and prejudices of the labor market (Barbera 2021).
For these reasons, it is necessary to think about the introduction of a greater human control of data inserted in the algorithms and avoid completely automated decisions (as is also prescribed in art. 22 of the GDPR).
The definition of a European regulatory framework for the protection of a minimum wage is a rather sensitive issue for the Member States, above all for those (such as Italy) that do not have a minimum wage established by law (Casasso 2023), so much so that the directive was not approved unanimously, but with Denmark and Sweden voting against it (in addition to Hungary’s abstention), two of the six European Union countries which do not have a legal minimum wage.
On 30 March 2023, according to the ordinary legislative procedure pursuant to art. 294 of the Treaty on the Functioning of the European Union (TFEU), a legislative resolution of the European Parliament was published on the proposal for a directive presented by the same body and by the Council. This act is intended to strengthen the application of the principle of equal pay for men and women for equal work or work of equal value.
The Directive is also very important for the fight against discrimination, and we hope that it will also have a favorable impact for the fight against algorithmic discrimination.

Author Contributions

Conceptualization, M.D.A. and S.D.; methodology, M.D.A.; formal analysis, M.D.A.; data curation, F.B.; writing—original draft, M.D.A., S.D. and F.B.; writing—review and editing, M.D.A., S.D. and F.B.; supervision, M.D.A. All authors have read and agreed to the published version of the manuscript.


This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

The informed consent has been required at the beginning of every single interview in respect of the GDPR regulations.

Data Availability Statement

All the Inapp-Plus databases are available. The access to Inapp-Plus datasets is free and the request must follow the procedure explained here (accessed on 1 September 2023).

Conflicts of Interest

The authors declare no conflict of interest.


The approval process was completed on 4 October. It started in January 2020 with a process of consultations that saw the social partners at European level as protagonists (Porcheddu 2020), continued with the presentation by the European Commission on 28 October 2020 of a proposal for a directive (Spattini 2020) and subsequently with negotiations between the Council and the European Parliament to reach agreement on a common position. It finally concluded with the adoption of the European Parliament of 14 September 2022 and the decision of the Council of 4 October 2022.


  1. Barbera, Marzia. 2021. Discriminazioni algoritmiche e forme di discriminazioni. Labour & Law Issues 7: 3–17. [Google Scholar]
  2. Casasso, Agnese. 2023. La Direttiva UE sulla parità retributiva tra uomini e donne: Una questione di trasparenza. Bollettino Adapt 22 maggio 2023 n.19. pp. 1–4. Available online: (accessed on 1 September 2023).
  3. Codagnone, Cristiano, Fabienne Abadie, and Federico Biagi. 2016. The Future of Work in the ‘Sharing Economy’. Market Efficiency and Equitable Opportunities or Unfair Precarisation. JRC Science for Policy Report EUR 27913 EN. Seville: Institute for Prospective Technological Studies. [Google Scholar] [CrossRef]
  4. Curran, Nathaniel Ming. 2023. Discrimination in the gig economy: The experiences of Black online English teachers. Language and Education 37: 171–85. [Google Scholar] [CrossRef]
  5. De Minicis, Massimo, Silvia Donà, and Manuel Marocco. 2020. Il lavoro online in Italia: Gig o Sharing economy? Prime evidenze empiriche da un’indagine Inapp. Sinappsi X: 125–45. [Google Scholar]
  6. De Stefano, Valerio, and Antonio Aloisi. 2019. Fundamental labour rights, platform work and human-rights protection of non-standard workers. In Labour, Business and Human Rights Law. Edited by Janice R. Bellace and Beryl ter Haar. Cheltenham: Edward Elgar Publishing, pp. 359–79. [Google Scholar]
  7. D’Ippoliti, Carlo. 2011. Economics and Diversity. Oxfordshire: Routledge. [Google Scholar]
  8. Donini, Annamaria. 2015. Il lavoro digitale su piattaforma. Labour &Law Issues 1: 51–71. [Google Scholar]
  9. Giorgini Pigniatello, Giacomo. 2021. Il contrasto alle discriminazioni algoritmiche: Dall’anarchia giuridica alle Digital Authorities. 16: 164–85. [Google Scholar]
  10. Guarascio, Dario, Valeria Cirillo, and Fenizia Verdecchia. 2021. I lavoratori delle piattaforme digitali in Europa: Un’analisi empirica. Sinappsi XI: 74–95. [Google Scholar] [CrossRef]
  11. Inapp. 2019. Gli internauti e i Lavoratori on line: Prime Evidenze da Inapp-Plus, Inapp Policy Brief 15, Inapp. Available online: (accessed on 1 September 2023).
  12. Inapp. 2021. Gender Policies Report. Roma: Inapp, December. [Google Scholar]
  13. Inapp, ed. 2022a. Rapporto Plus 2022. Comprendere la Complessità Del Lavoro. Roma: Inapp. Available online: (accessed on 1 September 2023).
  14. Inapp. 2022b. Lavoro Virtuale Nel Mondo Reale: I dati dell’indagine Inapp-PLUS sui Lavoratori Delle Piattaforme in Italia. Inapp Policy Brief 22. Available online: (accessed on 1 September 2023).
  15. International Labour Organization (ILO). 2021. World Employment and Social Outlook. The Role of Digital Labour Platforms in Transforming the World of Work. Ginevra: ILO. [Google Scholar]
  16. Kim, Pauline. T. 2017. Data-driven discrimination at work. William & Mary Law Review 58: 857–936. [Google Scholar]
  17. OECD, Employment Outlook. 2023. Artificial Intelligence and the Labour Market. July. Available online: (accessed on 1 September 2023).
  18. Orwat, Carsten. 2020. Risks of Discrimination through the Use of Algorithms. Berlin: Federal Anti-Discrimination Agency. [Google Scholar]
  19. Pallais, Amanda. 2014. Inefficient Hiring in Entry-Level Labor Markets. American Economic Review 104: 3565–99. [Google Scholar] [CrossRef]
  20. Perulli, Adalberto. 2021. La discriminazione algoritmica: Brevi note introduttive a margine dell’Ordinanza del Tribunale di Bologna. Lavoro Diritti Europa. n.1/2021. pp. 1–7. Available online: (accessed on 1 September 2023).
  21. Porcheddu, Diletta. 2020. La proposta di un salario minimo; le possibili iniziative comunitarie e le posizioni delle parti sociali europee, in Bollettino ADAPT, 21 settembre 2020 n. 34. pp. 1–10. Available online: (accessed on 1 September 2023).
  22. Salvi Del Piero, A., Peter Wyckoff, and Ann Vourc’H. 2022. Using Artificial Intelligence in the Workplace: What Are the Main Ethical Risks? OECD Social, Employment and Migration Working Paper, n.273. Paris: OECD Publishing. [Google Scholar] [CrossRef]
  23. Spattini, Silvia. 2020. La proposta europea di salario minimo legale: Il punto di vista italiano e comparato, in Bollettino Adapt, 21 settembre 2020, n.34. pp. 1–10. Available online: (accessed on 1 September 2023).
  24. Stanton, Christopher, and Catherine Thomas. 2014. Landing the First Job: The Value of Intermediaries in Online Hiring. Available online: (accessed on 1 September 2023).
  25. Troisi, Claudia. 2012. Divieto di discriminazione e forme di tutela. Profili comparatistici. Torino: Giappichelli Editore. [Google Scholar]
  26. Uhlmann, Eric Luis, and Raphael Silberzahn. 2014. Conformity under uncertainty: Reliance on gender stereotypes in online hiring decisions. Behavioral and Brain Sciences 37: 103–4. [Google Scholar] [CrossRef] [PubMed]
  27. Van Wolleghem, Pierre-Georges, Marina De Angelis, and Sergio Scicchitano. 2022. Do informal Networks Increase Migrants’Over-Education? Comparing Over-Education for Natives, Migrants and Second Generations in Italy and Assessing the Role of Networks in Generating It. Italian Economic Journal 9: 175–97. Available online: (accessed on 5 September 2023). [CrossRef]
  28. Voza, Roberto. 2017. Il lavoro e le piattaforme digitali: The same old story. WPCSDLE Massimo D’ 336/2017. pp. 2–19. Available online: (accessed on 1 September 2023).
Figure 1. Presence of a written contract for workers who directly manage the platform, by gender and age (%). Elaboration of the authors on INAPP Plus data 2021.
Figure 1. Presence of a written contract for workers who directly manage the platform, by gender and age (%). Elaboration of the authors on INAPP Plus data 2021.
Socsci 12 00503 g001
Figure 2. Evaluation criteria, by gender (%). Elaboration of the authors on INAPP Plus data 2021.
Figure 2. Evaluation criteria, by gender (%). Elaboration of the authors on INAPP Plus data 2021.
Socsci 12 00503 g002
Figure 3. Effects that the evaluation can produce, by gender (%). Elaboration of the authors on INAPP Plus data 2021.
Figure 3. Effects that the evaluation can produce, by gender (%). Elaboration of the authors on INAPP Plus data 2021.
Socsci 12 00503 g003
Table 1. Presence of a written contract and a minimum hourly wage by gender and age (%).
Table 1. Presence of a written contract and a minimum hourly wage by gender and age (%).
MenWomen18–2930–4950 and Above
Written contract 74.453.450.275.360.6
Unwritten contract25.646.649.824.739.4
Minimum hourly wage73.754.843.473.969.0
Not minimum hourly wage26.345.256.626.131.1
Elaboration of the authors on INAPP Plus data 2021. We also performed a t-test of statistical differences and we found that, for both written contract and the presence of minimum hourly wage, there is a statistically significant difference between means in the different groups of men and women (Pr(T > t) = 0.000) and oldest (over 29) and youngest (18/29) (Pr(T > t) = 0.001).
Table 2. Descriptive statistics.
Table 2. Descriptive statistics.
VariableMeanStandard DeviationMinMax
Written contract—Yes0.6950.46001
Minimum hourly wage—Yes0.6930.46101
Age: 18–290.1230.32901
30–49 0.6940.46101
50 and above0.1830.38601
Area of residence: Northwest0.2270.41901
South and islands0.4500.49701
Education: low secondary0.3550.47801
High school0.4630.49901
University or above0.1820.38601
Activity: delivery products0.1400.34701
Delivery meals0.3620.48101
Online activities0.3490.47701
Domestic work0.0920.28901
Accompanying someone with the car0.5730.23201
Direct management—Yes0.7440.43601
Main activity—Yes0.6470.47801
N. 492 observations—N. 570,521 observations weighted
Elaboration of the authors on INAPP Plus data 2021.
Table 3. Determinants of written from of the contract and presence of minimum hourly wage for platform workers. Average marginal effects of probit models.
Table 3. Determinants of written from of the contract and presence of minimum hourly wage for platform workers. Average marginal effects of probit models.
Written Form of ContractWritten Form of Contract—FemaleWritten Form of Contract—MaleMinimum Hourly WageMinimum Hourly Wage—FemaleMinimum Hourly Wage—Male
Women−0.156 *** −0.137 ***
[0.001] [0.001]
Age (reference: 18–29)
30–490.151 ***0.082 ***0.181 ***0.161 ***0.029 ***0.185 ***
50 and above0.037 ***0.234 ***−0.075 ***0.165 ***0.202 ***0.096 ***
Macro-area of residence (reference: Northeast)
Northwest0.014 ***−0.221 ***0.098 ***−0.024 ***0.068 ***−0.065 ***
Center0.024 ***0.118 ***0.043 ***0.024 ***−0.017 ***0.057 ***
South and islands0.048 ***−0.099 ***0.117 ***−0.042 ***−0.063 ***−0.008 ***
Education (reference: low level)
Medium level (high school)0.020 ***0.020 ***0.055 ***−0.044 ***−0.006 **−0.051 ***
High (university and above)0.020 ***0.135 ***0.001−0.099 ***0.017 ***−0.128 ***
Activity (reference: delivery of products)
Delivery of meals−0.012 ***0.043 ***0.024 ***0.126 ***0.209 ***0.146 ***
Online activities−0.089 ***−0.232 ***0.018 ***−0.065 ***−0.215 ***0.038 ***
Domestic work0.078 ***−0.164 ***0.180 ***0.056 ***−0.320 ***0.155 ***
Accompanying someone with the car and others−0.052 ***−0.326 ***−0.034 ***−0.185 ***−0.371 ***−0.094 ***
Direct management of the account0.272 ***0.463 ***0.215 ***0.310 ***0.305 ***0.314 ***
Main activity0.292 ***0.332 ***0.254 ***0.385 ***0.309 ***0.392 ***
(** 0.05 *** 0.01) Standard errors in parenthesis. Elaboration of the authors on INAPP Plus data 2021.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

De Angelis, M.; Donà, S.; Bergamante, F. Algorithmic Discriminations and New Forms of Protections: An Analysis of the Italian Case. Soc. Sci. 2023, 12, 503.

AMA Style

De Angelis M, Donà S, Bergamante F. Algorithmic Discriminations and New Forms of Protections: An Analysis of the Italian Case. Social Sciences. 2023; 12(9):503.

Chicago/Turabian Style

De Angelis, Marina, Silvia Donà, and Francesca Bergamante. 2023. "Algorithmic Discriminations and New Forms of Protections: An Analysis of the Italian Case" Social Sciences 12, no. 9: 503.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop