Abuse of a dominant position in the “all-digital” era
p. 471-485
Full text
1The digital economy is disrupting a traditional economy in which algorithms are now undeniably contributing to economic progress.
2Through network effects linked to the free and sophisticated use of mass-collected user data, whether “personal” or “technical”, algorithmic technology, through the business models of the GAFAM and other digital giants1, has profoundly transformed and reorganised both economic activities and consumer behaviour in just 20 years.
3Search engines, in particular Google, offer a striking illustration of this2.
4However, if algorithms contribute to a certain efficiency of supply, as well as of demand, by encouraging the various players to innovate and become more competitive, consumers, states and public authorities are increasingly aware of the potential harmful power of these digital giants.
5It is in this context that in 2010, the European Commission opened an investigation into the Google Shopping service3 which, after several years, resulted in a fine of 2.42 million euros to punish repeated abuses of its dominant position committed by the super-platform4. It was again on this basis that the American giant was fined 150 million euros in 20195 by the French competition authority (the same year that the European Commission imposed a fine of 1.5 billion euros in respect of its AdSense6 advertising network) and more recently, a record fine of 220 million euros for again favouring its own services in the online advertising sector7.
6These fines illustrate the potential anti-competitive effects of algorithms and the way in which the software operator can, through the configuration and therefore the criteria on which its algorithm is based, influence or even deceive the consumer and thus distort competition.
7In these circumstances, a good number of artificial intelligence devices could and should be regulated by a competition law – whose preferred tool is that of abuse of a dominant position – strongly destabilised by a highly disruptive technology. In this sense, Professors Ezrachi and Stucke were among the first to warn of the dangers for competition of the widespread use of algorithms and artificial intelligence8.
8At the dawn of the advent of “self-learning” algorithms, in a physically decompartmentalised distribution space, the challenges for states and competition authorities are of several kinds: economic9, philosophical10, political11, diplomatic12 but also legal.
9Indeed, competition law confronted with digital technology 13involves several challenges such as:
- Understanding the competitive issues related to new business models based on the collection and exploitation of personal data and the creation of a global community of users14;
- Detecting new types of crime through algorithms15;
- Using new tools based on mass data, otherwise known as Big Data, and artificial intelligence16.
10Faced with these challenges, and unlike the antitrust policy in the United States, where the Chicago School has “nipped in the bud” the risks of “false positives”17 fatal to economic efficiency, it is the “false negatives”18 that are feared in Europe because “the presentation of a competitive market structure is essential to create the right conditions for innovation”19.
11However, as the Commission points out, “competition policy alone cannot solve all the systemic problems that may arise in the platform economy. Within the logic of the single market, additional rules may be needed to ensure innovation, fairness and contestability of the market, as well as public interests that go beyond competition or economic considerations”20.
12However, if digital technology and the development of artificial intelligence are now perceived as new factors of competition, as was the case with the development of mass retailing, competition law has been applied without having anticipated or revised its standards.
13Thus, if at first sight, it is obvious that competition law appears destabilised in its tools by digital technology and artificial intelligence, it appears, however, in the light of its flagship instrument – the abuse of a dominant position – that competition law does not appear to be at all powerless (I) and could well emerge strengthened from this confrontation with artificial intelligence through an increased “legal” responsibility of its actors (II).
I. Abuse of a dominant position, a weakened instrument but adapted to the “all-digital” era
A. Abuse of a dominant position, an instrument destabilised by artificial intelligence
14Traditionally, “abuse” occurs when the practice restricts competition and does not respect the limits of normal competition, which is commonly defined as competition on the “merits”21.
15However, while abuse of a dominant position is an instrument that can be used against digital giants, it will, by its very nature, often be ineffective.
16Indeed, the characterisation of this abuse is made extremely difficult by the intrinsic characteristics of the algorithms but also of this new digital economy. Moreover, on closer inspection, the sanctions imposed by the European Commission on Google, for example on the referencing market, seem to be mainly justified by the latter’s monopolistic, hegemonic position.
17And for good reason, the fact that an industrialist gives priority to the promotion of its own products or services over those of its competitors cannot be challenged in any way. However, in the case of Google’s activities, the European Commission has ruled otherwise.
18In any event, competition law will not be a vehicle for software neutrality. This is all the more true since competition law cannot, by definition, be concerned with all decision support systems, since its application is subordinated to the exercise of an economic activity22.
19In other words, competition law and its flagship instrument, abuse of a dominant position, will not, in the very short term, provide an effective response to certain “structural” problems that affect concentrated markets such as the digital one.
20The causes of these structural failures are now known. “The “winner takes all” effect results from various factors, such as the two-sided nature of certain markets, the range effects resulting from the establishment of ecosystems, the importance of the data collected, and the low mobility of users due to the importance of transfer costs. It is coupled with network effects and incentives for collusion facilitated by the use of algorithm”23.
21This is detrimental not only to European businesses but also to European citizens who are confronted with an offer that is less efficient than the one that could be addressed to them, in terms of price, choice, services and innovations.
22However, the weapons of competition law (abuse of a dominant position), coupled with the natural laws of a new market, should logically allow the opening of these markets in the medium term. In this sense, the abuse of a dominant position remains a tool capable of dealing with the specificities inherent to the “all-digital” era. Moreover, does it not already apprehend the particular nature of the business models of the digital giants?
B. Abuse of a dominant position, an instrument adapted to artificial intelligence
23In the light of the findings and weaknesses of competition law described above, the European Commission distinguishes and contrasts two types of “tructural competition” problems24:
- “Structural risk to competition”, which can be defined as cases where “competition is not yet affected but is about to be affected because of the behaviour of one or more market players that threatens competition and because of the characteristics of that market”; and
- The “structural lack of competition”, i.e. cases where competition is already affected in the market, either because the market is already dominated by a “super-player” or because it is occupied by several players forming an oligopoly with a significant risk of tacit collusion.
24The purpose of this distinction would be to better define the time frame for the European Commission’s intervention: urgent in the first case, and less so in the second25.
25In this context and in order to deal with these new competition issues, the European Commission proposes to introduce a new tool into our legal system. By this means, it intends to impose “behavioural and structural” measures on the players in a market when this market is faced with competition problems26.
26According to the Commission (and this is a crucial point) the introduction of these measures would not presuppose the finding of a prior infringement. The Commission therefore wishes to be able to leave itself the possibility of sanctioning a dominant company, even though the company’s practices fall within the scope of “normal” competition, i.e. on the merits.
27In other words, with this paradigm shift and presumption of guilt, it will be up to the actor concerned to prove the absence of any violation. After all, is not the dominant company in question the actor best able to justify its own actions and the operation of its algorithm?
28In this way, the European Commission wishes to:
- Respond to the legal difficulties inherent to the intrinsic nature of algorithms; but above all
- To foster innovation by being able to make quick decisions to prevent the erection of barriers to market entry that would make the market less and less contestable.
29However, the scope of application of such a tool remains to be defined. But is it really an appropriate tool?
30Competition law is obviously not obsolete and can perfectly well deal with digital services. Article 102 of the Treaty on the Functioning of the European Union has been used on several occasions by the Commission in the digital sector.
31However, this right is still insufficient because the classic tools of competition law are not sufficient to make the positions acquired by the most powerful operators contestable. Due to network effects, the importance of data, and cost savings, market entry is made extremely difficult or even impossible. Competition law must be more directive and must also innovate in order to take the measure of the specificities of algorithms, the consequences of which are still largely unknown and, in all likelihood, underestimated.
II. Making digital players accountable: between prevention and detection of abuse of a dominant position
32The digital giants are considered to be companies that monopolise the market according to the “winner takes all” formula, and they lock in access by buying up potential new entrants. In order to curb this economic power and anticipate possible abuses by these super-platforms, several solutions abound in favour of innovation and are available to the authorities.
33For example, it is proposed to introduce so-called “ex-ante” regulation, which would take the form of obligations imposed on platforms (A), or to use “data” to regulate anti-competitive practices (B).
A. The affirmation of guiding principles for the prevention of crime
34In order to offer a personalised service to their customers, all digital players need to have a large amount of data on their customers. This is once again the reason for the worrying power of GAFAM and the powerful lobbying in favour of a free exploitation of the “data lake”27
35In order to avoid abuses, it has been proposed by some academics to strengthen the transparency obligations of companies with regard to their algorithms, and more precisely with regard to the personal data they have and the use they make of it, when assessing the competitive situation of a company and its market.
36The objective is to prevent a platform from taking advantage of the opacity of its algorithm, for example by charging excessive prices to both consumers and its partners.
37However, although strengthening the transparency obligations already present in many texts 28 could prove useful, both to inform consumers and to enable the authorities to ensure that fundamental rights and freedoms are respected, in many respects this strengthening appears to be an unsatisfactory solution.
38Indeed, algorithmic transparency would by definition lead to a lifting of the trade secrecy29 that currently protects algorithms because they cannot be patented for the moment.
39However, notwithstanding the fact that in France, the commercial code largely envisages the lifting of business secrecy which, for the record, does not constitute an obstacle to any request based on Article 145 of the Civil Code30, in the event that companies were obliged to disclose their algorithms, anti-competitive practices would probably be facilitated and innovation curbed, since companies would have no difficulty in copying the respective algorithms of their competitors.
40However, a complementary solution would lie in the fairness of the algorithms, which would have as a corollary the drafting of codes of good conduct by the developer and user31 company. It should be noted that the draft European regulation32 insists on and makes mandatory the drafting of such codes of conduct in order to encourage user companies to control their algorithms upstream to prevent anti-competitive risks.
41Companies would thus be obliged to control their algorithms and to programme them upstream in compliance with competition law (“compliance by design”). The underlying idea is to encourage companies to design and implement their algorithmic systems in an ethical and responsible manner.
42Consequently, if the obligation of transparency constitutes at first sight a real brake on innovation, it becomes an accelerator when coupled with the obligation of fairness which, together, fully participate in a much more determining obligation: that of the explainability of software.
43In other words, companies would thus be obliged to document and explain the operation of their algorithms in the event that an anti-competitive practice is observed, denounced or even anticipated.
44The advantages of such an “ex-ante” solution would be, at a minimum, twofold:
- It would make it possible to respond to the challenge of acceptability and more specifically to the endogenous and exogenous causes33 inherent in the development of any tool implementing artificial intelligence; and
- It would help to prevent and document possible abuse of a dominant position.
45This would encourage innovation and speed up any abuse of a dominant position proceedings brought by the authorities.
B. Regulation of digital actors through “dat”: a tool for detecting infringements
46Artificial intelligence is a vector of numerous discriminations perfectly highlighted by Professor MALURAIE-VIGNAL34. For the sake of completeness, the professor distinguishes between algorithmic discriminations to the detriment of companies and those to the detriment of consumers, which may be a function of consumer behaviour, or the characteristics of the sale or even of the buyers.
47However, artificial intelligence remains a multi-faceted tool that, paradoxically, can be the solution to the problems it induces due to its intrinsic characteristics.
48This is why AI is also perceived as a tool for detecting anti-competitive practices, particularly by governments35 and administrative authorities, which are now looking at the issue of detecting and regulating anti-competitive practices through the prism of “data-driven regulation”36.
49In other words, in addition to imposing possible new obligations on companies using algorithms, the public authorities are studying the possibility of using artificial intelligence to set up algorithm analysis programmes, i.e. to collect and exploit data in order to prevent, detect, deter and punish possible anti-competitive practices37.
50In this sense, the French financial administration is an example and even a model in this field38. Indeed, the financial administration is now deploying artificial intelligence tools in the field of financial evaluation and legality control39 in order to be part of the fight against fraud40. The use of algorithmic devices in the fight against fraud, which was made permanent in 2017 for companies, has now been extended to checks on individuals. These artificial intelligence devices aim to automate the “targeting of fraud” in order to identify the criteria characterising a fraudulent person in order to establish a typical profile that will then be applied to a target population41. In other words, the tax authorities use automatic data processing to determine patterns that can be used to identify fraudulent individuals or companies.
51However, the advent of such technological and technical devices will inevitably contribute to the establishment of fact as law, leading to the migration from a “causal” legal system to a system of practical correlations.
52In other words, this means that the potential fraudster will not be targeted on the basis of the offence committed, but through a factual analysis of signals, behavioural and/or relational, predefined by an algorithmic device42.
53To put it differently, the detection of fraudsters will result, in this context, not from a suspicion based on presumptions or denunciations43, but will be the result of the analysis of statistical probabilities “elf-generated” by the algorithmic device.
54Artificial intelligence devices will thus produce, from the facts, a new form of norm, which will be “algorithmic”, which will ultimately lead to a mutation of legal reasoning, and which will take the form of a system of practical correlations44.
55In theory, regulation by data therefore makes it possible to make the players more accountable, but also strengthens the regulator’s capacity for analysis, speeds up procedures and provides more information to users and civil society45.
56The introduction of algorithmic tools to combat anti-competitive practices will thus imply ex-ante surveillance – the anti-competitive practice will be presumed even before it is carried out – and will also imply widespread surveillance of all the players in a market targeted by algorithmic tools in order to identify discriminatory, fraudulent behaviour and in particular possible cartels, abuses of dominant positions and even unfair competition.
57In other words, we would end up with a total reversal of the burden of proof to the benefit of:
- increased accountability of the various actors.
- on the acceptability of a highly disruptive technology.
- the speed of proceedings to sanction abuses of dominant positions.
- on the adaptation of a law to the challenges brought about by the development of artificial intelligence.
58In the era of “algorithmic administration”, the potential evolution of the control of anti-competitive practices of the actors will therefore most likely lead to a change of paradigm and to a transformation of normativity through the consecration of a presumption of guilt of its actors.
59However, the development of these “ex-ante” tools will require, upstream, an understanding of their deployment context (adapted digital environment), and the mechanics of their operation. It also implies that the control of these devices, or even their certification, is an essential prerequisite for their adoption and implementation.
60In this sense, regulation through data is perfectly complementary to the discussions currently taking place on the transparency and fairness obligations that would be imposed on companies developing and/or using algorithms.
61However, there is one uncertainty that will remain despite the development of “ex-ante” regulation tools: human intention. It is indeed common sense to state that the introduction of "compliance by design" will not make it possible to enforce all competition rules.
62Consequently, “ex-ante” measures should be supplemented by “ex-post” measures. Among these measures, it is proposed, for example, to create new competition tools such as “abuse by use of leverage”46 or “abuse of a dominant position”47. The latter would respectively sanction:
- a company with a dominant position in one market that would have used its position to gain dominance in a related market.
- a company which is not yet dominant but which, by its actions, intends to become dominant.
***
63Today, more than ever before, the lawyer is lagging behind in the face of new technologies. The daily and increasing use of artificial intelligence as well as the weight of data collection and exploitation make the implementation of competition law and its flagship instrument, the abuse of a dominant position, ambivalent in many ways.
64In any case, if the “era of algorithmic administration” induces and will induce certain transformations of the “ex-ante” and “ex-post” normativity in order to allow the abuse of a dominant position tool to take the measure of this new digital economy, this should not hide the importance of having a body of coherent and readable texts. Businesses demand it and innovation requires it.
65In this sense, an adaptation of the existing texts, coupled with an increased responsibility of the various actors, appears to be common sense, considering the crucial role they play in the economy and even in democracy48.
Footnotes
1 In addition to the GAFAM (Google, Amazon, Facebook, Apple, Microsoft) there are other super-platforms include Netflix, Airbnb, Tesla, Uber, Twitter, Baidu, Alibaba, Tencent, Xiaomi.
2 See for more details, S MERABET, “Towards a law of artificial intelligence”, New Library of Theses, vol. 197, Es. Dalloz p.°278.
3 See Comm. EU, press release IP/10/1624, 30 Nov. 2010.
4 The European Commission criticised Google for manipulating search results in order to ensure better visibility of its own services, to the detriment of its competitors’ services. EU Comm. Dec. C (2017) 4444 final, June 27, 2017, aff. AT. 39740, Google INC. And Alphabet Inc.
5 Decision No. 19-D-26 of 19 December 2019 concerning practices implemented in the online search advertising sector.
6 V. EU Comm. Dec. 20 March 2019, Google, AT.40411.
7 Commission Decision of 27 June 2017, AT.39740 - Google Search (Shopping)
8 A.-E. Ezrachi and M.-E. Stuckes, Artificial Intelligence & Collusion: When Computers Inhibit Competition: U. Ill. L. Rev. 1775 (2017); V. Also Ezrachi, M. Stuckes, Algorithmic Collusion: Problems and Counter-Measures, in OECD Roundtable on Algorithms and Collusion, DAF/COMP/WD (2017), 25 May 2017.
9 The fear is that the winner-take-all will pre-empt the markets and exclude all forms of competition (e.g. Facebook)
10 “Because of the danger of confiscation of this Internet space which is supposed to be open to all by a few” (Danielle FASQUELLE).
11 Some companies are becoming so powerful that they are able to directly challenge the states themselves. For example, following the decision of the Australian authorities in 2021 to legislate and force the social network Facebook to pay the media for the use of their content, Facebook blocked all the news pages of the media in Australia on its site.
12 I.e.: because of the current tug of war between the United States, China and the European Union (e.g. the decision of the American government to prohibit Google from equipping HUAWEI equipment).
13 D. Fasquelle, Competition law at a crossroads with the emergence of digital technology: Contrats, conc. Consom. 2019, dossier 11).
14 Samia Maouche “Zoom on the new digital economy department of the Competition Authority”, 27/02/2020
15 J.B. Duclercq, Les effets de la multiplication des algorithmes informatiques sur l’ordonnancement juridique: Comm. Com. Electr. 2015, study 20).
16 A. Vial, The legal qualification of artificial intelligence: From silicon to person? Droit et Affaires n° 15, déc. 2018, 4).
17 Because it is difficult to study directly the effect of a practice on total surplus, an evidence test will make it possible to distinguish pro-competitive exclusionary practices from anti-competitive exclusionary practices by offering a simple analysis grid. The evidence test should make it possible to distinguish between practices that have a positive or neutral effect on total surplus and those that have a negative effect. But there is always a risk of error. Two types of error are possible: false positives and false negatives. In the first case, a practice is condemned even though it has a positive effect on the total surplus (it is pro-competitive, so innocent companies are wrongly condemned). In the second case, a practice is not considered illegal even though it reduces the total surplus (it is anti-competitive, so guilty companies are not condemned). Each type of error has a different impact on the economy. False positives can discourage risk-taking and innovation, and may inhibit competition between firms. False negatives tend to open the door to more anti-competitive practices. Choosing a test requires assessing the probability of occurrence of each type of error and their costs, in order to minimise the expected total cost. This choice is a matter of competition policy. The second dimension to be considered in the choice of the test relates to its implementation. This dimension concerns the competition authority, the companies and the judges.
18 The Chicago School’s doctrine is that it is better not to act and not to sanction behaviour that would be anti-competitive (“false negative”), than to wrongly sanction behaviour that would have a positive effect on consumer welfare (“false positive”). In other words, the market will self-regulate through the emergence of disruptive technologies.
19 D. Fasquelle, “Competition law facing the challenge of the digital economy”, Corporate law book #°3, May 2019, file 15.
20 Shaping Europe’s digital future": COM(2020) 67 final; See also EU Comm. 19 Feb. 2020, White Paper, “Artificial Intelligence, A European approach based on excellence and trust”: COM(2020) 65 final; “A European strategy for data”: COM(2020) 66 final; D. Bosco, “Shaping Europe’s digital future”: the European Commission unveils its digital strategy”, Contrats Concurrence Consommation No. 4, April 2020, comm. 70.
21 Article L. 420-2 of the Commercial Code and Article 102 TFEU; “It is the general view of competition authorities in OECD countries that the goal of competition policy is to protect competition, not competitors. In pursuit of this goal, many authorities and courts consistently use the term ‘competition on the merits’ to explain and justify their views on how to distinguish between practices that harm competition and those that enhance it.” (https://www.oecd.org/fr/concurrence/fusions/37503683.pdf); See for further discussion and S. Merabet, “Towards a Law of Artificial Intelligence”, New Library of Theses, vol. 197, Es. Dalloz p.°278.
22 I.e.: the algorithm used by the Parcoursup platform. See for more details S. Merabet, “Towards a law of artificial intelligence”, New Library of Theses, vol. 197, Es. Dalloz p.°278.
23 A-S. Chone-Grimaldi, Digital Services Act - Towards a new competition and regulation law applicable to the digital sector, The Legal Week General Edition n° 49, 30 Nov. 2020, doctr. 1360.
24 A-S. Chone-Grimaldi, Digital Services Act ‒ Towards a new competition and regulation law applicable to the digital sector, The Legal Week General Edition n° 49, 30 Nov. 2020, doctr. 1360.
25 It seems that the former includes competition problems in still evolving markets such as connected objects, augmented reality, solar drones or mapping for autonomous vehicles, and the latter includes the problem of Google’s domination of the general search market and Facebook’s domination of social networks.
26 V. For more information, see Aut. conc. contribution de l’Autorité de la concurrence au débat sur la politique de concurrence et les enjeux numériques, Feb. 19, 2020.
27 A Data Lake is a data repository that allows a very large amount of raw data to be stored in native format for an indefinite period of time.
28 Regulation (EU) 2019/1150 of 20 June 2019 promoting fairness and transparency for businesses using online intermediation services imposes transparency obligations regarding content ranking criteria; Directive (EU) 2019/2161 of 27 November 2019 dealing with platform-consumer relations, obliges the platform to clearly inform consumers of the main parameters determining the ranking of the various offers (art. 3); The Digital Services Act imposes transparency obligations on very large platforms regarding online advertising (art. 30); Article L.111-7 of the Consumer Code imposes transparency obligations regarding the conditions and criteria for ranking and referencing offers on platforms with respect to consumers; Regulation (EU) 2016/679 on the protection of personal data indicates that “controllers”, when implementing “automated decision-making”, will have to be able to provide “useful information concerning the underlying logic” of this automated decision, “as well as the significance and the intended consequences of such processing for the data subject” (art. 13 and 22).
29 Directive 2016/943 of June 8, 2016, transposed into French law by the July 30, 2018 law on business secrets (C. com., art. 151-1 et seq.).
30 Article 145 of the Code of Civil Procedure: “If there is a legitimate reason to preserve or establish before any trial the proof of facts on which the solution of a dispute may depend, legally admissible measures of inquiry may be ordered at the request of any interested party, on application or in summary proceedings”.
31 The Furman report recommends that the Digital Market Unit set up a code of conduct for platforms identified as having a strategic status on the market. For its part, the Crémer report recommends that the company itself set up a self-regulation system with a framework for algorithms via “compliance by design”. Marie MALAURY-VIGNAL, “Algorithms and competition”, Contrats Concurrence Consommation n° 6, June 2021, study 6.
32 The European Commission (EC) published on 21 April 2021 a draft regulation establishing harmonised rules on artificial intelligence (AI).
33 V. For more information: S Merabet, “Towards a law of artificial intelligence”, New Library of Theses, vol. 197, Es. Dalloz p.°278.
34 See for more details Marie Malaury-Vignal, “Algorithms and competition”, Contrats Concurrence Consommation n° 6, June 2021, study 6.
35 Competition Authority, press release, 18 July 2019, G7 - Agreement common competition law and digital economics.
36 Aut. Conc, AMF, ARAFER, ARCEP, CNIL, CRE and CSA, rapp, July 8, 2019, Regulation by data.
37 On this point, let us highlight the establishment of the British Competition market authority or the Dutch competition authority and, in France, the strengthening of exchanges on data-driven regulation between several regulators, including the Autorité de la concurrence, the Autorité des marchés financiers (AMF) and the Autorité de contrôle prudentiel et de résolution Aut. Conc, Regulation by data: regulators publish joint note, press release, July 34, 2019.
38 Caroline Lequesne-Roth, The fight against fraud in the digital age -. - Les enjeux du recours à l’intelligence artificielle par l’administration financière, Droit fiscal n° 5, 4 February 2021, 120.
39 See in this sense, A. NOR: CPAE1903196A, Jan. 29, 2019, creating an automated predictive analysis process for controlling State expenditure. The purpose of this system is to assist public accountants with certain verifications concerning: the quality of the authorising officer, the availability of credits, the production of supporting documents, etc. The system will be implemented on an experimental basis in the Brittany Regional Public Finance Department in 2019 and will be extended to the rest of France by 2020-2021.
40 In March 2019, the President of the National Assembly thus announced, among his proposals to “renovate parliamentary life”, the creation of a new tool “LexImpact”, the purpose of which is to allow, in the long term, “any parliamentarian who wants to put a figure on an amendment [...] [to] take care of it himself thanks to the software”, Éric Woerth, Séance en hémicycle, Wednesday 19 June 2019.
41 Rapp. Senate No. 668, 22 July 2020.
42 “The idea is to reconstitute algorithms by reverse engineering in order to understand how they work”. See for more details Marie MALAURY-VIGNAL, “Algorithms and competition”, Contrats Concurrence Consommation n° 6, June 2021, study 6.
43 Until now, cartels were often disclosed under the leniency procedure, whereby a participant in a cartel who denounced it could obtain a substantial reduction or even no penalty at all.
44 Cf. Supra.
45 Aut. Con. Communiqué, July 8, 2019, Cooperation between regulators.
46 A-S. Chone-Grimaldi, Digital Services Act - Towards a new competition and regulation law applicable to the digital sector, The Legal Week General Edition n° 49, 30 Nov. 2020, doctr. 1360.
47 Ibid.
48 i.e: the numerous foreign interferences during the recent American, French and German elections. See “Why Social Networks Put Democracy at Risk”, https://www.forbes.fr/technologie/pourquoi-les-reseaux-sociaux-mettent-en-danger-la-democratie/
Author
Avocat à la Cour
Docteur en Droit
Only the text can be used under the OpenEdition Books License license. Other elements (illustrations, attached files) are “All rights reserved”, unless otherwise stated.
Qu’en est-il du code du commerce 200 ans après ?
États des lieux et projections
Corinne Saint-Alary-Houin (ed.)
2008
Qu'en est-il de la simplification du droit ?
Frédérique Rueda and Jacqueline Pousson-Petit (ed.)
2010
La réorientation européenne de la TVA à la suite du renoncement au régime définitif
Francis Querol (ed.)
2014
Regards critiques sur quelques (r)évolutions récentes du droit
Tome 1 : Bilans et Tome 2 : Réformes-Révolutions
Maryvonne Hecquard-Théron and Jacques Krynen (ed.)
2005