Home Law Big Data Misuse and European Contract Law
Article Open Access

Big Data Misuse and European Contract Law

  • Cihat Börklüce EMAIL logo
Published/Copyright: November 21, 2024
Become an author with De Gruyter Brill

Abstract

The dynamics of contractual interactions have been evolving in recent years, as big data introduces new dimensions to previously conventional contracts. This development intensifies the information asymmetry between the dominant and vulnerable parties, posing increasing challenges for consumers and the entirety of European contract law. This paper offers three main contributions to this discourse. First, the utilization of big data introduces a novel form of information asymmetry between contracting parties, further empowering the already dominant party while exacerbating the vulnerability of the weaker party. Second, contemporary European case law and legal scholarship advocate for a harmonious approach in which European regulations and national remedies complement one another in protecting individuals. Lastly, there are already instances indicating the misuse of big data, where multiple individual claims may arise at both European and national levels.

Résumé

Ces dernières années, l’échelle des interactions contractuelles a considérablement évolué, car le Big Data ouvre de nouvelles perspectives par rapport aux contrats traditionnels. Cette évolution accentue l’asymétrie d’information entre la partie contractante déjà dominante et la partie contractante vulnérable, rendant le suivi de la situation de plus en plus complexe, non seulement pour les consommateurs, mais également pour l’ensemble du droit européen des contrats. Le présent article apporte trois contributions essentielles à ce débat. Premièrement, l’utilisation du Big Data crée une nouvelle dimension d’asymétrie d’information entre les parties contractantes, renforçant la position de la partie dominante et affaiblissant davantage la partie vulnérable. Deuxièmement, la jurisprudence européenne actuelle et la littérature juridique spécialisée développent une perspective harmonieuse dans laquelle la réglementation européenne et les remèdes nationaux se complètent en321 matière de protection des individus. Enfin, il existe déjà des cas témoignant d’une utilisation abusive du Big Data, susceptibles de donner lieu à plusieurs recours individuels, tant au niveau européen que national.

Zusammenfassung

In den letzten Jahren haben sich die Gleichgewichte in Vertragsbeziehungen signifikant verschoben, da Big Data für beide Parteien neue Perspektiven eröffnet und dies für Verträge, die bisher als eher herkömmlich gesehen werden konnten. Diese Entwicklung vertieft die Informationsasymmetrie zwischen der ohnehin dominanten und der verletzlichen Vertragspartei, eine Herausforderung nicht nur für die Kunden, sondern auch für das gesamte Europäische Vertragsrecht. Drei Aspekte stehen im vorliegenden Beitrag im Vordergrund, die in der bisherigen Debatte eher schwach ausgebildet sind. Erstens führt der Einsatz von Big Data ganz allgemein zu einem neuen, bisher unbekannten Maβ an Informationsasymmetrie zwischen den Vertragsparteien, welches die dominante Partei weiter stärkt und die verletzliche Partei zusätzlich schwächt. Zweitens wird in der aktuellen Europäischen Rechtsprechung und der juristischen Fachliteratur angemahnt (und in diesem Beitrag ausgetestet), dass Europäische Regulierung und die Rechtsbehelfe im nationalen Recht stärker Hand in Hand gehen sollten, besser aufeinander abgestimmt werden sollten, um sich im Schutz des Einzelnen besser zu ergänzen. Schlieβlich existieren bereits Fälle, die auf einen Missbrauch von Big Data hinweisen und bei denen verschiedene individuelle Ansprüche sowohl auf Europäischer als auch auf nationaler Ebene ins Auge gefasst werden können.

1 Introduction

E-commerce already accounts for more than one-fifth of the world’s retail trade and is on pace to exceed 25 % by the end of the decade.[1] This is no coincidence, as it brings multiple benefits to both sides of the contract. It is faster, safer, and more convenient, enabling contracts to be concluded regardless of time and place. Another reason for this surge is the use of big data in day-to-day contracting, through the personalization of offers and advertising. Although the term ‘big data’ dates back to the early 2000s,[2] 322 its impact on daily contracting raises novel issues. It introduces interesting topics for discussion – both economic and legislative.

The personalization and automation of offers could create a win-win situation for both parties in a contract. However, the reality does not conform to this promise, as the balance is shifting towards the companies’ side, rather than fostering balanced prosperity. The power to use big data towards customers also widens the gap between larger and smaller companies.[3]

This paper is based on a forthcoming dissertation, following, therefore, a similar structure and approach.[4] The original work compares the German and Turkish national laws with an overview of European regulations. The German and Turkish contract laws are fairly similar, because the Turkish civil provisions are based on the Swiss equivalents, so relatively few divergences occur.[5] For this reason and to save space, only the Turkish equivalents of the German codes will be referred to throughout the paper without entering into a comparison.[6]

The paper consists of five parts, including the introduction. Section II opens by describing the impact of big data on contract law, highlighting the mechanisms of big data and various practices of misuse. Section III then examines the relationship between European regulation and national contract law, with a recent CJEU judgment reflecting the historical development of the debate. This section also discusses several European (Section III-3) and national (Section III-4) legal safeguards against the misuse of big data. The fourth section provides three examples of past misuse cases from three different countries and companies, illustrating the overall potential for misuse. The fifth and closing section concludes with the findings of the paper.

2 A Shift in the Balance of Power

As the name implies, two pillars of big data are that it is big and it is data.[7] Accordingly, big data refers not only to the enormous amount of data that is beyond323 traditional methods to handle and analyse, but also to the technologies capable of working with such data.[8] Such technologies involve large amounts[9] of complex data[10] that are processed in an unprecedented speed[11] and accuracy.[12] , [13] Big data has a variety of application areas, such as credit scores,[14] healthcare, logistics, agriculture, city planning, smart homes and devices, supply chains, energy, investments and optimisation of R&D.[15] Another crucial area is the implementation of big data in e-commerce, where price calculations and other personalisations are done in an automated setting.[16] This implementation changes the traditional balance in contracts and creates a challenge for the contract law along the way.

2.1 Contributing Factors to the Shift in the Balance

Earlier experiments have shown that having social media accounts has the potential to influence users emotionally and behaviourally.[17] The mere combination of data and platform subscriptions has therefore already an influence over the user. This becomes even more apparent when considering the amount of personal data companies extract and process daily for commercial or R&D purposes. In one example, the average was estimated to be 1.7 megabytes of personal data per person each second, or around 143 gigabytes each day.[18] A correlation between the amount and quality of information a company has and the success of its direct marketing already324 exists,[19] and the same model naturally applies to big data-associated marketing and contracting.

The application of big data could also evoke trust from the user, as the cultural reputation of algorithmic calculations and recommendations entails the tendency to believe that these calculations are objective, devoid of bias, noise and failures of human intervention.[20] Such misplaced trust might for example be a particular drawbacks against the user in price discrimination practices.[21] It has also been reported that some companies have tried to make their multiplication ratios uneven because even multipliers such as 2.0 or 3.0 raise distrust in the user due to looking artificial.[22]

Misuse of big data is also difficult to spot since the computations and other personalisations take place before the final transaction and in the background and in most cases the user is not even aware of the personalisation being undertaken.[23] This is in part why this paper calls for a partially reversed burden of proof in big data-related discrimination claims, where the onus is sometimes on the company to prove that it is not at fault for the discrimination that has occurred and/or that it has taken the appropriate efforts in its algorithmic practices to prevent such discriminations from taking place.[24]

Larger companies – the so-called old dogs – also enjoy a competitive advantage over their smaller rivals through the exploitation of big data.[25] Not only do they have the ability to tailor their offers, which their competitors do not, but they can also directly target and discriminate against smaller companies. Smaller firms then lose twice: once by being the target of disadvantageous contracts, and again by not having325 the same contractual opportunities with other customers. Thus, simultaneously, not only contractual but also competition problems arise.[26]

A similar dynamic also takes place against inexperienced users. A savvy customer with enough online experience could, for example, use price comparison techniques to avoid price discrimination, or a VPN service to circumvent steering and geo-blocking practices, whereas an inexperienced user – especially the elderly and teenagers – could not prevent themselves from the same discriminating activities.[27] Even in cases where a user could avoid discrimination by using such services, the mere opportunity cost of investing the time, money, and effort to prevent them from getting discriminated against would make the market as a whole less efficient.[28]

The misuse of big data would mean that the powerful entities become even more powerful, while the vulnerable parties become increasingly disadvantaged. This dynamic affects both B2C and B2B transactions. Consequently, the balance shifts in a manner that contradicts the fundamental principles of contract law, which aim to ensure fairness and equality between contracting parties.

2.2 Big Data and Contracting: What’s New?

It is by no means a new phenomenon that some parties to a contract have more bargaining power than others.[29] One of the main objectives of contract law – and certainly of European law – is to establish a balance between such disparities in negotiating ability. Achieving such a fair balance is proving increasingly more challenging given the tools of big data,[30] which tend to favour companies with the resources to harness it over customers and other smaller market participants.[31]

Using such tools, companies can benefit from the following advantages over smaller market players that they would not have, or not to the same extent, had they not been able to harness the capabilities of big data:326

  1. Companies gain significant insights into behavioural activities and patterns by monitoring users’ online activities, which aids in predicting their future conduct.[32] This, in turn, enhances the efficiency of direct marketing, a practice companies have been striving to optimize for much of the last century.[33]

  2. They can personalize and tailor individual contracts, manipulating various deal characteristics, from the individualization of advertising to the customization of the consideration.[34] Companies can even steer customers such that certain goods and services are inaccessible to specific individuals, or available but with the user directed towards other options rather than the optimal choice.[35]

  3. Not only can companies influence the contract’s content, but they can also dictate the timing of the transaction.[36] This alters the traditional market dynamic where the buyer decides when to enter the market, shifting the advantage to the seller, who can strategically time advertisements to potential buyers. This increases the likelihood of a successful contract, regardless of its optimality.

  4. These practices are often legitimized by the fact that websites and applications obtain user consent through general terms and conditions and terms of use.[37] Companies therefore also want to appear to be able to avoid mandatory regulation, at least in the eyes of their contractual partners. They also try to privatise redistribution, even though redistribution is a task for the state. However, they do not necessarily give resources back to the general public. This questionable practice that appears to lack authority is challenged by various national and European codes.327

2.3 Different Practices of Misuse

2.3.1 Price Discrimination

Although having miscellaneous forms of appearance[38] and different meanings and outcomes in various areas of law,[39] price discrimination is, in essence, charging different prices to different customers for the same goods and services.[40] There is a fairly wide field of unproblematic cases of price discrimination, such as economies of scale or the gain from demand model, where the buyer is entitled to a discount on the condition of purchasing a certain amount of goods and services at once.[41]

The problem begins when the tools of big data assist in tailoring contracts in ways that maximise a company’s profits while forcing customers to pay more for the same good or service for no legitimate reason. To elucidate further, price discrimination in this context manifests itself in two distinct forms. Firstly, as a market mechanism where an increase in demand coupled with a decrease in supply leads to a rise in prices. Secondly, as an automated decision-making process, which is not influenced by supply, demand, or market conditions, but rather is based on a personal assessment of the counterparty.

There is already evidence of this second type of price discrimination. The location of the user,[42] the operating system used,[43] or even the battery status of the device[44] could lead to discrimination. Some offers are personalised with a discount if the user uses a comparison site, presumably because it suggests a more price-conscious user.[45] Price discrimination of such nature triggers several contract law remedies.328

2.3.2 Violation of Privacy

Another quite problematic form of misuse is the violation of privacy, where a person’s data or even their vulnerabilities and weaknesses are specifically targeted to lure or coerce them into a contract.[46] It appears that companies are already tracking even the most minute details of their potential customers’ online activities, which could add up to 400 pages for one singular user.[47] They also exchange such data sets,[48] creating yet another problematic area in data economics.[49]

There is also some evidence of privacy violations where the sensitive information is used for advertising purposes, either to optimize the timing[50] or the audience[51] of the advertisement. This discussion of privacy violations brings two major contractual issues to the surface: The first is the sensitive data itself, which is transferred from one company or institution to another,[52] and the second is the isolated contract made by individuals where their privacy has been breached beforehand.[53]

2.3.3 Steering (also through Blocking and Opacity)

A third type of questionable practice is steering, where the algorithms personalise search results according to the data they possess and work, such as the region from which the potential customer connects.[54] The aim is to get more people to choose one deal or one similar service over another. Sometimes steering is so sophisticated that a company practically hides some results from certain users.[55] This first form comes329 often due to the location of the user, which also be termed geoblocking.[56] There is a second, softer version of steering, where all results are available but in a particular order, so that a user is more likely to choose one of the leading results rather than the one that is best for them.[57]

Steering could also take place in a milder form via (or sometimes intensified by) intransparency/opacity. This problem refers not only to transparency clauses such as Article 12 and 13 GDPR (Article 4, 5, 6 KVKK)[58] but also to the information obligations under UWG (TTK)[59] and BGB (TBK/TMK).[60] , [61] If the disclosure of information would impact the user’s decision (e.g., in case of price discrimination), the fact that this information is withheld from the user is also an element of steering. Intransparency – either as lack of transparency or chosen intransparency – in itself could then lead to individual claims, in particular under Article 82 GDPR.[62] It could also have a cumulative effect on the misuse practices, leading to violations from for example § 3(2) UWG[63] and §§ 305 et seq or § 123 BGB (§§ 20 et seq or § 36 TBK) – especially where good faith is harmed.[64]

2.3.4 Combination of Misuse Practices

One key feature of big data-related contractual problems is that they rarely occur as a pure, single-layer type of infringement. Instead, they often involve a combination of several simultaneous practices. For example, a website might use steering alongside price discrimination, making the misuse twice as effective by not only showing less relevant results but also making them more expensive.[65] Another example might involve violating users’ privacy by exploiting their vulnerabilities and using these vulnerabilities to discriminate in terms of price. Even in cases where one type of practice is apparent, the lack of transparency may suggest a combination of violations, making the practice a potential breach of the contract law as well as the data protection law.330

The combination of practices has two main roles to play in the consideration of potential misuse cases:

  1. First, if a company engages in a combination of improper practices rather than just one, this would be significant in the examination of subjective facts such as intent or negligence. It would indicate that the company is acting with intent or, at the very least, is reasonably negligent in allowing multiple misuse practices to trespass.

  2. Second, the combination of excessive practices constitutes another distinct type of case. Here, multiple infringements collide and accumulate, leading to a summation effect[66] like the law on general terms and conditions[67] or unfair competition law,[68] where milder forms of misuse might not independently reach the threshold of illegality. However, when these misuses are combined, they collectively do.

3 Regulation and some Private Law Tools

A brief explanation is necessary at this point to explore the nexus between European regulation and national private law. A recent decision of the European Court of Justice (CJEU) will trace the historical development (section III-2), as it sets valuable principles for this relationship. The paper then discusses some contract law mechanisms against big data misuse (III-3 and III-4), as German and Turkish law could already offer a variety of remedies against big data misuse.

3.1 An Overcview of the Discussion

The debate on the connection between public good regulation and private law is not only a historical but also a contemporary one.[69] In Germany, the discussion could be331 traced back to the ordoliberal thoughts in the mid-20th century[70] and split into four main approaches, with the fourth one gaining momentum in recent years.[71]

The ordoliberal approach regarded private law and the regulation of the public good as two completely distinct paths and was therefore named the separation approach.[72] Next followed the milder, more economic approach, which maintained the separation between the two but treated private law and the regulation of the public good on par.[73]

Following George Akerlof’s distinguished essay ‘Market for Lemons’[74] a third approach developed, which correlated between private interests and public good.[75] Akerlof argued in a nutshell that information asymmetries in a market are not only harmful to individuals but also to the market itself over time. A market that does not function optimally due to information failures could in the long run lead to adverse selection, which then could collapse the market as a whole.[76]

The fourth-mild regulation (or as this paper calls it, harmonic) approach is still an ongoing discussion,[77] yet in principle, it does not view these two as ‘either-or’[78] propositions but rather as linked and complementary. This paper follows the harmonic approach also in the framework of big data regulation in the EU and the personal rights of individuals. A recent decision of the CJEU becomes quite central moving forward, as it has matching implications.332

3.2 Recent Jurisdiction of the CJEU: Private Law and Regulation in Harmony

The CJEU recently ruled against the Mercedes-Benz Group AG[79] that car buyers are entitled to compensation for damages caused by the use of called thermal windows,[80] based on § 823(2) BGB [§ 49(2) TBK] and even in the case of negligence.[81] This precedent ruling has not only overturned the previous decisions of the BGH but has also paved the way for a wave of similar legal actions in the future.[82]

In the words of the CJEU, the European regulations on this matter[83] ‘(…) must be interpreted as protecting, in addition to public interests, the specific interests of the individual purchaser of a motor vehicle vis-à-vis the manufacturer of that vehicle, where that vehicle is equipped with a prohibited defeat device within the meaning of the latter provision.’[84]

The importance of the decision reaches beyond its subject matter as it has the potential to shift the aforementioned discussion towards the fourth view, namely the mild regulatory approach.[85] Even in cases where the regulation does not confer an applicable individual remedy, member EU states should acknowledge at least one such remedy[86] within their national norms.

The aftermath of the debate has indeed already shown impact on both European and German legislation, as an Article 11a was added to the Unfair Commercial Practices Directive[87] amidst the ongoing discussions above, encouraging member states to provide individual protection against unfair commercial practices.[88] This333 was then implemented into the German UWG in § 9(2) in 2022 for cases against the consumer.[89]

Based on this decision, some European and German statutes will be specified since, in cases of big data misuse, an individual claim is now always presented, whether due to a violation of an individual or the public interest of the EU. It is also worth mentioning that for future cases, a specific individual remedy from European regulation will no longer be a necessity, as the decision grants individual claims notwithstanding the regulatory picture at a given time.

3.3 Remedies from European Law Against Big Data Misuse

With regard to several aspects of the big data, European regulation has a wide range of new rules, such as the DSA,[90] the DMA,[91] the Geoblocking-Regulation,[92] the Data Act[93] and the proposed AI-Act.[94] This paper, however, identifies three main contractual sources under the topic and proceeds to discuss national private law claims from the BGB. Although actually pre-dating the EU regulations, the UWG reforms since 2004 have mainly served to harmonise German competition law with the corresponding European regulations.[95] For this reason, the UWG is also considered under the European scope.334

3.3.1 Consumer Law

The first possibility is the consumer law mainly under §§ 312 et seq BGB,[96] which is largely based on the Consumer Rights Directive[97] and the Directive on Electronic Commerce.[98] This first discipline does not add any value to this particular matter, however, as general consumer rights exist regardless of whether there is a misuse of data or not.[99] Suffice it to note that consumer rights of return and cancellation are also granted in cases of misuse of big data, of course, to the extent that one of the parties is a consumer within the meaning of § 13 BGB[100] [§ 3(1) lit k TKHK].[101] Following the current amendment in 2022,[102] § 9(2) UWG also allows individual consumer claims relating to cases materially affecting the economic behaviour of the consumer without due diligence.[103]

3.3.2 Article 82 GDPR

A second possibility for a remedy emerges from Article 82 GDPR, in the shape of a right to compensation.[104] This grants any person who has suffered material or non-material damage as a result of a breach of the GDPR, the right to receive compensation from the controller or processor for the damage suffered. The right to compensation is also independent of the national laws of the EU Member States.[105] In Germany, for instance, there are already a large number of compensation cases, where the demanded amounts currently range from € 25[106] to € 10,000.[107] A recent CJEU ruling against Österreichische Post AG also paves the way for more damages claims from Article 82 GDPR, since the court states that there is no need for a material335 threshold to be met.[108] Thus, any violation of the GDPR could and will be the target of individual lawsuits within Article 82 across the member states of the EU moving forward.

3.3.3 §§ 3 et seq UWG – Unfair Practices

A third pillar of protection is within §§ 3 et seq UWG (§§ 54 et seq TTK). The misuse of big data could infringe several statutes of the Act against Unfair Competition. Yet, it is highly controversial whether and to what extent such statutes could be applied in the case of big data.[109] For some authors, personalisation is not material information that requires a duty to disclosure.[110] For others in the case of personalisation, there is often no false information about the product or the price.[111] Furthermore, the freedom of decision – as understood in the UWG/TTK – is different from the freedom of decision in the BGB/TBK.[112] Similarly, § 3(4) UWG sets the average consumer as the standard while determining whether or not an unfair practice has been committed, which could cause some shortcomings in this context given the susceptible groups and users who are more vulnerable to cases of big data misuse.[113]

This paper takes a more lenient approach by first accepting the general view that a blanket prohibition of such practices by the UWG/TTK is indeed not practicable.[114] To determine to what extent a practice resulting from the use of big data is a misuse and to what extent it is not, however, the UWG brings viable guidelines, especially after the last amendment it got in 2022.[115]

First, in the case of undue influence as defined in § 4a (1) no 3 UWG [§ 55(1) lit a no 8 TTK], a practice might be described as unfair if it exerts pressure on the market participant, even without the use or threat of physical force, in such a way that the other market participant’s ability to make an informed decision is significantly336 restricted.[116] Parallel to the prevailing opinion, the classification of personalisation through big data as an exercise of pressure is far-fetched, even if the personalisation leads to cases of misuse, which is why the possibility of invoking § 4a UWG [§ 55(1) lit a no 8 TTK] seems distant.[117]

Another unfair practice could be the withholding (no 1) of as well as unclear (no 2) or untimely (no 3) disclosure of material information, each of which is recognised as an omission under § 5a (2) UWG and prohibited under § 5a (1) UWG [§ 55 (1) lit a no 5 et seq TTK].[118] Material information is defined as pieces of information which are relevant for the market participant to make an informed decision and the omission of which is likely to cause the market participant to decide that they would not have made otherwise. Particularly in cases of steering or cumulative cases containing steering, the practice could be unfair if the market participant is substantially misled.[119] To avoid this, the big data controller should disclose all key metrics used in its’ personalisation process that might be classified as material information under § 5a UWG.[120]

Not least, § 3(2) UWG describes such practices as unfair if they do not meet the requirements of professional diligence[121] and are likely to significantly change the economic behaviour of consumers.[122] A major shortcoming of the UWG was that it lacked an individual right to damages, even though it recognised the particular unfairness to consumers, which was amended in 2022 by introducing this claim into § 9(2) UWG.[123] If a person who misuses big data intentionally or negligently engages in an unlawful commercial practice under § 3 UWG and thereby induces consumers to make a transactional decision which they would not have made otherwise, the person is obliged to compensate the resulting damage. Just like the compensation under Article 82 GDPR, the UWG now also provides remedies for individual claims – albeit only for consumers.337

4 Remedies from National Law Against Big Data Misuse

4.1 § 138(1)/§ 138(2) as Limits to Party Autonomy

One major link between European principles and likewise the Constitution and private law is § 138 BGB [§ 27(1) TBK], which declares contracts contradicting the public policy void.[124] This is defined as the sense of morality of all who think fairly and lawfully.[125] Defining such an abstract term carries some potential risks, but it also gives the private law a chance to combat the misuse of big data. In particular, in cases of privacy violations where sensitive data is used for commercial pursuits, either as part of the advertising strategy or as a pressuring tool over certain users, the transaction could be declared void. This might also apply to data swap agreements between companies, where sensitive personal data is transferred from one company to another.[126]

This limitation of party autonomy might also come in the shape of usury, as some big data misuse practices involve taking financial advantage of the vulnerability of the other side of the contract.[127] Usury is more concrete than the public policy option to some degree because its’ facts[128] are easier to identify.[129] First, there needs to be an exploitation of a contract in such a manner that the consideration is disproportionately inflated.[130] This should also result from the exploitation of the inferiority of the other party, either by taking advantage of their lack of experience, their inability to exercise sound judgment, or an emergency.[131] 338

With a few exceptions such as rental agreements,[132] the German BGH regards disproportionality as the doubling of the consideration.[133] The previous information regarding the power of big data becomes important for the examination of subjective facts, as the excessive use of algorithmic capabilities could lead to a lack of experience, especially for older and inexperienced users. In some cases, a more traditional form of usury could also occur by taking advantage of a predicament – as in the Uber low battery example.[134]

Contrary to the general view, this paper accepts the possibility of usury in cases of misuse of big data.[135] In cases where a user is disproportionately discriminated against due to inexperience or emergency, the obligatory transaction could be void under § 138(2) BGB (§ 28 TBK). In milder cases, where the conditions are not fully met, a usury-like transaction under § 138(1) BGB (§ 27 TBK) could also assist.[136]

4.2 § 134 as Another Limit

The violation of a statutory prohibition can also lead to the nullity of the transaction unless the statute leads to a different intended result.[137] According to § 2 EGBGB, a statute refers to any legal rule, including national and international norms as well as customary law, with some exceptions.[138]

According to the prevailing opinion, there is no subjective element in § 134 BGB [§ 27(1) TBK] and it is sufficient if the objective elements are fulfilled.[139] There should only be a prohibiting statute according to § 2 EGBGB, which is not lex perfecta, and this statute should be violated. So § 134 BGB and § 27 TBK take339 incomplete statutes[140] and transform them into leges perfectae by providing the previously lacking result.[141]

The obligatory transaction is then annulled and the performing transaction generally remains valid, but an obligation to return the goods and deeds may arise under § 812 BGB (§§ 77 et seq TBK).[142]

All European codes and principles are also statutes under § 134 BGB. As long as these norms do not set their legal consequences, the questionable legal transactions might be cancelled according to § 134 BGB.

The review of content under § 134 BGB applies not only to individual contracts between companies and users but also to contracts for the acquisition of such data. For example, in the case of personalised advertising, the acquisition of sensitive data may be void under § 134 BGB in conjunction with the GDPR and the BDSG.[143] An earlier decision of the OLG Frankfurt[144] demonstrated this connection by declaring a data transfer contract between two companies to be void due to a violation of § 134 BGB in conjunction with Article 28(3) BDSG – old version.

In this sense, the application of § 134 BGB is also essential concerning the focus of the CJEU decision (section III-2 above), as it links European law and national law, especially in cases where European rules lack providing individual claims.

4.3 § 123 through Deceit by Silence

Contracts concluded by declaration of intent subject to misuse of big data could, on the other hand, be voidable on the ground of deceit. § 123 BGB (§ 36 TBK) provides for the voidability of a declaration of intent based on duress and deceit.[145] A misuse of big data leading to deception might appear rather unlikely. However, deception could also occur through the silence of the contracting party if this party is obliged to disclose information to avoid misunderstandings or damage to the other party.[146] Facts about fundamental parts of a contract should be disclosed without request,340 especially when one contracting party has superior knowledge compared to the other.[147]

This principle also holds for contracts in the context of big data misuse, data-driven companies enjoy special leverage over users as a result of excessive information asymmetries.[148] Users lack basic information regarding the role of data in their daily contractual relations, not to mention the technical expertise to fully grasp how their data is being against them, were they to be provided with such information. Furthermore, European law also forces data handlers to process and transfer personal data transparently.[149] Failure to disclose such information threatens as a result not only contract law but also the very principles of European data regulation.[150]

Particularly in cases where the privacy of the user is violated by data-driven companies, as well as transparency problems, this might lead to the voidability of the declaration of intent according to § 123 BGB. Companies that use big data as a tool for tailoring contracts should also be held accountable for cases of unjustified discrimination directly resulting from such contracts.

There is no general obligation to disclose all information that might be important to the other party’s decision-making.[151] In determining whether such an obligation exists, principles such as good faith and the prevailing views of the particular business are relevant. If there is information asymmetry, one contracting party’s business or professional inexperience may require the other party to provide information in good faith.[152]

In cases of personalisation based on big data, the essentialia negotii are usually involved, e.g., the price paid. In contrast to the thresholds required for usury in § 138(2) BGB (§ 28 TBK), it could be argued that even a one per cent discrepancy could be meaningful to the user if such discrepancy results from data-driven-personalisation. The information asymmetry between the parties, which puts the user in a considerably inferior position, fosters the necessity of disclosure as well.

Therefore, in cases of big data and personalisation of contracts, companies should keep the necessary information on the existence of personalisation as well as the basic metrics used for such calculations (location, purchase history, credit points et cetera) available to the user. This information should be provided in plain and simple language and should straightforwardly convey the essence. For example, phrases such as ‘you will pay 20 % more due to personalisation’, ‘this price is341 calculated just for you’ or ‘this ad is personalised’ might be considered acceptable, with the option to read more if the user so decides.[153] Otherwise, the user’s declaration of intent is at risk of being voided through § 123 BGB (§ 36 TBK) during the period of avoidance as set by § 124(2) BGB (§ 39 TBK).

4.4 Culpa in Contrahendo

As part of the principles of contractual liability within the BGB, culpa in contrahendo or pre-contractual fault constitutes an alternative pillar against cases of misuse where the user suffers loss. The BGB does not only deal with some specific cases of culpa in contrahendo, such as the liability of the party declaring the contract to be void[154] or of an unauthorised agent[155] but also, after the reform in 2002,[156] with a provision of typical application in § 311,[157] which includes further scenarios.[158]

Culpa in contrahendo under § 311 BGB grants a compensation claim for pre-contractual fault and, in extremely limited cases, even a quasi-contractual claim when a contract does not materialise and one of the contracting parties thereby suffers a negative effect.[159] The main requirement is that there needs to be a binding relationship between the parties,[160] which is often established by initiating a contract.[161]

Similar to some preceding discussions, this paper also accepts culpa in contrahendo as a pivotal instrument against big-data-related misuses.[162] Unlike usury, it does not require an obvious disproportion between the performances, and unlike voidability in § 123 BGB (§ 36 TBK), it is not dependent on intent as a subjective342 element. Another advantage is that culpa in contrahendo compensates for the damage caused by the contract while keeping the transaction in force. Furthermore, it only covers the negative interests of the user, putting him in the position he would have been in had there been no misuse, which could serve as a practical countermeasure against such cases.[163]

The general framework of § 311 BGB could be applied to a variety of cases, not only price discrimination but also all forms of abuse and even cumulative cases. All misuse cases resulting in damages to the user, where there is a deficit of rationality and this deficit is abused or strengthened by the big data controller, could be covered by culpa in contrahendo.[164] To avoid this, such companies should disclose the relevant elements beforehand and the same principles should apply as in § 123 BGB above.[165]

4.5 §§ 305 et seq BGB

The terms and conditions leading to the discriminatory practices could also be invalid through the law of terms and conditions (AGB-Law) or §§ 305 et seq BGB (§§ 20 et seq TBK).[166] First of all, there is stricter protection of the consumer, as in § 310(3) BGB (§ 4 TKHK), where the facts are easier to prove, and their once-only use may also constitute a standard term rather than three in normal contracts.[167]

For other contracts, standard terms are invalid under certain circumstances.[168] According to § 307(1) BGB (§ 21 TBK), such provisions are invalid if, in violation of good faith, they unreasonably disadvantage the other party.[169] Such a disadvantage may also result from the ambiguous and incomprehensible nature of the provision according to § 307(2) BGB (§ 21 TBK). If certain provisions are removed from the343 contract, it will still be maintained with the rest of the provisions according to § 306(1) BGB (§ 22 TBK), and the gaps will be filled by statutory provisions.

In the context of big data, excessive secondary obligations are also disadvantageous if they are not related to the main obligation and do not bring any additional value to the other party.[170] Thus, if consent is given for additional obligations, such as advertisement obligations, such obligations could also constitute an unreasonable disadvantage under § 307 BGB (§ 21 TBK) and be declared ineffective.

A pivotal feature of the AGB-Law is that in the case of cumulative clauses, the entire clause could be invalid, even if the individual clauses would not be declared invalid on their own.[171] Such a multiplier effect could be particularly useful in the case of cumulative personalisation practices, where they may appear harmless or negligible in isolation, but when taken together pose an unreasonable disadvantage to the user.

5 International Case Law: A Few Examples

5.1 Meta (Australia)

In the first example, a leaked internal Meta[172] Australia document claimed that the company had the tools to identify younger users as young as 14 and calculate the perfect time to advertise mental support.[173] According to the document, the company could also identify users who felt insecure, worthless, stressed, overwhelmed or even stupid and useless.[174] This would allow advertisers to reach them at their most vulnerable by targeting the exact moments when young people needed a confidence boost, which would naturally increase the chances of them clicking on an advertisement and ultimately entering into a contract.

Meta had to this point already been involved in some questionable practices. An earlier study in the US showed the power of social media platforms such as Facebook by proving that users’ emotional states could be transferred to others through344 regular use of such platforms.[175] Another study showed that just 300 likes on Facebook were enough to predict a user’s traits and predict their future behaviour better than their spouse.[176] Similarly, Meta was already able to identify 52,000 different data points from the data sets the company was working with in 2014.[177] Another notorious example was the Cambridge Analytica scandal, where voters were manipulated by political advertising through social media.[178]

In the example of targeted advertising to young people to increase the chances of making contact, several private and European law institutions are violated.[179] The first contract between Meta and the advertisers on the insecurities of the users embodies a violation of privacy and contradicts multiple principles of Article 5 GDPR (Article 4 et seq KVKK), such as lawfulness, purpose limitation and data minimisation. Furthermore, there is no informed consent within the meaning of Article 7 GDPR (Article 6 KVKK). Therefore, this first B2B contract also violates § 134 BGB (§ 27 TBK) and is invalid. There could also be sanctions against Meta based on Article 83 GDPR.[180]

The second contract is the contract concluded by the young user as a result of the advertisement, where his privacy was violated beforehand. For this contract, there should be a right to avoidance in the sense of § 123 BGB (§ 36 TBK) and as a result of deceit by silence, as Meta does not disclose a central feature of the respective contract, which the user could expect to be disclosed in good faith.

A liability based on culpa in contrahendo is also possible, as neither Meta nor the advertisers fulfil the pre-contractual obligations towards the younger users in good faith. If this breach causes damage to the user, e.g., if he pays more for the contract because he was in need, this damage should also be equalized by § 311 and according to § 249 BGB. If the discrepancy reaches the usury threshold, a nullity from § 138 (2) BGB (§ 28 TBK) could also apply. Individual claims to damage compensation from Article 82 GDPR and § 9(2) UWG (§ 56 TTK) are still applicable.345

5.2 Decolar (Brazil)

There have already been examples of price discrimination depending on several characteristics of the user. In the Uber variant, low battery status multiplies prices;[181] in the Orbitz variant, it is the operating system used that leads to discrimination.[182] The most common of these practices is also known as geopricing, where discriminatory pricing occurs based on the location of the user.[183] The existence of geopricing in the United States has been demonstrated in a previous survey.[184] There has also been scientific proof of the existence and dynamics of this phenomenon.[185]

In the case of Decolar, a big hotel and flight reservation platform operating in South America,[186] users from Brazil were not able to access several search results during the 2016 Olympic Games held in Rio de Janeiro, while the same hotel rooms were accessible to other users from Argentina and several other countries, and even when they were able to access the same results, the prices were different.[187] For some hotel rooms, the discrepancies reached as much as 500 %.

Not only did Decolar engage in price discrimination in this case, but it also practised steering in the form of geoblocking.[188] As a result, Brazil’s Department of Consumer Protection and Defence fined the company 7.5 Million Brazilian Real[189] for infringements of Article 4,[190] 6[191] and 39[192] of the Brazilian Consumer Protection Code (CDC).[193]

In addition to the administrative sanctions, the geopricing/steering practices in the Decolar case also trigger several private law claims for individual users.[194] First346 of all, the disproportionality threshold of usury is met as the differences reach up to five times the original price. If the other facts are also present, meaning if the user is in an emergency or an extreme inexperience and the company acts with intent (which is assumed in this particular example), the undertaking could be void under § 138(2) BGB (§ 28 TBK).

The same transaction is also subject to avoidance by silence if the company does not disclose the necessary information behind the calculation of price, as the price belongs to essentialia negotii of a sales contract and such disclosure could be expected in good faith. Again, the assumption is that the company acts at least with intent.

A culpa in contrahendo also takes place on the assumption that the company acts at least negligently, and users who have been subjected to such discrimination could be compensated under § 249 BGB. Individual claims for damages according to Article 82 GDPR and § 9(2) UWG (§ 56 TTK) are still granted.

5.3 Google (France)

One last example came from France immediately after the GDPR came into force. The French National Commission for Information Technology and Freedom (CNIL)[195] fined Google € 50 million for not having the legal basis to engage in personalised advertising.[196] Google’s terms and conditions and privacy policy were required to be accepted by the user, otherwise the Android devices would have been unusable. According to the court, Google shared necessary data with its customers, but the amount of information was too large to identify data processing operations. As a result, the Article 12 GDPR (Article 11 KVKK) was violated due to the lack of accessibility of the information.[197]

The CNIL also concluded that the consent was not validly obtained by a positive act but rather by an opposition to another procedure,[198] nor was it specific to each processing activity. The total fine was € 50 million and was fully upheld by Conseil d’Etat [199] in mid-2020.[200] 347

In addition to being sanctioned by Article 83 GDPR up to 4 % of the annual turnover of the company, such practices could also trigger individual claims from Article 82 GPDR. The terms and conditions leading to the questionable practices of data usage and transfer could also be invalid through §§ 305 et seq BGB (§§ 20 et seq TBK). The user could also resort to § 311 BGB (§ 2 TMK) and also § 9(2) UWG (§ 56 TTK) in cases of damage.

6 Concluding Remarks

The implementation of big data transforms the nature of traditional contracts by introducing another information asymmetry between the contracting parties. This shift grants companies greater dominion over the other party, impacting not only consumers but also smaller enterprises. Consequently, the misuse of big data engenders both competitive and contractual issues between larger and smaller companies.

A recent ruling by the CJEU significantly contributes to the ongoing discourse on the interplay between public policy regulation and national private law. This decision advocates for a more harmonious form of regulation, where the two domains are complementary. The ruling stipulates that there should be at least one remedy under national law if a regulation’s objectives fail to provide personal compensation.

To address price discrimination, several remedies are available, including those related to usury, public policy violations, voidability by silence, and culpa in contrahendo. In cases of privacy violations, there are two primary focal points: the transfer of sensitive data between entities and the individual contract formed post-breach. The initial contract may be voidable due to public policy violations or statutory prohibitions, while the subsequent contract could be voidable due to deceit, with liability arising from culpa in contrahendo. Personal claims under the GDPR and UWG are applicable for both types of misuse.

For steering or mixed cases, liability for culpa in contrahendo may arise if damages occur. Such misuse could also activate European compensation mechanisms under the GDPR and UWG. Beyond individual instances, misuse frequently manifests through a combination of multiple practices. This cumulation has dual effects: it may indicate subjective elements such as negligence or intent, and it may trigger the cumulative effect of AGB or UWG, rendering the collective practices unlawful.

To mitigate liability, companies must utilize big data in strict compliance with individual rights and the public goods enshrined in European principles. A crucial obligation for companies is the disclosure of material information, ensuring that they are exploring, rather than exploiting, the opportunities presented by big data in everyday contracting.


Corresponding author: Cihat Börklüce, LL.M., Humboldt University of Berlin, Berlin, Germany, E-mail:
The author is an LL.M. Alumnus, a YLSY scholar and doctoral candidate at the Humboldt University of Berlin (currently waiting for publication). His assigned place of return after the doctorate is the Civil Law Chair at the Turkish–German University in Istanbul. For questions, comments and suggestions please contact at: cihatborkluce@gmail.com
Published Online: 2024-11-21
Published in Print: 2024-12-17

© 2024 the author(s), published by De Gruyter, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 International License.

Downloaded on 13.12.2025 from https://www.degruyterbrill.com/document/doi/10.1515/ercl-2024-2012/html?lang=en
Scroll to top button