Safeguarding Child Viewers: Legal Strategies for Commercial Sharenting on Social Media in China
-
and
Abstract
In the digital age, “commercial sharenting” refers to parents excessively sharing their children’s images and data on social media for profit. Initially motivated by parental pride, this practice is now driven by child-to-child marketing, where young influencers shape their peers’ consumption habits. While regulations protect child influencers’ privacy, a significant gap remains regarding the rights of child viewers. We argue that commercial sharenting threatens children’s right to health under Article 24(1) of the UNCRC, potentially leading to harmful consumer behaviors and identity confusion. In response, China has adopted a fragmented regulatory approach to platform liability. This article advocates for a comprehensive legal framework incorporating content filtering, moderation, and reviewal to regulate commercial sharenting and safeguard children’s rights and interests in China.
Introduction
The term “commercial sharenting” denotes the practice wherein parents excessively share photos and other information about their children on social media platforms with the primary aim of seeking financial profit.1 The earnings may come in the form of immediate payment, the establishment of future business interests for compensation, or through other avenues of current or potential revenue generation.2 Once a certain threshold of followers is reached, parents managing the account can monetize their social media presence and secure sponsorships from well-known brands, such as Walmart, Hasbro, and others.3 The invasion of child influencers’ rights, like privacy and economic exploitation in commercial sharenting scenarios, has been extensively discussed in previous scholarship, prompting lawmakers in various jurisdictions to address this issue by safeguarding the right to privacy and preventing the digital exploitation of child influencers.4 However, apart from digital advertising,5 existing literature rarely provides a systematic and comprehensive focus on the impact of sharing images or information about child influencers on other children behind the scenes.
Children today are both creators and audiences of social media.6 Studies indicate that children spend considerable time watching videos of their favorite influencers, during which they are also exposed to influencer marketing practices.7 Excessive and unhealthy engagement with videos featuring child influencers can have significant adverse effects on child viewers’ physical and mental health, including increased risks of myopia, spinal issues, and psychological distress, which are explored in detail in the third section of this article. Previous literature also indicates that children are easily affected by the content posted by influencers,8 especially peer influencers.9 The fast-paced development of social media opens up fresh avenues for brands to interact with children, integrating embedded advertising formats that seamlessly weave brand messages into captivating media content. This method makes promotional aspects less obvious, thus increasing the difficulty of detection by the target audience.10
This article endeavors to provide a systematic analysis of child viewers within the legal framework of commercial sharenting. Given that content shared by child influencers or their parents falls under the category of user-generated content, platform governance is considered a highly effective strategy for addressing associated issues.11 We argue for the necessity of compelling platforms to assume additional content moderation responsibilities in the context of commercial sharenting. This can be achieved through the establishment of a content governance mechanism, which would include detailed rules for the proper disclosure of sponsored products and strict content classification. A three-step mechanism could be employed to safeguard both child influencers and child viewers in the realm of commercial sharenting. Firstly, any content in commercial sharenting that risks encroaching on children’s privacy should be prohibited on social media, as it constitutes unlawful behavior. Secondly, social media platforms should be obligated to moderate content to protect child viewers, particularly through advertising laws or any other specific regulations. Finally, the third mechanism could leverage the “Notice and Action” provisions of the European Union’s Digital Services Act (DSA), requiring social media platforms to promptly remove commercial sharenting content upon notification by public authorities or third parties. The first and second steps represent ex-ante regulations, while the third step serves as an ex-post mechanism to prevent potential harm to child viewers.
The Irrefutable Influence of Child Influencers on Child Viewers
The manifestation of child-to-child marketing exerts a pervasive influence on the targeted children, with child influencers regularly promoting products like toys to their peers on platforms such as YouTube. This trend underscores that child-driven marketing has a significant influence on and economic implications for those children’s peers.12 Researchers have found that children will copy their peers’ behavior, including their consumption patterns,13 a phenomenon referred to as peer modeling or social learning.14 Gaining peer acceptance and approval is important to children.15 Particularly, from the ages of 6 to 7, peers become important agents for consumer socialization, and children begin to model their peers’ behavior.16 Even preschool children’s behavior and consumption may already be influenced by their friends.17 This has shown to be particularly true when the peer is the same age or slightly older.18
The rise of social media introduced a new type of peer endorsement, namely social media influencers, who are now widely acknowledged as a new form of celebrity.19 This new form of peer endorsement can reach a large number of child viewers due to the proliferation of social media, resulting in a wide-ranging impact.20 Due to their related and approachable nature, child influencers offer viewers a window into their private lives though social media posts. This fosters a pronounced sense of shared experience between the creator and the audience, positioning child influencers as akin to their peers in the viewers’ eyes.21 According to source attractiveness theory, the driven factors of source attractiveness consist of the source’s similarity, familiarity, and likeability. Specifically, similarity refers to the resemblance between the source and the receiver; in this context, the child influencer and the child viewer share a common identity as children. Familiarity denotes the degree to which the receiver knows the source through repeated exposure; in this scenario, the content created by child influencers, which is often centered around their daily lives and filmed in their homes, fosters a sense of familiarity with their audiences. Likability is defined as the affection felt towards the source, which can result from the source’s physical appearance and behavior.22 Similarity, familiarity, and likability contribute to the receivers’ identification with the source, thus increasing the likelihood of adopting the source’s beliefs, attitudes, and behaviors.23 In this case, the tangible appeal that child influencers have for their young audiences amplifies the likelihood of impressionable children emulating behaviors or making consumer choices based on the marketed content.
Brands hope that the positive associations linked to celebrities will transfer to the brands they endorse.24 The fast-paced development of social media opens up fresh avenues for brands to interact with children, integrating embedded advertising formats that seamlessly weave brand messages into captivating media content. Since child viewers and their parents are often less resistant towards endorsements from child influencers, as peers are considered authentic information sources with no commercial interest,25 this method makes promotional aspects less obvious, thus increasing the difficulty of detection by the target audience.26
Detrimental Impacts on Child Viewers
The detrimental effect that social media can have on child viewers emanates from various sources. Considering the gravity of this issue, our attention is directed towards the deleterious impact engendered by child-to-child marketing facilitated by child influencers on social media.
Misguided consumer behavior with undisclosed sponsorship
Based on the reviewed literature and theories, children are easily influenced by their peer models and are likely to imitate their behavior, including the use of products they endorse.27 As child influencers usually show the usefulness of the toys, clothes, or food they endorse, this action may accordingly lead to observational learning from their viewers.28 Thus, child viewers are particularly vulnerable to exploitation through targeted advertising, especially when ads are designed to exploit their developmental vulnerabilities, such as the varying levels of advertising literacy across early childhood (under age 5), middle childhood (6–9 years), and late childhood (10–12 years).29
In 2019, the watchdog group Truth in Advertising (based in Connecticut) filed a complaint with the United States (US) Federal Trade Commission (FTC) against “Ryan Toys Review,” YouTube child influencer Ryan Kaji’s top toy review channel,30 accusing Kaji of deceiving children through “sponsored videos that often have the look and feel of organic content.” Approximately 90% of his videos have featured at least one paid product recommendation targeting preschoolers, a demographic too young to discern between commercial content and genuine reviews.31 As Bram and Catalina have stated, “embedding sponsored content without any disclosure is the biggest harm faced by consumers in this industry.”32 The purposeful directing of advertising content towards children through videos created by child influencers necessitates specific regulatory oversight to ensure ethical and responsible marketing practices.
Right to health
The right to health is recognized as a fundamental right under the United Nations (UN) Convention on the Rights of the Child (UNCRC). Article 24(1) of the UNCRC specifically outlines the right of the child to enjoy the highest attainable standard of health. The content of children’s right to health in the digital age is specifically interpreted in General Comment No. 25 (2021) on children’s rights in relation to the digital environment, which compels Member States to implement regulatory measures against materials and services that have the potential to adversely affect children’s mental or physical well-being. These provisions specifically address targeted or age-inappropriate advertising, marketing strategies, and other pertinent digital services, intending to shield children from exposure to the promotion of unhealthy engagement in social media.33 In the visual world of commercial sharenting, child influencers’ excessive and unhealthy engagement with videos can have significant adverse effects on child viewers’ physical and mental health.
Previous research on the impact of celebrity endorsements on children has mainly focused on food marketing. Specifically, research has found that brands use celebrities to endorse energy-dense and nutrient-poor products,34 increasing children’s intake of these less-healthy foods.35 Researchers have examined the impact of social media influencers’ marketing on children’s food intake.36 Results have shown that influencer marketing of unhealthy foods increases children’s immediate food intake, which is direct evidence of the detrimental impact on children’s physical health.
Researchers use “health-related quality of life (HRQoL),37 an important multidimensional concept to measure the risk of precursors of disease to indicate the health status of the next generation.”38 Previous empirical studies indicate that children with less screen time and moderate physical activity had the greatest HRQoL levels.39 According to a 2021 study by the Center for Research on Chinese Youth, 70.8% of respondents often used Douyin (Chinese TikTok), Kuaishou, Huoshan, and other platforms to watch short videos.40 The addiction to social media poses great challenges to the physical and mental health of young people.41 The continuous viewing of screens, often for extended periods, leads to an increased myopia risk and other vision-related issues among children.42 Previous studies indicate that there was a statistically significant correlation between screen time (high vs. low) and myopia.43 Additionally, long-time sedentary behavior—maintaining improper posture while engrossed in video content—can contribute to spinal problems and musculoskeletal pain in young viewers.44 In addition to physical health concerns, the mental health implications of excessive screen time for children are equally concerning. Researchers have highlighted the adverse effects of internet addiction, characterized by an inability to regulate internet usage, which can lead to significant distress and functional impairment.45 Studies have shown a notable correlation between excessive use of short video applications and addictive behavior,46 as well as a strong association between excessive social media use and depressive symptoms.47
These physical and mental problems of children have escalated to the point of becoming a significant public health concern. Per a report by the World Health Organization (WHO) titled “Public Health Implications of Excessive Use of the Internet, Computers, Smartphones, and Similar Electronic Devices,” adolescents and young adults have emerged as the primary users of the internet and contemporary technologies. Behavioral addictions, characterized by an irresistible urge to repeatedly engage in online activities such as social media, have become a prevalent phenomenon across various jurisdictions.48 The negative impact on child viewers cannot be overlooked and warrants serious consideration due to the potential long-term consequences on their development and well-being.
Identification confusion
Previous studies indicate that children are highly inclined to identify with popular child influencers and adopt their attitudes, beliefs, and behaviors, including those related to brands and products.49 Sources used in advertising can serve as role models that children emulate in forming their identities.50 The likelihood of this process increases when consumers develop a parasocial relationship (PSR) with the source.51 PSR, as introduced by Horton and Wohl,52 refers to the relationships consumers develop with media figures, making them influential sources of information.53 The need for companionship, which is a fundamental driver for relationship formation, begins to emerge during childhood.54 As a result, children are particularly susceptible to developing PSRs with media figures.55 The one-sided emotional connections that individuals form with media personalities, such as popular child influencers, can significantly influence children’s attitudes, beliefs, and behaviors, including their consumption choices.
The proliferation of inappropriate content within commercial sharenting, including the flaunting of luxury items, has the potential to inadvertently endorse materialistic values to child viewers. The promotion of consumer culture by child influencers to their child viewers fosters a mindset centered around consumerism and material possessions, which can have profound adverse effects on influencers’ and viewers’ identity development and well-being.56 This emphasis on material wealth can contribute to feelings of inadequacy as well as comparison to and dissatisfaction with one’s own circumstances, potentially leading to adverse effects on young audiences’ identification clarity.57 It is essential to recognize and explore the implications of such content within commercial sharenting platforms to promote healthier lifestyles and values among child viewers.
Basic Framework for Protecting the Child Viewer in China
In the digital era, the underlying issue regarding the effective regulation of commercial sharenting or other user-generated content falls on social media platforms’ governance abilities. According to the gatekeeping theory, formally identified by Kurt Lewin,58 the gatekeeper plays a crucial role in determining what information should be allowed to pass to groups or individuals and what should be restricted. Indeed, through the gatekeeping process, the gatekeeper filters and removes unwanted, sensitive, and controversial information. This function serves to exert control over society or a group, guiding it along a path perceived as appropriate or beneficial.59
Moreover, in recent years, platforms have increasingly employed pre-designed algorithms for the organization and recommendation of content.60 This transformation has significantly altered how users engage with online content.61 Take TikTok as an illustrative case: the platform predominantly relies on content interactions within personalized video feeds, with the performance relying heavily on the effectiveness of the recommendation algorithm.62 It is noteworthy that the potential of recommendation algorithms is rooted in the abundant availability of user data, rendering TikTok capable of augmenting the efficiency of content distribution and enhancing the adaptability of the personalized video feed.63 As the primary driving force behind content delivered to users, particularly minors, the legal regulation of recommendation algorithms holds significant importance within the content moderation process.
Legal Regulations Surrounding Platform Governance
To enhance content governance on platforms, China has implemented a series of laws, including the Cybersecurity Law of the People’s Republic of China, the Data Security Law of the People’s Republic of China, etc. Several related administrative regulations and departmental regulatory documents have also been issued in recent years. The Provisions on Ecological Governance of Network Information Content, as deliberated and adopted at the executive meeting of the Cyberspace Administration of China (CAC), came into force on March 1, 2020.64 The Opinions of the Cyberspace Administration of China on Further Pushing Websites and Platforms to Fulfill Their Primary Responsibility for Information Content Management were issued on September 15, 2021, to further prompt websites and platforms to fulfill their primary responsibility for information content management.65
To regulate algorithm-generated recommendations for internet information services, the departmental Provisions on the Administration of Algorithm-generated Recommendations for Internet Information Services were issued, which explicitly denote the user’s right to know the algorithm,66 the user’s right of choice,67 and the special protection of minors.68 Further, on August 12, 2022, the CAC issued an initial set of announcements obligating internet service providers to submit information concerning the algorithms of domestic internet information services. This mandate aligns with the Provisions on the Administration of Algorithm-generated Recommendations for Internet Information Services, which pertain to algorithm-generated recommendations69 that highlight the responsibility of internet service providers to ensure algorithm security and construct an algorithm accountability mechanism.70 These provisions on algorithms encompass algorithm-powered recommendation services (ARS) utilizing technologies such as generation and synthesis, personalized push, selection sort, search filtering, scheduling decisions, and other algorithmic technologies to deliver information to users.71 As a result, the majority of online platforms that Chinese internet users utilize fall within the regulatory scope that these provisions define.72 The governance of platforms in China also covers other aspects related to the protection of children in the digital sphere, such as the anti-addiction system and special protection of personal information contained in the Law of the People’s Republic of China on the Protection of Minors. Besides, China’s effort to specially protect the rights of minors in cyberspace can be seen in the newly issued Regulation on the Protection of Minors in Cyberspace, which came into force on January 1, 2024.73
Broadly speaking, China has adopted a fragmented regulatory strategy to govern platform liability, and there is still an absence of a dedicated law that comprehensively addresses content moderation. The primary responsibilities of platforms are governed by departmental regulatory documents that have a relatively low legal status.74 Additionally, the rigorous administrative supervision system, which relies on the approach of “further pushing platforms to fulfill responsibility,” may not provide strong incentives for platforms to proactively engage in content moderation. In reality, content governance primarily occurs through “special campaigns” rather than through normalized governance processes.75 For example, the CAC implemented a nationwide campaign to purify the online environment, which was intended to “crack down on wrongdoing and malpractice in live-streaming and short videos” and resolutely curb the manipulation of minors to make profits through live broadcasting and short videos to create child influencers.76 In spite of its immediate effect, this governance approach may face challenges in terms of long-term sustainability.
Specific Law
The previous analysis of children’s right to health reveals that long-term viewing of online videos or live broadcasts, as well as other forms of online social networking, poses inevitable health harms for child viewers. Some special laws have been enacted to address this issue in China. Specifically, Article 74 of the Law of the People’s Republic of China on the Protection of Minors stipulates that network product and service providers shall not provide minors with products and services that induce addiction. Cyber games, online live broadcasts, online audio and video services, online social networking platforms, and other types of network service providers shall set corresponding time limits, authority, and consumption management parameters along with other rules for minors’ use of their services.
Furthermore, social media platforms in China face significant liability if they fail to handle children’s personal information in accordance with the Law of the People’s Republic of China on the Protection of Minors. This law mandates that network product and service providers, including social media platforms, receive warnings, have their illegal gains confiscated, and be fined between 100,000 yuan and one million yuan.77 The most severe legal consequences include the suspension of their business licenses and the closure of their websites by public authorities.78
Previous empirical studies assessing the impact of an advertising disclosure on minors’ recognition of influencer marketing have demonstrated that the inclusion of an advertising disclosure aids children in recognizing advertising.79 Therefore, the disclosure of sponsorship in influencer endorsements should be required to protect child viewers from misleading advertising. According to Article 9 of China’s Measures for the Administration of Internet Advertising, internet advertisements must be clearly identifiable, ensuring that consumers can recognize them as advertisements when promoting goods or services. This means that all internet advertisements, whether online or offline, must be explicitly labeled as advertisements to comply with legal requirements.80 If the internet advertisement is not identifiable, the market regulatory department shall order the violator to take corrective action and impose a fine of not more than 100,000 yuan on the publisher of the advertisement.81
Response and Measures for Protecting Child Viewers in Commercial Sharenting
Regulating Platform Liability
In analyzing China’s fragmented regulatory strategy, it becomes apparent that the existing regulatory strategy is insufficient for the commercial sharenting regime. To effectively safeguard the rights of child influencers and viewers alike, there is a pressing need for an integrated and optimized regulatory scheme. This scheme should encompass a comprehensive framework that addresses the challenges inherent in commercial sharenting while ensuring the protection of children’s rights and interests. As described above, a three-step mechanism could be employed to safeguard both child influencers and child viewers in the realm of commercial sharenting.
Primarily, any content within the realm of commercial sharenting that jeopardizes the privacy of children ought to be strictly prohibited on social media platforms, given their potential to infringe legal boundaries. Secondly, it is imperative that social media platforms bear the responsibility of actively moderating content associated with commercial sharenting to safeguard child viewers’ interests in compliance with the protection of minors via advertising laws and other special laws. Finally, a regulatory mechanism could use the “Notice and Action” provisions outlined in the EU’s DSA for reference. This would entail social media platforms being mandated to promptly remove inappropriate commercial sharenting content upon notification by either public authorities or third parties. The first and second steps represent ex-ante regulations, while the third step serves as an ex-post mechanism to prevent potential harm to child viewers.
Content filtering
To protect child influencers’ rights, social media platforms should prohibit any content that poses a risk of invading children’s privacy or personal information, as it constitutes unlawful behavior. The content disseminated on social media platforms of child influencers frequently transcends superficial portrayals of their domestic environments, delving into the disclosure of intimate personal details, including specific identifying information like the child’s name, birth date, school, etc. Moreover, intimate information, as termed by Professor Leah Plunkett, encompasses geographic locations, daily routines, preferences, and other private details that viewers can glean from the posted content.82 As Fishbein notes, this implies that a child influencer’s private life becomes openly accessible to the public.83 The personal lives of child influencers are laid bare for anyone on the internet to observe, thereby exposing them to significant risks of potential misconduct by individuals with malicious intent, such as perpetrators, pedophiles, or identity thieves. When a platform detects content containing certain information that endangers children’s privacy or personal information, such as nude pictures of children, the content should be directly prohibited.
One case involving a social media platform being charged with violating child data protection laws occurred in Ireland. On September 1, 2023, the Data Protection Commission (DPC), Ireland’s supervisory authority, finalized its decision to impose a fine of 345 million euro84 on TikTok, which allegedly violated the specific General Data Protection Regulation (GDPR) provision concerning children’s data protection.85
Content moderation
As the second step, social media platforms should be obligated to moderate content related to commercial sharenting to protect child viewers through laws on the protection of minors, advertising laws, or any other specific regulations. Various legal frameworks impose obligations on platforms and other network service providers to manage the amount of time users spend online, which is required by such instruments as the Law of the People’s Republic of China on the Protection of Minors,86 while the non-binding Initiative for Preventing Juveniles from Short Video Addiction compels platforms and providers to disclose product endorsements.
To safeguard children against manipulative covert advertisements with child-to-child marketing conducted by child influencers, advertising laws that include video-sharing platforms in their scope should be promulgated, mandating that sponsored content be appropriately disclosed. In the Chinese law, the relative provision can be observed in the first paragraph of Article 9: “An [i]nternet advertisement shall be identifiable so that it can be clearly identified by consumers as an advertisement.”87 However, from examining regulations in the United Kingdom (UK) and the US, it is evident that China lacks specific legislation concerning social media influencer advertising.
The UK’s Communications Act 2003 pertains specifically to the child influencer industry, serving to shield children from exposure to harmful content and ensuring that viewers of influencer content are safeguarded against encountering advertisements without adequate warning.88 Article 319(2)(1) of the Communications Act 2003 prohibits the “use of techniques which exploit the possibility of conveying a message to viewers or listeners or of otherwise influencing their minds, without their being aware or fully aware of what has occurred.”89 This legislation prohibits child influencers from disseminating sponsored content without appropriate disclosure, thereby offering a preventive measure against potential manipulations of uninformed children.
In the US, in November 2019, the FTC issued comprehensive guidelines for endorsements by social media influencers and brands, which also encompass child influencers. The guidelines require that sponsored content be clearly labeled and that endorsements, which include featuring a product or service in a post, tagging or liking brands, “pinning” brands, or commenting on or providing reviews of brands,90 be truthful and not misleading.91 This brief comparative study does not aim to provide a comprehensive analysis of the UK and US approaches to regulating child influencer advertising, but it does seek to highlight the need for more specific legislation on social media advertising in China.
Content reviewal
Finally, the third mechanism could leverage the “Notice and Action” provisions of the EU’s DSA, requiring social media platforms to promptly remove commercial sharenting content upon notification by public authorities or third parties.92 Thus, platforms should establish an ecological content governance mechanism, complete with detailed rules for the strict classification and grading of content. If platforms utilize personalized algorithmic recommendation technology to push information, the content recommended by those algorithms should be transparently communicated to consumers through prominent labels. Platforms should conduct manual reviews or interventions based on their own transparency guidelines, removing the unlawful and unappropriated content generated by sharenting. This empowers audiences with the right to disable recommended services or delete their personal labels, with digital content platforms offering the option to refrain from pushing services based on personal data. To safeguard minors, digital content platforms are prohibited from recommending content to minors or utilizing personally sensitive data for content recommendations.93
Convenient channels for filing complaints and reports should be prominently displayed, accompanied by clear instructions on how to utilize these channels as articulated in China’s Measures for the Administration of Internet Advertising. Article 16 of this 2023 departmental rule requires internet platform operators to monitor and examine their advertising content, and if illegal advertisements are discovered, the operator shall take necessary measures, such as giving notice to request a correction, deleting, blocking, or disconnecting the link(s) to the advertisement(s) along with maintaining relevant records. Internet platform operators should also establish an effective mechanism for receiving and processing complaints and reports. This includes setting up convenient channels for submitting complaints and making public the methods for lodging complaints and reports, ensuring they are handled promptly.94 If Article 16 is violated, the market regulatory authority at or above the county level shall order internet platform operators to take corrective action and impose a fine of not less than 10,000 yuan and not more than 50,000 yuan.95 However, the legal status of this regulation remains relatively low in terms of China’s normative hierarchy, and the penalties are mild, making it difficult to achieve a deterrent effect.
Further, online platforms should be encouraged to develop models tailored for minors and provide online products and services suitable for this demographic. Public authorities functioning as external supervisors are not engaged in case-by-case supervision. Instead, their role involves examining whether platforms have established relevant mechanisms for applying content moderation and are undertaking responsible measures in response to complaints or notices.
The first and second steps, constituting ex-ante regulations, involve proactive measures implemented before potential harm occurs to both child influencers and child viewers. In contrast, the third step serves as an ex-post mechanism, which functions retrospectively to address the harm that has already occurred to child viewers. The ex-post mechanism typically involves responsive measures, such as complaint mechanisms, reporting tools, and content removal procedures, to address harmful content or activities that have been identified after the fact. This mechanism may also include measures to provide support for affected individuals, including child viewers and their families.
Web Literacy Intervention
In addition to establishing comprehensive platform liability, enhancing children’s web literacy is also crucial. Take online advertising literacy as an example: researchers have explored the effectiveness of an educational vlog in assisting children aged 11–14 to cope with advertising. Their findings indicate that advertising disclosures enhanced children’s recognition of advertising.96 Studies testing the impact of advertising disclosures on minors’ recognition of influencer marketing have consistently demonstrated that the inclusion of such disclosures aids children in recognizing commercial promotions.97
Recently, China enacted administrative regulations aimed at enhancing the recognition and promotion of web literacy and morality. According to the Regulation on the Protection of Minors in Cyberspace, the Ministry of Education shall incorporate web literacy and morality education into quality-oriented education and formulate indicators for assessing minors’ web literacy and morality in conjunction with the State’s cyberspace affairs department. Education departments shall guide and support schools in carrying out web literacy and morality education for minors and foster minors’ cybersecurity awareness, digital literacy behavioral habits, and protection skills, focusing on the formation of moral awareness in cyberspace and the cultivation of the concept of the rule of law in cyberspace.
Conclusion
Commercial sharenting is a global phenomenon in the digital era. The practice of sharenting is not inherently bad; after all, it represents a manifestation of parental love and concern. Like a coin that has two sides, sharenting fueled by child-to-child marketing adversely affects both child influencers and viewers. Conventional legal paradigms predominantly center on child influencers. Nevertheless, considering the escalating prevalence of child-to-child marketing via social media platforms, the irrefutable influence of child influencers on child viewers cannot be overlooked, especially child viewers’ right to health. To regulate oversharing effectively, it is imperative to safeguard the interests of both child influencers and child viewers. Chinese lawmakers have implemented a series of legal regulations concerning platform liability for user-generated content, although these regulations represent a fragmented approach.
To systematically address the issues related to commercial sharenting involving both child influencers and child viewers, as well as other forms of user-generated content, we propose that China take a three-step approach: 1) content filtering, 2) content moderation, and 3) content reviewal. This three-step approach seeks to establish a comprehensive regulatory framework that safeguards child influencers from privacy violations and economic exploitation while ensuring child viewers’ rights are protected through legal compliance and transparent advertising.
- 1
Melanie N. Fineman , “Honey, I Monetized the Kids: Commercial Sharenting and Protecting the Rights of Consumers and the Internet’s Child Stars,” Georgetown Law Journal111, no. 4 (2023): 847–90.
- 2
Leah A. Plunkett , Sharenthood: Why We Should Think before We Talk about Our Kids Online (MIT Press, 2019), https://doi.org/10.7551/mitpress/11756.001.0001.
- 3
Amber Edney , ‘“I Don’t Work for Free’: The Unpaid Labor of Child Social Media Stars,” University of Florida Journal of Law and Public Policy32, no. 3 (2021–2022): 547–72.
- 4
Stacey B. Steinberg , “Sharenting: Children’s Privacy in the Age of Social Media,” Emory Law Journal66 (2016): 839; Fineman, “Honey, I Monetized the Kids” (n 1); RachelFishbein, “Growing up Viral: ‘Kidfluencers’ as the New Face of Child Labor and the Need for Protective Legislation in the United Kingdom,” Note, George Washington International Law Review54, no. 1 (2022–23): 127–56.
- 5
Marijke De Veirman , LiselotHudders, and Michelle R.Nelson, “What Is Influencer Marketing and How Does It Target Children? A Review and Direction for Future Research,” Frontiers in Psychology10 (3 Dec. 2019), https://doi.org/10.3389/fpsyg.2019.02685.
- 6
Ibid.
- 7
Frans Folkvord et al., “Children’s Bonding with Popular YouTube Vloggers and Their Attitudes toward Brand and Product Endorsements in Vlogs: An Explorative Study,” Young Consumers20, no. 2 (2019): 77–90. https://doi.org/10.1108/YC-12-2018-0896.
- 8
Carolina Martínez and TobiasOlsson, “Making Sense of YouTubers: How Swedish Children Construct and Negotiate the YouTuber Misslisibell as a Girl Celebrity,” Journal of Children and Media13, no. 1 (2 Jan. 2019): 36–52, https://doi.org/10.1080/17482798.2018.1517656.
- 9
Albert Bandura , “Social Cognitive Theory in Cultural Context,” Applied Psychology51, no. 2 (2002): 269–90, https://doi.org/10.1111/1464-0597.00092.
- 10
Liselot Hudders et al., “Shedding New Light on How Advertising Literacy Can Affect Children’s Processing of Embedded Advertising Formats: A Future Research Agenda,” Journal of Advertising46, no. 2 (2017): 333–49.
- 11
Saad Khan and Mia Lucas, “Platform Governance and Content Moderation: Examining the Role of Social Media Platforms in Content Moderation, Including Policies, Guidelines, and Challenges Related to Regulating News Content,” 8 July 2023 (available via ResearchGate).
- 12
See Catherine Jane Archer and Kate Delmo, “‘Kidinfluencer’ culture is harming kids in several ways – and there’s no meaningful regulation of it,” The Conversation, May 1, 2023, https://www.the-conversation.com/kidfluencer-culture-is-harming-kids-in-several-ways-and-theres-no-meaningful-regulation-of-it-204277.
- 13
Bandura, “Social Cognitive Theory in Cultural Context” (n 9).
- 14
Albert Bandura , “Self-Efficacy: Toward a Unifying Theory of Behavioral Change: Psychological Review,” Psychological Review84, no. 2 (Mar. 1977): 191–215, https://doi.org/10.1037/0033-295X.84.2.191.
- 15
Tamara F. Mangleburg , Patricia M.Doney, and TerryBristol, “Shopping with Friends and Teens’ Susceptibility to Peer Influence,” Journal of Retailing80, no. 2 (1 Jan. 2004): 101–16, https://doi.org/10.1016/j.jretai.2004.04.005.
- 16
Deborah Roedder John, “Consumer Socialization of Children: A Retrospective Look at Twenty‐Five Years of Research,” Journal of Consumer Research26, no. 3 (Dec. 1999): 183–213, https://doi.org/10.1086/209559.
- 17
Lucy Atkinson , Michelle R.Nelson, and Mark A.Rademacher, “A Humanistic Approach to Understanding Child Consumer Socialization in US Homes,” Journal of Children and Media9, no. 1 (2 Jan. 2015): 95–112, https://doi.org/10.1080/17482798.2015.997106.
- 18
Gene H. Brody and ZolindaStoneman, “Selective Imitation of Same-Age, Older, and Younger Peer Models,” Child Development52, no. 2 (1981): 717–20, https://doi.org/10.2307/1129197.
- 19
Marijke De Veirman , VerolineCauberghe, and LiselotHudders, “Marketing through Instagram Influencers: The Impact of Number of Followers and Product Divergence on Brand Attitude,” International Journal of Advertising36, no. 5 (3 Sept. 2017): 798–828, https://doi.org/10.1080/02650487.2017.1348035.
- 20
Johannes Knoll , “Advertising in Social Media: A Review of Empirical Evidence,” International Journal of Advertising35, no. 2 (3 Mar. 2016): 266–300, https://doi.org/10.1080/02650487.2015.1021898.
- 21
Alexander P. Schouten , LoesJanssen, and MaeganVerspaget, “Celebrity vs. Influencer Endorsements in Advertising: The Role of Identification, Credibility, and Product-Endorser Fit,” in Leveraged Marketing Communications, eds. SukkiYoon, Yung KyunChoi, and Charles R.Taylor (Routledge, 2021).
- 22
W.J. McGuire , “Attitudes and Attitude Change,” in Handbook of Social Psychology, eds. GardnerLindzey and ElliotAronson (Random House, 1985), 233–346.
- 23
Michael D. Basil , “Identification as a Mediator of Celebrity Effects,” Journal of Broadcasting & Electronic Media40, no. 4 (1996): 478–95.
- 24
Grant McCracken , “Who Is the Celebrity Endorser? Cultural Foundations of the Endorsement Process,” Journal of Consumer Research16, no. 3 (1989): 310–21.
- 25
Lisette de Vries , SonjaGensler, and PeterS.H. Leeflang, “Popularity of Brand Posts on Brand Fan Pages: An Investigation of the Effects of Social Media Marketing,” Journal of Interactive Marketing26, no. 2 (1 May 2012): 83–91, https://doi.org/10.1016/j.intmar.2012.01.003.
- 26
Hudders et al., “Shedding New Light” (n 10).
- 27
Peter Suedfeld et al., “Processes of Opinion Change,” in Attitude Change (Routledge, 1968): 29.
- 28
Albert Bandura , Joan E.Grusec, and Frances L.Menlove, “Observational Learning as a Function of Symbolization and Incentive Set,” Child Development37, no. 3 (1966): 499–506, https://doi.org/10.2307/1126674.
- 29
Esther Rozendaal , MoniekBuijzen, and PattiValkenburg, “Children’s Understanding of Advertisers’ Persuasive Tactics,” International Journal of Advertising30, no. 2 (Jan. 2011): 329–50, https://doi.org/10.2501/IJA-30-2-329-350.
- 30
“9-year-old boy named highest-paid YouTube star,” CNN Business, Dec. 23, 2020, https://edition.cnn.com/videos/business/2020/12/23/youtube-2020-highest-paid-ryan-kaji-sot-vpx.hln.
- 31
Tiffany Hsu, “Popular YouTube Toy Review Channel Accused of Blurring Lines for Ads,” New York Times, Sept. 4, 2019, https://www.nytimes.com/2019/09/04/business/media/ryan-toysreview-youtube-ad-income.html.
- 32
Bram Duivenvoorde and CatalinaGoanta, “The Regulation of Digital Advertising under the DSA: A Critical Assessment,” Computer Law & Security Review51 (1 Nov. 2023): 105870, https://doi.org/10.1016/j.clsr.2023.105870.
- 33
Committee on the Rights of the Child, General Comment No. 25, Children’s Rights in Relation to the Digital Environment, U.N. Doc. CRC/C/GC/25 (Nov. 19, 2021), paras. 96–97.
- 34
Marie A. Bragg et al., “Popular Music Celebrity Endorsements in Food and Nonalcoholic Beverage Marketing,” Pediatrics138, no. 1 (July 2016): e20153977, https://doi.org/10.1542/peds.2015-3977.
- 35
Emma J. Boyland et al., “Food Choice and Overconsumption: Effect of a Premium Sports Celebrity Endorser,” Journal of Pediatrics163, no. 2 (1 Aug. 2013): 339–43, https://doi.org/10.1016/j.jpeds.2013.01.059.
- 36
Anna E. Coates et al.,“Social Media Influencer Marketing and Children’s Food Intake: A Randomized Trial,” Pediatrics143, no. 4 (1 Apr. 2019): e20182554, https://doi.org/10.1542/peds.2018-2554; Anna ElizabethCoates et al., “The Effect of Influencer Marketing of Food and a ‘Protective’ Advertising Disclosure on Children’s Food Intake,” Pediatric Obesity14, no. 10 (2019): e12540, https://doi.org/10.1111/ijpo.12540.
- 37
Arwen M. Marker , Ric G.Steele, and Amy E.Noser, “Physical Activity and Health-Related Quality of Life in Children and Adolescents: A Systematic Review and Meta-Analysis,” Health Psychology37, no. 10 (Oct. 2018): 893–903, https://doi.org/10.1037/hea0000653.
- 38
Monica Wong et al., “Time-Use Patterns and Health-Related Quality of Life in Adolescents,” Pediatrics140, no. 1 (1 July 2017): e20163656, https://doi.org/10.1542/peds.2016-3656.
- 39
Dorothea Dumuid et al., “Health-Related Quality of Life and Lifestyle Behavior Clusters in School-Aged Children from 12 Countries,” Journal of Pediatrics183 (1 Apr. 2017): 178–83.e2, https://doi.org/10.1016/j.jpeds.2016.12.048.
- 40
Center for Research on Chinese Youth, “Report on Preventing the Juvenile from Short Video Addiction Model” (2021), http://www.cycrc.org.cn/kycg/seyj/202105/P020210526576438296951.pdf. Chinese language.
- 41
Lihong Lu et al., “Adolescent Addiction to Short Video Applications in the Mobile Internet Era,” Frontiers in Psychology13 (10 May 2022), https://doi.org/10.3389/fpsyg.2022.893599.
- 42
Carla Lanca and Seang-MeiSaw, “The Association between Digital Screen Time and Myopia: A Systematic Review,” Ophthalmic and Physiological Optics40, no. 2 (2020): 216–29, https://doi.org/10.1111/opo.12657.
- 43
Zhiqiang Zong et al., “The Association between Screen Time Exposure and Myopia in Children and Adolescents: A Meta-Analysis,” BMC Public Health24, no. 1 (18 June 2024): 1625, https://doi.org/10.1186/s12889-024-19113-5.
- 44
Lucas da Costa et al., “Sedentary Behavior Is Associated with Musculoskeletal Pain in Adolescents: A Cross Sectional Study,” Brazilian Journal of Physical Therapy26, no. 5 (1 Sept. 2022): 100452, https://doi.org/10.1016/j.bjpt.2022.100452.
- 45
Jonathan Burnay et al., “Which Psychological Factors Influence Internet Addiction? Evidence through an Integrative Model,” Computers in Human Behavior43 (1 Feb. 2015): 28–34, https://doi.org/10.1016/j.chb.2014.10.039.
- 46
Xing Zhang , YouWu, and ShanLiu, “Exploring Short-Form Video Application Addiction: Socio-Technical and Attachment Perspectives,” Telematics and Informatics42 (1 Sept. 2019): 101243, https://doi.org/10.1016/j.tele.2019.101243.
- 47
Elizabeth J. Ivie et al., “A Meta-Analysis of the Association between Adolescent Social Media Use and Depressive Symptoms,” Journal of Affective Disorders275 (1 Oct. 2020): 165–74, https://doi.org/10.1016/j.jad.2020.06.014.
- 48
World Health Organization, “Public Health Implications of Excessive Use of the Internet, Computers, Smartphones and Similar Electronic Devices,” WHO meeting report, Tokyo, Japan (27–29 August 2014), Sept. 9, 2015, https://www.who.int/publications/i/item/9789241509367.
- 49
Schouten, Janssen, and Verspaget, “Celebrity vs. Influencer Endorsements” (n 21); KatharinaPilgrim and SabineBohnet-Joschko, “Selling Health and Happiness How Influencers Communicate on Instagram about Dieting and Exercise: Mixed Methods Research,” BMC Public Health19, no. 1 (6 Aug. 2019): 1054, https://doi.org/10.1186/s12889-019-7387-8.
- 50
Cynthia Hoffner and MarthaBuchanan, “Young Adults’ Wishful Identification with Television Characters: The Role of Perceived Similarity and Character Attributes,” Media Psychology7, no. 4 (1 Nov. 2005): 325–51, https://doi.org/10.1207/S1532785XMEP0704_2.
- 51
Mina Tsay-Vogel and Mitchael L.Schwartz, “Theorizing Parasocial Interactions Based on Authenticity: The Development of a Media Figure Classification Scheme,” Psychology of Popular Media Culture3, no. 2 (2014): 66–78, https://doi.org/10.1037/a0034615.
- 52
Donald Horton and R. Richard Wohl, “Mass Communication and Para-Social Interaction,” Psychiatry (1 Aug. 1956), https://www.tandfonline.com/doi/abs/10.1080/00332747.1956.11023049.
- 53
Alan M. Rubin , Elizabeth M.Perse, and Robert A.Powell, “Loneliness, Parasocial Interaction, and Local Television News Viewing,” Human Communication Research12, no. 2 (1985): 155–80, https://doi.org/10.1111/j.1468-2958.1985.tb00071.x.
- 54
The Handbook of Children, Media, and Development, Wiley Online Library (2008), accessed 11 Oct. 2024, https://onlinelibrary.wiley.com/doi/10.1002/9781444302752.
- 55
Simone M. de Droog , MoniekBuijzen, and Patti M.Valkenburg, “Use a Rabbit or a Rhino to Sell a Carrot? The Effect of Character–Product Congruence on Children’s Liking of Healthy Foods,” Journal of Health Communication17, no. 9 (1 Oct. 2012): 1068–80, https://doi.org/10.1080/10810730.2011.650833.
- 56
Helga Dittmar , “The Costs of Consumer Culture and the ‘Cage Within’”: The Impact of the Material ‘Good Life’ and ‘Body Perfect’ Ideals on Individuals’ Identity and Well-Being,” Psychological Inquiry18, no. 1 (2007): 23–31.
- 57
Jennifer Ann Hill , “Endangered Childhoods: How Consumerism Is Impacting Child and Youth Identity,” Media, Culture & Society33, no. 3 (Apr. 2011): 347–62, https://doi.org/10.1177/0163443710393387.
- 58
Kurt Lewin , “Frontiers in Group Dynamics: Concept, Method and Reality in Social Science; Social Equilibria and Social Change,” Human Relations1, no. 1 (1 June 1947): 5–41, https://doi.org/10.1177/001872674700100103.
- 59
Ibid.
- 60
Eric N. Holmes, Liability for Algorithmic Recommendations, Congressional Research Service, R47753 (Oct. 12, 2023), https://crsreports.congress.gov/product/pdf/download/R/R47753/R47753.pdf.
- 61
Xing Lu , ZhicongLu, and ChangqingLiu, “Exploring TikTok Use and Non-Use Practices and Experiences in China,” in Social Computing and Social Media. Participation, User Experience, Consumer Experience, and Applications of Social Computing, ed. GabrieleMeiselwitz (Springer, 2020), 57–70, https://doi.org/10.1007/978-3-030-49576-3_5.
- 62
Daniel Klug et al., “Trick and Please. A Mixed-Method Study on User Assumptions About the TikTok Algorithm,” in Proceedings of the 13th ACM Web Science Conference 2021, WebSci ‘21 (Association for Computing Machinery, 2021), 84–92, https://doi.org/10.1145/3447535.3462512.
- 63
Pengda Wang , “Recommendation Algorithm in TikTok: Strengths, Dilemmas, and Possible Directions,” International Journal of Social Science Studies10, no. 5 (2022): 60–66.
- 64
Provisions on Ecological Governance of Network Information Content (promulgated by the Cyberspace Administration of China, Dec. 15, 2019, effective Mar. 1, 2020), ChinaLawInfo (last visited Nov. 3, 2023) (P.R.C.).
- 65
Opinions of the Cyberspace Administration of China on Further Pushing Websites and Platforms to Fulfill Their Primary Responsibility for Information Content Management (Cyberspace Administration of China, Sep. 15, 2021, effective 15 Sep. 2021) (promulgated by the Cyberspace Administration of China, Dec.15, 2019, effective Mar. 1, 2020), LawInfoChina (last visited Nov. 3, 2023) (P.R.C.).
- 66
Provisions on the Administration of Algorithm-generated Recommendations for Internet Information Services (promulgated by the Cyberspace Administration of China, Dec. 31, 2021, effective Mar. 1, 2022), ChinaLawInfo (last visited Nov. 3, 2023) (P.R.C.).
- 67
Ibid.
- 68
Ibid.
- 69
Fei Yang and Yu Yao, “A New Regulatory Framework for Algorithm-Powered Recommendation Services in China,” Nature Machine Intelligence 4, no. 10 (Oct. 2022): 802–03, https://doi.org/10.1038/s42256-022-00546-9.
- 70
Ibid.
- 71
Provisions on the Administration of Algorithm-generated Recommendations for Internet Information Services (n 66).
- 72
Yang and Yao, “A New Regulatory Framework for Algorithm-Powered Recommendation Services in China” (n 69).
- 73
Regulation on the Protection of Minors in Cyberspace (promulgated by the State Council, Oct. 16, 2023, effective Jan. 1, 2024), ChinaLawInfo (last visited Nov. 3, 2023) (P.R.C.).
- 74
Yaojia Tang and Chunhui Tang, “Research on the Allocation of Platform Liability for Network Information Content Governance,” Research on Financial and Economic Issues 6, no. 59 (2023): 72.
- 75
Yang and Yao, “A New Regulatory Framework for Algorithm-Powered Recommendation Services in China” (n 69).
- 76
“Fighting rumors, governance algorithms […] The ‘nationwide campaign to purify the online environment’ in 2022 will focus on correcting this network chaos,” China Daily, Mar. 17, 2022, https://cn.chinadaily.com.cn/a/202203/17/WS623348a5a3101c3ee7acc315.html. Chinese language.
- 77
One Chinese yuan is equal to about US$.137 (as of Jan. 17, 2025).
- 78
Regulation on the Protection of Minors in Cyberspace (n 73).
- 79
Steffi De Jans , VerolineCauberghe, and LiselotHudders, “How an Advertising Disclosure Alerts Young Adolescents to Sponsored Vlogs: The Moderating Role of a Peer-Based Advertising Literacy Intervention through an Informational Vlog,” Journal of Advertising47, no. 4 (2 Oct. 2018): 309–25, https://doi.org/10.1080/00913367.2018.1539363; Coates et al., “The Effect of Influencer Marketing of Food and a ‘Protective’ Advertising Disclosure on Children’s Food Intake” (n 36).
- 80
Measures for the Administration of Internet Advertising (promulgated by the State Administration for Market Regulation, Feb. 25, 2023, effective May. 1, 2023), ChinaLawInfo (last visited Nov. 8, 2023) (P.R.C.).
- 81
Ibid.
- 82
E.g., “Tekkerz kid, A Very Real Morning Routine! ft Tekkerz Kid JR,” YouTube, 10 July 2021, https://www.youtube.com/watch?v=DBrKc8HBwY [https://perma.cc/3E6V-3W64], showing one of Tekkerz Kid’s YouTube videos that depicts details of the inside of his bedroom.
- 83
Fishbein, “Growing up Viral” (n 4).
- 84
One euro equals US$1.027 (as of Jan. 17, 2025).
- 85
Data Protection Commission, “Irish Data Protection Commission announces €345 million fine of TikTok” (Sept. 15, 2023), https://www.dataprotection.ie/index.php/en/news-media/press-releases/DPC-announces-345-million-euro-fine-of-TikTok.
- 86
Wei Cheng Nian Ren Bao Hu Fa [Law of the People’s Republic of China on the Protection of Minors] (promulgated by the Standing Committee of the National People’s Congress, Oct. 17, 2020, effective June 1, 2021) Standing Comm. Nat’l People’s Cong. Gaz. (P.R.C.). Chinese language.
- 87
Measures for the Administration of Internet Advertising (n 80).
- 88
Communications Act 2003, c. 21 (UK).
- 89
Communications Act 2003, c. 21 (UK), art. 319(2)(1).
- 90
Brownstein Hyatt Farber Schreck, “FTC Issues New Guidelines for Social Media Influencers,” JDSUPRA (28 Jan. 2020), https://www.jdsupra.com/legalnews/ftc-issues-new-guidelines-for-social-40544/#:~:text=Under%20the%20FTC’s%20rules%20and%20guidelines%2C%20an%20“endorsement,”%20or%20commenting%20on%20or%20providing%20reviews%20of%20brands.
- 91
Federal Trade Commission, Advertising and Marketing, https://www.ftc.gov/business-guidance/advertising-marketing (last visited Oct. 10, 2023).
- 92
Digital Services Act (EU Regulation 2022/2065) (2022), art. 22.
- 93
Tang and Tang, “Research on the Allocation of Platform Liability for Network Information Content Governance” (n 74).
- 94
Measures for the Administration of Internet Advertising (n 80).
- 95
Ibid.
- 96
De Jans, Cauberghe, and Hudders, “How an Advertising Disclosure Alerts Young Adolescents to Sponsored Vlogs” (n 79).
- 97
Ibid.
© The Author(s), 2025. Published by International Association of Law Libraries
This work is licensed under the Creative Commons Attribution 4.0 International License.
Articles in the same Issue
- Article
- Assessing the Efficacy of the Responsibility to Protect (R2P) Principle amidst the Misuse of Veto Power: A Critical Analysis
- Humanity at the Crossroads: Human Rights Challenges in the Age of Lethal Autonomous Weapon Systems
- Advancements in Space Law: Satellite Communications Industry Regulations and Obligations for Orbital Debris Mitigation
- Hazards of the (Over-)Standardization of Academic Legal Works
- Role of India in Combating Transnational Environmental Crimes
- Safeguarding Child Viewers: Legal Strategies for Commercial Sharenting on Social Media in China
- Editorial Comment
- EDITORIAL COMMENT
- Front Cover (OFC, IFC) and matter
- JLI volume 53 issue 1 Cover and Front matter
- Back Cover (IBC, OBC) and matter
- JLI volume 53 issue 1 Cover and Back matter
- Miscellaneous
- The EU Reexamined: A Governance Model in Transition
- Design Law: Global Law and Practice
- Research Handbook on Asylum and Refugee Policy
- Artificial Intelligence and the Law
- International Calendar
- INTERNATIONAL CALENDAR
- Column
- Behind the Books: Global Insights from Law Librarians
Articles in the same Issue
- Article
- Assessing the Efficacy of the Responsibility to Protect (R2P) Principle amidst the Misuse of Veto Power: A Critical Analysis
- Humanity at the Crossroads: Human Rights Challenges in the Age of Lethal Autonomous Weapon Systems
- Advancements in Space Law: Satellite Communications Industry Regulations and Obligations for Orbital Debris Mitigation
- Hazards of the (Over-)Standardization of Academic Legal Works
- Role of India in Combating Transnational Environmental Crimes
- Safeguarding Child Viewers: Legal Strategies for Commercial Sharenting on Social Media in China
- Editorial Comment
- EDITORIAL COMMENT
- Front Cover (OFC, IFC) and matter
- JLI volume 53 issue 1 Cover and Front matter
- Back Cover (IBC, OBC) and matter
- JLI volume 53 issue 1 Cover and Back matter
- Miscellaneous
- The EU Reexamined: A Governance Model in Transition
- Design Law: Global Law and Practice
- Research Handbook on Asylum and Refugee Policy
- Artificial Intelligence and the Law
- International Calendar
- INTERNATIONAL CALENDAR
- Column
- Behind the Books: Global Insights from Law Librarians