Eight-year-old Anastasia (Nastya) Radzinskaya had 103 million subscribers on YouTube and 6.9 million followers on TikTok as of January 2023. It is estimated that she made $28 million, landing her the title of “sixth-highest-earning YouTube star in the world for 2021.” Her videos feature activities like writing a letter to Santa Claus, decorating her bedroom for Halloween, and singing and dancing with her friends. However, unlike most children, she has made a considerable profit from brand deals, her merchandise line, and her own NFT collection. While this is not the norm for most children on social media, it is a common dream. A Harris Poll/LEGO survey of children aged 8 to 12 conducted in the United States, United Kingdom, and China found that 29% of the children wanted to be a YouTuber. The majority of content consists of creators sharing their day-to-day lives, lifehacks, vacation vlogs, and favorite products. While most of this content ostensibly seems harmless, these posts raise a myriad of concerns about creating and profiting from the “digital footprints” of minors who cannot knowingly and fully consent to online exposure at such an early age.
This Essay identifies a gap in the law—the absence of social media protections for minors—and pinpoints possible solutions. It is necessary to focus on the existing challenges for children who are featured in content created by their guardians while acknowledging that there are overlapping challenges for children who are content creators themselves. This Essay then proposes ways in which legislatures can correct and extend existing laws. Most of this Essay focuses on privacy matters, statistics that give rise to various privacy concerns, and the shortcomings of the Children’s Online Privacy Protection Act. The second part of this Essay transitions to child labor protections for child performers and the gross inadequacies with the existing framework. This Essay recommends extending “Coogan accounts” to child influencers, thereby requiring a minor’s monetary earnings to be placed in a protected fund until they come of age. While these concerns and accompanying proposals are not all-inclusive, they advocate for a holistic approach to online protection of minors on social media. Legislators have an obligation to enact nationwide safeguards for families, but more specifically children, whose best interests may not be fairly protected by family members who can profit from their publicity. Concurrently, parents have a fiduciary duty to protect the interests of their children. More broadly, companies and cultural customs must adjust to protect and educate children and families about the risks of “sharenting” content on social media.
Terminology
Before proceeding, it is crucial to clarify the terminology in this Essay. At times, the term “parent” is used to describe a child’s legal guardian; however, it is important to recognize that this terminology is not representative of all family structures. “Social media” is used to refer to a myriad of social media platforms including YouTube, TikTok, Instagram, and Twitch. “Vlogging” is used to describe video blogging. “Sharenting” is a term commonly used to describe when parents share content concerning their children online. “Kidfluencer” is used to describe a child who personally identifies as a child influencer.
I. Background
The U.S. Supreme Court has “treat[ed] the family as an autonomous entity, implicitly recognizing the parental role in protecting the child from state action.” In the case of Meyer v. Nebraska, the Court deemed a statute prohibiting the teaching of foreign languages to children who had not passed the eighth grade to be unconstitutional, finding the statute infringed upon parents’ liberty to control their children’s education. The Court reasoned that this restriction encroached upon the Due Process Clause of the Fourteenth Amendment, affirming “the right of the individual to contract, to engage in any of the common occupations of life, . . . to . . . establish a home and bring up children, . . . and generally to enjoy those privileges long recognized at common law as essential to the orderly pursuit of happiness by free men.”
The Fourteenth Amendment provides that a state may infringe on the custodial rights of a parent, but only when it adheres to constitutionally adequate procedures. In the case of Prince v. Massachusetts, the court recognized the state’s authority to “protect the welfare of children” when upholding enforcement of a state child labor law against a child’s guardian who was preaching and selling religious literature with the child on a public road.
Some legal scholars have observed that “[c]ontrary to U.S. Supreme Court holdings, legislators and state courts, when creating and applying child welfare laws, have tended to perceive the family as individuals with separate and frequently conflicting rights, rather than the family as a unit with its own body of rights.” Legal guardians who want to profit from posts featuring their children may have legal interests that are diametrically opposed to those of their children. A content creator who focuses on posting their children functions as a creative director, manager, and sometimes a business owner looking to receive a return on investment. Whether it is the plan to earn pocket change or become a full-time content creator, the possibility of profiting creates an inherent conflict of interest between legal guardian and child. The child becomes a performer, actor, model, or case study subject to be observed by Internet users at large. Although not all family content creators rise to a high level of fame or controversy, it is essential to enact safeguards to protect minors.
Reinventing the wheel to create safeguards is unnecessary when there are already existing frameworks for child performers, specifically actors. This Essay recommends extending current legislation protecting child performers to child influencers, promoting education about online risks, and creating gaps to fill the vacancies in the law. Families need and deserve protection, and children need some form of recourse for harm that they endure as a result of online exposure. It is important to recognize that this approach is somewhat paternalistic and assumes that parents are incapable of protecting their children; however, following all the safety precautions and “doing the right thing” is not enough to prevent fallout harm. Enacting regulations, educating children and families, and changing cultural customs enables families to move forward with guidelines and legal parameters as the industry continues to expand and evolve.
II. Invasion of Privacy
Even though the state entrusts parents with the safety of their children, parents are sometimes the first to expose their children to safety hazards and infringe upon their privacy. Families have an obligation to take reasonable precautions to protect their children’s information online. However, the pervasiveness of social media coupled with the potential for exploitation makes it impractical to expect parents to safeguard their families. The Children’s Online Privacy Protection Act (COPPA) creates some security against data collection for children under the age of 13, but it fails to account for all minors. The government could implement or incentivize nationwide legislation regulating big tech companies by (1) pushing for further transparency, (2) promoting the creation and implementation of child-friendly versions of sites, and (3) creating clearer paths for recourse that would create safety mechanisms for minors on social media.
A. Safety and General Concerns
Seemingly innocuous posts, such as a child’s dance recital or sendoff to school, can lead to the revealing of more private facts. Websites, including Whitepages and CheckPeople, allow users to search and reverse search by full name, phone number, city, state, zip code, or place of business. Users can pay a premium price to unlock information including cell phone numbers, addresses, relatives, age of the search target and their relatives, financial records, maiden names, property details, carrier information, and voter registration information. “By tracing a parent’s social media data to voter registration materials, children’s identity can be inferred, including name, location, age and birthday, and religion.” Reviewing privacy settings and modifying who can view the content are only helpful until someone stores a copy of the information. These tips are rendered useless when a child grows a public platform.
As already mentioned, the number of minors on social media has skyrocketed, and parents do not always have the knowledge or the bandwidth to implement safety measures. According to the Mott Children’s Hospital study, “[o]ne in six parents whose child uses social media apps (17%) are not using any parental controls.” Children are susceptible to hazards online, including “[e]xposure to harmful or inappropriate content (e.g., sex, drugs, violence, etc.)”; data privacy concerns; online predators; cyberbullying; excessive or manipulative advertisements; identity theft; risk of being hacked; and interference with sleep, school, exercise, and family time. These harms can be mitigated through the development of existing regulations and the passage of pending legislation.
1. COPPA and the Collection of Data
COPPA was enacted to address some of the aforementioned issues, but only for children under the age of 13. COPPA generally prohibits an app or other online service from collecting, using, or disseminating personal information about children without verifiable consent from a parent. The platform must provide parents with written notice of the platform’s practice for securing and disclosing information. Parents are granted the authority to access their child’s personal information, determine how it is being collected and used, and request to have the information deleted. Platforms cannot “condition[] a child’s participation in a game, the offering of a prize, or another activity on the child disclosing more personal information than is reasonably necessary to participate in such activity.”
Many critics have identified the pitfalls of COPPA, including its vague terms, misplaced focus on children under the age of 13, and obvious loopholes. COPPA applies to websites “directed to children,” but the statute and regulations provide limited guidance for those parameters. Critics have pointed out that website operators generally “must determine for themselves” whether their platform is subject to COPPA requirements. A company’s adherence to COPPA may still not be enough to guarantee adequate protections. Parents are given the authority to set their child’s privacy settings; however, “only 56% of parents reported using privacy settings that limit the collection of data through children’s apps.” Moreover, not all parents have the knowledge to set the privacy settings. According to a study by the Consumer Reports National Research Center, “57% of Web users mistakenly believe that before monitoring their online browsing, companies are legally required to identify themselves, [and] spell out why they’re collecting data and who they intend to share it with.” Although the Federal Trade Commission (FTC) has authority to seek enforcement when companies violate their own privacy policies, the United States does not have a comprehensive privacy law that accounts for data collection.
In his 2022 State of the Union Address, U.S. President Joe Biden advocated for more stringent privacy protections and accountability. He highlighted the need to “strengthen privacy protections; ban targeted advertising to children; [and] demand [that] tech companies stop collecting personal data on our children.” Some senators have introduced additional legislative proposals concerning privacy matters. Senator Ed Markey, one of the authors of COPPA, has pushed for the “Children and Teens’ Online Privacy Protection Act (COPPA 2.0),” an extension of COPPA. The proposed act would implement a “‘Digital Marketing Bill of Rights for Teens’ that limits the collection of personal information of teens” and require companies to obtain user consent before collecting personal information from individuals ages 13 to 16 and “to permit users to eliminate personal information from a child or teen when technologically feasible.” Similarly, the proposed Kids Online Safety Act would offer protections to children younger than 16 and address targeted advertising directed toward minors.
Kalinda Raina, Vice President and Head of Global Privacy at LinkedIn, has recommended establishing education programs to help children understand digital dangers and risks from an early age. The framework for more rigorous requirements exists but requires additional support and public exposure to get passed by the legislature. Enacting meaningful change starts with changing attitudes and norms.
2. Pedophilia and Deepfakes
By virtue of being online, children may be targeted by pedophiles and subjected to deepfakes. Some family content creators have removed content featuring their children when “their channel analytics revealed that a massive portion of viewers were adult men, and their YouTube videos were being embedded on the websites of pedophiles. One creator saw her percentage of male viewers drop from 40 to 17 percent of her audience when she disabled the ability for her videos to be downloaded.”
These same viewers may be responsible for deep fakes, media that has been “manipulated by a computer to superimpose someone’s face, body or voice onto something else.” Even if a child never participated in a given act or activity, their “face, movement and voice could be inserted into all types of video[s].” Images from pictures and videos on social media can be harvested and synthesized to generate deepfakes, some of which contain sexually explicit or pornographic material. In 2019, the AI company Sensity found “70% of targets were private individuals whose photos had been harvested from social media.” As technology currently stands, “decent face swaps can be achieved with as few as 300 images,” a number that many social media accounts far exceed. Artificial learning and artificial intelligence are relatively uncharted territories due to their novelty; however, the harm from image-based sexual abuse is irreparable. There can be little retribution due to online anonymity and digital automation.
B. Digital Birth
A 2010 study conducted by AVG, an Internet security company, found that in the U.S., 92% of minors had an online presence before the age of two. More specifically, 34% of babies had their sonogram photos posted online before they were born. Aside from the obvious safety issues, there are concerns about how “[t]he online practice of ‘sharenting’ removes the autonomy of the child and denies them the right to craft their own footprint on a blank online canvas which should be the birth right of every child.” Even though posts can be deleted, search engines and databases “cache the information, providing an opportunity for infinite rediscovery long after any value of the initial disclosure remains.” One study found that “56% of parents shared (potentially) embarrassing information about their children online, 51% provided information that could lead to an identification of their child’s location at a given time, and 27% of participants shared (potentially) inappropriate photos.” Because sharing personal snippets on social media is so commonplace, parents can easily become complacent with their child’s image and data. Some children may not want a paper trail of digital mementos following them throughout their lives.
1. Parasocial Relationships
Online exposure can lead to the development of “parasocial relationships,” or a dynamic when the viewer forms a one-sided connection to the person on the screen as if they were a real-life mentor or companion. This relationship allows strangers to feel like they have a legitimate connection to the family, and viewers can feel emboldened to comment on private facts about the family’s life. Private family matters can be very emotionally draining, stressful, and traumatizing for a child, and these issues can be magnified when they are projected online.
2. Embarrassment and Emotional Collateral Damage
It goes without saying that sharing intimate memories or milestones can be humiliating for a child once they are old enough to understand what is being shared with the public. Many parents share videos of their child’s milestones, such as their first steps or even their temper tantrums. With constant accessibility to mobile cameras, there is nothing besides a child’s own plea to stop the parent or guardian from filming them at their most vulnerable moments in the home. Sonia Livingstone, a professor of social psychology at the London School of Economics and Political Science, explained: “We interviewed several families where even small children wished their parents would share fewer photos, and consult them more. We observed in a few families that children are even learning to tell their parents to stop.”
Since these children are under the age of 18, they are generally subjected to the rules of their parents and cannot legally refuse. When a parent prioritizes clicks over their child’s emotional needs or autonomy, they can damage their relationship with that child, diminishing their trust in that parent. Additionally, they can diminish their child’s self-esteem by subverting their needs and desires.
A child’s fear of online exposure can lead them to suppress their emotions and reactions during emotional events. This suppression can lead to issues including emotional problems and social deficits resulting from “maladaptive learning, overarousal, unrealistic expectations, and emotional constriction.” Both the emotional impact and the online footprint can affect ongoing and future relationships, educational opportunities, and job prospects. Furthermore, posted content can be leveraged against the child by their peers or other adults in their lives.
Parent and child influencers can incidentally reveal developmental delays, impediments, mental health challenges, disorders, illnesses, and trauma. In May 2020, Myka and James Stauffer became some of the most notorious family vloggers when they announced the decision to rehome their neurodivergent son, Huxley, just short of three years after adopting him from China. In 2016, the Stauffers began their 27-part series about their “adoption journey,” reaching nearly 139,000 views. Within one year of adopting Huxley, the channel flourished, and the Stauffers bought several cars and a house costing over $600,000. The family posted subsequent videos about Huxley’s challenges as a child with autism who suffered a stroke in utero, including how many months it took him to learn how to form one sign in sign language. In one of her videos, Myka credited a brand of laundry detergent with helping her form a bond with her son, who has a detachment disorder. In another video, the family bound the child’s hands with duct tape to prevent him from sucking his thumbs. There has been an abundance of online backlash against the family for crowdfunding the cost of Huxley’s adoption, exposing his personal needs online, publicly treating him as an outsider in front of their biological children, and profiting off of his challenges before rehoming him. The series of highly publicized videos raises several ethical questions about profiting from a minor’s emotional trauma and exposure to the public.
C. Rights of the Parents vs. Their Children
Protections and restrictions surrounding a child’s privacy interests are self-imposed by the child’s parent or guardian, meaning the parent has the potential to undermine their child’s best interests. When it comes to children in social media filing lawsuits against their parents, there is a dearth of existing case law. In the future, children may attempt to wield claims spawning from the Right to Be Forgotten or the Right to Erasure, but they will likely have difficulty overcoming their legal guardians’ Freedom of Speech Rights and right to privacy.
The Free Speech Clause of the First Amendment affirms that “Congress shall make no law . . . abridging the freedom of speech” and applies to the states by virtue of the Fourteenth Amendment. Parents have significant autonomy to post images of their child online so long as the content is not pornographic. Parents are generally extended the “right to be let alone,” equipping them with the right to parent as they see fit within reason.
D. Where Do We Go from Here?
The state has limited options when it comes to aiding children in social media. One way for the state to interject is to have public health and technology committees converge to create a public health practices handbook explicating both the health and digital risks of content creation.
The current culture does not find child-focused media to be repugnant; rather, it gluttonizes and capitalizes off social media content featuring children. Shifting attitudes seem aspirational at this point in time, but it may be helpful to rely on children’s rights scholars who advocate for recognizing children’s autonomy. Some scholars argue that as a child matures, they should be granted greater deference in controlling their own privacy. By the same token, “risks entailed by privacy are much higher for youth than for younger children.” One sensible practice model considers a child’s age and capacity, acknowledging that children need their space, and that adults should respect their privacy. This boundary means that a parent should delete or erase content featuring their child upon request. The Right to Erasure, which requires platforms to remove images or content featuring minors upon their request, has been implemented in other countries including France. This right has not yet been implemented or enforced by the federal government in the United States.
While the First Amendment prohibits state actors from abridging free speech, it says nothing about social media platforms themselves. It is important for social media platforms to take the initiative to protect children’s privacy by enacting and enforcing their own right of deletion provisions in their privacy policies. Furthermore, these companies should assume a “safety by default” approach to child safety. This could include a data minimization approach to data collection, only recording what is necessary and deleting data once they are no longer useful. Transparency about data collection, third-party platforms, and user analytics is crucial. Consumer notices should come in the form of clear and accessible avenues that educate parents and children alike about the risks of sharing information online. In order to enact meaningful change and protect minors, social media platforms must be transparent about the risks of cultivating an online presence and revealing certain information.
III. Child Labor
Social media has burgeoned and transformed into one of the most profitable and accessible career paths. The New York Times recently reported “brands might pay $10,000 to $15,000 for a promotional Instagram post while a sponsored YouTube video might earn $45,000. A 30- to 90-second shout-out in a longer video can cost advertisers between $15,000 and $25,000.” Given the high level of profitability, it is evident that many children are no longer playing. Rather, they are working, modeling, acting, selling products, marketing their brands, and building fan bases.
Four-year-old Aubrey Jade garnered 243,000 followers on Instagram through regularly posted fashionable outfits. She regularly sported designer brand accessories and outfits from Gucci to Saint Laurent. She posed like an adult woman popping her hip and wearing red lipstick. One of her captions reads: “I just love trench coats and I’m obsessed with this one!! Trench and bag from @amorandlittlegirls Pants and turtleneck @loolous_”. Similar to a child model, Aubrey Jade endorsed products, but she was not offered the same legal protections as a child model or actor.
There is a complete absence of labor legal protections for children like Aubrey on social media. Because children are being filmed in the home, and not in a studio, it is difficult to record and manage how long children are on-screen. Bee Fisher, the mother of three child influencers, explained her conscientious approach to managing her children’s social media accounts: “I don’t want to have a child 15 years from now sitting in a therapist’s office saying my parents made me take pictures every day. . . . If there’re days they’re totally not into it, they don’t have to be. . . . Unless it’s paid work. Then they have to be there. We always have lollipops on those days.” Not all parents share in Bee Fisher’s ethos of prioritizing their child’s wants and capacities.
A. Potential for Physical and Emotional Harm
In 2017, Mike and Heather Martin, managers of the DaddyOFive YouTube channel, lost custody of two of their five children for child neglect. In one of their “prank” videos, the Martins drove their sons to tears by falsely accusing them of making a mess, screaming, and swearing at them. In another video, the father taunted his distressed son before pushing him into a bookcase as the child tried to run past him. Even though the parents released public statements apologizing for their behavior, they sidestepped accountability by saying, “We were going for shock value. . . . What you see on our YouTube channel is not a reflection of who we are. . . . It’s a character. It was a show.” Another channel, Fantastic Adventures, garnered more than 250 million views, but behind the scenes, the seven adopted children were being abused and neglected. A wellness check on the family revealed that the children were physically punished and starved for failing to recall their lines for videos. YouTube de-platformed the family.
If a child is not outwardly being abused, it is difficult to quantify how much harm is being inflicted upon the child. Social media can be harmful to a child in ways that are invisible to others but nevertheless detrimental to their psychological development and mental health. When a child is thrust into the limelight, whether by choice or by force, they have the potential to be overworked, sexualized, criticized, shamed, and bullied. The absence of adequate laws and protections leaves children without proper recourse when they are exposed to harm through social media. It is helpful to look at both the Fair Labor Standards Act and Coogan laws, but both are riddled with loopholes.
B. Fair Labor Standards Act
The Fair Labor Standards Act governs child labor, but it does not apply in most circumstances to children who are employed by parents or guardians. State statutes follow suit and do not apply to children who are posted online by their parents. Existing Child Performer laws may be helpful in setting potential guidelines for child content creators, but they are arguably deficient because they allow for long hours. New York regulations dictate that “[o]utside of live theater and other live performance, a child performer may be employed no earlier than 5:00 a.m. on any day, no later than 10:00 p.m. on evenings preceding school days, and no later than 12:30 a.m. on the mornings of non-school days,” subject to limitations on working hours. California law generally establishes a ceiling of eight hours of work per day or 48 hours a week. Notably, as of 2022, 17 states did not have any specific protections for child performers. This inconsistency creates a gap in the law and allows parents of child performers to forum shop, meaning they could move to another state to evade labor guidelines. To create adequate labor standards, legislation must (1) pass nationwide to protect all child influencers on social media in the U.S. and (2) set more stringent perimeters for hours worked per day. To implement stricter safeguards, states could require parents to obtain a permit in order to maintain their child’s social media account. This level of compliance would only apply to parents who regularly post their child online with the intention of profiting from AdSense or sponsorships. The typical parent who posts photos to social media would not be required to comply with the same regulations.
C. Monetary Earnings
A popular child content creator may not even receive adequate compensation for their labor if their parent mismanages their funds. The United States should adopt nationwide legislation mirroring the California Family Code requiring the creation of a Coogan account for child performers. California’s Coogan law generally requires the child’s employer to preserve 15 percent of the child’s gross earnings in a trust for their benefit until they come of age. The California Family Code allows for the parent or legal guardian to be appointed as the trustee, unless a court deems that “the appointment of a different individual, individuals, entity, or entities as trustee or trustees is required in the best interest of the minor.” Under this policy, parents have the fiduciary duty to safeguard the monetary earnings of their children, who may be the breadwinners for the family.
In 2018, California enacted “kidfluencer” legislation that was a shell of the original bill introduced by Assemblyman Kansen Chu. It adds “digital” exhibitions to Labor Code provisions concerning child performers, but it is a significant departure from the bill’s original purpose. The adopted law exempts minors from needing work permits if their performance is unpaid and shorter than one hour. Critics have pointed out, “if work permits aren’t mandatory for kidfluencers, their parents have no legal obligation to open a Coogan account.” These inconsistencies allow for financial abuse by parents or guardians of child influencers even if that is not their intention.
Recent legislation passed by France in 2020 may offer an identifiable solution: requiring “the majority of a child’s income garnered from social media influencing” to be placed in a “French public sector financial institution, which will hold and manage that money until the child comes of age.” Enacting similar nationwide legislation in the United States could help relieve parents from a portion of their fiduciary duties and shift the obligation from decision-making to merely meeting compliance with the law.
Conclusion
In the United States, parents are entrusted with the rights of their children, and they are expected to safeguard their children’s best interests. Parents who seek to profit from their children’s online success may incidentally have interests that are antithetical to their children, thereby creating an insurmountable conflict of interest. Even when parents are careful to safeguard their children’s best interests, they may not be knowledgeable enough to combat online risks. These complications further exacerbate privacy concerns such as deep fakes and collection of data. Posting about children before they can knowingly and fully consent to an online presence raises questions about giving children the autonomy to craft their own digital pathways. In addition, filming children in the home raises a myriad of labor concerns, including working hours and monetary earnings. The existing legal frameworks for child performers offer some guidance, but there are a myriad of shortcomings. Protecting children and families requires more steadfast legislation, a general shift in attitudes toward safeguarding privacy, and holding social media companies and parents accountable.