The FTC Event that Wasn’t: The Attention Economy Workshop Misses an Opportunity for Meaningful Discussion

David Inserra

Tomorrow, June 4, the Federal Trade Commission (FTC) will hold a workshop titled “The Attention Economy: How Big Tech Firms Exploit Children and Hurt Families.” As if that title wasn’t clear enough, the event description doubles down on the FTC’s hostility to technology companies: “The event will bring together parents, child safety experts, and government leaders to discuss how Big Tech companies impose addictive design features, erode parental authority, and fail to protect children from exposure to harmful content.” Sadly, the workshop will likely be little more than a one-sided airing of grievances against tech companies. 

But it didn’t need to be this way. The FTC originally took a different approach. This event was originally announced as “The Attention Economy: Monopolizing Kids’ Time Online” and featured a more balanced framing and set of participants. 

I would know because I was originally invited to participate. While I am, of course, disappointed that I was not invited to participate in the revised event, the real disappointment is that we lost the chance to have a fruitful discussion featuring different perspectives on an important policy issue. Indeed, even within the Trump administration, there seem to be significant differences of opinion between those looking to empower American technological innovation and those who believe tech companies are harming Americans and should be punished. 

The FTC hearing, then, is a missed opportunity for experts, users, regulators, and policymakers to engage in a real conversation on this issue of how technological products are used by kids. But while I can’t join the FTC event, I’m thankful that we all benefit from technologies that allow us to still engage online. So, what follows is an abridged set of the remarks I would have delivered: 

Good afternoon and thank you for having me today on this panel. I’m honored to provide my viewpoint on how technology is being used by kids and the ways policymakers should be thinking about the impact of such technologies. 

Parents and policymakers frequently claim that kids’ use of technology is harmful. Ranging from questions about cell phones to social media and video-sharing platforms, modern technologies present opportunities and challenges. The question this panel seeks to address is what the impacts of new technologies are and how society should handle those impacts. 

My remarks will focus on how the primary concerns underlying current debates of kids online are similar to previous debates about kids and new technologies, are driven by the nature of the content kids consume rather than design of the technology, and how parents and families are best positioned to navigate the benefits and risks of children using technology. 

A Short History of Panics around Kids and Technology 

Recent history is full of examples in which a new technology or use of technology for expressive purposes was deemed harmful to children. With the rise of television following WWII, policymakers and advocacy groups argued that television was responsible for increases in violence and juvenile delinquency. And, similar to today, policymakers found that there wasn’t conclusive evidence that television caused an increase in problematic behavior among children. But that did not stop many advocates from making that argument. 

A Senate report from 1955 worried that both the content and the nature of the technology harmed children. 

“The cumulative effect of crime-and-horror television programs on the personality development of American children has become a source of mounting concern to parents…The subcommittee is aware that no comprehensive, conclusive study has been made of the effects of television on children. [Yet] there is reason to believe that television crime programs are potentially much more injurious to children and young people than motion pictures, radio, or comic books. Attending a movie requires money and the physical effort of leaving the home, so an average child’s exposure to films in the theater tends to be limited to a few hours a week. Comic books demand strong imaginary projections. Also, they must be sought out and purchased. But television, available at a flick of a knob and combining visual and audible aspects into a ‘live’ story, has a greater impact upon its child audience.” 

Many groups, professionals, and academics argued that the content and new nature of television posed a threat to American youth. Noted thinkers like Walter Lippmann made arguments that are not out of place in today’s debate over social media. 

“Censorship is no doubt a clumsy and usually a stupid and self-defeating remedy for such evils. But a continual exposure of a generation to the commercial exploitation of the enjoyment of violence and cruelty is one way to corrode the foundations of a civilized society. For my part, believing as I do in freedom of speech and thought, I see no objection in principle to censorship of the mass entertainment of the young. Until some more refined way is worked out of controlling this evil thing, the risks to our liberties are, I believe, decidedly less than the risks of unmanageable violence.” 

New music is also regularly blamed for corrupting children. Rock music was accused of being linked to satanism and all sorts of other harms by many in society, notably by Tipper Gore and her Parents Music Resource Center. Rap music has continually been accused of pushing kids towards violence or drug use, most recently in modern Drill rap. Whatever the specific form of modern music, society has regularly struggled with the types of musical expression our children should engage with, especially as new technologies make it easier for more children to consume. 

But in each case, there is no clear evidence that such expression causes harm to children. Many studies find that there may be a correlation between violent music and violent behavior, but not that violent music causes violence. That has not stopped many from continuing to blame various musical genres for various youth behaviors and harms. 

Similarly, the advent of exponentially powerful computing power brought with it violent video games that also stood accused of harming youth and inspiring violent anti-social behavior. Many have blamed video games for acts of violence, but research consistently finds that there is little to no evidence that video games cause children to engage in violence. 

New technologies and uses of technology for new forms of expression have frequently been accused of harming children, but the evidence does not support these concerns. These technologies are frequently accused of being harmful, both because of the violent, immoral, or otherwise dangerous nature of the content as well as the inherent design of the technology, such as the ease with which children can access content or its addictive, gamified nature. These same accusations have reemerged in the debate over modern communication technologies and social platforms, and this is where my analysis will turn next. 

Concerns about Content vs. Design 

In the modern debate over children and online speech, advocates for greater regulation of speech and technology will often try to distinguish between the content and design of online platforms. This distinction is necessary because First Amendment jurisprudence limits the authority of the government to regulate speech on the basis of the viewpoint of the content. While First Amendment experts will be able to unpack this point further, there has been an attempt to shift the debate to discuss the design features of new technologies rather than the violent, extreme, immoral, or otherwise objectionable nature of the content. Indeed, this panel is named the “attention economy” to refer to the design of new technologies that monopolizes the attention of children. 

But this distinction is one in name only. While new technologies certainly present new design elements and features, the supposed concern around the design of technology platforms is largely due to the types of content that these designs provide to users. To elucidate this point, some hypotheticals are worth considering. 

Imagine that the leading videos being watched by children online were videos about the Roman Empire, the Japanese Tokugawa shogunate, or countless other historical topics. Would we be holding congressional and FTC hearings full of experts accusing technology companies of harming the youth? Or imagine that the largest Facebook groups or subreddits were about gardening, landscaping, and garden-to-table cuisine. Would academics be searching to establish a causal relationship between gardening groups and kids’ aggressive behaviors and conditions? Or consider a world where X and Bluesky are dominated by self-help guides ranging from do-it-yourself instructions for common household tasks to conversations about how to be a more thoughtful, insightful, and humble individual. Are safety advocates and some parents demanding that platforms change their design to protect children? 

The answer to these hypothetical questions is clearly, “No.” Algorithms that serve more history videos, notifications telling children about a new gardening post, or some sort of gamified experience for how many DIY projects someone has tackled are not design features that policymakers care about. Just as rap and rock — but not classical— music on the radio and TV have been called harmful, we only care about the algorithms, autoplay, notifications, badges of honor, gamified experiences, and other design features because the content served by such features are found concerning, dangerous, or harmful by parts of our society. 

But, some might argue, these new technologies are built to drive engagement and interaction. While it is true that many online platforms include various means of engaging with their users, it is also true that many good products have features that make users want to use them. Good books are referred to as “page turners” or a book “you can’t put down.” A good movie might be described as “gripping” or “compelling.” Good music might be called “catchy” or “stuck in your head.” A good video game might be called “addictive” or “immersive.” The point is that good products often want to draw their users in and give them interesting experiences that they crave more. This might be due to the nature of the content or the products’ exciting features. 

Social media is no different. It uses various design choices, including appealing curation, placement, and posting features, algorithms to provide interesting content, and notifications that try to tell users when something they may care about has occurred. What is the alternative? Make it difficult to post or view content? Algorithms that purposefully give you content that is uninteresting and boring? Provide users with no notifications or notifications about random events they don’t care about? Of course, few would openly suggest that companies should be required to make bad products, yet this is where this line of argument leads. 

So, while many advocates, experts, and policymakers may sincerely believe they are concerned about the design of social media platforms, their fundamental concerns may regard the viewpoint and nature of content online. 

As someone who is a parent, I understand and share those concerns myself. But the government is neither constitutionally nor practically able to pursue the best interests of my children in terms of what kinds of content and technology they should be engaged with. The centrality of empowered parents and children, then, is where I will conclude. 

Empowered Parents and Children 

When dealing with potentially harmful content and design features that may advance their reach, the answer must start with parents. Parents are responsible for their children’s curfew, the peers their children spend time with, the books, television, and movies their children consume, the schools they attend, the atmosphere in the home, and the whole host of parenting decisions that have major impacts on the well-being of their children. While thankfully most parents no longer must worry about threats that previous generations faced, such as many deadly diseases and conditions that have been addressed by modern science, nutrition, and health decisions, instead, parents face the challenge of how to manage their children’s screen time and the content they see online. As with TV, music, and video games, the challenges facing parents involve popular technology and media widely adopted by children. And the sheer amount of content and new technologies certainly presents greater difficulties. 

But no one is better positioned to address those challenges and difficulties than parents. Some children will use these technologies effectively in their studies; others will find them a distraction. Some children can be trusted or will learn to be responsible with new technologies, while others may need greater shepherding and limits. Some children may find such technologies to be a lifeline to communities they otherwise don’t have access to, whether it be because of geographic or ideological isolation. Some children may be kept safer because of such technologies, while others may find certain online situations unsafe or harmful. Whatever the situation and child, parents are best positioned to make decisions for their children. 

And tech companies are increasingly responding to the demands of parents by providing greater parental controls. YouTube Kids or Amazon Kids are kids-specific products that provide greater parental controls and more specific choices to children. Instagram and TikTok have each significantly expanded the controls and tools built into their apps to help parents control what their kids do online. Parents can make use of apps built into devices like Apple Screen Time or third-party apps like Quostodio, Net Nanny, or more. 

Are these offerings perfect yet? No, there will always be improvements that can be made, and parents will have to decide whether or not the products and controls available meet their needs or should be avoided. But rather than clearly harming children and families, the market of technology products is increasingly providing parents and all individuals with better ways to manage the negatives of being online while expanding the many ways new technologies can benefit our society and our families. 

Thank you, and I look forward to the discussion.