Subscribe to Our Newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

Wisconsin lawmakers consider AI policy on youth safety, legal personhood and data centers

Proposals would regulate teen “companion chatbots,” clarify that artificial intelligence cannot be recognized as a legal person under state law, and establish oversight for energy-intensive AI data centers.

Jean Kiernan Detjen profile image
by Jean Kiernan Detjen
Wisconsin lawmakers consider AI policy on youth safety, legal personhood and data centers

Artificial intelligence (AI) is no longer theoretical in Wisconsin. It is in teenagers’ bedrooms, in legislative drafting rooms, and in proposals to build massive server complexes across small towns.

As the 2026 legislative session unfolds, lawmakers are confronting AI’s social, legal, and environmental reach – attempting to balance innovation with consumer protection, youth safety, and community oversight.

Youth, algorithms, and emotional dependence

Concern about young people’s exposure to digital platforms – including generative AI chatbots – has become one of the Legislature’s most immediate technology priorities.

A bipartisan Speaker’s Task Force on Protecting Kids, formed last fall, examined online risks and recommended several measures. Among the bills advancing through the Assembly Committee on Children and Families in early 2026 is Assembly Bill 965, which focuses on “companion chatbots” used by minors. These systems retain memory of past interactions and sustain humanlike dialogue, creating emotional bonds that worry parents, child-development specialists, and digital safety experts.

AB 965 would require operators to ensure chatbots do not:

  • Encourage dangerous behavior
  • Discourage seeking professional help
  • Promote illegal activity
  • Supply sexually explicit content
  • Prioritize a child’s preferences over factual accuracy or safety

Operators who violate these rules could face fines up to $25,000 per violation. Parents or guardians could pursue civil action if a child is affected, with enforcement handled by the Wisconsin Department of Agriculture, Trade and Consumer Protection or the Department of Justice.

State Rep. Benjamin Franklin, a co-author of the bill, emphasized the stakes:

"Our youth put a lot of stock into what these artificial intelligence tools tell them. Even with security protections, children logging online are still at risk."

Supporters say the legislation is a necessary guardrail, while critics caution that broad definitions could unintentionally include helpful educational tools or raise privacy concerns.

A personal story

For one Wisconsin family, those concerns became tangible.

Last fall, Sophie, 15, began using a chatbot for homework help. At first, the exchanges were casual: math questions, science assignments, even a few jokes. Over hours, the conversation shifted. The AI remembered prior chats, mirrored her tone, and responded instantly – sometimes in ways that reflected her frustrations or anxieties.

"It wasn’t about immediate danger," her mother, Jessica, said. "It was seeing her rely more on a machine than on family and friends."

Sophie’s parents worried less about any single conversation than the cumulative effect: a sustained emotional attachment to a system built to maintain engagement. Lawmakers advancing youth-protection bills argue that guardrails are necessary before these systems become more sophisticated and deeply embedded in daily life.

Critics warn that poorly defined rules could sweep in beneficial educational or therapeutic tools. Supporters counter that drafting precision – not inaction – is the answer.

Another proposal addresses a question that once sounded hypothetical: can AI ever be considered a legal person?

Senate Bill 102, introduced by Sens. André Jacque and Mark Hoan and cosponsored by Reps. Lindee Rae Brill, Barbara Dittrich, and David Armstrong, would explicitly state that AI is not a legal person under Wisconsin law. If enacted, AI systems could not:

  • Own property
  • Enter into marriage (the so-called “robot marriage” provision)
  • Hold positions requiring legal personhood

Supporters describe the measure as preventative: ensuring accountability remains with corporations and individuals rather than shifting to automated systems as algorithms expand into hiring, lending, health care diagnostics, and public administration.

While largely symbolic for now, SB 102 reflects broader unease about how quickly AI capabilities are advancing beyond existing statutory frameworks.

Data centers and community resistance

The most visible AI-related conflict in Wisconsin centers on hyperscale data centers – massive server facilities that require significant electricity, water, and land.

Residents in Mount Pleasant, Mount Horeb, and Beaver Dam have raised concerns about environmental strain, energy demand, water usage, and transparency in local approval processes.

On Feb. 12, Rep. Darrin Madison, D-Milwaukee, introduced legislation to pause new hyperscale data center construction statewide until stronger environmental and planning standards are in place. His proposal would:

  • Create a statewide planning authority
  • Prevent utility costs from being shifted to residential ratepayers
  • Establish community recovery funds
  • Require expanded reporting before projects proceed

Madison and other Democrats argue current oversight mechanisms have not kept pace with the scale of proposed facilities. Republican lawmakers earlier advanced alternative siting and reporting standards, which they say balance economic growth with environmental review.

At recent Capitol hearings, protesters framed the issue in stark terms:

"These data centers aren't providing anything of real value to the people in Wisconsin, so why should we let them destroy our environment over being able to generate stupid deep fake garbage?" said Ash Petrie, one of the demonstrators.

Industry groups dispute that characterization. The Wisconsin Data Center Coalition says the projects represent billions in projected private investment, construction employment, and long-term technical jobs tied to the digital economy.

The debate reflects a familiar Wisconsin tension: economic development incentives weighed against environmental stewardship and local control.

Universities and workforce strategy

While lawmakers debate guardrails, Wisconsin universities are positioning the state for an AI-driven economic future.

On Feb. 12, five Milwaukee institutions – University of Wisconsin–Milwaukee, Marquette University, Medical College of Wisconsin, Milwaukee School of Engineering, and Waukesha County Technical College – announced an expansion of the Northwestern Mutual Data Science Institute. Launched in 2018, the partnership now includes all five institutions to accelerate AI research and workforce preparation.

"What really worries me is the pace of change and its implications on curriculum," said UWM Chancellor Thomas Gibson.

Northwestern Mutual CEO Timothy Gerend said the expansion strengthens research capacity and talent development. Gov. Tony Evers praised the collaboration as aligning economic opportunity with responsible growth.

National context

Wisconsin’s AI policies unfold amid a broader national conversation. Senators Tammy Baldwin and Ron Johnson offer contrasting perspectives. Baldwin emphasizes protecting residents from unintended consequences while promoting workforce development, whereas Johnson stresses balancing economic opportunity with state oversight and states’ rights in shaping AI policy.

Across the country, states including California, New York, and Illinois are adopting age-based safeguards, transparency requirements, and legal clarifications for AI systems. At the federal level, Congress and agencies such as the Federal Trade Commission are considering regulations on AI companions for minors, algorithmic accountability, and data privacy protections. Experts warn that emotionally responsive AI can influence adolescent development, creating a pressing need for thoughtful guardrails. In this context, Wisconsin’s actions – from companion chatbot regulation to data center oversight – position the state at the intersection of innovation, public safety, and civic responsibility.

A structural moment

Taken together, Wisconsin’s AI proposals illustrate how rapidly the technology has moved from novelty to infrastructure:

  • In households, it shapes adolescent behavior and emotional development.
  • In law, it raises questions of liability and accountability.
  • In communities, it tests land-use policy, water rights, and energy planning.
  • In higher education, it is redefining workforce strategy.

Public opinion remains cautious. Even lawmakers emphasizing economic opportunity acknowledge that unregulated technological expansion can produce unintended consequences.

For families like Sophie’s, the issue is immediate and personal. For policymakers, it is institutional — how to set boundaries before systems outpace regulation.

Artificial intelligence is now embedded in Wisconsin’s civic, economic, and social fabric.

The question facing lawmakers is no longer whether AI will influence the state’s future.

It is who will define the terms under which it does.

Wisconsin lawmakers consider AI policy on youth safety, legal personhood and data centers © 2026 by Jean Kiernan Detjen is licensed under CC BY-NC-ND 4.0

Jean Kiernan Detjen profile image
by Jean Kiernan Detjen

Truth Prospers Here.

Join our subscriber list and get notified of the latest news from around the Fox Valley.

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

Read More