Colorado leads the way in putting people before AI

By Abby Leeper Gibson & Gwen Battis

When you picture someone experiencing homelessness, what comes to mind? For many of us doing this work -social workers, advocates, organizers- we know there’s no single story. People lose housing for all kinds of reasons: stagnant wages, rising rents, medical debt, family trauma, bureaucratic gaps. No two paths look exactly the same, and there’s no one-size-fits-all solution. 

That’s why it’s so important that the systems we build—especially the new ones—don’t flatten people into data points or overlook the lived complexity of their lives. But that’s exactly the risk we face as artificial intelligence (AI) becomes more embedded in how decisions are made.

Colorado is taking this risk seriously. This year, it became the first state in the country to pass consumer protections relating to AI, with specific focus on guarding residents from bias and discrimination. Senate Bill SB24-205, also known as Consumer Protections for Artificial Intelligence, will go into effect on Feb. 1, 2026, and broadly speaking, puts new obligations on companies developing and deploying AI. In short: if a system is making decisions that significantly affect someone’s life—like determining eligibility for housing, employment, or insurance—it must be designed with  “reasonable care to protect consumers from any known or reasonably foreseeable risks.” According to the National Association of Attorneys General, the law applies to “high risk” AI systems that make decisions and provide assessments. It’s a pretty big deal. 


What this means for Coloradans

Currently, AI systems are used to sort through rental applications, filter job candidates, assign risk scores to people experiencing homelessness, and make decisions about who gets approved for loans or public assistance. These tools can replicate the same biases our systems have always had—only faster, and behind a curtain of technology that makes discrimination harder to see and harder to challenge.

Historically disadvantaged communities—especially Black and brown Coloradans, LGBTQ+ individuals, and those living without stable housing—are often the first to bear the brunt of these harms. Colorado’s new law won’t fix that overnight. But it’s a crucial first step in acknowledging the danger and putting safeguards in place.

This legislation also sends a clear message: emerging technology doesn’t have to come at the expense of equity. Regulation doesn’t mean stifling innovation—it means guiding it toward fairness, dignity, and public good.

What’s next 

The Denver Basic Income Project is among a coalition of organizations working hard to combat bias and discrimination, and replacing these narratives with a better understanding of homelessness and economic injustice in the U.S. Unfortunately, it often feels like our country takes one step forward and two (or more) steps back. But, in one regard, Colorado is leaping ahead.

As more industries turn to AI to automation, we need to prepare for deeper economic shifts. Automation is likely going to impact employment across all sectors–from logistics to customer service to creative work. While there is potential for upskilling and additional job creation down the line, what happens to Coloradans who are displaced in the meantime? 

That’s where guaranteed income comes in. 

If we’re serious about building an economy that works for everyone—through technological change, not in spite of it—we need to create a baseline of economic security. Direct cash programs and income floors are not futuristic ideas. They’re practical responses to a rapidly changing world. 

Colorado is leading the way by putting people first in its approach to AI. Let’s keep that momentum going.

Next
Next

Colorado’s State of Homelessness 2024: Why Cash Is Key to Ending—not Just Managing—Poverty