As a cultural anthropologist, I spent the past year examining what it means for workers in Atlanta to be perceived as “AI literate.”

I have seen firsthand how uncertainty around the goal of “AI literacy” has stalled progress in developing systems and policies that protect workers and communities.

Each of the many artificial intelligence (AI)-focused events I attended over the course of my research echoed the promise that Georgia can lead the nation in AI adoption.

On paper, the state is well-positioned to do so: Georgia Tech alone hosts multiple National Science Foundation (NSF)-funded AI institutes and a growing network of public-private partnerships is eager to brand the Atlanta metropolitan area, in particular, as an AI hub.

Yet, behind the optimism, I heard public officials voice unease about the risks of adopting AI without a clear framework for oversight or accountability. And they are right to be concerned.

Three challenges that obstruct AI literacy implementation

Anuli Akanegbu, Ph.D., is a cultural anthropologist in Atlanta. (Courtesy)

Credit: Handout

icon to expand image

Credit: Handout

Over the course of my fieldwork, I observed three structural challenges that prevent Georgia from implementing AI in ways that safeguard workers and communities.

First is unequal public infrastructure.

Across the city, there is uneven access to the tools necessary to participate in an AI-driven economy.

In my interviews with experts, I learned that growth in Atlanta’s tech sector is outpacing public investment in basic services.

Noticeably, the internet runs slower in historically Black neighborhoods in Atlanta than in areas like Buckhead or Midtown.

A business leader framed it as a broader failure in how public investment works, recalling the early days of the internet when widespread access allowed more people to explore and discover itsvalue.

Without concrete, public-sector investments and policies that ensure access, equity, and opportunity in the tech ecosystem, AI risks reinforcing existing disparities between communities — to the advantage of those who are already privileged.

Second is limited government capacity.

While the public sector bears responsibility for regulating AI, already underresourced and overburdened agencies lack the staffing, funding, and technical expertise to do so effectively.

I learned from a former government official now working at an AI startup about a structural constraint in public-sector innovation, noting that “there’s no R&D part of our budget,” unlike private firms such as Apple or Google that can devote a significant chunk of their spending to developing new products and services.

Without the resources to innovate internally, government agencies are largely reactive, unable to set standards or enforce accountability in AI adoption while private firms surge ahead to set the terms of innovation with minimal oversight and limited accountability. Instead of merely reacting to market trends or urging workers to “keep up” with AI, state and local institutions need to set standards for job quality as well as racial equity, enforce safeguards against discrimination, and require transparency and accountability in AI deployment.

Finally, there are regulatory gaps and uncertainty around enforcement. The existing AI-related legislation in Georgia is narrow in scope. Measures such as House Bill 147 focus primarily on the use of AI within state government agencies, rather than addressing how private employers deploy AI in hiring, evaluation, and termination decisions. As a result, some of the most consequential uses of AI in the labor market remain largely unregulated at the state level.

Clear penalties and enforcement on AI front are key

State Sen. Nikki Merritt, D-Grayson, who is a sponsor of Senate Bill 167 (SB167), one of Georgia’s first AI accountability bills, told me more about the state’s regulatory gaps. Introduced to the Georgia State Senate in February 2025, SB 167 proposes a framework for responsible AI deployment by private entities and safeguards against algorithmic discrimination in consequential decisions pertaining to employment, housing, and credit.

Sen. Merritt told me that lobbyists swarmed her after the bill’s hearing to inquire about who would enforce it. Under Georgia law, the enforcement of this legislation would fall to the state’s attorney general, just as it does at the federal level. In our conversation, she acknowledged that the details of penalties, reporting mechanisms and thresholds for violations will need to be clarified for the bill to have its intended effect, especially given industry opposition to AI regulation efforts.

Without clear penalties, enforcement mechanisms, and thresholds any state-level AI regulation will risk being more aspirational than actionable. On a national level, this regulatory vacuum puts Black, low-income, and rural communities at disproportionate risk. Even before the AI boom, public policy was failing to secure the labor protections, equity mandates, or the support systems such as child care, transportation, and digital access these communities need to sustain stable, dignified work.

Public officials now face a defining choice: Invest in public infrastructure, regulatory frameworks and enforcement mechanisms that protect workers – or allow AI-driven inequities to harden. Georgians deserve more than aspirational rhetoric. They deserve AI regulations that safeguard them from harm and displacement, rather than sacrificing their well-being to accelerate adoption.

Anuli Akanegbu, Ph.D., is a cultural anthropologist and the author of the report (404) Job Not Found: What Workforce Training Can’t Fix for Black Atlantans in the Age of AI,” published by the Data & Society Research Institute. Her work examines how Black workers in the U.S. navigate structural inequality, technological change, and economic instability.

This guest opinion column was updated for clarity in two sentences under the point about unequal public infrastructure.

About the Author

Keep Reading

FILE - A data center owned by Amazon Web Services, front right, is under construction next to the Susquehanna nuclear power plant in Berwick, Pa., Jan. 14, 2025. (AP Photo/Ted Shaffrey, File)

Credit: AP

Featured

Colin Gray, the father of the accused gunman in the Apalachee High shooting, is escorted into the courtroom for his pretrial hearing at Barrow County Courthouse in Winder, Ga., on Thursday, Dec. 18, 2025. Gray is charged in connection with the shooting, with prosecutors saying Gray gave his son access to the AR-15-style weapon used in the shooting. (Abbey Cutrer/AJC)

Credit: abbey.cutrer@ajc.com