A new poll reveals that approximately 80 percent of Americans believe data centers should be required to supply their own energy rather than relying on existing electrical grid infrastructure, highlighting growing public concern over the massive power consumption of tech facilities.
The survey results come as data centers continue expanding across the United States to support cloud computing, artificial intelligence, and digital services, with many facilities consuming electricity equivalent to small cities. Industry analysts estimate that data centers currently account for roughly 2-3 percent of total U.S. electricity consumption, a figure projected to double within the next decade.
“There’s a clear public sentiment that these massive commercial operations shouldn’t burden existing infrastructure that serves residential communities,” said one energy policy analyst familiar with the polling data. The survey reportedly included responses from over 1,000 adults nationwide and was conducted within recent weeks.
Tech companies have increasingly faced scrutiny over their environmental impact and resource consumption. Major cloud providers including Amazon, Microsoft, and Google have made commitments to renewable energy, though most still rely heavily on grid electricity for their expanding data center operations.
Energy experts note that requiring self-generation could accelerate adoption of renewable energy technologies, as data centers would have strong incentives to invest in solar, wind, and battery storage systems. However, industry sources warn such requirements could significantly increase operational costs and potentially slow technological innovation.
The polling results may influence upcoming legislative discussions, as several states consider regulations addressing data center energy consumption and environmental impact. Lawmakers are expected to weigh public opinion against economic development concerns as the digital economy continues expanding.