The internet often feels weightless: a photo shared, a video streamed, a query answered. But those tiny acts add up. Data centres, networks and the devices we use all burn electricity — and that electricity creates greenhouse gases unless it’s from clean sources. Recent analysis and company reports indicate that a suite of practical, proven green-computing measures — from smarter software and improved cooling to circular hardware and clean power purchasing — can yield substantial reductions in the sector’s carbon footprint. Some technical reviews and industry plans point to potential reductions in the range of 40–60% when a full set of measures is applied at scale. According to a recent review on green computing, energy-saving hardware, AI-driven operations, and circular e-waste strategies can together deliver 40–60% cuts in energy consumption without sacrificing performance.

In This Article
- The Promise: What “60%” Really Means and Why it’s Credible
- Companies and People who Cut Carbon, and How they Did It
- How These Cuts are Achieved in Practice — The Tactics that Scale
- Actionable Advice: What CIOs, Engineers, and Even Everyday Users Can Do Now
- Conclusion: Realistic Optimism, Not Magic
The Promise: What “60%” Really Means and Why it’s Credible
When you read “60%” in a headline, it helps to be concrete. That number is not claiming the whole internet’s emissions will fall overnight by that share. Instead, researchers and industry reports mean this: in many real-world setups, a coordinated package of changes — upgrading inefficient servers, using efficient chips, switching to liquid cooling, shifting workloads to times with clean grid power, and buying or generating carbon-free electricity — can cut the operational emissions of particular data centers, clouds or IT estates by a majority compared with a business-as-usual baseline. A 2024 industry analysis from RMI lays out the technical road map — better site selection, high-efficiency systems, liquid cooling and flexible workload management — that together unlock deep reductions in carbon intensity. RMI
Large tech firms already show what’s possible. Google’s internal use of machine learning to tune cooling reduced cooling energy in some centres by up to 40%, a step-change that reduces total site emissions when combined with other improvements. “We used AI to forecast temperatures and tweak cooling setpoints in real time,” the DeepMind/Google reports explain — a practical example of how software can drive hardware-level savings.
Governments and multilateral bodies are also taking stock. The International Energy Agency’s recent analysis of data-centre energy use provides an updated baseline for how much electricity the sector consumes and where efficiency can be captured — essential context for turning percentage savings into absolute carbon reductions..
Companies and People who Cut Carbon, and How they Did It
Big numbers become believable when you see the people behind them. Digital Realty, a major global data-centre operator, recently described operational and design changes intended to lower direct and indirect emissions substantially: their published goals include a 60% reduction in direct and indirect emissions per square foot by 2030 through a mix of renewables procurement, advanced cooling (including liquid cooling) and AI-driven optimisation. Aaron Binkley (VP of Sustainability) and Shea McKeon (Global Head of Design and Engineering) told Business Insider Africa that the company’s approach relies on engineering, early clean-energy planning and new operational tools — an industry-level example of turning targets into projects.
Another concrete win comes from applied AI in buildings and facilities. In a commercial-building pilot, an AI control system reduced HVAC energy use and cut a measurable share of emissions — demonstrating the same principle that drives data-centre AI projects: better real-time control reduces waste. The lesson is simple and repeatable: add sensing, predict demand, and optimise operations.
Regional and academic projects add complementary evidence. The Rocky Mountain Institute’s technical review and the ITU’s “Greening Digital Companies” work collect dozens of case studies showing measurable energy and carbon reductions when the right package is used (site choice, efficient hardware, circular supply chains and clean electricity). Those documents map technologies to outcomes, which is why policymakers and investors now treat green computing as investable decarbonization.

How These Cuts are Achieved in Practice — The Tactics that Scale
The tactics are straightforward; the challenge is doing them together and at scale.
First, software and operations matter. Smarter workload placement, autoscaling, and AI-based cooling or power scheduling reduce wasted cycles and trim cooling needs. Google’s DeepMind case is an iconic example where software lowered cooling energy by ~40% at some facilities.
Second, modern hardware and cooling systems pay big dividends. Liquid cooling and advanced chip design reduce server power draw and cut the electricity used per computation. RMI’s roadmap highlights liquid cooling and integrated system design as major levers for lowering the carbon intensity of new AI-grade data centres.
Third, energy sourcing and grid strategy. Buying or directly procuring carbon-free electricity, and designing data-centre loads to align with times when grids are cleaner, reduces the carbon per kilowatt. It’s a classic engineering-meets-policy play: some operators sign long-term renewable contracts; others build onsite or nearby generation. Google and Microsoft both report steady progress in purchasing clean energy and reducing data-centre emissions intensity through such measures.
Fourth, systems thinking: reuse heat and reduce material emissions. Some facilities route waste heat into district heating systems or local industry (a project ACEEE described can reduce heating-sector emissions by up to 60% when waste heat replaces local boilers). Circular asset management — refurbishing servers, extending device life, and responsible e-waste programs — reduces the embodied emissions of the digital economy.
What makes the “up to 60%” claim realistic is that each tactic multiplies others. Software cuts load, hardware reduces consumption, clean energy reduces carbon per kilowatt, and circular practices shrink embodied emissions. Applied together, the combined effect can reach the levels researchers have modelled.
Actionable Advice: What CIOs, Engineers, and Even Everyday Users Can Do Now
For CIOs and engineering teams: run the stack audit. Measure energy per request, identify idle or underutilised servers, and prioritise software changes (autoscaling, efficient models, model distillation for AI) that immediately reduce compute demand. Consider liquid cooling and procurement strategies for new capacity, and include clean-energy clauses early in site selection. RMI and McKinsey both recommend combining “offence” (technology that reduces emissions across the business) with “defence” (reducing the footprint of IT itself) for the fastest progress.
For data-centre operators: treat operations and the grid as partners. Use AI tools to shift flexible workloads to cleaner hours, coordinate with utilities for demand flexibility, and explore waste-heat reuse with local towns or industry. Digital Realty’s roadmap is a useful playbook: engineering design, early renewable planning and cross-department collaboration are the practical steps they’re using to hit big targets.
For investors and policy makers: set clear signals and standards. Certification schemes, procurement rules that value lifecycle emissions, and incentives for low-carbon site selection accelerate the market. The ITU’s 2025 guidance and national energy agencies’ data make the case that policy can direct capital toward the technologies that deliver the largest cuts.
For everyday users and smaller orgs: choose greener cloud regions, turn off idle services, and prefer providers that publish clear emissions and energy-use data. Small steps across millions of users are meaningful — efficient software and responsible cloud choices directly reduce demand on data centres.
A short, practical checklist to start today (work with your engineering and procurement teams): measure energy intensity (requests/W or kWh per model training), enable autoscaling and scheduled shutdowns, choose cloud regions with lower carbon intensity, request sustainability data from vendors, and plan hardware refreshes toward high-efficiency platforms.
Conclusion: Realistic Optimism, Not Magic
“Green computing” is not mysticism; it’s applied engineering, procurement and policy. The evidence base is practical — field studies, industry reports and pilot projects — and shows that when organisations invest coherently across software, hardware, cooling and energy sourcing, they can often cut a majority of operational carbon. According to multiple industry and academic analyses, combining these measures is a credible pathway to large emissions reductions in the digital sector.
If you lead technology, sustainability or policy work, the takeaway is clear: the tools exist, case studies prove the impact, and integrated action will be essential as demand for compute keeps rising. The next step is not another study but practical deployment — and the companies that move now will both lower their emissions and gain resource savings and resilience as a result.