Tech
Study outlines steps for California to reach net-zero emissions by 2045
A 2022 California law mandates net-zero greenhouse gas emissions by 2045 and negative emissions every year thereafter. The state can achieve this but will have to act quickly and thoroughly, and success will require new technologies for sectors difficult to decarbonize, a new Stanford University study finds. The state will need to decarbonize not only cars and electricity but also trucks, trains, planes, agriculture, and factories, while slashing pollution from its oil refineries.
The research team created a new model that projects emissions, society-wide economic costs, and consumption of energy resources under many scenarios for California to reach net-zero emissions by 2045. The model uses data from U.S. federal agencies, national laboratories, California state agencies, past studies, and various other online public sources. (Data sources are provided in the study’s Appendix B.) The model forecasts that 170 gigawatts of new generation and 54 gigawatts of storage will be needed by 2045, compared with California’s current generation capacity of 80 GW, as transportation, buildings, and industry transition from fossil fuels to low-carbon sources of electricity. The expansion of electricity will be needed despite expected gains in energy efficiency in many technologies.
The study, published this week in the journal Energy Policy, provides a detailed roadmap for meeting California’s net-zero mandate. First, commercially available technologies can slash the state’s emissions in half. Technologies proven at pilot scale that need commercial development and lower costs could address another 25%. The final quarter will rely on inventions still being worked on in laboratories.
“One key to success will be building an emission-free power grid using a combination of solar, wind, batteries, and sources of clean, firm power like natural gas with carbon capture and storage or nuclear power,” said the study’s senior author, Sally Benson, the Precourt Family Professor of energy science and engineering in the Stanford Doerr School of Sustainability.
The study, which was funded by several industry associations and trade unions impacted by the state’s move to net-zero emissions, also examines some policy and economic implications for the state.
“We will need to build this infrastructure at an unprecedented pace to put proven technologies to work at the scale we need,” added Benson, who was the chief strategist for the energy transition at the White House Office of Science & Technology Policy from 2021 to 2023.
First 52%: Commercial technologies
The necessary technologies already in commercial use that could halve California emissions include renewable electricity generation, batteries for storing that energy, electric passenger vehicles, heat pumps, and machines that produce methane fuel from wastewater, manure, and food and plant waste.
However, significant administrative and logistical barriers could stymie deployment of these technologies at the required speed and scale. The state is already experiencing overwhelmingly long queues to connect new renewable energy generation and grid-scale energy storage to the grid. Local ordinances frequently block permits for new power plants. Other obstacles include the early termination of federal tax credits for EVs and home solar, federal challenges to California banning sales of gas-powered cars in 10 years, elevated financing costs, and supply chain disruptions.
“California can build the infrastructure it needs to meet the 2045 mandate, but the state must implement policies to overcome regulatory and logistical barriers,” said the study’s lead author, Joshua Neutel, a Ph.D. student in civil and environmental engineering, a joint department of Stanford’s School of Engineering and Doerr School of Sustainability.
Several readily available measures save more money than they cost to implement, after accounting for state and federal incentives—many of which are slated to end in the coming months. The authors estimate electric passenger vehicles, solar and wind power, reduced in-state oil production, and replacement of fossil-based gas with methane fuel made through anaerobic digestion could eliminate 44% of the state’s greenhouse gas emissions (based on estimated 2045 emissions if the state were to continue business as usual).
Next 25%: Early-stage technologies
The authors estimate a quarter of emissions abatement could come from technologies in the early stages of commercialization, including zero-emission heavy-duty vehicles, clean industrial heating from electricity and hydrogen, and carbon capture and sequestration (CCS).
Eliminating carbon emissions from heavy-duty vehicles could reduce California emissions 12%. However, emission-free trucks still need to improve their range and cargo capacity while reducing charging time and purchase price. Another area in early-stage deployment involves switching several industries from fossil fuels to carbon-free electricity and green hydrogen. This accounts for 5% of emission reductions in the authors’ projections.
CCS entails capturing carbon dioxide directly at the source, such as at gas-fired power plants and factories, and securely sequestering the emissions deep underground. In some hard-to-decarbonize sectors, like oil refining and producing cement, hydrogen, and some electricity, CCS may be the most viable option in the near and medium term, according to the authors. The study confirms prior findings that a limited amount of natural gas power paired with CCS (34 of 170 gigawatts, or about 20% of new generation capacity) could vastly reduce the number and costs of wind and solar farms. Pairing bioenergy with CCS could remove another 2% of emissions from 2019 levels to reach net-zero emissions.
Final 23%: Research-phase technologies
Nascent technologies still in the research phase include decarbonized trains, planes, and boats; low-emission refrigerants; and carbon dioxide removal (CDR) from the atmosphere. Replacing fossil fuels for planes, trains, and boats with electricity, hydrogen, and renewable fuels faces challenges from their weight, cargo capacity, costs, and the limited availability of clean fuels.
Traditional refrigerants are powerful greenhouse gases up to 2,000 times more potent than CO2 during their first 100 years in the atmosphere. Climate-friendly alternatives, possibly including CO2 as a refrigerant, are still in the early stages of development.
CDR will play a significant role, with the researchers’ model projecting that California will need to sequester about 45-75 million tons of CO2 annually by 2045 through CDR, in line with the state’s 2022 forecast. Explored CDR options include bioenergy with CCS and direct air capture plants. The prior emits but then sequesters biogenic CO2 through industrial processes like hydrogen and electricity generation. The latter extracts CO2 directly from ambient air and stores it underground.
“If net-zero by 2045 is a binding constraint, then large amounts of CDR will be needed,” said study co-author Sarah Saltzer, managing director of the Stanford Center for Carbon Storage. Current methods for extracting carbon dioxide from ambient air remain costly and energy intensive.
Political and economic implications
The study recommends several policy changes, including streamlining the permitting of, and grid connections for, new generation, energy storage, and power lines. This year, the state has taken initial steps to do this.
The research advises that California should consider incentives for adding CCS to existing natural gas-fired power plants. For example, it could qualify such power plants as one way for utilities to meet the state’s renewable portfolio standard. This could prevent expensive overbuilding of solar power plus batteries.
This work also supports maintenance of the state’s EV sales mandate for 100% clean vehicles by 2035 and consideration of similar policies for building appliances. Policymakers could develop roadmaps for advancing “renewable natural gas” and “renewable diesel,” which are chemically equivalent to fossil-based natural gas and diesel but made from biological feedstocks, said the researchers. These fuels have a limited global supply but could be vital for decarbonizing hard-to-abate sectors.
“Reaching net-zero by 2045 is not so much a challenge in cost,” said Benson, “but a challenge in getting the necessary technologies available in time and establishing the social, political, and economic environment to deploy these technologies rapidly and broadly.”
More information:
Joshua Neutel et al, What will it take to get to net-zero emissions in California?, Energy Policy (2026). DOI: 10.1016/j.enpol.2025.114848
Citation:
Study outlines steps for California to reach net-zero emissions by 2045 (2025, September 28)
retrieved 28 September 2025
from https://techxplore.com/news/2025-09-outlines-california-net-emissions.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
Tech
Need One Pair for Hiking, Traveling, and Working Out? Try Gravel Running Shoes
HOKA’s max-stacked Rocket X Trail combines road race shoe energy with boosted grip from a 3-mm lugged outsole. If you’re looking for a fast shoe to go on the attack, this is it. It’s also fantastic for all round comfort. In testing, I laced up the Rocket X Trail and ran 3 hours (just short of 19 miles) fresh out of the box, across roads, forest gravel trails, some grass and through some serious water. It delivered efficiency and energy whether I was moving at marathon pace or with heavier, tired, ragged footfalls in the latter miles.
The rockered, supercritical midsole uses HOKA’s liveliest foam, similar to those you find in its race-ready road shoes, along with a carbon plate. That combines for a really fun ride that’s smooth, springy and fast and really consistent. It’s also highly cushioned, so you will sacrifice a lot of ground feel for that big stack springy softness. It’s also less stable over very lumpy terrain. But on open, flat, runnable mixed terrain, it’s excellent.
The lightweight uppers have a race-shoe-ready feel and after running through ankle-deep flooded sections, they shed water really quickly. This is a pricey road-to-trail shoe, it’s versatile and there’s plenty of winter road potential, too.
| Specs | |
|---|---|
| Weight | 9.45 oz |
| Heel-to-toe drop | 6 mm |
| Lug depth | 3 mm |
Tech
If a Garmin Is Too Expensive, Consider Suunto’s Latest Adventure Watch
It’s always pleasing to see an array of physical buttons, and you get sizable ones too. You’re not going to miss these wide flat ones even when picking the pace up. The silicone strap has a nice stretch to it and while the button clasp is a bit awkward to get into place, this watch does not budge.
Suunto has jumped on the flashlight trend, with an LED light strip sat on the front of the case. You can adjust brightness levels and there’s SOS and alert modes to emit a very noticeable pulsating light pattern. This is a light I found useful rooting around indoors as well as on nighttime outings.
The biggest change is the introduction of a 1.5-inch, 466 x 466 AMOLED display. This replaces the dull, albeit very visible, memory-in-pixel (MIP) display. Suunto also ditched the solar charging that did require spending a significant amount of time outside to reap its battery benefits.
Adding AMOLED screens to outdoor watches has been contentious. The older MIP displays are just more power-efficient. The Vertical 2 is down by about 10 days from the older Vertical for what Suunto calls daily use.
Still, even if you’re putting its tracking and mapping features to use, you’re not going to be reaching for the charger every few days. After two hours of tracking in optimal GPS mode, the battery only dropped by 2 to 3 percent. The battery drop outside of tracking is also small and the standby performance is excellent as well.
Software Updates
Photograph: Michael Sawh
A more streamlined set of smartwatch features helps reserve battery for when it really matters. Unfortunately, I probably got better battery life because you don’t get phone notifications or responses if it’s paired to an iPhone instead of an Android. There’s also no onboard music player, but you do get a pretty slick set of music playback controls that are accessible during tracking.
Tech
Electronic health records are still creating issues for patients | Computer Weekly
Every NHS trust in England needs an electronic patient record (EPR) system in place by March 2026, as part of a government push to digitise the healthcare system.
In many ways, this is long overdue: some trusts have still been using pen-and-paper record-keeping until very recently.
EPRs have the potential to massively improve efficiency in the NHS. If working properly, they allow doctors to keep all of their records in one place, speed up prescribing and diagnostics, and make it easier for patients to access their own health information.
But these roll-outs have not been without problems. Concerns have been raised about how far these benefits can actually be realised. Some NHS trusts have experienced issues with integrating new systems and training staff on how to use them.
In the extreme, there have been reports of EPRs creating new problems for hospitals, with evidence suggesting these systems may have contributed to serious harm and even deaths among patients.
NHS trusts have been put in charge of procuring their own EPRs, meaning there are numerous different technology companies involved. Some providers of these systems are large US firms. This includes Oracle Health, provided by the Larry Ellison-led tech giant, and Epic, a tech firm based in Wisconsin.
Contracts can run into nine figures: Guy’s and St Thomas’, a trust in South London, launched a £450m system from Epic in late 2023. Some parts of the NHS have been using them for more than a decade, but a handful are still set to miss the government’s March deadline.
Data access
Pritesh Mistry is a fellow at the King’s Fund, where he researches the impact of digital transformation in the NHS. He says it has had “both positive and negative impacts”.
“In the last few years, we’ve seen doubling down on the focus around digital records,” says Mistry. These are now in place in more than 90% of all trusts, and every GP practice.
“That means we’ve now got [new] data that’s within the healthcare system, which allows us to do other things, like treat populations, and understand and track patient safety,” he says.
Despite this, he cautions some patients are still struggling to get hold of their own data.
“We’ve got a lot of data that’s in silos,” says Mistry. “It doesn’t flow. That’s the biggest challenge: making the data accessible and usable for patients and healthcare professionals to be able to provide care in a way that is joined up and meets with modern expectations.”
He says complaints with new technology haven’t just come from patients.
“We need to recognise that staff are really frustrated,” says Mistry. “Software often crashes. Computers are really slow, and technology adds to their workload, instead of simplifying things.” He caveats that some parts of the NHS are better than others on this.
Safeguarding patient data
Mistry adds that there are safeguards in place to ensure patient data isn’t ending up where it shouldn’t be – such as through data protection rules and procurement requirements.
However, he warns that “we need to make sure we move with the times in terms of what technology is available”. Mistry is more concerned about medical staff inadvertently putting personal information into a large language model, for instance.
“Digital exclusion remains a barrier as well,” he says, adding that these systems have the potential to widen inequalities in healthcare. Those less able to use new technology might struggle to access their records.
“People tend to assume it’s old people [who are most impacted], but that isn’t necessarily true,” says Mistry, instead highlighting the impact of poverty and deprivation, with some still unable to afford internet access.
He argues the NHS should be working to meet people where they are, and provide more “tailored” technology services.
Patient safety
Nick Woodier is a doctor and investigator at the Health Services Safety Investigations Body (HSSIB), which looks into issues with healthcare in the UK. He sees problems arising from how EPRs are deployed by trusts, especially when medical staff overestimate their capabilities.
He uses the example of prescribing medicines: “There’s an assumption that these electronic prescribing systems will stop you [from] doing something catastrophic.”
But this isn’t always the case. In one investigation, the HSSIB found a child had been prescribed nearly 10 times the recommended dose of an anti-coagulant medication, with doctors having assumed the EPR would flag an issue. The child ended up with a bleed on their brain.
Woodier also worries hospitals are not always picking up on when these systems are at fault.
“We will often see where incidents have happened and the contribution of the electronic system has not been recognised,” he says.
Woodier sees this as coming from a culture which prefers to put the blame for safety failures on individuals.
A 2024 investigation by the BBC found there were more than 126 instances of serious harm registered by NHS trusts across 31 trusts, including three deaths related to EPR problems.
The HSSIB has also encountered problems from patients being unable to access their digital records.
“We’ve seen in general practice, for example, some patients telling us that they’ve gone without care – because in their mind, they thought the only way they could access their GP was to fill in an electronic form,” says Woodier.
A spokesperson for NHS England says EPRs are “already having a significant impact on improving safety and care for patients”, for instance, by helping to identify conditions such as sepsis, and preventing medication errors.
“They have replaced outdated and often less-safe paper-based systems, and we are working closely with NHS trusts to ensure they are implemented safely alongside other systems with appropriate training – and are used to the highest quality and safety standards,” the spokesperson adds.
Interoperability
The EPR roll-out has also been criticised for problems with “interoperability” – the ability of different programs and modes of data collection to converse with each other. The patchwork of different systems used by different trusts means data stored in one system might not be useful for a system used by a different part of the NHS.
Woodier says this often happens in communications between hospitals and GP surgeries. This can involve someone manually inputting information from one system to another, which can create risks when data is not being transferred properly, or is missed completely.
“When you introduce a manual operation, that risk increases,” he warns. “The odds are that at some point, somebody won’t do the right thing, because that’s the reality of being human.”
Alex Lawrence, a fellow at the Health Foundation, describes interoperability as a “significant challenge”, which the NHS and technology companies have been “grappling with for a really long time”.
“Some trusts have found it much harder to access their own EPR data than they anticipated, because of where that data is stored,” she adds, referring to research the organisation carried out in 2024.
“If it’s taking you days to pull the data that you need, then it’s already not going to be useful for a lot of the purposes that you might want it for.”
However, Lawrence adds that there have been some steps made in the right direction, notably with the Data (Use and Access) Act, which was passed last year.
“The government is making information standards mandatory for EPR providers, as well as trusts, with the Secretary of State potentially having more powers to enforce those standards,” she says.
The longer term
Going forward, Lawrence would like to see a system involving “patients being empowered with access to their own data, and as far as appropriate, clinicians being able to see all of the history that they need for their patients”.
In an ideal system, different parts of the healthcare system would be able to “share a patient’s data where necessary and appropriate, in an easy and timely way”.
She says they have the “potential to offer enormous value”, but much of their functionality is going unused. “What our qualitative research suggested was that a lot of these systems are still functioning as digital notebooks,” says Lawrence.
Matthew Taylor is the head of the NHS Confederation and NHS Providers, membership bodies for healthcare organisations.
“NHS leaders say the gap between trusts on digital maturity is still stark – and it’s shaping how quickly organisations can move to modern EPRs,” he says.
This gap – combined with the organisational complexity of the healthcare system – means interoperability has “long been a thorn in the NHS’s side”.
Taylor adds that EPRs are not a “once-and-done” job, and argues they will result in savings in the long term, but that it may take around five years to see the benefits.
“Hospitals are housing a huge amount of paper records, and the cost of storing, retrieving and managing those records can run into millions of pounds each year,” he says.
These systems are part of a larger picture, and one facet of the conversation, around the use of artificial intelligence in the NHS. AI models for areas such as research and diagnostics will require extensive and standardised medical data.
Mistry warns these AI tools operate on the basis of “garbage in, garbage out”.
“There is a risk that we roll out AI tools without the underpinning data quality it needs,” he says, adding that this could exacerbate inequalities or biases from using AI.
As Woodier puts it: “We’ve got organisations who are still using archaic computers, have got infrastructure that’s not working, are still on old web systems, or have EPRs that don’t talk to each other. A few [trusts] don’t have EPRs.
“So, actually, are we trying to run before we’ve even managed to walk?”
-
Politics1 week agoWhat are Iran’s ballistic missile capabilities?
-
Business6 days agoIndia Us Trade Deal: Fresh look at India-US trade deal? May be ‘rebalanced’ if circumstances change, says Piyush Goyal – The Times of India
-
Business7 days agoAttock Cement’s acquisition approved | The Express Tribune
-
Business1 week agoHouseholds set for lower energy bills amid price cap shake-up
-
Politics1 week agoUS arrests ex-Air Force pilot for ‘training’ Chinese military
-
Fashion7 days agoPolicy easing drives Argentina’s garment import surge in 2025
-
Sports6 days agoLPGA legend shares her feelings about US women’s Olympic wins: ‘Gets me really emotional’
-
Fashion7 days agoTexwin Spinning showcasing premium cotton yarn range at VIATT 2026
