Solarinstallguide

Utility Rate

The price per kilowatt-hour (kWh) you pay your electric utility — the single biggest variable in solar financial returns, as higher rates mean larger savings.

Your electricity rate (measured in cents or dollars per kWh) directly determines solar's financial value. The higher your rate, the more each kWh of solar production saves. Average U.S. residential rates range from $0.08/kWh (Louisiana, Oklahoma) to $0.30+/kWh (Hawaii, California, Massachusetts).

Rate increases compound the value of solar over time. Average US electricity rates have risen approximately 2.5–3% annually over the past 20 years. A 3% annual rate increase means electricity that costs $0.14/kWh today costs $0.25/kWh in 20 years. Solar locks in zero-cost generation for 25+ years — an effective hedge against utility rate inflation.

Rate structures matter beyond the per-kWh charge: fixed monthly charges (customer charges) that you pay regardless of usage, demand charges (based on peak usage in 15-minute intervals, common in commercial accounts), and tiered rates (higher rates for higher usage) all affect solar economics. Understanding your current rate structure requires reading your actual utility bill carefully — not just the "average cost per kWh" summary.

Real-World Example

The installer's production estimate showed 11,000 kWh/year; at the homeowner's current $0.18/kWh rate the savings were $1,980/year, but the homeowner noted her utility had a $15/month fixed charge that would remain — actual net savings were $1,980 – $0 (already paid) = $1,980 because fixed charges don't change with solar.

Related Terms

Net MeteringTime-of-Use (TOU) RateSolar Payback PeriodLCOE (Levelized Cost of Energy)
← Full Solar Energy Glossary