Comparison with Overhead Lines
Performance Advantages
Underground power lines exhibit superior reliability compared to overhead lines primarily due to reduced exposure to environmental and external hazards, resulting in fewer outages overall. Empirical data from global utilities indicate that underground systems experience approximately one-third the outage frequency of overhead systems, with averages of 5.0 outages per 100 km for underground versus 16.1 per 100 km for overhead.[31] In specific U.S. contexts, such as North Carolina utilities from 1998-2002, underground lines showed an interruption rate of 0.3 per mile, half that of overhead lines at 0.6 per mile.[31]
Weather resilience represents a core performance edge, as underground lines are largely insulated from wind, ice, storms, tree contact, and lightning strikes that frequently disrupt overhead infrastructure. During Hurricane Irma in 2017, Florida Power & Light's undergrounded distribution systems recorded a 4% outage rate, compared to 24% for unhardened overhead systems.[1] Conversions from overhead to underground have yielded measurable reliability gains, such as a 95% improvement in the System Average Interruption Duration Index (SAIDI) for Wisconsin Public Service Commission lines from 2012-2021, reducing storm-related outage durations by 137 minutes.[1] Similarly, Virginia Electric and Power Company's projects from 2016-2022 achieved a 99% improvement in the System Average Interruption Frequency Index (SAIFI) and 27% faster restoration times during a 2022 snowstorm.[1]
A broader correlation from Stanford research across U.S. systems shows that a 10% increase in underground line miles corresponds to a 14% reduction in annual customer interruption durations, underscoring causal links to enhanced grid stability.[1] Underground configurations also mitigate risks from vehicle collisions with poles and wildlife interference, further lowering unplanned downtime.[32] In fire-prone areas, such as those served by Pacific Gas & Electric, undergrounding projects are projected to reduce ignition risk by 99% during wildfires.[1] These advantages stem from the physical burial of conductors, which eliminates aerial vulnerabilities while maintaining electrical performance under normal loads.
Performance Disadvantages
Underground power cables exhibit reduced current-carrying capacity, or ampacity, compared to equivalent overhead lines primarily due to constrained heat dissipation mechanisms. While overhead conductors benefit from natural convection and radiation in ambient air, underground cables rely on thermal conduction through surrounding soil or backfill, which imposes higher thermal resistance and limits permissible operating temperatures to prevent insulation degradation.[33][34] This results in underground cables typically requiring 20-50% larger conductor cross-sections to achieve comparable power transmission ratings, or operating at derated levels, particularly in high-load or warm soil conditions where soil moisture and thermal resistivity further exacerbate limitations.[34]
Fault location and repair processes for underground lines introduce significant performance drawbacks in terms of outage duration and system availability. Unlike overhead lines, where visual inspection or basic patrol can quickly identify damage from events like storms or vegetation contact, underground faults—often caused by insulation failure, corrosion, or digging strikes—require specialized techniques such as time-domain reflectometry, sectionalizing, or excavation to pinpoint issues, extending mean time to repair from hours or days for overhead systems to weeks or even months in complex urban or long-distance installations.[35][36][37] Empirical data from utilities indicate that while underground lines experience fewer weather-related interruptions, their repair times can multiply outage impacts, with some analyses showing lower overall reliability for aging underground infrastructure compared to well-maintained overhead equivalents.[38]
Underground systems also demonstrate diminished capacity to handle transient overloads, a critical performance metric during peak demand or contingency events. Overhead lines can temporarily exceed rated ampacity through dynamic thermal ratings aided by cooling airflow, whereas underground cables risk thermal runaway or dielectric breakdown under similar stresses due to slower heat rejection, necessitating more conservative loading practices or additional cooling infrastructure like forced ventilation ducts.[39][40] Furthermore, the higher capacitance in underground AC cables generates excessive reactive power, potentially reducing effective power transfer efficiency and requiring compensatory devices such as shunt reactors, which add complexity and can limit voltage stability during dynamic operations.[41]
Ongoing monitoring and maintenance pose additional performance hurdles, as underground installations preclude routine visual or non-invasive assessments feasible for overhead lines, leading to undetected degradation from factors like water ingress or soil movement that progressively erode long-term efficiency and capacity.[32][40] These constraints collectively contribute to scenarios where underground lines underperform in high-growth or variable-load environments without substantial engineering mitigations.
Cost-Benefit Evaluations
Underground power lines typically incur significantly higher upfront installation costs compared to overhead lines, often ranging from 5 to 10 times greater due to trenching, cabling, and protective measures required. For instance, a 2013 study by the Electric Power Research Institute (EPRI) estimated that undergrounding medium-voltage distribution lines in suburban areas could cost $1.5 to $3 million per mile, versus $200,000 to $500,000 per mile for overhead equivalents. These elevated capital expenditures stem from excavation, specialized insulated cables, and thermal management systems to prevent overheating in buried conduits, factors absent in overhead designs.
Lifecycle cost analyses, however, reveal potential offsets through reduced maintenance and outage-related expenses. Overhead lines demand frequent repairs from weather damage, vegetation management, and corrosion, with annual maintenance costs averaging 1-2% of initial investment, while underground systems require less intervention post-installation, potentially lowering total ownership costs over 30-50 years. A 2018 analysis by the California Public Utilities Commission (CPUC) for wildfire-prone regions projected that undergrounding high-risk overhead lines could yield net savings of $2-5 billion statewide over decades by averting outage costs, which averaged $1-2 per kWh lost in 2017-2018 events. Reliability benefits further enhance economic viability; underground lines experience outage rates 10-20 times lower during storms, translating to avoided customer interruption costs estimated at 10,000−10,000-10,000−50,000 per hour for commercial users in urban settings.
Quantitative cost-benefit ratios vary by geography and hazard exposure. In low-risk temperate zones, benefit-cost ratios often fall below 1:1, rendering undergrounding uneconomical without subsidies, as per a 2020 U.S. Department of Energy report citing ratios of 0.3-0.7 for standard urban retrofits. Conversely, in hurricane- or wildfire-vulnerable areas, ratios exceed 2:1; post-Hurricane Sandy (2012), New Jersey's undergrounding program achieved a 3:1 benefit-cost ratio by reducing annual outage costs from $1.5 billion to under $200 million in affected grids. These evaluations underscore that while upfront burdens deter widespread adoption— with only 10-20% of U.S. distribution lines undergrounded as of 2022—targeted applications in high-impact zones align with causal economic incentives favoring resilience over raw installation thrift.
Critics, including utility economists, argue that overreliance on simplistic models ignores indirect costs like land acquisition delays and construction disruptions, which inflated New York City's undergrounding efforts by 20-30% beyond projections in 2014-2019. Empirical data from peer-reviewed engineering assessments, such as those in IEEE Transactions, emphasize that benefits accrue primarily from empirical outage reductions rather than aesthetic or speculative environmental gains, prioritizing verifiable metrics like system average interruption duration index (SAIDI) improvements of 50-80%. Regulatory frameworks increasingly mandate such evaluations, with bodies like the UK's National Grid incorporating discounted cash flow models showing breakeven at 20-40 year horizons in coastal deployments.