Definitions matter. Especially when those definitions come from the federal government. In the case of “broadband,” the definition set by the federal government creates our standard of Internet living. Depressingly, the American government’s definition means ISPs get away with offering very poor levels of “broadband.”
The Federal Communications Commission (FCC) is the agency responsible for defining broadband. The metric they set forms the basis of determining whether the government can say that a household has access to broadband Internet.
Today, that metric is 25 megabits per second download (25 Mbps) and three megabits per second (3 Mbps) upload. Based on that metric, the most recent broadband deployment report from the FCC has found that everyone, everywhere in America has broadband. Mission accomplished; we have solved all the problems of Internet access, right? Obviously, no. No one thinks we have actual universal broadband in the United States today. The needs of Internet users have long ago surpassed the FCC’s 25/3 metric. It’s possible this metric was out-of-date from the moment it was established.
In short, the FCC’s 25/3 metric is not only a useless metric, it is an actively harmful one. It masks the rapid monopolization of high-speed access occurring in the United States and obscures the extent to which low-income neighborhoods and rural communities are being left behind. And, it attempts to mask the failures of our telecom policy to promote universal broadband. But this failure can’t be masked during this pandemic, when millions of Americans are experiencing it as they try to work, learn, and entertain from home.
How Does This Metric Get Established?
In the last ten years, the FCC has updated its metric to assess broadband deployment twice. Once in 2010, where it moved from 200 kilobits (200 Kbps) symmetrical (originally established in 1999) to four megabits (4 Mbps) download and one megabit (1 Mbps) upload. The second time was in 2015, where it updated the 4/1 standard into today’s 25/3.
In theory, the FCC sets these standards by studying current consumer needs and the ever growing demand for bandwidth. If the FCC finds that the current standard is not keeping up with the evolution of networks, advancements in technology, and general consumer expectations, the agency is supposed to update its metric. In practice, these standards have not kept pace with either.
Section 706 of the Telecommunications Act of 1996 commands the FCC to annually assess the state of broadband deployment in America and if it finds that “advanced telecommunications capability” is not being deployed to all Americans in a “reasonable and timely fashion” that the FCC shall take “immediate action to accelerate deployment” by “removing barriers to infrastructure investment and by promoting competition.” In short, if the government data is showing that our telecom policy is failing, the FCC is required to rethink its approach and try something to fix it. For the last three years now, the FCC has declared that everything is fine—thanks to everyone getting broadband at the 25/3 metric at a reasonably and timely fashion.
The 25/3 Metric Was Potentially Behind the Curve in 2015
In 2015, a small revolution in broadband access had already been underway in fiber to the home. The city of Chattanooga, which deployed the nation’s first gigabit symmetric fiber network in 2011, had cost-efficiently upgraded its network to 10-gigabit (10 Gbps) symmetrical without any major new investments. This happened because fiber network equipment had advanced so rapidly that the exact same fiber wires laid years ago could deliver ten times the capacity. This ability to continuously get faster as hardware improves is unique to fiber per our technical analysis and is the central reason why all telecom policy should be oriented around universal fiber to the home deployment. In recognition of the benefits community networks were yielding, the FCC did attempt to roll back bans on local government broadband in 2015 as part of its overall effort to fulfill its obligations to improve broadband deployment per Section 706 (though ultimately the FCC lost in court).
But the 2015 decision to define broadband as 25/3 was not focused on assessing which communities had fiber and which did not. It was the minimum speed necessary to use currently available applications and services for that year. And even this minimalist approach was opposed by incumbents such as AT&T and Verizon who argued in favor of retaining the old 4/1 standard. Most of the opposition the FCC faced asserted that 25/3 focused only on “heavy users” and that a 25/3 metric was “too high, excessive, or purely aspirational” despite it already being available to 83% percent of Americans at that time. If the FCC wanted to establish a minimum metric to assess who has fiber or not, the actual number would have needed to have been low latency 100 megabits per second symmetrical speeds.
It is clear that the big, old ISPs would have opposed any change that raises the standard of broadband. It is likewise clear that these speeds were not “purely aspirational”--as we’ve seen with people moving online for school, work, and community in the years that followed. In fact, the 25/3 metric is downright slow by today’s standards and needs, and is practically near obsolescence.
How the 25/3 Metric Masks Our Growing Broadband Monopoly and Lack of Access Problem
The federal government still assesses our competitive landscape with the 25/3 metric. This metric furthers the absurd notion that slow DSL lines, satellite broadband, cable, wireless, and fiber to the home all count as equal options for broadband access. That is, that the quality of Internet life of someone on an ancient DSL line is equivalent to those who can get fiber. But only cable and fiber to the home are actually delivering the high speeds capable of handling things like video conferencing, cloud computing, and other major applications and services in need of a robust upload speed.
Some wireless solutions may join the fray with the latest spectrum policy decisions from the FCC, but ultimately even those wireless towers will need fiber. Between cable and fiber to the home, only fiber to the home is capable of scaling up into the multi-gigabit era and beyond in a cost-effective way. The same types of ten-fold advancements that occurred in 2015 with Chattanooga are going to happen again with fiber while cable systems will struggle to improve at the same pace. But while cable systems lurch their way into the multi-gigabit era, all the other legacy options will never reach that destination.
We call this limit of future potential between broadband networks the “speed chasm,” which stands for the fact that not all broadband options are created equal due to plain old physics. Fiber has an extreme amount of capacity in the wire itself, while copper, coaxial, and wireless options are forced to contend with physical barriers inherent in their mediums. As consumer demand continuously rises, these legacy options will fall out of favor much like dial-up Internet did in the past and DSL is today. This coming obsolescence with older networks is why spending money on slow legacy networks today is a giant waste of money and any new network built needs to have fiber at its core in order for it to remain useful for an extended period of time. But, so long as we consider 25/3 sufficient to meet today's needs, the FCC remains ill-equipped to address the rapid decline of legacy networks as well as the general failure for fiber networks to replace them across the country, leaving most people with a cable monopoly, or with nothing.
A Better Way to Assess Broadband
We at EFF support more regular and rigorous means of assessing broadband, as detailed in our recent FCC filing. Rather than wait an indeterminate amount of time before the FCC is willing to acknowledge the United States has a real problem, it should be baked into the process that every two to three years we assess what level of broadband is needed to meet the growth in consumption. It should be done with publicly available data that measures user behavior. Five years and counting is an abysmally long time to wait.
We also need to begin assessing the future potential of networks to stay ahead of demand in order to weed out legacy networks that are no longer relevant. Determining the potential rate of obsolescence is critical to the federal policy goal of maintaining universal broadband access because it flags where new investments are lagging or are completely absent. And perhaps most importantly we need to start assessing the price ISPs are charging for broadband to see where consumers are being gouged due to a lack of competition. A recent study by the Open Technology Institute has found that the United States has the most expensive and slowest broadband networks amongst advanced economies. Our median upload speed today is 15 Mbps while the EU is at 40 Mbps, and Asia enjoys an eye-popping 500 Mbps due to an aggressive fiber policy.
There are no good reasons why the United States is not a world leader in broadband. Efforts are underway, such as the House of Representatives’ universal fiber plan and state efforts to create fiber infrastructure programs, that will reestablish our leadership. But so long as we hold onto useless metrics like the 25/3 federal definition of broadband as the means to determine our progress, we will never even take the first step.