Mark Twain is reputed to have said:
History does not repeat itself, but it rhymes.
I was reminded of that this week, but in this case, it not only rhymes, but uses the same letter, G.
This week, Comcast was called out for advertising “Xfinity 10G Network,” which mostly does not support 10 gigabits per second (Gbps). Never mind that even if it were, it would probably not be practical for most households, which might struggle to consume a few hundred megabits simultaneously watching a lot of ultra-high-definition movies.
Clicking through the marketing plan revealed a maximum of 2 Gbps, which is probably sufficient for most users. Comcast’s advertising also had questionable claims about reliability, which is probably of more importance to most users, particularly those who work from home.
Unabashed vendor hype is nothing new, particularly in the tech industry. Perhaps somewhat ironically, Comcast’s biggest critic in this case, T-Mobile, played a seminal role in the marketing rebranding of 4G over a decade ago. It may surprise you to learn that the modern definition of 4G was born not in a lab or a standards group but in a Jamaican courtroom.
The Comcast kerfuffle
The current Comcast controversy emerged under the auspices of the US BBB National Programs, National Advertising Division. The group works with companies, industry experts and trade associations with a self-regulatory environment to foster industry best practices in truth in advertising. On October 12th, the NAD determined:
[Comcast had] provided a reasonable basis for its ‘Next Generation’ claim for its ‘Xfinity 10G Network, as well as the implied claim that it has already achieved a major technological revolution.
However, NAD also recommended that Comcast discontinue its 10G claims or at least qualify them as aspirational. It also recommended Comcast discontinue its advertising that Comcast services would continue running during a power outage, which is not currently technically feasible.
The story started in February when Comcast adopted the new 10G marketing terminology to bring awareness to a new technology roadmap. The NAD concluded that 10G should mean either 10 Gbps or 10th Generation. However, only one plan could reach those speeds, requiring a costly fiber upgrade only available in a few markets.
The NAD also found that Comcast did not provide a reasonable basis to support the claim that the new service was vastly superior to 5G wireless networks. The NAD did say it would allow Comcast to continue with the new branding as long as they explicitly noted it was aspirational.
G rhymes with G
The birth of commercial 4G services goes back to a marketing controversy in a Jamaican courtroom in 2011. It all started when local Telco Digicel sued upstart LIME for using the term 4G to describe a new phone service built on WiMAX (a wide area version of Wi-Fi). This new “4G” service was only marginally faster than the 3G services at the time. It was certainly nowhere close to the engineering communities’ current expectations for 4G.
The trouble was no one had ever defined what exactly 4G was. It was a sort of aspirational vision for something after 3G. The International Telecommunications Union was leading efforts to define various wireless telco standards like HSPA+, CDMA, EDGE, and LTE Advanced for different Telco architectures but left the “G” branding to telcos. It probably did not help matters that Apple had started including ‘G’ in the names of its latest phones, like the iPhone 3G and 4G, which did not support the full speeds of the 3G and later 4G services available at the time.
An essential argument in the Jamaican case was that then ITU Press Secretary Sanjay Acharya had said in a press conference that no 4G service had been deployed at the time. He later told me that the ITU had not defined 4G and never had a mandate to do so. The problem was that the ITU press team was only supposed to talk about official standards.
To help settle the matter, the ITU issued a follow-up press release saying:
It is recognized that [4G], while undefined, may also be applied to the forerunners of these technologies, LTE and WiMax, and to other evolved 3G technologies providing a substantial level of improvement in performance and capabilities with respect to the initial third generation systems now deployed.
Within weeks, telcos worldwide took the ball and ran with it. The previous engineering consensus was that 4G would support 100 megabits per second (Mbps) speeds. But shortly after the ITU made clear that 4G was undefined, T-Mobile, Verizon, and Sprint all rebranded existing offerings as 4G services that maxed out at 3-12 Mbps.
At the time, Will Straus, a now-retired industry analyst at Forward Concepts, told me:
The 4G architectures discussed five years ago are not the same 4G that people are talking about today with HSPA+. As it is not a standard, there is nothing wrong with that other than the fact that carriers have obfuscated its meaning for the sake of advertising. The term ‘4G’ is now meaningless. There is no technical definition. There is only a marketing definition, which is anything faster than 384 kbps.
Strauss also observed that a marketing rebranding of existing technology was not new. Verizon had previously touted a network upgrade as 3G that operated at the same speeds as AT&T’s much larger network 2.5G network of similar speed.
Depending on how the current Comcast case proceeds, it's reasonable to imagine that other telco vendors may similarly begin the rollout of “new” 10G or, heck, even 100G services referring to the Generation of software patches meant to fix bugs.
I am not even sure how anyone might even practically consume these higher levels of bandwidth. And at least in the US, operators have been renowned for imposing usage caps on these services, which penalize customers who find creative ways to make the most of the bandwidth they may be led believe they have already paid for.
But the fundamental problem in all this marketing is these upgrades seem to gloss over the lower reliability and availability that comes with every new generation of technology. In the last month, my high-speed broadband network has gone offline several times for as much as an hour or more. This has been pretty frustrating for making Zoom calls requiring well under a megabit per second.
Innovations in wireless networks are even worse. In theory, 5G wireless networks can support up to 20 gigabits per second, although carriers suggest we can reasonably expect 100 megabits per second. That’s when they work. The problem is that these faster speeds often use much higher frequencies, which get blocked by buildings and trees more easily. And sometimes, when the 5G network cuts out, my phone won't automatically switch over to a more reliable 4G or even 3G network.
For the most part, things seem to work better when I turn off 5G by default. At least when it works, it's fast enough for everything I need.