Nicholas Carr was right - IT died, but was resurrected
- Summary:
- The 'does IT matter' meme has resurfaced. It's an important discussion.
Carr was less overtly confrontational in his subsequent book, where he (or his editors) attempted to assuage professional sensitivities by turning the statement into a question. However his fundamental thesis was unchanged (emphasis added),
That IT’s strategic importance is not growing, as many have claimed or assumed, but diminishing. As IT has become more powerful, more standardized, and more affordable, it has been transformed from a proprietary technology that companies can use to gain an edge over their rivals into an infrastructural technology that is shared by all competitors. Information technology has increasingly become, in other words, a simple factor of production — a commodity input that is necessary for competitiveness but insufficient for advantage.
Note that this was published in 2004, when AWS was just being hatched as a public service and the boom in cloud services, virtualization and myriad open source projects that created the foundations for containers, big data analytics, machine/deep learning and cloud-native development platforms were a decade away.
It wasn’t until four years later in his next book, The Big Switch, that Carr discussed the cloud as an emerging information utility, however it’s doubtful even he could have predicted the rapid democratizing spread of such developments, where the smallest startup with a credit card can tap a planetary-scale computer with capabilities once reserved for government labs and intelligence agencies.
None of this sat well with IT practitioners who rightly saw a career-stifling threat to their roles as the high priests of technology and gatekeepers to business systems. A favorite argument at the time, which still gets recycled, contended that the business use of IT was so varied that it couldn’t possibly be turned into a fungible commodity. However, Carr clearly stated that he was referring to information infrastructure, the purchase, care and feeding of which constituted the vast majority of IT’s spending and personnel at the time. (p. xii, emphasis added).
I use ‘IT’ in what I believe is its commonly understood sense today, as denoting all the technology, both hardware and software, used to store, process, and transport information in digital form. The meaning of ‘IT’ does not encompass the information that flows through the technology or the talent of the people using the technology.
I was reminded that the rankling caused by Carr’s critique continues by Jon Reed’s Hits and Misses column in which he discusses a piece entitled IT Matters Again: The Enterprise of The Future Present, by Stephen O’ Grady that dredges up Carr’s original work only to equivocally conclude that Carr was both right and wrong; and anyway, it’s all irrelevant now since we’ve semantically moved the goalposts that delineate the role and meaning of IT. Writes O'Grady,
The implication of which is that while all IT may not matter, some may literally spell the difference between organizational life and death. In that sense, then, it’s safe to say we’re living in a post-‘Does IT Matter?’ world.
Employment numbers signal change
The intervening decade-plus couldn’t have proven Carr more right: the traditional enterprise IT role as the builder and maintainer of an organization’s information infrastructure and application systems is dying on the altar of shared, metered, on-demand as-a-service products that provide everything from basic compute and storage infrastructure to sophisticated packaged applications.
While not dinosaurs doomed to extinction, IT operations teams are being relegated to caring for legacy systems, procuring cloud services and configuring remote, rented infrastructure and software.
A core argument for any utility, whether it's the distribution of water and power or IT infrastructure, is the ability of large, integrated providers to operate and distribute commodity or interchangeable services from central plants more efficiently than individual consumers can do on their own. Given that one of the largest components of IT costs is labor, if the utility theory of IT as embodied by cloud service providers is true, then overall IT employment should be declining. Let's look at the numbers.
The most recent CompTIA survey of IT jobs found that the number of information technology jobs at U.S. companies across all sectors of the economy dropped by 90,000 in June, marking the fourth straight month of declines. However, there's an interesting dichotomy, as CompTIA notes in its press release (emphasis added),
Today’s report includes some contradictory signals in IT employment. While the IT sector added jobs, IT employment across all sectors of the economy was down 90,000 jobs.
Thus, while companies specializing in IT products and services (the sector) added jobs, IT employment writ large, predominantly practitioners in other businesses, lost jobs. CompTIA attempts to soften the blow by saying that the demand for technology skills remains strong, which is spin of the highest order since it lumps data scientists and system administrators into the same category of "skilled" worker.
To validate the apparent trend indicated in CompTIA's data, let's look at long-term employment data from the U.S. Bureau of Labor Statistics (BLS). Its market-moving monthly employment report is one of the most significant economic indicators and includes data going back decades. Comparing the total employment numbers for all worker versus those in the information sector (as defined here) shows a disturbing divergence if you're an IT professional. (Unfortunately, the BLS doesn’t provide permalinks for extracted data, but the datasets are available here).
Comparing seasonally adjusted data for total non-farm employment with that for the information sector since 2008, i.e. just as the U.S. entered a severe recession, shows a significant divergence. While total employment dropped about 6.3 percent peak to trough, in the information sector it fell more than twice as much, almost 13 percent. Furthermore, total employment bottomed out 18 months before the information sector’s nadir. However, the most telling indicator of change afoot in the IT business is the fact that employment never fully recovered to its pre-recession levels and is in fact already on the decline even as the overall jobs market continues its vigorous expansion. Indeed, total employment is now about 7.6 percent higher than its level in 2008.
The dichotomy between IT employment and the overall jobs market is starkly evident when one looks at the ratio between the two. Starting at almost 2.2 percent of the jobs economy, the information sector has been on a steady decline, dropping 0.3 points, or almost one-sixth, to less than 1.9 percent. Looked at another way, if the information sector currently constituted the same share of the jobs market as it did in 2008, there would be an additional 486,000 jobs in IT and related work.
My take
Thoroughly tracing the roots and ramifications of changes in the IT sector employment is worthy of a business school term paper, however two of the primary catalysts are the rise of rentable cloud infrastructure and application services along with the related evolution of highly automatable software-defined infrastructure.
Together, these technologies have devalued most traditional IT roles in systems administration, facilities operations and reactive problem resolution. Such marginalization of vast swaths of the IT landscape has forced the discipline to thoroughly re-examine and profoundly change its mission.
IT was once was full of the technology equivalent of construction and maintenance workers, but those jobs have been eliminated by automation, utility service providers and integrated hardware-software products. For example, in a recent survey, CIOs reported that 18 percent of their workloads were running in public clouds, with plans to almost double the share to 34 percent by next year. Goldman Sachs analysts conducting the survey seem surprised by the accelerating rate of cloud adoption, noting that over the past three years the workload share only increased 4 points.
Tomorrow's IT, namely the organization at the foundation of new digital business opportunities, aka digital transformation, requires engineers, architects, data scientists, systems analysts and operations research specialists. The crux of O’Grady’s critique of Carr’s thesis is an assumption that such high-level skills were always central to the role of IT.
If narrow [ sic ] definition is used and IT is taken to mean nothing more than base infrastructure, then Carr’s viewpoint remains correct. If, however, the definition of IT encompasses the entirety of an organization’s technology portfolio and strategy, however, the assertion that IT doesn’t matter could not be less accurate today.
Unfortunately, the dramatically shifting IT job market and the wrenching organizational and personal transformations required to make a new IT that matters again belies the contention. It’s easy to assume that the mantra of digital transformationists everywhere that “IT is the business” has always been an obvious and universally acknowledged fact, but it conveniently ignores the history of IT’s existence as a defined business category and the significant changes happening across its organizational hierarchy from CIOs to front-line technicians.