Critical Manufacturing and Electricity - The Back Bone’s Connected to the Neck Bone
The Essentials, Fifth Edition
In the first edition of this blog, I noted that “the electric sector underpins every other essential industry sector, and it also relies on many of them. I…think of the overlaps like the Olympic rings – all interlinked, with some overlapping more than others.”
For the next several editions, I’ll continue to focus on each critical infrastructure sector in relation to the electric sector because electricity – which began to be deployed as a service close to 150 years ago – has enabled the progress, convenience and abundance that are hallmarks of modern life. Thereafter, I’ll get into the overlapping policy issues in more detail.
In this edition, I’ll discuss the evolution of the critical manufacturing sector and how inextricably it’s linked to electricity. Before I do that, I want to thank those of you who’ve commented on the blog thus far – whether relaying additional resources, disagreeing with a characterization I’ve made, or offering to help with translating engineering/technical elements. I appreciate the engagement by readers of the blog and hope it will continue.
Now on to critical manufacturing. As you can imagine, “manufacturing” is a huge topic, so I’ll focus in on “critical” manufacturing and its relationship to electricity. I found this succinct definition of manufacturing from Investopedia.com:
Manufacturing is the process of turning raw materials or parts into finished goods through the use of tools, human labor, machinery, and chemical processing. It allows businesses to sell finished products at a higher cost than the value of the raw materials used. Large-scale manufacturing allows for goods to be mass-produced using assembly line processes and advanced technologies as core assets.
By contrast, following is the definition of critical manufacturing, as provided by the U.S. Department of Homeland Security’s Cybersecurity and Information Security Agency (CISA):
Critical Manufacturing is a sector that produces highly specialized parts and equipment that are essential to primary operations in several U.S. industries. It is crucial to the economic prosperity and continuity of the United States, and a direct attack or disruption of it could affect essential functions at the national level and across multiple critical infrastructure sectors.
Before I go into more specifics, a brief (and I promise, brief, this time!) step into the history of manufacturing.
Humans have been developing tools to help with hunting, foraging, and building for tens of thousands of years, but initially did so manually. Just like with transportation and water infrastructure, however, several breakthroughs occurred about 6,000 years ago in Mesopotamia and Egypt. The Mesopotamians invented key machines including the wheel, first used in making pottery. The Egyptians developed rudimentary paper made from papyrus as well as the corresponding production methods enabling exportation throughout what is now the Middle East. The Egyptians also developed bricks for building, enabling the creation of iconic structures like the Pyramids, still standing today.
Fast forward to the second century AD when paper-making became more robust based on Chinese inventions and their eventual assimilation elsewhere. Around 1,000 AD, the Song Dynasty in China was the first known government to issue paper money.
In Europe, a paper mill was first established in Sicily in the 12th century, with the subsequent invention of spinning wheel leading to both developments in textiles and the ancillary development of printing. Heat was used in casting, with the blast furnace initially developed by the Chinese about 2,400 years ago coming into widespread use in Europe (initially France) not until approximately 600 years ago. The Europeans used this technology, initially at least, to build cannons.
While this progress was made during the Middle Ages, a major convergence of scientific, engineering, and analytical breakthroughs in manufacturing did not occur until the middle of 1800s. While instruments and machines were used previously to aid in manufacturing (the spinning wheel being a prime example), they were operated manually up to this point in history. The Industrial Revolution, taking place from 1700-1930 or so, is broken up into a few parts to differentiate several key developments, but the hallmark of this era was the economic and widespread use of energy to power machines.
The first major manufacturing industry to fundamentally change the fabric (pun intended) of the world was that of textiles. The power loom and spinning jenny took clothing manufacturing out of homes and into factories, which ultimately enabled cotton (derived from a plant) to overtake wool (derived from an animal).
When engines (powered by coal) combined with electricity (powered by coal, hydropower, and oil at the time) enabled both the lighting of factories and the eventual powering of machines, the stage was set for modern assembly lines. According to Inventionland.com, the first assembly line was patented in 1901 by Ransom Olds. Henry Ford improved on Olds’ methodology to mass produce cars, which then led to assembly lines in many industries.
Since the Industrial Revolution, two other major developments in manufacturing have occurred – the first is the concept of “lean manufacturing” or “just-in-time manufacturing,” first developed by the Japanese in the 1930s, and which made its way West in the 1970s and ‘80s. It was intended to reduce times between suppliers and customers, tightening the supply chain and reducing reliance on stockpiles. The second big development is the introduction of robotics. Interestingly, the first robot was tested in the 1920s – yes, 100 years ago! The concept was to mimic human precision and elements in certain manufacturing processes that still required human interaction. The first industrial robot was used by GE in 1961, and the robotic “arm,” aptly named the “Stanford arm” and now seen in all modern manufacturing was invented in 1969 by a Stanford engineer named Victor Scheinman.
What is not readily apparent from this timeline is that the less we rely on human inputs into manufacturing, the more energy from elsewhere such manufacturing requires. And the energy itself must be increasingly reliable – the precision required to manufacture computer chips, for example, means that any power interruption can damage the chip or render its functionality questionable. Such manufacturers often include onsite electric back-up and/or additional infrastructure from their electric utilities to ensure as close to 100 percent reliability as possible.
Conversely, electric utility infrastructure is heavily dependent on critical manufacturing – the sector that takes raw materials and creates steel-transformers, copper-wires, and wood-poles. Electric utilities also rely on telecommunications technologies – circuitry, fiber-optic cable, wireless radio frequency transceivers, etc. And, in an increasingly digitized world, most electric utilities rely on computers, specialized software, and sensors that enable both greater situational awareness on electric grids as well as data collection and analysis (which helps with managing the increasingly granular levels of demand, or load – think EVs, Tesla home batteries and heat pumps).
While I won’t yet go into detail on current policy matters, I am deliberately digressing here to observe a few things. First, the “lean manufacturing” developed over the last 40 years has enabled greater efficiencies, resulting in more competitive prices. The downside, however, has been less reliance on stockpiles and, therefore, potential exposure when domestic manufacturers have folded due to offshore competition or when the supply chain is disrupted, like it was during the pandemic. In my opinion, more serious conversations are needed between critical manufacturers and the electric sector to understand these tipping points and how we can avoid problems in the future. The distribution transformer shortage in this country is serious and unresolved. Electric vehicles, which require many of the same inputs as does electric infrastructure, cannot run without electricity (hence, the name “electric” vehicles—just sayin’). We have been able to tackle similar issues head on during wartime and should be able to do so now.
Off my soapbox now, here are some other ways that critical manufacturers and electric utilities overlap:
Aging infrastructure -- per a great comment from a reader, not all older infrastructure is problematic – if maintained correctly, some can last decades (most types of generation facilities are in this category and some poles and wires as well), but some does need replacing and customers can be reluctant to fund the capital investments needed to upgrade, especially given current inflationary trends.
Codependency. If electricity is less reliable, then critical manufacturing will be as well. And vice versa. How is reliability measured in each industry?
Environmental regulation. Both face significant regulation on their impacts on water, air and land – and should continue to collaborate to educate regulators and policy makers on their needs.
Workforce challenges and the knowledge drain that has resulted from retirements in recent years.
Supply chain constraints that impact every aspect of infrastructure deployment and maintenance.
How to best use technology to create efficiencies and minimize expenses.
How to manage the cybersecurity risk that comes with those technology deployments.
Given the workforce challenges, how to hire skilled workers who can understand both technology and the infrastructure itself.
What about physical security challenges?
How will the two sectors deploy AI – are there overlaps?
I firmly believe that these two sectors can and should link arms and figure out how to enable successful outcomes for both industries – getting in front of potentially problematic policies by first educating each other and then policy makers.