Information Technology and Electricity – The Power Couple

In the first edition of this blog, I noted that “the electric sector underpins every other essential industry sector, and it also relies on many of them. I…think of the overlaps like the Olympic rings – all interlinked, with some overlapping more than others.”

Since May 2023, I’ve focused on each critical infrastructure sector in relation to the electric sector because electricity – which began to be deployed as a service close to 150 years ago – has enabled the progress, convenience and abundance that are hallmarks of modern life. Thereafter, I’ll get into the overlapping policy issues in more detail.

For this final edition of the series (woo hoo, we made it!), I’ll cover the IT Sector.  But first, for those of you who’ve been keeping track, you may have noticed that I’ve written 19 newsletters about 16 critical infrastructure sectors. I’m not a math whiz, but I know that doesn’t add up.  Well, the first edition was an overview/intro, so the meat of the topic didn’t get started until the second edition. I added mining as the 17th critical infrastructure sector because, to me, it makes no sense that it isn’t included on the federal government’s list (feel free to comment, but only if you’ve read my overview of the mining sector in edition seven). Then, I split up the emergency services sector into two editions because there was just too much to cover in that disparate sector. Voila. After this edition, I’ll home in on the policy issues I’ve raised and delve into other areas as well – all still focused on the 17 critical infrastructure sectors.

Onward to the IT Sector! The Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency defines this sector as follows:

“The Information Technology Sector is central to the nation's security, economy, and public health and safety as businesses, governments, academia, and private citizens are increasingly dependent upon Information Technology Sector functions. These virtual and distributed functions produce and provide hardware, software, and information technology systems and services, and—in collaboration with the Communications Sector—the Internet. The sector's complex and dynamic environment makes identifying threats and assessing vulnerabilities difficult and requires that these tasks be addressed in a collaborative and creative fashion.

Information Technology Sector functions are operated by a combination of entities—often owners and operators and their respective associations—that maintain and reconstitute the network, including the Internet. Although information technology infrastructure has a certain level of inherent resilience, its interdependent and interconnected structure presents challenges as well as opportunities for coordinating public and private sector preparedness and protection activities.”

Hmmm…does this seem a bit nebulous to you as well? Given how specific the definitions of the other sectors are, I’m scratching my head about this one.  For me to proceed, I need to know precisely what I’m researching. So, for the first time in this newsletter…drumroll please…I’m going to use another definition other than that provided by CISA. This is from Wikipedia – it’s lengthy, but I like it:

Information technology (IT) is a set of related fields that encompass computer systems, software, programming languages and data and information processing and storage. IT forms part of information and communications technology. An information technology system (IT system) is generally an information system, a communications system, or, more specifically speaking, a computer system — including all hardware, software, and peripheral equipment — operated by a limited group of IT users, and an IT project usually refers to the commissioning and implementation of an IT system…The term is commonly used as a synonym for computers and computer networks, but it also encompasses other information distribution technologies such as television and telephones. Several products or services within an economy are associated with information technology, including computer hardware, software, electronics, semiconductors, internet, telecom equipment, and e-commerce .

Looking at this definition, it’s clear why CISA struggled defining the sector. IT significantly overlaps with the communications and manufacturing sectors such that it’s difficult to identify where one ends and the other begins. For our purposes here, and since we have discussed the communications sector in the fourth edition of this newsletter and the critical manufacturing sector in the fifth edition, we’re going to focus on the history of computers and the digital technology that enabled the networking of those computers.

History of Computing

Let’s start with a definition of computing from Wikipedia (I looked for others, but liked this best):

Computing is any goal-oriented activity requiring, benefiting from, or creating computing machinery. It includes the study and experimentation of algorithmic processes, and development of both hardware and software. Computing has scientific, engineering, mathematical, technological, and social aspects. Major computing disciplines include computer engineering, computer science, cybersecurity, data science, information systems, information technology, and software engineering

According to Brittanica, the earliest known device created for making calculations is the abacus, likely developed about 4,300-4,700 years ago by the Sumerians in Babylon (located in modern-day Iraq). The Sumerian abacus can make basic calculations such as addition, subtraction, multiplication, and division. Also, according to Brittanica, some historians believe that its creation, combined with that of the Indian discovery of the number zero, formed the basis of the Arabic-Indian system of numbers we use today.

About 3,000 years ago, the Chinese developed a south-pointing chariot that used a differential gear. The chariot was used as a compass in that, no matter which way it turned the pointing statuette on the top always pointed south. This concept was used hundreds of years later in the development of analog computers. Circa 2,300 years ago, the Greek mathematician and physicist, Archimedes, used the mechanical principle of balance to make large calculations. Wikipedia also notes that the earliest geared computing device uncovered to date is the Antikythera mechanism, used to calculate astronomical positions about 2,100 years ago.

From these early days, it took another 2,600 years to make significant progress in terms of using machines for more advanced uses and calculations. While some scientists and philosophers during the Middle Ages contemplated mechanized devices and Muslim mathematicians made advances in the use of cryptography (methods to secure communications) during this same period, it wasn’t until the early 1600s that major leaps forward began to occur. Scottish mathematician John Napier came up with the idea of “logarithms,” which simplified multiplication of large numbers and was immediately used in astronomical calculations. Soon thereafter, in 1620, English mathematician Edmund Gunter created a machine to make these logarithmic calculations, the Gunter scale. This device was used widely in navigation and led to the development of the slide rule.

While Leonardo da Vinci sketched a design for the first calculator a century earlier, another attempt to design and create a calculating machine was made by a man named Wilhelm Schickard in 1623 who described his design to astronomer and mathematician Johannes Kepler. Schickard was a casualty of war and died before finalizing his machine, however, and the next person to take up the mantle was the Frenchman Blaise Pascal in the 1640s. He successfully built and widely deployed the first calculator at that point, according to Brittanica.

In 1671, German mathematician Gottfried von Leibnitz added to Pascal’s design with his “Step Reckoner” that was able to multiply rather than just add and subtract. A strong proponent of binary numbering given the simplicity in denoting a switch or number “on” and one “off,” but his machine did not use binary numbers itself. Much later, in the 20th century, the use of binary numbers in computing married perfectly with power electronics – an electrical circuit can only be either “on” or “off.”

While these mathematicians made significant progress in the evolution of thought about creating devices or machines to improve human capabilities, especially to perform repetitious or complicated calculations, the commercial need didn’t arise until the 1800s, at which point innovation accelerated. During the early industrial revolution, rapid innovations occurred in all the fields I’ve discussed in this newsletter, all of which used machines to enable such progress. These strides highlighted the need for greater and greater calculating capacity, and it was also during this period that engineers and scientists began to envision computing as more than just calculation.

In 1820, the Frenchman Charles de Colmar created a commercial-grade calculator that was used for 90 years thereafter – called the arithmometer. Before that, in 1804, Joseph-Marie Jacquard created a loom that had implications for developing what we think of as modern computers a century later. His loom was the first programmable machine. Programming is one of four essential elements that comprise a modern computer. Jacquard used punch cards that were fed into his loom to direct the machine to weave different patterns, depending on what was dictated by the cards. Fascinating.

In the 1820s in England, brilliant mathematician Charles Babbage convinced the British government to fund the research and development of a much more sophisticated calculator of astronomical equations needed for navigation than was available at the time. His “difference engine” operated digitally, meaning it used decimal numbers 0-9 to move toothed wheels and to advance when the entire rotation was completed. It was also unique in that it could perform complex equations and it had a storage mechanism, the second essential function of modern computers a century later.

Unfortunately, the construction limitations of the time meant that the machine was never fully completed, but Babbage envisioned another, even more ambitious machine, called the “analytical engine” that would consist of four parts – the mill, the store, the reader and the printer – which are similar to the four essential components of modern computers, with the mill being the central processing unit, the store being the data storage, the reader being the input and the printer being the output. It was also fully programmable in design.

According to Brittanica, Lady Ada Lovelace, a brilliant mathematician herself and of famous parentage, corresponded with Babbage about his engine and was the only one who understood its implications, as she demonstrated in a paper published in 1843, when she was 27 years old. Lady Lovelace noted that, because the machine operated on general symbols rather than on numbers, it established “a link…between the operations of matter and the abstract mental processes of the most abstract branch of mathematical science.”  Babbage’s biggest contribution was even conceiving that such a thing was possible and his second biggest was the use of storage in the machine. Lady Lovelace understood the concept and its application so well that she was the only one who could program the machine correctly, thus becoming the first known computer programmer.

Even with these advances, Babbage and Lovelace were at least a century ahead of their time, meaning the analytical engine was not ready for broad commercialization. The western world instead turned its attention to more specialized and precise calculators, with a series of innovations throughout the 1800s that propelled the Industrial Revolution into the 20th century. For example, the typewriter was commercialized in the 1870s by the Remington Arms Company. None of these had the attributes of a computer as did the analytical machine, but they advanced our understanding of mechanics and, after the 1870s, how mechanics interacted with electricity/electronics.

Modern History

In 1924, the Computing-Tabulating-Recording Company became the International Business Machines Corporation, or IBM. IBM’s major foray into computers didn’t happen for another couple of decades, but the ground was laid for its eventual dominance of the industry later in the 20th century. According to Brittanica, the theoretical underpinnings needed to fully embrace modern computers were developed at MIT and Harvard in the 1930s and 1940s. Vannevar Bush of MIT created the first analog computer while Howard Aiken of Harvard was pioneering digital computers. Aiken explored electromechanical relay circuits and vacuum tubes for speed and reliability.

During the same timeframe, Alan Turing of Cambridge University in England became interested in designing a machine to solve all mathematical equations, not just arithmetic. He sought to define a “universal computing machine” before building it (I can relate—not to his brilliance, but to his desire for a working definition as an essential foundation for moving forward). I reviewed a publication from Stanford University that discusses Turing’s machine and that proffered a lengthy definition (Alan Turing (Stanford Encyclopedia of Philosophy), but have to come back to Brittanica for a more succinct explanation.:

“In particular, it would not be limited to doing arithmetic. The internal states of the machine could represent numbers, but they could equally well represent logic values or letters. In fact, Turing believed that everything could be represented symbolically, even abstract mental states, and he was one of the first advocates of the artificial-intelligence position that computers can potentially “think.”

Turing set out the philosophical and intellectual underpinnings that bolstered the commercial development of modern computers. The first computer to fully combine electronic logic circuits and capacitors to store data using binary numbers (electronic digital computer) was that developed by John Atanasoff of Iowa State College (now University) and his colleague, Clifford Berry, in the 1937-1942 timeframe, contrary to most reports that had it as the Colossus in England in 1943 or the ENIAC in the U.S. in 1945. Around this time, the first networked computers were created by George Stibitz at Bell Labs in the U.S. They used electronic relays networked via telephone lines, but the technology was too slow for full development at the time.

Colossus in England was developed by none other than Alan Turing to decode German transmissions during World War II, and was successful in doing so, but was kept a secret and dismantled after the war. While the war inspired progress in computing, the computers developed then were for specific purposes to achieve wartime aims. Post-war, scientists and businesses were free to envision and develop general purpose computers. According to Brittanica, a paper in 1946 published by three men, Arthur Burks, Herman Goldstine, and John von Neumann articulated the need for a single storage location and computer coding so that programs could be read and shared or modified as data by other programs. This sparked the concept of computer programming languages.

Despite these leaps, the post-war years were still hampered by computers being used largely by experts, and as to be glorified calculators rather than for more general uses. Computer self-modification was key to enabling that broader use, but many were concerned about the ethics of such a step, foreshadowing our current discussions about the evolution of AI. Such a person was Konrad Zuse, who nonetheless developed the first real programming language that did not need an intermediary human for translation – it was a language meant for use by machines. Other languages were soon developed, including IBM’s FORTRAN, still being studied today. A mathematician and early computer programmer, Grace Murray Hopper, had campaigned for acceptance of such languages after the war, throughout the 1950s and IBM took up the call. COBOL came next, specifically created for business purposes, but both languages were “universal” and could be used by any computer to solve any problem.

From a manufacturing standpoint, the creation of the transistor – a semiconductor enabling amplification, control and generation of electrical signals – was another game-changer and emboldened IBM to step more fully into bringing computers into the business realm. And then things sped up quickly with second- and third-generation computers rapidly evolving. Printing technology improved. Transistor technology improved.

Computer specialist J.C.R. Licklider was put in charge of a special government agency called the Advanced Research Projects Agency (ARPA) in 1957 in response to the U.S.S.R.’s Sputnik satellite launch. It was here, and in collaboration with others, that networked computers were fully developed, laying the groundwork for the “world wide web” of computers now known as the internet. The need for operating systems that oversaw computer actions and languages was embraced around this time as well, with one operating system, UNIX, still widely deployed today – developed by two Bell Labs staff, Ken Thompson and Dennis Ritchie.

The idea of a “mini-computer” had been around since 1950, but was brought forward more prominently starting in the mid-1960s, at which point they were used for a decade in business and lab settings – still expensive and slow by today’s standards, but enabling technologists to “tinker” more easily with their own machines. The development of integrated circuits spurred a new era of personal computers. Such circuits were eventually embedded on chips enabled by developments in semiconductor and microprocessor technologies. Some (but not all) prominent companies creating innovations in semiconductors were in the San Francisco Bay area and a culture of innovation sprang up in the area in the 1970s. People started building their own computers, including Bill Gates and Paul Allen in the Seattle area, who developed the software system BASIC, called their company Microsoft, and entered into a partnership with IBM in 1980.

Steve Jobs and Steve Wozniak met at a local computer group and agreed to partner when Steve Wozniak’s prototype computer was rejected by the established company Hewlett-Packard. The two instead developed Apple Computer together in the 1970s and into the ‘80s. In 1984, Apple released the Macintosh, which was a hit (our family bought one in 1987) and launched Apple into the stratosphere.

In the mid-1990s, what was then known as the World Wide Web began to be built out and its functionality improved. Websites were created with HTML, a primitive online coding language (I created one for my boss in Congress at the time!). Computers got smaller and smaller, and the release of the Palm Pilot in 1998, followed by the BlackBerry ushered in the era of handheld computer devices.

And the rest is…recent history, with the evolution of computer technology developing so rapidly that policies, procedures and ethical considerations cannot always keep up. The networking of computers has ushered in the era of cybersecurity risks and the enabling of “self-regulation” within computers many decades ago laying the groundwork for AI – its many benefits and its drawbacks.

Nexus with the Electric Sector

Throughout the historical overview, I’ve mentioned the nexus with electricity. To state the obvious, computers and computer networks don’t work without electricity and electrical components. The devices themselves must be powered, the manufacturing of computers and their components require electricity (sometimes massive amounts), the mining of the inputs into those components takes electricity. The communications networks underpinning the internet and its offshoot, the “cloud” require electricity. You get my drift.

Other Ways the Sectors Overlap

Here are some other ways that the IT sector and the electric sector overlap:

  • Reliance on transportation. Computers, smart phones, and computer components must be shipped via planes, trains, and trucks, while electric utilities rely on transportation for both the day-to-day deliveries of certain types of fuel and all components of their grids and power plants.

  • Reliance on critical manufacturing. Both sectors rely on steel and steel products, semiconductors, digital components, copper wire, fiber optic cable, etc.

  • Environmental regulation/climate change. While the IT sector does not directly emit, both the electricity required for all aspects of their business and the manufacturing required have both emissions and other environmental implications. The electric sector is heavily regulated from an environmental standpoint.

  • Reliance on natural gas and other fuels. There is a direct relationship to IT facilities related to their back-up, onsite, power needs, while the electric sector uses these fuels to provide day-to-day power to the IT sector.

  • Reliance on water. Like every other critical infrastructure sector, including the electric sector, the IT sector needs access to water – for its employees, manufacturing processes, and cooling.

  • Workforce challenges to get workers educated and up to speed on technological innovations, and the knowledge drain that has resulted from retirements in recent years impact both sectors’ regular operations.

  • Supply chain constraints that impact every aspect of infrastructure deployment and maintenance in both sectors. Questions about outsourcing of manufacturing to other countries is especially concerning to both the IT and electric sectors and is a place to collaborate.

  • How to best use technology to create efficiencies and minimize expenses.  Obviously, the IT sector is based on technology and is interested in supplying such technologies to the electric sector. With the proliferation of the cloud and supporting data centers, more collaboration is happening, and even more is necessary between these two sectors.

  • How to manage the cybersecurity risk that comes with those technology deployments. Both industries are acutely focused on this and could work even more collaboratively in the future, particularly to protect data centers that are, in turn, powered by utilities.

Previous
Previous

Dichotomy (Part one)

Next
Next

Government Facilities and Electricity – Freedom to Electrify