top of page
Search
  • Writer's pictureFlagler Technologies

The Evolution of Computing Power: From Mainframes to Quantum Computers

Updated: Sep 22, 2023


To explore the history of computing power is almost like gazing into the universe, where millions of combined hours of advancement have seemingly transpired in the blink of an eye. By definition, computing power measures the ability of a system to complete complex equations using data processing in order to perform a certain number of operations in a given second. These factors determine a computer's speed, along with a number of other components, such as central processing units (CPU), random-access memory (RAM), storage capacity, operating system, and network infrastructure.

Computers have undoubtedly revolutionized every facet of our lives, from communication to banking and business, as well as data storage for all our most important personal information. Unfathomable progress has been made across every industry in an exponential fashion, and the computer's ultimate capabilities, while impossible to predict, are still being realized every day. But to understand the abilities of computing power today, we have to explore the origins of advancement.

Historical Overview of Computing Power

The original computation devices had humble beginnings and largely inefficient successors. The United States first implemented a primitive technology when Herman Hollerith developed the electronic tabulating machine in 1890, which used thick punch cards to track data for his employer, the Census Bureau. In essence, the holes punched into the grid of the card triggered specific circuits when entered into the machine, providing data much faster than prior census methods. Hollerith would later go on to form IBM.

Though there were a few entrepreneurial inventions with similar purposes in the 19th century, the concept of computing numbers took off in the 1930s when Alan Turing published his famed work, "On Computable Numbers." His principal device, dubbed the "Turing Machine," aimed to expand on electronic tabulating devices and would strive to define computer processing as we know it today — crunching any sort of complex number equation.

Milestones in the Evolution of Computer Processing

Some of the most crucial periods in the computer's evolution came as systems learned to multitask, thanks to more advanced integrated technologies.

HP's 200A Audio Oscillator

Hewlett-Packard was formed in a Palo Alto garage in 1939 by David Packer and Bill Hewlett, and their HP 200A Audio Oscillator was one of the first commercial computing products to help bring audio to the cinema when Disney purchased them for its theaters. As the movement to electronic computing became inevitable, the variety of applications became evident in any profession working with numerical data.

Coding & the First Computer Chip

In 1953, Grace Hopper became the first person to derive code and was called the "First Lady of Software" due to her discoveries in computing language. By 1958, Jack Kilby and Robert Noyce revealed the first computer chip, known then as an integrated circuit. This technology would soon allow what was once fractions of saved memory to expand into much larger storage capacities.

We use floating point operations per second (FLOPS) to measure computing power. This binary "floating" arithmetic, which performs dynamic equations while carrying the decimal perfectly, really began to take off when chips and disc drives were introduced. Opposed to the outdated punch-grid method of days past, more data was able to be processed with increased efficiency, allowing engineers to perform more complex tasks in a shorter time.

Mainframes to Microprocessors

Next arrived the mainframe computer, which came about after years of tedious micro-processing improvements were made to machines larger than some rooms. The continued efforts to reduce the size and increase productivity led to two important inventions – the Dynamic Access Memory Chip (DRAM) and the floppy disc. These new innovations would give processors much more memory to work with, allowing elaborate computations to be performed with ease. These revelations made a large impact on personal devices, which would also allow information to be transferred between multiple devices easily.

As microprocessing became the norm in the 1970s, smaller and smaller devices harnessed even greater computing power. This all but confirmed one of the most significant revelations in computing history discovered by Intel co-founder Robert Moore just a few years prior. Known today as "Moore's Law," he stated that the number of transistors on a microchip will double approximately every two years, increasing computing power exponentially. This fact has remained consistent to this day, leading us to an age of unprecedented technical capabilities.

Exploring the Rise of Supercomputers and Beyond

As central processing units continued to solve equations at quicker rates, the concept of supercomputers came along. These were devices with improved circuits, reduced wiring, and processing power rates that were multitudes faster than their predecessors. Even supercomputers continued to one-up each other in their early stages, racing to become the centralized machine utilized in businesses and information centers around the globe.

And with increased processing power came the more advanced graphic processing units, which allowed visual operations like photos and video to take on new, more demanding applications. The potential of some GPUs' extensive capabilities has now surpassed that of some CPUs, but we commonly understand our modern computers are heavily reliant on running both.

With this combination, a new era of computing power dawned on the information age, and soon everyone would have access to more affordable home computers that could be used for anything from learning to running apps or playing games. Without continuous innovations in smaller, better processing chips, things like laptops, smartphones, and video game consoles would not have stormed the scene in such a quick fashion.

A newer, more recent form of computing power is Field Programmable Gate Airways (FPGAs), which are integrated circuits that can be customized to perform specific tasks with inherent flexibility. This allows them to be reprogrammed if necessary and makes them adaptable to the size or requirements of a particular computing system. When we're talking about robotics, 5G technology, or artificial intelligence, we're mainly discussing the future of FPGAs.

Quantum Computing's Potential to Revolutionize Processing Power

Considering we've all but worn out the traditional microchip in terms of processing power, it's no surprise that supercomputers evolved into their present form of quantum computing or operations difficult enough that they can only be solved by quantum mechanics. These highly advanced computers take on unprecedented loads of data and process tasks at rates of efficiency once thought to be impossible.

And thanks to other pinnacles in processing achievement such as cloud computing, we now live in a digital environment in which hardware keeps the servers running while we can access software and a wide array of information and services at the palms of our hands through the cloud.

Take parallel computing, which is a sort of serial computation that is performed sequentially through one processor, one action at a time. However, these computers take on huge databases, separating various information and performing individual calculations based on the instruction of each software. This has been a huge advancement for theoretical, real-world models. The complex, interrelated structure of these computers, as well as the unpredictable nature of our world, helps us make sense of the simultaneous processes happening all around us.

Distributed computing, on the other hand, uses one system to run as many software components as needed to perform a particular function. The benefit of these computers is that they can communicate across local or distant networks and use all sorts of computing devices to act as one cohesive system. This also means you can continue to expand your processing power without losing efficiency because more computers can always be added to the distributed system.

Future of Quantum Computing

Quantum computers can scour and process larger swaths of data than ever before, putting the abilities of advanced computing at the ends of our fingertips. But how far can computing power go, and what are the limitations or risks involved? The truth is we aren't really sure, but with Moore's Law reigning true, we continue to see devices becoming more connected, efficient, faster, and smaller.

At the end of the day, the efficiency of the energy and resources used to perform these complicated equations will be at the forefront of computer engineering. The power of computer processing and the future of concepts like quantum computing is exciting but will be heavily reliant on infrastructure, and the ultimate goals of programming development are likely to follow society's most urgent demands. Governments, tech companies, universities, and many other institutions will be looking to the impending opportunities the quantum computing age will bring, but how we understand and implement it into our daily lives is up to us. What we do know is that it allows for limitless expectations.



174 views
bottom of page