When you think about software, you likely try to visualize something that is highly intangible, such as the information stored on your “cloud” or a program saved on your hard drive that you can’t see. And if we are considering only software from 2012, you are correct. Software has indeed been so far removed from “hardware” that it might more accurately be deemed “noware.”
But software wasn’t always this way. As you’re about to see, technology has come a long way. The earliest manifestations of software used for information storage would probably be considered “hardware” today. By older standards, however, software did indeed start out as hardware that was simply “softer.” Let’s take a look at how this was the case.
Software in Ancient Babylon?
If your idea of “software” could loosely be defined as “computing,” then there have been devices like that for a long, long time. Case in point: the abacus. The world’s first-known calculator, the abacus was essentially a counting frame that allowed you to perform calculations by sliding beans up and down a stick. Many historians believe this invention was probably developed by the ancient Babylonians. Later, the Chinese abacus (also known as the “suanpan”) would become the most widely-used form of calculation in the world before the calculator itself.
If you tend to think of those devices as early examples of computing hardware and not software, then the story changes drastically. Modern computing relies on a number of concepts that were developed over millennia. For example, the use of zero as a number in mathematics was first pioneered in ancient India – that led us to the present state of software. But we can look at software as a function of hardware if you want the real story of its development throughout the 20th and 21st centuries.
Was the First Software Actually Paper?
Our concept of software still relies somewhat on the hardware that delivers it; throughout the 1990s, for example, you’d be hard-pressed to find software that wasn’t available on a floppy disk that could be inserted into the computer. Software started out as a similar concept to punch-cards – printed on paper or card stock – and would be used to input certain information into a computer. The computer could read which variables were punched out on the cards and then record the information. The idea of “punching out” at work is really quite similar.
Eventually, the interface of the computers themselves grew more sophisticated, allowing software engineers to develop software on and in the computer themselves. Throughout the 1970s and surrounding years, software engineering was actually considered a troubled concept because of the many problems software would face. For example, poor software for radiotherapy machines could endanger peoples’ lives.
Over time software became a much more predictable process, and its consistency increased. The arrival of the Internet also heralded a new era for easily-shareable software that has led us to the wide-ranging world of software applications that we use today.
About The Author: James Cofflin works with Arcisphere Technologies, an IBM rational company offering clients ease of use in the software development industry. Learn more here: softwarelifecyclepros.com
1 Comment
Pingback: David