The History of Computer Viruses

Computer viruses have a much longer history that most people would imagine. They predate the modern internet although the first viruses were purely technical excises in computer programing. It is not until the advent of large scale internet use that malicious computer viruses started to appear.

The basic theory that underpins most types of PC virus was outlined in John von Nuemans scienfic paper published back in 1966. The work titled “The Theory of Self-Reproducing Automata”. Known as the last of great mathematicians von Nueman had also worked on the US nuclear program and instrumental in developing game theory.

This theory was not used until 1971 when the first ever virus was released across the ARPANET network. The virus called Creeper spread across the network and infected DEC PDP-10 computers. When a computer was infected it displayed a message reading “Im the creeper catch me if you can.” The program was an experiment and the Reaper virus was released to clean up and remove the Creeper.

The first anonymous virus was the Wabbit released in 1974 a self-replicating program that lead to a computer crash. This was followed by ANIMAL in 1975. This virus was attached to a program called PERVADE and it reproduced itself in the background and spread across computers as the program was shared. Although ANIMAL was a non-malicious virus it exploited holes in the OS of the computer and left the name of the Animal selected by PERVADE in all the directories and files that the user had access too.

The Elk Cloner written by 15 year school student Rich Skrenta exploited issue with the Apple II boot system. The virus is widely viewed as the first large scale computer virus in the wild. In the wild referrers the fact that it was not contained with one lab or network.

The virus spread via the boot disk of the computer and every after every 50 infections of the boot disk it displayed a message in the form of a short poem. Skrenta who went onto a successful career in computer programming described Elk Cloner as dumb practical joke.

After the Elk Cloner infected Apple machine virus that infected IBM computers followed. The ARF-ARF virus arrived in 1983 and the Trojan horse wiped out the computers directory by offering to sort it into alphabetical order. Although the Pakistani Flu virus appeared in 1986 it was the following year that saw a rapid increase in the number of computer infections.

In 1987 the Vienna, Lehigh, Jerusalem, SCA and Christmas Tree Exec viruses all first appeared and attacked different aspects of computer operating systems. Other virus occurred in different locations around the globe. These included the Stoned virus in New Zealand, Ping Pong in Italy and the Cascade virus in the IBM offices in Belgium. This explosion in computer attacks resulted in IBM developing its own anti-virus software for the public. Before 1987 IBMs anti-virus software had been for internal use only.

These early computer infections were only the start of the problems created by PC virus. The rate and seriousness of the infections after the end of the 1980s has resulted in the creation of the computer security industry.

Tony Heywood 2012

Computers At War In The 70’s And 80’s

he ’70s and the ’80s might be easily described as the ‘laptop warfare’. Every company had a brand new kind of pc, higher than the final that they wished to change the world. Everyone knew it was only a matter of time earlier than one was adopted as the standard, with all the advantages for software program compatibility this would deliver – they usually have been determined for it to be their mannequin that made the big time.

Within the ’70s, two computer systems practically grew to become dominant: the Apple II and the Commodore 64. Both of those computers bought within the tens of millions, inspiring an entire generation – they were used for every little thing from office tasks to games.

It was in 1980, however, that IBM launched its IBM PC, and issues really went crazy. IBM’s PC wasn’t patented. IBM went to a small firm named Microsoft to get an working system for this laptop, and ended up with DOS, but Microsoft was willing to license DOS to anyone else who paid their fee. By 1984, ‘IBM PC compatible’ computer systems had been accessible, and a de facto commonplace was born. Software makers could lastly write their programs for one operating system and one hardware configuration – and anybody computer that didn’t follow the specification to the letter was rapidly left with no packages to run.

In 1990, Microsoft launched Home windows 3.0 (the primary version of Windows to be really profitable), and the PC’s lock on the marketplace was set in stone. The release of the Pentium and Windows ninety five made it lastly the quickest, least expensive and best system round, and it shortly stopped making sense to develop software program for anything else.

From then on, the PC was the dominant pc – immediately, it’s estimated to have between ninety five% and 98% of the market, with almost all the remainder being held by Apple Macintosh computers.

Computer Software – Where It Began

The first theory about software was proposed by Alan Turing in his 1935 essay, “Computable numbers with an application to the Entscheidungsproblem (Decision problem).”

The term “software” was first used in print by John W. Tukey in 1958. Colloquially, the term is often used to mean application software.

In computer science and software engineering, software is all information processed by computer system, programs and data.

The academic fields studying software are computer science and software engineering.

The history of computer software is most often traced back to the first software bug in 1946. As more and more programs enter the realm of firmware, and the hardware itself becomes smaller, cheaper and faster due to Moore’s law, elements of computing first considered to be software, join the ranks of hardware.

Most hardware companies today have more software programmers on the payroll than hardware designers, since software tools have automated many tasks of Printed circuit board engineers.

Just like the Auto industry, the Software industry has grown from a few visionaries operating out of their garage with prototypes.

Steve Jobs and Bill Gates were the Henry Ford and Louis Chevrolet of their times, who capitalized on ideas already commonly known before they started in the business.

In the case of Software development, this moment is generally agreed to be the publication in the 1980s of the specifications for the IBM Personal Computer published by IBM employee Philip Don Estridge. Today, his move would be seen as a type of crowd-sourcing.

Until that time, software was bundled with the hardware by Original equipment manufacturers (OEMs) such as Data General, Digital Equipment and IBM. When a customer bought a minicomputer, at that time the smallest computer on the market, the computer did not come with Pre-installed software, but needed to be installed by engineers employed by the OEM.

Computer hardware companies not only bundled their software, they also placed demands on the location of the hardware in a refrigerated space called a computer room.

Most companies had their software on the books for 0 dollars, unable to claim it as an asset (this is similar to financing of popular music in those days).

When Data General introduced the Data General Nova, a company called Digidyne wanted to use its RDOS operating system on its own hardware clone. Data General refused to license their software (which was hard to do, since it was on the books as a free asset), and claimed their “bundling rights”. The Supreme Court set a precedent called Digidyne v. Data General in 1985. The Supreme Court let a 9th circuit decision stand, and Data General was eventually forced into licensing the Operating System software because it was ruled that restricting the license to only DG hardware was an illegal tying arrangement.

Soon after, IBM ‘published’ its DOS source for free, and Microsoft was born. Unable to sustain the loss from lawyer’s fees, Data General ended up being taken over by EMC Corporation. The Supreme Court decision made it possible to value software, and also purchase Software patents. The move by IBM was almost a protest at the time and few in the industry believed that anyone would profit from it other than IBM (through free publicity).

Microsoft and Apple were able to thus cash in on ‘soft’ products. It is hard to imagine today that people once felt that software was worthless without a machine. There are many successful companies today that sell only software products, though there are still many common software licensing problems due to the complexity of designs and poor documentation, leading to patent trolls.

With open software specifications and the possibility of software licensing, new opportunities arose for software tools that then became the de facto standard, such as DOS for operating systems, but also various proprietary word processing and spreadsheet programs. In a similar growth pattern, proprietary development methods became standard Software development methodology.

How Computers Have Revolutionized Our World

The world has been completely changed by the advent of personal computers. This is because they have radically transformed the way the world lives, works and conducts business. Computers actually arrived on the scene back in the 1930s and were the brainchild of one Konrad Zuse who produced what was then called the Z1 computer. While this was a giant innovation, this was far from the sophisticated machines that we have today.
That computer laid the groundwork for further research by others. In 1944, two men named John Atanasoff and Clifford Berry from Iowa University produced several innovations of the computer based on Zuse’s model. It was capable of producing simple to moderately complex arithmetic calculations and small but simple computing tasks.
Harvard University, then took it a notch higher and partnered with IBM, the business conglomerate in 1953 to produce what was then the world’s first sophisticated computer. Even then, computers were still gigantic and occupied huge spaces and used a lot of power.
The world as we know it today has been completely changed by computers. They now sit in almost every office space in most major cities of the world. It is hard to imagine business as we know it today without computers. Work would be grossly inefficient and cumbersome. Take for example calculations. It would not be possible for companies which hold a lot of material and inventory to process orders efficiently without computer power.
Computers have also made a giant impact in the field of communication. With the advent of the Internet, the World Wide Web or Information Superhighway, the world has become smaller and has even been referred to as a global village. Phrases such as “global market place” have also been coined to signify how computers have been able to link the world.
We mentioned communications and how computers have made a giant impact in this realm. Messages can now be sent and received via email instantaneously from one end of the globe to the other. This has almost rendered the regular snail mail obsolete. Then there are chat rooms, Instant Messaging and Video Messaging which have transformed the way people meet and interact across the globe.
All this has been made possible by computers connected to the Internet via wireless and wired networks. Large firms such as Microsoft, Yahoo and Google have contributed to this by running sophisticated web applications that have made it possible to chat in real time with someone on the other side of the globe. This has completely changed the way we conduct meetings and even run classrooms and training.
Businesses have benefitted tremendously owing to the advent computers. Companies can even open virtual branches in other countries without necessarily being there physically. Banks can even run on the Internet without possessing actual premises. Such has been the impact of computers to modern life. Sometimes they also have their own disadvantages in that some people accuse them of causing some to be thoroughly anti-social. It is also easy for some to develop multiple identities online to mask their true selves.

Evolution Of Computers Through Five Generations

The technological development of the computer over the ages is often referred to in terms of the different generations of computing devices. The very first ones occupied a lot of space. So much so that special janitorial services or commercial cleaning services were hired for their maintenance. Mainly, a generation is the state of improvement in the product development process; basically a certain ‘leap’ in the computer technology. This fundamentally changes the way computers operate.

With each successive generation, the internal circuitry has become smaller, more advanced and more versatile than in the preceding generation. As a result of the miniaturization; speed, power, and computer memory has proportionally increased. New discoveries are in progress that affect the way we live, work and do other chores. Currently there are five known generations of computer.

The first generation computers relied on vacuum tubes for circuitry and magnetic drums for memory. These were enormous machines, taking up entire rooms. First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time.

Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology. The computers of this generation have an added value as they were developed initially for atomic industry.

The development of the integrated circuit was the hallmark of the third generation of computers. The need for such a development was necessary because although the use of transistors in place of vacuum tubes greatly reduced heat loss into the surrounding, there was still a considerable degree of heat loss that was damaging the internal components of the computer. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.

The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.

As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUI’s, the mouse and handheld devices.

Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Parallel processing involves a system harnessing the processing power of many CPU’s to work as one, as opposed to von Neumann’s single central processing unit design. Superconductor, an equally innovative invention allows flow of electricity with little or low resistance; greatly improving information flow and reducing heat loss.