DSL

T1 and T3

A Word Of Caution

Networking: The Intranet

Sharing Printers and Programs

About Sharing Programs

Protecting Yourself And Your Business


E-mail Issues

Computer Care And Maintenance

Protecting The Hardware

Software Maintenance


Chapter 1


The Beginning


When people hear the words «Information Technology,» the first things that come to mind are computers and the Internet. It may also bring up words like «network,» «intranet,» «server,» «firewall,» «security,» as well as more arcane expressions such as «router,» «T-1,» «Ethernet,» or the mysterious and exotic-sounding «VoIP» (pronounced «voyp»).


In fact, information technology is all of these things, and more. It’s hardly new, however. Information technology is as old as the brain itself, if you think of the brain as an information processor. As far as I.T. being a science, even that goes back as far as the earliest attempts to communicate and store information.


And that is essentially what information technology is: the communication and storage of information, along with the ability to process and make use of the information stored. In this chapter, we’ll begin with a brief history of I.T., what it comprises today, and the different major types of I.T. systems available today.


A Short History of Information Technology


As human societies have grown in size and complexity, so has the need to collect, store and transmit information. While it could be argued that brains represent a


form of «bio-information technology,» the Greek word «» – from which we get the word «technology» – really refers to scientific or mechanical knowledge, particularly that which involves the use of tools. Therefore, we’ll begin our journey with human’s first attempts to record and transmit knowledge through mechanical means.


might think of as «information technology.» Using a combination of tools that included manganese «crayons» and clay that was colored with various pigments, early humans left these images on the walls of a cave near Lascaux, France and on cliffs in the Algerian Sahara. These have been dated as being approximately 18,000 and 8,000 years old respectively. Unfortunately, there is no way to be certain exactly what message was being communicated (a problem our own descendants 15,000 years from now may very well encounter!)


Since the images depict animals that were commonly hunted at the time, and given the importance of game animals to a hunting-gathering culture, it’s possible that such images were attempts to present information about such game, or part of a rite designed to ensure a successful hunt.


The invention of writing systems – including pictograms such as hieroglyphics, alphabetic writing and «syllabic» systems – seems to have taken place almost at the same time as the development of agriculture. Agriculture introduced such formerly unknown concepts as land ownership, advanced trade and the accumulation of wealth, which in turn led to more complex societal structures. As you might expect, this necessitated more detailed and efficient record-keeping. Alphabetic writing has a substantial advantage over pictograms (hieroglyphs), because a relatively limited number of symbols (letters) can be used over and over in infinite combination to communicate nearly anything. (As you will see later, modern I.T. uses only two of these symbols!)


Preserving and storing such information posed certain challenges; information either had to be inscribed on stone or clay tablets (which were heavy) or animal skins, wax tablets or papyrus (which weren’t durable).


The Hellenistic World


The Classical Greeks were the first people of record to attempt to find scientific, rational explanations for natural phenomena. Some of the earliest proto-computers known were mechanical devices developed by the Greeks. One of these was a form of abacus (which also developed and was used in ancient China). The device facilitated and simplified mathematical calculation.


Early Programmable Devices


By the time the gradual break-up and fall of the Roman Empire was complete in the year 476 C.E., scientific and technological advances in the Western world had ground to a halt. While much of the scientific knowledge of the Greeks was preserved by Irish monks and Arab scholars, it wasn’t until the fourteenth century that principles of engineering were rediscovered and applied to information. The first of these was of course the printing press. Although the concept of movable type printing had been developed in China some four hundred years earlier, it was Gutenberg’s device in 1447 that revolutionized communications, making it easier and faster to record and disseminate information than ever before. The first truly programmable device would not come along for another 354 years, however.


The Jacquard Loom of 1801 was a product of the Industrial Revolution. This invention used a series of specially punched paper cards that functional as templates, allowing for the automatic weaving of highly intricate patterns. Those punch cards became very significant to computing in the 1950’s, 60’s and 70’s.


The next development was Charles Babbage’s «Analytical Machine» – a fully-programmable computer that unfortunately was never actually built. Babbage worked on designs from 1837 until his passing in 1871. This steam-powered mechanism would have also utilized punch cards, with a central processing unit (CPU) and a form of memory storage in the form of a system of pegs inserted into rotating barrels.


The Analytical Machine would have been capable of storing 1,000 numbers of up to fifty digits each, and perform six different mathematical operations, including the calculation of square roots. Babbage’s ideas were incorporated into early electronic computing devices being developed in the late 1930’s and 1940’s, although not all of these were actually programmable. The first truly programmable computers – able to store and use information – did not come into common use until the 1950’s, and yes – made use of punch cards (those born before 1965 may remember playing with them).

Загрузка...