MODULE_8_STS_(1)(2) (1)
MODULE_8_STS_(1)(2) (1)
LEARNING OBJECTIVES
INTRODUCTION
People are now living in a society where the internet, computers and smartphones have
become essential parts of their everyday lives for immediate accessing and sharing of
information. People are now in the Information Age, also known as Computer Age, Digital Age,
or New Media Age. According to Merriam-Webster Online Dictionary, Information Age is the
modern age regarded as a time in which information has become a commodity that is quickly
and widely disseminated and easily available especially through the use of computer
technology.
Information Age is the period where people can best be characterized as highly
technologically advanced, and internet and data communication minded. The people’s way of
living changed greatly from Renaissance period when they began to write realistic books and
not just religious stories to Industrial Revolution period, when major changes happened in
agriculture, manufacturing, mining, transportation, and technology. This is the period when
digital technologies have changed every aspect of people’s lives; from the way they work and
learn to the way they play and socialize. People now can access information with the touch of a
button. They can do almost everything online; communicating, shopping, paying, working,
educating or learning, watching entertainment, booking, or even ordering food. These
technological advancements have profoundly impacted the society and its environment; the
social, economic, and cultural conditions, science, research, and industries including but not
limited to healthcare, education, finance, entertainment, transportation, and media and
communications.
History
Information Age is tightly attached to the advent of personal computers but many
computer historians tracked its beginnings to the research, A Mathematical Theory of
Communication conducted by Claude E. Shannon, a researcher and mathematician, and also
known as the "Father of Information Theory." This study proposed that information can be
digitized or quantitatively encoded as a series of ones and zeroes. It showed how all information
media, from telephone signals to radio waves to television, could be transmitted without error
using this single framework.
1
The digitization of information gave way for rapid development of modernized
equipment, fiber optic cables and faster microprocessors, accelerated communication and
information processing, World Wide Web, email and mobile technology. As information is
increasingly described in digital form, businesses across many industries have sharpened their
focus on how to capitalize on the Information Age.
Information Technology
Information technology (IT) has always been around from the beginning of time.
People needed to communicate and socialize with each other to grow. The system of
information - the storing, retrieving, manipulating, and communicating information has been in
place since the Sumerians in Mesopotamia developed writing in 3000 BC. Information
technology, in the modern sense is defined as the use of any computers, storage, networking
and other physical devices, infrastructure and processes to create, process, store, secure and
exchange all forms of electronic data. Typically, IT is used in the context of enterprise
operations as opposed to personal or entertainment technologies.
The term information technology was coined by the Harvard Business Review in order to
make a distinction between purpose-built machines designed to perform a limited scope of
functions and general-purpose computing machines that could be programmed for various
tasks. As the IT industry evolved from the mid-20 thcentury, computing capability advanced while
device cost and energy consumption fell lower, a cycle that continues today when new
technologies emerge.
Courtesy of https://www.williamhortonphotography.com
2
Petroglyph
The popularity of alphabets led the way for development of pens and paper. It
started off as just marks in wet clay, but later paper was created out of papyrus plant. As
information grew, people realized the importance of organizing and storing them in
permanent storage. First books were written and kept in libraries. Egyptian scrolls and
book-like binding of paper were popular ways of writing down information to save.This
period was also marked by the development of the first numbering systems. The first 1 –
9 system was created in 100AD; and the number 0 was invented and added in 875AD.
This was followed by the invention of calculator, then known as abacus. This was the
very first sign of information processor.
.
2) Mechanical (between 1450 and 1840)
Development of new technologies emerged and invented like the slide rule, an
analog computer used for multiplying and dividing. Blaise Pascal invented the Pascaline,
a very popular mechanical computer. Charles Babbage developed the difference engine
which tabulated polynomial equations using the method of finite differences.
Courtesy of https://www.britannica.com
Difference Engine
3
Courtesy of sites.harvard.edu
Harvard Mark 1
The first large-scale automatic digital computer in the United States was the Mark
1 created by Harvard University around 1940. This computer was 8ft high, 50ft long, 2ft
wide, and weighed 5 tons. It was programmed using punch cards.
The ENIAC (Electronic Numerical Integrator and Computer) was the first high-
speed, digital computer capable of being reprogrammed to solve a full range of computing
problems. This computer was designed to be used by the U.S. Army for artillery firing
tables. This machine was even bigger than the Mark 1 taking up 680 square feet and
weighing 30 tons. It mainly used vacuum tubes to do its calculations.
Courtesy of pinterest.com
ENIAC
4
There are five main generations of digital computing.
1. The first generation (1942 -1955)used vacuum tubes. This period marked the beginning of
commercial computer age via UNIVAC (Universal Automatic Computer), the first
commercially available computer. It was developed by two
scientists Mauchly and Echert at the Census Department of United States in 1947.
Examples of first generation computers are ENIVAC and UNIVAC-1.
2. The second generation (1955 -1964) used transistors. The scientists at Bell laboratories
developed transistor in 1947. These scientists include John Barden, William Brattain and
William Shockley. The size of the computers was decreased by replacing vacuum tubes
with transistors. The examples of second generation computers are IBM 7094 series, IBM
1400 series and CDC 164 etc.
3. The third generation (1964 – 1975) used the integrated circuits (IC). Jack Kilby developed
the concept of integrated circuit in 1958. It was an important invention in the computer
field. The first IC was invented and used in 1961. The size of an IC is about ¼ square inch.
A single IC chip may contain thousands of transistors. The computer became smaller in
size, faster, more reliable and less expensive. The examples of third generation computers
are IBM 370, IBM System/360, UNIVAC 1108 and UNIVAC AC 9000.
4. The fourth generation (1975 – present) computers started with the invention of
Microprocessor. The Microprocessor contains thousands of ICs. Ted Hoff produced the
first microprocessor in 1971 for Intel. It was known as Intel 4004. The technology of
integrated circuits improved rapidly. The LSI (Large Scale Integration) circuit and VLSI
(Very Large Scale Integration) circuit was designed. It greatly reduced the size of
computer. The size of modern Microprocessors is usually one square inch. It can contain
millions of electronic circuits. The examples of fourth generation computers are Apple
Macintosh &IBM PC.
Courtesy of oldcomputers.net
Apple 2
5. The fifth generation (Present and beyond) computers are based on the technique
of Artificial Intelligence (AI). They can understand spoken words and imitate human
reasoning. They can respond to its surroundings using different types of sensors.
Scientists are constantly working to increase the processing power of computers. They
are trying to create a computer with real IQ with the help of advanced programming and
5
technologies. IBM Watson supercomputer is an example of fifth generation computer. It
combines artificial intelligence (AI) and sophisticated analytical software for optimal
performance as a "question answering" machine. The supercomputer is named after
IBM's founder, Thomas J. Watson. The Watson supercomputer processes at a rate of
80 teraflops (trillion floating point operations per second). To replicate (or surpass) a
high-functioning human's ability to answer questions, Watson accesses 90 servers with
a combined data store of over 200 million pages of information, which it processes
against six million logic rules. The system and its data are self-contained in a space that
could accommodate 10 refrigerators.
Computer
Types of Computer
Since the advent of the first computer different types and sizes of computers are offering
different services. Computers can be as big as occupying a large building and as small as a
laptop or a microcontroller in mobile and embedded systems. The byte-notes.com enumerates
the four basic types of computers:
1) Supercomputer
The most powerful computers in terms of performance and data processing are the
Supercomputers. These are specialized and task specific computers used by large
organizations. These computers are used for research and exploration purposes, like NASA
uses supercomputers for launching space shuttles, controlling them and for space
exploration purpose. The supercomputers are very expensive and very large in size. It can
6
be accommodated in large air-conditioned rooms; some super computers can span an
entire building.
Uses of Supercomputers:
Space Exploration
o Supercomputers are used to study the origin of the universe, the dark-
matters. For these studies scientist use IBM’s powerful supercomputer
“Roadrunner” at National Laboratory Los Alamos.
Earthquake Studies
o Supercomputers are used to study the Earthquakes phenomenon. Besides
that supercomputers are used for natural resources exploration, like natural
gas, petroleum, coal, etc.
Weather Forecasting
o Supercomputers are used for weather forecasting, and to study the nature
and extent of Hurricanes, Rainfalls, windstorms, etc.
Nuclear Weapons Testing
o Supercomputers are used to run weapon simulation that can test the Range,
accuracy & impact of Nuclear weapons.
2) Mainframe Computer
Mainframes are not as powerful as supercomputers, but many large firms & government
organizations use this type of computer to run their business operations. Because of size,
the mainframe computers can be accommodated in large air-conditioned rooms. They can
process and store large amount of data. Banks, big educational institutions and insurance
companies use mainframe computers to store data about their customers, students &
insurance policy holders.
3) Minicomputer
Minicomputers are used by small businesses & firms. They are also called “Midrange
Computers”. These are small machines and can be accommodated on a disk with not as
processing and data storage capabilities as super-computers & Mainframes. These
computers are not designed for a single user. Individual departments of a large company or
organizations use Mini-computers for specific purposes. For example, a production
department can use Mini-computers for monitoring certain production process.
4) Microcomputer
7
Influences of the Past on Information Age
The past has greatly influenced the Information Age. The Renaissance Age created the
idea inventions, while too advanced for the time; the basic idea was used to develop modern
inventions. The Renaissance also changed literature. At first, only books that told stories of
religion and religious heroes were written. During the Renaissance, people began to write
realistic books and not just religious stories. People’s mindset about themselves changed. It
was no longer about what humans could do for God, but what humans could do for themselves.
This way of thinking is called humanism.
The Scientific Revolution changed the modern era by introducing important scientists
such as Galileo, Copernicus, and Sir Isaac Newton. Their discoveries paved the way for modern
tools, inventions, and innovations.
8
1983 Compact discs (CDs) are launched as a new way to store music by the Sony and
Philips corporations.
1987 Larry Hornbeck, working at Texas Instruments, develops DLP® projection—now used
in many projection TV systems.
1989 Tim Berners-Lee invents the World Wide Web.
1990 German watchmaking company Junghans introduces the MEGA 1, believed to be the
world's first radio-controlled wristwatch.
1991 Linus Torvalds creates the first version of Linux, a collaboratively written computer
operating system.
1994 American-born mathematician John Daugman perfects the mathematics that make
iris scanning systems possible.
1994 Israeli computer scientists Alon Cohen and LiorHaramaty invent VoIP for sending
telephone calls over the Internet.
1995 Broadcast.com becomes one of the world's first online radio stations.
1995 Pierre Omidyar launches the eBay auction website.
1996 WRAL-HD broadcasts the first high-definition television (HDTV) signal in the United
States.
1997 Electronics companies agree to make Wi-Fi a worldwide standard for wireless
Internet.
21st
century
2001 Apple revolutionizes music listening by unveiling its iPod MP3 music player.
2001 Richard Palmer develops energy-absorbing D3O plastic.
2001 The Wikipedia online encyclopedia is founded by Larry Sanger and Jimmy Wales.
2001 Bram Cohen develops BitTorrent file-sharing.
2001 Scott White, Nancy Sottos, and colleagues develop self-healing materials.
2002 iRobot Corporation releases the first version of its Roomba® vacuum cleaning robot.
2004 Electronic voting plays a major part in a controversial US Presidential Election.
2004 Andre Geim and Konstantin Novoselov discover graphene.
2005 A pioneering low-cost laptop for developing countries called OLPC is announced by
MIT computing pioneer Nicholas Negroponte.
2007 Amazon.com launches its Kindle electronic book (e-book) reader.
2007 Apple introduces a touchscreen cellphone called the iPhone.
2010 Apple releases its touchscreen tablet computer, the iPad.
2010 3D TV starts to become more widely available.
2013 Elon Musk announces "hyperloop"—a giant, pneumatic tube transport system.
2015 Supercomputers (the world's fastest computers) are now a mere 30 times less
powerful than human brains.
2016 Three nanotechnologists win the Nobel Prize in Chemistry for building miniature
machines out of molecules.
2017 Quantum computing shows signs of becoming a practical technology.
Internet Technology
9
In 1960, the first practical prototype of the Internet came about through the creation of
ARPANET, or the Advanced Research Projects Agency Network. Originally funded by the U.S.
Department of Defense, ARPANET used packet switching to allow multiple computers to
communicate on a single network. The technology continued to grow in the 1970s after
scientists Robert Kahn and Vinton Cerf developed Transmission Control Protocol and Internet
Protocol, or TCP/IP, a communications model that set standards for how data could be
transmitted between multiple networks. In 1971, Ray Tomlinson invented and developed what is
called electronic mail or email today, by creating ARPANET’s networked email system. The
concept of nearly instantaneous communication between machines within an organization
proved to be so beneficial and practical that the concept soon began to spread. In 1983,
ARPANET adopted TCP/IP, through which researchers assembled the “network of networks”
that became the modern Internet. Over the next few years, America Online (AOL), Echomail,
Hotmail and Yahoo shaped the Internet and email landscape. The online world then took on a
more recognizable form in 1990, when computer scientist Tim Berners-Lee invented the World
Wide Web. While it’s often confused with the Internet itself, the web is actually just the most
common means of accessing data online in the form of websites and hyperlinks. The web
helped popularize the Internet among the public, and served as a crucial step in developing the
vast trove of information that most of us now access on a daily basis.
Social Media
Since the birth of Internet and WWW, social media platforms continuously evolve (i.e.,
AOL, Yahoo messenger, bulletin board forum systems, game-based social networking sites,
FaceBook, Myspace, Viber, Skype, etc.) Social media is understood as the different forms
of online communication used by people to create networks, communities, and collectives to
share information, ideas, messages, and other content, such as videos. It has become an
integral part of people’s lives. They use it to connect with friends and family, to catch up on
current events, and, perhaps most importantly, to entertain themselves.
To develop and maintain a conducive online experience, Internet etiquette, also known as
Netiquette, must be observed. Thesupruce.com enumerates the following Netiquettes:
1. Be nice.
The first rule of internet etiquette is to be kind and courteous. Remember that
whatever you send from your keyboard or your phone is still an extension of you, even
though you're not with others in person. Never flame or rant in public forum. Avoid
gossiping and cyber bullying.
10
4. Don’t shout.
Avoid using all caps in any email or post. It comes across as shouting, which is
rude.
5. Use discretion.
Whether you are sending email, instant messaging, commenting on Facebook,
adding images to Snapchat, or posting a message to your blog, you need to remember
that anything you put on the Internet can be there forever. Even if you remove the
material, someone may have made a screen shot, copied, or saved it. One rule of thumb
many people use is to never post anything you wouldn’t want your parents or boss to see.
8. Protect children.
If you allow children to access the Internet, make sure you know what sites they
visit and who their “friends” are.
However, if it continues and you feel as though you are being threatened, contact
the authorities. You need to make sure you protect yourself and your family.
11
LEARNING ACTIVITY
Reflection:
1. Which developments in the information age brought significant changes in the way you
live your life today?
2. Social media also poses certain risks especially in the dissemination of false or fake
information. As a student, how will you use social media to ensure that you do not
propagate inaccurate and unreliable information?
12
13