Information Age

historical period From Wikipedia, the free encyclopedia

Remove ads

The Information Age[a] is a historical period that began in the middle of the 20th century. It is defined by a quick change from older industries, like those created during the Industrial Revolution, to an economy focused on information technology. The start of the Information Age is often connected to the invention of the transistor in 1947.[2] This technological step forward greatly changed how information is handled and sent.

According to the United Nations Public Administration Network, the Information Age was made possible by taking advantage of smaller computers and computer parts.[3] This led to modern information systems and internet communication becoming the main force behind social evolution.[4]

There is an ongoing discussion about whether the Third Industrial Revolution has ended and if the Fourth Industrial Revolution has already begun because of recent progress in areas like artificial intelligence and biotechnology.[5] It is suggested that this next change might bring about the Imagination Age, the Internet of things (IoT), and fast advances in machine learning.

Remove ads

History

The digital revolution changed technology from a continuous analog format to a discrete digital format. This change made it possible to create perfect copies that were exactly the same as the original. For example, in digital communication, special hardware could boost the digital signal and send it along without losing any information. Just as important to this revolution was the ability to easily move digital information between different media and to access or share it from far away. One major moment in the revolution was the change from analog to digitally recorded music.[6] During the 1980s, the digital format of optical compact discs (CDs) slowly replaced analog formats, such as vinyl records and cassette tapes, as the most popular choice.[7]

Previous inventions

Humans have been making tools for counting and calculating since ancient times, such as the abacus, astrolabe, equatorium, and mechanical clocks. More complex tools started to appear in the 1600s, including the slide rule and mechanical calculators. By the early 1800s, the Industrial Revolution had led to mass-market calculators like the arithmometer and the enabling technology of the punch card. Charles Babbage proposed a mechanical, general-purpose computer called the Analytical Engine, but it was never built successfully and was largely forgotten by the 20th century. Most inventors of modern computers were unaware of it.

The Second Industrial Revolution in the last 25 years of the 19th century led to useful electrical circuits and the telegraph. In the 1880s, Herman Hollerith created electromechanical counting and calculating devices using punch cards and unit record equipment, which became widely used in business and government.

At the same time, various analog computer systems used electrical, mechanical, or hydraulic systems to model problems and find answers. These included a tide-predicting machine (1872), differential analyzers, perpetual calendar machines, the Deltar for water management in the Netherlands, network analyzers for electrical systems, and various machines for aiming military guns and bombs. The building of analog computers for specific problems continued into the late 1940s and beyond with FERMIAC for neutron transport, Project Cyclone for military uses, and the Phillips Machine for economic modeling.

Building on the complexity of the Z1 and Z2, German inventor Konrad Zuse used electromechanical systems to complete the Z3 in 1941. It was the world's first working, programmable, fully automatic digital computer. Also during World War II, Allied engineers built electromechanical bombes to break German Enigma machine codes. The base-10 electromechanical Harvard Mark I was finished in 1944, and was somewhat improved with ideas from Charles Babbage's designs.

1947–1969: Origins

In 1947, the first working transistor, the germanium-based point-contact transistor, was invented by John Bardeen and Walter Houser Brattain while working under William Shockley at Bell Labs.[8] This paved the way for more advanced digital computers. From the late 1940s, universities, the military, and businesses developed computer systems to digitally repeat and automate mathematical calculations that were previously done by hand, with the LEO being the first commercially available general-purpose computer.

Digital communication became affordable enough for widespread use after the invention of the personal computer in the 1970s. Claude Shannon, a Bell Labs mathematician, is recognized for setting the foundation for digitalization in his important 1948 article, A Mathematical Theory of Communication.[9]

In 1948, Bardeen and Brattain patented an insulated-gate transistor (IGFET) with an inversion layer. Their idea forms the basis of today's CMOS and DRAM technology.[10] In 1957 at Bell Labs, Frosch and Derick were able to make planar silicon dioxide transistors.[11] Later, a team at Bell Labs showed a working MOSFET (Metal-Oxide-Semiconductor Field-Effect Transistor).[12] The first key step for the integrated circuit was achieved by Jack Kilby in 1958.[13]

Other important technological steps included the invention of the single-piece integrated circuit chip by Robert Noyce at Fairchild Semiconductor in 1959,[14] which was made possible by the planar process developed by Jean Hoerni.[15] In 1963, complementary MOS (CMOS) was developed by Chih-Tang Sah and Frank Wanlass at Fairchild Semiconductor.[16] The self-aligned gate transistor, which made mass production even easier, was invented in 1966 by Robert Bower at Hughes Aircraft[17][18] and separately by Robert Kerwin, Donald Klein, and John Sarace at Bell Labs.[19]

In 1962, AT&T introduced the T-carrier for long-distance digital voice transmission using pulse-code modulation (PCM). The T1 format carried 24 PCM, time-division multiplexed speech signals, each encoded in 64 kbit/s streams, plus 8 kbit/s of framing information for synchronization. Over the next few decades, digitizing voice became the standard for almost all transmissions except the very last part of the connection (where analog stayed the norm until the late 1990s).

Following the development of MOS integrated circuit chips in the early 1960s, MOS chips achieved a higher density of transistors and lower manufacturing costs than older bipolar integrated circuits by 1964. MOS chips continued to grow in complexity at the rate predicted by Moore's law, leading to large-scale integration (LSI) with hundreds of transistors on a single MOS chip by the late 1960s. The use of MOS LSI chips in computing formed the basis for the first microprocessors, as engineers started to realize that an entire computer processor could fit on a single MOS LSI chip.[20] In 1968, Fairchild engineer Federico Faggin improved MOS technology with his development of the silicon-gate MOS chip, which he later used to develop the Intel 4004, the first single-chip microprocessor.[21] It was released by Intel in 1971 and set the stage for the microcomputer revolution that began in the 1970s.

MOS technology also led to the development of semiconductor image sensors suitable for digital cameras.[22] The first such image sensor was the charge-coupled device (CCD), developed by Willard S. Boyle and George E. Smith at Bell Labs in 1969,[23] based on MOS capacitor technology.[22]

1969–1989: Invention of the internet, rise of home computers

The public first learned about the ideas that led to the Internet when a message was sent over the ARPANET in 1969. Packet-switched networks like ARPANET, Mark I, CYCLADES, Merit Network, Tymnet, and Telenet were developed in the late 1960s and early 1970s using various protocols. ARPANET, in particular, led to the creation of protocols for internetworking, which allowed many separate networks to be connected into one network of networks.

The Whole Earth movement of the 1960s promoted the use of new technology.[24]

In the 1970s, the home computer was introduced,[25] along with time-sharing computers,[26] the video game console, the first coin-operated video games,[27][28] and the golden age of arcade video games began with Space Invaders. As digital technology spread and the change from analog to digital record keeping became the new standard in business, a relatively new job became popular: the data entry clerk. This role, drawn from the ranks of secretaries and typists, involved converting analog data (like customer records and invoices) into digital data.

In developed nations, computers became quite common during the 1980s as they found their way into schools, homes, businesses, and industry. Automated teller machines (ATMs), industrial robots, CGI in film and television, electronic music, bulletin board systems, and video games all contributed to the spirit of the 1980s. Millions of people bought home computers, making early personal computer makers like Apple, Commodore, and Tandy well-known names. The Commodore 64 is often cited as the best-selling computer of all time, reportedly selling 17 million units[29] between 1982 and 1994.

In 1984, the U.S. Census Bureau began gathering data on computer and Internet use in the United States; their first survey showed that 8.2% of all U.S. households owned a personal computer in 1984, and households with children under 18 were nearly twice as likely to own one at 15.3% (middle and upper-middle-class households were the most likely to own one, at 22.9%).[30] By 1989, 15% of all U.S. households owned a computer, and nearly 30% of households with children under 18 owned one.[31] By the late 1980s, many businesses relied on computers and digital technology.

Motorola created the first mobile phone, the Motorola DynaTac, in 1983. However, this device used analog communication. Digital cell phones were not sold commercially until 1991 when the 2G network started to open in Finland to meet the unexpected demand for cell phones that became clear in the late 1980s.

Compute! magazine predicted that the CD-ROM would be the central part of the revolution, with many household devices reading the discs.[32]

The first true digital camera was created in 1988, and the first were sold in December 1989 in Japan and in 1990 in the United States.[33] By the early 2000s, digital cameras had become more popular than traditional film cameras.

Digital ink and paint was also invented in the late 1980s. Disney's CAPS system (created 1988) was used for a scene in The Little Mermaid (1989 film) and for all their animated films between The Rescuers Down Under (1990) and Home on the Range (2004).

1989–2005: Invention of the World Wide Web, mainstreaming of the Internet, Web 1.0

Tim Berners-Lee invented the World Wide Web in 1989.[34] The "Web 1.0 era" ended in 2005, around the time more advanced technologies began to develop at the start of the 21st century.[35]

The first public digital HDTV broadcast was of the 1990 World Cup that June, shown in 10 theaters in Spain and Italy. However, HDTV did not become a standard until the mid-2000s outside of Japan.

The World Wide Web became publicly available in 1991, having previously been available only to government and universities.[36] in 1993, Marc Andreessen and Eric Bina introduced Mosaic, the first web browser that could show images directly on the page[37] and the foundation for later browsers like Netscape Navigator and Internet Explorer. Stanford Federal Credit Union was the first financial institution to offer online internet banking services to all its members in October 1994.[38] In 1996, OP Financial Group, also a cooperative bank, became the second online bank in the world and the first in Europe.[39] The Internet grew quickly, and by 1996, it was part of popular culture, and many businesses included websites in their advertisements.[source?] By 1999, almost every country had an internet connection, and nearly half of Americans and people in several other countries used the Internet regularly.[source?] However, throughout the 1990s, "getting online" required complex setup, and dial-up was the only affordable connection type for individuals; the current mass Internet culture was not yet possible.

In 1989, about 15% of all households in the United States owned a personal computer.[40] For households with children, nearly 30% owned a computer in 1989, and this number reached 65% in 2000.

Cell phones became as common as computers by the early 2000s, with movie theaters starting to show ads asking people to silence their phones. They also became much more advanced than the phones of the 1990s, most of which only made calls or at most allowed for simple games.

Text messaging became widely used globally in the late 1990s, except in the United States of America, where it did not become common until the early 2000s.[source?]

The digital revolution also became truly global during this time. After changing society in the developed world in the 1990s, the digital revolution spread to the general population in the developing world in the 2000s.

By 2000, a majority of U.S. households had at least one personal computer, and the following year, they had internet access.[41] In 2002, a majority of U.S. survey respondents reported having a mobile phone.[42]

2005–present: Web 2.0, social media, smartphones, digital TV

In late 2005, the number of Internet users reached 1 billion,[43] and 3 billion people worldwide used cell phones by the end of the decade. HDTV became the standard television broadcast format in many countries by the end of the decade. In September and December 2006, respectively, Luxembourg and the Netherlands were the first countries to completely switch from analog to digital television. In September 2007, a majority of U.S. survey respondents reported having broadband internet at home.[44] According to estimates from Nielsen Media Research, around 45.7 million U.S. households in 2006 (about 40 percent of 114.4 million) owned a dedicated home video game console,[45][46] and by 2015, 51 percent of U.S. households owned one according to an Entertainment Software Association report.[47][48] By 2012, over 2 billion people used the Internet, which was twice the number from 2007. Cloud computing became widely used by the early 2010s. In January 2013, a majority of U.S. survey respondents reported owning a smartphone.[49] By 2016, half of the world's population was connected[50] and as of 2020, that number has risen to 67%.[51]

Remove ads

Social and economic impact

The Information Age has led to a knowledge-based society, driven by computerization and the ability to process, share, and access information instantly.[52] The growth of the Internet has played a major role in this change.[53]

In 1996, the Canadian government recognized the profound shift in the economy by publishing a report titled The Information Highway, which summarized the effects of a rapidly changing economic structure:

"The economic shift from the industrial age to the information age is rooted in two intersecting factors: information technology (especially the computer), and the network, or the ability of information technology to connect computers and people in a global web. The ability of the Information Highway to transmit and process vast amounts of data at ever-increasing speeds is a driving force in the restructuring of the global economy. It has been argued that the importance of the Information Highway is that it provides greater opportunities for human resources to be used to create greater wealth. Information Highway technologies allow firms to reduce inventory and organizational costs by managing a variety of business operations in a more efficient manner. Furthermore, these technologies provide opportunities for Canadian firms to gain new clients through an expanded market base, and can result in significant productivity gains, particularly for small and medium-sized firms. Ultimately, the growth of the Information Highway will contribute to economic growth in Canada in the 21st century by providing the necessary infrastructure for the development of the high-value-added knowledge-based economy."[54]

Economic effects

The development of the transistor and the integrated circuit (IC) was essential for the creation of new technologies, including the personal computer and the mobile phone.[55] These inventions created new jobs and changed existing industries, such as the banking sector, which introduced technologies like ATMs and electronic funds transfer, allowing services to be available around the clock.[56] The Information Age has impacted other sectors, including defense, by providing command, control, communications, and intelligence support; manufacturing, through process automation and robotics; and medicine, with improved medical imaging and diagnostic technology.[57]

The Internet has also transformed many industries, leading to new ways of shopping and media consumption.[58]

Telecommuting

The ability to work from home, or telecommuting, has become common. By 2003, 24% of Americans reported they were telecommuting, and over the next three years, the number of telecommuters grew by about 1.5 million people per year.[59] A 2009 survey by the U.S. Bureau of Labor Statistics found that 21% of employed people did some or all of their work from home.[60]

Developing world

The spread of information technology is also seen in the developing world. The number of mobile phones in Africa increased from 15 million in 2002 to 650 million in 2012.[61] This growth has led to increased market efficiency, for example, by giving farmers access to up-to-date pricing information for their goods.[62]

Remove ads

Social and ethical challenges

The Information Age has brought with it several concerns, including the digital divide, which is the gap between those who have access to modern information technology and those who do not.[63] Other issues include challenges to intellectual property and data privacy, as information can be easily copied and shared globally.[64]

Privacy

The easy collection and sharing of personal data, especially on the Internet, has raised major concerns about privacy.[65] The rise of digital technology means that governments and companies can gather massive amounts of information about individuals, leading to worries about surveillance and the misuse of this data.[66]

Intellectual property

The Information Age has made it easier and cheaper to copy digital media, creating conflicts over copyright and other forms of intellectual property.[67] The music, film, and software industries have faced challenges from illegal file-sharing and online piracy. Organizations have responded by introducing Digital Rights Management (DRM) technologies to control how digital content is used, but these measures are often criticized for limiting fair use.

Freedom of speech and censorship

The Internet has provided new opportunities for freedom of speech and sharing information across borders, but it has also enabled new forms of censorship and information control by governments and powerful organizations.[68] Debates continue over who should regulate online content and balance free expression with preventing harm or illegal activity.

Workforce and employment

As technology automates more tasks, there are concerns about its effect on employment and the need for new skills. While some jobs, such as data entry clerk, declined, new ones in areas like software development, data analysis, and network administration emerged.[69] The shift to a knowledge-based economy requires workers to continuously learn new skills to stay relevant.[70]

Further developments and future outlook

The Information Age continues to evolve rapidly. Developments such as the growth of the Internet of Things (IoT), which connects everyday objects to the Internet, and advancements in artificial intelligence (AI) are expected to drive the next phase of this era, potentially leading to the Fourth Industrial Revolution.[5]

The increasing power of technology and the vast amounts of data being generated suggest that the changes seen so far are only the beginning of a larger transformation in society and the global economy.

Remove ads

Notes

<ref group=lower-alpha>

References

Loading related searches...

Wikiwand - on

Seamless Wikipedia browsing. On steroids.

Remove ads