Welcome back. In this video,
we'll be learning how huge devices like the Analytical Engine grew,
I mean, shrunk into the computing devices that we use today.
The development of computing has been steadily growing since the invention
of the Analytical Engine but didn't make a huge leap forward until World War II.
Back then, research into computing was super expensive,
electronic components were large and you
needed lots of them to compute anything of value.
This also meant that computers took up a ton of space and
many efforts were underfunded and unable to make headway.
When the war broke out, governments started pouring money
and resources into computing research.
They wanted to help develop technologies that would give
them advantages over other countries,
lots of efforts were spun up and advancements were made in fields like cryptography.
Cryptography is the art of writing and solving codes.
During the war, computers were used to process
secret messages from enemies faster than a human could ever hope to do.
Today, the role cryptography plays in secure communication is
a critical part of computer security which we'll learn more about in a later course.
For now, let's look at how computers started to make dramatic impact on society.
First up is Alan Turing,
an English mathematician and now famous computer scientist.
He helped develop the top-secret Enigma machine which
helped Allied Forces decode access messages during World War II.
The Enigma machine is just one of the examples of how
governments started to recognize the potential of computation.
After the war, companies like IBM, Hewlett-Packard,
and others were advancing their technologies into the academic,
business, and government realms.
Lots of technological advancements and computing were made in
the 20th century thanks to direct interest from governments,
scientists, and companies left over from World War II.
These organizations invented new methods to store data in
computers which fueled the growth of computational power.
Consider this, until the 1950s punch cards were a popular way to store data.
Operators would have decks of ordered punch cards that were used for data processing.
If they dropped the deck by accident and the cards got out of order,
it was almost impossible to get them sorted again.
There were obviously some limitations to punch cards,
but thanks to new technological innovations like magnetic tape and its counterparts,
people began to store more data on more reliable media.
A magnetic tape worked by magnetizing data onto a tape.
Back in the 1970s and 80s,
people used to listen to music on vinyl records or cassette tapes.
These relics are examples of how magnetic tapes
can store information and run that information from a machine.
This left stacks and stacks of punch cards to collect dust while
their new magnetic tape counterparts began to revolutionize the industry.
I wasn't joking when I said early computers took up a lot of space.
They had huge machines to read data and racks of vacuum tubes that help move that data.
Vacuum tubes control the electricity voltages and
all sorts of electronic equipment like televisions and radios,
but these specific vacuum tubes were bulky and broke all the time.
Imagine what the work of an I.T.
support specialist was like in those early days of computing.
The job description might have included crawling around
inside huge machines filled with dust and creepy crawly things,
or replacing vacuum tubes and swapping out those punch cards.
In those days, doing some debugging might have taken on a more literal meaning.
Renowned computer scientist Admiral Grace Hopper had
a favorite story involving some engineers working on the Harvard Mark II computer.
They were trying to figure out the source of the problems in a relay.
After doing some investigating,
they discovered the source of their trouble was a moth,
a literal bug in the computer.
The ENIAC was one of the earliest forms of general purpose computers.
It was a wall-to-wall convolution of massive electronic components and wires.
It had 17,000 vacuum tubes and took up about 1,800 square feet of floor space.
Imagine if you had to work with that scale of equipment today.
I wouldn't want to share an office with 1,800 square feet of machinery.
Eventually, the industry started using transistors to control electricity voltages.
This is now a fundamental component of all electronic devices.
Transistors perform almost the same functions as
vacuum tubes but they are more compact and more efficient.
You can easily have billions of transistors in a small computer chip today.
Throughout the decades, more and more advancements were made.
The very first compiler was invented by Admiral Grace Hopper.
Compilers made it possible to translate
human language via a programming language into machine code.
In case you didn't totally catch that,
we'll talk more about compilers later in this course.
The big takeaway is that this advancement was
a huge milestone in computing that led to where we are today.
Now, learning programming languages is accessible for almost anyone anywhere.
We no longer have to learn how to write machine code in ones and zeros.
You get to see these languages in action in
future lessons where you'll write some code yourself.
Side note, if the thought of that scares you,
don't worry, we'll help you every step of the way.
But for now, let's get back to the evolution of computers.
Eventually, the industry gave way to the first hard disk drives and microprocessors.
Then, programming language started becoming
the predominant way for engineers to develop computer software.
Computers were getting smaller and smaller,
thanks to advancements in electronic components.
Instead of filling up entire rooms like ENIAC,
they were getting small enough to fit on tabletops.
The Xerox Alto was the first computer
that resembled the computers we're familiar with now.
It was also the first computer to implement
a graphical user interface that used icons, a mouse, and a window.
Some of you may remember that the sheer size and cost of
historical computers made it almost impossible for an average family to own one.
Instead, they were usually found in military and university research facilities.
When companies like Xerox started building machines at
a relatively affordable price and at a smaller form factor,
the consumer age of computing began.
Then in the 1970s,
a young engineer named Steve Wozniak invented the Apple I,
a single-board computer MIT for hobbyists.
With his friend Steve Jobs,
they created a company called Apple Computer.
Their follow up to the Apple I,
the Apple II, was ready for the average consumer to use.
The Apple II was a phenomenal success,
selling for nearly two decades and giving
a new generation of people access to personal computers.
For the first time, computers became affordable for
the middle class and helped bring computing technology into both the home and office.
In the 1980s, IBM introduced its personal computer.
It was released with a primitive version of an operating system
called MS DOS or Microsoft Disk Operating System.
Side note, modern operating systems don't just have text anymore,
they have beautiful icons, words,
and images like what we see on our smartphones.
It's incredible how far we've come from
the first operating system to the operating systems we use today.
Back to IBM's PC,
it was widely adopted and made more accessible to consumers,
thanks to a partnership with Microsoft.
Microsoft, founded by Bill Gates,
eventually created Microsoft Windows.
For decades it was the preferred operating system in the workplace and
dominated the computing industry because it could be run on any compatible hardware.
With more computers in the workplace, the dependence on I.T.
rose and so did the demand for skilled workers who could support that technology.
Not only were personal computers entering the household for the first time,
but a new type of computing was emerging: video games.
During the 1970s and 80s,
coin-operated entertainment machine called arcades became more and more popular.
A company called Atari developed one of
the first coin-operated arcade games in 1972 called Pong.
Pong was such a sensation that people were standing in
lines at bars and rec centers for hours at a time to play.
Entertainment computers like Pong launch the video game era.
Eventually, Atari went on to launch
the video computer system which help bring personal video consoles into the home.
Video games have contributed to the evolution of computers in a very real way,
tell that to the next person who dismisses them as a toy.
Video game show people that computers didn't always have to be all work and no play,
they were a great source of entertainment too.
This was an important milestone for the computing industry,
since at that time,
computers were primarily used in the workplace or at research institutions.
With huge players in the market like Apple Macintosh and
Microsoft Windows taking over the operating systems space,
a programmer by the name of Richard Stallman started
developing a free Unix-like operating system.
Unix was an operating system developed by Ken Thompson and Dennis Ritchie,
but it wasn't cheap and wasn't available to everyone.
Stallman created an OS that he called GNU.
It was meant to be free to use with similar functionality to Unix.
Unlike Windows or Macintosh,
GNU wasn't owned by a single company,
its code was open source which meant that anyone could modify and share it.
GNU didn't evolve into a full operating system,
but it set a foundation for the formation of one
of the largest open source operating system,
Linux, which was created by Linus Torvalds.
We'll get into the technical details of Linux later in this course,
but just know that it's a major player in today's operating systems. As an I.T.
support specialist, it is very likely that you'll work with an open source software.
You might already be using one like the internet browser Mozilla Firefox.
By the early 90s, computers started getting even smaller,
then a real game changer made its way into the scene:
PDAs or personal digital assistants,
which allows computing to go mobile.
These mobile devices included portable media players, word processors,
email clients, Internet browsers,
and more all in one handy handheld device.
In the late 1990s,
Nokia introduced a PDA with mobile phone functionality.
This ignited an industry of pocketable computers or as we know them today, smartphones.
In mere decades, we went from having computers that weigh tons and
took up entire rooms to having powerful computers that fit in our pockets.
It's almost unbelievable, and it's just the beginning.
If you're stepping into the I.T.
industry, it's essential that you understand how
to support the growing need of this ever-changing technology.
Computer support 50 years ago consisted of
changing vacuum tubes and stacking punch cards,
things that no longer exist in today's I.T.
world. While computers evolve in both complexity and prevalence,
so did knowledge required to support and maintain them.
In 10 years, I.T.
support could require working through virtual reality lenses, you never know.
Who knows what the future holds?
But right now, it is an exciting time to be at the forefront of this industry.
Now that we've run down where computers came
from and how they've evolved over the decades,
let's get a better grasp on how computers actually work.
Không có nhận xét nào:
Đăng nhận xét