Skip to main content

History & Future of Computers


            In order to fully understand the fundamentals of information technology, you must first build a solid foundation by learning where computers came from and why they were originally invented.  The history and future of computers has been a very long and fascinating journey.  Britannica.com defines a computer as “An apparatus that performs routine calculations automatically” (Pottenger & Hemmendinger, 2018 ).  It is thought that computers were first conceptualized around the 1400s by Leonardo da Vinci.  By the late 1800s, computers were starting to be invented to help calculate census data.  Once the first computer was built, computing technology grew exponentially at a steady pace.  The late 1900s saw the rise of personal computing and the 2000s saw the advent of the mobile computer.  With each evolution of computer technology, more and more of the science fiction dreams of the past are becoming technological realizations of the future.
            Many historians believe the first plans for a computer were developed by Leonardo da Vinci.  That computer was never actually built, but scientists believe that the plans would create a working computing machine.  The first “computer” was not built until 1801 when the French Mathematician Joseph Marie Jacquard built a weaving loom that used punch cards to as the program to automatically weave different patterns.  The world still had to wait almost 100 years until the first computer to perform calculations was built.  In 1884, Herman Hollerith designed a punch card system to calculate the 1890 U.S. census.  In 1936, Alan Turing presented the notion of a universal computing device.  Up until that point, each computer was built for its own specific purpose.  When the Second World War consumed the world, University of Pennsylvania professors John Mauchly and J. Presper Eckert developed the Electronic Numerical Integrator which is largely considered the grandfather of digital computers (Zimmermann, 2017).  During this period of computers, their components were made of vacuum tubes.  Just before the first electronic circuit was created, Grace Hopper created the first computer language named COBOL.  Just five years later, in 1958, Jack Kilby and Robert Noyce unveiled the integrated circuit which became known as the computer chip.  Once the U.S. started the 1970s, each component of the modern computer was developed or refined.
            Modern computing first started in 1981 when IBM released its personal computed with the MS-DOS operating system.  A couple years later, Apple released Lisa with the first graphical user interface.  From there, the modern computer started to increase in complexity.  After Apple released their GUI operating system, Microsoft announced their product that would compete with Apple’s OS, Microsoft Word.  Before personal computers were release, the government and some colleges had their own computers.  During that time, the internet was being developed, and the first dot-com domain was registered by Symbolics Computer Company in 1985.  By the mid-1990s, the first microprocessors were allowing personal computers to start handling complex graphics and allow the playback of digital music files.  Just before the turn of the 21st century, wireless network connections named “Wifi” was introduced to the world.  Several years later, the first 64-bit processor was released by AMD as the Athlon 64.  The 64-bit processor allowed computers to handle far more complex computations and allowed for much more memory to be added to computers.  Beginning in the 2010s, the Internet of Things(IoT) was being introduced to the world.  The IoT consists of any networked physical device such as PCs, vehicles, home appliances, or any other item that is embedded with electronics that allow the devices to exchange data.  The future of computing is now about to change the world in a very similar manner to how personal computing altered the world in the 1990s.
            The future of computing will start to bring the dreams of all science fiction fans into a reality.  There are many different technologies just on the horizon that will change society as we know it.  These various devices are going to change how the user of computers will perceive the information being presented to them.  Augmented Reality (AR) offers to put a digital overlay of the world a person sees with the help of mobile devices that are worn.  With AR, engineers will be able to work on projects collaboratively and in real-time regardless of their geographical or astronomical locations.  Virtual Reality (VR) offers to take people to entirely different worlds and realities that are created using computer software.  VR will also allow people to visit anywhere in the world without leaving their living room.  Quantum computing is said to increase the computational speed of computers by an almost unfathomable factor.  Once quantum computers are fully realized, humans will be able to have fully encrypted information.  The Defense Advanced Research Projects Agency (DARPA) is currently developing “Molecular Informatics” which uses molecules to perform computerized calculations.
            To fully understand the fundamentals of information technology, a person needs to have a solid foundation of computer knowledge that begins with understanding the history of computers and why they were invented.  When they understand the history, it will allow them to begin understanding how modern computers were able to evolve into the electronic toys of current society.  When a person appreciates modern computing, they will begin to dream about the future of computers and the wonderful worlds they will be able to explore from their living rooms.

References

Pottenger, W. M., & Hemmendinger, D. (2018, May 03). Computer. Retrieved July 2, 2018, from https://www.britannica.com/technology/computer/History-of-computing
Zimmermann, K. A. (2017, September 06). History of Computers: A Brief Timeline. Retrieved July 2, 2018, from https://www.livescience.com/20718-computer-history.html
Vahid, F., & Lysecky, S. (2017). Computing technology for all. Retrieved from zybooks.zyante.com/

Comments

Popular posts from this blog

CPT 200: Fundamentals of Programming Languages

    During my quest to obtain a Bachelor of Information Technology from Ashford University, my fourth class was CPT 200: Fundamentals of Programming Languages.  For that class, the programming language that is taught is Python 3.     On the first week of class, we were asked to create code that would ask a user to input several pieces of information about any specific employee.  We were to use the variables: employeeName, employeeSSN, employeePhone, employeeEmail, and employeeSalary.  After the data was inputted, it needed to be printed on the screen.  Below was what I turned in for Functionality 1:     During the second week of class, we were to read two chapters: Chapter 3: Types and Chapter 4: Branching.  These chapters introduced us to the different types of variables that can be used within Python as well as how to use branching in your scripts. For the second functionality, we were instructed to adjust our code to allow for 5 different employees to be input into the system

CPT 307: Starting to understand algorithm selection

As it turns out, there is a specific science to selecting the best algorithm to apply to data within a program.  When I first started my Data Structures & Algorithms class, I was excited to learn about different algorithms, and how to efficiently store and sort data using advanced data structures.  What I learned was that there are a great many different algorithms to both search and sort information stored in arrays.  Some websites, such as geeksforgeeks.org, have entire lists of different algorithms each with differing complexities, and each tailored for a specific use.  The computer science community describes algorithm efficiency using two different measures of complexity.  Time complexity is a function relating the number of actions (n) that will be performed on an array (a[]).  There are many different kinds of actions that an algorithm can perform on an array of data.  Time can mean the number of memory accesses performed, the number of comparisons between integers, the nu

CPT 200: Functionality 3 - Loops and Functions

import sys employeeList = {} lstMenuOption = ('1', '2', '3', 'Q') #i = 0 #loop count variable validation = False #initialize validation check to "False" menuSelection = None numEmployees = None num = 1 def main_menu():     print('-----------------------------------------------------')     print('(1) Add new employee(s)')     print('(2) View all employees in the database')     print("(3) Check employee's information")     print('(Q) Exit\n')     totalEmployees = len(employeeList) #Display number of employees in the databse     print('There are (%d) in the database' % totalEmployees)     print('-----------------------------------------------------')     option = input('Please select an option from the menu: ')     return option def valid(option): #Function to validate menu selection     if (option.isnumeric() == True):         if option in lstMenuOption:             return True