Who Invented the Computer? When was the computer first invented and how was it invented? History of the Computer

who found the computer, when was the computer first invented and how it was invented.
who found the computer, when was the computer first invented and how it was invented.

A computer is a device that can store and return the information we process whenever we want. Today's computers are capable of tracking generalized sets of processes called programs. These programs enable computers to perform a wide variety of tasks. A complete computer containing the hardware, operating system (main software) and peripheral equipment required and used for "full" operation can be referred to as a computer system. This term can also be used for a group of computers connected and working together, particularly a computer network or cluster of computers. The first electric computer is the ENIAC.

Computers have appeared in many different forms throughout history. The first computers of the mid-20th century were the size of a large room and consumed hundreds of times more power than today's computers. By the beginning of the 21st century, computers were able to fit in a wristwatch and run on a small battery. The main reason why they can be manufactured so small is that in 1969, circuits that can fit into very small spaces can be made with semiconductors. Computers we use today gained speed after the 4004, which was Intel's first processor title. Our society recognized the personal computer and its portable equivalent, the laptop computer, as symbols of the information age and identified it with the concept of computer. They are widely used today. The basic working principle of the computer is the binary number system, that is, codes consisting of only 0 and 1.

The ability to save the desired software and run it at any time is the main feature that makes computers versatile and distinguishes them from calculators. The Church-Turing thesis is a mathematical expression of this versatility and underlines that any computer can perform the tasks of another. So whatever their complexity, from pocket computers to supercomputers, they can all perform the same tasks without memory and time limits.

History of the Computer

Many devices known as 'computers' in the past do not deserve this definition by today's criteria. Computer at startup sözcüIt was a name given to objects that facilitated the computational process. Computer examples of this early period include the number bead (abacus) and the Antikitera Machine (150 BC - 100 BC). Centuries later, in the light of new scientific discoveries at the end of the Middle Ages, the first of a series of mechanical computing devices developed by European engineers belongs to Wilhelm Schickard (1623).

However, none of these devices meet today's definition of a computer, as they are not software (or installable). The punched cards produced by Joseph Marie Jacquard in 1801 to automate the process on the weaving loom are regarded as one of the first traces of software (installation) in the development process of computers, albeit limited. Thanks to these cards provided by the user, the weaving loom could adapt its operation to the drawing described with the holes on the card.

In 1837, Charles Babbage conceptualized and designed the first fully programmable mechanical computer, which he called the Analytical Engine (analytical engine). However, he could not develop this machine due to financial reasons and the inability to complete his work on it.

The first large-scale use of punch cards was the calculator designed by Herman Hollerith in 1890 to be used in accounting transactions. The business to which Hollerith was affiliated at that time was IBM, which will become a global computer giant in the following years. By the end of the 19th century, applications (technologies) began to emerge that would greatly contribute to the development of computing hardware and theories in the years to come: punch cards, Boolean algebra, space tubes, and teletype devices.

In the first half of the 20th century, many scientific requirements were met with increasingly complex analog computers. However, they were still far from the infallibility level of today's computers.

Computing application continued to improve throughout the 1930s and 1940s, and the advent of the digital electronic computer only took place after the invention of electronic circuits (1937). Important works of this period include the following:

  • Konrad Zuse's "Z machines". Z3 (1941) is the first machine that can operate based on binary numbers and operate with real numbers. In 1998 the Z3 proved to be Turing compatible and thus earned the title of the first computer.
  • The Atanasoff-Berry Computer (1941) was based on space tubes and had a binary number base as well as a capacitor-based memory hardware.
  • The English-made Colossus computer (1944) showed that the use of thousands of tubes, despite its limited firmware (installability), could yield a sufficiently reliable result. II. It was used in World War II to analyze the secret communications of the German armed forces.
  • Harvard Mark I (1944), a computer with limited configurability.
  • Developed by the US Army, ENIAC (1946) is based on a base of decimals and is the first general purpose electronic computer.

Identifying the downsides of ENIAC, its developers worked on a more flexible and elegant solution and proposed what is now known as the hidden software architecture, or more commonly known as the von Neumann architecture. After first mentioning this design in a publication by John von Neumann (1945), the first of the computers developed based on this architecture was completed in the United Kingdom (SSEM). ENIAC, which acquired the same architecture a year later, was named EDVAC.

With almost all of today's computers adapting to this architecture, the computer sözcüIt is also used as the definition of the day. Therefore, according to this definition, although the devices of the past are not counted as computers, they are still referred to as that in the historical context. Although computer implementation has undergone fundamental changes since the 1940s, most have remained true to the von Neumann architecture.

After space-tube-based computers remained in use throughout the 1950s, faster and cheaper transistor-based computers became widespread in the 1960s. As a result of these factors, computers have been switched to mass production at an unprecedented level. By the 1970s, the integrated circuit implementation and the development of microprocessors such as the Intel 4004 once again saw a huge increase in performance and reliability, as well as a cost reduction. In the 1980s, computers began to take their place in the control equipment of many mechanical devices in daily life such as washing machines. During the same period, personal computers were gaining popularity. Finally, with the development of the Internet in the 1990s, computers have become familiar devices such as television and telephone.

According to the von Neumann architecture, computers consist of four main components. The computer has arithmetic logic.

Memory

A computer's memory can be thought of as a set of cells that contain numbers. It can be written into each cell and its contents can be read. Each cell has a unique address. One command would be, for example, to sum the contents of cell number 34 with cell number 5.689 and place it in cell 78. The numbers they contain can be anything, number, command, address, letter, etc. Only the software that uses it determines the nature of its content. The majority of today's computers use binary numbers to save data, and each cell can contain 8 bits (ie one byte).

So a byte can represent 255 different numbers, but they can only be from 0 to 255 or from -128 to +127. When multiple bytes arranged side by side are used (usually 2, 4 or 8), it is possible to record much larger numbers. The memory of modern computers contains billions of bytes.

Computers have three types of memory. The registers in the processor are extremely fast but have a very limited capacity. They are used to satisfy the processor's need to access the much slower main memory. The main memory is divided into Random Access Memory (REB or RAM, Random Access Memory) and Read Only Memory (SOB or ROM, Read Only Memory). It can be written to RAM at any time, and its content is preserved only as long as power is maintained. Contains information that can only be read and pre-loaded in ROM. It preserves this content regardless of power. For example, while any data or command resides in RAM, it is located in the BIOS ROM, which regulates the computer hardware.

A final subtype of memory is cache memory. It is located in the processor and is faster than main memory, as well as having a larger capacity than registers.

Input / Output is the tool a computer uses to exchange data from the outside world. Commonly used input units include the keyboard and mouse, and for output, the screen (or viewer, monitor), speaker, and printer. Fixed and optical discs, on the other hand, take on both tasks.

Computer networks

Computers have been used since the 1950s to coordinate information across multimedia. The US military's (SAGE) system was the first comprehensive example of such systems and pioneered many special purpose commercial systems such as this system (Saber). In the 1970s, American engineers laid the foundations of what is now known as the computer network by connecting computers (ARPANET) within the framework of a project carried out within the military. Over time, this computer network was not limited to the military and academic units, but expanded, and today millions of computers have been internally formed Bilgisunar (Internet or General network). By the 1990s, computer networks became widespread with protocols called Global Network (World Wide Web, WWW) developed in Switzerland's CERN research center, applications such as e-mail and cheap hardware solutions such as ethernet.

Hardware

The concept of hardware encompasses all the tactile components of a computer.

Hardware examples
Peripheral units (Inlet / outlet) Introduction Mouse, Keyboard, Joystick, Browser
Log out Monitor, Printer, Speaker
Both of them Floppy drive, Hard disk, Optical disk
Link units Short range RS-232, SCSI, PCI, USB
Long range (Computer networks) Ethernet, ATM, FDDI

Input / output units

Input / output enables communication between different functional units (subsystems) of the data processing system or to send information signals directly to these interfaces.

Inputs are signals from different units. Outputs are the signals sent to these units. I / O devices are used by a user (or other systems) to connect with the computer. For example, the keyboard and mouse are computer input devices. The screen, speaker and printer are the output devices of the computer. Different devices use input and output signals to connect with the computer. Modem and connection cards may be examples.

Keyboard and mouse take physical movements of users as input and bring these physical movements to a level that computers can understand. Output units (such as printer, speaker, screen) take the output signals produced by the computer as an input signal and convert these signals into outputs that users can see and read.

In computer architecture, the central processing unit (CPU) and the main memory form the heart of the computer. Because the memory can directly read the data in the central processing unit with its own instructions and write data directly to the central processing unit. As an example, a floppy drive takes into account I / O signals. The central processing unit's provision of I / O methods helps to complete device drivers in low-level computer programming.

High-level operating systems and high-level programming allow to operate by distinguishing ideal I / O concepts and basic elements. For example, the C programming language contains functions to organize the I / Os of the software. These functions allow data to be read from files and data to be written into these files.

Software

The concept of software describes all non-material components in the computer: software, protocols, and data are all software.

Software
OS Unix / BSD UNIX V, AIX, HP-UX, Solaris (SunOS), FreeBSD, NetBSD, IRIX
GNU / Linux Linux distributions
Microsoft Windows Windows 3.0, Windows 3.1, Windows 95, Windows 98, Windows NT, Windows CE, Windows XP, Windows Vista, Windows 7, Windows 8 Windows 8.1 Windows 10
DOS DOS / 360, QDOS, DRDOS, PC-DOS, MS-DOS, FreeDOS
Mac OS Mac OS X
Embedded and Real time operating systems Embedded operating systems
Libraries Multimedia DirectX, OpenGL, OpenAL
Software library C library
Data Communication rule TCP / IP, Kermit, FTP, HTTP, SMTP, NNTP
Document formats HTML, XML, JPEG, MPEG, PNG
user interface Graphical user interface (WIMP) Microsoft Windows, GNOME, KDE, QNX Photon, CDE, GEM
Textual user interface Command line, Shell
Other
Application Office Word processor, Desktop publishing, Presentation software, Database management system, Spreadsheet, Accounting software
Computer Access Scanner, Email client, Global web server, Instant messaging software
Design Computer aided design, Computer aided production
Graphics Cellular graphics editor, Directional graphics editor, 3D modeler, Animation editor, 3D computer graphics, Video editing, Image processing
Numerical sound Digital sound editor, Audio player
Software engineering Compiler, Translator, Interpreter, Debugger, Text editor, Integrated development environment, Performance review, Change control, Software configuration management
Oyunlar Strategy, Adventure, Puzzle, Simulation, Role playing, Interactive fiction
Ek Artificial +, Antivirus software, Document manager

Be the first to comment

Leave a response

Your email address will not be published.


*