History:
Originally "calculator" referred to someone who performed numerical calculations under the auspices of a mathematician. He probably worked with the help of various mechanical prediction devices such as numerology. An example of an early divination device is the Antikythera, a Greek device believed to have been developed around 87 BC, used to predict the movements of the planets. The technology behind this ingenious device was lost over time.
Due to the renaissance in Europe, the fields of mathematics and engineering saw great growth. Many mechanical calculating devices using technology developed for clocks in the early 17th century began to lag behind, and many of the technologies for digital computers were developed in the late 19th and early 20th centuries. For example, perforating bark and vacuum bandage can be mentioned. The first fully programmable computer was conceptualized and designed by Charles Babbage in 1837. But due to a combination of technological limitations at the time, lack of funds, and an inability to be creative with his design (a characteristic of the end of thousands of computer engineering projects), he was unable to fully build the device.
In the first half of the 20th century, multipurpose synchronous computers were used for many scientific computing needs. They used direct physical or electronic modeling of problems for predictions. Such computers are rarely used after the development of digital computers.
Alan Turing was an English mathematician and logician. He is also considered as the father of modern computer science. With the help of Turing machine, he made great contribution in formalizing the concepts of algorithm and computation.
Predictive devices with more availability and flexibility were developed back in the 1930s and 1940s. These gradually incorporated the advanced features of modern computers, such as the use of digital electronics (invented by Gowood Sannon in 1937), and more flexible programming. It is very difficult to define a specific point in this timeline as the first computer. Notable achievements include Konrad Füss's Fudge machine, the British secret Colossus computer, and the American Eniac.
Aware of Enniac's shortcomings, its developers created a more flexible and ideal design. All modern computers derive from this, later known as the Program Cache Architecture. A number of projects to develop computers from this framework were launched in the 1940s, the first of which was the Manchester-Small-Scale-Experimental Machine. But the first computer that can be used in practice is Edzac.
Computer-powered computers were in use throughout the 1950s. But due to the invention of triads in 1954, in the 1960s, strip computers were replaced by cheaper, smaller, faster triad computers. The introduction of integrated-circuit technology in the 1970s greatly reduced the cost of computer production, making it possible for the general public to afford the precursors of today's personal computers.
A general purpose computer consists of four important sections. These are,
Arithmetic and logic unit
Control unit
Memory
Input devices and output devices
These areas are connected to each other by busses made of wire blocks.
The controller, arithmetic unit, registers, basic input-output devices, and other hardware closely connected with these are collectively called the central processing unit. Early Central Secretariats had separate components. But since the mid-1970s, all these have been integrated into an integrated circuit. It is called a microprocessor.