A computer is a machine that can store and process information.Computers and computing devices from different eras- left to right, top to bottom:
Early vacuum tube computer (ENIAC)
Mainframe computer (IBM System 360)
Smartphone (LYF Water 2)
Desktop computer (IBM ThinkCenter S50 with monitor)
Video game console (Nintendo GameCube)
Supercomputer (IBM Summit)
A computer is a machine that can be programmed to automatically carry out sequences of arithmetic or logical operations ( computation).
modern digital electronic computers can perform generic sets of operations known as programs. These program enable computers to perform a wide range of tasks. The term computer system may refer to nominally complete computer that includes the hardware, operating system, software, and peripheral equipment needed and used for full operation; or to a group of computers that are linked and function together, such as a computer network or computer cluster.
A broad range of industrial and consumer products use computers as control systems, including simple special-purpose devices like microwave ovens and remote controls, and factory devices like industrial robots. Computers are at the core of general-purpose devices such as personal computers and mobile devices such as smartphones. Computers power the Internet, which links billions of computers and users.
Conventionally, a modern computer consists of at least one processing element, typically a central processing unit (CPU) in the form of a microprocessor, together with some type of computer memory, typically semiconductor memory chips.
The processing element carries out arithmetic and logical operations, and a sequencing and control unit can change the order of operations in response to stored information. Peripheral devices include input devices (keyboards, mice, joystick, etc) output devices (monitors, printers etc.), and input/output devices that perform both functions (e. g. touchscreens.) Peripheral devices allow information to be retrieved from an external source, and they enable the results of operations to be saved and retrieved.
Etymology
It was not until the mind-20th century that the word acquired its modern definition; according to the Oxford English Dictionary, the first known as of the word computer was in a different sense, in a 1613 book called the Young Mans Gleanings by the English writer Richard Brathwait: ” I haue sic read the truest computer of times, and the best Arithmetician that euer sic breathed, an dhe reduceth thy dayes into a short number. “This usage of the term referred to a human computer, a person who carried out calculations or computations. The word continued to have the same meaning until the middle of the 20th century. During the latter part of this period, women were often hired as computers because they could be paid less than their male counterparts. By 1943, most human computers were women.
The Online Etymology Dictionary gives the first attested use of computer in the 1640s, meaning ‘ one who calculates’; this is an “agent noun from computer (v)”. The online Etymology Dictionary states that the use of the term to mean “calculating machine” (of any type) is from 1897. The Online Etymology Dictionary indicates that the modern use of the term, to mean programmable digital electronic computer dates from 1945 under this name; [in a] theoretical [sense] from 1937, as Turing machine. The name has remained, although modern computers are capable of many higher-level functions.
History, pre-20th century
Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers. The earliest counting device was most likely a form of tally stick. Latter record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc). which represented counts of items, likely livestock or grains, sealed in hollow unbaked clay containers. The use of counting rods is one example. the abacus was initially used for arithmetic task. the Roman abacus was developed from devices used in Babylonia as early as 2400 BCE. Since then, many other forms of reckoning boards of tables have been invented. In a medieval European counting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to calculating sums of money.
The Antikythera mechanism is believed to be the earliest known mechanical analog computer, according to Derek j de solla Price. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythra, between Kythera and Crete, and has been dated to approximately c–100 BCE. Devices of comparable complexity to the Antikythera mechanism would not reappear untill the fourteenth century.
Many mechanical aids to calculation and measurement were constructed for astronomical and navigation use. The planisphere was a star chart invended by Abu Rayhan al Biruni in the early 11th century. The astrolabe was invented in the Hellenistic word in either the 1st or 2nd centuries BCE and is often attributed to Hipparchus. A combination of the planisphere and dioptra, the astrolabe was effectively an analog computer capable of working out several different kind of problems in spherical astronomy. An astrolabe incorporating a mechanical calendar computer, and gear-wheels was invented by Abi Bakr of isfahan, Persia in 1235. Abu Rayhan al-Biruni invented the first mechanical geared lunisolar calendar astrolabe, an early fixed-wired knowledge processing machine with a gear train and gear-wheels–c–1000 AD.
First computer
The machine was about a century ahead of its time. All the parts for his machine had to be made by hand- this was a major problem for a device with thousands of parts. Eventually, the project was dissolved with the decision of the British Government to cease funding. Babbage’s failure to complete the analytical engine can be chiefly attributed to political and financial difficulties as well as his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow. Nevertheless, his son, Henry Babbage, completed a simplified version of the analytical engine’s computing unit (the mill) in 1888 He gave a successful demonstration of its use in computing tables in 1906.
Analog computers
During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers. The first modern analog computer was a tide-predicting machine, invented by Sir, William Thomson (later to become Lord Kelvin) in 1872. The
differential analyser, a mechanical analog computer designed to solve differential equations by integration using wheel-and disc mechanism, was conceptualized in 1876 by James thomson, the elder brother of the more famous Sir William Thomson.