COMPUTER SCIENCE
Computer science (or computing science) is the study and the science of the theoretical foundations of information and computation and their implementation and application in computer systems.[1][2][3] Computer science has many sub-fields; some emphasize the computation of specific results (such as computer graphics), while others relate to properties of computational problems (such as computational complexity theory). Still others focus on the challenges in implementing computations. For example, programming language theory studies approaches to describing computations, while computer programming applies specific programming languages to solve specific computational problems. A further subfield, human-computer interaction, focuses on the challenges in making computers and computations useful, usable and universally accessible to people.
History
Main article: History of computer science
The early foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks, such as the abacus, have existed since antiquity. Wilhelm Schickard built the first mechanical calculator in 1623.[4] Charles Babbage designed a difference engine in Victorian times (between 1837 and 1901)[5] helped by Ada Lovelace.[6] Around 1900, the IBM corporation sold punch-card machines.[7] However, all of these machines were constrained to perform a single task, or at best some subset of all possible tasks.
During the 1940s, as newer and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors. As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. Computer science began to be established as a distinct academic discipline in the 1960s, with the creation of the first computer science departments and degree programs.[8] Since practical computers became available, many applications of computing have become distinct areas of study in their own right.
Although many initially believed it impossible that computers themselves could actually be a scientific field of study, in the late fifties it gradually became accepted among the greater academic population.[9] It is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM (short for International Business Machines) released the IBM 704 and later the IBM 709 computers, which were widely used during the exploration period of such devices. "Still, working with the IBM [computer] was frustrating...if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again".[9] During the late 1950s, the computer science discipline was very much in its developmental stages, and such issues were commonplace.
Time has seen significant improvements in the usability and effectiveness of computer science technology. Modern society has seen a significant shift from computers being used solely by experts or professionals to a more widespread user base.
Major achievements
Please help improve this section by expanding it. Further information might be found on the talk page or at requests for expansion. (June 2008) |
German military used the Enigma machine during World War II for communication they thought to be secret. The large-scale decryption of Enigma traffic at Bletchley Park was an important factor that contributed to Allied victory in WWII.[10]
Despite its relatively short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society. These include:
Applications within computer science
- A formal definition of computation and computability, and proof that there are computationally unsolvable and intractable problems.[11]
- The concept of a programming language, a tool for the precise expression of methodological information at various levels of abstraction.[12]
Applications outside of computing
- Sparked the Digital Revolution, which led to the current Information Age and the Internet.[13]
- In cryptography, breaking the Enigma machine was an important factor contributing to the Allied victory in World War II.[10]
- Scientific computing enabled advanced study of the mind, and mapping the human genome became possible with Human Genome Project.[13] Distributed computing projects such as Folding@home explore protein folding.
- Algorithmic trading has increased the efficiency and liquidity of financial markets by using artificial intelligence, machine learning, and other statistical and numerical techniques on a large scale.[14]
Fields of computer science
As a discipline, computer science spans a range of topics from theoretical studies of algorithms and the limits of computation to the practical issues of implementing computing systems in hardware and software.[15][16] The Computer Sciences Accreditation Board (CSAB) – which is made up of representatives of the Association for Computing Machinery (ACM), the Institute of Electrical and Electronics Engineers Computer Society, and the Association for Information Systems – identifies four areas that it considers crucial to the discipline of computer science: theory of computation, algorithms and data structures, programming methodology and languages, and computer elements and architecture. In addition to these four areas, CSAB also identifies fields such as software engineering, artificial intelligence, computer networking and communication, database systems, parallel computation, distributed computation, computer-human inte