1. Business Data processing (BDP) .
Business data processing is characterized by the need to establish, retain, and process files of data for producing useful information. Generally, it involves a large volume of input data, limited arithmetical operations, and a relatively large volume of output. For example, a large retail store must maintain a record for each customer who purchases on account, update the balance owned on each account, and a periodically present a bill to the customer for merchandise purchased. This type of record keeping requires reading a customer’s account number, name, address, and previous balance. The bill involves a few basic calculations and the result are printed and mailed to the customer for collection. Tens of thousands of similar bills are commonly handled in the same way.
2. Scientific Data Processing (SDP) .
In science, data processing involves a limited volume of input and many logical or arithmetic calculations. Unlike business problems, most of the scientific problems are non-repetitive, requiring a “one-time” solution. For example, in cancer research, data on cancer patients (collected over a period of time) are analyzed by a computer to produce a possible cure. Although a final cure is unavailable, computer analysis of the hundreds of man-years of computations. It has also brought us a step closer to the final answer to the cancer horror. Although scientific data may differ from business data, the processing pattern is quite similar.
4 Methods of data processing
1. Batch processing
- all input has to be ready beforehand
- time to get your result may be long if interleaved with other batch jobs
- any errors may ruin the whole run but you won 't know it until results are returned
2. Online processing
- break in communication can leave your session in an unknown state
- may only be able to run a single session (may be no multiple logins)
- slow communication line can make the processing not work -- can 't get to the next step until the first one processed
3. Real time processing
(when done properly I can 't think of end-user restrictions, so here are some for programmers)
- operating systems that support it are often proprietary
- languages that support it are not the mainstream ones
- concepts in programming for it are unknown to the average programmer
4. Distributed processing
- processors may not be available when needed e.g. due to networking problems
- requires networking
- requires parallel programming or at least attention, in design, to distributing the tasks
5 Generation of computer
First Generation (1940-1956) Vacuum Tubes
The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.
First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts.
The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.
Second Generation (1956-1963) Transistors
Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.
Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.
The first computers of this generation were developed for the atomic energy industry.
Third Generation (1964-1971) Integrated Circuits
The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.
Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.
Fourth Generation (1971-Present) Microprocessors
The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer—from the central processing unit and memory to input/output controls—on a single chip.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.
Fifth Generation (Present and Beyond) Artificial Intelligence
Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.
References http://www.mbaknol.com/management-information-systems/areas-of-data-processing/ http://www.webopedia.com/DidYouKnow/Hardware_Software/2002/FiveGenerations.asp
References: http://www.mbaknol.com/management-information-systems/areas-of-data-processing/ http://www.webopedia.com/DidYouKnow/Hardware_Software/2002/FiveGenerations.asp
You May Also Find These Documents Helpful
-
3. If you figure the run time for minutes per part divided by the number of employees the processes that take the most time are the clean, coat, and the final test. You would have to increase the # of machines to cut down on the time to clean and coat or increase the # of machines and employees to cut down on final test time.…
- 657 Words
- 3 Pages
Good Essays -
4. Give three examples of subsystems not operating in the context of IT. Why are these considered subsystems and not systems?…
- 631 Words
- 3 Pages
Satisfactory Essays -
1) When you promoted your server to domain controller and installed DHCP, what would happen if there was another domain controller already on this network?…
- 501 Words
- 3 Pages
Satisfactory Essays -
4. Dexter is 45 years old and 181 cm tall. For the last 8 years, his doctor has charted Dexter’s…
- 2658 Words
- 11 Pages
Good Essays -
It appears that they did not count the individual batch time that’s why they didn’t utilize the total capacity of production.…
- 510 Words
- 3 Pages
Good Essays -
What is Data? What is information? Data is facts; numbers; statistics; readings from a device or machine. It depends on what the context is. Data is what is used to make up information. Information could be considered to be the same characteristics I just described as data. In the context of transforming data into information, you could assume data is needed to produce information. So information there for is the meaningful translation of a set of or clusters of data that’s produces an output of meaningful information. So data is a bunch of meaningless pieces of information that needs to be composed; analyzed; formed; and so forth to form a meaningful piece of information.…
- 880 Words
- 4 Pages
Good Essays -
Database systems are a way to collect and store large amounts of data. Essentially, database are electronic filing systems that store raw data to be later retrieved as useable information (Skillport, ). Using such a tool simplifies the filing and storage of all sorts of information used by businesses today. A common type of database is a customer/inventory database. Different tables store customer information, past customer orders, inventory counts and distributor information and then this information can be cross-referenced for following inventory pathways.…
- 666 Words
- 3 Pages
Good Essays -
Having made this distinction it is possible to further refine these operations to this service request. The first objective is to identify the type of system design that is the most efficient and cost effective. Since this system is going to be dealing with fluctuating and exponentially expanding data from multiple sources for multiple accounts it is going to need to be self adaptable.…
- 2865 Words
- 12 Pages
Powerful Essays -
Generally, data mining (sometimes called data or knowledge discovery) is the process of analyzing data from different perspectives and summarizing it into useful information - information that can be used to increase revenue, cuts costs, or both. Data mining software is one of a number of analytical tools for analyzing data. It allows users to analyze data from many different dimensions or angles, categorize it, and summarize the relationships identified. Technically, data mining is the process of finding correlations or patterns among dozens of fields in large relational databases.…
- 1660 Words
- 7 Pages
Good Essays -
the data processed in these systems are data measured or processed by people or other machines.…
- 1464 Words
- 6 Pages
Powerful Essays -
Process. The first process is to allow a machine learning algorithm to process the data of the…
- 2242 Words
- 9 Pages
Better Essays -
The following example of an information processing model gives an insight into how the brain processes information and where that information is stored in our memory. It stems from the ideas of several theorists (Atkinson & Shriffen 1968; Nessier 1976; R.Gagne 1985.)…
- 396 Words
- 2 Pages
Good Essays -
* Explain whether or not you believe there is a discernible difference in efficiency between compressing and decompressing audio data and compressing and decompressing image data. Provide at least three reasons for your argument.…
- 344 Words
- 2 Pages
Good Essays -
Process batches of data at regular intervals. The amount of data processed is usually large with the data being of identical type. Batch processing could be used to produce utility bills, bank statements or payroll data.…
- 855 Words
- 4 Pages
Better Essays -
Summarise the main points of legal requirements and codes of practice for handling information in health and social care.…
- 1180 Words
- 5 Pages
Good Essays