Top-Rated Free Essay
Preview

Data processing

Good Essays
1067 Words
Grammar
Grammar
Plagiarism
Plagiarism
Writing
Writing
Score
Score
Data processing
2 Areas of data processing
1. Business Data processing (BDP) .
Business data processing is characterized by the need to establish, retain, and process files of data for producing useful information. Generally, it involves a large volume of input data, limited arithmetical operations, and a relatively large volume of output. For example, a large retail store must maintain a record for each customer who purchases on account, update the balance owned on each account, and a periodically present a bill to the customer for merchandise purchased. This type of record keeping requires reading a customer’s account number, name, address, and previous balance. The bill involves a few basic calculations and the result are printed and mailed to the customer for collection. Tens of thousands of similar bills are commonly handled in the same way.

2. Scientific Data Processing (SDP) .
In science, data processing involves a limited volume of input and many logical or arithmetic calculations. Unlike business problems, most of the scientific problems are non-repetitive, requiring a “one-time” solution. For example, in cancer research, data on cancer patients (collected over a period of time) are analyzed by a computer to produce a possible cure. Although a final cure is unavailable, computer analysis of the hundreds of man-years of computations. It has also brought us a step closer to the final answer to the cancer horror. Although scientific data may differ from business data, the processing pattern is quite similar.

4 Methods of data processing
1. Batch processing
- all input has to be ready beforehand
- time to get your result may be long if interleaved with other batch jobs
- any errors may ruin the whole run but you won 't know it until results are returned

2. Online processing
- break in communication can leave your session in an unknown state
- may only be able to run a single session (may be no multiple logins)
- slow communication line can make the processing not work -- can 't get to the next step until the first one processed

3. Real time processing
(when done properly I can 't think of end-user restrictions, so here are some for programmers)
- operating systems that support it are often proprietary
- languages that support it are not the mainstream ones
- concepts in programming for it are unknown to the average programmer

4. Distributed processing
- processors may not be available when needed e.g. due to networking problems
- requires networking
- requires parallel programming or at least attention, in design, to distributing the tasks

5 Generation of computer
First Generation (1940-1956) Vacuum Tubes
The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.
First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts.
The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.

Second Generation (1956-1963) Transistors
Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.
Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.
The first computers of this generation were developed for the atomic energy industry.
Third Generation (1964-1971) Integrated Circuits
The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.
Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.
Fourth Generation (1971-Present) Microprocessors
The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer—from the central processing unit and memory to input/output controls—on a single chip.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.
Fifth Generation (Present and Beyond) Artificial Intelligence
Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.

References http://www.mbaknol.com/management-information-systems/areas-of-data-processing/ http://www.webopedia.com/DidYouKnow/Hardware_Software/2002/FiveGenerations.asp

References: http://www.mbaknol.com/management-information-systems/areas-of-data-processing/ http://www.webopedia.com/DidYouKnow/Hardware_Software/2002/FiveGenerations.asp

You May Also Find These Documents Helpful

  • Satisfactory Essays

    Data Comm

    • 501 Words
    • 3 Pages

    1) When you promoted your server to domain controller and installed DHCP, what would happen if there was another domain controller already on this network?…

    • 501 Words
    • 3 Pages
    Satisfactory Essays
  • Better Essays

    Created in many different forms and formats, data is collected, processed, stored, and retrieved by business to support the many informational needs of organizations.�� INCLUDEPICTURE "https://api.turnitin.com/images/spacer.gif" * MERGEFORMATINET �� HYPERLINK "javascript:void(0);" Business data enters an organization 's information system through software applications. The software applications process and code the data with proprietary formats that are difficult to extract or report without the help of sophisticated report writer or data extraction tools.�� INCLUDEPICTURE "https://api.turnitin.com/images/spacer.gif" * MERGEFORMATINET �� HYPERLINK "javascript:void(0);" Data is the heart of any business. Without good data turned into information, management can not make the proper decisions.�� INCLUDEPICTURE "https://api.turnitin.com/images/spacer.gif" * MERGEFORMATINET �� HYPERLINK "javascript:void(0);" The advances in computer processing power, storage capabilities, and the development of more ways to add information to data have paved the way for a radically new approach to collecting, storing, retrieving, and reporting business information: to build an entire information…

    • 1645 Words
    • 7 Pages
    Better Essays
  • Good Essays

    assign

    • 508 Words
    • 3 Pages

    3. Give three business examples (not mentioned in the text) of data that must be processed…

    • 508 Words
    • 3 Pages
    Good Essays
  • Good Essays

    3. If you figure the run time for minutes per part divided by the number of employees the processes that take the most time are the clean, coat, and the final test. You would have to increase the # of machines to cut down on the time to clean and coat or increase the # of machines and employees to cut down on final test time.…

    • 657 Words
    • 3 Pages
    Good Essays
  • Good Essays

    John Lewis Hardware

    • 505 Words
    • 3 Pages

    Data will need to be processed in business for marketing purposes such as John Lewis. For example all information need to be processed such as product information or jobs details all that need to be processed in data information. Once it’s finished possessing the outputted data it will show the information of the product which will be outputted data. Every department in john Lewis will make data and other functions of the business will gained from external areas or sources. If the data is incorrect or the outputted information isn’t accurate enough then it will be…

    • 505 Words
    • 3 Pages
    Good Essays
  • Good Essays

    Operational data are kept in a relational database that structures tables that tend to be extremely normalized. Operational data luggage compartment is optimized to support transactions that symbolize daily operations. For example, Customer data, and inventory data are in a frequent update mode. To provide effective modernize performance, operational systems keep data in many tables with the smallest number of fields. Operational data focus on individual transactions rather the effects of the transactions over time. In difference, data analysts tend to comprise of many data dimensions and are concerned in how the data recount over those…

    • 628 Words
    • 3 Pages
    Good Essays
  • Good Essays

    Circuit Board Analysis

    • 510 Words
    • 3 Pages

    It appears that they did not count the individual batch time that’s why they didn’t utilize the total capacity of production.…

    • 510 Words
    • 3 Pages
    Good Essays
  • Good Essays

    Database systems are a way to collect and store large amounts of data. Essentially, database are electronic filing systems that store raw data to be later retrieved as useable information (Skillport, ). Using such a tool simplifies the filing and storage of all sorts of information used by businesses today. A common type of database is a customer/inventory database. Different tables store customer information, past customer orders, inventory counts and distributor information and then this information can be cross-referenced for following inventory pathways.…

    • 666 Words
    • 3 Pages
    Good Essays
  • Satisfactory Essays

    Ups Study

    • 307 Words
    • 2 Pages

    Processing: The data are transmitted to a central computer and stored for retrieval. Data are also reorganized so that they can be tracked by customer account, date, driver, and other criteria such as the consolidation of orders for efficient final delivery of packages.…

    • 307 Words
    • 2 Pages
    Satisfactory Essays
  • Satisfactory Essays

    4. Give three examples of subsystems not operating in the context of IT. Why are these considered subsystems and not systems?…

    • 631 Words
    • 3 Pages
    Satisfactory Essays
  • Better Essays

    Big Data Challenges

    • 1242 Words
    • 5 Pages

    A voluminous and incredible amount of data is produced daily from individuals, businesses, governments across the world of information gathering and sharing. A text message, an email message loaded with images and videos, monthly sales report at a department store, information obtained from a recipient of a government sponsored program, tons and tons of sources are ways in which data is acquired. But when data collected inundates any given system, the frustration upon users becomes a nightmare. The…

    • 1242 Words
    • 5 Pages
    Better Essays
  • Better Essays

    Analysing and Storing Data

    • 1242 Words
    • 5 Pages

    Data management within Human Resources (HR) is essential as this can be used when organisations have to make decisions, contact employees and also satisfy legal requirements.…

    • 1242 Words
    • 5 Pages
    Better Essays
  • Satisfactory Essays

    Please write answers on a separate sheet of paper. YOU MUST STAPLE ALL PAPERS TOGETHER AND HAVE YOUR NAME AND CLASS SECTION INFORMATION ON EACH SHEET OF PAPER.…

    • 398 Words
    • 2 Pages
    Satisfactory Essays
  • Better Essays

    history markscheme

    • 2242 Words
    • 9 Pages

    Process. The first process is to allow a machine learning algorithm to process the data of the…

    • 2242 Words
    • 9 Pages
    Better Essays
  • Powerful Essays

    Processing: The data are transmitted to a central computer and stored for retrieval. Data are also reorganized so that they can be tracked by customer account, date, driver, and other criteria such as the consolidation of orders for efficient final delivery of packages.…

    • 2580 Words
    • 11 Pages
    Powerful Essays

Related Topics