Abstract: Given that the Internet is an engineered system like any other, why should we distinguish "Internet research" from any other study of technology? One answer is that computers are distinctive in their direct and systematic relationship to language. Another is that the Internet, through its layered architecture, is highly appropriable. Even so, the Internet does not cause a revolution or define a wholly separate "cyber" sphere. Instead, due to its distinctive qualities, it participates in somewhat distinctive ways in the ongoing political life of institutions.
When I was going to graduate school at MIT, most of the professors around me were embarrassed to be called computer scientists. Their complaint was this: why should there be a separate field of computer science, any more than there is a separate field of refrigerator science? In their view, computers were just complex physical artifacts like any others. Following Simon (1969), they argued that design principles such as modularity are not specific to software, but are properties of the universe in general. The structures that evolve are modular because modular structures are more stable than others. Computers were simply a special case of these universal laws.
This perspective on computer science differs from the view in most textbooks. In the textbooks, a computer is a device that can compute any function that any particular Turing machine can compute. The professors at MIT would have none of this. Of course the mathematics of computability was interesting, but it reflected only one corner of a much larger space of inquiry. What they found most interesting was not the mapping from single inputs to single outputs but the relationship between the structure of a computational device and the organization of the computational process that arose when the device was set running.
The physical realization of computational processes was, however, only half the story. The other half lay in the analysis of problem domains. This is a profound aspect of computer work -- and all engineering work -- that is almost invisible to outsiders. Computers are general-purpose machines in that they can be applied to problems in any sphere. A system designer might work on an accounting application in the morning and an astronomical simulation in the afternoon. As the problems in these domains are translated into computational terms, certain patterns recur, and engineers abstract these patterns into layers of settled technique.
Here is an example. I once consulted with a company that wanted to automate the design of some complex mechanical artifacts. The designer of these artifacts might have to make several dozen design decisions. I spent several weeks sitting with an engineer and marching through a stack of manuals for the design of this category of artifacts. We needed the answer to a critical question: in working forward from requirements to design, does the designer ever need to backtrack? Is it ever necessary to make a design decision that might have to be retracted later? If backtracking was required, the company's task would become much harder. After working several cases by hand, it became clear that backtracking was not only necessary but ubiquitious, and that the company needed to hire someone who could build a general-purpose architecture for the backtracking of parameterized constraints. Backtracking is an example of a structure that recurs frequently in the analysis of problem domains. The resulting analogy among domains can be pursued, and might be illuminating all around.
For the professors at MIT, then, engineering consists of a dialectical engagement between two activities: analyzing the ontology of a domain and realizing that domain's decision-making processes in the physical world. ("Realize" here means "make physically real" rather than "mentally understand".) Ideas about computational structure exist for the purpose of translating back and forth...
Please join StudyMode to read the full document