The dawn of personal computing in the early 1980s was the beginning of a rapidly evolving technology stranglehold on everyday life. The mechanical world of typewriters, dedicated word processors and adding machines with cranks were quickly left in a wake of microchips. The good news: you could do more. The bad news: you can be caught in a mire of complex and often-confusing computer-based equipment. The study of Human-Computer Interaction (HCI) finally took center stage in the mid-1990s as the World Wide Web, e-mail, and Windows 95 burst upon the scene. Over the years, computers and programs have become easier to use. Computer programs have become more user friendly, taking advantage of “point and click” design. Minimal mouse clicks are only part of the picture to improve the computing experience. Cleaner, less cluttered “work spaces” for software are the heart of HCI. Users can focus on the tasks at hand (i.e., doing whatever it is they want to do with computers without worrying about how to “make things work”). HCI was a major factor at the Xerox Park Research Project in the late 1970s, even if the people involved weren’t quite sure initially what HCI was (or the monumental impact it would ultimately have). Those researchers made pioneering efforts studying how people interacted with technology. They then redesigned software (and computers) to improve the “computing experience,” boosting productivity. Mouse technology and “desktops” with icons (primitive as they were compared to today’s standards) made it easier for people to work with technology that was soon to change the computing landscape of our daily lives in the 1980s. HCI has to do with the space that is created for you to work with technology. I’m not talking about space in the traditional sense of “floor space.” I’m referring to the space that envelops you the minute you start concentrating. That space is what you become “submerged” in when you interact with a computer (or any kind of technological device you need to control). Think about all those flashing “12:00” digits on VCRs when they became a part of our lives in the 1980s. People quickly figured out how to press the red record button on the VCR to record a program “on the fly.” But it was a much different story when it came to performing such tasks as programming the VCR and setting channels. The typical reason for the futility many people felt while trying to use a VCR was confusing or poorly written instructions. The same is true for digital watches. Setting (or turning off) an alarm on a digital watch can be difficult. And, it really all comes back to HCI. The steps to accomplish tasks with electronic equipment have become more intuitive, because the designers of the products now look very closely at HCI and the space they are creating for you to work in. Problems setting VCRs and digital watches are only a small part of the picture. Even today, people don’t use many of the features on their cell phones or answering machines. The reason is that they just can’t figure out what to do. After just a few attempts a person usually gives up on taking advantage of the “bells and whistles” that sounded great at first. The Present
It sounds like everything is apples so far. User-centered design works well, we have good office information systems, HCI is a solid discipline (if unexciting because we still like those breakthroughs every few years). Intel recently reorganized itself to align with the major market sectors for Intel PCs today. Those sectors are office, home, medical, and mobile. That’s a lot of PCs in new places, and they’re almost all running a Star-style WIMP interface. Smart phones today are about as powerful as a midrange PC from eight years ago, but they waste the latter in media performance. Although only a tiny amount of smart-phone software is around now, it is one of the fastest-growing sectors of the industry. Unfortunately, if you’ve tried interacting with a nontrivial smart-phone...
Please join StudyMode to read the full document