THE EVOLVING ROLE OF SOFTWARE




Today, software takes on a dual role. It is a product and, at the same time, the vehicle for delivering a product. As a product, it delivers the computing potential embodied by computer hardware or, more broadly, a network of computers that are accessible

by local hardware. Whether it resides within a cellular phone or operates inside a mainframe computer, software is an information transformer—producing, managing, acquiring, modifying, displaying, or transmitting information that can be as simple as a single bit or as complex as a multimedia presentation. As the vehicle used

to deliver the product, software acts as the basis for the control of the computer (operating systems), the communication of information (networks), and the creation and

control of other programs (software tools and environments).

Software delivers the most important product of our time—information. Software transforms personal data (e.g., an individual’s financial transactions) so that the data

can be more useful in a local context; it manages business information to enhance competitiveness; it provides a gateway to worldwide information networks (e.g., Internet) and provides the means for acquiring information in all of its forms.

The role of computer software has undergone significant change over a time span of little more than 50 years. Dramatic improvements in hardware performance, profound changes in computing architectures, vast increases in memory and storage

capacity, and a wide variety of exotic input and output options have all precipitated more sophisticated and complex computer-based systems. Sophistication and complexity can produce dazzling results when a system succeeds, but they can also pose

huge problems for those who must build complex systems.

Popular books published during the 1970s and 1980s provide useful historical insight into the changing perception of computers and software and their impact on

our culture. Osborne [OSB79] characterized a "new industrial revolution." Toffler [TOF80] called the advent of microelectronics part of "the third wave of change" in

human history, and Naisbitt [NAI82] predicted a transformation from an industrial society to an "information society." Feigenbaum and McCorduck [FEI83] suggested

that information and knowledge (controlled by computers) would be the focal point for power in the twenty-first century, and Stoll [STO89] argued that the "electronic

community" created by networks and software was the key to knowledge interchange throughout the world.

As the 1990s began, Toffler [TOF90] described a "power shift" in which old power structures (governmental, educational, industrial, economic, and military) disintegrate as computers and software lead to a "democratization of knowledge." Yourdon

[YOU92] worried that U.S. companies might loose their competitive edge in softwarerelated businesses and predicted “the decline and fall of the American programmer.”

Hammer and Champy [HAM93] argued that information technologies were to play a pivotal role in the “reengineering of the corporation.” During the mid-1990s, the pervasiveness of computers and software spawned a rash of books by “neo-Luddites”

(e.g., Resisting the Virtual Life, edited by James Brook and Iain Boal and The Future Does Not Compute by Stephen Talbot). These authors demonized the computer, emphasizing legitimate concerns but ignoring the profound benefits that have already been

realized. [LEV95]

During the later 1990s, Yourdon [YOU96] re-evaluated the prospects for the software professional and suggested the “the rise and resurrection” of the American programmer. As the Internet grew in importance, his change of heart proved

to be correct. As the twentieth century closed, the focus shifted once more, this time to the impact of the Y2K “time bomb” (e.g., [YOU98b], [DEJ98], [KAR99]).

Although the predictions of the Y2K doomsayers were incorrect, their popular writings drove home the pervasiveness of software in our lives. Today, “ubiquitous

computing” [NOR98] has spawned a generation of information appliances that have broadband connectivity to the Web to provide “a blanket of connectedness

over our homes, offices and motorways” [LEV99]. Software’s role continues to expand.

The lone programmer of an earlier era has been replaced by a team of software specialists, each focusing on one part of the technology required to deliver a complex application. And yet, the same questions asked of the lone programmer are being

asked when modern computer-based systems are built:

• Why does it take so long to get software finished?

• Why are development costs so high?

• Why can't we find all the errors before we give the software to customers?

• Why do we continue to have difficulty in measuring progress as software is being developed?

These, and many other questions,1 are a manifestation of the concern about software and the manner in which it is developed—a concern that has lead to the adoption of software engineering practice.

 



Frequently Asked Questions

+
Ans: What is manual system or what is automatic system : The main difference between manual and computerized systems is speed. Accounting software processes data and creates reports much faster than manual systems. Calculations are done automatically in software programs, minimizing errors and increasing efficiency. Once data is input, you can create reports literally by pressing a button in a computerized system. view more..
+
Ans: Software is (1) instructions (computer programs) that when executed provide desired function and performance, (2) data structures that enable the programs to adequately manipulate information, and (3) documents that describe the operation and use of the programs.  view more..
+
Ans: SOFTWARE: A CRISIS ON THE HORIZON? view more..
+
Ans: Many causes of a software affliction can be traced to a mythology that arose during the early history of software development view more..
+
Ans: Although hundreds of authors have developed personal definitions of software engineering, a definition proposed by Fritz Bauer [NAU69] at the seminal conference on the subject still serves as a basis for discussion: [Software engineering is] the establishment and use of sound engineering principles in order to obtain economically software that is reliable and works efficiently on real machines. view more..
+
Ans: A common process framework is established by defining a small number of framework activities that are applicable to all software projects, regardless of their size or complexity. A number of task sets—each a collection of software engineering work tasks, project milestones, work products, and quality assurance points—enable the framework activities to be adapted to the characteristics of the software project and the requirements of the project team. view more..
+
Ans: To solve actual problems in an industry setting, a software engineer or a team of engineers must incorporate a development strategy that encompasses the process, view more..
+
Ans: Sometimes called the classic life cycle or the waterfall model, the linear sequential model suggests a systematic, sequential approach5 to software development that begins at the system level and progresses through analysis, design, coding, testing, and support. view more..
+
Ans: Often, a customer defines a set of general objectives for software but does not identify detailed input, processing, or output requirements. In other cases, the developer may be unsure of the efficiency of an algorithm, the adaptability of an operating system, or the form that human/machine interaction should take. In these, and many other situations, a prototyping paradigm may offer the best approach. view more..
+
Ans: Rapid application development (RAD) is an incremental software development process model that emphasizes an extremely short development cycle. The RAD model is a “high-speed” adaptation of the linear sequential model in which rapid development is achieved by using component-based construction. If requirements are well understood and project scope is constrained, view more..
+
Ans: There is growing recognition that software, like all complex systems, evolves over a period of time [GIL88]. Business and product requirements often change as development proceeds, making a straight path to an end product unrealistic; tight market deadlines make completion of a comprehensive software product impossible view more..
+
Ans: Object-oriented technologies provide the technical framework for a component-based process model for software engineering. The objectoriented paradigm emphasizes the creation of classes that encapsulate both data and the algorithms used to manipulate the data. If properly designed and implemented, object-oriented classes are reusable across different applications and computer-based system architectures. view more..
+
Ans: The formal methods model encompasses a set of activities that leads to formal mathematical specification of computer software. Formal methods enable a software engineer to specify, develop, and verify a computer-based system by applying a rigorous, mathematical notation. A variation on this approach, called cleanroom software engineering view more..
+
Ans: The term fourth generation techniques (4GT) encompasses a broad array of software tools that have one thing in common: each enables the software engineer to specify some characteristic of software at a high level. The tool then automatically generates source code based on the developer's specification view more..
+
Ans: If the process is weak, the end product will undoubtedly suffer, but an obsessive overreliance on process is also dangerous. In a brief essay, Margaret Davis [DAV95] comments on the duality of product and proces view more..




Rating - 4/5
450 views

Advertisements