The School, University or venue where you study from
In a paragraph, describe yourself in relation to your learning or teaching
Click to select more than one option (multiple times)
The programming languages are the main tools of the new age system designer. While the traditional tools of flowchart designs provide the “old school” programmer-analyst with the tools to visualize an information system, the newly accessible programming languages such as Microsoft’s Visual Basic provide a more dynamic way of developing system applications.
Although computers from one manufacturer tend to have the same machine language, those from different manufacturers do not. Accordingly, different computers have different assemblers and assembly languages. In addition to the conversion of the mnemonic operation code and decimal operand in each instruction into machine code, most assembly languages have functions to facilitate programming, such as combining a sequence of several instructions into one pseudo-instruction. Machine code instructions are then generated from this pseudo-instruction. Programming in assembly languages requires a solid knowledge of computer architecture and is more time-consuming than programming in high-level languages.
FORTRAN (Formula Translation) was developed as a language for numerical analysis computation by John W. Backus and others at IBM and was announced by IBM in 1957. It has been revised several times since then. Even though other languages, such as C, are becoming popular for scientific and engineering computations, FORTRAN is still the language of choice for numerical analysis. In order to extend its applicability to scientific computations beyond numerical analysis, facilities for handling structured data, dynamic data allocation, recursive calls, and other features were added in the version released in 1990, FORTRAN 90.
COBOL (Common Business-Oriented Language) is the most popular language in the business community, including banks and insurance companies. Computer users and manufacturers joined with the U.S.
Department of Defense to set up a common programming language for business applications and formed the Conference on Data Systems Languages (CODASYL) in 1959. CODASYL created COBOL to fulfill two major objectives: portability (the ability of programs to be run with minimum modification on computers manufactured by different companies) and readability (the ease with which programs can be read like ordinary English sentences). COBOL has been revised several times since 1959. It can be more easily understood by business people than can other languages, and programs written in COBOL are quite portable.
PL/I is a complex language that was suggested by SHARE (a group of IBM computer users) and IBM in 1963. It was initially called NPL (New Programming Language) but later renamed PL/I. IBM announced the first manual of PL/I in 1965. The American National Standards Institute (ANSI) and other organizations have revised it several times since then. PL/I was designed for both scientific/engineering and business problems by integrating the features of FORTRAN, COBOL, and ALGOL, which were popular languages at that time. Many different types of data can be processed, and a large range of arithmetic and other operations are available. In recent years PL/I has been used less frequently.
BASIC (Beginner’s All-Purpose Symbolic Instruction Code) is a general-purpose programming language developed by John G Kemeny and Thomas E. Kurtz at Dartmouth College, Hanover, N.H., in the mid-1960s. It is one of the simplest high-level languages and can be learned with relative ease, even by schoolchildren and novice programmers. Since about 1980, BASIC has become popular for use on personal computers.
APL (A Programming Language) is based on the book, A Programming Language published in 1962 by Kenneth E. Iverson of IBM. In 1968 IBM announced APL/360 as its first version. APL was initially used for scientific and engineering problems, but, since IBM introduced APLSV (APL Shared Variable) in 1973, APL can handle general files and consequently has been adopted for business programs as well. Statements are expressed with simple notations, and their operational functions are powerful.
Pascal is a language developed by Niklaus Wirth of the Federal Institute of Technology, Zurich, Switzerland. in the late 1960s. It was intended to be a good educational tool for the systematic teaching of programming and to have fast, reliable compilers. Since 1974 the Pascal compiler developed by Wirth has been available to the public and has been used by many universities. Pascal strongly influenced many languages developed later, such as Ada. The language specifications of Pascal are concise, making it easier to learn than many other high-level languages. Complex data structures and algorithms can be described concisely by Pascal, and its programs are easy to read and to debug.
Ada is a high-level language whose development was initiated in 1975 by the U.S.Department of Defense. Ada was intended to be a common language that could be used on the department’s computers, which were produced by many different manufacturers. The first version of the language specification was completed in 1979. It was formally named Ada in honor of Augusta Ada King, Countess of Lovelace and daughter of Lord Byron. (She had worked as an assistant to Charles Babbage in the development of his Analytical Engine and is often credited as being the world’s first computer programmer.) Ada is similar to Pascal but contains many additional features that are convenient for the development of large-scale programs. Because of its abundant features, however, ordinary users may feel awkward using Ada. Thus, Ada has not been widely used in programs other than those for the Department of Defense.
Although C is considered to be a high-level language, it has many low-level features, such as the ability to directly handle addresses and bits. C is, nonetheless, highly portable. It was developed by Dennis M. Ritchie of AT&T Bell Laboratories in 1972. The operating system UNIX has been written almost exclusively in C; previously, operating systems were almost entirely written in assembly or machine code. C has been extensively used on personal and larger computers.
LISP (List Processor) is a language that is powerful in manipulating lists of data or symbols rather than processing numerical data. In this sense, LISP is unique. It requires large memory space and, since it is usually processed by an interpreter, is slow in executing programs. LISP was developed in the late 1950s and early 1960s by a group headed by John McCarthy, then a professor at the Massachusetts Institute of Technology. At that time, LISP was radically different from other languages, such as FORTRAN and ALGOL. Several versions have been developed from the LISP 1.5 introduced by McCarthy; Common LISP, released in 1984, is becoming the de facto standard of LISP.
Fourth-generation languages (4GLs)
These are closer to human language than other high-level languages. 4GLs are intended to be easier for users than machine languages (the first generation), assembly languages (the second generation), and high-level languages (the third generation). Many 4GLs actually incorporate third-generation software as well. 4GLs are used primarily for database management. A typical 4 GL statement is FIND ALL RECORDS WHERE NAME IS “SHEHU”. FOCUS, developed by Information Builders, Inc., is a database management system that includes a 4GL. It is popular among COBOL users because FOCUS is similar to COBOL and can be easily used with COBOL programs. Structured Query Language (SQL) and Query By Example (QBE), both produced by IBM, are other examples of 4GLs.
An important trend in programming languages is support for data encapsulation, or object-oriented code. Data encapsulation is best illustrated by the language Smalltalk, in which all programming is done in terms of so-called objects. An object in Smalltalk or similar object-oriented languages consists of data together with the procedures (program segments) to operate on that data. Encapsulation refers to the fact that an object’s data can be accessed only through the methods (procedures) provided. Programming is done by creating objects that send messages to one another so that tasks can be accomplished cooperatively by invoking each others’ methods. This object-oriented paradigm has been very influential. For example, the language C, which was popular for engineering applications and systems development, has largely been supplanted by its object-oriented extension C++.
An object-oriented version of BASIC, named Visual BASIC, is available for personal computers and allows even novice programmers to create interactive applications with elegant graphical user interfaces (GUIs). One of the most powerful examples of this development tool is Microsoft Visual Basic.
In 1995 Sun Microsystems, Inc., introduced Java, yet another object-oriented language. Applications written in Java are not translated into a particular machine language but into an intermediate language called Java Bytecode, which may be executed on any computer (such as those using UNIX, Macintosh, or Windows operating systems) with a Java interpretation program known as a Java virtual machine. Thus Java is ideal for creating distributed applications or Web-based applications. The applications can reside on a server in Bytecode form, which is readily downloaded to and executed on any Java virtual machine. In many cases it is not desirable to download an entire application but only an interface through which a client may communicate interactively with the application. Java applets (small chunks of application code) solve this problem. Residing on Web-based servers, they may be downloaded to and run in any standard Web browser to provide, for example, a client interface to a game or database residing on a server.
At a still higher level of abstraction lie visual programming languages, in which programmers graphically express what they want done by means of icons to represent data objects or processes and arrows to represent data flow or sequencing of operations. As of yet, none of these visual programming 1anguages has found wide commercial acceptance. On the other hand, high -level user-interface languages for special-purpose software have been much more successful; for example, languages like Mathematica, in which sophisticated mathematics may be easily expressed, or the “fourth generation” database -querying languages that allow users to express requests for data with simple English-like commands. For example, a query such as “Select salary from payroll where employee = ‘‘Shehu’, “written in the database language SQL (Structured Query Language), is easily understood by the reader. The high-level language HTML (Hypertext Markup Language) allows nonprogrammers to design Web pages by specifying their structure and content but leaves the detailed presentation and extraction of information to the client’s Web browser.
Rapid Application Development or RAD is a new (highly interactive) systems development approach whereby qualitatively better systems can be realized in a shorter working less time and at lower cost compared to the traditional one. System development projects are famous for not meeting their delivery dates. Even worse, the system may prove insufficient to meet actual business needs once it is delivered. Strangely enough though, businesses are often inclined to accept non-perfect solutions if they are delivered on time. The Rapid Application Development (RAD) approach was developed invented to help cope with these problems.
The techniques which collectively are known as RAD were first formalized in the methodology called RIPP (Rapid Iterative Production Prototyping) at Dupont in mid- 1980s. James Martin extended the work done at Dupont and elsewhere into a larger more formalized process, later called Rapid Application Development in which he coined the term RAD.
Between 1970 and 1990, most systems development took place in an environment which— though complex—was far simpler technically than the world offered to us by modem technology architectures. The business problems solved by those systems were also relatively simple, and the personnel who worked on the design and implementation of the systems were almost entirely technical personnel.
The world is a different place now. When you add up the technical complexity of client-server systems which cross multiple operating system platforms, the business complexity of the issues those systems address, and the much wider organizational involvement imposed by user-driven requirements definition processes, you end up with a situation that’s far too tangled for technicians to unravel unaided. Systems development in this world demands the cooperation of people from many different parts of an organization. Such fluid movement of people through a project is common pratice in most consulting organization, but it’s far from the norm in many maintenance-oriented shops.
The Benefits of RAD
This is frequently expressed in terms of the 80/20 rule. The assumption is that 80% of an application can be delivered in 20% of the time. The remaining 20% of an application frequently amounts to copper-plating’ – putting a technical finish on the software product. In RAD, technological or engineering excellence is important, but is seen as coming second after business excellence.
DSDM – a RAD methodology
In 1992, a number of experienced RAD developers came together and decided to develop a new methodology, to combine the best elements of existing methods and practical experience.
This methodology became the Dynamic System Development Method (DSDM). A consortium of several different companies was set up to maintain it, with DCE as a member. In 1995 DCE introduced the methodology in the Benelux and founded the DSDM consortium there with six other companies.
This unit has taught you about the major programming languages, including the newly accessible Rapid Application Development languages, as tools for system design.
Enter the destination URL
Or link to existing content