--- PREFACE TO THE 2013 EDITION Comments about plans to prepare a second edition to this book varied widely. Some felt that this book is outdated, that nobody is interested in a system of this kind any longer. "Why bother"? Others felt that there is an urgent need for this type of text, which explains an entire system in detail rather than merely proposing strategies and approaches. "By all means"!. Very much has changed in these last 30 years. But even without this change, it would be preposterous to propose and construct a system competing with existing, worldwide "standards". Indeed, very few people would be interested in using it. The community at large seems to be stuck with these gigantic software systems, and helpless against their complexity, their peculiarities, and their occasional unreliability. But surely new systems will emerge, perhaps for different, limited purposes, allowing for smaller systems. One wonders where their designers will study and learn their trade. There is little technical literature, and my conclusion is that understanding is generally gained by doing, that is, "on the job". However, this is a tedious and suboptimal way to learn. Whereas sciences are governed by principles and laws to be learned and understood, in engineering experience and practice are indispensable. Does Computer Science teach laws that hold for (almost) ever? More than any other field of engineering, it would be predestined to be based on rigorous mathematical principles. Yet, its core hardly is. Instead, one must rely on experience, that is, on studying sound examples. The main purpose of and the driving force behind this project is to provide a single book that serves as an example of a system that exists, is in actual use, and is explained in all detail. This task drove home the insight that it is hard to design a powerful and reliable system, but even much harder to make it so simple and clear that it can be studied and fully understood. Above everything else, it requires a stern concentration on what is essential, and the will to leave out the rest, all the popular "bells and whistles". Recently, a growing number of people has become interested in designing new, smaller systems. The vast complexity of popular operating systems makes them not only obscure, but also provides opportunities for "back doors". They allow external agents to introduce spies and devils unnoticed by the user, making the system attackable and corruptible. The only safe remedy is to build a safe system anew from scratch. Turning now to a practical aspect: The largest chapter of the 1992 edition of this book dealt with the compiler translating Oberon programs into code for the NS32032 processor. This processor is now neither available nor is its architecture recommendable. Instead of writing a new compiler for some other commercially available architecture, I decided to design my own in order to extend the desire for simplicity and regularity to the hardware. The ultimate benefit of this decision is not only that the software, but also the hardware of the Oberon System is described completely and rigorously. The processor is called RISC. The hardware modules are decribed exclusively in the language Verilog. The decision for a new processor was expedited by the possibility to implement it, that is, to make it concrete and available. This is due to the advent of programmable gate arrays (FPGA), allowing to turn a design into a real, functioning processor on a single chip. As a result, the described system can be realized using a low-cost development board. This board, Xilinx Spartan-3 by Digilent, features a 1-MByte static memory, which easily accommodates the entire Oberon System, incuding its compiler. It is shown, together with a display, a keyboard and a mouse in the photo below. The board is visible in the lower, right corner. The decision to develop our own processor required that the chapters on the compiler and the linking loader had to be completely rewritten. However, it also provided the welcome chance to improve their clarity considerably. The new processor indeed allowed to simplify and straighten out the entire compiler. For a description of a system to be comprehensible, the key element is the notation, formalism, or language in which it is defined. Algol 60, published 50 years ago, was proposed as a publication language, as a formalism in which algorithms could be defined without reference to particular computers, or to any mechanism at all. This was a great goal, but so far it was hardly achieved. Yet, it emphasized the importance of abstraction to be achieved by a notation with a mathematically rigorous foundation. At least, Algol was the first language based on a formally defined syntax. Algol was the result of the early recognition that programs must never be written just to feed computers, but always to be understood and to be instructive to people. In all my past work, I have tried to design a successor to Algol, that improves its rigor and at the same time extends its applicability from numerical algorithms to software systems. From a long sequence, starting with Algol, through Pascal, Modula, and Oberon, we have come closer to this goal than ever before, and closer than any other language in existence. The key lay in a continued struggle for sensible simplification. The Oberon language, defined in 1988, underwent a revision in 2007, mostly discarding features that were either duplications or not essential. Adaptation of the system's source code to the revised language was, besides the change of processor, the second important reason for numerous, local changes in this text.Received on Fri Mar 21 2014 - 12:51:26 CET
This archive was generated by hypermail 2.3.0 : Fri Mar 21 2014 - 13:00:07 CET