| descomp |
II. Shape Computation Theory 2. Computational Theory The examination of a domain of phenomena is often supported by a calculus. The phenomena in view can be expressive, or natural. For example, natural language is the expressive medium that we use to communicate our thoughts through speech. Our understanding of the logical structure of natural languages relies on the formal models of logic. And, the examination of their syntactic properties relies on the models of formalized grammars. A calculus is a formal environment where calculations of some kind can take place. The propositional calculus is a calculating environment designed to reveal certain characteristics of the logical structure of arguments. And a formal grammar is a calculating environment that reveals the syntactic properties and relations of empirically given expressions. The modern origins of this kind of investigation can be traced in Descartes, Leibniz, Boole, Frege, Russell and Whitehead, Carnap, Lesnievski, Tarski, and the logical empiricists. Analogous studies concerning the syntax of languages were performed in the early 20s by Jespersen, and latter in the 50s and the 60s by Chomsky and the researchers of Artificial Intelligence. In the preface of his otherwise technical work Der logische Aufbau der Welt, Carnap (1928) places this approach into context with the arts: “We feel an inner affinity between the attitude that lies at the bottom of philosophical work and the spiritual attitude which expresses itself at present in entirely different areas of life: we sense that attitude in currents of contemporary art, especially in architecture, and in movements that seek to give a meaningful shape to human life”. Carnap took interest in art and architecture by giving lectures at the Bauhaus, and so did other logical empiricists at the Chicago School of Design in the 40s. Not by accident, during the entire period from the 20s to the 50s Klee and Kandinsky attempted to introduce methodic thought in their teaching and practice of painting. The common aspect in the course of all computational theories is the use of calculating systems and the effort to map empirical data on them. A theory T of this kind includes some calculus C and some set of rules of syntax and interpretation R. In more concise form: T = C U R. For example, in logic and in formal grammars the expressions of natural language are first reduced to strings, to become expressions of the calculus. Then, they are treated according to techniques that originate in set theory. Logicians examine how the words fit together so as to preclude the possibility that the premises are true and the conclusion false. And, grammarians examine the set of conventions that allow the mechanical generation of a corpus of expressions. In logic, an argument is shown valid by providing a translation into a demonstrably valid argument in the formal language. Atomic sentences and “connectives” like ', &, ~, --->, <--->, etc. are used to reveal the structure of the argument. In syntax production rules and transformations are used to generate sentences from finite sets of atomic phonemes. The most common objection to the computational approach is that a computational theory T fails to reflect the way in which one acts and thinks, and, that a calculus can represent, at best, only moments in a system that is continually changing. I think that this objection is reasonable, but it is missing the point. The issue is not how to mirror “all” the heuristic and pragmatic aspects of a real process but “some” aspects and features of it. The question is what are these aspects and features in each case, and what is an appropriate calculus to express them? Calculating systems are also constructions. And however ingenious many of them may be, they can have little or no interest from a specific empirical viewpoint. One cannot just pick a calculus and squeeze the empirical content in. The choice of the appropriate calculus becomes an issue of central importance in the development of a theory, just like choosing the appropriate tool for a task. Computational design theory was introduced by a group of researchers in the 60’s and 70’s. The aim of computational design theory was the use of computational means in design. Some of the proposed means included, set theory (Alexander 1964), graph theory (Steadman 1973), Boolean algebra (March 1972), computer generated design (Eastman 1970; Mitchell 1974), formal syntax (Hillier et al. 1976), and shape grammars (Stiny and Gips 1972). In all the above cases, computation was used either for prescription, or for description of the behavior of the designers. In the prescriptive case, computation was applied through a prescriptive system of rules providing a norm for further empirical study; in the second, as a descriptive, rigorous affirmation that the claims of a hypothesis produce analogous results. |
| C o n s t r u c t i n g D e s i g n C o n c e p t s : A Computational Approach to the Synthesis of Architectural Form Kotsopoulos S, Ph.D. Dissertation, Massachusetts Institute of Technology, 2005 |
| r e s e a r c h |