EDITORIAL Cover Image

EDITORIAL
EDITORIAL

Author(s): Imrich Vaško, Shota Tsikoliya
Subject(s): Fine Arts / Performing Arts, Editorial
Published by: Historický ústav SAV, v. v. i.
Keywords: computational architecture; architectural practice; digital architecture, computerised desig; parametric design;

Summary/Abstract: The development of information technologies and the digitalisation of architectonic tools has radically changed both architectural practice and architectural research. Currently, it is difficult even to cite an area of human activity where computer technology lacks a significant role, and architectural design is hardly an exception. Computers shape all stages of architectural creation, from design through optimization up to production. The development of computer technologies in architectural design is manifested in two main tendencies, these being the digitalisation of the design tools and the digitalisation of the design processes. Yet these tendencies are connected, so much so that they can only be observed in their mutual relations. And, as in many other fields, a change is underway in architecture’s approach: there is discussion about open systems, soft systems, generative processes and bottom-up methods. The transition from reductionism to a view of the world as an open dynamic system has changed architectural thinking. Though computational tools have been in use in architectural practice for almost half a century and have formed an essential component of most architectural studios during the past two decades, the absence of a clear and coherent theory of digital architecture complicates our view of its various branches and any clear definition of its concepts. Concepts such as parametric designing, generative design, or computational design have a distinct lack of clarity in their relations and hierarchies. In the professional literature, there exists a difference between “computational design” and “computerised design”, or conversely even “computer-aided design”. The first idea is understood as the process through which specific problems are first conceived as abstract data and their mutual relations are depicted in the form of logical and mathematical formulae. A concrete architectural problem is replaced by an abstract model, and various solutions are simulated and evaluated. As such, it primarily concerns the digitalisation of the design process itself. The second concept is used in the case of using digital tools that make use of the calculating power of information technology to compile and organise information already known. Yet these two forms are not opposites, but indeed complementary phenomena. For a better understanding of the mutual relation between digital processes and digital tools, it is necessary to state their context within the wider scheme of architectural practice, theory and technology. The first graphic systems for computational design appeared in the 1960s. Originally, their ambition was to imitate earlier programs used for structural calculations and use them for addressing more purely design-oriented problems. One example from structural engineering is the idea of form-finding, i.e. the process of seeking the optimal form through finding the balance between all of the forces in action. And yet, it very soon became clear that optimisation as the main goal could not be achieved, considering the quantity and the complexity of requirements, intentions and limitations encountered in the architectonic design. Automated design then became the term used to describe a holistic vision of computers capable of independently grasping and resolving the problems of architecture as a whole. This research program, one of whose initiators was the MIT professor Nicholas Negroponte, posed the question of the full extent that a computer could copy the behaviour of a human designer in solving architectural problems. The development of cybernetics, systems theory and machine learning led to the idea of a computer not as the replacement of the human, but as an addition or extension. This definition of the role of the computer corresponded with the contemporary writings of Marshall McLuhan on the role of media and the wider concept of the relationship between humans and machines. Architecture, thus, was not merely the physical object, but was re-formulated as the complex interaction between material characteristics, social and economic relations, and the formation of spaces, forms and structures. The first graphic design programs, such as Sketchpad by Ivan Sutherland or Lokat (developed at the Harvard School of Design), were based on these assumptions. Computational design was tested on limited problems that would have been hard to address in traditional methods. Later, with computer programs beginning to target a wider spectrum of architects, the priority became the standardisation of approaches and methods, which in the worst case led to the idea of the program as a digital pen, and in the optimal one to the creation of digital databanks (Building Information Management). Where computerised design started with a specific problem and ended with a building, computational design started with abstract qualities and generative rules, ending in the form of a dynamic system. The development of computing methods in design led to a change in how architects thought about architectural problems: no longer in the quantification of data, but instead in the conception of architectonic form as system. According to Christopher Alexander, every aspect of form, whether solitary or as part of a pattern, can be conceived as a structure from several components. Each object is a hierarchy of components: the larger define the distribution pattern of the smaller, while even the smaller ones, however much they appear solitary, are in fact themselves a pattern from still smaller parts. Conceiving of architectonic phenomena as open, dynamic systems led to a similar change of concepts as in other, primarily natural-science disciplines. A general theory of systems emerged in the 1950s in the work of Ludwig von Bertalanffy. While the classic reductionist principle of modern science had come up against its limits even earlier, with new findings in physics at the turn of the 19th and 20th centuries, it was primarily the development of biology and the study of living organisms that revealed the necessity of thinking systematically. Bertalanffy held that deductive methods and focusing on isolated cases was wrong for biology, since no living organism exists in isolation from its environment and can be described only as a complex system of relationships and interactions. The general systems theory was formulated as a universal scientific discipline describing the methods for the study of organised wholes. Applied to architecture, above all in the wok of Christopher Alexander, systems theory had a major impact on how the design process was viewed. Alexander stated that a complex architectonic problem cannot be solved through its division into sub-problems, and the summary of solutions to sub-problems could never lead to a full design. The alternative was the abstraction of the entire problem in all its correlations. Though the idea of architecture as a system had existed long before the creation of computing technologies, it was only the possibilities given by digitalisation that expanded the concept of open dynamic systems. Judging from current tendencies in architectural education, academic institutions are not merely platforms for transmitting the necessary knowledge and know-how for professional work, but also a unique environment for research and innovation. New technologies of designing and manufacturing are transforming the shape of the discipline, and require a greater role for experiment, testing, prototype creation and development. Inside the educational institutions, there emerge platforms focusing on investigating the possibilities of computational design and new technologies, in cooperation with related academic fields. Most notable in these platforms is the close cooperation with other branches of science or technical research. The latest issue of A&U presents a wide range of projects that not only examine the use of information technologies, but which could not even appear without their use, or at the very least would have assumed a quite different form. The results of research in the academy, in Europe and beyond, these projects map out the current scene in computational design, detail its methods and speculate on its potential.

  • Issue Year: 52/2018
  • Issue No: 3-4
  • Page Range: 159-161
  • Page Count: 3
  • Language: English, Czech
Toggle Accessibility Mode