Intuitive & interactive design using AI & VR
This interdisciplinary project brings together architects, civil engineers, and computer scientists to develop AI-powered tools for early-stage design exploration. Building on the open-source AIXD (AI eXtended Design) toolbox, we combine machine learning with immersive visualization to support intuitive and quantitatively informed design decisions in architecture and engineering.
Design decisions made in the early phases of architecture and engineering projects have a significant impact on performance, cost, and sustainability. Yet, current workflows typically explore only a small subset of possible solutions due to time and complexity constraints. Generative AI offers the opportunity to rapidly generate large sets of feasible design alternatives, but navigating and understanding this vast solution space remains a challenge.In this project, we develop new tools that enable designers to explore complex design spaces more intuitively, intelligently, and collaboratively. Building on the open-source toolbox AIXD (AI eXtended Design) we use machine learning to create meaningful embeddings— data-driven, lower-dimensional representations of parametric designs that capture key performance attributes such as structural efficiency, embodied carbon, or spatial qualities, and geometric properties of the design. Henceforth, These embeddings allow for structured comparison and interactive navigation of diverse design alternatives.To make this exploration accessible and engaging, we pair these AI-driven representations with a Virtual Reality (VR) interface. The immersive environment enhances spatial understanding, enables direct user interaction and feedback, and provides a collaborative space where participants can engage in co-joint design.At its core, the project fosters collaborative design between human experts and AI systems, where designers steer the exploration and the AI reveals hidden potentials and trade-offs.
Scientific goals:
- Create intuitive representations of the design space that help designers understand and compare options, supporting better decision-making.
- Enable interactive exploration in Virtual Reality (VR) by using these representations in immersive environments that improve spatial understanding, engagement and collaboration.
- Demonstrate the value of AI-supported design through AEC use cases, validating the approach in pilot studies with architects and engineers working on early-phase design challenges.
Funding: ETH Foundation (2024-HS-486)
Duration: 01.2025 - 01.2026
Project Leads:
Dr. Luis Salamanca, Dr. Aleksandra Anna Apolinarska, Sophia Kuhn
Researchers:
Richard Danis, Jacky Choi, Panagiotis Karapiperis, Gereon Siévi
Publications:
- external page AIXD: AI-eXtended Design Toolbox for data-driven and inverse design
- external page Design Space Exploration and Explanation via Conditional Variational Autoencoders in Meta-Model-Based Conceptual Design of Pedestrian Bridges
- external page Augmented Intelligence for Architectural Design with Conditional Autoencoders: Semiramis Case Study