IAAC – Master in Robotics and Advanced Construction
Workshop 2.2
Visiting Faculty: Keith Kaseman
Faculty Assistants: Soroush Garivani, Daniela Marquez.


Credits: Spatial Futures Lab


Capitalizing on the rapid proliferation of technologies including computer vision, augmented reality (AR), mixed reality (MR) and unmanned aerial systems (UAS), a primary mission of the Georgia Tech Spatial Futures Lab is to weave these often disparate tools into customized multisystem design-production platforms. As such, an interactive cyber-physical design environment involving these technologies has been developed within the Lab, providing a base capacity for live augmented UAS modeling that exploits both manual and automated operations.

Relying upon simultaneous human and machine spatial perceptions and connecting the digital with the physical through live sensing and feedback, this uniquely configured ROS-enabled platform will serve as the design environment within which workshop participants will operate. This workshop aims to exploit the constraints and opportunities at hand, tasking participants to develop various strains of purpose, action and protocols geared towards the production of interoperable design models, operable mixed reality interfaces and corresponding digital structures.

Hardware comprising this cyber-physical design environment includes a high-precision optical motion capture system (www.optitrack.com) installed within a drone net, a Parrot Bebop Power 2 drone, personal smartphones / tablets and the Microsoft HoloLens (v1). Software utilized in the operation of the multisystem includes ROS (www.ros.org), Motive (optitrack.com/products/motive), Rhino 6 (www.rhino3d.com), Grasshopper (www.grasshopper3d.com), and Fologram (www.fologram.com).

Participants will be provided with a comprehensive demonstration of the system and tutorials for its operability at the outset of the workshop. Subsequently, all developmental work will be performed primarily through Grasshopper and Fologram, pointed towards and in conjunction with interactive operability of the multisystem cyber-physical design environment. Initial drone operations will range from the utilization of automated flight algorithms developed at IAAC to manually controlled flight interactions with operable digital design models developed in the Spatial Futures Lab. Workshop aims will revolve around the development of new design protocols enabled by the cyber-physical platform and demonstrations of interactive workflows that result in the production of precisely tuned digital models of spatial configurations and architectural components.

Learning objectives

At course completion the student will be able to:
– Operate through various channels within a cyber-physical design environment
– Customize and interact with a mixed reality (MR) interface
– Perform live parametric and geometric operations with a drone
– Design architectural components with UAS and live MR