It’s a good idea to visit the website of MIT’s Senseable City Lab from time to time. It showcases plenty of refreshing projects and experiments that aim to stress “the increasing deployment of sensors and hand-held electronics in recent years”. The lab tries to investigate the ‘real-time city’, which is the product of these rapid technological developments. Recently, the Senseable City Lab and the Aerospace Robotics and Embedded Systems Laboratory (ARES) launched Flyfire, a collaborative project which attempts to transform any ordinary space into a highly immersive and interactive display environment by the use of self-organizing micro helicopters. It can be considered a new step in designing the sky above us.
The idea behind Flyfire is to explore the capabilities of this display system by using a large number of these LED-enhanced micro helicopters. The small small LEDs allow each helicopter to act like a smart pixel. Through precisely controlled movements, the helicopters perform elaborate and synchronized motions and form an elastic display surface for any desired scenario. Self-stabilizing and precise controlling technology from the ARES Lab enable real-time adaptation of the motion of the pixels.
“The Flyfire canvas can transform itself from one shape to another or morph a two-dimensional photographic image into an articulated shape. The pixels are physically engaged in transitioning images from one state to another, which allows the Flyfire canvas to demonstrate a spatially animated viewing experience.”
Flyfire serves as an initial step to explore and imagine the possibilities of this free-form display — “a swarm of pixels in a space”, as the scientists call it. Check out the video below to find out more about the project.