fluid interfaces’ platform controls physical objects with simple virtual manipulations
all images courtesy of fluid interfaces

 

 

 

 

fluid interfaces, a branch of MIT media lab in cambridge, massachusetts created a new kind of tool for empowering users to connect and manipulate the functionality of physical objects called the ‘reality editor’.  the interface drags a virtual line from one object to another to create a relationship between these objects. the program is a result of three years of MIT research to create a platform that grants the user maximum control by leveraging human strength such as spatial coordination, muscle memory, and tool-making. 

 

a demonstration of the ‘reality editor’ video courtesy of fluid interfaces

 

 

 

for example, just by pointing ’reality editor’ at a lamp and drawing a line to it will enable it to be controlled wirelessly without any preliminary instructions. if users want a timer linked to the light, just borrow the functionality of an object with a time, such as a TV, by drawing a line from it to the light. the designer and engineers at fluid interfaces also open sourced the ‘open hybrid’ platform, allowing fellow DIY creators to fully integrate the interface into next-generation objects.

MIT-media-lab-fluid-interfaces-reality-editor-designboom-02re-designed QR codes for recognizing objectsMIT-media-lab-fluid-interfaces-reality-editor-designboom-03the interface draws lines for connecting actions MIT-media-lab-fluid-interfaces-reality-editor-designboom-04even chairs can be outfitted with the platform to offer added statistics