The Apprentice Agents project is an ongoing collaboration with the Social Robotics Lab at Yale University which integrates cognitive and robotic architectures to implement a robotic system that learns elements of its semantic and episodic memory through language interaction with people. The robot system can extract, represent and reason over the meaning of the user’s natural language utterances.
This work facilitates a bi-directional grounding of implicit robotic skills in explicit ontological and episodic knowledge, and of ontological symbols in the real-world actions by the robot.
The New York Digital Urbanism Lab aims to be a testing ground for smart city initiatives. It is designed to allow for the reconfiguration of space: from the small scale (the individual's interface with the building) to a middle scale (the scale of the architectural component such as the wall) to the large scale of the building (including modular electrical schemes to support changing demand). A taxonomy of workspaces allows for the reconfiguration of research clusters according to changing groups sizes, projects, and varied needs that people may have for privacy and ownership.
This project outlines a site-specific predictive model for coordinating events and configurations of the Civic Forum at Doca de Santos, Lisbon. The model is informed by direct and voluntary human input - through the combination of a mobile app and urban dashboard display. The application seeks to crowd-source issue-based direct democracy and community events for Doca de Santos. Embedded on-site sensors collect data that is used to create a predictive model for programming the civic forum according to the data, the time of year, day of the week, and time of day.
This expanded Wiki-visualizer enlarges and adapts to the scale of a room to capitalize on one way in which humans rival computers in recognizing patterns within massive sets of spatialized data. One of the main goals of the application is to achieve both searchability and serendipity in information search, which was facilitated by the immersive environment allowing users to use their spatio-temporal memory in knowledge exploration.
STRING is an audiovisual performance piece that takes place in a 360-degree immersive room and involves feedback between the room, musician, and string instruments. The interactive environment is built with a MaxMSP patch, contact mics and a guitar modulated by an amp/filter app.