“Neoperceptions” aims to explore the ways to augment human perception.
This new art-meets-science project with contributions from researchers at the MIT Physics Department, MIT Media Lab, as well as Arizona State University Sci Hub, The Boston Museum of Science, and the synesthesia-influenced composer Mary Bichner — advised by theoretical physicists Frank Wilczek (Nobel laureate and Herman Feshbach Professor of Physics), and Nathan Newman (Lamonte H. Lawrence Professor in Solid State Science).
The “Neoperceptions” team created a new, LED-based, live-performance technology that displays the real-time color patterns synesthetic artist associates with her music due to polymodal synesthesia, allowing audience members to “hear” color and “see” sound just as synesthesia would have allowed them to do. The visuals also include an intricate design of garments for the performers made of conductive and light-transmitting materials, which translate the music of the performers into light patterns on their costumes, thus adding a whole new level to the synesthetic experience. The initiative was funded in part by Council for Arts at MIT.
Synesthesia is best described as a crossing of the senses or, as Merriam-Webster defines, “ a concomitant sensation”. Some synesthetes associate sight with sound, others sound with taste, still others taste with touch. In Bichner’s case, she associates each note in the Western musical scale with a specific color, and “sees” splashes of these colors when she hears their corresponding pitches sounded. These color combinations are often the inspiration for her work, and she conveys these color patterns to the audience in live performance through video and other means. Through the “Neoperceptions” project, a series of LED-based visual devices and the performers costumes will be able to accurately convey these color patterns for the first time.
The prototype for this technology premiered during Bichner’s “Synesthesia Suite” concert at the Charles Hayden Planetarium on April 4th 2019, in the form of specially-designed LED panels attached to audio capturing devices and placed strategically around the planetarium’s full 360-degree dome, dome projections, and garments. It has been exhibited at MIT Design Week, MIT Media Lab and other events. A future concert featuring an even more advanced version of the technology will be scheduled for sometime in the Fall/Winter of 2019.