The ASCAP Lab announced that Living Colour founder and guitarist Vernon Reid has joined as artist in residence, and also announced the finalists from its NYC Media Lab Seed Project as well as several upcoming virtual speaking engagements.

First, Reid will appear at the NYCML Summit on Friday, Oct. 9 in a keynote conversation with Brooke Eplee of The ASCAP Lab and finalists from The ASCAP Lab Seed Project. The panel, “A Musician’s Perspective on Innovation and its Role in the Creative Process,” will explore what’s next in technology and music creation and experience.

Next, the ASCAP Lab, together with New York City Media Lab, sponsored an 11-week open university challenge for graduate students and faculty in creative tech, music, design and technology. Five selected teams, all from New York University, engaged in early-stage R&D to explore new ways to create and experience music using emerging technologies such as spatial audio, computer vision and machine learning tools for object detection. Two of these teams, CHILLER and Madd by Team MiDi, will present their prototypes on Oct. 8 at this week’s New York City Media Lab Summit.

CHILLER developed a tiny device that adds another dimension to the experience of music. Based on the small, low-cost Raspberry Pi computer, their Computer Human Interface for Live Labeling of Emotional Responses (CHILLER) can read the body’s physical and emotional reactions to music, converting them into data that can then be leveraged in a multitude of ways by emerging technologies. LEDs light up when emotional reactions are detected and provide a new way for performers and audiences to interact, whether the performance is virtual, recorded or live in the same room.

Using the 30-year-old keyboard-based MIDI input as a starting point, Madd by Team MiDi created a new device for audio sampling and editing that features a 21st-century, user-friendly interface powered by a sensor that the team built from scratch. Compared with traditional keypad interfaces, the soft-surface sensor captures nuances in a user’s touch, allowing for increased expressiveness and customization of sound.

“Collaborating with these five incredible university teams and seeing them apply their imagination and ingenuity to the ASCAP Lab Seed Project has been inspiring,” said Nick Lehman, ASCAP Chief Strategy and Digital Officer. “We’re excited to nurture this type of emerging talent who will push the envelope of music creation tools and experiences to benefit ASCAP members and drive innovation in the music industry. We want to extend our sincere thanks to everyone who participated and to the NYC Media Lab for their partnership.”

A highlight video on all the ASCAP Lab seed project teams is available at www.ascaplab.com.

Finally, on October 15, ASCAP Lab will present “Can We Learn to Stop Worrying and Love AI?,” a virtual session with Hod Lipson, artificial intelligence expert and Professor of Engineering and Data Science at Columbia University. Lipson will explain why, with a combination of increased investment, computing speed and more, we will soon enter an era of AI “chaos and amazement,” and how this could affect the future of music. Registration for the free virtual session, which is open to ASCAP members, is available at https://bit.ly/3l7pDNa.

For more information on The ASCAP Lab, visit www.ascaplab.com.