Real-Time Sensing and Artificial Intelligence for Live Performative Arts
Workshop
Duration: 12 hours
Wednesday, 06 March 2024 | 18:00 > 21:00 2024-03-06T18:00:00.000Z | Classroom 1
Thursday, 07 March 2024 | 18:00 > 21:00 2024-03-07T18:00:00.000Z | Classroom 1
Friday, 08 March 2024 | 18:00 > 21:00 2024-03-08T18:00:00.000Z | Classroom 1
Saturday, 09 March 2024 | 18:00 > 21:00 2024-03-09T18:00:00.000Z | Classroom 1
AIM: This workshop aims to empower live performers and art practitioners with the skills to integrate various sensors for creating live demonstrations of real-time sensor data for creative coding applications. Participants will learn to collect data from various sensors (using standard protocols such as USB and serial communication), visualize the real-time data stream, and employ pretrained neural networks for insightful data analysis, data augmentation, and creative applications. The framework and software used are OpenFrameworks (https://openframeworks.cc/) and C/C++.
AUDIENCE: Live performers and art practitioners with very little programming knowledge. For the non-expert, we will ask ChatGPT for help with coding! We will provide code examples, slides, and help in getting started.
WORKSHOP STRUCTURE:
- Session 1: Creative coding, OpenFrameworks, the basic of programming, interactivity.
- Session 2: OpenFrameworks, creating add-ons in OFX, sensors (Kinect, Camera) and MIDI (in/out).
- Session 3: Neural networks basics, working with data, and neural networks in OFX.
- Session 4: Let's build a demo together: sensors, OFX, and neural nets - interpretation, augmentation, and generative AI.
METHODOLOGY:
- Hands-on, practical approach.
- Following and building upon a single example.
- Real-time development of a demo project with audience participation.
MATERIALS:
- Participants to use their laptops.
- Provided sensors (radar, camera, lights).
- Course materials: Slides, example add-ons for sensor integration.
- Post-workshop: Release of materials for public access.
TOOLS AND TECHNOLOGIES:
- Open Frameworks
- PyTorch/TensorFlow
- Various open-source software
- Various sensors, 24Ghz radar, cameras, Kinect 360 (Microsoft).
WORKSHOP FORMAT: Interactive, project-based learning. Emphasis on collaboration among participants of varying skill levels to address different learning curves.
SLIDES AND EXAMPLES (github repo): https://github.com/federicohyo/real-time-sensors-and-artificial-intelligence-for-live-performative-arts/tree/main
AUDIENCE: Live performers and art practitioners with very little programming knowledge. For the non-expert, we will ask ChatGPT for help with coding! We will provide code examples, slides, and help in getting started.
WORKSHOP STRUCTURE:
- Session 1: Creative coding, OpenFrameworks, the basic of programming, interactivity.
- Session 2: OpenFrameworks, creating add-ons in OFX, sensors (Kinect, Camera) and MIDI (in/out).
- Session 3: Neural networks basics, working with data, and neural networks in OFX.
- Session 4: Let's build a demo together: sensors, OFX, and neural nets - interpretation, augmentation, and generative AI.
METHODOLOGY:
- Hands-on, practical approach.
- Following and building upon a single example.
- Real-time development of a demo project with audience participation.
MATERIALS:
- Participants to use their laptops.
- Provided sensors (radar, camera, lights).
- Course materials: Slides, example add-ons for sensor integration.
- Post-workshop: Release of materials for public access.
TOOLS AND TECHNOLOGIES:
- Open Frameworks
- PyTorch/TensorFlow
- Various open-source software
- Various sensors, 24Ghz radar, cameras, Kinect 360 (Microsoft).
WORKSHOP FORMAT: Interactive, project-based learning. Emphasis on collaboration among participants of varying skill levels to address different learning curves.
SLIDES AND EXAMPLES (github repo): https://github.com/federicohyo/real-time-sensors-and-artificial-intelligence-for-live-performative-arts/tree/main
Author
- I am an assistant professor in the Electrical Engineering Department of the Eindhoven University of Technology (TU/e), leading the Neuromorphic Edge Computing Systems (NECS) Lab.
My interest lies in the field of neural networks, artificial intelligence, and neurobiological systems. My research is devoted to the development of bio-inspired unconventional computing machines comprising sensory systems and information processing devices based on bio-inspired neural networks. - - Phd candidate at the NECS lab of TU Eindhoven
- Neuromorphic engineering
- Embedded programming
- Hardware design