Dynamic Stochastic Synthesis (DSS) is a direct digital synthesis method invented by Iannis Xenakis that produces a wave of variable periodicity through regular stochastic variation of its wave cycle, resulting in emergent pitch and timbral features. While high-level parametric control of the algorithm enables a variety of musical behaviors, composing with DSS is difficult because its parameters lack basis in perceptual qualities. The Xenos virtual instrument plug-in implements DSS with modifications and extensions that enhance its suitability for general composition. Written in C++ using the JUCE framework, Xenos offers DSS in a convenient, efficient, and widely compatible polyphonic synthesizer that facilitates composition and performance through host-software features, including MIDI input and parameter automation. Xenos also introduces a pitch-quantization feature that tunes each period of the wave to the nearest frequency in an arbitrary scale. Custom scales can be loaded via the Scala tuning standard, enabling both xenharmonic composition at the mesostructural level and investigation of the timbral effects of microtonal pitch sets on the microsound timescale.
We’ll show you how to design and make 3D printed patterns to create soft 3D crafts like ornaments, decorations, and jewelry.
The workshop is free of charge and all materials will be provided. You must be 18 years or older, and available for both days of the workshop to participate.
Space is limited. To enroll complete and submit this form via the web: https://bit.ly/experimental-embroidery
For more information, please visit: https://ecl.mat.ucsb.edu/embroidery.
Generating Sound and Organizing Time with Gen~
The gen~ environment for Max/MSP lets us work on sonic algorithms down to sample and subsample levels. Written a decade ago during my doctoral research at MAT, UCSB, it has become a widely used platform for sonic experimentation, interactive arts, product design, and music making by artists such as Autechre, Robert Henke, and Jim O’Rourke. At its heart is the capacity to write whole algorithms that are processed one sample at a time, allowing unique access to filter, oscillator and other micro-level synthesis designs, and where each edit made while patching seamlessly regenerates optimized machine code under the hood (and which can be exported for use elsewhere). In this talk I will introduce the gen~ environment, including how and why it was developed, but also where and how it can be applied and what it has inspired. This will draw material from a new book about sonic thinking with signals and algorithmic patterns, with a gamut of synthesis and audio processing examples demystified through patching with gen~.
Graham Wakefield is an Associate Professor and Canada Research Chair in the department of Computational Arts at York University, where he leads the Alice Lab, dedicated to computational art practice and software development in mixed/hybrid reality. His ongoing research practice applies a deep commitment to the open-endedness of computation—as an art material—as expressed both in new software for artists and musicians (such as the gen~ environment for Max/MSP), as well as immersive artworks of biologically-inspired systems (working with Haru Ji as Artificial Nature). These installations have been exhibited in many international venues and events, including La Gaîté Lyrique/Paris and ZKM/Germany, and his research has been published in the Computer Music Journal, IEEE Computer Graphics & Applications, the International Journal of Human-Computer Studies, Leonardo, ICMC, NIME, SIGGRRAPH, ISEA, EvoWorkshops, and many more.
Graham Wakefield's new book (October 2022):
For more information about the MAT Seminar Series, go to:
Space Control is a multitrack workstation dedicated to the design, realization, and mixture of spatial gestures for electroacoustic music composition. With its simple interface and minimal learning curve, it makes quick and powerful spatialization available to users of all experience levels.
Released in June 2022, Space Control was created by the team of Professor João Pedro Oliveira, acting as project manager, and software developer Raphael Radna. Radna is a PhD candidate in Music Composition at UC Santa Barbara, and is also pursuing a Masters of Science degree from the Media Arts and Technology Graduate Program at UCSB.
There is also a Quick Start video available on YouTube:
The project was supported by a Faculty Research Grant from the UCSB Academic Senate.
Parasitic Signals - Coexistence with SARS-CoV-2
This project is to transform a nano-scale of a striking biological phenomenon, the relationship between SARS-CoV-2 (Corona) virus and human molecules into an interactive audiovisual simulation. Especially, in this pandemic situation, the SARS-CoV-2 (Corona) virus is a key interest in all fields of science. By collaborating with scientists at Johannes Kepler University (JKU) in Linz, Austria, we are going to simulate the relationship between SARS-CoV-2 virus and human lectin proteins by using Atomic Microscopy (AFM), which can touch a single molecule to measure binding force between SARS-CoV-2 virus and human lectin protein. We are creating an interactive audiovisual installation and performance from an interaction data set of corona virus and human protein. The audience will be invited to an immersive space where they can control the two biomolecule’s behavior so that they can intuitively recognize the biological characteristics of Corona virus and human protein.
This project is not only a demonstration of scientific data and the development of a sonification tool, but also it tries to look at the interspecies relationship in parasitism, as a mutualistic and long-term relationship. Especially, in this pandemic situation, coronavirus brought huge impacts socially, as well as individually. Through this collaboration, this project is a continuous series of parasitism in humans that in particular deals with our current and future life with coronavirus, with various perspectives of social, political and cultural levels. Especially, as the corona virus is being extensively researched due to the pandemic circumstance all over the world, this project will be meaningful to demonstrate how we can possibly control our coexistence in virtual space.
Ars Electronica Center, Linz Austria.
Kepler’s Gardens at JKU Campus, Linz, Austria.
Professor Kuchera-Morin and Dr. Rincon will be joined by Jean Johnstone of UC Berkeley, and will evaluate the impact of arts, culture and entertainment to the future of California.
About the California 100 Research Grants
California 100 is a new statewide initiative being incubated at the University of California and Stanford University focused on inspiring a vision and strategy for California’s next century that is innovative, sustainable, and equitable. The initiative will harness the talent of a diverse array of leaders through research, policy innovation, advanced technology, and stakeholder engagement. As part of its research stream of work, California 100 is sponsoring 13 research projects focused on the following issue areas:
Gustavo Alfonso Rincon
EmissionControl2 is a granular sound synthesizer. The theory of granular synthesis is described in the book Microsound (Curtis Roads, 2001, MIT Press).
Released in October 2020, the new app was developed by a team consisting of Professor Curtis Roads acting as project manager, with software developers Jack Kilgore and Rodney Duplessis. Kilgore is a computer science major at UCSB. Duplessis is a PhD student in music composition at UCSB and is also pursuing a Masters degree in the Media Arts and Technology graduate program.
EmissionControl2 is free and open-source software available at: github.com/jackkilgore/EmissionControl2/releases/latest
The project was supported by a Faculty Research Grant from the UCSB Academic Senate.
Media Arts and Technology (MAT) at UCSB is a transdisciplinary graduate program that fuses emergent media, computer science, engineering, electronic music and digital art research, practice, production, and theory. Created by faculty in both the College of Engineering and the College of Letters and Science, MAT offers an unparalleled opportunity for working at the frontiers of art, science, and technology, where new art forms are born and new expressive media are invented.
In MAT, we seek to define and to create the future of media art and media technology. Our research explores the limits of what is possible in technologically sophisticated art and media, both from an artistic and an engineering viewpoint. Combining art, science, engineering, and theory, MAT graduate studies provide students with a combination of critical and technical tools that prepare them for leadership roles in artistic, engineering, production/direction, educational, and research contexts.
The program offers Master of Science and Ph.D. degrees in Media Arts and Technology. MAT students may focus on an area of emphasis (multimedia engineering, electronic music and sound design, or visual and spatial arts), but all students should strive to transcend traditional disciplinary boundaries and work with other students and faculty in collaborative, multidisciplinary research projects and courses.