
Computation & Creativity
I enjoy exploring the intersection of computation and creativity. I have experience in robotics, coding, and generative AI algorithms capable of outputting novel instances of digital media. I worked with graduate students as an unpaid intern at the MIT Media Lab on a variety of projects where I helped to develop creative tools for myself and for under-resourced kids to learn about creativity & AI. The tools I helped create are used to empower Title I middle school students to create with and learn about AI. I did my most ambitious coding projects during the pandemic.
.png)
Texture mapping sketches with GANs
This is the first project I helped out with as a volunteer summer intern at the MIT Media Lab. Doodles to Pictures is a web app developed by graduate student, Safinah Ali, to help kids understand how AI algorithms (such as Generative Adversarial Networks, a.k.a. GANs) can be used in creative artworks. Different texture models were trained using corresponding data sets. I gathered lots and lots and lots of images to train these texture models. I wanted kids to able to produce whimsical, weird, and unexpected results as well as those that are visually pleasing. I chose images to train textures like cats, flowers, birds, reptiles, shoes, etc. In the web app, you can draw the outline of something, choose a texture model, and the algorithm colors your line drawing with that texture. Sometimes with surprising or slightly disturbing results, like a fluffy kitty flower! In the AI Literacy workshop on AI & Creativity, students first generated a very short story based on the writing style of an author (e.g., different text-generation GAN models were trained on text from Shakespeare, Dr. Seuss, or Harry Potter). Students then illustrated their short stories with this Doodles to Pictures app. Turns out AI algorithms are already used to generate short news summaries that you read on the Internet!

Doodle Bot
The Doodle Bot is the second project I worked on with Safinah. She developed a high school curriculum on robotics and Arduino coding where students could 3D print, assemble, and program the Doodle Bot to draw different kinds of things. My task was to do the entire hands-on curriculum and provide feedback on whatever I was confused about or where I got stuck. I had taken my first AP Computer Science class in high school, so this was a great opportunity to put some of my early coding skills to work. I enjoyed the 3D printing and hands-on building aspects of the project. And I coded the Arduino microcontroller to have the Doodle Bot draw something, like the letter "A" or a spirograph sort of thing. Safinah eventually piloted this curriculum with high school and college students in Mexico.

Explorations in AI and Music using VAEs
I worked with graduate student Stephen Kaputsos on this project. He was developing a curriculum for middle schoolers to learn about how AI can understand and generate sounds. A significant part of the curriculum used music to explore concepts in machine perception, sound representation, and sound generation. Given my background in music and coding, this was right up my alley.
We found a Google Colab Notebook that could generate music synthesizer sequences using a generative machine learning technique called a variational autoencoder (VAE). We had to install a virtual machine on my laptop to run the code in Linux. I modified the code to replace the synthesizer with a drumbeat sequencer. I created a set of MIDI drum beats sequences using a DAW (Abelton Live) as input to the algorithm to interpolate between these inputs to generate new drum sequences. I found that by playing with different parameters, such as the temperature parameter, I could encourage the VAE algorithm to be more or less "adventurous" in its interpolation explorations. You could definitely go too far and make the code produce drum sequences that sound like overly-caffeinated 6-year-old on Adderall.
I have used this project to explore "composing with AI" in my own computational music projects. You can check out my composition, Algorhythm, on my music page. It incorporates AI-generated drum sequences. I tuned the temperature parameter down to generate the core groove, and turned it up to generate the more interesting drum flourishes. The AI was a bit enthusiastic on the high hat, but hey, I worked with it. I guess that's part of the adventure in composing with generative AI. You're never quite sure what you're going to get.

A Ryanesque Beatboxing AI
Can I build a Robo-Ryan beatboxer? This project builds on the aforementioned VAE drum generator project but with a twist. To turn this into a beatboxing generator, I worked with Stephen to train Generative Adversarial Network (GAN) models for my voice when making beatboxing sounds. We used another Google Colab Notebook to train the wavGAN models. I first recorded a corpus of specific beatboxing sounds, about 40 examples of each type (e.g., kick, snare, high hat, etc. as beatboxing emulates drumming using your voice). Hence, the Ryan-kick model generates slightly different variations of how I make the beatboxing kick sound.
The training process was so heavy it crushed my own computer, bringing it to a grinding halt. Stephen ran the training on Media Lab machines that were much more powerful than mine. Turns out, each model took over 13 hours to train. Eye-opening.
Finally, with the Robo-Ryan beatboxing models in hand and the VAE interpolator MIDI drum sequences, the final step was to use a Digital Audio Workstation (DAW) to map each MIDI drum sound to the corresponding MIDI beatboxing sound produced by the wavGAN. Voila! This is how I could produce a Ryanesque beatboxing sequence.
This is the longest project I ever worked on. It took 10 months from conception to the final musical product. There were many frustrating moments in trying to get the code to work. I watched tons of videos on machine learning with audio, etc. This project was definitely an exercise in tenacity and perseverance. But, the AI beatboxer is actually pretty cool. It can generate super-human beatbox riffs because it's not constrained by human physiology. I generated a bunch of different beatboxing sequences and used them to compose a new original song, Woah. All that hard work and frustration paid off. It's one of my favorite songs, combining vocoded a capella with the Robo-Ryan beatboxer. Check it out on my music page. This song was highlighted at the MA STEM week launch event at MIT to show what a high school student could artistically produce with AI.