Audio Visualizer with P5-JS Library
Finally creating a spinning audio visualizer and opening myself up to ultrasonic capabilities.
Since beginning my journey into coding I have always wanted to make a spinning audio visualizer. I had seen examples of this project for years yet was always intimidated by the complexity. Our assignment this week was to utilize a pre-existing code library and build something with it. Having seen the P5-JS examples before, this visualizer included, I knew exactly what my direction would be.
I began by simply having audio play and pause. Following along with the fantastic Coding Train Youtube tutorials this was a breeze. Then I began

trying to get a visualizer to analyze the input level and map it accordingly. Although there were many headaches and bumps in the road, normal for any coding attempts I have done, the P5-JS libraries made this aspect obtainable. I started turning this data into a simple line before learning how to create a spinning visualizer. After some visual tweaking I had a working site. To the left you can see the entirety of my java-script functionality. One function I want to give special attention to is “touchStarted().” To prototype this visualizer I was using Google Chrome. Although Chrome’s safety and privacy functions are fantastic, it often. makes simple coding experiments like this more complicated. Although I had earlier succeeded in creating a play/pause button, it was originally left out for the visualizer. I felt that such interaction was not needed for a simple proof of data visualization. Chrome’s interface however does not allow for audio/video content to play without user input. This function allowed a work around to this problem.
One of the benefits of this visualizers is its clean and simple look. One that to me creates art from art. Taking the artistic aspects of music and creating visually interesting images that the musicians would have never imagined.


Looking forward however and wanting to begin analyzing microphone input for a future project, I started playing around with microphone input. My plan was to have both an internal WAV file and microphone input controlling two different lines based on their amplitude. Although I was getting both sets of data to appear in the visualizer, introducing the second variable shrunk my circle into a tiny dot. I tweaked every function in my java-script before the shape grew to a palatable size. Below are three videos documenting the struggles and success of this project.
Looking forward I want to begin utilizing the getPower() function or FFT function to begin analyzing the audio frequency present. With such data obtained I want to then move into using ultrasonic audio frequencies to create simple YES / NO functions.
Here is a link to my Github Repo: https://github.com/calebhammel/CityLineLego