Content area
This project explores the convergence of auditory and visual realms by employing data analysis, visualization techniques, and sound synthesis. The focus lies in generating compelling audio compositions from time-series datasets to explore the rich interplay between data and audio perception. We often only consider the consumption of data through a visual paradigm, but there is increasing evidence that presenting data through multiple mediums can improve the learning and understanding of complex topics. To explore this, I created my own data sonification and visualization application using Ableton Live, Max/MSP, and TouchDesigner, and applied it to datasets capturing information about birth rates and global warming. I was ultimately able to complete this application and was able to learn about how this approach to communicating data could be applied to other projects in the future.