What Is the Correlation between Sound and Color in RAW Digital Data Formats?

Abstract

The ideas leading to my current body of work incorporate several concepts that I have been researching for a number of years. What is included in the gallery involves sound converted video. Individual video frames are converted to sound files and back through algorithmic code. This conversion process involves concepts such as abundance, collection, organization, reconfiguration, juxtaposition and aesthetic elevation. With regards to my installation work, video and sound are recorded and processed in real time within the space. This video processed with the result projected back onto the space. The resulting imagery is subject to change by viewer interaction and site specificity. Most recently, my research has lead into the study of how data is interpreted through my artistic process. A process where RAW data formats are used in ways that confuse the software packages that edit them. More specifically, raw sound files are imported into graphic editing programs. When raw sounds transition into raw image through this process, the resulting imagery is represented by a basic, 16 color palette. What dictates this allocation? What assigns these color values? How is/can this new data represented in the gallery? In this paper, I describe my process in depth, RAW data formats for both sound and image, the different variations and methods used in data import process and how color is translated from audio information.

Presenters

Nick LeJeune
Assistant Professor of Interactive Media and Game Design, Communication and Humanities, SUNY Polytechnic Institute, New York, United States

Details

Presentation Type

Creative Practice Showcase

Theme

New Media, Technology and the Arts

KEYWORDS

Raw Data, Data Visualization, Conversion, Collection, Reconfiguration, Interactive, Immersive, Installation