Sihwa Park

About

Selected Works
MAIAP
Diffusion TV
YouTube Mirror
Uncertain Facing
GeoD
ARLooper
Ballet Mécanique
Brand Logo Sonification
TopicBubbles
Loading...
BeHAVE
InstaSynth
Structured Improvisation X
FormSound

Archive
SoniPi
Letter Frequency Visualization
Generative Audiovisual Study
Ring of Quartets
Hans Zimmer’s OST vs Film
︎︎︎Past Works (2008-2015)

Recent News

- Oct 6, 2025: Diffusion TV was exhibited at the Connected Minds Conference 2025 Arts Reception.
- Jun 13, 2025: Diffusion TV was exhibited at CVPR 2025 AI Art Gallery (On-site display and online) as one of the shortlisted artworks.
- May 23, 2025: Diffusion TV was exhibited at ISEA 2025.
- Apr 8, 2025: Diffusion TV was accepted for CVPR 2025 AI Art Gallery (On-site display and online).
- Dec 11, 2024: Diffusion TV was accepted for the ISEA 2025 Exhibition.
- Aug 30, 2024: I performed in Exit Points#52 at Arraymusic, Toronto.
- Apr 7, 2024: YouTube Mirror was accepted for CVPR 2024 AI Art Gallery (Online). ︎︎︎More News
©2024 Sihwa Park

Sihwa Park


InstaSynth 

Listen to Your Colors
Date: May 2017
Categories: Audiovisual Performance, Data Art, Visualization, Sonification

InstaSynth is an image-based sonification project that transforms personal images on Instagram into a unique audiovisual piece. After gathering recent 20 photos from my Instagram account via the Instagram API, InstaSynth sequentially shows each image while extracting the 12 dominant colors of the image and pixelating the image based on the dominant colors. The pixelated image is fragmented and dropped into a rotating transparent hemispheric structure in the background, which is comprised of 20 vertical bars. Each bar has 12 sections that are filled with the dominant colors of an image. As a result, InstaSynth generates 20 images’ hemispheric color palette in three-dimensional space. Each colored bar creates a sound based on an additive synthesis technique. The hue values of a bar's 12 colors are mapped onto the frequencies of 12 sine oscillators respectively. Each oscillator's amplitude is determined by a mapped color's brightness value and the dominant degree of the color. The generated sound is also spatialized by the bar's position. InstaSynth triggers each bar's sound according to BPM (beats per minute). As a result, the hemispheric color palette is used as a color-based sound synthesis instrument. This project attempts to give people a novel artistic experience by making abstract visuals and sound from real images.


AlloSphere Demonstration at the MAT 2017 End of Year Show: Re-habituation

This piece was made and presented for the AlloSphere in UCSB, which is a three-story facility and is used to represent large and complex data, including immersive visualization, sonification, and interactivity with multiple modalities.