Mobile devices have been a promising platform for musical performance thanks to the various sensors readily available on board. In particular, mobile cameras can provide rich input as they can capture a wide variety of user gestures or environment dynamics. However, this raw camera input only provides continuous parameters and requires expensive computation. With Phone with the Flow (PwF), we propose combining camera based motion/gesture input with the touch input, in order to filter movement information both temporally and spatially, thus increasing expressiveness while reducing computation time.
PwF is currently implemented as an Android 9.0 App. You can download the apk file here.
How to
Every touch input activates a region of interest (ROI) in the camera image. The captured movements in the ROIs are then sonified. The sound synthesis can be done either directly on the mobile device, with restrictions on the complexity of the synthesis due to limited computing capabilities, or the features can be sent to external musical software via OpenSoundControl messages. For further information about the mappings used in the app, you can download the Pure Data patch and control it by enabling the OSC output of PwF.