for Sound Voltex controller and MAX/MSP
VVD21 is a musical approach between arcade game controller and electronic music, created a MAX/MSP patch for Sound Voltex controller (model FacueTwo), Sound Voltex is a rhythm game included virtuosic playing with complicated charts, rhythm games have always been part of my life activities, I had reached to top ten players on the game ‘BeatStream’ over Asia, as the top player of Taiwan. In those arcade games I had experienced, I found potential elements from the diversity of their simple input sources: buttons and touch screens related to instrument practices and composition. By connecting the controller to MAX MSP mimic the sound inputs with regular buttons, effects buttons, and knobs varying live processing effects, visioning arcade’s input system being an instrument incorporating improvisations and freeing game plays to musical output rather than merely inputting the designed information.
Input sources: button and touch screen
Firstly, inputting data from human interactions and motions, in the designing process of controller with button settings, the beginning sketch of making button controllers was designing the button sizes, quantities, and their positions, including variations for hand gestures, the process that was similar to building instruments is thinking of ‘how human mechanics are conformed then how do we play with it?’; besides, touch screens also held similar approaches in the designing process, but with more freedoms on motions, touch screens opened up the range of sensing area, basically the whole screen could be used variably, most importantly, we are drawing onto it, whether with arrows, waves, lines, shapes, even free forms, touch screens has included visual contents on the action of playing instruments.
Charting systems - composition
The charting systems of rhythm games are also their compositional methods, by notating the inputs according to individual buttons or notating motional gestures for people to use certain techniques or do specific directional movements with hands. So, the first topic is actually “how do we notate for these instruments?” Before we talk into it, thinking of this idea “what if we incorporate this concept to nowadays music where inputs are notated rather than notating pitches and notes?” For one examination I had explored through notating sound inputs in my project “MA” with techniques and motions, but I will leave details to MA’s specific description.
Now let’s get back to rhythm games’ charting systems, whether in buttons or touch screen settings, the main element for inputting signal is what and where to play/hit on the instrument, to notate the notes that are assigned to each bottom, for example like having a musical staff with pitches, but instead the pitches were notated as hits/inputs with graphics that indicated the bottoms or knobs we are playing(pic1), so that the chart is based on a specific controller with designated techniques considered from its range and specific motions human could do (no difference than musical instruments).
In VVD21, using the sdvx controller (the chart pic1 is for), which has four main buttons, two effect buttons, a start button, two knobs, and an extra button. I used them as midi inputs sending signals into MAX/MSP, with sound samples and live processing effects, treating the controller as an instrument.
Variation of charting systems
Rhythm games have a great variety of charting methods, judgement line is indicating where and when people show hit the notes when they meet the line. Fundamentally, in the viewing perspective of top and bottom going upward or downward, and left and right going either direction (pic2); moreover, the judgement system could be geometric shapes (pic3), as a circle meeting notes from outside to inside or oppositely, or could even be designed wherever it is desired with visual aspects.
VVD21 MAX/MSP patch
As a percussive instrument, having four buttons (A-D) control the samples that are being played: R-knob is filtering folders and changing the samples that A-D buttons will be playing, and L-knob is giving the playback speed of each sample.
Processing: Fx-L decides whether the samples are going through spectral filters, if not, the samples stay the same and go to the next phase; Fx-R controls which speakers the sounds are played, I have set up three speakers which are stereo (right and left) and center.
Recording playback: this is a live recording function only being played on the stereo speakers, recording in real time with no processed samples, and then process through right and left knobs.
In conclusion, there are many possibilities for further developments on musical instrument design, no matter on electronic or acoustic fields. But most importantly in this project, the ‘customized’ musical approaches elaborated specific outcomes from independent instruments – Compositionally, custom charts will better express the mechanism of the instruments and will better inform techniques thus develop further practices; Performing-wise, customized instruments will create specific limitations like hand gestures, ranges, and ways, such limitations lead to results of making certain instrument-specified sound gestures. Furthermore, carrying the concepts with nowadays technology practices, there are way more sources to be explored and researched toward topics, taking life surroundings as instruments, create sounds from ordinary life.