Forum – Week 12 – Improvisation Session #2:
Forum – Week 12 – Improvisation Session #2:
Stephen mentioned that things should get a little wild....too far perhaps? Well, doesn’t one week make a world of difference? It often seems to work out this way; whenever I have an adverse reaction to part of the course program, it tends to appease my negativity with a more satisfying result from a subsequent session. Perhaps it was the unveiling of Jake’s highly evolved creation that added a new dynamic, but it is as if some of the awkwardness of last week’s session has been resolved, and good things are happening consequently.
Despite the relative success of the first two rounds of group improv, I felt it would be beneficial to add some conventional timbre, to compliment the piano and hacked synthesiser that were harmonising together nicely. So it was that the electric guitar introduced itself to the sound scape, and in spite of cold hands and rusty technique, it felt and sounded good to play. I didn’t mind the ‘tinny’ quality of the diminutive Daphon amplifier, as it discouraged me from playing fast, therefore providing a more complimentary backdrop to the mix.
Also mentioned today was the relentlessly approaching gig at Jive, which we are all performing in. I think it should come off okay after today, and I intend to bring a couple of extra toys to add some more variety to the sonic timbre of our presentation. Until then, looking forward to no work and end of year drinks…
Reference:
Whittington, Stephen. “Forum – Week 11 – Improvisation 2.” Workshop presented at EMU space, Level 5, Schultz Building, University of Adelaide, 25th of October, 2007.
Harris, David. “Forum – Week 11 – Improvisation 2.” Workshop presented at EMU space, Level 5, Schultz Building, University of Adelaide, 25th of October, 2007.
Audio Arts – Week 12 – Game Audio Design (4) - Voice Acting
Audio Arts – Week 12 – Game Audio Design (4) - Voice Acting:
The AKG is my mic of choice for.....everything... To give all an exhibition of my personal voice acting skills, I have created a comprehensive list of assets, which mirror the ‘Game Masters’ voice information pointers in the menu screen and during game play. The purpose of these non-character voice-overs is for providing statistical game information (some of this is for multiplayer modes, depicting team position information, but I will not be creating assets for that side of game play), vocal reinforcement of onscreen information, and for general in-game encouragement of the human player.
Beware the gauntlet...whoooooo! The existing assets for this group in the Open Arena files were quite dry, and in the normal pitch register of human speech. The actual reproduction of these files during the game is quite different with some obvious pitch shifting and reverberation applied to the files on playback. With this in mind, I have tried to be restrained in my application of these two effects and failed. That being said the quality of the ‘raw’ assets is not so over the top that they won’t fit in the final product without a little tweaking, so I have kept all relevant Cubase and Pro-tools files for this purpose.
The original files were processed in Pro-tools initially with a little help from the Eventide harmoniser. I have gone for a semi-synthesised quality for the voice over feature as it is an ‘unseen’ feature of the game and could potentially be the voice of a machine. Some more processing was executed in Plogue to further enhance the obscure qualities of the vocal takes, mainly via the KT granulator and a stereo flanger.
I hold no reservations regarding choosing myself as the “voice over guy” as people have mentioned in the past that my spoken voice is quite versatile and interesting. All the same, it was once again an educational experience when actually trying to record my voice in the studio, and the asset list is becoming insane..
Click here to link to this week’s MP3s and my updated assets register.
Reference:
Haines, Christian. “Audio Arts – Week 12 – Game Audio Design (4) - Voice Acting.” Lecture presented at Tutorial Room 408, Level 4, Schultz Building, University of Adelaide, 23rd of October, 2007.
Creative Computing – Week 11 - Processing – Spatialisation
Creative Computing – Week 11 - Processing – Spatialisation:
Panifried 'Vista' coming soon..
With the onset of a brutal spring cold threatening to bring me down, I have kept things simple this week. Rather than opting for the convoluted quad channel patch examples from which to draw a template, I selected an auto panning idea that keeps things in the traditional stereo domain.
While this little effect box may appear small, its potential project inclusion is leaning toward using multiples of the unit for auto panning specific frequency ranges in the context of an FFT process. It sounds ambitious, and will probably burn a few Macs in the process, but I think it’s an interesting idea and worth pursuing.
I have included possibly my least sophisticated example track yet, but it gives a straightforward idea of how the effect works. It is basically a wave-cycle controlled left-to-right-to-left panning process, which is being manipulated by changing the frequency of a simple sine wave in this case. The tremolo effect that can be achieved in simple setups like this example is worth a listen. Future experimentation will involve feeding more complex audio signals into the pan control input.
Click here to link to online folder containing MSP patches and an audio example.
Reference:
Haines, Christian. “Creative Computing – Week 11 - Processing – Spatialisation.” Lecture presented at Tutorial Room 408, Level 4, Schultz Building, University of Adelaide, 18th of October, 2007.
Forum – Week 11 – “Improvisation 1”:
Forum – Week 11 – “Improvisation 1”:
Grr! Noise. Is it an underrated sonic phenomenon with multiple unexplored musical applications? Yes. Did we lift the lid on any of these hidden miracles today? In a word; no. I’m unsurprised at the my perceived lack of interesting musical outcome resulting from today’s session. This is largely due to the time frames involved. Giving people a couple of months to build an entirely new instrument, with virtually no time to examine its performance possibilities and develop some kind of consistent technique, and expecting ten to fifteen people to ‘improvise’ musically with these things ad lib is a tall order. I don’t consider myself an ignoramus, and believe my mind is wide open, but being involved in many sessions of similar construction up to this point, I’m starting to find the concept tedious. I just haven’t experienced anything worthwhile during exercises of this type.
Perhaps if we had slowly worked on the instruments over the whole year and put together a pre-determined composition with them, based on results of personal and group experimentation, then maybe something worth listening to may have been produced. In this situation however, it’s just the same old story, be it today’s direction sheets dictating vagaries, or Harris’s from “Compossible”, or anything that leaves too much open to interpretation. I think part of the problem I have with this approach is that it’s always been presented to me in a large ensemble situation. There is a reason why Orchestras and Big Bands have conductors and rigid scores; when big groups get together and just improvise individually with only the vaguest suggestion of approach, and no predetermined ‘style’ to base their playing on, it rarely produces more than mediocre results from a musical perspective.
There was a moment of potential interest when Stephen suggested that all who were using sustained tones should aim to create some beating frequencies with each other and have the percussive instruments manufacture some sort of polyrhythmic effect out of the result.
This did reach a climactic point in which all involved were harmonised nicely, but it was short lived.
Perhaps I’m judging too harshly at this early stage, but there is only one week to go and I’ll be very surprised if the results improve in that time...
Reference:
Whittington, Stephen. “Forum – Week 11 – Improvisation 1.” Workshop presented at EMU space, Level 5, Schultz Building, University of Adelaide, 18th of October, 2007.
Forum – Week 10 – Instrument Presentation:
Forum – Week 10 – Instrument Presentation:
Perhaps this one's past the Beta test..
This week proved mildly frustrating as some of my instruments high level features, such as the Victorian synthificated piezonified reverb in a can, decided to confuse me with their poorly planned connectivity system. As a result, it was not until the group had left and I was hanging around in the studio by myself, that the awe-inspiring power of my creation was unleashed. Next week you all shall not be so lucky.
Some of the efforts on display were impressive, especially (I almost can’t bear to say it) Luke Digance’s, which was not only an aesthetic triumph but also a diverse and interesting take on the bent circuit genre. I must investigate his innovative gain boost feature.
Ben Probert also came to the party prepared with his intimidating (and I suspect very time consuming) butchery of an entire Yamaha keyboard. An ironic and humorous touch was the ‘Stereo Victorian Synth’, which is in fact controlled by that which used to be a conventional synthesiser, very clever.
Douglas Loudon had also gone to considerable trouble with his, which looked imposing and sounded cool, although the ring modulation effect would have been nice to hear. Hopefully this week all will be functioning as planned (my own device included). I’m not entirely sure what the next two weeks will hold, but looking forward to Thursday…
Reference:
Whittington, Stephen. “Forum – Week 10 – Instrument Presentation.” Workshop presented at EMU Space, Level 5, Schultz Building, University of Adelaide, 11th of October, 2007.
Harris, David. “Forum – Week 10 – Instrument Presentation.” Workshop presented at EMU Space, Level 5, Schultz Building, University of Adelaide, 11th of October, 2007.
Audio Arts – Week 11 - Game Audio Design (3) – Soundtracks
Audio Arts – Week 11 - Game Audio Design (3) – Soundtracks:
A notated result of an ironically rare opportunity to write music during a music degree.. The dark and ominous soundtrack I have created this week is a combination of a haunting string melody, a sinister frequency modulated bass synthesiser bubbling out a demented pedal tone, a light and understated conga riff, and muddy but booming kick drum stabs. The string sound is courtesy of Reason Adapted, as is the conga loop (the melody is an original of my own creation), and the bass synth is a pad variation of the resonant bass preset in the A-1 synth.
With the cavernous, echoing, death-match chamber of Kaos from Open Arena in mind, I wanted a sonic backdrop that would give appropriate discomfort, complimenting the environment. The resulting 2-minute track achieves this via the use of sparse instrumentation, but with all four factors contributing a key element. The string melody provides the ‘Haunting quality’, adding an element of sophistication to a musical style, which often omits melody, or at least buries it deep within the mix. The congas provide a quiet, non-intrusive rhythmical backdrop that should suit the moods of running, walking and sneaking around dark corridors. This is enhanced by the bass synth line, which is pumping its manipulation values through an oscillator in 16th notes at randomly varying velocities, which in turn alter its timbre in the same time structure. The final waveform of doom is a fat electronic kick, which pulsates every two beats and should simulate the player’s own heartbeat to some extent.
Like Muse said, plug in baby.. The kick has been muddied by excessive reverb, which allows it to blend into the mix and reduce its dominance. Reverb was the order of the day, as I wanted to compliment the environmental qualities of the large stonewalled chamber. Overall, the track loops back to the start cleanly, and should sit well at a modest or relatively low volume in the mix, providing game-play enhancing music without detracting from realism.
Click here to link to online folder containing an MP3 of the complete masterwork.
Reference:
Haines, Christian. “Audio Arts – Week 11 - Game Audio Design (3) – Soundtracks.” Lecture presented at Tutorial Room 408, Level 4, Schultz Building, University of Adelaide, 16th of October, 2007.
Creative Computing – Week 10 - Processing – Delay:
Creative Computing – Week 10 - Processing – Delay:
Focussing on the flange effect for the most part this week, I have created a plug-in like effect rack that represents a combination of both. With all parameters set up for MIDI control, it is a versatile effect unit that can respond musically to physical gestures.
My main concern at this stage is the presence of audio clicks during the course of manipulating certain parameters. This was also the case with last weeks attempt at building a phase vocoder. Ideally, I plan to have this, and other audio effects, embedded in a switchable bpatcher structure, which allows instant access to any single effect at a given time. This would be present in the path of the final audio output stage of my planned virtual instrument, and will avoid much of the screen clutter that can occur with multiple plug-ins. Ultimately it’s one window, showing one at a time, with a small section of buttons and key shortcuts to switch between which is on display. Now, for the pressing matter of implementation…
Click here to link to this weeks MP3 example and MSP patch files.
Reference:
Haines, Christian. “Creative Computing – Week 10 - Processing – Delay.” Lecture presented at Tutorial room 408, Level 4, Schultz building, University of Adelaide, 10th October, 2007.
Audio Arts – Week 10 - Game Audio Design (1) - Ambience
Audio Arts – Week 10 - Game Audio Design (1) - Ambience:
The Eventide harmoniser, for all audio solutions... Creating ambience was an interesting task, not my forte really but the ideas were forthcoming nevertheless. My approach involved starting with vocal weirdness and building things up from there. There is a lot to be said for taking advantage of the dead room and decent microphones, but even more to be said of the Eventide harmoniser. It seems as if the Eventide was designed for this very purpose, and it excels when applied to sound enhancement and distortion of original reference sources.
I can make my voice go quite low anyway, but having the Eventide on hand to drop things a couple of semitones (not too many, as I suspect the GAE will do the same thing again) and retain good quality (a rare triumph in Pro-tools) while creating a rich and powerful sonic landscape out of just one vocal track makes things quite painless. The only trouble I faced was in the area of bouncing. I created many vocal narration samples to maximise the studio time I had and Pro-tools has a most annoying feature when bouncing with complex TDM plugins switched on. It likes to jerk and peak at the start of the sound file, so one must give it half a second of ‘run up’ to gain a clean bounce (where would we be without Cubase?). This equates to a long drawn out session comprising just of individual bouncing.
I have gone for two contrasting sonic backdrops, namely one that favours higher frequencies and whispers, and another that favours low rumbles and growling. Both turned out well, so hopefully they will integrate into the game environment seamlessly.
Click here to link to online folder containing MP3s and this week’s asset log upgrade.
Reference:Haines, Christian. “Audio Arts – Week 10 – .“ Lecture presented at Tutorial room 408, Level 4, Schultz Building, University of Adelaide, 9th October, 2007.
Audio Arts - Week 9 - Game Audio Design (1) – Assets
Audio Arts - Week 9 - Game Audio Design (1) – Assets:
So many memories.. The assets I have created are for the following game objects:
- Menu noises
- Shotgun blast
- Frying player in Lava
Menu noises:My approach was to go back to Bidule as suggested by Christian as the fastest option for creative sound design. I wanted the bleeping sound for indicating mouse clicks in the menu screen of Open Arena to be unique, so I sourced the raw audio from a Bidule additive synth and racked up a line of audio processing effects routed through and audio matrix for easy accessibility.
The new bleep is essentially a single harmonised tone from the synth, run through a KT Granulator and stereo flanger with some tweaking to gain desirable subtlety in reverberation, feedback and delay.
Shotgun Blast:This was relatively easy as I found some reasonable audio to work with on soundsnap.com. I combined a couple of explosive sound files with a diving low tone generated in bidule, and added an audio file of a mechanical lock in use for the shell ejecting clicks. Reverb was kept to a minimum as the original game file had little, most probably this will be a variable controlled by the GAE.
Frying Player in Lava:This sounded straightforward until I realised the difficulty of creating a non-rhythmic short loop that could potentially repeat more than several times in a row during game play. I sourced some fire and frying sounds online once again, but as soon as a file introduces a peak of miniscule difference to the rest of the file it is immediately noticeable as an unrealistic artefact. Sourcing a bit of altered white noise from bidule and playing with the mix help a lot, but it’s far from satisfactory. The main problem is that the original file cannot be altered too much in the time domain and it was very short, due to the ‘jump in jump out’ scenario that would occur when a player falls in the lava. Oh well, I guess people will want to stay out of there during game play anyway
Click here to link to online folder containing Mp3 examples and my current assets register.
Reference:
Haines, Christian. “Audio Arts - Week 9 - Game Audio Design (1) – Assets.” Lecture presented at Tutorial room 408, Level 4, Schultz building, University of Adelaide, 2nd October, 2007.
Creative Computing - Week 9 - Processing – PFFT
Creative Computing - Week 9 - Processing – PFFT:
The main interface of my phase vocoder object.
A long day of successful experimentation followed by a failed master patch sums up my creative computing task. On an individual basis, I have had an interesting experiment with the various tutorial examples of MSP’s FFT capabilities. In particular, the Crossover and Phase Vocoder patches. After testing variations of these with many samples and configurations, with varying degrees of success, it was time to integrate some of them into my main granulator patch.
Under the hood of the bpatcher..
At first I tidied up the layout and functionality of the patches interface, which was quite messy, then set about calculating the gain structure for additional audio objects. As far as I can tell, there is nothing wrong with the layout of my patch programming; it just…uses more than 100 percent of the computers CPU. I suspect that this may be the result of using too many signal scope-type objects for visual feedback but I’m not entirely sure at this stage.
All I can say is it’s taken way too much time, and I’ve had to record some independent phase vocoding without the granulator for your sonic scrutiny this week. The sonic result is still worth a listen, and at least displays one of the many interfaces I have created for the exercise in action..,
Click here to link to online folder containing MP3 and Max files.
Reference:Haines Christian. “Creative Computing - Week 9 - Processing – PFFT.” Lecture presented at Tutorial Room 408, level 4, Schultz Building, University of Adelaide, 4th of October 2007.
Forum – Week 9 – ‘The Bent Leather Band’
Forum – Week 9 – ‘The Bent Leather Band’:
Stuart and Joanne of Melbourne’s ‘
Bent Leather Band’ put on a captivating and informative workshop this afternoon. It was certainly interesting to witness the sophisticated heights that home made electronic instrument making / playing can reach.
Of particular appeal to me was the focus on playability and evolution of technique. I had given considerable thought to this area during the recent construction of my own instrument, and believe it would have benefited greatly from more time (not to mention money) devoted to this. Watching Stuart engage the light harp with often highly animated, but always controlled, gestures invoked images of Clara Rockmore pulling a complex violin sonata out of a Theremin. Of course, this music was very different, but the BLB, like so many new music exponents, are driven by the pursuit of new sounds and musical concepts, not by a desire to recreate that which has come before.
Perhaps the quality of sound (which was very good) was a key factor in this, but I found this performance substantially more engaging than many of my new music experiences of the past. The fact that the BLB are interested in the visual impact of their performance and stage setting as well as the sonic result certainly works in their favour too. The creature-like instruments of meticulously crafted leatherwork and electronics are awe-inspiring to say the least. I think the notably absent ‘aura of defensiveness’ in the likeable personalities of Stuart and Joanne, when communicating about their craft, helped to avoid tension that can exist beneath the surface of many a new music / art presentation.
Although I disagree with the notion that 'art of the past belongs in a museum' on some levels (Is it not okay to consider some music to possess a timeless quality, just as relevant today as it was then etc?), I found the experience rewarding and enlightening, much like the visit from Dr Sardeshmukh in 2006, looking forward to next week…
Reference:
Stuart Favilla and Joanne Cannon. “Forum – Week 9 – ‘The Bent Leather Band’.” Workshop presented at Space, Level 5, Schultz Building, University of Adelaide, 4th of October 2007.
Haines Christian. “Forum – Week 9 – ‘The Bent Leather Band’.” Workshop presented at Space, Level 5, Schultz Building, University of Adelaide, 4th of October 2007.
Sebastian Tomczak. “Forum – Week 9 – ‘The Bent Leather Band’.” Workshop presented at EMU Space, Level 5, Schultz Building, University of Adelaide, 4th of October 2007.
Forum – Week 8 – Completed instrument:
Forum – Week 8 – Completed instrument:
Well here it is, the ‘Squarewave/Piezo/Vicsynthingy-Mach-1’. It is a combination of manipulable Square wave oscillators, piezo transduction, and Victorian synthesiser concepts that functions as a stand-alone instrument (with the exception of external amplification). The output of the square wave oscillators feeds into a single piezo, which resonates inside a metal can. This is picked up by a second piezo, also inside the can, and fed to an amplifier. There is a Victorian synth device sitting on top of the can, the sonic result of which is also picked up by the piezos, therefore merging the two sounds.
Expressive device attachments are in the form of LDR and pot controls, and push on/off – off/on-switching devices, and manual Victorian synth control. Also, the can itself can be tilted to create varying timbres from the altered positioning of the piezos. The end result is an instrument capable of producing a variety of tones and timbres that allows for plenty of performance techniques to be applied ‘on the fly’.
Click here to link to online folder containing a high resolution photo archive and an audio example.
Reference:
Christian Haines. "Music Technology Forum - Completed Instrument." Music Technology Forum - Instrument Project. Electronic Music Unit, University of Adelaide, South Australia, September 2007.