There was a relatively short space of time between plugging in the game controller, registering it in SC, and integrating the code into my week 6 patch (not my most exciting effort, but one suited to the exercise). Then an epic battle between man and machine ensued as I struggled to work out why the machine would not allow GUI manipulation code to be directly executed from the HIDDevice.action. The reason is due to priority being given to audio synthesis tasks it seems. If you’re going to insert code into your HID action that will update GUI objects with method calls like “slider1.valueAction(someValue)” for instance, then you must ‘wrap’ the code in a function and call .defer on the function like so:
{slider1.valueAction(someValue)}.defer;
This way the machine will do its audio synthesis thing, followed by updating any GUI elements that are involved. I guess this functionality is built into the GUI object classes themselves, negating the need to do the same when working with them directly.
Anyhow, I’ve finally succeeded in mapping the game controller to the four sliders in this patch. The 8 main buttons control slider one, giving obviously crude resolution. The 8-way directional pad does the same for the next one, and the analogue sticks do the next two with much sharper resolution of course.
Click here to link to online folder containing this week's code and a short MP3.
This succeeded in holding my interest without a single change in instrumentation or a hint of Foley work. My only criticism would be that it did little to evoke genuine fear. However, it did convey the excitement of the chase scene well.
The electric guitars slowed to a deathly crawl were a personal highlight for me. The stark nature of some sounds left me a little cold in places, but then again this is a horror movie.
This suitably left of centre approach seems to be coming along nicely. I felt the initial buildup was probably drawn out too long, but this was only a rough edit.
Luke talked, audibly but indecipherably, almost the entire time his film was playing which made it difficult to embrace on its own merit. I still like the scream, but I think he shouldn’t get too attached to its ‘raw’ form, and use it in various places with brutal and destructive effects.
? Composition Student:
Interestingly, but perhaps unsurprisingly, this music written on the piano while intended for other instruments produced a weird, incomplete effect. It’s too difficult for me to offer an informed opinion about its intended form from a Sibelius/Kontakt reduction.
Reference:
Harrald, Luke. “Audio Arts – Week 11 – Semester 2, 2008: Student Presentations.” Workshop presented at EMU space, level 5 Schultz building, University of Adelaide, 21st of October 2008.
Forum – Week 11 – Semester 2, 2008: “Robert Moog and Other Things That WOW”
In true prog style: the silliest piano solo you'll ever see/hear...
The robots are coming. This kind of Tech-Nerd fear mongering poses a significant question for me; “Where is the higher consciousness going to come from?” I wonder if anyone thinks it will just miraculously happen when technology reaches a given threshold of complexity. Seriously, the amount of programming and engineering that must have gone into the ‘cute’ robot from the Steve Reich clip - to achieve far less than a sub-epsilon-semi-moron level of intelligence indicates that we have little to worry about, at least in the immediate future.
Far more exciting was the K. Emerson keyboard solo. I find it perplexing that a great many keyboard players are not interested in embracing the possibilities for expression they are offered by technology. Don’t great concert pianists wish they could bend a note once in a while? Or add some vibrato? Or throw in a dive bomb? Or dirty it up with a distortion pedal and a Wah?
It was interesting to see how some conventional musicians from the 1970s had taken to synthesisers. They may have called it Avant Garde, but their “New York Minimalist” approach to electronic music was a far cry from Stockhausen to my ears (this is a good thing).
Reference:
Whittington, Stephen. “Forum – Week 11 – Semester 2, 2008: Robert Moog and Other Things That WOW.” Workshop presented at EMU space, level 5 Schultz building, University of Adelaide, 23rd of October 2008.
For the week 9 component on panning, I have added a reverb synth to my week 7 Drum machine that incorporates a 'SplayZ' 5.1 panner. Rather than adopting the 'round and round in a circle' model that Mr Haines presented on Thursday, this class seems to 'spread' its array of input channels to even locations in the spectrum - be it quad, stereo or anything else. The added GUI elements control the 'width' of the spread and the offset of each channel relative to the width setting.
I got a little carried away and have added an auto beat maker button to the GUI which spits out on/off messages to the box grids to make for fast and easy beat construction. I have also added an ‘update’ button to re-execute the reverb synth code. This enables the user to mess with GUI settings and recreate the GUI for testing without having to re-initialise sample synths – it’s a problem work-around as these will not load twice without you quitting SC and starting again. It seems that Sample Manager has no multi channel function that I can find so I’ve used the freeware encoder drawn to our attention by L. Harrald a few weeks ago. It’s a curious effect that the SplayZ class forces on a drum kit, but not something I’d choose from a musical perspective.
Click here to link to this weeks code and a multi channel sound file of the result.
Reference:
Haines, Christian. “Creative Computing – Week 9 – Semester 2, 2008: Spatialisation.” Lecture presented at tutorial room 408, level 4 Schultz building, University of Adelaide, 16th of October 2008.
Audio Arts – Week 10 – Semester 2, 2008: “Synopsis of My Project”
I doesn't look like much work when you put it like this...
Following Luke’s instruction to embrace and enhance the terror that is Nosferatu, I have taken an approach that works infallibly in all horror movies that adopt it.It can be described as soft, sinister, dark ambience interspersed with brutal transients to let the audience know when there is something onscreen (or about to be) that is truly menacing.
Top achieve this I have used a combination of virtual instruments (orchestral strings, piano and synth pads designed for dark ambience), and specially designed samples.The creepy undertone to Nosferatu entering the man’s room is a sample of my own voice in various breathy forms, heavily processed with an Eventide harmoniser.Other samples have been sourced from various instrument banks, online databases and more personal creations.
I believe I have been successful in creating the ‘heart jolting punches’ that one always receives from a truly suspenseful horror/thriller movie.The more I play with this genre, the more I feel it’s ‘all in the sound’ - horror films would be nothing without rumbling synthesisers, orchestral bass/percussion sections and screeching violins!
One of the most difficult problems I found was with lining up footsteps to onscreen action.There is some choppiness or ‘frame jumping’ in this cut that makes realism hard to perceive on occasion.
Reference:
Harrald, Luke. “Audio Arts – Week 10 – Semester 2, 2008: Synopsis of My Project.”Lecture presented at EMU space, level 5 Schultz building, University of Adelaide, 14th of October 2008.
The delicate and subtle set of introspective animated short films presented by Poppi and Yisheng Qian (Betty) was quite engaging.It’s refreshing to witness a presentation of work that the artist has truly immersed themselves in for their own reasons.Too often we are presented with student projects that are products of course criteria, rather than inspired creation (I include my own efforts in this observation).I’ve towed the line thus far with the “this is what I did with Max/MSP, SC, Audio Arts etc” course directed approach.When I see presentations like this one however, where the person has gone against that model and decided to showcase something they really care about, it makes me wonder if I should have done the same.I mean, I found this presentation very entertaining, and I struggle to imagine anyone having the same reaction to one of my Max, Collider or Concrete pieces, which have felt like more of an afterthought to the process of learning how to program, than works of sonic art.Perhaps the criteria for student presentations could be opened up to a free for all “show us what you like to do with sound” model, rather than “show us what you did for this course last semester, with the painfully limited amount of time you had to work with.”
Reference:
Whittington, Stephen.“Forum – Week 10 – Semester 2, 2008: Honours Student Presentations.”Workshop presented at EMU space, level 5 Schultz building, University of Adelaide, 16th of October 2008.
Forum – Week 9 – Semester 2, 2008: “Student Presentations”
All we need now is a virtual person to play it...
Ben Probert’s controversial 2nd year Max program was nice to look at, as always. Unfortunately its musical/sonic output was a little underdeveloped. I have an interest in creating self-composing or computer assisted composition software myself, so despite the simplicity of the programs music it gave me a few ideas to explore. My own programming experience thus far indicates that moving beyond simple output of a self-composing tool (as I’m sure Ben will agree) involves an exponentially increasing amount of code.
Luke Digance kept things rhythmic with his SuperColliding effort from last semester. As did my own piece on the day, it sounded a little stark. I think it’s a combination of the Bose speakers, which need extra drive to produce effective bottom end, and the recording space in EMU, which is designed to control sound, not enhance it.
I can understand why Will Revill wanted to present his impressive graphical interface from a former Max project. The lack of sound left me a little cold though. I’m glad I didn’t sign up for the collaborative game project; it sounded like a disorganised nightmare, without even a finished product for the trouble of the few hard-working individuals involved.
Jake Morris: Nice work on the BNU commercial, just watching it increased my energy level…
Reference:
Whittington Stephen. “Forum – Week 9 – Semester 2, 2008: Student Presentations.” Workshop presented at EMU space, level 5 Schultz building, University of Adelaide, 9th of October 2008.
A note to self is necessary for future endeavors of this nature: “use the sliderData/buttonData array system for GUI building.” It’s a difficult decision to make at the beginning of a program. For instance, with a weekly exercise such as this one I didn’t anticipate such a long page of code. The data array method is more confusing to implement initially, but seems to be far more manageable for GUI building when more than four GUI objects are required in an interface window. So, the rule for future projects will be to draw the proposed GUI roughly on paper and if it contains 4+ objects, then use a data array.
FFT is a strange area of synthesis. It’s confusing to understand and implement initially, and once this hurdle is overcome, not too sonically rewarding. Rather than chase audio bliss with this exercise I have opted to merely integrate some GUI control into a couple of predefined FFT processes. My program uses the Comb filter process and magnitude freezing in combination to allow blending of both synthesis types on up to two samples simultaneously. It needs some tweaking in the area of bus management and output mixing, but does an effective job in revealing various results of these processes on audio data. Click here to link to online folder containing this week’s SC code and an audio example.
Reference:
Haines, Christian. “Creative Computing – Week 8 – Semster 2, 2008: FFT (2).” Lecture presented at tutorial room 408, level 4 Schultz building, University of Adelaide, 9th of October 2008.
What impressed me most about the Hoth ice battle scene was, despite the corniness (Village Voice criticism duly noted!), Williams’ ability to switch between motifs as dictated by on screen action. To perform this with convincing style is extremely difficult.
I have been meaning to experiment with minor/major inversion tricks for compositional sleight of hand for some time. Clearly the effect of this technique is responsible for the Vader theme’s longevity. Other points of interest in Kalinak’s article were exposing the laborious task of orchestration and the discovery of why so many 1960s movies annoy me. Why cant American business men stick to their own game? This high jacking of film to sell Pop records reminds me of a certain 1950s US songwriter who felt threatened by the self contained juggernaut of Elvis Presley. The said composer put his art on the back burner for a while and (in the traditional entrepreneurial US style of marketing mindless crap to mindless consumers) created some stupid little collector dolls of the King himself. Once these were on the open nick-knack market they apparently outsold Elvis’ ludicrously high record sales in the first year! It seems a fitting story to parallel with a time when the accompanying Pop songs of many pictures “out grossed the films they were composed for.” (Kalinak 1997: 3)
Reference:
Harrald, Luke. “Audio Arts – Week 9 – Semester 2, 2008.” Lecture presented at EMU Space, level 5, Schultz building, University of Adelaide, 7th of October 2008.
Kalinak, Kathryn. “John Williams and ‘The Empire’ Strike Back.” Online article, accessed 9th of October, 2008. URL: http://web.archive.org/web/19970516041818/ http://citd.scar.utoronto.ca/VPAB93/course/readings/kalinak.html