Thursday, May 29, 2008

Forum – Week 11 – The Philosophy of Technology

This is my modest run about, for short errands to the shops etc. Some say it's excessive...I say it's mmmaaaaadddd!!!!!

This was a surprisingly engaging discussion with genuine opinion bandied about by numerous different students for a change. I guess the idea of guilt attached to the use of technology sparks a reaction in many. I believe capitalism is a deeply floored system of social structure that rewards greedy ambition with excessive material gain and power.

ABC radio recently featured some talk back with American economist Robert H Frank who had conducted research into money feeding happiness. Apparently it does, to a degree. It seems, according the results, that happiness can be bought for a household income of around $100,000.00 per annum in the current economic climate. This apparently enables an average family to pay off a modest mortgage, eat well and purchase any services they need to make life easier and fun, such as gardening, car washes, entertainment etc. Beyond this the researcher claimed there was no higher level of genuine happiness that could be attained. In fact, he stated that excessive earning and lavish lifestyles can be quite damaging, as those living such a life become alienated from the ‘real’ world. If only the capitalist pigs that promote unnecessary consumerism would take a long hard look at themselves…


Reference:

Whittington, Steven. “Forum – Week 11 – The Philosophy of Technology.” Workshop presented at EMU space, level 5, Schultz building, University of Adelaide, 29th of May 2008.

Harris, David. “Forum – Week 11 – The Philosophy of Technology.” Workshop presented at EMU space, level 5, Schultz building, University of Adelaide, 29th of May 2008.

Tuesday, May 27, 2008

Audio Arts – Week 11 & 12 – “Elder Hall and Orchestra Recording”

Plan image created using the ultra versatile "Cardioid Graffle" software..


As you can see on the created image, I have gone with a similar layout to Ray’s from Friday’s concert, but made some subtle changes. Given our selection of Mic’s at EMU, I feel this type of recording would benefit from the use of boutique condensers above all else.

Starting at the front of stage, I will have a spaced pair of Neumann U87’s to compliment the fixed A/B configuration that lives in Elder hall. The height of these should be around 2 metres above the heads of seated performers but I plan to angle the diaphragms around 45 degrees in toward each respective string section. Cardioid polar patterns will be used.
Rather than spot micing the bass strings, I have opted to use the infinitely sensitive Rhode NTV in a similar position of height to the U87’s, but toward the rear of the bass string group with the diaphragm facing inward once again. Mirroring this on the left of stage will be a Neumann U89 that should reinforce the high strings nicely. The U89 will be set to omni for consistency with the Rhode NTVs preset pattern.

The woodwind will be semi spot miced, with 2 Neumann KMi84’s angled parallel to the floor and only slightly above the heads of the woodwind performers. Hopefully mimicking Ray’s technique with different mic’s here should reinforce the woodwind without boosting the horns too much. The Rhode NTV on the bass strings should compliment the U87s in picking up a nice degree of ambient room and audience noise.

This takes care of our maximum eight inputs at Elder Hall. The outputs should be sent via the fire-wire desk into Garage Band, Cubase SX or similar for capture. The session must be at least 24 bit / 48KHz in sample depth/width though. I was very surprised that Ray was going with a standard Red Book format on Friday as the lower dynamic range could be a real problem with such dynamic music.

Depending on the piece to be performed, there may be a need to follow a score during recording, for the purpose of gain riding. Hopefully this will not be the case, but the Rhode NTV is particularly sensitive, and can overdrive easily.


Reference:

Haines, Christian. “Audio Arts – Week 11 – Elder Hall and Orchestra Recording.” Practical excursion presented at Elder Hall, University of Adelaide, 23rd of May 2008.

Creative Computing - Week 10 - Streams(3)


The big band of the future, represented by an ironically small amount of code..

There was a particularly misleading and confusing component of the help file ‘Events7.help’ this week. Regarding something referred to as a message function, it gave a not only poorly annotated example of the \msgFunc parameter syntax, but a faulty one at that, which crashed after one or two notes, grr! I say it was misleading as the pre-example blurb gave the impression that this was the structure needed to pass information to a non default synth in an eventStreamPlayer. This is not the case, I discovered an hour later, as changing symbols such as \midinote and \db to their respective counterparts in my synth (converting values to useable form of course (.midicps etc)) worked fine.

I have embedded my Pbind/Paddp/Pseq/Pxrand/Ptuple thingy in a task so the random selection is re-executed on every repeat of the sequence. Once I got the Ptuple patch playing with my synth, I modified on of the white keys generic music makers from Events6.help to give the sequence some accompaniment. A touch of extra randomness was then added and a formidable self composing revolution was born.


Click here to link to online folder containing this weeks .rtf file and MP3 example.


Reference:

Haines, Christian. “Creative Computing – Week 10 – Streams (3).” Lecture presented at tutorial room 408, level 4, Schultz building, University of Adelaide, 22nd of May 2008.

Thursday, May 22, 2008

Forum – Week 10 – “Scratch”

Oh, yeah..


I really enjoyed a second viewing of Doug Pray’s fantastic documentary on the noble art of Scratching. I had forgotten just how humorous it was, and how talented the practitioners involved were. Turntablists Qbert, Shadow, and Mix-Master Mike were equal victors in my ill-informed opinion. The art has grown to a level of incredible complexity in a very short time.

I am most impressed with those who find a medium between successfully executing virtuoso technique and fulfilling their role as performers. This is clearly not only applicable to the art of turntablism but to virtually all music performance – engage with the people, not just the instrument.

I find it interesting that such a wide audience exists for what many may see as an indulgent and alienating art form. As I have clarified for myself in recent studies into the role of virtuoso performers, it seems there will always be a market for some forms of indulgent music that thrive on pushing boundaries of human limitation. For me there is nothing more engaging in a performance than witnessing entertainers really putting themselves out there, going to a place accessible only through tireless dedication and practice. It never ceases to inspire.


Reference:

Whittington, Stephen. “Forum – Week 10 – Scratch.” Workshop presented at EMU space, level 5, Schultz building, University of Adelaide, 22nd of May 2008.

Monday, May 19, 2008

Audio Arts - Week 8/9 – Live Sound and Acoustics

Mix it up - uh!

(1) What do you think the purpose of live recording is?

Live recording at its purist is capturing a genuine moment in time. It is the art of preserving a performance in the clearest format that the environment, equipment and circumstances will allow.

The biggest advantage that live recording should offer is an abundance of performance energy. Whatever the performance type, this should be the group in their element, the moment in time they all work and long for. Public performance always introduces elements of adrenaline, fear and excitement that, in the hands of seasoned professionals, should be focused for an energetic and organic experience. It is difficult to reach the same level of hyperactivity in a sterile recording studio environment, and live recordings, while not always offering note perfect performance, should offer listeners a piece of the group as raw musicians, putting themselves out there without fear of exposing their flaws.

Disadvantages are numerous. There can be little or no control over the reflective properties of the space, so microphone placement must be optimised for the space as it is. All micing techniques are affected by this so any or all can produce lesser quality than could be expected from a studio recording space.
There is generally nowhere dedicated for monitoring to take place, so the closest available room of suitable size (or less) must be settled upon.
Room microphones must take into account the audience noise. The optimum placement for room ambience for the mics may expose them to isolated noise from a small pocket of the audience, therefore making the crowd sound smaller overall.
The big one of course is that the band can only do one take of each song. If it’s really bad, there’s no second chance.


(2) Where do you think the future of studio and venue acoustics lies?

For lower end and home recording studios, the new plug-ins and small hardware devices offering dedicated room correction equalisation seem to offer the only ‘solution’ to the perils of mixing and monitoring in an acoustically inferior space. As the price of labour escalates, and the availability of personnel decreases, treatment of rooms that requires a tradesman’s assistance is a very expensive option – before an acoustic engineer is even considered. Given that software and hardware in this field is only going to improve, perhaps a medium between relatively cheap acoustic baffling and software/hardware correction should be the goal for those with tight budgets. Removing some of a room’s reflective issues with baffles could reduce the margin for error in correction devices, thus yielding a more accurate result overall.

All of us who study at EMU know the cost incurred by the treatment of the recording space. It quickly chewed through what seemed to be a considerable budget as the acoustic engineer’s fees ended up compromising the overall result, which was left unfinished. To look at EMU as a test case for exploring the merits of software like the Advanced Room Correction (ARC) from IK Multimedia, or the KRK Ergo System is a useful exercise in economics. Given some of the software/hardware purchased by EMU over the last year, either of these options as an experiment would be a relatively small investment. Treating the
Studio one and Studio two control rooms would be unrealistically expensive by comparison. Even if the ‘virtual’ solutions are not up to scratch, small projects for treating the rooms physically may be embarked upon with the knowledge of their necessity. Personally, I think a correction device plus a large number of decent acoustic baffles of varying sizes would be the best option for EMU at this stage. Varying baffles would allow for a degree of experimentation in the control rooms to find a balance between correction and treatment.

Ultimately it seems that physical treatment of the expensive kind is just one more area of sound engineering being absorbed into the virtual world.


References:

Grice, David. “Audio Arts - Week 8/9 – Live Sound and Acoustics.” Lecture presented at Immanuel College Concert Room and Recording Studio, Novar Gardens, SA, 16th of May 2008.

"KRK Systems Releases Ergo, Powerful Room Audio Correction System". 2008.

Multimedia, IK. "Advanced Room Correction System - Chapter 1 - Arc Overview". 2008.

Stavrou, Michael Paul. "Stav's Word: Recording a Live Gig." Audio Technology.45: 64 - 66.

Creative Computing - Week 9 - Streams (2)

etc etc..

This week I have kept the synth simple once again and focused on gaining a fundamental understanding of the subject matter. The 'Event' class was a bit tough, but up to the end of the 'Environment' section I believe I'm on the right track.

I wanted to incorporate the Pbind class in some way, but time has restricted me to utilising the Pshuf and Pser objects. These are organised into a shuffled list of Pser's that exist inside a Pshuf Object and have similar but varying data for musical effect. I have included nil values at one index of each Pser list to give some rhythmic variety. The clever touch this week was getting a task to exist within another task. This allowed the patch to play through the shuffled sequence via the internal task and repeat it exactly or partly according to a preset variable. The external or 'superTask' as I have named it then reshuffles the list and the new list is played through the desired number of times.


Click here to link to online folder containing SC file and an MP3 example.


Reference:

Haines, Christian. "Creative Computing - Week 4 - Streams (2)." Lecture presented at tutorial room 408, level 4, Schultz building, University of Adelaide, Thursday 15th of May 2008.

Thursday, May 15, 2008

Forum – Week 9 – Masters Student Presentations:



Sebastian Tomczak’s devotion to re-texturalisation of sound as a compositional basis seems like a pursuit of endless depth and possibility. However, I can’t help but feel there is something lacking in many of the finished products from a musical point of view. I think the format of the Milk-Crate sessions contributes to this with its narrow time frame. It would be impossible to produce a high quantity of meticulously crafted work in such a short time. Such things are agonised over weeks, months and years in my experience. Milk-Crate concerns aside, I was glad to finally hear his water surface controller in action, but would have enjoyed some lower frequency action from the output.

Darren Curtis seems to have been busy over the last couple of years, getting involved with what appears to be a combination of installation and live performance sound/visual art. Darren also sources sound from unlikely means, but in more organic(?) areas than Sebastian. I was surprised that his Fringe Festival project’s content was devoted to the planets and not directly related to “Frequency Medicine”. His music was relaxing, and it's interesting to hear mystical stories of Aztec multi-track recording from the dark ages.


Reference:

Tomczak, Sebastian. “Masters Student Presentation.” Workshop presented at EMU space, level 5, Schultz building, University of Adelaide, 15th of May 2008.

Curtis, Darren. “Masters Student Presentation.” Workshop presented at EMU space, level 5, Schultz building, University of Adelaide, 15th of May 2008.

Tuesday, May 13, 2008

Creative Computing - Week 8 - Streams(1):

NY, NY..

The first problem encountered this week was my previous synths being too temperamental to utilise in the stream control exercise. Solution: steal some of their working parts and make a new one. The new synth uses Saw wave Ugens in conjunction with an amplitude envelope and a RLPF. I wanted to create a stream controlling piece of code that played melodies with accompanying chords. I took the time to create Array banks of appropriate MIDI note numbers for this purpose.

Making polyphonic chords in SC is no trivial task it seems. Time has forced me to settle for chords of randomly compiled notes from the I, ii and V chords of C major of independently chosen durations. The result is a Steve Reich - New York Counterpoint-like composition that provides a surprising amount of variety for some modulated saw waves in the (vaguely) key of C.

The Synth.set method is still causing me trouble, so I shall endeavor to brush up on my knowledge on this during the week.


Click here to link to online folder containing .rtf files and an audio example of the NewYorkCounterPointerAtor.


Reference:

Haines, Christian. "Creative Computing - Week 8 - Streams(1)." Lecture presented at tutorial room 408, level 4, Schultz building, University of Adelaide, 8th of May 2008.

Thursday, May 08, 2008

Forum – Week 8 – “Peter Dowdall and Sound Engineering”

It was a long search, but finally; a photo in which she is not engaged in pulling down her pants..



Mr Dowdall presented a scarily realistic overview of his life as a devoted sound engineer. Starting off with what seemed like a worst case scenario, he outlined a past experience of recording a Jazz Big Band in the EMU recording space at Adelaide University. The fact that Peter had extensive experience with a space that we are all familiar with proved particularly useful in visualising the various situations that arose (not to mention his fastidious photographic chronology of the session).

I find that many audio engineering lectures, when dealing with technical specifics, tend to cover similar issues repeatedly. Peter’s was no different on the surface – microphone placement, mic suitability, headphone monitoring availability, communication, sound bleeding etc – but his repeated reference to the signal to room noise ratio, that must be considered with all instruments, seemed to really hit me as something I’ve not given proper consideration to in the past. Coaching performers to minimise their movement for instance, is something I’ve mostly approached as a timbral quality control practice based on the sound a performer makes, not other sounds that may become more present as a result of their movement.


Reference:

Dowdall, Peter. “Sound Engineering and Session Management.” Lecture presented at EMU space, University of Adelaide, level 5, Schultz Building, 8th of May 2008.

Tuesday, May 06, 2008

Creative Computing – Week 7 – Synthesiser Definitions (2)

Who needs a quad core when your Mac runs at 600%+?


This was another challenging exercise. My biggest issue with programming thus far is memory – mine not the computers. I find that once a file or program reaches a certain level of complexity it becomes very difficult to remember (regardless of how much annotation of code one performs) what each variable’s function is in various contexts of the code.

For step one this week I have stolen Jacob Morris’ code from week 6. I couldn’t quite work out the flow of his SynthDef. It appears to fill an Array with SynthDef instances which are then mixed together for some kind of industrial sounding end result. Nevertheless, I’ve added an amplitude envelope as an audible alteration.

The new SynthDef I have created has utilised the cheesy piano sample donated by Mr Haines. It is unrecognisable in it’s new context as there is lots of modulation performed on the sample playback rate. The final output is passed through a Resonz filter UGen which has a modulating resonant bandwidth. Pitch bend / modulation wheels don’t like “c.set” in this patch. They pass data to SC but refuse to recognise the SynthDef’s argument symbols. They will pass to a generically named “synth” however.


Click here to link to online folder containing this week’s code bundle and an MP3 example.

Reference:

Haines, Christian. “Creative Computing – Week 7 – Synthesiser Definitions (2).” Lecture presented at tutorial room 408, level 4, Schultz building, University of Adelaide, 1st of May 2008.

Thursday, May 01, 2008

Audio Arts – Week 7 – “Location Recording”

The bottom file is an unusual '+1 heavy waveform from the AKG'.

Recording: Schultz Level 10 auditorium
Musician: Robert Keane
Instrument: Euphonium

Techniques attempted:

1) Spaced Pair – Approximately 2 meters from left and right at 45 degrees to performer , cardioid patterns (AKG-Right):

A spaced pair with different types of microphones creates an obvious level imbalance and the U87 had to be backed off to compensate

2) Mega Spaced Pair – same angle but about 7 meters back, cardioid patterns:

Same situation with levels. I preferred the sound of close miking in this space as the Linoleum floor reflections provided a very ‘harsh’ room sound for distance miking.

3&4) Akg front and close, U87 3m back (cardioid patterns), plus Akg front and close, U87 3m back (Omni patterns):

The first of these two is U87 only as the AKG recorded weakly and only added to room noise when boosted – a drawback for universal gain of dual mic inputs. On omni the signal was clearer. Reflections from the room are awful.

5) Mid Side, in middle of room:

No issues with levels – horrible reflections again.

6) Experimental – Mics behind obstacles (whiteboard, curtains)

An attempt to fight the room with bad mic placement. It failed to bring out anything but excess button and room noise.

Click here to link to online folder containing MP3 examples for this week.


Reference:

Grice, David. “Location Recording.” Lecture presented at Elder Hall, University of Adelaide, 29th of April 2008.

Forum – Week 7 – “Tristram Cary”


Has Davros finally met his match?

Homage was paid to a most interesting composer and Adelaide university legend today. I had the pleasure of meeting Tristram briefly in 2006, and I have to agree with David Harris’ sentiments regarding his aura of kindness and generosity. He was no longer the imposing figure that David described, age having compromised much of that, but he seemed fascinating nevertheless. Intelligent elderly people often have that effect, my Grandfather was the same and my Grandmother still does, they compel you to listen and try to take some of their objective experience on board – no matter the subject at hand.

It is almost an impossible thing for our spoilt minds to imagine what it would have been like to take part in the invention of electronic sound generation/manipulation technology. Even today, the VCS is a fun and versatile device to use, providing a wide range of tones in infinite analogue glory. To hear such things for the first time seems an overwhelming experience. I found his electronic composition a little on the complex side (little or no repetition, rarely a definable pulse), but there was careful attention to detail ensuring the sound-scape never came across as harsh.


Reference:

Whittington, Stephen. “Trsitram Cary.” Workshop presented at EMU Space, Level 5, Schultz building, University of Adelaide, 1st of May 2008.