Friday, April 27, 2007

Audio Arts – Week 7 – “Recording the Piano”

Audio Arts – Week 7 – “Recording the Piano”:


Being the Pianola robot for the day, I felt a little cut off from the sound engineering side of the fence at times. Then again I did get to work on my mediocre keyboard technique, which, as you will find out upon listening to the examples below, is just as bad as ever…

Harsh self-criticism aside, the exercise did give me some fresh insight into the art of capturing the soul of a great piano. It just so happened that the tuner was onsite for a regular service of the EMU Steinway as we were setting up mics, so there was no excuse for bad intonation. The piano in question sounds great in any position in the room to my ears but probably best in the centre where we chose to record it. That being said, making a great sound and capturing a great sound are two very different things.

Honky Sonata:

This example is the result of a combination of three dynamic mics. The Beta 52 and MD-421 were placed under neath the soundboard on the bass (long) side of the piano. The SM-57 was placed above and at 45 degrees to the treble strings close to the keyboard. The first impression this may give is a quirky sound of low quality, which technically correct I guess, but if a honky tonk piano sound was the desired effect then a successful outcome has been achieved.

Roomy Sonata 1 & 2:

An example of the same section first using the mid-side technique but substituting the bottom mic from the MS pair with an MD-421 under the soundboard, and an NTV for a room mic. The second example uses the MS configuration with the NT5’s being used as a stereo pair for room mics.

Prelude no.1:

For this Bach Prelude I have provided two examples of the mid-side technique without a room mic for easier comparison. The first example has the U87 on top set to figure 8with the Akg Buls underneath set to omni. This seemed to be preferable to John and Luke at the time, but upon closer inspection of the second example with the polar patterns of the mics inverted (positions remaining) I think it produced a softer less ‘honky’ recording.

Prelude no.21:

A simple approach was adopted here, trying out the spaced pair idea touched on briefly by David G. at the end of Tuesday’s lesson. The winner here is clearly configuration 1, which placed the U87 above the strings at around 45 degrees to the soundboard at the treble end, and the NTV at the bass end with similar positioning. The recording is quite rich and resonant considering there were only two mics involved. Switching the positions of the mics produced quite the opposite however, so live and learn…

Click here to link to online folder containing audio examples.


Reference:

David Grice. “Audio Arts – Wk 6 – Recording the Piano.” Tutorial presented at EMU space and Studio 1, Level 5, Schultz building, University of Adelaide, 24th April, 2006.

Thursday, April 26, 2007

Forum – Wk 7 – “Gender in Music Technology, Can you tell the difference?”

Forum – Wk 7 – “Gender in Music Technology,

Can you tell the difference?”


Ben Probert, Douglas Loudon, Amy Sincock, Jacob Morris:

Utilising his increasing knowledge and insight in reference to the historically male dominated discipline of music education, Ben highlighted some valid points that may contribute to the lack of women in present day music technology. In particular, I thought the issue of expectation and prejudgement placed upon someone of specific gender entering a competitive field generally reserved for the opposite could be a major factor influencing one’s initial decision. After all, if it is assumed that the work one produces is always going to be judged harshly as a result of something irrelevant (gender) before it’s even begun, then those lacking the courage to present their work to such a hostile society may seek other friendlier outlets for expression. Not all women have the relentless drive and confidence of Germaine Greer but I’m sure many have just as much to say.

It is therefore unfortunate that we see fewer women involved in this section of music society. We are surely missing out on the benefit of talent and originality that many who have shied away from the field may have had to offer. Such is the current situation however, but maybe things will improve with time..


Douglas Loudon I believe unintentionally created a monster of audience participation, which threatened to turn into another mess of simultaneous loud debating and opinionated rhetoric. Perhaps presenting an opportunity for the crowd to intervene without giving himself the chance to first present and articulate his points was the can opener? All the same, it seemed to flow with some continuity into Amy’s presentation whose approach was somewhat similar.


Amy remarked on her longing for a utopian society in which concerns of gender equality are obsolete, as all are judged fairly on the merit of the talent they have to offer. This sounds wonderful but is far from the case in society as we know it. I know feminists can seem overbearing and ruthless in their pursuits at times, and some of them may well be over the top when it comes to certain observations and assumptions that they may make, but they aren’t always behaving that way without reason. If we look just a little into our own political past, inequality between genders was once what would today be regarded as completely unacceptable. Women didn’t gain the right to vote by sitting on their hands and dreaming of a world above such injustice. They continuously rallied, fought and tried their best to raise awareness of the fact that the injustice actually existed. I think that often an aggressive well researched and articulated statement is necessary to get people that have the power to enforce change to realise that there is a need for it at all. At the end of the day it’s usually men that inhabit positions of influence and if you don’t shout they’ll just keep on tinkering. I guess one problem with enticing women into such influential positions is that all the aggression and peacock strutting involved puts them off major protesting and politics in the first place. I’m straying a little from music technology but it’s all relative..


Jake, which flavour do you prefer?


Reference:

Stephen Whittington. “Gender in Music Technology, Can you tell the difference?” Workshop presented at EMU space, Level 5, Schultz building, University of Adelaide, 26th April, 2007.

Ben Probert. “Gender in Music Technology.” Student talk presented at EMU space, Level 5, Schultz building, University of Adelaide, 26th April, 2007.

Douglas Loudon. “Gender in Music Technology.” Student talk presented at EMU space, Level 5, Schultz building, University of Adelaide, 26th April, 2007.

Amy Sincock. “Gender in Music Technology.” Student talk presented at EMU space, Level 5, Schultz building, University of Adelaide, 26th April, 2007.

Jacob Morris. “Gender in Music Technology.” Student talk presented at EMU space, Level 5, Schultz building, University of Adelaide, 26th April, 2007.

Monday, April 16, 2007

CComputing – Wk 6 – Messaging and Routing

CComputing – Wk 6 – Messaging and Routing: 02/04/07


Thats a tidy piece of work, just don't dig to deep..

For those of you using PCs at home and dealing with cross-compatibility issues, take note if you haven’t come across this already: The file names of your objects have a limit to how large they can be on the Macintosh platform before it has one of its interesting moments and changes the letters exceeding the limit to numbers (which only shows in the patcher window when you open the patch by itself incidentally), and therefore Max just spits out the old ‘no such object’ error in relation to the file/files in question. The only solution I could find was to rename the files with fewer characters. I hope this saves the rest of you some hair.



The random sequencer seems to finally be pushing us in a musical direction with this project. I think I may have strayed from the requested features a little in my creativity, but I think the result is interesting and musical enough regardless.



Pitch Selection:



The pitch selection in my sequencer is not necessarily specific, but allows the user to select between three sections of the keyboard for the random object to play notes from. The random object is selecting these notes from randomly loaded table objects however, and I have included a facility that allows the user to ‘draw’ in their preferred pitch and velocity if desired.

I never was any good with chords, maybe this is why...

Note delta time and duration:

I have to admit I overlooked this feature in my design, but I have created an object to deal with one of the issues arising from note lengths: The fact that when too many notes are received by a synthesiser it can cause it to crash. This can be pretty annoying when the sequencer is making some interesting music at the time. My safeguard object to counter this is exactly that – a counter that flushes any held notes when it has received 20 MIDI pitches. This feature only kicks in when the sequencer is turned on so it doesn’t annoy the user during normal operation.

Debugging:

This pretty much amounted to re-writing the whole program although certain things could be copied from the previous model. It serves me right for trying to do too much in the early stages, which only served to complicate things beyond belief.


Click here to link to online folder containing a Zip file of the patch and its associated help and object files.

Don’t forget to add the pathways to the ‘Dave’s Help’ and ‘Dave’s Object’ folders in Max file preferences before loading.


Reference:

Christian Haines: “Creative Computing Week 6 - Messaging and Routing”. Lecture presented at Tutorial Room 407, Level 4, Schultz Building, University of Adelaide. 2nd April, 2007.

Friday, April 06, 2007

AArts – Wk 6 – “Recording Orchestral Strings”

AArts – Wk 6 – “Recording Orchestral Strings” 03/04/07:


Anna & the Cello:

This was a much more challenging instrument to capture than the sax or flute I felt. The main issue was that of audible noise from the performers natural movements whilst playing. The instrument itself seems prone to producing unwanted noise due to its structure. Only the slightest tap with the bow or even heavy use of the fretting hand on the fingerboard at an inopportune moment could contaminate the waveform of an otherwise beautiful sequence.

The micing was a moderately different affair to Tuesday’s efforts as we opted to use a couple of room mics (U87 and U89) and compare the difference. Another questionable innovation was the idea to put the KM84 overhead on the right hand side of the performer initially. A second overhead was also tried with the Rode NT4 stereo condenser but I opted to leave it out of all but the glissando and NT4 specific mix-downs, as it brought too much ‘honk’ into the sonic structure.

The room mics were tested independently at a considerable distance from the source, being on the other side of the room. As usual I think the U87 was the real winner on the day as the U89 is considerably less sensitive, which was a bad thing on this occasion providing duller, less lively ambience, but it’s always worth giving others a chance. The reverberation level was tested in stages by gradually opening more and more curtains to ‘open up’ the sound. We settled on the north side open, south side closed scenario.

After a take of this set up, the room mics were moved into within a metre of the performer, and the KM84 shifted to her left shoulder area, facing the upper fingerboard. The general group consensus was that this was the most successful of the days efforts at mic positioning. I myself am undecided, but it’s all subjective anyway.

Make up your own mind people; it’s still a free country for the moment.


Click here to link to online folder of MP3’s.


Reference:

Grice David. 2007. “Recording Orchestral Strings”. Tutorial held at Studio 1 and EMU space, level 5, Schultz Building, University of Adelaide. 3rd April.

Robjohns Hugh. 1999. “Strings Attached”. Sound On Sound Music Recording Magazine, Media House, Trafalgar Way, Cambridge, UK.

Thursday, April 05, 2007

Forum – Wk 6 – “Collaborations Pt 3”

Forum – Wk 6 – “Collaborations Pt 3” 05/04/07:


Luke Harrold and Matthew Phipps: “The 9:13”

I guess the title should read collaborators on collaboration. Today’s forum started out a promising affair, with Luke Harrold’s algorithmic accompaniment to the short film ‘The 9:13’ displaying his talent for original electronic music composition – something he seems to have been a little coy about in our recent perspectives lectures. A piece (or pieces) driven much more by the texture of various sonic timbres and haunting harmonisation than by any tangible pitch or steady rhythm, it sat very well with the disturbing underlying theme of the film. Did anyone see that episode of Six Feet Under where David Fisher picks up a hitchhiker who then proceeds to torture him physically and emotionally for several hours? It reminded me of that, provoking the same feeling of empathetic helplessness for the victim in question.

The only drawback was that I was too familiar with Adelaide Railway Station and the trains where it was filmed. I found it hard to loose myself in its performance, as I kept noticing variants in the background noise and scenery that rang false to my own extensive experience with them. Such is the adverse consequence of local knowledge I guess..


David Harris and Pamela Rataj: “Compossible IV for twelve musicians”

Now this was a far superior outcome than our recent group effort to perform this same piece. Why you ask? Was it the professional musicians? Was it the awareness of each artist regarding what was required of them? (They clearly didn’t have the idea presented to them five minutes before being asked to perform the piece, as we did) Was it collaborating with the visual artist Pamela Rataj that tipped the balance in favour of a rich and rewarding sonic outcome?

I think possibly all of the above are correct to an extent, but I believe the simplest and most influential factor was that he recruited instrumentalists that play instruments capable of producing endless real sustain of pitches – something that none of us could do given what we had to work with a few weeks ago. And not only that, he had a tangible motivation behind his choice of notes and duration, that being the pursuit of random pitches and harmonisation of the microtones between our traditional chromatics. My faith in the results of experimental music is somewhat restored – nice work Dave.


Poppi Doser and Betty Qian: “Behind the Door”

This work highlighted an interesting angle for consideration: the issue of a language barrier when collaborating with a person of non-English speaking background. Despite the challenges presented by this situation, electro acoustic musician and composer Poppi Doser, and visual animation specialist Betty Qian have created an intimate and beautifully delicate piece of work, utilising elements of fantasy, dream state, and I suspect some subtle political and social observation. Obviously the collaborative relationship is satisfactory to both involved as they are currently working on a new project, which I look forward to seeing..


Stephen Whittington: “Collaboration with the Dead”

I forbid you all to defile my masterworks with your vile rock and roll!

Stephen’s insights into collaboration with deceased composers by current musicians felt like a many-tiered highbrow argument waiting to happen. After all, despite the will to honour the original composers vision that may be expressed by various performers, they can never actually communicate with the person themselves, so does this situation really comprise a collaboration as such? I plan on arranging a Thrash Metal version of the very Sonata # 8 that Stephen teased us with at the beginning of his talk. Would I not be collaborating with Beethoven because I am using his music outside of its originally intended context? Should black technically be referred to as dark grey, or is there really a point of colour saturation that can be scientifically proven and defined to be black?

Too late LVB...Man, can this chick shred!

Until next time…


Reference:

Luke Harrold. “Collaboration with Computers”. Presentation and discussion held at EMU, Level 5, Schultz Building, University of Adelaide. 5th April 2007.

David Harris. “Collaboration for Compossible”. Presentation and discussion held at EMU, Level 5, Schultz Building, University of Adelaide. 5th April 2007.

Poppi Doser and Betty Qian. “Behind the Door”. Presentation and discussion held at EMU, Level 5, Schultz Building, University of Adelaide. 5th April 2007.

Stephen Whittington. “Collaboration with the Dead”. Presentation and discussion held at EMU, Level 5, Schultz Building, University of Adelaide. 5th April 2007.

Wednesday, April 04, 2007

CComputing – Wk 5 – “UI Controls and Application State”:


After another exciting bi-polar evening involving some 10 or so hours in front of the max edit window, I am still yet to discover the secret to the adjustable feedback program requested by our commander in chief. Nevertheless, I have cleaned up my edit layout (and subsequently re-cluttered it with new objects created for this week’s task), and gotten all of the other functions to work.


White or black key selector:

This basically involves sending incoming pitch and velocity data to a pair of select objects via a gate (status controlled by key colour selection in an ubumenu), stripnote, and modulo to give select a realistic set of figures to work with. Select then opens or closes another pair of gates which are receiving the direct pitch and velocity info from notein depending on which coloured notes it has received based on the ubu preselection of input.


MIDI delay:

This step was straightforward: pitch and velocity are allowed to access a pipe object when the delay is switched on. The pipe’s delay time in milliseconds can be either typed into the pitch wheel’s manual entry number box or adjusted in real time with the wheel itself. The pitch wheel’s output is sent to pipe via a horizontal slider which is set to multiply the incoming values of zero to 127 by 50 to give a crude but broad range of delay values to work with – it’s hours of fun with the auto-shredder…

Feedback control: not yet established, I will seek assistance.


Keyboard shortcuts:

I haven’t gone overboard with these as the patch is starting to look quite dense and I don’t want to confuse the issue too much. Just the important functions have been taken care of so far:

Initialise: i

Panic button: spacebar

Switch to controller entry: c

Switch to keyboard and mouse note entry: k

Turn on/off delay: d

Start/stop the Shredder: s

Count MIDI instrument type down through list: down arrow

Count MIDI instrument type up through list: up arrow


Click here to link to online folder containing text file of patch.


Reference:

Christian Haines. “Creative Computing – Wk 5 – UI Controls and Application State”. Lecture presented at tutorial room 408, Level 4, Schultz building, University of Adelaide. 29th March 2007.