Thursday, May 31, 2007

Forum – Week 12 – “Composition vs. Improvisation”

Forum – Week 12 – “Composition vs. Improvisation

Here we see a classic example of Bush administration improv' which, interestingly, is used to obscure the result of a failed composition..


Comprovisation? Improvosition? Incomprehensible imposition?!

Where the line blurs between the two, and we can never draw a truly accurate distinction as to where one finishes and the other begins, in the case of music I pose the question: Do we really need to know or care? I often find these academic explorations into phenomena that defy rigid characterisation tedious. Given that we were discussing music today, I felt as if we could have been constructively making music (improvised or otherwise) instead of theorising about what constitutes composition or the other. If this were a political degree focused on US foreign policy for instance, and a discussion exploring the different warring tribal factions in the Congo, with a view to better understanding leading to a potential end to civil war in the country and saving life, then I would not hesitate to perceive the relevance. When it comes to music however, this kind of intense scrutiny of “what is composition?” seems a little trivial, in a troubled world that could certainly use questioning minds for a more constructive purpose.

All the same, I enjoyed hearing another example of the brilliant sitar player Dr C. Sardeshmukh. It brought back fond memories of the improvisation (or was that composition) session involving one of last year’s forum ensembles in which Dr C. challenged the group to a 45-minute duel with the C Lydian scale as his weapon of choice. Click here to view last year’s blog, which contains a link to a badly recorded audio file of the session.

The Ross Bolleter example, although exhibiting an interesting approach to piano (for ‘ruined’ piano was Ross’s description), produced repetitive and annoying results to my ears. It was the constant broken or ‘dead’ key noise that did it. Clunk, Clunk, Clunk. To me this was an unremarkable sound, one most people have probably heard from an old dilapidated piano in the past and not worthy of repetition in a purely musical context. There were some fascinating bell-like sounds resonating from certain strings that caught my attention positively, but the clunk was always lurking to spoil the moment.


Entering the world of Mr Bungle again was an appealing and frustrating affair as always. Despite my appreciation of complexity in music on many levels, I just can’t come at the seemingly random compositional approach of Mike Patton. I guess I would perceive his method to be something in the order of taking many good ideas, throwing them up in the air, and connecting them linearly, as they are picked up off the floor. To my ears, it is the sonic equivalent of forcing incompatible pieces of a jigsaw puzzle together. It seems like there’s a coherent picture in there somewhere, some areas even look stable and appealing but it lacks the overall structure that I desire from a completed work of art...


Reference:

Whittington, Steven. “Forum – Week 12 – Composition vs. Improvisation.” Workshop presented at EMU space, Level 5 Schultz building, University of Adelaide, 31st May, 2007.

Harris, David. “Forum – Week 12 – Composition vs. Improvisation.” Workshop presented at EMU space, Level 5 Schultz building, University of Adelaide, 31st May, 2007.

Wednesday, May 30, 2007

Audio Arts – Week 12 – “Mastering Pt 2"

Audio Arts – Week 12 – “Mastering Pt 2”

What, no fire in the sky?


After another round with the MC4, I believe I’ve managed to gain comparable loudness to the Martin Pullan example of ‘Smoke on the Water’ live. The differences seem to be in my preference for the ‘smiley face’ EQ sound which drops out the midrange somewhat, giving a crisper separation between bottom and top end. I’m relatively satisfied with my kick drum sound, but the snare lacks something in comparison to Martin’s effort.

In attacking the various frequencies with substantially more gain than last week, I suspect there is a little peaking going on here and there. This is often hard to decipher with heavy rock music, as there is desirable distortion incorporated as a deliberate part of the sound. I utilised the MC4’s ability to isolate frequencies to focus on narrowing the Q a little more than last week. To get that authentic David J Dowling edge, I ended up stacking two Joe Meequalisers at the insert stage with (I only realised this when I was doctoring the images) exactly the same settings. It feels as if it somehow detracts from my ingenuity, to have just added another plug-in, rather than fighting what I had for the right sound, but if it’s there, use it I guess.

The order of plug-ins..would the Joe Meek's have sounded different being inverted ?


At the end of the day, I think my result sounds like it has my personal staple (whether it’s particularly unique or desirable to anyone but me, I don’t know), and Martin’s sounds like it’s supposed to, based on my cultural conditioning to what commercial music is meant to achieve sonically.


Click here to link to online folder containing the Martin Pullan example and My example.


Reference:

Grice, David. “Audio Arts – Week 12 – Mastering Pt 2.” Tutorial presented at Studio 1, level 5 Schultz building, University of Adelaide, 29th May, 2007.

Friday, May 25, 2007

Audio Arts – Week 11 – “Mastering”


I must say, the MC4 multiband compressor is very impressive. If the power to isolate and manipulate the level and onset of individual frequencies in a soundfile is the key focus of good mastering equipment, then the MC4 certainly seems up to the task.

Interestingly enough, it’s only after applying such material to a piece of work that I can really appreciate its value. I think people like me, who work more by the numbers than intuitively, need a reference point that is very obvious, to get an idea of the difference my work has made on a given subject. When listening back to the original soundfile (which I have conveniently put in the online folder so you can hear the difference yourself) after adding the big bottom, MC4, and Joe Meek EQ, I could really hear the muddiness and excessive boom in the bottom end.

I think the fundamental change that has been made to this file is ‘cleanliness.’ After really isolating and focussing on the individual elements, I think most of the important sonic events can now be easily interpreted in their own right. As a self-criticism of my novice ability in the field of mastering, I think the end result may sound a little to bright, as the top end tends to bite pretty hard. This can be attributed in part to the settings I chose from the Joe ‘Meekqualiser’, which is a sensitive little unit to say the least.

The order of plug-ins used in the insert stage..

By no means have I explored the entire plethora of possibilities offered by the Pro-tools mastering plug-ins, but complicated units like ‘autotune’ would need a week of focus to themselves to be fully examined.


Click here
to link to online folder containing the original and the mastered file.


Reference:

David Grice. “Audio Arts – Week 11 – Mastering” Tutorial presented at Studio 2, Level 5 Schultz building, University of Adelaide, 22nd of May, 2007.

Thursday, May 24, 2007

Creative Computing – Week 11 – “Distributed Performance”

Creative Computing – Week 11 – “Distributed Performance”


Dave's Patch.

Luke, Dave and the Fat 32:

Once again Max has succeeded in disrupting my growing confidence by forcing me to spend 1½ hours creating one component of my contribution to this exercise. A few little gripes with organising the reception of MIDI data from the Novation created a bit of a marathon session but we got there in the end (after moving to a different workstation).
Lukes patch.

The motivation behind construction of my patch is simple; create as many pathways for sending CC MIDI data as there are control devices to utilise on the Novation Keyboard. I ended up mapping 14 controllers in all to the filter and amplitude ADSR controls of the A1 synthesiser in Cubase on Luke’s machine. I also mapped CC’s to the filter resonance and cut-off area to maximise the filters influence on the sonic result. In the audio example posted I had discarded the amplitude controls for the flanger, which has a more obvious characteristic to manipulate.

The only real patch building issue with my system was working out which way to route data from the relevant controllers and create a list in the correct order, for out put to Luke’s system. Once the problem was sorted I simply had to duplicate the controller number selection component 13 times to give me the correct quantity to work with. There was an issue also with the first ctlin object not recognising the pitch wheel but adding a second one fixed that.


Click here to link to online folder containing patches and an audio mixdown example.


Reference;

Christian Haines. ‘Creative Computing – Week 11 – Distributed Performance.” Tutorial presented at Tute room 408, Level 4 Schultz Building, University of Adelaide, Thursday 24th May, 2007.

Forum – Week 11 – “Construction and Deconstruction”

Forum – Week 11 – “Construction and Deconstruction”

I'm so gonna gunnify you down holy man...

Simon Whitelock:

Simon approached the topic from the interesting standpoint of ‘to build or destroy’ in relation to the ‘jacking’ (Whitelock: 2007) of samples by modern artists for the purpose of creating or colouring their works. The delivery was certainly entertaining as he constantly made adjustments to EQ and volume levels on the mixing desk in true DJ fashion. This routine accompanied the continual barrage of ‘then and now’ audio examples that were referenced throughout the presentation.

I think the moral standpoint that Simon was trying to make with some of his references (especially the Daft Punk vs Breakwater example) was that he believed that the updated or ‘jacked’ version was not created in the spirit of the original. Although, when probed for a commitment as to wether or not he believed that jacking was ‘destroying’ the original, or to put it more accurately ‘how’ it was destroying the original, he back-pedalled into the neutral corner. I found it a little disappointing that when asked one carefully worded question regarding his view that he would alter his position so quickly, especially after displaying such confidence throughout his talk. That being said, it was an amusing and obviously crowd-pleasing presentation, but isn’t that what good DJ’s do best?


Nathan Shea:

I was certainly surprised by the amount of work that went into recording the black metal album played by Nathan early on. I mean, to get Lo-Fi sound, does one really have to go to so much trouble. Unless of course one is going for that pure analogue warmth, the kind that can only be sourced from the circuitry of a K-mart cassette recorder? Hmm, I smell a false sense of regressive artistic genius. I’m not an ignorant newcomer to the genre of black metal, as a certain unnamed family member has played numerous examples at me on many occasions. My impression of your average Norwegian loser, who wasn’t dealt the best of hands in life (looks, family, intelligence etc), belting out this ridiculous excuse for aggressive music from his cold, dark, one bedroom high-rise dungeon hasn’t changed. The contrast Nathan attempted to draw between Black Metal and Drum and Bass seemed a little vague, but I did appreciate some smooth beats after such a repulsive onslaught.


John Delany:

The world of Lustmord is a dark and mysterious one. I have talked about this artist to Mr Delany on many occasions and find his approach to be abstract yet accessible. The third example in particular highlighted the exceptional control that Lustmord has over his craft. Silky, deep sonic undertones provided the perfect vehicle for stabs of velvety yet brutal sub bass impact. We should all be very afraid…


Reference:

Stephen Whittington. “Construction and Deconstruction.” Workshop presented at EMU space, Level 5 Schultz building, Thursday 24th of May, 2007.

John Delany. “Construction and Deconstruction.” Student talk presented at EMU space, Level 5 Schultz building, Thursday 24th of May, 2007.

Simon Whitelock. “Construction and Deconstruction.” Student talk presented at EMU space, Level 5 Schultz building, Thursday 24th of May, 2007.

Nathan Shea. “Construction and Deconstruction.” Student talk presented at EMU space, Level 5 Schultz building, Thursday 24th of May, 2007.

Tuesday, May 22, 2007

Creative Computing – Week 10 – “Interface Design”

Creative Computing – Week 10 – “Interface Design”

Note the half visible scrolling message?


Nuclear Synthesiser 1.0:

In the interest of dealing with my lack of sanity and sleep deprivation, I have decided to stop tweaking the patch here for now. For what it is, I don’t think the GUI needed a huge amount of work this week, as it was already quite functional. I have cleaned up some of the annoying text messages and replaced them with more informative pop up help windows. There is some modest animation in the form of a visual keyboard retraction program, which ties in with a relevant scrolling message that is activated whenever the user clicks the button for keyboard display.

A jsui javascript dial object has been included as an indicator dial which gives a visual representation of the clocks current position in its count through coll’s items. The values to feed into the jsui dial had to be mapped to between 0 and 1 (jsui dials range) with a zmap object to enable use in this way.

Menu items are restricted to the addition of one only this week, which comes up as the first item in the Max menu as “The Nuclear Synthesiser”, when menu items are activated by clicking the yellow ‘open’ button in the middle of the GUI. Time has restricted my level of sophistication here as the tutorials proved more time consuming and difficult than expected.

If all goes to plan the patch should be bug free – well it was when I left it…


Click here to link to online folder containing zip file of patch and associated files.


Reference:

Christian Haines. “Creative Computing – Week 10 – Interface Design.” Tutorial presented at tute room 408, level 4 Schultz building, University of Adelaide, 17th May, 2007.

Thursday, May 17, 2007

Forum – Week 10 – " Construction and Deconstruction"

Forum – Week 10 – " Construction and Deconstruction"

Oh well, back to the old drawing board..


Frederick May:

Freddy gave an interesting and entertaining presentation citing the ‘ingredients of song construction’, namely the ingredients of successful pop music, as his relevant subtopic. Those of us who dream of mainstream success would love to imagine an arcane formula one that would provide clear access to that top 0.000000000001 percent of artists who enjoy a long and successful career in the commercial music industry. How Freddy can argue that such a formula exists without knowing what the seven magic ingredients are (as he quoted from the great sound engineer Michael Stavrou) is a difficult question without, I believe, a convincing answer. I’m certainly not trying to discredit the concept, but apart from the un-referenced formula of pitch, positive lyrics, tonality, and body resonance (which apparently only holds true for happy-go-lucky pop songs) there was nothing presented to back up the idea apart from a few varying musical examples that made no.1, all of which apparently contain ‘the seven mysterious ingredients’.

I think there was a valuable point missed in Freddy’s also unconvincing argument that one could fall asleep listening to the metal band ‘Slipknot’ unless the visual impact of their live video was presented at the same time. I have been listening to heavy metal music for at least twenty years, starting in the eighties when the genre received very little airplay in Australia – especially on television, and I could not think of one situation where video footage has had a profound influence on my decision to purchase an album. When I first ‘heard’ Metallica’s Master of Puppets back in 1987, I embraced it as the music I had wanted to ‘hear’ for my entire life. I had been into Punk music for a long time preceding this and was overwhelmed at Metallica’s ability to express themselves so aggressively with such astounding musicianship, which goes against the easily accessible three chord grunge ideal of Punk rock. Then and now it gives me all those stimulating feelings that Freddy pointed out in relation to the Slipknot video: Excitement, Aggression, Happiness, Raised heart rate etc. The point I’m trying to make is that I loved their music for years without ever seeing them, and despite initial short-lived excitement at the novelty of my first viewing of the film clip to the song ‘One’ in 1989, it has never been about the visual aspect.

As for the ‘valuable’ point I promised to mention at the start of the above paragraph, the reason I think one may make a connection with the importance of visual impact in conjunction with the consummation of Slipknot’s music, is the fact that their craft lacks a certain level of sophistication. Personally I think it’s no accident or pure musical influence on the senses that they dress up in their scary suits and jump around like maniacs during a performance. Like many of the so called ‘Nu-metal’ bands of the mid to late nineties and early this century they realised they were not going to pull ‘Metal’ crowds on the strength of their musicianship alone, so other factors of their performance were exaggerated to compensate.

Dragos Nastasie gave us the breakdown of some layering concepts of techno, new age, and industrial rock. I found the point regarding contextual change in a given sound or melody to be the most useful.

Matt Mazzone proceeded with a disastrously bugged presentation (why do so many people in a music tech course appear overwhelmed when faced with one computer, a CD player, and a less than daunting ten channel mixer?) of some advertising music he had written for the Media Resource Centre. It was pleasant enough sounding music in the main example, non intrusive and unchallenging, an approach I wish more advertising companies would adopt.


Reference:

Steven Whittington. “Forum – Week 10 – Construction and Deconstruction." Workshop presented at EMU space, Level 5 Schultz building, University of Adelaide, 17th May, 2007.

Frederick May. “Forum – Week 10 – Construction and Deconstruction." Student talk presented at EMU space, Level 5 Schultz building, University of Adelaide, 17th May, 2007.

Dragos Nastasie. “Forum – Week 10 – Construction and Deconstruction." Student talk presented at EMU space, Level 5 Schultz building, University of Adelaide, 17th May, 2007.

Matt Mazzone. “Forum – Week 10 – Construction and Deconstruction." Student talk presented at EMU space, Level 5 Schultz building, University of Adelaide, 17th May, 2007.

Wednesday, May 16, 2007

Audio Arts - Week 10 – “Mixing Pt 3”

Audio Arts - Week 10 – “Mixing Pt 3”

Not the EMU Cort..NOOOO!!!!

The nature of this weeks exercise didn’t exactly call for an intensive mixing session but I have done a little tweaking to the final result regardless. I recorded the in-house EMU electric guitar played through the in-house Laney amplifier. Not my rig of choice, but a worthy set up for the task.

I close miked the cabinet with an SM-57, in a baffled 4m square area in front of the control room of studio 1. I placed a U-87 for some ‘space’ nearby at about 45 degrees off axis to the speaker, set it on omni, and about 2.5m off the floor. I then cranked the amp to 11 and had a quick sound check.

After spending too much time (that’s what I get for listening to Ben’s advice) working out the best way to play the axe from inside the control room, and setting up Pro-tools to receive from the Avalon things were ready to roll. I downloaded a backing track to play to as I felt it would give a better impression of the instrument in context. The initial signal was terrible as the Laney’s speaker has issues with being cranked past around 60%. I begrudgingly backed it off and ripped through two 5 minute takes of bluesy improvisation.

The first take was recorded from the Avalon Preamp. The second was direct to the control 24. Then both takes were individually routed to the Avalon and re-recorded to examine the additional ‘warmth’ or ‘character’ that the Avalon may provide. All up there are four 1-minute tracks at the link below that demonstrate these approaches.

Personally, I don’t know if I’m sold on this high end (not to mention high-priced!) analogue gear. I have never really trusted my own ears when it comes to this end of the sound game however so maybe I have good reason. I did add some subtle EQ, compression, reverb and a stereo delay but they don’t colour the original signal too much so those in the know should still be able to differentiate.

These are the plugins that were applied universally to all tracks via an auxiliary channel.


Please comment if you feel you can tell the difference, as I’d love to get some second opinions.


Click here to link to online folder containing MP3’s of all mix-downs.


Reference:

David Grice. “Audio Arts – Wk 10 – Mixing Pt 3.” Tutorial presented at Studio 1, Electronic Music Unit, Level 5 Schultz Building, University of Adelaide, 15th May, 2007.

Tuesday, May 15, 2007

“Creative Computing Week 9 – Data Management and Application Control”

“Creative Computing Week 9 – Data Management and Application Control”


The main user interface.

After a wise decision to copy a rough draft of Christians ‘heads up’ this week the organization of data flow was relatively straight forward, if a little time consuming. I produced a basic patch with the suggested timer providing note duration values and packed along with pitch and velocity before being stored in coll, I attempted to create a dual input functionality. This would have allowed the user to input notes with either the controller keyboard or the k-slider but was abandoned in favour of straight controller keyboard input.

I have once again gone for the metro running a counter option for playback and overdub recording functionality as it allows for the manipulation of playback length. The patch is set up to trigger the coll with a ‘length’ message whenever it receives a new list (note information) from pack. This message is read by coll and it sends its length (number of items inside) out the left outlet. The number is routed to the ‘max count’ input of counter to keep its wraparound length consistent with the coll. The function of sending this message has been added to other objects / functions that influence the total items in the coll such as the ‘erase note’ function which allows the user to chose a list number to erase from coll’s current group of items.

The output of coll passes through a ‘route’ object with the argument ‘list’. This allows route to perform the function of filtering out items that are not in list format, therefore blocking the individual numbers that are sent out by the length message being fed to coll.

The Logic function controll window.

My patch of logic functions is following some simple procedures such as routing data to particular places if it is above or below a certain threshold. The idea is to create sweeping functions that control MIDI data faders to add interesting and moderately unpredictable timbres. As this is my first real attempt with artistic use of the expr, if, and zmap objects, it is not a particularly sophisticated result although it does sound interesting.


This is the A-1 Cubase synth used for the recording.

Click here to link to online folder of patch and audio MP3 of this weeks masterpiece.


Reference:

Christian Haines. “Creative Computing – Wk 9 – Data Management and Application Control.” Tutorial presented at Room 408, Schultz building, Level 4, University of Adelaide, 10th May, 2007.

Thursday, May 10, 2007

Audio Arts – Week 9 – “Mixing Pt 2”

Audio Arts – Week 9 – “Mixing Pt 2”



Contrasting styles from one set of audio files:

I must say, I do love this kind of exercise. There is nothing like taking a little artistic licence to spice up the average pop song. To demonstrate that I’m not all about decimation I have included a ‘bare bones mix’ in which I have omitted all instruments and notes that I did not deem necessary to convey the vocal line and emotional impact of the song. This reduced the drum line to a laid-back ballad rhythm that should really contain a rim-shot instead of a snare. The bass has been reduced to mostly single sustained notes aided in their longevity by a delay line. Only one of the harmonic acoustic lines was used and the vocal reverb has been enhanced significantly.

For the ‘Hyper mix’ I upped the ante with the drum tempo, but in such a way as to let it sit under the vocals as they were in the previous mix. All this extra drum activity needed a bit of competition from the other instruments so fuzz was added to the bass for some punch, the electric guitars replaced the acoustics and were suitably smeared with some Amplitube dirt. I opted for a little crunch on the vocal line with this one as well as I think it helps it to cut through the mix.


Now for the fun part: ‘Death mix’. This one is not for the faint hearted, with double kick drumming, aggressive snare rolls, pitch shifted bass, electric guitar and vocals, over the top distortion, and a healthy dose of time stretching to tie it all together in a loose fitting minute of mayhem. I guess I wanted to give myself a challenge and see just how evil I could get with this innocent little pop tune. If anyone wants to take a guess at the time signature good luck…



Click here
to link to online folder containing MP3s of all mixes.


Reference:

David Grice. “Audio Arts – Wk 9 – Mixing pt 2.” Tutorial presented at Studio 1, Level 5, Schultz building, University of Adelaide, 7th May 2007.

Forum – Week 8 – “Tristan Louth-Robins – Masters project”

Forum – Week 8 – “Tristan Louth-Robins – Masters project”

"You know Alvin, folding your arms indicates a degree of social tension..."


Alvin Lucier and the concept of focused listening:

Not being quite the ignoramus that I once was, I had actually attended Tristan’s performance of Lucier’s “I am sitting in a room” at last years Festival and was suitably spellbound by its hypnotic nature. Hearing a little more in conjunction with Tristan’s clear-cut explanation was educational the second time round. It changed my sonic perception, once I developed an understanding of how the end result came be derived. I had previously wondered why people chose to keep performing this same piece, but the fact articulated by Lucier himself eg; “different from the room you are in now” is justification enough, for a different result will be achieved in a different space. Perhaps someone may (perhaps someone already has) one day try performing the piece using a similar room to that utilised in Lucier’s recording, but speaking different words and changing the dynamic articulation. I have yet to hear the experiment performed with the human voice using anything other than dry monotonic delivery of the original text.

The drum and bass from alpha brain waves seemed a suitable pairing as producing ultra low frequency is a desirable characteristic in many a large drum cylinder. I think the component of Tristan’s presentation that sparked my interest the most was the harmonic bowl resonator concept. There were some very guitar like harmonics extracted from the set up, and I am always looking for new ways to replicate that particular sound. I like the idea of complimenting a guitar recording with similar but unique sounds that have been derived from unusual sources, as opposed to my standard procedure of trawling through sound banks and hoping to strike it lucky.

All in all, an informative presentation and a welcome change from the pace of the last few weeks…


Reference:

Stephen Whittington. “Forum – Week 8.” Workshop presented at EMU space, Level 5, Schultz building, University of Adelaide, 10th May, 2007.

Tristan Louth-Robins. “Master of Music Project.” Student talk presented at EMU space, Level 5, Schultz building, University of Adelaide, 10th May, 2007.

Tuesday, May 08, 2007

Creative Computing – Week 8 – “MIDI Information and Virtual Instrumentation”


The art of sonic decimation..


For this masterpiece I have incorporated a combination of random and sequential number generating devices for the manipulation of mostly user selectable controllers in Reason. The new device has been integrated into my random note generator from week seven. The first one tackled was Xbendout, which I quickly discovered requires some restraining to prevent MIDI overflow situations. I have included the option of either Xbend or Standard pitchbend control or both at once for the user.

Next was the Modulation wheel controller. After foolishly creating eight different objects and making only subtle variations to their functionality based on controller representation, I realised I was wasting time and changed to a more generic MIDI approach. Sourcing a MIDI controller device list from the Internet was easy and I dumped its content into an UBU menu. This is simply routed to the ctlout device and allows the user to select which controller they whish to use. Of course it’s not identical to Reasons list but it’s a reasonable substitute for some sort of visual guide.

This approach meant that I could now duplicate the device as many times as was necessary and the onus would then be on the user to select the controller to route it to. This would have been a much smoother transition had I not used so many receive objects initially. They are now on my to be avoided unless absolutely necessary list. This has rendered the object patches a little cluttered at this stage, but I can work on cleaning them up later.

The data out put automation from the control station can be either sequential or random (individually selectable for each controller), and as a counter is used in conjunction with a metronome for streaming the numbers, the range of numbers that are output can be easily dictated.

User notes:

If no controller data is being received by Reason, you may need to manually click on the bottom UBU menu device for the controller that you are using. I have implemented a global initialisation feature to deal with this but it’s having a problem communicating with objects inside a bpatcher.

Below is a link to an MP3 of my newfound music tech virtuosity.


Click here to link to online folder containing zipfile of patch and an MP3 of its musical result.


Reference:

Christian Haines. ‘Creative Computing WK 6 – MIDI Information and Virtual instrumentation.’ Tutorial presented at tute room 408, Level 4, Schultz building, University of Adelaide, 4th May, 2007.

Friday, May 04, 2007

Audio Arts – Week 8 – “Mixing Pt 1”

Audio Arts – Week 8 – “Mixing Pt 1”

Even Guy gets intimidated by this crowd..

After some serious splicing, fading, stretching and other general audio manipulation tasks, I think I have manufactured a semi-reasonable vocal line from the atonally corrupted stream that we were given to work with (meaning the guitar in the vocal track, not the voice).

The main issue was the onset of heavily strummed guitar chords, which were out of tune and sounded clunky against other instruments in the mix. The only bandaid type solution I could muster for this was to apply fades to the beginning of affected regions in a vain attempt to remove the problem. While this worked to some extent it has the unfortunate side effect of making the onset and transition between some vocal regions sound artificial. If one can ignore these quirks however, it works in the context of the rhythm section and harmony lines.

I started the instrumental mix by tweaking the kick and snare with a little redundant frequency cutting to tighten things up a bit. The same was performed for the Cymbals. I then called upon my session management skills and routed the drums to an auxiliary track for some global reverb and volume control for the percussion.


The bass was rolled back with a simple 1 band EQ at around 1Khz to lose some of those muddy mids. I found that I liked the same EQ setting applied to the acoustic guitars as well so maybe I have an adversity to midrange frequencies. The acoustics received a generous amount of Bomb Factory compression as well.

For the electrics I played around with a couple of different Amplitube settings for the varying rhythm and melody lines available. In the end I used a global Amplitube plug in for the electric guitars auxiliary bus and only had one electric guitar track using an alternative as it tended to highlight its bright melody line. A little reverb and compression were then applied to the group globally.

There was no need for routing with the vocals as there is only one line but EQ, Reverb, Compression, and De-essing were utilised to tweak my final arrangement and make one last attempt at masking that pesky background guitar.

Pro-tools has never crashed on my PC, Ever - Hmm...

Below is an MP3 of the final result and the EMU Drop box will contain my full Pro-Tools session.

Click here to link to online folder containing MP3's of the final mixes (variations are in the inclusion / exclusion of electric or acoustic guitars and different levels of pan settings).


Reference:

David Grice. “Audio Arts – Wk 8 – Mixing Pt 1.” Tutorial presented at Studio 1, Level 5, Schultz Building, University of Adelaide, 1st May, 2007.

Thursday, May 03, 2007

Forum – Week 8 – “Gender in music technology, can you tell the difference? (Pt 2)”

Forum – Week 8 – “Gender in music technology, can you tell the difference? (Pt 2)”


Bradley Leffler made some interesting comparisons as our first presenter today. Personally I would argue that Pop music, if we are talking about the generic boy band / girl band / boy+girl band formulas that we see on VH today, is in the most mainstream of cases marketed equally at anyone who craves instantly catchy throwaway music that will satisfy them only until the next big hit is on the airwaves.

As for Metal, well everyone who doesn’t accept that Metal is the future of all is in denial, end of.

Perhaps some differences between a John Cage 4:33 performance by a man as opposed to a woman would be the choice of setting and the gender mix of the attending audiences. As much as we all love to poke fun at this piece I think it’s arguable that these factors would contribute to the recital being different on at least some level.

I don’t necessarily agree that Kraftwerk are “Dehumanising” music. If anything, to me they are doing their best to humanise musical sound created with electronic music technology. After all, they present their music in a human controlled live performance manner as if they were a regular four-piece boy band, the difference being that they dance around playing their electronic devices instead of vocally crooning. It may be electronic circuitry that is generating the tones but it’s still being controlled, selected and manipulated by people via various interfaces (such as the pocket calculator).


Laura Gadd made a brief foray into the territory of Amy’s chosen angle from last week. I found her comparison between Pink and Eminem a little odd. While I’m no expert on the singer, as far as Pop stars go everything I have heard about Pink strikes me as atypical of entertainers in the field as she seems to be an introspective, thoughtful and intelligent person. Eminem, who I have heard much more about strikes me as a violent, homophobic, racist, misogynistic, white trashy and a generally awful human being by contrast. His style certainly does not represent the statistical norm when it comes to male lyricism, but perhaps in the seedy world of Gangsta hip-hop his work is the standard that many aspire to. Examining the lyrical content of two similar artists (male and female respectively) may have exposed something worthy of analysis but this one just seemed like comparing apples to oranges.


Ben Cakebread’s
insight into Queen’s successful marketing to the Glam and Art-rock audiences of the mid 70’s was certainly interesting and entertaining but I’m not sure where the gender in music technology angle was prevalent. Queen are certainly known for their willingness to embrace technology, with increasingly sophisticated studio and stage production, so maybe it would have been interesting to find out if they had worked with any females in the field over the course of their career.

So, how did Queen appeal to a “cock rock” audience? Brian May – pure genius.


Peter Kelly had many a random angle to explore on the issue but I found the delivery a little too complex and dry to really make a comment at this stage…


Reference:

Stephen Whittington. “Music Technology Forum – Week 8 – Gender in Music Technology, can you tell the difference?” Workshop presented at EMU Space, Level 5, Schultz building, University of Adelaide, 3rd May, 2007.

Bradley Leffler. “Music Technology Forum – Week 8 – Gender in Music Technology, can you tell the difference?” Student Talk presented at EMU Space, Level 5, Schultz building, University of Adelaide, 3rd May, 2007.

Laura Gadd. “Music Technology Forum – Week 8 – Gender in Music Technology, can you tell the difference?” Student Talk presented at EMU Space, Level 5, Schultz building, University of Adelaide, 3rd May, 2007.

Ben Cakebread. “Music Technology Forum – Week 8 – Gender in Music Technology, can you tell the difference?” Student Talk presented at EMU Space, Level 5, Schultz building, University of Adelaide, 3rd May, 2007.

Peter Kelly. “Music Technology Forum – Week 8 – Gender in Music Technology, can you tell the difference?” Student Talk presented at EMU Space, Level 5, Schultz building, University of Adelaide, 3rd May, 2007.

Tuesday, May 01, 2007

Creative Computing – Week 7 - “Messaging and Analysis”:

Creative Computing – Week 7 - “Messaging and Analysis”:


The main GUI window.

As you can see the multislider has indeed been utilised as the main GUI element for control in this patch. The tall vertical multislider object with the 12 sliders moving horizontally between minimum and maximum values controls pitch probability, with some familiar dials and message boxes to select which octave the respective pitches will come from. Two other multisliders have been embedded in bpatcher objects to save space and are for controlling the probability of varying velocity and duration values of the said notes. The difference between these two sliders and the pitch controller is that the velocity and duration sliders retain the full range of MIDI values (0-127) for settings as opposed to the pitch controller, which only has 12 sliders so it can only control data for 12 notes at any given time.

I understood the tutorial explanation of sending a bang to the left input of a table object straight away, but when I tried to dig deeper and get my head around the finer details of this “quantile” command, I found the reference manuals explanation to be a bit much for my current stage of development. Nevertheless, I know what it does and where I can use it, so do I really need to know the complex maths behind it?

The tempo object was straight forward, and is used to send repeating bangs to the three table objects which in turn send out an address via quantile generated probability which is converted into a MIDI note, velocity or duration, depending on which table it is.


I'm sure there is an easier way but why forgo 16 amusing hours of pain...

Pitch info only is sent to a histo / table combination at this stage, so only the pitch history can be examined. I have set the patch up so the histo object clears itself when one note reaches 127 hits to keep the visuals interesting.

Whilst I understood the clocker and iter objects functionality in the tutorials I haven’t yet found a need for them in this patch but time will tell.

All files are embedded in the patch at this stage so there should be no need to set up extra pathways.


Click here for link to online folder containing text file of patch.


Reference:

Christian Haines. “Creative Computing Wk 6 – Messaging and Analysis.” Tutorial presented at tutorial room 408, level 4, Schultz building, University of Adelaide, 26th April 2007.