Forum - Week 6 – Physical Computing:
Forum - Week 6 – Physical Computing:
A sign of things to come for me I'm afraid..
The underlying potential of this week’s material promises much. Unfortunately my attempts to engage the technology were marred by two separate chip failures, the second one just before I could record my proven square wave circuit. The result of this hiccup is that I couldn’t really make much sound this week, not a good sign for an aspiring musician (unless you’re Cage).
This little random player idea spat out some interesting results before the second 4093 died on me..
Still determined to salvage something concrete out of the exercises, I moved onto the square wave controlled note player. Low and behold, this required another square wave circuit to be made using the dodgy 4093 chip. Suffice to say, I couldn’t get there without working hardware, so I made a rough copy of the circuit following the ‘IMHOTEP’ instructions, so at least I understand its functionality.
And the good times just kept on rolling..
Moving right along, I came to the ‘HORUS’ voltage to MIDI control patch and circuit. As this one did not require the 4093, I managed to get it up and running. There is something organic about the way the visual sliders onscreen respond to the analogue voltage control that you are feeding them. I don’t quite understand why when only using one analogue input from the Arduino, shifting the wire to output 2, 3, 4… etc seems to retain control over the last chosen port as well as the new one. To take this discovery to the extreme, I placed it in every analogue in port for a short time and sure enough I had one pot controlling eight visual sliders on screen. Some shoddy Max programming perhaps?
Total control, mwahaha!
At least there was some success at the end of another frustrating day of defeat at the hand of technology, there is no sound file this week because, well, there was no sound…
Reference:
Haines Christian. “Forum – Week 6 – Physical Computing.” Workshop presented at the Audio Lab, Level 4, Schultz Building, University of Adelaide, 30th of August 2007.
Sebastian Tomczak. “Forum – Week 6 – Physical Computing.” Workshop presented at the Audio Lab, Level 4, Schultz Building, University of Adelaide, 30th of August 2007.
Audio Arts - Week 6 - Sound Asset Design:
Audio Arts - Week 6 - Sound Asset Design:
Finally, some definition.. Despite my initial reservations about F-Mod, I found this to be one of the most simple, straightforward and constructive tutorials I have experienced to date. I thought I was back at TAFE for a minute…
The only thing that let me down this week was the engine designer thingy with the four sliders for RPM etc; it didn’t seem to affect anything. Probably something simple I’m missing. All other sections of the chapter were very helpful, and for the first time this semester I thought to myself that game audio could actually be fun! I shall quash such hedonistic thought patterns in due course.
Why do you promise so much and deliver so little? F-Mod does contain some very intuitive features despite its beta-feel clunkiness. I am at a loss as to why all the standard features of today’s software have been omitted (cut and paste, option drag duplicate, quick undo etc) but it’s still manageable. The 3D – 2D morphing capabilities are very effective, as is the ability to use real-world analogies like ‘yards’ (I prefer metres myself) and decibels for x-axis – y-axis placement.
The lack of F-Mod factory sounds didn’t stop me from using all the features with some trusty EMU samples. Note my far from gattling gun “kick – H H H H – snare” is not all that, but it helped me to understand the functionality of the cycle tutorial.
The ideal gun for a failed servo job.. Click here to link to online folder containing an MP3 example of this weeks effort.
Reference:
Haines, Christian. “Audio Arts - Week 6 - Sound Asset Design.” Lecture presented at Tutorial Room 408, Level 4, Schultz building, University of Adelaide, 28th of August 2007.
Forum – Week 5 – Circuit Bending II:
Forum – Week 5 – Circuit Bending II:
The complete circuit.. All things forum seemed determined to bring me down this week. The voltage converter circuit didn’t work, the talking parrot delivered paltry results and my pursuits at home left me with nothing on Sunday evening.
Hardwired for sound.. That changed this morning when I decided to give the toy mobile another crack at hard wiring some bends. I quickly discovered that multiple levels of resistance and an easy on/off switching system were necessary to capture the low level signal. I dispensed with the 4.5 volts of internal batteries in favour of a 9-volt fed via three resistors (two pots and an LDR).
Many pots.. This still required a delicate touch, but the results came through when I discovered that fine-tuning the pots gave a lot of expressive control to the LDR. After so much disappointment it was rewarding to manipulate that annoying operator voice, creating some almost demonic speech fragments. It is currently a combination of hard wired and breadboard circuit connectivity, but this will be converted to a more robust model over the next week.
Click here to link to online folder containing this weeks audio example.
Reference:
Haines, Christian. “Forum – Week 5 – Circuit Bending II.” Workshop presented at the Department of Engineering, Intermediate Practical Laboratory, University of Adelaide, 23rd of August 2007.
Tomczak, Sebastian. “Forum – Week 5 – Circuit Bending II.” Workshop presented at the Department of Engineering, Intermediate Practical Laboratory, University of Adelaide, 23rd of August 2007.
Creative Computing - Week 5 - Synthesis (2) - Vibrato, FM and Waveshaping:
Creative Computing - Week 5 - Synthesis (2) - Vibrato, FM and Waveshaping:
I think I’ll add trying to be clever and failing miserably to my list of attributes next time I compose a resume. The task of creating an FM synth was not in itself difficult, but trying to add some useful GUI features to a scenario that uses poly~ was a different story.
A new and visually informative interface...
Poly~ is a very confusing and temperamental object to use. Its arguments and supposed input commands seem to be recognised on an almost random basis. If you examine my patch, I have gone to considerable trouble attempting to alleviate some of the clunky and confusing user issues I had with last week’s approach, by trying to incorporate dynamic GUI features that highlight some of the current conditions in poly that one may want to be aware of. These include: which note is the current target, which notes are muted and which notes have their windows open.
One of the poly voice instances.. These features all operate correctly as far as I can tell, but poly~ refuses to accept input to certain voices for no good reason that I can decipher. Despite containing ten instances of the djd-fmsynth~ patch, it will only play back one at a time, and even then only 10 or 1 it seems. I have well and truly run out of time to look into this further but if anyone has suggestions based on their own success, then I’ll be glad to listen…
Click here to link to online folder containing this weeks disappointing result.
Reference:
Haines, Christian. “Creative Computing – Week 5 – “Synthesis (2) - Vibrato, FM and Waveshaping.” Lecture presented at Tutorial Room 408, Level 4, Schultz building, University of Adelaide, 23rd of August 2007.
Audio Arts – Week 5 - Audio Engines Analysis:
Audio Arts – Week 5 - Audio Engines Analysis:
I really would have liked to achieve all of the exercise outcomes with FMod this week, but either its first install has some bugs to overcome or I have a lot to learn. I did read a lot of the manual though so I wasn’t going in blind.
I have nevertheless created a collage of some audio manipulated within the program. The record mode seemed to work within FMod but even though it created .wav files nothing would recognise them, so I had to settle for ripping to Cubase the old fashioned way. I had some success getting sounds to spawn irregularly and implemented some over the top random pitch shifting and volume control. For a serious effort one would be much more conservative.
Trying to merge these files into one folder only brings pain.. The main drawback for me was that FMod wouldn’t let me drag and drop audio into sound definition categories or event categories without crashing. On top of this the ‘add audio effect’ feature gave me no response whenever I tried selecting it with a parameter file in the event editor – most frustrating. I have told Christian and Peter about the issues so hopefully the rest of you don’t suffer the same fate.
The files I used came from the freesound website mentioned in this weeks exercise directive. There is certainly a lot to choose from on the site, and the ‘fighting grunts’ and ‘weapon sounds’ are remarkably close to those in a Playstation one game called ‘Tenchu’. Not all are professionally recorded however. The sword sound in particular contains a lot of background noise, but you get what you pay for I guess.
Click here to link to online folder containing this weeks MP3 result.
Reference:
Haines, Christian. “Audio Arts – Week 5 - Audio Engines Analysis.” Lecture presented at Tutorial Room 408, Level 4, Schultz building, University of Adelaide, 21st of August 2007.
Forum – Week 4 – Circuit Bending:
Forum – Week 4 – Circuit Bending:
Post-tinker..
Upon carefully selecting a $5 Reject Shop radio for destruction, I thought I had made a wise choice that would give immediate results. This was not the case. The circuit board inside an FM radio is substantially more complex that in your average “Force Space (Engrish)” light saber rip off.
Pre-tinker..
After spending a considerable amount of time trying different variations of point connecting, with and without resistors and potentiometers, all I achieved were varying degrees of volume attenuation. Not being one to let initial disappointment hold me back, I dampened my hands with a cloth and began some “wet hands” circuit manipulation. At first this seemed to give me random radio stations and occasionally a mix of more than one, an effect one could probably get from a cheap radio without ripping it apart. Then I stumbled on the sweet spot, which allowed for some Theremin-like manipulation of sine tones.
Sweeeet!!
The spot was considerably small, and only the slightest movement of my finger created drastic fluctuation in frequency. It still managed to feel like an organic form of instrument control however, kind of like a subtle vibrato on a guitar string that produces anything but subtle results.
Click here to link to online folder containing a sound file of the ‘sweet spot’ in action.
Reference:
Haines, Christian. “Forum – Week 4 – Circuit Bending.” Workshop presented at the Department of Engineering, Intermediate Practical Laboratory, University of Adelaide, 16th of August 2007.
Tomczak, Sebastian. “Forum – Week 4 – Circuit Bending.” Workshop presented at the Department of Engineering, Intermediate Practical Laboratory, University of Adelaide, 16th of August 2007.
Creative Computing – Week 4 – “Synthesis (1) - Additive, Tremolo, RM
Creative Computing – Week 4 – “Synthesis (1) - Additive, Tremolo, RM and AM”:
If this looks more complex than it should, it's because it is...
I have obviously spent too much time on this one, but you all know how it is once you start. Luckily the tutes were not too time consuming this week so most of my time was spent having fun designing a new patch and GUI.
I haven’t been successful as yet in integrating independent amplitude envelopes to all available voices in my additive synth. By the time I thought of implementing them I was already over my allocated time for the task by about two hours. The interface is a bit clunky to use at the moment as well, but I will solve this in the future by placing some universal presets inside that will give fast access to a variety of tones. As it is it takes a minute before a complex sound can be heard.
The MIDI controller keyboard integration was no problem, but I would like to incorporate a timing system that adjusts the domain of the function depending on the time between note on/off. Also, the keyboard is only really useful for selecting notes for the voices to hold at the moment – it cant play back all voices itself and change their respective pitches universally. Another job for the to do list.
Note the 'sfrecord~' object, you'll need it to output audio dumps, the argument '2' is to tell it to create 2 inputs for stereo. Click here to link to online folder containing a Zip file of this week’s MSP files and an MP3 of some sounds the patch has made.
Reference:
Haines, Christian. “Creative Computing – Week 4 – “Synthesis (1) - Additive, Tremolo, RM and AM.” Lecture presented at Tutorial Room 408, Level 4, Schultz building, University of Adelaide, 16th of August 2007.
Audio Arts – Week 4 – Game Engine Overview:
Audio Arts – Week 4 – Game Engine Overview:
GTA San Andreas was created to run on three different platforms, namely the Playstation-2, the X-box, and PC’s. As this is a post 2004 game released on the same platforms as its franchise predecessors, its development environment changes relating to the hardware involved are minimal. That being said, it is a much bigger game than Vice City or GTAIII, so a lot has been put into creating a larger environment that lacks repetition.
There seems to have been a degree of
criticism aimed at Rockstar for sticking with the ‘Renderware’ engine for this instalment of the franchise. Complaints are focused on ‘blocky graphics’ and a poor frame rate when the activity on screen gets too busy. It was perhaps a deliberate move on the part of the company to stick with what they knew, given that this was their most ambitious offering to date at the time of release. Changing to a new engine may have created more problems than it would have solved, and perhaps harmed the continuity of the franchise.
As
Rockstar seem to be closely guarding the technical details of their development environment it is worth mentioning some of the real world research that went into making GTA San Andreas. Teams were deployed with professional photographers to map out much of the city and country environments the company whished to emulate in the game. Although it is set in fictitious cities, their real world counterparts (San Francisco, Los Angeles, and Las Vegas) are obvious, and this is in no small part due to accurate portrayal of their
respective landscapes thanks to painstaking research.
Music has played an interesting role in the GTA series and San Andreas follows the same format. Set in the early nineties, it specifically uses music from that era, which only plays when the player is operating a vehicle with a radio. There doesn’t appear to be much separating the programming of music and sound across the series, so whether any new approach to accessing audio and playback has been utilised is unlikely. There is certainly a lot more sound in San Andreas compared to the first two offerings though, so if anything was needed it may just have been an upgrade to procedures of asset management.
What must be said for the way this game has developed over time is that it represents a clear case of pushing an old system, and an old GAE, to its limit – at least graphically. Three instalments of this series have been released for the same consoles and computer platforms, so Rockstar have, in my opinion, made the right decision in sticking with the same GAE, even if it was to become redundant soon after the San Andreas release. With the Playstation-3, X-box 360, and PC graphics about to make another quantum leap, why overdevelop for something that’s on its way out?
Reference:
Haines, Christian. “Audio Arts – Week 4 – Game Engine Overview.” Lecture presented at Tutorial Room 408, Level 4, Schultz building, University of Adelaide, 14th of August 2007.
Forum – Week 3 – Breadboarding:
Forum – Week 3 – Breadboarding:
Focusing on the ‘Ring Modulation’ circuit this week some interesting results were extracted.
While the sonic results often mirrored the square wave circuit (albeit a self modulating version), I found some of the screaming dive-bomb like tones to be similar to that which I often strive to achieve on the electric guitar. The difference is in the purity of the sound as it isn’t prone to interference the way magnetic pickups are.
My main issue with this project was trying to work out exactly what was providing the power. I could disconnect the battery from the power bus on the breadboard and the tones kept coming. At certain settings switching the power on and off at the bus (I soldered the battery terminals to one of the switch devices for convenience) changed the pitch but didn’t cut the power. I suspect the ‘ring’ effect was causing the circuit to use the 9V power in the amplifier to drive the circuit once it got started. I also had to alter the position of the resistor pot from that which was specified in the instructions to get any attenuation of signal parameters.
On top of the slow frequency modulation effects, I also found the rhythmic attributes of this type of modulation appealing. I will certainly be looking into creating a more directly controllable version of this device for my main instrument.
Click here to link to online folder containing an audio collage of this week’s effort.
Reference:
Haines, Christian. “Forum - Week 3 – Bread boarding.” Workshop presented at the Department of Engineering, Intermediate Practical Laboratory, University of Adelaide, 9th August 2007.
Tomczak, Sebastian. “Forum - Week 3 – Bread boarding.” Workshop presented at the Department of Engineering, Intermediate Practical Laboratory, University of Adelaide, 9th August 2007.
Audio Arts – Week 3 – Process and Planning:
Audio Arts – Week 3 – Process and Planning:
Would anyone at Rockstar Games really know what it sounds like to fire a machine gun from a motorcycle during an off road car chase ?
Sticking with the GTA franchise for this week’s exercise, I have compiled an excel document of categorical significance to organising game audio assets. There were a few issues encountered with this, not the least being that excel is a temperamental program at the best of times and running it in the lounge room on an old laptop is just painful.
I think the fundamental question it has raised for me is this; do I want to be a part of this industry? Game audio seems to have now surpassed film in terms of the quantity of sounds that need to be made available for a project. Despite this, industry insiders seem to indicate that it is still largely an afterthought in the eyes of major game developers. So it was with skepticism that I read through Mark Lambert’s glowing report of his ‘rewarding’ experience with sound designing the ‘Elder Scrolls IV’.
There is certainly something to be said for the pride one must feel at the completion of such a monumental task, but does the end result justify the sacrifice. Typical high-pressure scenarios were laid out by Mark, such as staying up till 4am working on this and that, only to sleep for three hours and get up to do it all again. I can’t think of anything worse. I find it interesting that the gaming industry has adopted this blanket approach to burning out all individuals involved, although, as much of the industry has developed in workaholic Japan I probably shouldn’t be surprised. It’s just that; isn’t video gaming, for the most part, the favorite past time of lazy people? So why are those in the industry (who surely express an interest in working with video games as a result of their own satisfying experiences of entertainment) so willing to rip their fingers to the bone?
Maybe I’m just bitter because Uni seems to take away all my leisure time, while throwing so much work on the schedule that I feel I cant really do a good job of anything…..a-ha! When I leave I’ll be prepared for a soul destroying six month career in the game industry!! Pessimistic? Me?
Click here to link to online folder containing the excel file for this week.
Reference:
Haines, Christian. “Audio Arts – Week 3 – Process and Planning.” Lecture presented at Tutorial room 408, Level 4, Schultz Building, University of Adelaide, 7th of August 2007.
Creative Computing - Week 3 - Polyphony and Instancing
Creative Computing – Week 3 – Polyphony and Instancing:
An initial hurdle presented itself in Max/MSP tradition; by staring me so directly in the face that it was at least an hour before I could interpret the simplest of errors in my programming. I refer to the number following the word ‘args’ in the poly~ object above. Notice it has a decimal point? Apparently this is necessary if one whishes to create numerical values between one and zero. Stating the obvious Dave? Yes. So why can I never see these things when it counts?
Anyway, the objects for this week are sorted; my GUI is still paralysingly simplistic but I’m restraining design ambition in anticipation of future confusion and time wasting at the hands of a complicated program written for a simple problem. MSP has some strange representation of signal flow that I whish to become general and accessible knowledge before I crank the GUI complexity.
The only significant difference between my embedded patch and Christian’s example on Thursday is the inclusion of some extra inputs. I wanted to retain the possibility of a sig~ object being easily inserted into the signal path of cycle~ or phasor~, thus increasing the variety of sounds that may be achieved by wave summing.
In the example patch ‘Wk 3 – Working example’ there is clearly a need for the signal amplitude outputs of the independent patches on screen to be halved, but it is demonstrated within each patch that I know how to achieve this. The example is for the purpose of demonstrating the functionality within each patch, so forgive this external oversight wont you?
Click here to link to online folder containing this week’s MSP bundle.
Reference:
Haines, Christian. “Creative Computing – Week 3 – Polyphony and Instancing.” Lecture presented at Tutorial Room 408, Level 4, Schultz Building, Thursday the 9th of August, 2007.
Forum - Week 2 – Input and output – 02/08/2007:
Forum - Week 2 – Input and output – 02/08/2007:
After completing the underwhelming task of soldering four bits of cable together for ‘fun’, I became determined that actual fun would be the objective of my piezo experimentation. Fun always begins when one breaks a rule or two, so I started by plugging my piezo into a MAINS POWER device, that being an M-box hooked up to the ever faithful Cubase.
Here comes the pain...
Once levels were set suitably low, I began to experiment with scratching the device over various surfaces. This achieved mediocre results and in my frustration I hit the piezo with a paint stripping tool. Perhaps not surprisingly, this act of violence produced a desirably metallic kick drum-like effect that was fat and staccato at the same time, just how I like my kicks to sound. Now we were getting somewhere.
The sound of birth control?..
In the reading for this week there was mention of an American sounding concept called ‘Plasti-Dip’, which could be used for water proofing a piezo device. In lieu of this Dip, an unsuspecting prophylactic was recruited for the task. Dropping this contraption into a honey container half filled with water and blowing bubbles with a straw produced some very creepy results indeed…
And a little help from my friends...
Click here to link to online folder containing an MP3 of this weeks 40 second masterpiece.
Reference:
Collins, Nicolas. “Handmade Electronic Music: The Art of Hardware Hacking.” TF-ROUTL, 2006.
Haines, Christian. “Forum - Week 2 – Input and output.” Workshop presented at the Department of Engineering, Intermediate Practical Laboratory, University of Adelaide, 2nd August 2007.
Creative Computing – Week 2 - Signal Switching and Routing:
Creative Computing – Week 2 - Signal Switching and Routing:
This week’s patches incorporate my favourite jsui object for setting data values. The cycle~ and phasor~ objects have both been given essentially the same treatment in regard to their embedded functionality. I have created help examples that exploit the phase input potential of both to at least some extent. In response to the request for the inclusion of inlets for data entry to the bpatcher, I have placed a sig~ object above the left input of both the phasor~ and cycle~ examples. This must be connected manually at this stage, as I still have a bit to learn about switching audio pathways on and off.
The mute function is something that I pre-empted last week, but I tried to make some mildly sophisticated improvements, namely removing the pops when the left or right channel is muted. Despite creating an object which encapsulates the functionality of the sel-1 / Delay set up from last week, that can be used to route the on / off signal anywhere, I’m still getting the pops. More on this next week when I will hopefully have a solution…
Click here to link to online folder containing all relevant Max files.
Reference:
Haines, Christian. ‘Creative Computing – Week 2 - Signal Switching and Routing:’ Lecture presented in Tutorial Room 408, Level 4 Schultz building, University of Adelaide. 2nd July, 2007.
Audio Arts - Wk 2 – Game Audio Analysis
Audio Arts - Wk 2 – Game Audio Analysis:
Grand Theft Auto – Vice City:My personal favourite of the GTA series, due in no small way to my shameless nostalgic attachment to all things
V-Rock in the eighties, gets the sound analysis it so rightly deserves this week…
Boot and load:Developer’s Logo fades onto screen accompanied by deeply reverberated woman’s monophonic and shortly sustained vocal note. Low synth note fades in subsequently. Chattering of a crowd murmurs through and all sound crossfades into a transistor radio recording of the eighties hit ‘Video Killed the Radio Star’. The screen cuts to an emulation of a Commodore 64 computer screen. The sound of someone typing in a familiar C-64 load command is heard. The command is executed and a cheesy square-wave synth melody plays over another display of the developer’s logo.
Title sequence:Eighties disco music then kicks in for the opening credits and title sequence. Deliberately synthesised instrumentation characterises the composition. It then cuts to a silent loading screen and the player waits patiently for a progress bar to fill up, indicating that the game has finished loading.
Title screen:This game does not have a ‘title screen’ as such.
Interface:Occasional buzzing of controller vibration.
Game in sequence:This game begins with a short film outlining the beginnings of the characters story, laying the groundwork for future action. It begins in a seedy Mafia den, where gangsters discuss the players character, Tony Vercetti’s, release from prison and decide to put him to work. Dialogue, the soundtrack of another eighties pop hit ‘Take these broken wings’ and the footsteps and clinking of a servant serving drinks are the only audio evident.
Upon cutting to the next scene, some generic Cuban music accompanies the sound of Toni’s aeroplane landing in vice city. Ken the nervous lawyer is waiting to pick them up. Car doors open and close and Toni and his boys get in. Ken the lawyer is talking nervously the whole time.
Sound of an approaching chopper is then accompanied by an ambient synth soundtrack as the scene is set for Toni’s first drug deal.
The next set of sounds heard is as follows:Noise of chopper is prevalent the whole time.
Car doors open.
Ken is still talking.
Sound of footsteps.
Crouching of hidden soldiers.
Toni’s and drug dealers voices.
Gunfire.
Burnout of Ken’s car as he escapes with Toni in the back.
Chopper takes off.
Game play (action, narratives, interludes, pause) outro:
The next scene flows seamlessly into the actual gameplay. The audio sequence and introduced environment sounds are thus:
Ken still talking.
Car Screeching to a halt.
Toni and Kens short parting conversation.
Bleeps for game instructions then begin.
Environment then kicks in.
Car engine still running.
Car horns beeping.
Pedestrians talking.
Cars passing.
Rain falling.
Birds chirping when rain stops.
Audio types - music, environments, dialogue, narration, sound effects, foley, action sounds:
The collection of sound files for GTA’s in game action is enormous but an abbreviated list should outline the core elements.
Music: All of the in game music comes from ‘source’ in GTA. The player chooses radio stations when driving cars. Background music may be heard in certain environmental scenarios such as the shopping mall. There is no constant non-diegetic ‘soundtrack’, such as that which can be heard for the duration of many other games.
Environmental sounds: Some have already been listed in relation to the outdoor suburban environment portrayed in GTA. There are also subtle changes to the background hum when the player moves away from the city to indoor or remote environments. One example of this is the above mentioned shopping mall, in which the background noise (and foreground) is sharply reverberated to enhance the realism of an indoor space with many ‘live’ surfaces.
Dialogue: GTA is littered with dialogue all the way through, from pedestrians to associates of the character on the street making idle chatter, to Toni’s phone conversations and taunting of victims during the execution of ‘mission’ duties.
Narration:The only evidence of narration is Tony occasionally talking to himself.
Sound effects: For a violent action RPG, the usual suspects are here – Weapon sounds, explosions, car and car crash sounds, helicopters, boats etc. All sound effects can be attributed to ‘real world’ scenarios, so the sound designers have not had to create sound for concepts that do not exist such as laser guns or alien space ships.
Foley: A comprehensive foley exists for GTA, from standard footsteps and character shuffling to various surface contact effects such as the sound of brushing when Tony runs too close to a hedge. Object sounds are plentiful with most sonic reactions from a given action accounted for. The clicking sounds of car doors opening and shutting and a weapon being reloaded is one example.
Reference:Haines, Christian. ‘Audio Arts - Wk 2 – Game Audio Analysis.’ Lecture presented in Tutorial Room 408, Level 4 Schultz building, University of Adelaide. 31st July, 2007.