Sunday, April 30, 2006

Week 7

  • Practical 1 - Audio Arts
No Audio Arts this week - public holiday.
  • Practical 2 - Creative Computing -Supercollider FM/AM/SynthDef's [1]

This week we learnt how to write AM and FM synthesis programs. For me it was a huge learning curve because it meant that I had to relearn all the previous theory by actually putting it to practice. It became a very addictive exercise though, and this is what I came up with after 8+ hours.

FM



// Week 7 FM

(
SynthDef ("FM",
{

arg
freq1 = 602,
freq2 = 600
;

var
fm, cf, in
;

cf = MouseX.kr(0.1, 250);
in = MouseY.kr(0.1, 500);

fm = SinOsc.ar(
freq: [freq1,freq2] + // Carrier Frequency
VarSaw.ar(
freq: cf // Control Frequency
,
mul: in // Index
)
,
mul: [0.8] // Overall Amplitude
)
;
Out.ar([0,2], fm);
}
).load(s);
)

a = Synth.new("FM");
a.set(\freq1, 220, \freq2, 221);
a.free;


AM1



// Week 7 AM (1)

(
SynthDef ("AM1",
{

var
am,
ay,
bee,
cee
;

ay = MouseX.kr(1-70);
bee = MouseY.kr(1-70);
cee = (ay*ay)+(bee*bee);

am = Saw.ar(
freq: [ay,bee], // Carrier Frequency
mul: [SinOsc.ar(
freq: 1 + cee, // Control Frequency
mul: 0.5
),
Pulse.ar(
freq: cee, // Control Frequency
mul: 0.5
)
]
)
;
Out.ar([0,2], am);
}
).load(s);
)

b = Synth.new("AM1");
b.free;


AM2



// Week 7 AM (2)

(
SynthDef ("AM2",
{

arg
freq1 = 600,
freq2 = 601
;

var
am,
my,
mx
;

my = MouseY.kr(0.1,600);
mx = MouseX.kr(0.1,601);

am = SinOsc.ar(
freq: [freq1,freq2], // Carrier Frequency
mul: [Blip.ar(
freq: my + mx, // Control Frequency
mul: 0.5 ),
Dust.ar(
density: my + mx, // Control Frequency
mul: 0.8 )
]
)
;
Out.ar([0,2], am);
}
).load(s);
)

b = Synth.new("AM2");
b.free;


I think a more effective way of learning (at least for me) would have
been to be given a simple program such as a sinusoidal wave generator,

e.g

// Sine Wave
(
{
SinOsc.ar(
freq: 440,
mul: 0.5
)
}.scope(1);
)

and then add to the program every week as we learn new theoretical concepts such as argument, variables, synthdef's etc. This way we would be learning the theory, but also practically applying it instead of learning it, and relearning it after 7 weeks of not actually doing anything with it.

Sure we've been doing exercises that involved altering other patches, but by creating a program from scratch (no cutting and pasting), everything has been fused together consequently increasing my confidence with Supercollider immensely. I actually feel like I want to spend hours and hours a day working on it (not possible unfortunately) to increasing my knowledge. Before Supercollider was just an idea floating around, now it's a foundation I can build upon.

  • Music Technology Forum - Presentation - Sebastian Tomczak's projects [2]
Sebastian Tomczak is a Music Technology honours student. When he thought about what he wanted to study, he looked back through his years and found a common theme is all his work. This was to create musical expression from objects of non-musical nature. He is also interested in musical limitations, and by combining both interests, he has conducted (up to date) 7 different milkcrate sessions. These "Milkcrate" sessions involve filling up a milkcrate with non-musical items, and making as much music as possible from these items in a 24 hours period. He also mentioned his work on sonification, and his recent piece, "Duet for Desk Lamps", that used light from desk lamps to control the amplitude in an AM synthesis patch via a solar panel; a minimal, yet fantastic idea.
  • Music Technology Forum - Workshop - Workshop on Iannis Xenakis, Gabrielle Manca, and Phillip Glass [3]
The first piece we listened to was "Voyage to Andromeda" by Iannis Xenakis. It was constructed by a computer that read a series of different graphs. This piece was an audio/visual overload, and reminded me of a granular or rocky construction that sporadically jumped to different states.

The second piece we heard was "In Flagranti", by Gabrielle Manca. This piece pushed guitar playing to its limits with unconventional playing, and unusual guitar sound effects. It reminded me of animalistic chirping sounds.

The last piece we listened to was "Rubric", by Phillip Glass. This was another interesting piece that contained a lot of arpeggios that started began to remind me of the sound of a 1 hert difference tone. Visually it reminded me of some old black and white footage of soldiers going off to war saying goodbye to their families.

  • References
    [1] Haines, Christian. 2006. Lecture on Supercollider . University of Adelaide, 27 April.
    [2] Sebastian Tomczak. 2006. Presentation on Sebastian Tomczak's projects. University of Adelaide, 27 April.
    [3] Harris, David. 2006. Workshop on Iannis Xenakis, Gabrielle Manca, and Phillip Glass. University of Adelaide, 27 April.

Sunday, April 23, 2006

Week 6 Recording Assignment

For this assignment I chose to record a piano using the Mid Side (M/S) technique with a pair of Neumann U87 mics. The instrument choice was a little lacking in the adventurous side of things considering that we did this exact exercise in class, but I made this choice because:

1) This is my main instrument, and so it's something I want to learn how to mic well.
2) I wanted to prove to myself that I could successfully use this technique before moving onto other Stereo mic techniques.

I prepared the recording space by clapping my hands and listening to the natural reverb. After opening the curtains, the reverb did increase a little, but not a lot. Still, the extra reverb did make enough difference to satisfy my judgement. Next I prepared the piano by putting carpet under the soft pedal to dampen the sound it makes when it is used.

Lastly, before I got into the recording, I tested for the hot spot on the figure 8 mic. I believe it's usually at the front of the mic, but to be sure I did a test anyway. Honestly, I could barely tell the difference (this could have been because of the original position I had the mic), but the difference I did hear confirmed that the hot spot was at the front of the mic.

I proceeded to setup the M/S configuration at the spot I thought I'd get the best piano results. I hit record and played the piano. When I returned to the studio to hear the recording, it sounded as though it had been beaten unconscious (no presence or clear even frequency response), even after I copied and inverted the figure 8 track. I figured it was probably the position of the microphone.

Thankfully Poppi turned up and agreed to play the piano for me whilst I aurally check for the best mic position. My ears told me the best position was inside of the piano lid, close to the sound board, and closer to the treble strings.

The results from moving the mics from their original position were startling. Suddenly all that was missing in the sound revealed itself.



My only gripe with the recording was that the pedal sound still prevailed in the quieter parts, but overall I am satisfied with the results.



Any constructive criticism is well appreciated.

Monday, April 10, 2006

Week 6

  • Practical 1 - Audio Arts - Live Recording [1]
This week was a repeat of last week’s lessons, but this time in the safe haven of EMU. We recorded the same string quartet with matched mic pairs using the following techniques:

- X/Y (2xNT4)

Sounded: Trebly

- X/Y (2xKMI)

Sounded: Distant

- Omni Spaced (2xU87)

Sounded: wide dynamics, realistic sounding ambience (omni), an impressive full sound (a testament to the microphone quality)

- Cardiod Spaced (2xNT3)

Sounded: thin and trebly.

The quartet was placed roughly in the middle of the room facing the southern windows, and the mikes toward the quartet facing the northern windows.

In the studio, David stressed the point of avoiding any peaks (even if they’re only in the yellow) as a safety measure just in case of unexpected increases in amplitude.

A possible solution to this recording was to use the audio from the U87 pair with some of the NT3 pair adding some of the trebly sound to the mix.

I think this lesson was too easy compared to last weeks, and I suspect this somewhat added to a feeling of un-fulfillment.
  • Practical 2 - Creative Computing - Supercollider (3) [2]

Ah good old Supercollider; Another week of programming that for some reason I’m not entirely into at the moment. I’m sure that if I really spent the extra time on it I would learn to love it, but for some reason I haven’t the motivation to do any extra other than skimming over the week's readings.

This week we learnt how to record audio from Supercollider into a file for use in other audio applications. We also looked at some of the different noise ugens that are available. The one that really caught my ear was the dust ugen. This ugen is great for trigger type processes because of its sporadically sharp wave peaks, but even as an audio source it creates some excellent sounds.

The reading contained the same things we went through in class, but also touched upon mouse control assignment (x/y movement mapping), voltage control synthesis, vibrato, and envelopes.

  • Music Technology Forum - Presentation - Music Technology Forum (1) - What is Music Technology? [3]
This week we had our first forum (more to be expected) with panelists, Stephen Whittington, Mark Carroll, and Tristan Louth-Robins. The forum subject was, “What is music Technology?”. Steven began the forum by looking at some definitions made by other distinguished academics from around the world. Although they shared shockingly different opinions about what Music Technology is, many seemed to agree that it is a discipline. Stephen then went further to try and define the word, "discipline". His definition of "discipline" loosely fit into that of Music Technology, but more precisely the term "inter-disciplinary" seemed to fit more comfortably. He also described the history of EMU, and explained how in the past it used to be a general Digital Arts unit, but in more recent times it has morphed into an audio focused unit.

Mark’s take on Music Technology was that it was more of a hybrid multi-disciplinary subject. He brought up the argument of vocational versus artistic education, and how unlike learning a violin, the study of Music technology isn’t as defined. He said it’s so broad that it’s more up to the individual students to find their own area of study. In relation to this, he also mentioned great opportunities for post-graduate research.

Tristan was third up in the chain of scholars, and tended to agree on points more so than form his own opinions (because by then most of the main issues had already been covered), but he did make some good comments that have unfortunately slipped my mind. His view on Music Technology was less diplomatic, yet he provided us with a more personal view on Music Technology from his own experience as a student.
  • Music Technology Forum - Workshop - Edgar Varese, Milton Babbitt, Barry Truax [4]
This week we examined three different pieces by three different composers.

The first of them was “Equatorial” by Edgar Varese. This piece was a vocal work with orchestral accompaniment. Here are some notes that I took about it:

- Thick, lush, and detailed texture with spotty use of orchestral colour
- Architecturally complex
- Masterful use of instrumentation
- Gave me the sense of brittle moss, crumbling away with the contact of water droplets.

The second piece we listened to was the integral serial piece, “Ensemble of Synthesisers” by Milton Babbitt. I really enjoyed this piece, particularly because of the complex electronic textures created through the various synthesisers. If the synthesisers were replaced by a say a piano, I would have probably gotten bored.

The third piece we listened to was, “Wings of Nike” by Barry Truax. I didn't write any notes for it so it was either dead boring, or it was so great I lost track of time. Whatever the case, I don’t remember it so I don’t have anything to say about it here.

  • References
    [1] Grice, David. 2006. Practical on Live Recording. University of Adelaide, 4 April.
    [2] Haines, Christian. 2006. Lecture on Supercollider . University of Adelaide, 6 April.
    [3] Whittington, Stephen. Carroll, Mark. Tristan-Louth Robins 2006. Forum on the subject "What is Music Technology?". University of Adelaide, 6 April.
    [4] Harris, David. 2006. Workshop on Edgar Varese, Milton Babbitt, and Barry Truax. University of Adelaide, 6 April.

Sunday, April 02, 2006

Week 5

  • Practical 1 - Audio Arts - Stereo recording of String Quartet [1]
This class proved to be an unfortunate turn of events. Our initial plan was to record a live string quartet exploring different types of spaced microphone setups. This is what happened.

9:20 Discovered the portable flash memory recorder didn't have any memory card.

9:30 Replaced it with a Portable DAT recorder.

9:50 We discovered the DAT didn't have phantom power. This ruled out the NT4 and U89 Mikes. Then Adrian almost semi-saved the day when he revealed two AA batteries that we presumably thought we could use in the Stereo NT5, but unfortunately that didn't work for some reason.

9:53 We began recording the quartet using the undesirable Sony stereo mic that came with the DAT. We all decided that whilst stand up on a chair, it sounded the best, so David proceed to record them from a number of different positions around them using himself and a mic stand (couldn't clip the mic on a mic stand).

In the end we at least got "a" recording. It's unfortunate we weren't able to try the techniques we originally planning for, but technical difficulties are to be expected in a music technology course. Despite our original plans going under, it was an educational experience nonetheless. This was the first time we stepped out level 5 to record something, and for that I'm grateful enough.

  • Practical 2 - Creative Computing - Supercollider (2) [2]

This week we mostly went throught formatting and commenting conventions, and the related radings were mostly revision (explaining the same sort of things in a different way), which was helpful I guess. I remember last year I didn't have a problem getting my head around Max/MSP, but I can't say the same about Supercollider. This week we mostly went through formatting and commenting conventions, and the related readings were mostly revision (explaining the same sort of things in a different way), which was helpful I guess. I remember last year I didn't have a problem getting my head around Max/MSP, but I can't say the same about Supercollider. This is probably largely due to the fact that I'm spending half the time on Supercollider this year as what I did with Max last year. Instead of reflecting on Supercollider every week, I think it would be more productive (for me at least) if I were given some programming tasks that needed to be solved and submitted every week. Something similar to the tasks we were set last year for Max. I find putting all the theory to practice really helps. Unfortunately so far it's (mostly) all been about learning the theory, and not the practice, which i'm hoping will change soon.

  • Music Technology Forum - Presentation - Chris Williams [3]
This week we were blessed with the presence of Producer/Sound Engineer Chris Williams. He brought us through an insightful schedule of a radio drama he directed and engineered the sound of. In regard to the recording and sound engineering, he talked about the acoustics of the recording space at ABC, the mics he used, and the signal pathway from the mics in the recording space into the studio. Accompanying this, he also showed us some screenshots of the Pro Tools sessions he was using

This industry related presentation was a nice balance from the art presentations we've had previously, and I'm definatly looking forward to more industry related speakers.
  • Music Technology Forum - Workshop - John Cage [4]

For this workshop we listened to John Cage's Music for Carillion (1954), William's Mix (1952), and 101 (1989). Each piece was quite diverse in instrumentation, and once again they reminded me of Cage's versatility as a composer. The one thing I admire about John Cage, is his ability to write experimental music that doesn’t just sound like an intellectual experiment, although I can't say this about all his pieces. Out of all three of the pieces, I would have to say "101" was my favourite. To me it wasn't just ambience, but ambience that contained a journey full on interest. It reminded me
of the ambience heard in the original Silent Hill for Playstation as the player character walked the misty streets of Silent Hill (check out the picture).

  • References
    [1] Grice, David. 2006. Stereo recording of a String Quartet. University of Adelaide, 28 March.
    [2] Haines, Christian. 2006. Overview of Supercollider . University of Adelaide, 30 March.
    [3] Williams, Chris. 2006. Presentation from ABC Producer and Engineer. University of Adelaide, March 30.
    [4] Harris, David. 2006. Workshop on John Cage. University of Adelaide, March 30.