Monday, May 15, 2006

Week 11

  • Practical 1 - Audio Arts - Mixing [1]
This week was a bit different than previous weeks in that it was more about analysing and listening than anything else. David played us some of his favourite tracks that he worked on. It was a great insight to hear his mixes and learn about them. He told us various thing about the mixes such as what sort of reverb he used, doubling, signal routing (eg through various effects), and sound design. The sound design part was very interesting to me. I mean my experience with Mikcrate was a similar idea to what David described in this particular mix, but the difference was that it didn’t sound experimental. In fact the first time I heard it, it didn’t even occur to me that most (if not all) of the percussion sounds were created using non-traditionally musical items.
  • Practical 2 - Creative Computing - Supercollider (8) [2]
Ok here's my patch. Ok here's my patch. The synthDef's work, but I couldn't work out how to get the Pbind to execute correctly. I suspect it had something to do with the order of execution, I'm not sure. I think 12 hours is enough time to spend on this patch. Perhaps if I find some time later on, I'll try and fix it. Special thanks go out to Adrian and Martin for doing their patches before me so I could use them as a guide for mine.

Here is the actual sound of a single note that the SynthDef's produce

This is the result PBind returned

I somehow don't think this is the sound I was looking for.

// Week 11

(
// Modulator
SynthDef("Modulator",
{

// Arguments
arg busout = 30,
density = 8
;

// Variables
var modulator
;

// Modulator
modulator = Dust.kr(density)
;

// Output
Out.kr(
bus: busout, // Modulator out control bus 30
channelsArray: modulator
)
}
).store;

// Carrier
SynthDef("Carrier",
{

// Arguments
arg busin = 30,
busout = 20,
freq = 8000,
dur = 2,
leg = 2
;

// Variables
var carrier,
modulator
;

// Modulator
modulator = In.kr(busin, 1);

//Carrier
carrier = SinOsc.ar(
freq: freq,
mul: modulator // Modulator in on bus 30 performing AM on Carrier
)
;

// Envelope
carrier = carrier
*
EnvGen.kr(
Env([0,0.6,1,0], [dur,0.1,0.01]), doneAction: 2
)
;
// Output
Out.ar(
bus: busout, // Sending out audio bus 20
channelsArray: carrier
)
;
}).store;

// Effect
SynthDef("Delay",
{

// Arguments
arg busin = 20,
busout = 0,
mdtime = 0.2,
deltime= 0.2,
dectime= 6
;
// Variables
var delay,
carrier
;

// Modulated Carrier
carrier = In.ar(busin, 1);

// Filter
delay = CombC.ar(
in: carrier,
maxdelaytime: mdtime,
delaytime: deltime,
decaytime: dectime
)
;

// Output
Out.ar(busout, delay)
;

}).store;
)

Synth("Modulator", addAction: \addToTail);
Synth("Carrier", addAction: \addToTail);
Synth("Delay", addAction: \addToTail);

(
// Sequencer
Pbind(
\Instrument, "Delay",
\dectime, Pseq([6, 5, 4, 3, 2, 1, 0.5, 0.25, 0.125, 0.0612], inf)
).play;
)

  • Music Technology Forum - Presentation - Presentation by Stephen Whittington [3]
Stephen Whittington took the stand this week and talked to us about his compositions and his current work on distributed music and vocoding.

He started off explaining that before mass communication and travel, humans lived in small scattered groups. Cultures were isolated, and there were geographical boundaries. Today, thought is conducted on a global scale, and travel that used to take months, can now only take a few hours. Stephen is interested how this thought has translated to musical interaction. With communication protocols such as VOIP (Voice Over Internet Protocol) that work with the internet, “Distributed Music” is now a global phenomena.

Stephen defines Distributed Music as musicians performing together, but not in close proximity with each other. The thought that we are all on the same ‘Spaceship', is a good example of global thought; Stephen mentioned a quote from someone who said something on the lines of this, but I can’t remember who.

Stephen’s other main interest lie in that of vocoding, which ties in to his interest with the human voice and the expression of ‘utterance’.

He presented some of his compositions that used both of these technologies. The one composition that stuck in my mind was, “X is Dead”. With this piece he played some audio through a speaker connected to the bottom of a piano which is turned resonated its strings. At the same time, he also played the piano and spoke a series of words.

The other piece I found interesting was his involvement with “Distributed ‘synchronicity’ experiments”. The sonic result of this I thought sounded similar to a piece from the Dues Ex – Invisible War soundtrack. I think the fantastic “Alexander Brandon” had some part in creating this soundtrack. The soundtrack itself is mostly ambient with some ethnic instruments at times. You can download it for free here. The Deus Ex 1 soundtrack can be downloaded here.

I hear the 2nd years get to have a lecture with him every week on the human voice. I wish I had attended this as the human voice is also one of my interests. That’s the thing with the Tech course; it changes every year so there is usually something new to learn about other year levels even if you are a 3rd year. I wish all lecturers were as cool as Stephen is about sitting in on their classes… well most lecturers of whom I’ve met are.

  • Music Technology Forum - Workshop - NA
David Harris was sick this week.
  • References
    [1] Grice, David. 2006. Practical on Live Recording. University of Adelaide, 23 May.
    [2] Haines, Christian. 2006. Lecture on Supercollider . University of Adelaide, 25 May.
    [3] Tim Swalling. 2006. Presentation on Tim Swalling's projects. University of Adelaide, 25 May.

2 Comments:

Blogger unknown said...

There are a couple of things wrong with your Pbind code. Firstly, use all lower cases for event keys (i.e /instrument, not /Instrument). Secondly, you don't need to instantiate synths (/addToTail or whatever) if they are controlled by your Pbind, it does this automatically. I would advise that you 'Pbind' the carrier, not the effect, otherwise you may end up with the 'default tone', which it sounds like you have in your second audio sample. If you want to control the effect you can declare two Pbinds and execute them simultaneously:

a = Pbind(/instrument, "Carrier",
// carrier sequencing
// parameters etc.
)

b = Pbind(/instrument, "Delay",
// delay sequencing
// parameters etc.
)

Add some behaviour where the comments are, then select all and execute. Make sense?

4:43 pm, June 17, 2006  
Blogger Tyrell Blackburn said...

Martin, thanks for the help with the code. I'll definitely revisit this code and consider your comments.

It's less than a week away and my Major (Supercollider) is due, so I must understand how this works.

2:15 pm, June 18, 2006  

Post a Comment

<< Home