The NIN rebuild was done entirely in GarageBand, starting from Trent's files and using only cut/copy/paste/move, pitch-shifting and a little extra effects-processing.
GarageBand is sleek, but barely adequate for even this crude project. Its current fatal limitation for my own music is that it doesn't do any kind of MIDI out, and thus can't drive external MIDI devices. It's cool that it has its own software instruments, but they're toy sounds compared to my Korg Triton, which I have no interest in reducing to a glorified keyboard controller.
I unloaded a lot of my old random bits of music gear in the big object-purge when Beth moved in, so my setup is now pretty simple: the Triton, a Tascam 788 hard-drive recorder, a guitar, a big guitar multi-effects board, a fretless bass, some microphones, and a pair of Mackie powered monitors. I didn't use the guitar or bass on the new song, so everything other than my voice came from the Triton.
There are five different loops on the drum track, mostly played by hand and then touched up in the event editor. I initially played the piano riff by hand, but couldn't get it to be legato enough, so I ended up transcribing it and then recreating it note by note in step-record. The Triton has a great dual-programmable arpeggiator, so for the burbly bass-line I step-recorded a custom arpeggiator pattern and then played the trigger notes in real-time. Or, more precisely, I first tried to play this on my real bass, but it made my hand bleed, so I slowed down what I was doing and then recreated an even faster version in the arpeggiator. I think everything else I just played by hand, the most complicated bit being some weird ghost noises in the quiet parts that required the ribbon controller and joystick. Another advantage to sequencing this stuff instead of recording it as audio, obviously, is that you can do all the punching in and out you want without any risk of stop/start artifacts, which saves a lot of time in a song like this where most of the individual parts come in and out a lot. All the mixing and effects-processing for the instrument parts were done in the Triton, too. I think in the end I used nine sequencer tracks and nine different synth programs, but I'm not sure what maximum note-polyphony I hit at any one moment.
All the Triton stuff was originally done with internal sync, to cut down on button-pushing, but for the vocals the Triton was then slaved to external MIDI sync from the 788. The switch could have been essentially transparent, except that I had insisted on using one drum-loop in 7/4, so I had to recreate a matching tempo-map on the 788.
I then did vocal takes ad nauseam on the 788, switching among several recorder tracks to compare takes. I tried some "harmony" vocals, too, which sounded fabulously terrible, so in the end stuck to a single vocal track, from a single continuous take. The 788 has two multi-effects units of its own, so all the vocal processing was done there: EQ, compression, de-essing, some chorus, a little reverb. The Triton was slaved to the 788 all the way to the final mixdown, so the instrument parts didn't go through any extra generations, and more significantly, remained in fully-editable sequence form even after the vocals were recorded.
The combination of the storm and Beth being away resulted in my violating one cardinal rule of recording in this process. When I went to do the vocals I realized that I'd purged not only my one bedraggled pair of closed-ear headphones, but all my 1/4"-to-1/8" headphone adapters, and thus I couldn't even use my iPod earbuds for monitoring. So I did the vocals with the music actually playing in the room, me facing towards the monitors and the microphone facing away. The directional pattern of the microphone (an Audio-Technica Midnight Blues) turns out to be excellent, and produced almost no audible bleed-through even when soloing the vocal track.
Mastered to CD-R on the 788, trimmed and normalized in Sound Studio on the Mac, converted to mp3 in iTunes.
There's not much else to say about the compositional process, such as it was. I did most of the drums first, including a little bit of diagramming on paper and a bunch of just pounding on keys. Once I had the loops I went back to paper to figure out a tentative song-structure, sequenced that, and then fiddled with it until it seemed workable. The deepest original philosophical premise for the music was that it be 3:20 long, which I revised to 2:57 because one section felt tedious and I couldn't think of a way to fix it. The rest of the instruments were overlaid on the drums one at a time, with a fairly small amount of iteration since there aren't actually that many places in the song where there's a lot going on at once.
For the vocals I started with nonsense lyrics to work out the melody and meter. Possibly I ended up with some nonsense lyrics in the finished piece, too, but at least they're different than the nonsense lyrics I began with. Sometimes I already have a story in mind before writing anything, but in this case all I had was one word ("Tantalizer"), so it took several drafts before anything even vaguely coherent materialized, and although I know what I think the thing ended up being about, I can't really explain why it ended up being about that.
I always think I could get better melodies if I figured them out on the keyboard instead of just improvising them by singing, but when I try to do that I come up with notes I want to sing but can't, which is unhelpful. Some vocal lessons would increase my options, but then so would virtually any kind of music training, production discipline, writing forethought, etc. But as the last long silence testifies, I have a much bigger problem finding the time and emotional space to make any music, and it's pointless to worry about how bad your music is when you aren't making it.
GarageBand is sleek, but barely adequate for even this crude project. Its current fatal limitation for my own music is that it doesn't do any kind of MIDI out, and thus can't drive external MIDI devices. It's cool that it has its own software instruments, but they're toy sounds compared to my Korg Triton, which I have no interest in reducing to a glorified keyboard controller.
I unloaded a lot of my old random bits of music gear in the big object-purge when Beth moved in, so my setup is now pretty simple: the Triton, a Tascam 788 hard-drive recorder, a guitar, a big guitar multi-effects board, a fretless bass, some microphones, and a pair of Mackie powered monitors. I didn't use the guitar or bass on the new song, so everything other than my voice came from the Triton.
There are five different loops on the drum track, mostly played by hand and then touched up in the event editor. I initially played the piano riff by hand, but couldn't get it to be legato enough, so I ended up transcribing it and then recreating it note by note in step-record. The Triton has a great dual-programmable arpeggiator, so for the burbly bass-line I step-recorded a custom arpeggiator pattern and then played the trigger notes in real-time. Or, more precisely, I first tried to play this on my real bass, but it made my hand bleed, so I slowed down what I was doing and then recreated an even faster version in the arpeggiator. I think everything else I just played by hand, the most complicated bit being some weird ghost noises in the quiet parts that required the ribbon controller and joystick. Another advantage to sequencing this stuff instead of recording it as audio, obviously, is that you can do all the punching in and out you want without any risk of stop/start artifacts, which saves a lot of time in a song like this where most of the individual parts come in and out a lot. All the mixing and effects-processing for the instrument parts were done in the Triton, too. I think in the end I used nine sequencer tracks and nine different synth programs, but I'm not sure what maximum note-polyphony I hit at any one moment.
All the Triton stuff was originally done with internal sync, to cut down on button-pushing, but for the vocals the Triton was then slaved to external MIDI sync from the 788. The switch could have been essentially transparent, except that I had insisted on using one drum-loop in 7/4, so I had to recreate a matching tempo-map on the 788.
I then did vocal takes ad nauseam on the 788, switching among several recorder tracks to compare takes. I tried some "harmony" vocals, too, which sounded fabulously terrible, so in the end stuck to a single vocal track, from a single continuous take. The 788 has two multi-effects units of its own, so all the vocal processing was done there: EQ, compression, de-essing, some chorus, a little reverb. The Triton was slaved to the 788 all the way to the final mixdown, so the instrument parts didn't go through any extra generations, and more significantly, remained in fully-editable sequence form even after the vocals were recorded.
The combination of the storm and Beth being away resulted in my violating one cardinal rule of recording in this process. When I went to do the vocals I realized that I'd purged not only my one bedraggled pair of closed-ear headphones, but all my 1/4"-to-1/8" headphone adapters, and thus I couldn't even use my iPod earbuds for monitoring. So I did the vocals with the music actually playing in the room, me facing towards the monitors and the microphone facing away. The directional pattern of the microphone (an Audio-Technica Midnight Blues) turns out to be excellent, and produced almost no audible bleed-through even when soloing the vocal track.
Mastered to CD-R on the 788, trimmed and normalized in Sound Studio on the Mac, converted to mp3 in iTunes.
There's not much else to say about the compositional process, such as it was. I did most of the drums first, including a little bit of diagramming on paper and a bunch of just pounding on keys. Once I had the loops I went back to paper to figure out a tentative song-structure, sequenced that, and then fiddled with it until it seemed workable. The deepest original philosophical premise for the music was that it be 3:20 long, which I revised to 2:57 because one section felt tedious and I couldn't think of a way to fix it. The rest of the instruments were overlaid on the drums one at a time, with a fairly small amount of iteration since there aren't actually that many places in the song where there's a lot going on at once.
For the vocals I started with nonsense lyrics to work out the melody and meter. Possibly I ended up with some nonsense lyrics in the finished piece, too, but at least they're different than the nonsense lyrics I began with. Sometimes I already have a story in mind before writing anything, but in this case all I had was one word ("Tantalizer"), so it took several drafts before anything even vaguely coherent materialized, and although I know what I think the thing ended up being about, I can't really explain why it ended up being about that.
I always think I could get better melodies if I figured them out on the keyboard instead of just improvising them by singing, but when I try to do that I come up with notes I want to sing but can't, which is unhelpful. Some vocal lessons would increase my options, but then so would virtually any kind of music training, production discipline, writing forethought, etc. But as the last long silence testifies, I have a much bigger problem finding the time and emotional space to make any music, and it's pointless to worry about how bad your music is when you aren't making it.