Friday, August 16, 2013

Game Music Online: Introduction to Game Audio: How Games Are Different from Anything You've Worked on Before

Talking to a friend when I went in to UM today about my game music class and he pointed me to this video of Brian Schmidt delivering what appears to be the introductory talk at GameSoundCon in 2010. Wow!  I've got to get there!  If you haven't seen this before, give it a view.  It's amazing.  The best part of it is that Brian is a clear and effective communicator.  Seeing this makes me all the more excited to have him as a virtual speaker in my class.




Here are my notes and reactions to his talk:

First and foremost, it pleases me greatly to have read most of this before.  While the way he presents it is very unique, after a year of intense study, there were very few ideas I hadn't encountered before here.

I really like his closing the door analogy for game audio.  Basically, you could brush it closed, slam it, or never close it at all-- in a movie, a fixed medium, we know how it'll be, how hard the door's shut, when it's shut, etc.  In a video game, you don't know when or how...  you need to be able to encompass all of those scenarios and more.

A random thought as he's describing game audio: Musically, we need a new way to think about game music analysis, since it's interactive.  Traditional methods still apply in plenty of cases, key, meter, phrase shape, etc.  In other ways, they need to evolve-- the phrase can be broken many places and a new cue jumped to.  How can music analysis adjust for this?  Not really sure how or what exactly that means, but listening to this made me realize there's much work to be done in this area.  Interesting to think of it here because I've been realizing that I'm up for some hurdles in the "My Gaming Audio History" section of the blog as I move forward in time to modern games that have adaptive audio.

I knew the term "spotting session" from film audio, but good to hear "event planning" is the game music equivalent.

Spore-- real time compositional synthesis.  I know nothing about this game.  I want to check it out.

It's very good for me to hear about how game engines and audio engines interact.  I really didn't know anything about the process, and while his explanation is simple, the personification makes it easy to understand.

I love the turn of the century theater pianist analogy as being comparable to the audio engine and game sound.  What makes this analogy work is that the theater pianist was making music live, often playing from a collection of pieces with varying moods and styles that were easy to jump between as needed for the onscreen action.  In this analogy, the theater pianist represents the audio engine and the music he played off of the collection of game assets.

Brian sums up four unique audio challenges:
1- Lack of precognition for player action
2- Duration of experience is unknown
3- Sounds must vary with action
4- Unlike movies, games have resource constraints

Probably my favorite statement from the video was that "as a game composer you're working on a work in progress."  When you read game composer interviews, they discuss one of the best ways of getting inspiration for the game music is to play the game.  I'd never considered that they weren't looking at the finished version of the game, just a build.  I can identify with this as a musical director where I've worked on shows.  They often need to be adapted for the venue, pieces shortened, keys changed, parts re-written, music inserted or removed.  That flexibility, which at times can seem like chaos, is an extremely creative environment and necessitates a fine balance between being flexible in the moment and simultaneously keeping the big picture in sight.

No comments:

Post a Comment