Here at Boom Box Post, we conduct monthly Lunch and Learn meetings where a rotating member of the team teaches a lesson to the rest of the studio. Whenever it is my turn to teach a Lunch and Learn lesson, I always try to rack my brain for a topic that I either don’t use on an everyday basis, or would personally like to learn more about. This month I chose Impulse Responses and Convolution Reverb.
I find this particular topic very interesting for a few reasons. First, it requires getting up out of that office chair! I’m always down for active and interactive audio experiences. Second, I love customizable audio options. It isn’t often that you find EXACTLY what you’re looking for, so the ability to create custom reverbs is always very useful. And lastly, it is a topic that I always knew about but have never personally done, so I figured diving right in was the best way to get a proper hands-on Convolution Reverb and Impulse Response experience.
We've been lucky enough here at Boom Box Post to be working on a lot of new series lately. And with new series, come main title sequences. The goal of any great main title is to stick in your head, typically achieved by a catchy, music-driven sequence. So where does that leave us sound designers? Sound effects can be infectious too! Here are some tips to help you succeed in getting sound effects into the next great main title sequence.
At Boom Box Post, we specialize in sound for animation. Although sonic sensibilities are moving toward a more realistic take, we still do a fair amount of work that harkens back to the classic cartoon sonic styles of shows like Tom and Jerry or Looney Tunes. Frequently, this style is one of the most difficult skills to teach new editors. It requires a good working knowledge of keywords to search in the library--since almost all cartoon sound effects are named with onomatopoeic names rather than real words like “boing”, “bork”, and “bewip”--an impeccable sense of timing, and a slight taste for the absurd.
I used to think that you were either funny or not. Either you inherently understood how to cut a sonic joke, or you just couldn’t do it. Period. But, recently, I began deconstructing my own process of sonic joke-telling, and teaching my formula to a few of our editors. I was absolutely floored by the results. It turns out, you can learn to be funny! It’s just a matter of understanding how to properly construct a joke.
The first piece of advice I give any new sound editor is to get Pro Tools and learn the keyboard shortcuts. Forget proficiency in typing, that's child's play. In order to compete in the real world of post production sound, you need to be FAST. Knowing your way around the keyboard doesn't just shorten your workday, it tells the clients - who expect requests to be carried out quickly - that you are on top of your game.
Basic keyboard shortcuts - switching the tools, changing the view - need to be second nature. But with literally hundreds to learn, there's bound to be a few that have slipped through the cracks. Here are some of the best 'lesser-known' Pro Tools keyboard shortcuts to help speed up your workflow.
We've all been there, right? You're setting up a home studio, and notice that although you finally have all the right gear, your room is sounding less than optimal. Clap once, and you hear a ping-pong of reverberations that make your ears recoil and your heart sick. So, you look up acoustic panels only to find that they're priced for princes. In this post, I'll share with you how to make inexpensive yet high quality acoustic panels on your own.
It's no surprise that parodies/homages of the games of my youth (those popular throughout the 80's) are starting to pop up in the shows I work on. In fact, they've been cropping into modern cartoons for pretty much my entire career. There are a few reasons for this. First (and most obvious) everything that is old is new again. Retro is always going to be hip, and we have been in a love affair with 'The Decade of Excess' for quite a while now. It's also true that a lot of the talent at the Executive Producer, Director and Head Writer level these days (those producing the content) land right square in that age group where these are the things they love from their youth as well. Lastly, however, you need to consider the style that comes with writing a video game sequence into your animated program. Most modern games both look and sound entirely realistic. So if, for example, you wanted the Teenage Mutant Ninja Turtles to take a break and geek out over a video game together, what fun would it be to have them play something that looks and sounds like a feature film? The fun comes with the retro, both visually and sonically.
Over the past year, Jeff has written two excellent posts on sound effects editorial layout: Downstream: Valuable Sound Designers Think Like Mixers and Speak Volumes Through Well Organized Work. He's laid out the golden rules of sound editorial layout in an easy-to-follow manner, and I highly recommend reading both posts before this one.
But, even the clearest rules can be misinterpreted and scenarios that seems like exceptions can often arise. Even the most seasoned editor will encounter situations where he or she will wonder, "How do I know if this is the best layout?" Here, I want to address some common pitfalls that I've seen and help you to solve them.
Our first Glossary of Sound Effects post was so popular we decided it would be fun to expand on it. This time around we not only included more specific search terms, but also a handful of modifiers.
As sound editors and designers, it’s always fun to talk about the techniques and tools we use to create out-of-this-world effects. At Boom Box, we’re often teaching each other new plug-ins to broaden our “sonic toolbox” and take our work to new heights. All of these tools and tricks-of-the-trade are necessary for us to do our job, but it’s important to remember that our job is that of a storyteller. Everything we create (however we choose to create it) must support, and perhaps elevate the storyline. In my personal experience, I have found that the quality of my work shines when I allow the story to guide my decisions specifically when editing “toony” effects, backgrounds, and design.
We have been meeting with a lot of candidates lately, both for our internship program as well as to bulk up our freelance roster. In addition to sitting down for a chat or looking over resumes, Kate and I are reviewing a lot of work. Whether editors are aware of it or not, the work in these sessions speaks a lot to their experience level. I've written previously about how to properly present your work with the mixing endgame in mind. However, I haven't yet touched on a topic that time and again seems to need further discussion; how to properly cut backgrounds. Not so much on a technical level (when it comes to how we like to see backgrounds cut, Jessey Drake has already created a great practical guide right here on this blog). It's more an issue of what constitutes a background, an ambience or simply another sound effect. It seems like such a simple thing, but being able to distinguish these from one another and thus properly laying out these sounds seems to be the dividing line between experience and novice. Here are some tips on how to be sure your backgrounds are an asset rather than a liability.
Elastic audio. The myth, the hidden tool and treasure.
As you might have read in our previous blog post How Do Ears Work?, our brains use our ears to derive sounds from detected frequencies. These frequencies are natural occurring vibrations that enter our ears where they are then processed into what we perceive as sounds. But what exactly are these frequencies? And how do they work?
A few weeks ago, we wrote a blog post about how the human ear works, and that inspired me to dive deeper into the section about the brain; specifically, psychoacoustics. The study of psychoacoustics, as defined by Merriam-Webster, is “a branch of science dealing with the perception of sound, and the sensations produced by sounds.” Essentially, psychoacoustics is how your brain perceives sound, and if used correctly, it can be an incredibly powerful tool in a sound designer’s arsenal.
Sound is an essential part of all of our lives. It allows us to communicate with others via speech, it helps us to sense imminent danger, and it affords us the enjoyment and entertainment of music. But, how does sound make its way from vibrations in the air to our own auditory perception which we can easily identify and translate? Our bodies are miracles of science, and the answer to that question is fascinating.
At Boom Box Post, we are always doing our best to meet new content creators who are just beginning their professional journey. Not only are their projects incredibly fun and inventive, but we often get to walk them through the process of post-production sound for the first time. For even the most seasoned artists, writers, or producers, this can be daunting territory the first time around.
The following is a primer designed to introduce new content creators to post-production sound. It's an incredibly fun process and the final step in creative story storytelling before your content reaches viewers.
The study of the interaction between how our ears and brain respond to sound is called psychoacoustics or sound perception. As audience members, we can perceive a sound as being a pleasing experience or not and anywhere in between. But, this perception isn't formed merely by using our ears. The connections between our ears, brain, and nervous system let us feel the effects of sound with our entire body. This concept of physically hearing and psychologically perceiving sound helps to connect us to the television show, movie, or video game we might be enjoying.
We were so excited to give a talk at this year's Creative Talent Network Animation Expo in Burbank. The talk started with a brief history of sound for animation (a lot of which you can find expertly boiled down here) followed by an overview of the post sound process from beginning to end. We finished up with some video demos of the different layers of sound in our work as well as some of the fun instruments and props we have recorded over the years.
We hoped the panel would prove interesting to content creators looking for information on how to approach the sound process for their own work. To our pleasant surprise (this was our first time doing this after all) the turnout was incredible! The room was filled to capacity and we were bombarded with fantastic questions from a very energetic crowd.
In the 1920’s and 1930’s, recording equipment was extremely large and heavy, rendering it impossible to take outside of the studio. Unable to record sound effects in the real world, the studios were forced to invent new approaches to creating sound for their animated content. Thus, two different approaches to sound effects were quickly developed.
With the recently released Star Wars: The Force Awakens trailer smashing existing viewing records, and crashing sites like Fandango due to a rush for pre-sale tickets, it is no secret that the hype is strong with this one. On December 18th of this year, hoards of people will be heading to the theaters to witness the newest addition to the Star Wars universe.
Diehard fans know there is a lot to look forward to, but there is a new addition to the Star Wars universe that is easily overlooked: Dolby Atmos. Most theaters still show films in 5.1, but with Atmos becoming increasingly popular as part of a premium film experience, it is worth noting how far technology has come since the first Star Wars film in 1977. Therefore, I would like to focus this week’s blog post on the evolution of mixing formats and how they impact the audience experience.