Sound Effects
Sound effects are the icing on the cake, but in the bizarre world of sound, this icing has the same nutrients found in fruit and vegetables. They have the sexy allure of bared skin at the beach but somehow cover the same skin with the thickest wool in a blinding snowstorm. Sound effects are life, death, and everything in between.
We can think of sound effects in two distinct categories: hard effects and ambience.
Hard effects (SFX) are the foreground of the aural narrative. They are prominent in the mix and are defined and distinct. Another track of thought is to think of a hard effect as an object that generates its own sound, often without direct human interaction. An alarm clock RINGS. A tyrannosaurus rex ROARS. A grenade EXPLODES.
Hard effects can be anything and everything that we hear in the world. Oftentimes in dramatic narratives, hard effects are machines, animals, creatures, vehicles, weapons, electronics, and any object typically larger than a human being. But let us not forget sounds that we hear on a daily basis: doors, smartphone ringers, televisions, and water faucets. They are all in the foreground and work with the dialog to carry the action from scene to scene.
These foreground effects can be recorded in the studio or in the field (which is most likely where you'd find recordists capturing gunshots, explosions, and other exotic effects). In your case, it will probably be a healthy mix of both studio and field recording and depending on the nature of the sound itself, can usually be recorded in mono (single channel).
Ambiences on the other hand are the background, the rest of the world that resides in the blur beyond the reaches of focus. Stop for a moment. Turn off the radio and the television and any other foreground noise . . . then, just listen. Whatever sound that remains is your ambience. It could be the hum of the refrigerator, the traffic outside your window, or the ring of crickets in your backyard.
Though these sounds are quite different from one another, they are in fact, one in the same. They provide an atmospheric, contextual setting, and a subtle form of completeness that soothes the nerves. Let's think about film ambiences for a moment. The next movie or television program you watch, focus past the dialog and listen to the layers underneath (and they are almost always there). If you were to remove all of the ambiences from a motion picture soundtrack, you feel it instantly, for the ambience fills in all the gaps of audio that so comfort us in the real world.
The beauty of ambiences is that at times, they can creep into the foreground to establish a setting. When first entering a beach environment, the crash of waves might be prominent in a mix to introduce the new environment but later the waves settle into the background when the action or dialog picks up. But remember, despite the change in volume, it's always there.
Ambience can also be a character, infusing a sense of elation, mystery or terror. The horror and science fiction genres are particularly known for their use of ambiences to establish mood.
In terms of actual sound gathering, often the best ambiences are found out in the real world (and to complete the immersive effect, recorded in stereo). While recording a particular sound effect might take a few seconds time; ambient recordings will often have takes that are minutes, dozens of minutes, or even hours in length. These recording sessions require perfect silence, in which an ambient loop of sound can be established. Out of an hour of recorded material, maybe we can gather only ninety perfect seconds of a particular ambience, but that means our loop is ninety-seconds long, which is much easier to further edit than a loop that is only nine seconds.
We can think of sound effects in two distinct categories: hard effects and ambience.
Hard effects (SFX) are the foreground of the aural narrative. They are prominent in the mix and are defined and distinct. Another track of thought is to think of a hard effect as an object that generates its own sound, often without direct human interaction. An alarm clock RINGS. A tyrannosaurus rex ROARS. A grenade EXPLODES.
Hard effects can be anything and everything that we hear in the world. Oftentimes in dramatic narratives, hard effects are machines, animals, creatures, vehicles, weapons, electronics, and any object typically larger than a human being. But let us not forget sounds that we hear on a daily basis: doors, smartphone ringers, televisions, and water faucets. They are all in the foreground and work with the dialog to carry the action from scene to scene.
These foreground effects can be recorded in the studio or in the field (which is most likely where you'd find recordists capturing gunshots, explosions, and other exotic effects). In your case, it will probably be a healthy mix of both studio and field recording and depending on the nature of the sound itself, can usually be recorded in mono (single channel).
Ambiences on the other hand are the background, the rest of the world that resides in the blur beyond the reaches of focus. Stop for a moment. Turn off the radio and the television and any other foreground noise . . . then, just listen. Whatever sound that remains is your ambience. It could be the hum of the refrigerator, the traffic outside your window, or the ring of crickets in your backyard.
Though these sounds are quite different from one another, they are in fact, one in the same. They provide an atmospheric, contextual setting, and a subtle form of completeness that soothes the nerves. Let's think about film ambiences for a moment. The next movie or television program you watch, focus past the dialog and listen to the layers underneath (and they are almost always there). If you were to remove all of the ambiences from a motion picture soundtrack, you feel it instantly, for the ambience fills in all the gaps of audio that so comfort us in the real world.
The beauty of ambiences is that at times, they can creep into the foreground to establish a setting. When first entering a beach environment, the crash of waves might be prominent in a mix to introduce the new environment but later the waves settle into the background when the action or dialog picks up. But remember, despite the change in volume, it's always there.
Ambience can also be a character, infusing a sense of elation, mystery or terror. The horror and science fiction genres are particularly known for their use of ambiences to establish mood.
In terms of actual sound gathering, often the best ambiences are found out in the real world (and to complete the immersive effect, recorded in stereo). While recording a particular sound effect might take a few seconds time; ambient recordings will often have takes that are minutes, dozens of minutes, or even hours in length. These recording sessions require perfect silence, in which an ambient loop of sound can be established. Out of an hour of recorded material, maybe we can gather only ninety perfect seconds of a particular ambience, but that means our loop is ninety-seconds long, which is much easier to further edit than a loop that is only nine seconds.
Creating Sound Objects
Synchronization of Image and Sound
To fortify the illusion of the story world, sound is most often synchronized with its expected accompanying image to create the virtual reality. The saying, "See a dog; hear a dog," refers to hearing exactly what we are seeing, forming a redundancy that we normally experience in the real world.
By breaking this image-sound connection, the dog can bark offscreen to create an atmosphere -- say, a neighborhood fight. This flexibility allows the sound designer freedom to interpret the scene and offer the audience more insight than with pure synchronization alone.
A more unusual action is to synchronize the dog's barking with an unexpected image. If a character is gesticulating with exaggerated mouth movements but we hear a dog bark instead of any words synchronized with the lips, this produces a strange, maybe funny, and perhaps threatening effect.
You can make the point of synchronization in several ways. An unexpected double break in both the audio and visual tracks can jolt the audience with the shock of transitioning into an entirely new scene. Another possibility is when separate tracks converge after being on their own courses and gradually or suddenly fall into sync, at which point a greater relatedness occurs between the two sound sources. This can happen within certain audio elements as well as between effects, music, dialogue, and ambience. A third option, when the image cuts to a close-up and the sound grows louder or a grand orchestral theme accompanies an epic wide shot, creates emphasis on the sound with physical punctuation. Finally, the meaning of a word, a particularly emotional musical chord, or a sound effect imbued with storytelling can strengthen the meaning of an image synchronizing with this sound.
A "false sync point" does not fulfill the anticipation of matching image and sound. This can happen when, for example, a gun is pointed at someone's head and the camera cuts to another image with the sound of the shot. Our minds fill in this synchronism that doesn't exist on the screen, making our involvement even more intimate because we are participating with the action internally.
Here is an example of the question of synchronization. The speed of sound is much slower than that of light, so thunder reaches us much later than lightning, and the crack of a baseball bat will be heard in the outfield bleachers quite after the ball is hit. Convention tells us to put all sounds in exact sync with image, but recognizing the physics of light and sound can sometimes better serve filmic reality. The decision to follow convention or physics depends on the story and emotional impact in the scene.
Multiple Meanings of Sound
A single sound can have different meanings depending on its visual context. A sigh could be the last breath on someone's deathbed or a pleasant recognition of a cute baby. A car horn can indicate imminent danger or celebration of a hometown victory. A hiss might come from a rattlesnake or a kettle boiling. This ambiguity is extremely fertile ground for creating sound effects from unusual sources.
You can develop another level of sophistication in storytelling with a "sound arc," which is a single type of sound progressing throughout the entire film. Each time it occurs, the context may shift according to the emotional or character development so that the sound actually represents a flow in the drama.
The intention of a clock ticking, for example, might range from control or anxiety to danger or horror.
The impact or intention of the sound may shift over the course of the film. In TheBlair Witch Project, the sounds are quite realistic and linked to the character's identifiable physical environment until the evening brings unknown noises from the dark woods. Furthermore, when they reach the house at the end, the ambience tracks go completely wild, adding to the full-blown fear of their experience.
The Language of Sound Imagery
We use words to symbolize things, events, and descriptions of the physical world as well as our subjective, emotional states. Sounds can represent more than just the image onscreen.
If we take, for example, the sound of a scream and use it beyond its literal link to a wide-open human mouth, here are some possibilities available within the language of sound imagery.
Simile occurs with the acoustic similarity of two sounds, such as the scream and a siren.
Hyperbole is an obvious and intentional exaggeration, like the scream paired with an alarm clock.
Metaphor suggests comparison of the sound with an idea: the scream with a blinking red light, perhaps.
Allegory represents the abstract through concrete -- for example, in The Shout, a scream is mysteriously held until the climax.
Irony contrasts least-expected opposites, such as the scream with a smile.
Paradox is an apparent contradiction that may express an inner truth -- for example, a scream seeming to come from a cigarette.
Finally, vivification occurs when living traits are ascribed to an inanimate object -- the scream from a doormat, for instance.
These forms of sound language make possible a subtler yet potentially powerful expression of the theme or emotion in the film. They may be more effective if the audience is familiar with the references, so consider your viewers' knowledge base and culture when using these more abstract relationships in your sound design.
Sound Objects
Many times, a single sound that you record or find in a sound effects library is perfect for the moment in the scene. However, a sound that occurs onscreen often consists of a combination of sounds that you will construct. For the sake of this class, we'll call this a sound object.
A sound object consists of two or more distinct sources combined through the gestalt principle of grouping by temporal proximity. In other words, when we hear two sounds simultaneously, they naturally seem to come from the same source. Any separation, even by a fraction of a second, can dispel the illusion of unity.
Adding different sounds to a single sound, such as a door closing, can shift the semantic meaning. Here are some examples of how the image of a door closing can have different meanings when a second sound is added:
A closing door with gunshot means finality. The same door sound plus a bell equals an alert. Layered with a crowd cheer, a closing door could be a celebration. That door and wailing together imply grief. A closing door combined with a tiger growl indicates terror. Layering more sounds adds to the complexity of the waveform and the meaning within the storyline.
In these cases, the mix levels are critical for the final impact, so you must test different ratios of gain on each sound. Which should be more prevalent? This can change during the course of the story to emphasize a transition in the drama.
The movie Malcolm X, for example, gradually introduces the sounds of gunshots as a political statement. The main character's social activism begins with a burst of fame denoted by popping photoflashes, and as this fame grows, it becomes more dangerous to the status quo. The sounds of flashes then become layered with louder and louder gunshots, portending his assassination because of his provocative politics.
Synchronization of Image and Sound
To fortify the illusion of the story world, sound is most often synchronized with its expected accompanying image to create the virtual reality. The saying, "See a dog; hear a dog," refers to hearing exactly what we are seeing, forming a redundancy that we normally experience in the real world.
By breaking this image-sound connection, the dog can bark offscreen to create an atmosphere -- say, a neighborhood fight. This flexibility allows the sound designer freedom to interpret the scene and offer the audience more insight than with pure synchronization alone.
A more unusual action is to synchronize the dog's barking with an unexpected image. If a character is gesticulating with exaggerated mouth movements but we hear a dog bark instead of any words synchronized with the lips, this produces a strange, maybe funny, and perhaps threatening effect.
You can make the point of synchronization in several ways. An unexpected double break in both the audio and visual tracks can jolt the audience with the shock of transitioning into an entirely new scene. Another possibility is when separate tracks converge after being on their own courses and gradually or suddenly fall into sync, at which point a greater relatedness occurs between the two sound sources. This can happen within certain audio elements as well as between effects, music, dialogue, and ambience. A third option, when the image cuts to a close-up and the sound grows louder or a grand orchestral theme accompanies an epic wide shot, creates emphasis on the sound with physical punctuation. Finally, the meaning of a word, a particularly emotional musical chord, or a sound effect imbued with storytelling can strengthen the meaning of an image synchronizing with this sound.
A "false sync point" does not fulfill the anticipation of matching image and sound. This can happen when, for example, a gun is pointed at someone's head and the camera cuts to another image with the sound of the shot. Our minds fill in this synchronism that doesn't exist on the screen, making our involvement even more intimate because we are participating with the action internally.
Here is an example of the question of synchronization. The speed of sound is much slower than that of light, so thunder reaches us much later than lightning, and the crack of a baseball bat will be heard in the outfield bleachers quite after the ball is hit. Convention tells us to put all sounds in exact sync with image, but recognizing the physics of light and sound can sometimes better serve filmic reality. The decision to follow convention or physics depends on the story and emotional impact in the scene.
Multiple Meanings of Sound
A single sound can have different meanings depending on its visual context. A sigh could be the last breath on someone's deathbed or a pleasant recognition of a cute baby. A car horn can indicate imminent danger or celebration of a hometown victory. A hiss might come from a rattlesnake or a kettle boiling. This ambiguity is extremely fertile ground for creating sound effects from unusual sources.
You can develop another level of sophistication in storytelling with a "sound arc," which is a single type of sound progressing throughout the entire film. Each time it occurs, the context may shift according to the emotional or character development so that the sound actually represents a flow in the drama.
The intention of a clock ticking, for example, might range from control or anxiety to danger or horror.
The impact or intention of the sound may shift over the course of the film. In TheBlair Witch Project, the sounds are quite realistic and linked to the character's identifiable physical environment until the evening brings unknown noises from the dark woods. Furthermore, when they reach the house at the end, the ambience tracks go completely wild, adding to the full-blown fear of their experience.
The Language of Sound Imagery
We use words to symbolize things, events, and descriptions of the physical world as well as our subjective, emotional states. Sounds can represent more than just the image onscreen.
If we take, for example, the sound of a scream and use it beyond its literal link to a wide-open human mouth, here are some possibilities available within the language of sound imagery.
Simile occurs with the acoustic similarity of two sounds, such as the scream and a siren.
Hyperbole is an obvious and intentional exaggeration, like the scream paired with an alarm clock.
Metaphor suggests comparison of the sound with an idea: the scream with a blinking red light, perhaps.
Allegory represents the abstract through concrete -- for example, in The Shout, a scream is mysteriously held until the climax.
Irony contrasts least-expected opposites, such as the scream with a smile.
Paradox is an apparent contradiction that may express an inner truth -- for example, a scream seeming to come from a cigarette.
Finally, vivification occurs when living traits are ascribed to an inanimate object -- the scream from a doormat, for instance.
These forms of sound language make possible a subtler yet potentially powerful expression of the theme or emotion in the film. They may be more effective if the audience is familiar with the references, so consider your viewers' knowledge base and culture when using these more abstract relationships in your sound design.
Sound Objects
Many times, a single sound that you record or find in a sound effects library is perfect for the moment in the scene. However, a sound that occurs onscreen often consists of a combination of sounds that you will construct. For the sake of this class, we'll call this a sound object.
A sound object consists of two or more distinct sources combined through the gestalt principle of grouping by temporal proximity. In other words, when we hear two sounds simultaneously, they naturally seem to come from the same source. Any separation, even by a fraction of a second, can dispel the illusion of unity.
Adding different sounds to a single sound, such as a door closing, can shift the semantic meaning. Here are some examples of how the image of a door closing can have different meanings when a second sound is added:
A closing door with gunshot means finality. The same door sound plus a bell equals an alert. Layered with a crowd cheer, a closing door could be a celebration. That door and wailing together imply grief. A closing door combined with a tiger growl indicates terror. Layering more sounds adds to the complexity of the waveform and the meaning within the storyline.
In these cases, the mix levels are critical for the final impact, so you must test different ratios of gain on each sound. Which should be more prevalent? This can change during the course of the story to emphasize a transition in the drama.
The movie Malcolm X, for example, gradually introduces the sounds of gunshots as a political statement. The main character's social activism begins with a burst of fame denoted by popping photoflashes, and as this fame grows, it becomes more dangerous to the status quo. The sounds of flashes then become layered with louder and louder gunshots, portending his assassination because of his provocative politics.
Evolving Sound
Sound is evolving in preproduction on big animated productions like in the How to Train Your Dragon movies. In the following NPR interview, Randy Thom describes how he was asked to create vocalizations of each dragon before they were animated, and how this involvement in preproduction in some ways changed the look of the animation. It's a fantastic process and very interesting how they intentionally choose to humanize each dragon.
Sound is evolving in preproduction on big animated productions like in the How to Train Your Dragon movies. In the following NPR interview, Randy Thom describes how he was asked to create vocalizations of each dragon before they were animated, and how this involvement in preproduction in some ways changed the look of the animation. It's a fantastic process and very interesting how they intentionally choose to humanize each dragon.
Creating Sound Events
Storytelling and the Hero’s Journey
At the heart of storytelling is the classic structure of goal-conflict-resolution. In its most contributive form, sound design supports this dramatic evolution by developing themes and rhythms to underscore the narrative structure. Some examples in myth, music, and film illustrate how this structure works. Later in the course, techniques for creating visual and sound maps of a film will offer practical applications.
The great mythologist Joseph Campbell researched ancient stories of world cultures and found a universal blueprint for human transformation, calling it "The Hero's Journey." This is not the only narrative structure that he found, but it certainly remains valid in today's society and storytelling. We see it in such films as The Wizard of Oz, Star Wars, and Gandhi, and it continues to give us hope against all odds that we will succeed in our own quests. The structure consists of three basic acts:
Discontent or imbalance with the home community
Journey into the unknown in search of a treasure or solution, with many battles along the way
Return to the community with something of value as well as an internal transformation of character and story resolution
Story Structure in Music
You can also find the three-act structure in classical music -- in particular, the sonata form. During the Classical and Romantic periods, the sonataform was developed with a distinctive architecture relating to a beginning, middle, and end, also typical of dramatic storytelling. Two thematic ideas are presented in the exposition, and then these are brought into tension in the development through key changes with transforming or opposing themes. In the recapitulation, the composer brings us back to the opening themes and "home" key, closing out the whole piece.
In emphasizing tension and release, change and resolution, the sonata form mirrors the emotional lives of the listeners. They are able to take the nonverbal expedition that is reflected archetypically in the Hero's Journey, in which the protagonist must begin with a set of known themes, virtues, or goals; confront opposition to reaching these goals with both inner and outer conflict along the way; finally find a resolution; and return to home with hard-won treasure and personal transformation. The contradictory pressures and relief of tension are the same that we experience in life, society, and all great storytelling.
Anticipation
The foundation of listening is anticipation, seeking patterns, and variations within expectation. This principle of music theory coincides with dramatic storytelling structure, whether in comedy, thriller, or action-adventure. We try to figure out what will happen next and revel in the surprising twists and turns of the plot. They ultimately bring us to a climax we are all expecting, even if we don't know the exact outcome.
When we anticipate a musical or dramatic event, we test our hypothesis, allowing for adjustments in our next anticipations. These are the deep relationships that create not only the structure but also the emotion.
Imagine that you need to pay for a movie ticket and you think you have a 10-dollar bill in your wallet. If that is correct, there is no emotion in the act. But if you only have five dollars, that could make you pretty upset; if you find 50 dollars, you would be pleasantly surprised. Emotion emerges in music and storytelling when anticipation is not met.
The distance from the anticipated outcome can affect the audience as well. If the deviations are too big, the structure becomes incoherent and senseless. If the anticipation is always met or there is too little change, then the result is boring or mechanical.
Contradicting Expectation
Meaning arises in music, story, and film when expectation is contradicted. Setting up a joke with a 1-2-3 punch line is a classic twist of expectation as with the satire in such films as Airplane or Scary Movie. Beethoven developed a theme toward a logical point and then flipped it in a new direction, drawing the audience all the way to the end of a movement before resolving the structure. The contradiction occurs between the conventional, expected structure based on genre -- also considered the gestalt background for the audience -- and the individual story or musical structure, or the gestalt foreground, that actually appears.
The clearest tension combines maximum contradiction with maximum unification between these background and foreground elements. In other words, arousing unfulfilled expectations, heightening suspense, and postponing resolution are all essential to build this tension so that the ending has emotional value.
Violating one moment can intensify the next anticipated and fulfilled one. A classic example: in the horror film, the small fright of a bat screeching followed by the real threat of the howling monster. This pleasure mechanism derived from the unexpected has its basis in our neuronal makeup.
Sound Events
A series of sounds heard sequentially in time can give us a huge amount of information about the world and the dynamics that exist among people, things, and events. The relationships are defined by action-reaction and cause-effect, so the order of these sounds is critical in determining the meaning of the event. For example, if you hear a slap and then a cry, you can assume that someone has been hit and is crying because of the pain. If the order reverses, you might imagine that the crying elicits the slap, which induces the person to stop crying.
In film, the picture most often dictates the order of events, and the sound accompanies this order. But there are many opportunities to use sound as an off-screen element or in the "I Think," "I Know," or "I Don't Know" sound spheres. In this case, the sound is unlinked to the image and free to create its own story.
In this course, we are deconstructing the listening and creative process into small pieces, such as grammatical structure and rules. In fact, during your professional sound design work, these elements will not be artificially separated. So consider that the definition of the sound event is serving just for this learning environment, and use this approach in isolation from the sound object or use of plug-ins so that you master this part of the sound design grammar.
Earthquake Example
An earthquake is an example of a series of sound events that add up to a dramatic narrative sequence rather than a simple boom and crash. It can be portrayed in four separate parts, with a few seconds of pause between each.
First you hear a low rumble and shuddering; it grows for a second or two. Then a couple of pottery crashes mix with louder rumbling. Next, a sudden sliding, crashing sound can be achieved by, say, dropping a quantity of small stones on the sloping lid of a cardboard box with a glass jar at the bottom and dropping the pitch an octave or two. Finally, the rumbling noises come in again and then fade out to zero.
You can obtain an effective brooding sensation in between the earthquake sounds with faint, distant voices while laying panic noises like screaming and shouting behind the third, "falling debris" section.
Storytelling and the Hero’s Journey
At the heart of storytelling is the classic structure of goal-conflict-resolution. In its most contributive form, sound design supports this dramatic evolution by developing themes and rhythms to underscore the narrative structure. Some examples in myth, music, and film illustrate how this structure works. Later in the course, techniques for creating visual and sound maps of a film will offer practical applications.
The great mythologist Joseph Campbell researched ancient stories of world cultures and found a universal blueprint for human transformation, calling it "The Hero's Journey." This is not the only narrative structure that he found, but it certainly remains valid in today's society and storytelling. We see it in such films as The Wizard of Oz, Star Wars, and Gandhi, and it continues to give us hope against all odds that we will succeed in our own quests. The structure consists of three basic acts:
Discontent or imbalance with the home community
Journey into the unknown in search of a treasure or solution, with many battles along the way
Return to the community with something of value as well as an internal transformation of character and story resolution
Story Structure in Music
You can also find the three-act structure in classical music -- in particular, the sonata form. During the Classical and Romantic periods, the sonataform was developed with a distinctive architecture relating to a beginning, middle, and end, also typical of dramatic storytelling. Two thematic ideas are presented in the exposition, and then these are brought into tension in the development through key changes with transforming or opposing themes. In the recapitulation, the composer brings us back to the opening themes and "home" key, closing out the whole piece.
In emphasizing tension and release, change and resolution, the sonata form mirrors the emotional lives of the listeners. They are able to take the nonverbal expedition that is reflected archetypically in the Hero's Journey, in which the protagonist must begin with a set of known themes, virtues, or goals; confront opposition to reaching these goals with both inner and outer conflict along the way; finally find a resolution; and return to home with hard-won treasure and personal transformation. The contradictory pressures and relief of tension are the same that we experience in life, society, and all great storytelling.
Anticipation
The foundation of listening is anticipation, seeking patterns, and variations within expectation. This principle of music theory coincides with dramatic storytelling structure, whether in comedy, thriller, or action-adventure. We try to figure out what will happen next and revel in the surprising twists and turns of the plot. They ultimately bring us to a climax we are all expecting, even if we don't know the exact outcome.
When we anticipate a musical or dramatic event, we test our hypothesis, allowing for adjustments in our next anticipations. These are the deep relationships that create not only the structure but also the emotion.
Imagine that you need to pay for a movie ticket and you think you have a 10-dollar bill in your wallet. If that is correct, there is no emotion in the act. But if you only have five dollars, that could make you pretty upset; if you find 50 dollars, you would be pleasantly surprised. Emotion emerges in music and storytelling when anticipation is not met.
The distance from the anticipated outcome can affect the audience as well. If the deviations are too big, the structure becomes incoherent and senseless. If the anticipation is always met or there is too little change, then the result is boring or mechanical.
Contradicting Expectation
Meaning arises in music, story, and film when expectation is contradicted. Setting up a joke with a 1-2-3 punch line is a classic twist of expectation as with the satire in such films as Airplane or Scary Movie. Beethoven developed a theme toward a logical point and then flipped it in a new direction, drawing the audience all the way to the end of a movement before resolving the structure. The contradiction occurs between the conventional, expected structure based on genre -- also considered the gestalt background for the audience -- and the individual story or musical structure, or the gestalt foreground, that actually appears.
The clearest tension combines maximum contradiction with maximum unification between these background and foreground elements. In other words, arousing unfulfilled expectations, heightening suspense, and postponing resolution are all essential to build this tension so that the ending has emotional value.
Violating one moment can intensify the next anticipated and fulfilled one. A classic example: in the horror film, the small fright of a bat screeching followed by the real threat of the howling monster. This pleasure mechanism derived from the unexpected has its basis in our neuronal makeup.
Sound Events
A series of sounds heard sequentially in time can give us a huge amount of information about the world and the dynamics that exist among people, things, and events. The relationships are defined by action-reaction and cause-effect, so the order of these sounds is critical in determining the meaning of the event. For example, if you hear a slap and then a cry, you can assume that someone has been hit and is crying because of the pain. If the order reverses, you might imagine that the crying elicits the slap, which induces the person to stop crying.
In film, the picture most often dictates the order of events, and the sound accompanies this order. But there are many opportunities to use sound as an off-screen element or in the "I Think," "I Know," or "I Don't Know" sound spheres. In this case, the sound is unlinked to the image and free to create its own story.
In this course, we are deconstructing the listening and creative process into small pieces, such as grammatical structure and rules. In fact, during your professional sound design work, these elements will not be artificially separated. So consider that the definition of the sound event is serving just for this learning environment, and use this approach in isolation from the sound object or use of plug-ins so that you master this part of the sound design grammar.
Earthquake Example
An earthquake is an example of a series of sound events that add up to a dramatic narrative sequence rather than a simple boom and crash. It can be portrayed in four separate parts, with a few seconds of pause between each.
First you hear a low rumble and shuddering; it grows for a second or two. Then a couple of pottery crashes mix with louder rumbling. Next, a sudden sliding, crashing sound can be achieved by, say, dropping a quantity of small stones on the sloping lid of a cardboard box with a glass jar at the bottom and dropping the pitch an octave or two. Finally, the rumbling noises come in again and then fade out to zero.
You can obtain an effective brooding sensation in between the earthquake sounds with faint, distant voices while laying panic noises like screaming and shouting behind the third, "falling debris" section.
|
|
Assignment - Spotting for Sound Effects
Film Project
sfx_spotting_sheet.doc | |
File Size: | 34 kb |
File Type: | doc |
dab_window_burn.zip | |
File Size: | 76884 kb |
File Type: | zip |
Today you will be spotting a scene from the SAGU cinematic production of Drawing a Blank for Sound Effects.
- Download the SFX Spotting Sheet above
- Use the Drawing a Blank video clip with SMPTE Window Burn (You will find a zipped file containing an Adobe Audition project already set up for you above).
- Begin filling in the required information in the spotting sheet
- After completing the spotting sheet, search online databases for SFX, or check out the recording equipment needed and begin recording the needed SFX
- This SFX Spotting Sheet will be due on the final day of class when you submit your Film Project
Trailer Project
sfx_spotting_sheet.doc | |
File Size: | 34 kb |
File Type: | doc |
Today you will also be spotting a movie trailer or video game trailer of your choice for Sound Effects.
- Download the SFX Spotting Sheet above
- Use a movie/video game trailer video clip of your choice with SMPTE Window Burn
- Begin filling in the required information in the spotting sheet
- After completing the spotting sheet, search online databases for SFX, or check out the recording equipment needed and begin recording the needed SFX
- This SFX Spotting Sheet will be due on the final day of class when you submit your Film Project