Category Archives: Editorials

Not So Scary – Modern Horror Movies and the Lack of Genuine Scares

 

mama
Horror movies have been around since the very beginning of cinema.  From F.W. Murnau’s classic vampire flick Nosferatu (1922) to Universal Studio’s monster movies like Dracula (1931) and Frankenstein (1931), audiences have made watching scary films a long standing tradition.  And, like most other genres, horror has grown and evolved with the times, satisfying the changing tastes of it’s audiences.  In the 50’s, we saw the rise of the Sci-fi monster movies and in the 60’s and 70’s, “schlock” horror began to become popular, thanks to relaxed restraints over acceptable on-screen violence.  It is a genre that has more or less stayed strong in every decade and is much more adaptable than any other genre of film.  But, in recent years, I have noticed that there has been a severe drop off in horror movies that actually leave a mark.  It seems that today, studios are more interested in quantity over quality and its a trend that is having a negative effect on the genre as a whole.  My belief is that studios are using the horror genre as a way to generate a quick influx of cash, knowing that there is a built in audience of people who watch horror movies no matter what it is.  That’s why you see so many horror films quickly drop off after their opening weekend.  There seems to be the belief nowadays that you can pass off something as a horror movie if it has one or two big scares; but the reality is that the best horror films don’t always rely on things that make us jump out of our seats.
What makes a great Horror movie is the use of atmosphere.  This has been the case since the very beginning; back when cinema was still silent.  F.W. Murnau’s silent masterpiece Nosferatu shows exactly how atmosphere can be used to signify terror.  In the movie, we see how simple staging and effective use of shadows can be used to terrifying effect.  The vampire Count Orlok, played by actor Max Schreck, is able to strike at his victims using just his shadow, an image in the film that is made simply with the movie’s use of lighting, but still done with chilling effectiveness.  Early Hollywood horror films likewise made great use of atmosphere.  If you look at a movie like Dracula, there is actually very little on-screen violence present.  Instead, the film presents a feeling of dread through the gloomy atmosphere of the vampire’s castle.  Thanks to that, and Bela Lugosi’s iconic performance, you don’t need to see the bloodletting of Dracula’s victims in order to be scared.  This has helped to give these movies lasting power over so many years.  It’s amazing that movies made in the early days of cinema can still be scary, given all the limitations they had.  And given all the bad things we’ve seen happen to movie vampires in recent years (I’m looking at you Twilight), I’m glad that Lugosi’s version of the Count still can create a chill.
Understandably, the horror genre has had to grow and evolve with the times in order to survive, but for many years there was still an emphasis on atmosphere at play.  The more rebellious era of the 70’s allowed for more use of onscreen violence, and while many filmmakers perhaps went a little overboard in this period, there were a few that actually made an impact.  Dario Argento created films that were not only gory but also artistically staged like The Cat of Nine Tales (1971), Deep Red (1975) and the very twisted Suspiria (1977), which showed off how atmosphere could still be used to enhance the gore on film.  Director George A. Romero likewise used atmosphere effectively in a sub-genre of horror that he helped create; the zombie flick.  Despite the fact that these directors were given more leeway to do what they wanted, what made their early work so effective was in how they showed restraint.  You can show a lot more in horror movies nowadays, but sometimes what remains unseen becomes the scariest element, and that’s why films of this era managed to be effective.  The filmmakers knew when to be shocking and when to show restraint, based on what the horror movies that inspired them had done in the past.  But, as generations of filmmakers become more desensitized to what can be allowed in a horror movie, that sense of restraint also goes away.
The problem that I see in most modern horror movies today is that there is no self-restraint left in them.  For the most part, the filmmakers chose to throw atmosphere out the window in favor of “jump scares.”  A “jump scare” is when something suddenly pops onto screen out of nowhere in an attempt to make the audience scream and jump all at the same time, usually accompanied with a loud music cue to maximize effect.  A “jump scare” can work, when it is used sparingly, but too many films today are overusing it, which diminishes it’s effectiveness over time.  One of the best examples of a jump scare is actually in a film that you would consider more of a thriller than a horror movie; Jaws (1975).  The scene in question is when scientist Hooper (Richard Dreyfuss) is investigating a shark attack on a fishing boat at night.  While examining the hole in the bottom of the boat, a severed head pops out suddenly, creating a genuine scare for both him and the audience.  This scene is effective because it is unexpected and is built up thanks to the atmosphere of the moment.  Also, it is one of the few times that director Steven Spielberg actually uses a “jump scare” in the movie.  The  fewer times it happens, the more effective it is, and unfortunately that’s a technique that few horror filmmakers today understand.  When you use a technique too many times, it becomes tiresome and the audiences become more aware of it.  Unfortunately, too many filmmakers get carried away and have too much fun creating these kinds of “jump scares.”
One other problem I have noticed with modern horror films is the over-abundance of CGI.  While computer effects can sometimes be helpful in a horror film, like making it look like a character has lost a limp or manipulating an environment in a way that defies physics, there is a larger problem of effects work making moments that should be scary less so.  The problem is that most computer effects look too artificial.  Of course, when you see puppetry and prosthetic work used in horror movies, they are far from realistic too, but those effects are at least are physical in nature and actors can still interact with them.  When you see a horror movie use CGI too much, you just know that the actors are reacting to nothing else but a green screen effect.  A recent movie like Mama (2013), loses all effective chills when you see the digital apparition in it appear.  This is more apparent in smaller budget horror films, which you can kinda excuse due to limitations in budgets.  But when a bigger budget horror film, like the upcoming Carrie remake, looks so pathetic because of the overdone CGI effects, then you begin to see how digital imagery has a negative effect on the genre.  Even a good horror film like World War Z suffered from some unnecessary CGI work, which had the unfortunate affect of making the zombies less frightening.  If ever there was a place where I wish horror filmmakers would show more restraint, it would be here.
One other problem that I see plaguing the horror genre is the lack of original ideas.  Today we are seeing an overabundance of the same kinds of ideas used over and over again.  Seriously, how many haunted house movies do we need?  Not only that, there are far too many remakes and sequels in the horror genre.  Do we really need seven Saw movies and four Paranormal Activities?  Horror sequels have become so absurdly common, that we have ridiculous titles like The Last Exorcism 2 and A Haunting in Connecticut 2: Ghosts of Georgia appear as a result; and yes that second title is real.  I see it as commerce taking precedence over artistic vision, and the fact that film studios are more likely to invest in something already established than in something new.  Every now and then, you do see a movie with a fresh idea come about, like Paranormal Activity in 2007, but even that was driven into ground with too many follow ups with diminishing returns.
Remakes are also a negative factor in horror movies today.  What you usually see in these horror remakes are films that get rid of all the atmosphere from the originals in favor of upping the gore factor and the scary bits; just because filmmakers have the ability to do now what could only be implied at in the past.  The problem with this is that it completely misses the point of what made the original films so effective in the first place.  A particular example is the terrible remake of John Carpenter’s The Thing, which loses all of the substance of the original in favor of just making the film as gory as possible.  Gore does not equal scary.  Filmmakers like Carpenter knew that, and that’s why they used gore sparingly.  The sad thing is that remakes try to one up these originals because the tools today are so much better; but it fails miserably every time.
Thankfully, despite the attempts by Hollywood to try to push the Horror genre into more exploitative territories, the classics still hold up all these years later.  Even a 90 year old film like Nosferatu still gives audiences chills to this day.  And I think that it all comes down to atmosphere.  It’s like how people tell ghost stories around a campfire.  Would you rather listen to the story that builds up to a chilling ending that’ll leave you with nightmares, or would you rather listen to someone’s story that gets caught up in the gory details and then just ends without a payoff?  That’s what’s being lost in horror movies today.  The classics knew how to build their stories around scary ideas, and not just the imagery.  The Twilight Zone became popular on television because it presented us with unsettling scenarios that made us anxious the longer we thought about them.  Not once did we see the monster on the wing of a plane attack William Shatner in the famous episode; it was the frightening possibilities that could have come about that made the episode scary and also Shatner’s paranoia in his performance.  The best horror movies have staying power because they knew that their audiences had the imaginations capable of filling in the gory details that remained unseen.
So, is horror a dying genre?  Of course not.  There is an abundance of terrible horror movies out there, but that’s only because the market has been flooded.  Every now and then, a fresh new idea comes along and not only makes an impact, but it will also go on to influence the genre as a whole.  One thing that I would like to see an end to in the horror genre is the over-abundance of terrible remakes.  Just looking at the new Carrie remake trailer makes me laugh, because it’s taking everything that worked in the original and makes it less subtle.  I believe it strongly; CGI, and shaky-cam for that matter, are making horror films less frightening.  They are showy techniques that ruin atmosphere needed for a good horror movie and I wish more filmmakers would show more restraint.  I’ve stayed away from horror films generally because of this, and the horror movies that I gravitate towards are ones that have been around a long time.  If you’re wondering which one I consider my favorite, it would be Stanley Kubrick’s The Shining (1980).  Talk about a film that makes the most out of it’s atmosphere.  I hope that other horror filmmakers take a look at what makes the classics as scary as they are, and learn the effectiveness of restraint.  You’d be surprised how much a little scare can go when it’s built up well enough.

The Best of the Worst – Why We Have a Good Time Watching Bad Movies

 

manos
If there was ever a place where the word “bad” could be considered a relative term, it would be in the movies.  Over the course of film history, we have seen Hollywood and the film industry at large put out an astounding variety of movies, and not all of them have hit their targets the way that the filmmakers had intended.  If you produce hundreds of products within a given year, the odds are that some, if not most of them are not going to be good.  But like most things, one man’s trash can be another man’s treasure, and that has led to a fascinating occurrence in the film community.  Some “bad movies” have actually earned a fanbase all on their own, finding an audience in some unexpected ways.  This has been the case with films that have built a reputation over time, but nowadays, we are actually seeing intentionally bad movies become phenomenally successful upon their initial releases, as was the case with the premiere of Sharknado on the SyFy Channel earlier this summer.  How and why “bad movies” find their audiences is still a mystery to many, but what I love about this trend is that it shakes up our preconceived notions about the film industry, and makes us reconsider what we find entertaining in the first place.
So, what is it about these “bad” movies that makes them so entertaining to us?  The truth is that there is no “one thing” that defines the success behind these films, and usually it’s all relative to each individual movie.  Sometimes it’s the incompetency behind the making of the film that we find so entertaining.  Sometimes it’s because the film is so out-of-date that it becomes hilarious.  Sometimes it’s the lack of self-awareness and ego behind the director’s vision.  And sometimes it’s the filmmakers just not giving a damn what other people think and just going all out with their material. The formula has no consistency, and yet we see many film’s fall into these many different categories of “bad” films.  Usually the best of the these are the ones that fulfill the criteria of a “bad” movie so perfectly, that it becomes memorable and re-watchable.  Only in rare cases does this work intentionally, and usually the best “bad” films arise from an unexpected accident.
Some of the best “bad’ movies have come out of turmoil, which makes their existence all the more fascinating.  Usually this is attributed to movies that were made despite the fact that their filmmakers didn’t know what they were doing.  One of the most notorious examples of this was the 1966 cult classic, Manos: The Hands of Fate.  Manos was the creation of Hal Warren, a fertilizer salesman from El Paso, Texas who made a bet with a screenwriter friend of his that he could make his own movie without any help from Hollywood.  Making good on his wager, Mr. Warren wrote and directed this schlocky horror film centered around a cult leader named the Master (pictured above) who holds a family hostage in his compound, which is watched over by a lecherous caretaker named Torgo.  Hal Warren shot the film with a camera that could only shot 30 seconds of film at a time with no recorded sound, and most of the movie was shot at night with set lighting attracting moths in almost every shot.  The finished film is a convoluted mess and it ended any shot for Hal Warren to pursue a career in filmmaking.  However, many years later, the film was rediscovered by the producers of the show Mystery Science Theater, who then featured it on their show and created a renewed interest in this odd little film that no one outside of Texas knew about.  Manos became a hit afterwards because people were fascinated by how silly this poorly made film was, something that the MST crew had a hand in.  Since then, Manos has earned a reputation for being among the worst films ever made, and that in itself has made it a favorite for people who gravitate towards that.
While Manos represents an example of a disaster turned into a success, there are other bad films that have become fan favorites just out of being incredibly dated.  These movies usually make up the majority of what people consider good “bad” films, since most films are a product of their times.  Whether people are entertained by these because of their “out-of-date” nature or merely because of shear nostalgia, there’s no denying that time has a way of changing how we view these kinds of movies.  The 1950’s has become an era that many film fans find to be full of some good trash, mainly due to the rise of the B-movie in this period of time.  Some cult hits like The Blob (1958), Creature from the Black Lagoon (1954), Attack of the 50 ft. Woman (1958), and The Thing from Another World (1951) all rode the surge of a sci-fi craze of the post-war years, and while everything from these films, like the visual effects and the acting, feels antiquated today, they still have a camp value that makes them watchable all these years later.  The “cheese factor” plays a big role in keeping these films entertaining long after their relevance has diminished.  You can see this also in the beach party movies of the early 60’s, which are charming despite their paper-thin plots. The one other era that has produced it’s own distinctive set of dated films would be the 1980’s, with it’s collection of dated fantasy pictures and culturally infused fluff, like He-Man inspired Masters of the Universe (1987) or the E.T. wannabe Mac and Me (1988).  By all accounts, these films would have long been forgotten outside of their era, and yet they have lived on with audiences who still find something entertaining in them.
One of my favorite types of “bad” film is the kind that comes from a complete lack of control from either the director or the performer.  There have been some directors that have actually gained their reputation as an filmmaker by staying within the B-Movie community.  The most famous of these filmmakers has become Ed Wood Jr.; a person who some have claimed to be the worst director in history.  Ed Wood’s notable contributions to cinema have been the cross-dressing comedy Glen or Glenda (1953), the Bela Lugosi-starring Bride of the Monster (1955), and what many consider the director’s “masterpiece,” Plan 9 from Outer Space (1959).  Whether or not Ed Wood was earnest in his vision or whether he made his film’s intentionally bad is still debated, but there is no doubt that Plan 9 is a special kind of “bad;” a movie so aggressively cheesy, that it is hard not to be entertained by it.
Other filmmakers who were more aware of their B-Movie status have still gained an honored reputation with audiences.  Roger Corman, a man who prided himself in making movies both fast and cheap, has actually become influential to a who generation of blockbuster filmmakers.  His Little Shop of Horrors (1962) even inspired a Broadway musical.  Also, sometimes a way out there performance can often make a “bad” movie worth watching.  I would argue that this is the case with most Nicolas Cage films, like Vampire’s Kiss (1988) or Ghost Rider: Spirit of Vengeance (2012).  One film that has become a cult classic mainly due to one “out-of-control” performance was 1981’s Mommie Dearest, where Faye Dunaway chews the scenery in a good way as a way over-the-top Joan Crawford.  Usually a lack of restraint by the filmmakers can sink a film, but these movies actually prove that it’s not always the case.
While many films usually become a cult hit over time, there a select few that attempt to achieve cult status right away by being intentionally bad.  Like I stated earlier, Sharknado became an instant hit when it premiered on cable, and having seen the film myself, it’s clear that the filmmakers behind it knew what kind of movie they were making.  Rarely do you see filmmakers try to aim for that intentionally “bad” gimmick for their movies, because obviously if audiences don’t accept it, then you’ve just made a bad movie.  Director Tim Burton tried to create a homage to B-Movie sci-fi with his 1996 film Mars Attacks, but the film was an odd blend of tounge-in-cheek mockery with earnest storytelling, and the end result doesn’t achieve what it set out to do.
But, one example of an intentionally bad film that did click with audiences is the campy musical The Rocky Horror Picture Show (1975); a movie that pays homage to campy horror and sci-fi, while mixing in 50’s rock music and trans-sexual humor.  Rocky Horror tries so hard to be so bad, you would think that the whole thing would be a mess; and yet, it remains entertaining and it has one of the most dedicated fanbases in  the world.  I think the reason why a movie like Rocky Horror works is because of the fact that it just doesn’t care what people will think about it.  It is what it is, and that’s why people gravitate to it.  It’s a one of a kind.  A movie like Mars Attacks didn’t click as a throwback, because it didn’t have that same kind of assured belief in itself, and that shows why it is hard to make a bad movie feel good.
When it comes down to it, “bad” movies are usually determined by the tastes of the people who watch them.  We have made some of these “bad” movies our favorites because of the value we find in their chessy-ness, or by our fascination with how badly it gets things wrong.  For a movie to be all around bad, it has to lack any kind of entertainment value in the end.  For those who are wondering, the worst movie that I have ever seen, and one I see no redeeming value in, is the 1996 film Space Jam.  To me, it was the worst experience I have ever had watching a movie, mainly because I saw it as a blatant self-serving production piece for a sports super star (Michael Jordan) and it ruined three things on film that I love dearly: NIKE, Looney Tunes, and Bill Murray.  But, I do recognize that the film does have its fans, so in the end it all comes down to taste.  But, it is fascinating how our tastes leave room for something as poorly made as a Manos or even the more recent Birdemic: Shock and Terror (2010), a movie that needs to be seen to be believed.  There is certainly value in seeing something that we find entertaining, and perhaps that is why these films live on the way they do.

Inspired by a True Story – The Process of Showcasing History in Hollywood

 

jobs3
This week, two very different biopics open in theaters, both ambitious but at the same time controversial.  What we have are Ashton Kutcher’s Jobs and Lee Daniel’s:The Butler (you can thank uptight Warner Bros. for the title of the latter film.)  Both are attempting to tell the stories of extraordinary men in extraordinary eras, while at the same time delving into what made these people who they are.  But what I find interesting is the different kinds of receptions that these two movies are receiving.  Lee Daniel’s The Butler is being praised by both audiences and critics (it’s receiving a 73% rating on Rottentomatoes.com at the time of writing this article) while Kutcher’s Jobs is almost universally panned.  One would argue that it has to do with who’s making the movies and who is been cast in the roles, but it also stems from larger lessons that we’ve learned about the difficult task of adapting true-life histories onto film.  The historical drama has been a staple of film-making from the very beginning of cinema.  Today, a historical film is almost always held to a higher standard by the movie-going public, and so it must play by different rules than other kinds of movies.  Often it comes down to how accurately a film adheres to historical events, but that’s not always an indicator of a drama’s success.  Sometimes, it may work to a film’s advantage to take some liberties with history.
The Butler and Jobs make up what is the most common form of historical drama; the biopic.  In this case the subjects are White House butler Cecil Gaines, portrayed by Oscar-winner Forrest Whittaker, and visionary Apple Computers co-founder Steve Jobs.  Both are men who hold extraordinary places in history, but in very different ways.  Despite the differences in the subjects, it is the history that surrounds them that plays the biggest part of the story-telling.  Filmmakers love biopics because it allows them to teach a history lesson while at the same time creating a character study of their subject.  Usually the best biopics center around great historical figures, but not always.  One of the most beloved biopics of all time is Martin Scorsese’s Raging Bull (1980), which tells the story of a washed-up heavyweight boxer who was all but forgotten by the public. Scorsese was attracted to this little known story of boxer Jake LaMotta, and in it he saw a worthwhile cautionary tale that he could bring to the big screen.  The common man can be the subject of an epic adventure if his life’s story is compelling enough.  But there are challenges in making a biopic work within a film narrative.
Case in point, how much of the person’s life story do you tell.  This can be the most problematic aspect of adapting a true story to the big screen.  Some filmmakers, when given the task of creating a biopic of a historical figure, will try to present someone’s entire life in a film; from cradle to grave. This sometimes works, like Bernardo Bertolucci’s The Last Emporer (1987), which flashbacks to it’s protagonist’s childhood years frequently throughout the narrative.  Other times, it works best just to focus on one moment in a person’s life and use that as the focus of understanding who they were.  My all-time favorite film, Lawrence of Arabia (1962) accomplishes that feat perfectly by depicting the years of Major T.E. Lawrence’s life when he helped lead the Arab revolts against the Turks in World War I.  The entire 3 1/2 hours of the film never deviates from this period in time, except for a funeral prologue at the beginning, and that is because the film is not about how Lawrence became who he was, but rather about what he accomplished during these formidable years in his life.  How a film focuses on it’s subject is based around what the filmmakers wants the audience to learn.  Sometimes this can be a problem if the filmmaker doesn’t know what to focus on.  One example of this is Richard Attenborough’s Chaplin (1992), which makes the mistake of trying to cram too much of it’s subject’s life into one film.  The movie feels too rushed and unfocused and that hurts any chance the movie has with understanding the personality of Charlie Chaplin, despite actor Robert Downey Jr.’s best efforts.  It’s something that must be considered all the time before any biopic is put into production.
Sometimes there are great historical dramas that depict an event without ever centering on any specific person.  These are often called historical mosaics.  Often times, this is where fiction and non-fiction can mingle together effectively without drawing the ire of historical nitpicking.  It’s where you’ll find history used as a backdrop to an original story-line, with fictional characters participating in a real life event; sometimes even encountering a historical figure in the process.  Mostly, these films will depict a singular event using a fictional person as a sort of eyewitness that the audience can identify with.  You see this in films like Ben-Hur (1959), where the fictional Jewish prince lives and bears witness to the life and times of Jesus Christ.  More recently, a film like Titanic (1997) brought the disaster to believable life by having a tragic love story centered around it.  Having the characters in these movies be right in the thick of historical events is the best way to convey the event’s significance to an audience, because it adds the human connection into the moment.  Titanic and Ben-Hur focus on singular events, but this principle can also be true about a film like Forrest Gump (1994) as well, which moves from one historical touchstone to another.  Forrest Gump’s premise may be far-fetched and the history a little romanticized, but it does succeed in teaching us about the era, because it does come from that first-hand experience.  It’s that perspective that separates a historical drama from a documentary, because it helps to ground the imagination behind the fictional elements into our own lives and experiences.
Though most filmmakers strive to be as historically accurate as they can be, almost all of them have to make compromises to make a film work for the big screen.  Often, a story needs to trim much of the historical elements and even, in some cases, take the extraordinary step of rewriting history. You see this a lot when characters are created specifically for a film as a means of tying the narrative together; either by creating an amalgam of many different people into one person, or by just inventing a fictional person out of nowhere.  This was the case in Steven Spielberg’s Catch Me if You Can (2002), which followed the extraordinary life of Frank Abagnale Jr. (played by Leonardo DiCaprio), a notorious con-artist.  In the film, Abagnale takes on many different identities, but is always on the run from a persistent FBI agent named Carl Hanratty (Tom Hanks). Once finally caught, Abagnale is reformed with the help of Hanratty and the film’s epilogue includes the statement that, “Frank and Carl remain friends to this day.”  This epilogue had to be meant as a joke by the filmmakers, because even though Frank Abagnale is a real person, Carl Hanratty is not. He’s an entirely fictional character created as a foil for the main protagonist.  It’s not uncommon to see this in most films, since filmmakers need to take some liberties to move a story forward and fill in some gaps.  Other films do the risky job of depicting real history and completely changing much of it in service of the story.   Mel Gibson’s Braveheart takes so many historical liberties that it almost turns the story of Scottish icon William Wallace into a fairy-tale; but the end result is so entertaining, you can sometimes forgive the filmmakers for making the changes they did.
But while making a few changes is a good thing, there is a fine line where it can be a disservice to a film.  It all comes down to tone.  Braveheart gets away with more because it’s subject is so larger than life, that it makes sense to embellish the history a bit to make it more legend than fact.  Other films run the risk of either being too irreverent to be taken seriously or too bogged down in the details to be entertaining.  Ridley Scott crosses that line quite often with his historical epics, and while he comes out on the right side occasionally (Gladiator and Black Hawk Down) he also comes up with the opposite just as many times (Robin Hood, 1492: Conquest of Paradise, Kingdom of Heaven theatrical cut).  Part of Scott’s uneven record is due to his trademark style, which services some films fine, but feel out of place with others.  Tone also is set with the casting of actors, and while some feel remarkably appropriate for their time periods (Daniel Day-Lewis in Lincoln for example) others will feel too modern or awkwardly out-of-place (Colin Farrell in Alexander).  Because historical films are expensive to make, compromises on style and casting are understandable for making a film work, but it can also do a disservice to the story and shed away any accountability in the history behind it.  While stylizing history can sometimes work (Zack Snyder’s 300), there are also cinematic styles that will feel totally wrong for a film.  Does the shaky camera work, over-saturated color timing and CGI enhancements of Pearl Harbor (2001) make you learn any more about the history of the event?  Doubtful.
So, with Lee Daniel’s The Butler and Jobs, we find two historical biopics that are being received in very different ways.  I believe The Butler has the advantage because we don’t know that much about the life that Mr. Cecil Gaines lived.  What the film offers is a look at history from a perspective that most audiences haven’t seen before, which helps to shed some new light on an already well covered time period.  With Jobs, it has the disadvantage of showing the life of a person that we already know everything about, and as a result adds nothing new to the table.  Both films are certainly Oscar-bait, as most historical films are, but The Butler at least took on more risks in its subject matter, which appears to have paid off in the end.  Jobs just comes off as another failed passion project.  What it shows is that successful historical dramas find ways to be both educational and entertaining; and on occasion, inspiring.  That’s what helps to make history feel alive for us, the audience.  It’s the closest thing we have to time machines that help be an eyewitness to our own history.  And when it’s a good story, it stays with us for the rest of our lives.

Thrown into the Briar Patch – The Uneasy and Confusing Controversy of Disney’s “Song of the South”

 

songofthesouth
What does it take to blacklist a whole film?  Walt Disney’s 1946 film Song of the South has the dubious distinction of being the only film in the company’s history to be declared un-releasable. Many people state that it’s because of the perception that the film has a racist message and that it sugarcoats and simplifies the issue of slavery in an offensive way.  I would argue that it’s not right to label a film one way without ever having seen it, but unfortunately Disney is reluctant to even let that happen.  What is interesting is the fact that by putting a self-imposed ban on the distribution of the film, Disney is actually perpetuating the notion that Song of the South is a dangerous movie, due to the stigma it holds as being the one film that they refuse to make public.  Disney, more than any other media company in the world, is built upon their wholesome image, and for some reason they are afraid to let their guard down and air out their dirty laundry.  But, is Song of the South really the embarrassment that everyone says it is, or is it merely a misunderstood masterpiece. Thankfully, I have seen the film myself (thank you Japanese bootlegs and YouTube), so I can actually pass judgment on it, and like most other controversial things, you gain a much different perspective once you remove all the noise surrounding it.
For a film that has gained such a notorious reputation over the years, the actual history of the production is relatively free of controversy.  Walt Disney wanted to adapt the Uncle Remus Stories, which were popular African-American folktales published by Joel Chandler Harris in post-Reconstruction Georgia.  Disney said that these stories were among his favorites as a child and he was eager to bring to life the moralistic tales through animated shorts starring the characters Brer Rabbit, Brer Fox and Brer Bear.  The film was a breakthrough production for the Disney company as it was a mix of live action and animation.  Sequences where the live action character of Uncle Remus interacts with the cast of animated critters were astonishing to audiences at the visual effects were highly praised at the time; remember this was almost 20 years before Mary Poppins (1964), which was also a hybrid film in itself.  Walt Disney treated the subject material with great reverence and he brought in the best talent possible to work on the film, including Oscar-winning Cinematographer Gregg Toland (Citizen Kane, The Grapes of Wrath).  Disney was especially proud of the casting of James Baskett as Uncle Remus, and he even campaigned heavily to earn Mr. Baskett an Oscar nomination for his performance;  Baskett wasn’t nominated, but he did win a special honorary Oscar in recognition of his work on the film.  The movie was a financial success and it did earn another Oscar for the song “Zip-a-Dee-Do-Dah,” which has become a sort of an unofficial anthem for the Disney company.
Surprisingly, the film would be re-released constantly for decades afterwards.  It even provided the inspiration for what is still one of Disneyland’s most popular attractions: Splash Mountain.  It wouldn’t be until after a short theatrical run in 1985 that Disney began their policy of keeping the film out of the public eye.  Not surprisingly, this was also around the same time that a new corporate team, led by Michael Eisner, had taken over operation of the company, and with them a whole new mindset centered around brand appeal.  While Song of the South would sometimes be called out in the past by organizations like the NAACP for it’s quaint portrayal of post-slavery life, the film was not considered an outright embarrassment.  It was merely seen as a product of its time and was much more notable for its animated sequences than for its actual story line.  But once Disney made it their policy to shelve the film for good based on the perception that the film made light of slavery, that’s when all controversy started heating up.  To this day, Song of the South has yet to receive a home video release here in the United States, and Disney is still continuing to stand by their decision to not make the film public.
So, having seen the actual film, it has given me the impression that Disney didn’t ban the film just because of its content, but rather it was an attempt to keep their image as clean as possible.  My own impression of the film is this; it’s harmless.  Don’t get me wrong, it is not the most progressive depiction of African-American life in America and some of the portrayals of the ex-slave characters are certainly out of date to the point of being cringe-inducing.  But it’s no worse than a film like Gone with the Wind (1939), and that film is considered one of the greatest movies of all times.  If Song of the South has a flaw it would be that it’s boring.  The movie clearly shows Walt Disney’s lack of experience in live action film-making, as the main story of the film is very dull and flimsy. Basically it follows the life of a young southern boy, played by Disney child star Bobby Driscoll (Peter Pan) as he deals with the break-up of his family and the finding of solace in the stories told to him by a former slave, Uncle Remus.  There’s not much more to it than that.  Where the film really shines is in its animated sequences, which are just as strong as anything else Disney was making in the post-War era.  The art style in particular really does stand out, and conveys the beauty of the Southern countryside perfectly.
Somehow, I believe that there’s a different reason why the film has garnered the reputation that it has.  Disney is a big company that has built itself around an image.  Unfortunately, when you go to certain extremes to keep your image as flawless as it can be, it’s going to make other people want to tear that down even more.  There are a lot of people out there who hate Disney purely on their wholesome image alone, and when they find cracks in that facade, they are going to keep on exploiting that whenever possible.  Walt Disney himself has been called everything from racist to anti-Semitic, which if you actually dig deeper into any of those claims, you’ll find that there’s little truth to them and that they’re usually attributed to people who came from rival companies or had a contract dispute with Mr. Disney.
Unfortunately, by trying so hard to sweep so much under the rug, the Disney company opens itself to these kinds of accusations; and they have no one else to blame for them but themselves. Walt Disney was not a flawless man by any means and the company has made embarrassingly short sighted decisions in the past; hell they’re still making them now (John Carter, The Lone Ranger). But, their flaws are no worse than the ones that plague other companies in Hollywood.  Just look at the racial stereotypes in old Warner Brothers cartoons; there was an actual war propaganda Looney Tunes short called Bugs Nips the Nips, which is about as racist as you can get.  The only difference is that Warner Brothers has not shied away from it’s past embarrassments, and have made them public while stating the historical context of their productions.  As a result, Warner Brothers has avoided the “racist” labels entirely and their image has been kept intact.  For some reason, Disney doesn’t want to do that with Song of the South, despite the fact that Disney has made public some of their older shorts that are far more overtly racially insensitive than the movie. There are shorts from the 1930’s that showed Mickey Mouse in black face, and yet they still got a video release as part of the Walt Disney Treasures DVD Collection.  I think the reason why Song of the South didn’t get the same treatment is because it’s such a polished and earnest production, and it’s probably easier to dismiss the silly cartoon for it’s flaws because they’re less significant.
Regardless of how it accurately it addresses the issues of slavery and the African-American experience, the Song of the South should at least be given the opportunity to be seen.  It’s a part of the Disney company’s history whether they like it or not, and to sweep it aside is doing a disservice to the Disney legacy as a whole.  Being a white man, I certainly can’t predict what the reaction from the African-American community will be, but is that any excuse to hide the film from them.  Maybe black audiences will come to the film with an open mind; quite a few at least.  It just doesn’t make any sense why this is the film that has been deemed un-watchable when other films like Gone with the Wind, which is very similar content wise, is heralded as a classic.  Even D.W. Griffith’s The Birth of a Nation (1915) is available on home video, and that film openly endorses the Ku Klux Klan.  Song of the South is so harmless by comparison and the worst that you can say about it is that it’s out of date.
As a film, I would recommend everyone to give it at least a watch, if you can.  The animated sequences are definitely worth seeing on their own, and I think some people will appreciate the film as a sort of cinematic time capsule.  While the African-American characters are portrayed in a less than progressive way, I don’t think that it’s the fault of the actors.  James Baskett in particular does the most that he can with the role, and it’s hard not to like him in the film.  He also does double duty playing both Uncle Remus and the voice of Brer Fox, which shows the range that he had as a performer.  The music is also exceptional with songs like “The Laughing Place,” “Sooner or Later,” “How Do You Do?” and the Oscar-winning “Zip-a-Dee-Do-Dah;” crowd-pleasers in every way. It’s definitely not deserving of the reputation it’s gotten.  Disney’s reluctance to make the film available just goes to show the folly of trying to keep a flawless image, when it would actually serve them better to have it out in the open.  Sometimes you just need to take your medicine and let things happen.  After all, aren’t the people who ride Splash Mountain everyday at Disneyland going to wonder some day what film it’s all based on?

Nerd Heaven – The Highs and Lows of Marketing at San Diego Comic Con

 

spiderman
This is no ordinary corporate showcase.  In the last decade or so, San Diego Comics Convention (SDCC), or its more commonly known name COMIC CON, has become a full-fledged festival for the whole of Nerd-Dom.  Not only is it a great place for fans to encounter their favorite artists and filmmakers in person, but it’s also a great place for Hollywood to showcase their tent-pole productions to an eager audience.   In all, its a celebration of all forms of media, where the experience of the presentations and panels can often overshadow the actual products themselves. But, while everything is all in fun at Comic Con, the business end is what matters the most on the actual show floor itself.  As with all conventions, Comic Con is geared toward marketing.  Big studios and publishers get the most attention in the media coverage of the Con, but SDCC started with the small vendors and they continue to be part of the backbone of the whole show.  For everyone involved, there is a lot at stake in these four packed days in mid-July.
Up and coming artists, journalists, and filmmakers are just as common amongst the visitors as they are among the headliners, and the mingling of different talents defines much of the experience at the Con.  While many people get excited by the surprises on hand, that excitement can sometimes have difficulty extending outside the walls of San Diego’s Convention Center.  Marketing to a crowd of fans is much different than marketing to a general audience.  I believe Comic Con works best as a testing ground for marketing strategies in the bigger push of selling a project to the world. Sometimes, a lot of buzz can be generated with a surprise announcement or a with a well placed tease.  One clear example of this at Comic Con this year was the surprise announcement of a Superman and Batman movie coming in 2015.  The announcement was a bombshell for the fans who witnessed it live at the convention, and that extended to a media blitz that spread quickly through all news sources that same day.  This surprise effectively gained needed attention for a project that has only been in the planning stages so far.  Where the risk lies is in the effectiveness of this kind of moment, and there can be no more unforgiving audience than one made up of nerds.
Many of the big studios have figured this out over time, and the planning of their showcases at Comic Con are almost as intricate as the projects they’re trying to sell.  One thing they have certainly learned is that Comic Con patrons are extremely discerning, and are often even more informed about the different projects than the talents involved.  There is a fine line between excitement and scorn within the fan community, and if you fall on the wrong end of that line, it can be brutal.  Comic Con is all about fan service, which is no surprise to everyone.  This year in particular, there were more instances of stars making appearances in costume than ever before.  As you can see in the photo above, Spiderman was there to address the audience in person, which was a special treat considering that the film’s star, Andrew Garfield, was the one behind the mask.  Avengers villain Loki also showed up to introduce footage from the upcoming Thor sequel, with actor Tom Hiddleston completely in character the whole time.  All of these moments make the live presentations far more entertaining, and that in turn helps to make the audience even more enthusiastic about the upcoming films.  Comic Con is a place where theatrics meets commerce, and where a a well made sales pitch can turn into a fanboy’s dream come true.
Given that SDCC started as a showcase for comics, its no surprise that Marvel and DC are the ones who put on the biggest shows; and are the ones who connect to their audiences better than anyone else, through all their experience.  But more recently, the showcases have steered away from the printed page and have been more focused on the silver screen.  Its not that Comic Con has abandoned the medium that started it all; print comics still have a place on the convention floor. It’s just that the movie industry is bigger and more involved, and have seen the benefits of marketing at the Convention.
With production budgets rising, Comic Con has become more important than ever as a way to generate enthusiasm for film projects; even ones that have trouble getting attention.  Several years ago, Disney made a surprise announcement at Comic Con that there was a sequel to their cult-hit Tron (1982) in the works, which was highlighted by a teaser trailer.  Little was known or talked about with the project, and Disney wasn’t quite sure if the project would go anywhere past Development, so the trailer was made as a way to test the waters.  The reception they got was overwhelming, especially when it was revealed that the original film’s star, Jeff Bridges, was involved and production went full steam ahead afterwards.  Few expected a Tron sequel to be newsworthy, let alone the hot topic of the conversation at the convention, but Disney showed that year what a simple surprise could do to generate excitement.  Since then, surprises have not only become more frequent, but now they are expected.
That leads to some unforeseen consequences sometimes in a high stakes venue like this.  When the audiences are expecting a surprise to happen at any moment, it puts even more pressure on the marketing teams to deliver the goods.  There have been many cases when a production company ends up promising too much and then fails to deliver.  A couple years ago, Guillermo del Toro teased the crowd at a Disney presentation by revealing his involvement in a new Haunted Mansion film, which he promised was going to be more spiritually faithful (no pun intended) to the original Disneyland ride than the Eddie Murphy flop had been.  It was an exciting announcement at the time, but several years later, almost nothing new has been heard about the project, and with Del Toro taking on more and more new projects, it’s becoming more obvious that this particular project is probably not going to happen.  Other broken promises have included several announcements of a Justice League movie, including one that is currently out there now and remains to be seen; or news that TV-scribe David E. Kelley was going to give Wonder Woman a new TV series, which led to a disastrous pilot episode that never got picked up.  This is why production companies need to show good judgment when they present their projects at Comic Con.  Once you make a promise, you have to commit.  If you don’t, no one will take those promises seriously, and the whole aura that a Comic Con surprise makes will stop working.
In many ways, Comic Con has become a more favorable place for television than film.  TV shows like Game of Thrones, The Walking Dead, Dexter, and Doctor Who can benefit from all the same kinds of media buzz that a theatrical film can get at the Con, without having the pressure of marketing a massive project with a $250 million budget; although TV budgets are rising too. Comic Con isn’t the only platform for marketing a film, but it’s certainly one of the biggest and the stakes are getting higher.  In a year like 2013, which has seen numerous under-performing films hitting theaters this summer, the pressure is on when it comes to getting the message to resonate beyond the cheering fans in Hall H.  I don’t envy the people behind the Comic Con presentations one bit, because they have so much resting on their shoulders.  And when you’re dealing with a fan-base as well informed as those in the fan community, it’s a wonder how they can keep the surprises coming.
I should note that I have yet to attend Comic Con myself.  My observations are from an outsider’s perspective, though I do follow the live news coverage of the conventions every year with great anticipation.  I hope to someday see it for myself; just to take in the experience of seeing the whole carnival-esque atmosphere of the place.  I’m not sure if I’ll attend it in costume like all the cosplaying regulars there, but then again, “when in Rome…”.  Overall, there’s no doubt that Comic Con is one of the most important institutions we have in our media culture today, and it will continue for many years to come.  There are Comic conventions to be found across the world over, but this is the grand-daddy of them all, and no other convention has this kind of influence on the film industry in general.  Plus, where else are you going to see cool stuff like this:

The Terrible Threes – The Hard Road of Second Sequels

 

second sequels
The number 3 seems to be unlucky for film franchises.  That’s the thought that came to mind when I watched The Hangover Part III.  Short review; it sucked, and I’m beginning to see how it falls into a pattern.  Movie franchises seem to fizzle out around the point that a third entry is released.  Unless its a part of a pre-planned trilogy, like The Lord of the Rings, it is very rare to see a second sequel rise to the level of its predecessors.  So, why do so many filmmakers insist on moving forward with a series that has clearly lost steam after two films.  The simple fact is that sequels are easy to make and unfortunately the law of diminishing returns applies far too often.  In many cases, the first and second sequels just repeat the formula of the initial films, and that not only shows a loss in creativity, but it also defeats the purpose of building up the brand in the first place.  Audiences naturally want to see new things when they watch a movie, even when it comes from a sequel.  Some sequels do manage to breath new life into familiar stories; even deviate from the previous ones in wild and interesting ways.  But while you can sometimes catch lightning in a bottle in two tries, it almost rarely happens again.
There are many factors that go into making a great sequel.  A sequel has to know what made the first film a success and do exactly the same, only bigger.  In some cases, a sequel can even far exceed its predecessor.  Director James Cameron seems to take that principle to heart when making the sequels to his films.  In the case with Terminator 2 (1992), he not only continued the story of the first film, but made it bigger and more epic in the process.  For many people, it’s the movie they most think about when the hear the word “Terminator.” It’s no simple feat for a sequel to be the definitive entry in a series.  A more recent example of this would be Christopher Nolan’s The Dark Knight (2008), which became so popular, that it changed the way we market superhero movies today.  We no longer look at Nolan’s films as the Batman Begins trilogy.  Instead, it’s considered the Dark Knight trilogy, which is the direct result of the sequel overshadowing the first film.
Though the track record for a first sequel is good, there’s less success when it comes to the second sequel.  Once a series hits its third entry, that’s the point where it begins to show signs of exhaustion.  By this point, filmmakers are almost trapped by their own success; having to keep something fresh and interesting long after the good ideas have been used up.  Like I mentioned before, unless a series was planned long ahead of time as a trilogy or more, then most of the creativity will be spent by the time the third film comes along.  It’s very hard to be a sequel to a sequel, and audiences can only take so much of the same story before they lose interest.
The genre that seems to suffer the most from this 3rd film curse is the superhero genre.  Usually superhero films that carry a 3 next to it’s name have ended up being the most criticized by their fans.  We’ve seen this with films like Superman 3, Spiderman 3, X-Men: The Last Stand, and now it appears from this year as well, Iron Man 3.  Even Christopher Nolan’s critically lauded The Dark Knight Rises failed to deliver for some fans.  As is the case with most of these films, they are the follow-ups to very some very beloved sequels; ones that fans had hoped these trilogy cappers would’ve built upon.  There are a couple reasons that could explain why these films have fallen short: one, the audiences’ expectations were just too high for the filmmakers to deliver; two, the filmmakers decided to deviate too much from a proven formula as a means to spur on their creative juices; or three, the filmmakers had clearly lost interest and were just trying to fulfill their obligations. The worst case is when a series decides that it’s ready to be done, without the foresight of establishing a means of wrapping up the story.  This was the case with X-Men: The Last Stand (2006), which haphazardly crammed in a bunch of story points and characters in a film that didn’t need them in order to please the fans expectations as they cut the story off way too short.  The final result was jumbled mess that ended up pleasing no one and it hurt the brand for years to come.
Other franchises also suffer from this pattern, but out of some very different outcomes.  Sometimes, a series does plan ahead and creates a trilogy based off the original film’s popularity, leading to the production of two films at once.  This, however, is a huge risk because it puts the pressure on the middle film in the series to deliver; otherwise the third film will be left out to dry if it doesn’t work.  This has happened on several occasions, such as with the Back to the Future trilogy, the Matrix trilogy, and the Pirates of the Caribbean series.  The Pirates films in particular became so notoriously over-budgeted, that it actually led to the end of studios making simultaneous productions for sequels.  While the receptions of the films are mixed, there was no denying that these series lost steam the longer they went on.  The same happens with the opposite as well, when an unnecessary third film is made many years after the previous sequel.  The Godfather Part III (1990) is probably the most famous example, having come nearly 16 years after the last film, and which extended a story that people thought was perfectly resolved earlier, for no other reason other than to do it all over again.
That’s exactly what most 3rd films end up being: unnecessary.  That’s what I thought when I saw The Hangover Part III.  The series has long exhausted it’s potential and is now running on the fumes.  Could the series have sustained enough interest over three films is another question entirely.  It certainly had enough clout for one sequel.  But whether or not a film series makes it too a 3rd film should entirely be the result of the need to explore the possibilities of the story, and not just to repeat the same formula for the sake of making some quick cash.  These films must be able to stand on their own and not just be an extension of what came before.  The best trilogies are ones where each entry has its own identity, and can entertain well enough on their own without feeling like the extended part of a greater whole.  Films like Return of the Jedi, Indiana Jones and Last Crusade, Goldfinger and The Return of the King are beloved because they entertain while also being an essential part of their overall stories.  And most importantly, they didn’t waste their potential.  Something that the filmmakers behind the Hangover films should’ve considered.

Your Movie is Loading – Digital Innovations and the Resulting Tightened Gap Between Cinema and Home Entertainment

 

home theater
Growing up through the 80’s and 90’s, it was clear that going to the movies and watching one on TV were very different experiences.  But in the years since, technology has revolutionized the ways in which we experience a movie.  Thanks to innovations like Movie on Demand and digital camera and projection, that line between the two experiences has been clearly redefined.  Film companies can now premiere their projects on multiple platforms whereas years ago, you had to wait for sometimes even a year before you were able to buy a film on video after it left the theaters.  The accessibility of the internet has influenced that shift more than anything; allowing people to see what they want, when they want, all through the process of video streaming.  Like most new things, this shift in how we watch the movies has its pros and cons.  For one thing, it gives exposure to movies and media that normally wouldn’t have been seen years ago, while at the same time, causing previous standards of the movie industry to become obsolete and forgotten.  We live in an era where things are changing rapidly, and I wonder if these changes are just trends or are here to stay for good.
One thing that has changed movie watching dramatically is the actual digitizing of media for home viewing.  Before, we had to to buy a tape or a disc to watch a film at home, but nowadays, many people are opting to just cut the middle man out and download a film off the internet.  Places like iTunes allow for the purchasing of a digital version of a film the same day it’s released in stores, and sometimes even earlier on certain exclusives.  It’s a good place to purchase a movie for those who don’t want to clutter their shelves with DVD boxes.  This has also changed the rental business, with sites like Netflix and Hulu putting old juggernaut rental chains like Blockbuster out of business.  That development right there spells out just how powerful this new trend has become.
But what’s interesting about this change is that the film industry has yet to figure it out.  Accessing a movie through a digital copy or through a streaming service is difficult because there is no standardization.  You have certain movies available on some formats and unavailable on others.  It all depends on who has the contract with the retailer.  At least in the years past, you had standardization with all movies released on the preferred format.  Yes, VHS and Blu-ray had to gain dominance over BetaMax and HD DVD in order to become the standard, but once they did, selections in a video store became only a matter of which title the person wanted.  Nowadays, a person who wants the digital copy of a film has to download multiple media players onto their computer or mobile devices just so that they can watch the movies that they want.  And all of these media providers are competitive enough to survive in this market, so any standardization will not be happening soon.  Perhaps its a good thing for there to be competitiveness in the market of media sharing, as it leads to more innovations, but it has the consequence of making the market difficult to navigate.
One of the things that I do find to be a troubling change is the loss of a movie being an actual physical thing.  It may be strange to think of a movie as an object, but I consider myself a collector as well as a fan of cinema, and when I like a movie, I want to include it in my collection.  I have been collecting movies since childhood, and that has included VHS tapes, DVD’s and now Blu-rays.  I am the kind of person that has multiple copies of a single film in different formats and my library is bound to continue growing for a long time to come.  To me, its just a nice feeling to be able to look at a film sitting on my shelf and see it as a part of a physical history of cinema.  This is why I haven’t digitized my film collection.  I am far more likely to buy a disc of a film rather than download one, mainly because I still prefer holding a movie in my hand, even though the idea of having everything stored on a computer is one that I do understand.  For many, a digital copy is a preferred method for people who have been wanting it for a long time and in many ways it’s the faster and easier mode of distribution.
This trend has definitely changed distribution in Hollywood in a good way.  Some movies in the past have struggled to get appropriate distribution, whether they lacked the funding or they were just too risky a project for the studios to make a fuss about in the first place.  In some cases, movies would become hits long after their run in theaters, once they were seen on cable or home video; cult classics like Office Space (1999) or Clerks (1994).  Now, it is possible for a film to bypass the pressure of a theatrical exhibition and be seen almost immediately on whichever format a person chooses.  This is especially true with documentaries, which can be seen on anything from movie screens to YouTube, and not lose any of their impact.  Director Kevin Smith saw the potential in this multi-platform model of release, and decided to self-distribute his most recent film Red State (2011) outside the Hollywood system.  The results were mixed on the success of the release, but Kevin Smith did make waves due to the attempt, and it has made multi-platform distribution just as viable a trend as anything else we’ve seen in the past few years.
Another surprising thing that technology has done to the film industry is to change the way films are both made and processed nowadays.  Digital photography has advanced so much, that it’s oftentimes hard to tell if a movie is shot digitally or not.  Digital projection has certainly taken over cinemas completely, as it’s now hard to find a place that still runs film prints; another sad change, where a film stops existing as a physical thing.  But digital projection has been around long enough to make audiences no longer see any real difference, unless they have a trained eye.  The same goes for digital photography.  Digital cameras are now able to shoot in such high resolutions that it actually exceeds the clarity of regular 35mm film.  This has enabled some new advancements in the presentation of movies, like Digital 3D and 48 frames per second.  While unique, these trends are sometimes just a gimmick, and are usually dependent on the quality of the film to work for an audience.  But the trend has moved in favor of digital photography for a while now.  Only a few filmmakers have stuck by traditional film, like Quentin Tarantino and Steven Spielberg, but for many filmmakers who have limited means and want to bypass the film processing phase, they are embracing the new technology with great enthusiasm.
This has also crossed over into television as well, which has made that line between cinema and home entertainment even more blurred.  TV shows today are filmed mostly with digital cameras, and that has significantly changed what kinds of TV productions that are seen now.  Shows like Game of Thrones and The Walking Dead are done with such complexity in their production, that they can be comparable to the quality of a theatrical film.  This is thanks to digital camerawork that is able to replicate film clarity and allows for manipulation in post production, either through color grading and/or the additions of visual effects.  Years ago, there was no mistaking the difference between what a film looked like and what a TV show looked like.  They were completely distinctive forms of entertainment.  Now the gap has tightened, and it’s probably what has drawn more people towards home viewing.  Can you imagine what shows such as M.A.S.H and Happy Days would be like if they were made with today’s technology.
It’s an interesting tug-of-war that we are seeing today between film and television; one that has been brought about through digital innovations.  While some things will never change, there are other trends that have clearly made things different than what we grew up with.  I for one have my line with what I’m willing to embrace from these new trends, but I am pleased to see so many advancements made in the last few years.  I certainly like the increased accessibility to films that I would normally have had trouble finding.  Digital photography has also made television a whole lot better in recent years.  But, I also miss the experience of working with actual film.  My years as a projectionist gained me a strong appreciation for the look and feel of a film print, but it’s sad to see it become an obsolete tool in film presentation today.  Also, while digital presentation and video streaming are convenient and innovative, the movie itself is what will make or break the investment in the end.
Ultimately, there’s nothing that beats a good time at a movie theater with an auditorium full of people.  Home entertainment may be at a high standard now, and techniques like 3D and high frame rate may be eye-catching, but it ultimately comes down to the human factor.  I enjoy watching a movie, no matter what technology is behind it, as long as it remains entertaining.  And that’s an experience that will always be timeless, even if the ticket prices are getting painfully and astronomically higher.

Pencils to Pixels – The End of Hand-Drawn Animation?

I have been a fan of animation for as long as I can remember.  My friends growing up would always refer to me as the “Disney” kid, and that’s mainly because I was an unashamed fanboy at an early age.  I made an effort to soak up as much as I could from the Disney company’s output during my formative years, and now I am an expert in all things Disney.  Nowadays, I’ve moved beyond just animation and have come to love films of all kinds.  I still do share a special fondness for Disney animation all these years later, however.  To me, it was my gateway drug into the world of cinema.  Unfortunately, as I’ve gotten older, the state of animation has moved away from the stories and the styles that I grew up.  Today, computers have replaced the artist’s sketch pad and hand drawn animation is almost non-existent.  What troubles me most is that Disney, the studio that set the standard for quality animation, has also been forced to catch up with the current trends and they’ve gone on and replaced 2D with 3D.  As a student of film, I understand that the market dictates what goes into production and right now hand-drawn animation is not as commercially viable as computer animation, and it makes me concerned that that style is now truly gone.
As far as the history goes, animation has been as big a part a of cinema as anything else.  In the early days, cartoons were mainly experimental in nature, and were usually thrown in-between feature films at the local cinemas as time-fillers.  But in the 30’s, pioneering filmmakers like Walt Disney proved that animation wasn’t just entertainment, it was art as well.  With films like Snow White and the Seven Dwarfs and Fantasia, Disney animation proved to be just as popular a draw as a John Wayne western or a James Cagney gangster pic.  Other studios also added to the mix, with Warner Bros. hilarious Looney Tunes series and UPA’s experimental use of limited animation.  In the 60’s and 70’s, animation started to fall back into relying on a niche audience, mainly dismissed as kid stuff.  Disney still made features, but they were few and far between, and usually done with limited budgets.  This led to the departure of many artists who felt that animation was not being taken seriously enough, like famed independent animation producer Don Bluth (The Secret of NIMH).  In the late 80’s Disney considered ending all animated productions, after their film The Black Cauldron (1985) lost a lot of money at the box office, but new talent and management decided on a wait and see policy and that led to the production and release of The Little Mermaid (1989).
The years following the release of The Little Mermaid are what is commonly known as the Disney Renaissance, and this occurred just at the right time for a fan like me.  Mermaid arrived when Disney was starting to release their catalog of films on home video, and I had already seen a bunch of them already at this point.  I had developed a sense of what a Disney film was and what it can be, and The Little Mermaid showed me that the Disney style was not only still around, but thriving.  In the years that followed, I eagerly awaited every new Disney feature; from Beauty and the Beast (1991) to Aladdin (1992) to The Lion King (1994), each a bigger success than the one before it.  These films were my childhood and to this day, I am still an avid fan, as I am collecting each of these films on blu-ray.  This success also spawned a great revival of animation throughout Hollywood.  There were numerous attempts by other studios to make feature animation at the same level as Disney and they range from brilliant (The Iron Giant) to admirable (The Prince of Egypt), to mediocre (Rock a Doodle) to un-watchable (Quest for Camelot).
Unfortunately, The Lion King was such a colossal hit, that it ultimately set the bar too high to match.  Even Disney struggled to follow that success, as the budgets got higher and the returns got lower.  By the time I was in high school, hand-drawn animation had once again started to recede into the background.  At this same time, we began to see the rise of Pixar and the success they achieved with the new advances in computer animation.  The turn-of-the-millennium brought about a big sea-change in not just what animated films were being made, but a change in the perception of what an animated film was.  Today, children are growing up believing that an animated film should look more like Shrek and less like Sleeping Beauty.  Which makes me worried that the end truly has come for hand-drawn animation; to where not even a mermaid princess can save it now.
There are other people out there, like me, who still hold hand-drawn animation close to their hearts.  In 2009, after Disney’s acquisition of Pixar, there was a noble attempt to bring back the traditional hand-drawn animated musical with The Princess and the Frog.  Unfortunately, the film under-performed and the revival turned out to be only a momentary reprieve.  Princess is a good film, and it did okay business; just not Pixar-sized business.  Audiences did say they were nostalgic for the Disney films of the past, but recreating that same kind of success is something that you can’t manufacture.  The Little Mermaid was the right film at the right time, and the success that followed was built upon the goodwill that the film delivered.  Princess had too much riding on its shoulders and that caused the film to suffer in the story department.
One thing that hand-drawn animation needs is a genuine and honest surprise.  One of the last big hits Disney had at the box office was Lilo and Stitch (2002), a film that many of the studio brass brushed off initially until it found a big audience.  It showed that animation doesn’t need to be a fairy tale to be considered a Disney classic.  Really, if you look at all the Disney films overall, there are only 7 or 8 fairy tales among them.  Also, the reason why Pixar’s films are so successful is not because of the quality of the computer animation (though it does help), but because they put so much emphasis on getting the story right.  That’s something that you find lacking in most animated features.
Overall, the reason why I prefer hand-drawn animation, even over the best Pixar films, is because of the human touch.  When you watch traditional animation, you are seeing something that was drawn out by actual people.  Not that computer animation is easy; and I know a lot of computer animators who put a lot of work into what they do.  But, when you watch a CG-animated film, you are watching something that was put through a computerized intermediate before it’s put on film.  Some of it looks nice, but I find most of it artificial in movement and texture.  With traditional animation, everything is exaggerated and less bound to reality, which helps to makes the drawings look more interesting.  There is subtlety in character movement that you just can’t get in computer animation.  Would the Genie from Aladdin have been better if he was animated in a computer?  There is a clear fundamental difference between these styles, and neither should replace the other.  Unfortunately, computer animation has claimed victory in the feature department.
Hand-drawn animation has however survived in unlikely places, such as television.  There are only a hand-full of fully computer animated shows out there, as many of them are still 2D.  The Simpsons and Family Guy are still animated by hand, as are many shows on Cartoon Network and Nickelodeon.  Even shows entirely animated in the computer, like South Park or the Flash-animated Archer, create a hand-made look in their presentation.  Also, hand-drawn animation is still going strong overseas, with the success of Anime.  Asian artists seem to have found that perfect medium of embracing the mechanics of computer effects, without abandoning the hand-drawn style altogether.  Hayao Miyazaki’s films in particular represent what modern Disney films could be with the tools that are available today.
 
But, as things stand, animation now belongs to the digital world.  I hope to someday see another revival of hand-drawn animation, but that seems less likely as the concept of an animated film changes over time.  Seeing this sea change has made me feel more like an adult than anything, as I find my childhood ideals transforming into nostalgia.  I am grateful that Disney still treats their film canon with a great amount of reverence, and my hope is that future generations are able to accept the animated classics of the past as something equal to the films of the present.  It may be a drought right now, but good art always manages to stay timeless.
  

Welcome to CineRamble.com, or How I Learned to Stop Worrying and Love the Movie Blog

Hello Everyone.  My name is James Humphreys.  I am the author and creator of this humble blog.  It may look amateurish now, but this is my first foray into internet literature, so I hope to get better as time goes along.  In any case, this will the first post that I’ll ever write on this site, so I better make it count.  (Sits alone at the computer)…………(time passes)………. Is it time for work already?  Shoot.  That was a waste of time.
Okay, seriously, I do have a plan for what I’m going to write here at CineRamble.com.  In case it’s not obvious already from the title and picture I put up, this will be a blog about movies.  I’m very opinionated when it comes to films and filmmaking, and I enjoy sharing my thoughts with anyone; friends, family, and sometimes just the person I’m sitting next to in the movie theater.  Yeah I’m that guy.  I always try to stay informed and keep up with all the current trends in the media, as hard as that sometimes can be, and I always try to keep an open-mind and look for new and exciting things that are developing in Hollywood.  Until now, I’ve had a lot of things to say, but no place to say it, hence my decision to start a blog.
My mission on this site is to present my views on a variety of subjects within the movie realm, through an entertaining and often informative personal perspective.  My posts will mostly be opinion pieces, where I will share my own two cents on what’s currently happening.  But, I also plan on writing movie reviews for this site, both for current releases and movies already out on video.  I will also write reports about my experiences in the film world itself.  I live in Los Angeles, just over the hill from Hollywood itself, so there are plenty of potential things for me to report on, such as special screenings, premieres, or exhibitions at the local museums.  I also plan on doing top ten lists and retrospectives for this site; whatever I think would be worth writing about, I will bring it here.
In any case, I’m not looking for agreement on everything I say on this blog; in fact, if you disagree with me on something, I welcome it.  I’m always looking to inspire discussions about movies everywhere I go and I hope that this website is able to do the same.  If I can inspire a passionate rebuke that is able to change my perspective on things, it will be incredibly worthwhile.  My hope is that I can bring people’s attention to movies and ideas that have sometimes fallen through the cracks, and help shed new light on them.  I welcome feedback, because frankly, I’ll need it if I am ever going to get the hang of this.
So, having said all this, I’m going to formally welcome everyone to CineRamble.com and I look forward to actually getting this thing off the ground.  Expect my first official post in the next week, or so.  Hopefully it expands from here into something special.  So, it’s time to take that first step down this rabbit hole and see where it goes. “LOOK, MEIN VIEWERS, I CAN WALK.”