Tag Archives: Editorials

Ecstasy for Gold – The Cutthroat Campaigns of Awards Season

OscarStatue

Awards season is once again upon us and as always, there is a lot of debate over which film is deserving of the industry’s highest honors.  What is interesting about this year, however, is how up in the air it is.  For the first time in a long while, there are no clear favorites in this year’s Oscar race.  In years past, a clear picture would form by now of who was leading the pack after the Golden Globes and all the industry guilds have made there choices.  But so far, every one of the top honors this year has been varied, leaving no clear front runner for Best Picture at the Oscars; made all the more confusing after the Producers Guild Awards ended in a tie for the first time in it’s history, awarding both 12 Years a Slave and Gravity for Best Film.  Sure, any accolades for these movies are well deserved and appreciated by their recipients, but it’s the Academy Awards that solidifies the award season, and it’s what everyone in the industry strives for in the end.  That strong desire to win the top award has become such a dominant force in the industry, that it has started this troubling trend of negative campaigning in Hollywood. In recent years, we’ve seen Oscar campaigns become so overblown and vicious that it would make even Washington insiders queasy.  And the sad result is that in the pursuit of the industry’s top honors, the movies themselves will get lost in the shuffle.
This isn’t something new either, but it has developed over time into something bigger.  Oddly enough, when the Academy Awards first started in 1927, the awards themselves were considered an afterthought.  Instead, it marked the conclusion of a banquet dinner held by the Hollywood elite to celebrate the end of the year.  Many of the winners in this first ceremony either discarded their Oscars or pawned them off in later years, not foreseeing the significance that those statues would have in the years to come.  It wasn’t until about 4-5 years later when the ceremony gained significance, around the time when they started announcing the winners on the radio, allowing audiences to be informed about Hollywood’s awards recipients.  Once the ceremonies began to be televised in the 50’s, the awards season had now become a full blown cultural event and a focal point for the industry ever since.  Of course, with the whole world now interested in who was winning, it soon led to some of the studios making behind the scenes deals in order to get their movies to the top.  One of the earliest examples of questionable campaigning for an award came in the 1940 Oscar race, when producer David O. Selznick, hot off his Awards success for Gone With the Wind (1939), pressured a lot of entertainment press agents to campaign for his next film, the Hitchcock-directed Rebecca (1940).  The aggressive campaigning helped the film win Best Picture, but it failed to win any other major award, which led many people to question whether or not it deserved it in the first place; especially considering that it beat out the more beloved The Grapes of Wrath (1940) that same year.
This illustrates the major problem with an overly aggressive awards campaign that I’ve observed; the doubt that it raises over whether or not the movie deserves what it got.  We’ve seen the Academy Awards honor films that have certainly withstood the test of time (Casablanca (1943), Lawrence of Arabia (1962), and The Godfather (1972), just to name a few), but there are also choices made in other years that have left us wondering what the Academy was thinking.  But it’s not the final choices that make the Oscar campaigning problematic.  We all differ when it comes to choosing our picks for the awards, because everyone’s tastes are different.  What I find to be the problem is the increasingly nasty ways that movie companies try to get their movies an award by attacking their competition.  In recent years, I’ve noticed that this has gone beyond the usual “For Your Consideration” campaigning that we commonly get from the studios, and it has now devolved into fully-fledged mudslinging.  Truth be told, I don’t even think political campaigns get this cutthroat, but then again, I’m not much of a political observer.  This year in particular, we’ve seen complaints leveled at films for inaccuracies in their historical reenactments and for mis-characterizations of their subjects.  While some accusations have merit, there becomes the question of whether or not it matters. There are some voters out there who are persuaded by the chatter and would rather let the outside forces persuade them towards making a choice than judging a film on its own strengths, which becomes problematic when that chatter is ill-informed.
The most troubling thing about the recent trend of negative campaigning in the awards season is the inclusion of outside forces brought in to give weight to the criticisms behind a film.  This goes beyond just the negative reviews from critics.  What we’ve seen happen recently is the involvement of the media and press more and more in Oscar campaigns.  This has included articles written by scholars and experts that call into question the authenticity of the facts in the film as a way of slamming a movie’s credibility.  Famed astrophysicist Neill DeGrasse Tyson made the news weeks back when he published an article that pointed out the scientific information that the movie Gravity got wrong, which many people in the industry jumped upon to undermine Gravity’s chances for some of the top awards.  Mr. Tyson later on said that he did the article just for fun and continued to say that he still enjoyed the film immensely, but this seemed to get lost in the controversy that his first article stirred up.  It could be argued that film companies utilize negative campaigning just because it’s easier and more effective, which is probably true, but what it ends up doing is to distract people away from what the awards season should really be about which honoring the best work done by people in the industry that year.
The most dangerous kinds of negative campaigning that I’ve seen have been the ones that have no bearing in actual fact.  One of my first articles on this blog was an editorial addressing the smear campaign leveled against Quentin Tarantino’s Django Unchained.  At the time of the film’s release, African-American director Spike Lee openly criticized the movie because of it’s pervasive use of the “N-word,” and he denounced the film as “racist” and an insult to the history of slavery; despite the fact that he hadn’t seen the film yet.  Spike Lee’s comments however were used as ammo against the movie during last years Oscar race, which fortunately had little effect as the film walked away with two awards; for Screenplay and for Supporting Actor Christoph Waltz.  The same cannot be said for Kathryn Bigelow’s Zero Dark Thirty, however.
Released around the same time as Django, Zero Dark Thirty had a lot of hype built up around it, seeing as how it was documenting the search and capture of Osama Bin Laden.  The film’s hype was a case where Hollywood’s connections with political insiders became both a blessing and a curse.  Some left-wing studio heads even wanted to fast track the film’s release, so it would premiere before the 2012 election in the hopes that it would boost President Obama’s chances for reelection.  When the film premiered, however, the film’s reception was not what people expected.  Bigelow’s very frank depiction of torture used by the CIA to help find Bin Laden angered many people, and criticism of the film shifted from it being called left-wing propaganda to right-wing propaganda.  The film’s producers rightly argued that politics had nothing to do with the movie’s overall depiction, but the damage had already been done.  The one time Oscar front-runner was dealt a significant blow.  Kathryn Bigelow was shut out of the Best Director category and the film only ended up winning one award for Best Sound Editing, which it had to share in a tie with Skyfall (2012).  You could say that Zero Dark Thirty became a victim of it’s own pre-release hype, but I think the negative campaigning against the film rose to an almost unethical level when political leaders got involved.  Just weeks before the Oscar’s ceremony, Democratic Senator Diane Feinstein, along with fellow Democrat Carl Levin and Republican John McCain, called for an investigation into the film’s development, examining how Bigelow and writer Mark Boal got their information.  When the Oscars were over, almost on cue, the investigation was dropped.  We may never know if there was some backroom deal involved, but I saw this as an example of Awards campaign interference gone too far.
It’s troubling to think that some people are so easily persuaded by hype and negative press in the film industry, but it’s a result of how valuable these awards have become.  It is true that winning an Oscar will increase a film’s overall box-office numbers, which may be good for business, but it’s bad for the film’s legacy.  What is there to gain from a short-term boost in grosses when you’re hurting the film’s chances of having a long shelf life?  There are many examples of movies gaining a negative stigma if they win the top award over more deserving films.  The most controversial example would be 1998’s Shakespeare in Love, which many people say stole the Best Picture award away from Steven Spielberg’s Saving Private Ryan; so much so, that new campaign rules were drafted up by the Academy when it was revealed how much money Miramax execs Bob and Harvey Weinstein put into the film’s Oscar campaign.  Shakespeare did see a boost at the box office in the weeks before and after the awards, but the controversy behind it has unfortunately overshadowed the film itself over the years, which has in turn destroyed its staying power.  Time is the best judge of great movies, but the Oscars have only the year long window for perspective, so usually their picks have little foresight in the end.  1999’s winner, American Beauty, has almost faded into obscurity over time, as other films from that same year, like The Iron Giant, Fight Club, and The Matrix have become beloved classics up to today.
Is it right in the end to criticize a film over it’s content, or it’s adherence to the facts?  My argument is that a movie should be judged solely on it’s own strength as a movie.  The truth is that there is no absolute truth in film; it’s all make-believe after all.  If a film needs to take some historical liberties in order to tell a more fulfilling story-line, then so be it.  What I hate is when controversies come up around a film when it really doesn’t matter in the end.  Some controversies this year have erupted over films like Saving Mr. Banks and Captain Phillips, because of their white-washed approach to the depictions of their main characters, and the negative campaigns against them robbed actors like Tom Hanks and Emma Thompson out of recognition for awards that their outstanding performances would’ve otherwise deserved.  So what if aspects of these people’s lives are left out of the film; in the end they have nothing to do with the story’s that the filmmakers wanted to focus on in the first place.  The Wolf of Wall Street has had it’s own set of controversies, some of which the movie purposely provoked, and yet it didn’t effect it’s chances at the Oscars, so it shows that there is a selective bias in the negative campaigning behind against these films; all depending on who has something to gain from knocking out the competition.
When the winners of the Oscars are announced this year, my hope is that the voters use their best judgement when they cast their ballots.  For the most part, the Academy Awards will never please everybody.  Most often, whenever people say they were upset by the Awards, it’s more because there are few surprises and the whole thing ends up being boring in the end.  That’s why I am excited about this year’s open race, because anybody could win.  Unfortunately, the closer the race, the more negative the attacks against each film will be.  I think that hype can be a dangerous tool for a film if it is misused, and will ultimately end up clouding the merits of the movie itself.  In the end, Oscar gold does not always mean certification of excellence.  Great films stand the test of time, while the Oscars are more or less a time capsule of public tastes from that specific year.  Sometimes they pick the right Best Picture or performance, sometimes they don’t.  But what is certain is that negative campaigning is getting uglier and more prevalent in the award season.  What I hate is the fact that it’s become less about honoring great works in cinema and more about competition, seeing who’ll take home the most awards at the end of the night.  What seems to be lost in the shuffle is whether or not people like the actual films; that the movies are becoming increasingly seen as an afterthought in the awards season, with hype and name recognition mattering more in the media’s eye.  But, in the end, what matters is the entertainment value of it all, and no doubt we’ll still continue to be on the edge of our seats each time those envelopes open.

Hollywood Royalty – The Ups and Downs of a Film Acting Career

actress

A lot of work goes into making movies from many different departments, but what usually ends up defining the finished product more than anything is the quality of the actors performing in it.  Whether we like it or not, the actors and actresses are what the audiences respond to the most; more than the script and the direction itself.  Sure, writers and filmmakers can leave an impression and can build a reputation of their own, but their work is meant to be unseen and part of the illusion of reality.  It is the actors who must be in front of the camera the whole time and be able to make you forget that you are watching something that was constructed for your entertainment.  And this is mainly why we hold the acting profession in such high regard.  Sure, anybody can get in front of the camera and act, but it does take real skill to make it feel authentic and true to life.  Hollywood actors are an interesting lot because of the whole aura of celebrity that surrounds them.  They are simultaneously the most beloved and most reviled people in the world, and this usually is a result of the work that they do.  What I find fascinating is the way that a film actor’s career evolves over time, and how this affects the way we view them in the different roles they take.  Some people come into fame unexpectedly and then there are others who work their way up.  There are many ways to look at an actors career and it offers up many lessons on how someone can actually make an impact in the business depending on what they do as an actor.
The way we cast movies has certainly changed over the years.  When the studio system was at it’s height in the 30’s and 40’s, actors were mandated to be under contract, meaning that they had to work for that studio exclusively.  This became problematic whenever an actor or actress coveted a role that was being produced at a competing studio, excluding them from consideration.  Actors also had little choice in what kinds of movies they made, mainly due to the studio bosses who would make those decisions for them.  Many of these actors would end up being typecast in roles that the studios believed were the most profitable for them.  It wasn’t until the establishment of the Screen Actors Guild that actors finally had the ability to dictate the parameters of their contract, and also to have more say in the direction of their careers.  Even still, the pressure to be a successful matinee idol was a difficult thing to manage in Hollywood.  In many ways, it was often better to be a character actor in these early years than a headliner.  A character actor at this time may not have gotten the name recognition or the bigger paydays, but they would’ve gotten more diverse roles and a steadier flow of work of screen credits.  Actors from this time like Peter Lorre, Walter Brennan, and Thelma Ritter enjoyed long lasting careers mainly because they made the most of their supporting roles and had more leeway in the directions of their careers.
It’s the status of a matinee idol that really makes or breaks an actor.  Over the many years since the inception of cinema, we’ve seen actors rise and fall, and in some cases rise again.  Making a career out of film acting is a difficult nut to crack, and seeing how the industry is sometimes very cruel to outdated actors, it’s any wonder why there are so many people who want to do it.  I believe that it’s the allure of fame that drives many young up and comers to want to be actors, but following a dream does not an actor make.  It takes hard work, just like any other field in entertainment.  If I can give any advice to someone pursuing an acting career, it’s that you should never get into it just because you have the looks of a movie star.  Do it because you like performing and being a part of the film-making process.  Of course, it’s probably not my place to give advice to an actor, seeing as how I have not been on a stage since the eighth grade, and that I am looking at this from a writer’s point of view.  But, from what I’ve observed in the film community, it is that the best actors out there are the ones who are really engaged in the process, and not the ones who are in it just to build up their image.  The tricky part, however, is figuring out how to maintain this over time.
Becoming a successful actor in Hollywood has a downside that can be either a minor thing or a major negative thing depending on the person it happens to, and that’s the stigma of celebrity.  Whether an actor seeks it out or not, by being out in front of the camera, they have exposed themselves to a public life.  This isn’t a problem if the actor or actress manages their public and private lives well, but if they don’t, it’ll end up defining their careers more than the actual work that they do.  Case in point, actor/director Mel Gibson. Mel’s career has been negatively impacted by his off-screen troubles, including a nasty break-up with his Russian girlfriend and an Anti-Semitic fueled rant during a drunk driving arrest.  What’s most problematic for Mr. Gibson out of all this is the fact that no matter what he does now, no matter how good, it will always be overshadowed by his own bad behavior.  And it is a shame because, in my opinion, he’s still a very solid actor.  I still love Braveheart (1995) to death, and I think a lot of people are missing out if they haven’t seen his work in The Beaver (2011) yet.  Or for that matter, his excellent direction in Apocalypto (2006). Unfortunately, all his hard work is for not as he continues to alienate more of his audience because of his off-screen behavior.  This is the downside of celebrity that we see, and whether an actor is deserving of the scorn or not, it will always be a part of the business.
Actors and actresses can also find themselves in a rut simply because they are unable to adapt to the changing course of the industry.  This is certainly the case with people who have created their own signature style of acting.  Comedic actors in particular fall into this trap.  I’ve noticed that some actors who breakthrough in comedies in certain decades will almost always loose their audience by the next.  Shtick is a deceptive tool in the actor’s arsenal, because it helps people achieve stardom right away, but also anchors them down and keeps them stuck in place.  We’ve seen this happen to many comedic stars, like Eddie Murphy and Mike Meyers and Jim Carrey; and it’s starting to become apparent in Sacha Baron Cohen’s post-Borat career.  The only comedic actors who seem to make long lasting careers are the one’s who choose a dramatic role once in a while, like Bill Murray or Robin Williams.  Age also plays a factor in the downfall of people’s careers.  It usually happens with child actors who can’t shake off their youthful image, and unfortunately diminish and disappear once they become adults.  Making that transition from child actor to adult actor is tough, and it’s what usually separates the Elijah Woods from the Macaulay Culkins.  It’s less difficult nowadays for elderly actors to loose their careers than it was many years ago, mainly because movies like Nebraska (2013) give older actors much better roles.  But, in the past, the industry was incredibly cruel to older actors; something highlighted brilliantly in Billy Wilder’s classic Sunset Boulevard (1950).
What usually ends up making an actor or actresses’ career survive is their ability to grow as a performer.  There’s something to the old adage of there being “no role too small.”  Actors should relish the opportunity to diversify their choices of roles.  And usually the ones who have the longest lasting careers are the people who can play just about anything.  Meryl Streep is considered the greatest actress of her generation, and she didn’t do it by playing the same kind of character over and over again.  She has done comedies, dramas, cartoons; she has played Austrians, Australians, teachers, mothers, daughters, grandmothers, you name it.  No one would ever consider her lazy.  She has made a living challenging herself as an actor, and while not every role of her’s works (Mama Mia, for example) she nevertheless commands respect for her efforts.  What I respect the most is the ability of an actor or actress to move effortlessly from genre to genre and still act like it’s a role worthy of their talents.  That’s why I admire actors like Christian Bale, who can go from dark and twisted roles like The Machinist (2004) to playing Batman, or Amy Adams who can appear in movies as diverse as Paul Thomas Anderson’s The Master (2012) and The Muppets (2011) and give each film her best effort.  It’s always refreshing to see actors who commit themselves to any role they get, which in turn helps to endear them to us as an audience.  An invested actor will almost always make a film better.
Usually nowadays a bad performance is not measured by how misplaced an actor is or by how out of their league they may be.  The worst kinds of performances come from actors who are just lazy.  At the point where an actor just works for the paycheck and nothing more is usually where their careers begin to decline.  We’ve seen this with many actors who just get too comfortable doing the same role over and over again, or with people who take a job just for the pay and act like the part is beneath them.  When this happens, it’s usually driven by ego, which is another negative by-product of celebrity.  When an actor begins to dictate the terms of their comfort level in the production, rather than try to challenge themselves as a performer, it will usually mean that they’ll put in a lackluster performance, which leads them towards becoming a one-note performer.  This sometimes happens to people who hit it big and then become afraid of alienating this new audience they’ve built.  Johnny Depp is an actor that I think has reached this point, having built a wide fan-base from his Pirates of the Caribbean films.  The once ground-breaking actor has now fallen victim to his own shtick and that has negatively impacted his recent slate of films like The Lone Ranger (2013), which shows what happens when you try to play things too safe.
It is remarkable when you see these changes in a film actor’s career, because they usually happen unexpectedly.  Overall, the actor is the one responsible for their own career path, but the market itself can be a wild card factor in the lives of these people.  I for one value the efforts of a strong actor who’ll continue to try hard, even when the roles stop being what they are used to.  It’s something of a miracle to see actors who continue to stay relevant year after year, like Tom Hanks or Sandra Bullock.  They’ve locked into a career path that seems to have worked for them and have managed to maintain they’re faithful audiences even when they take on more challenging roles. What is also interesting is how Hollywood values a redemption story when it comes to an actor’s career.  A Hollywood comeback always manages to be a positive thing in the industry, especially when it happens to the least expected people; like with Robert Downey Jr. bouncing back from his drug addiction to play Iron Man, or Mickey Rourke pulling himself out of B-movie hell when he made The Wrestler (2008).  Film acting careers are probably the least predictable in the industry and it takes someone with a lot of personal resilience to make it work.  If there is anything an up and coming film actor should learn is that hard work pays off.  Don’t fall victim to concerning yourself over the changing trends or acting out of your comfort zone.  In the end, the best thing you can do is to commit to the role, no matter what it is.  Like the great George Burns once said, “Acting is all about sincerity.  And if you can fake that, then you’ve got it made.”

Tis the Season – Why Some Films Become Holiday Perennials

its_a_wonderful_life_3

We’ve reached the end of another calendar year and of course that can only mean that it’s Holiday season once again.  Whether we are celebrating Christmas, or Hanukkah, or whatever, it’s a time of the year where we all gather together and honor family, tradition, and the gift of giving.  What’s interesting about Christmastime, however, is just how much the holiday tradition is actually influenced and centered around Holiday themed movies.  A Holiday film can pretty much be considered a genre all it’s own, since so many of them exist out there, and are created specifically to invoke the holiday spirit.  Not only that, but they are movies that we continually return to every year around this same time, like it’s part of our holiday ritual.  This doesn’t happen with every Christmas themed movie, however, since many of them try to hard to hit their mark and fail spectacularly.  And yet, we see small films that no one thought much of at first grow into these perennial classics over time, and in some cases add to the overall Christmas mythos that defines the season.  But, how do we as an audience discern the classics from all the rest?  What really separates a Miracle on 34th Street from a Jingle all the Way?  Quite simply, like with most other movies, it’s all determined by what we bring from our own experiences in life when we watch a movie.
The emergence of  perennial holiday classics is nothing new in pop culture and actually predates the beginning of cinema by a good many years.  Literature has contributed holiday themed stories in both short form and novels for the last couple hundred years, helping to both shape and reinvent Christmas traditions in a very secular fashion.  Our modern day physical interpretation of Santa Claus can in fact be contributed to his appearance in “Twas the Night Before Christmas,” the 1823 poem by American author Clement Clarke Moore.  Moore’s nearly 200 year poem is still being recited today and it shows just how much tradition plays a role in keeping a perennial classic alive in the public’s eye.  Around the same time, acclaimed British novelist Charles Dickens wrote the story of A Christmas Carol, chronicling the tale of Ebenezer Scrooge and his visits by three ghosts on Christmas Eve.  Since it’s original printing in 1843, A Christmas Carol has gone on to be one of the most re-adapted story-lines in history.  Perhaps nearly a quarter of all holiday classics can claim to have been influenced by Dickens’ classic tale, where a dreary old cynic has his heart warmed by the holiday spirit.  Dickens meant for his novel to be a meditation on greed and class inequality, but I have no doubt that he purposefully meant for Christmas traditions to be the healing influence in Scrooge’s reawakening.  These stories continue to stand strong so many years later and it shows just how far back our culture began to value Christmas stories and songs as a part of the holiday tradition.
Even from the very outset of cinematic history we saw films carry on holiday themes.  Both Twas the Night Before Christmas and A Christmas Carol provided inspiration for movie-makers many times, given their already beloved appeal, but some people in Hollywood also saw opportunities to add their own original holiday themed stories into the mix.  When the studio system emerged, they were very well aware of the marketability of a holiday themes.  After all, people usually visited movie theaters frequently during the cold winters, so why not play up the festive mood that everyone was already in.  For the most part, movies celebrated Christmas more frequently in short segments than in full length story-lines in these early years; whether it was capitalizing on a new popular Christmas song in a lavish musical segment, or by portraying a Christmas celebration as part of larger arching narrative.  Many people forget that one of the most popular Christmas tunes ever written, “Have Yourself a Merry Little Christmas,” wasn’t even from a Christmas themed movie; rather it came from the 1944 musical Meet Me in St. Louis.  But eventually the Christmas season became such an influential part of our modern cultural tradition, that it would inspire films devoted entirely to the holiday spirit.
So, in the years since, we have seen Holiday films become standard practice in Hollywood.  Every year, it’s inevitable to see a Christmas movie released in time for the holidays.  Unfortunately, for most of them, Christmas movies very rarely achieve classic status.  For every one that audiences grow an attachment to, there will be about a dozen more that will likely be forgotten by next December.  Evidently, it seems like Hollywood’s approach to the holiday season is less carefully planned out than any other part of the year.  Their approach seems to be throwing whatever has Christmas in the title up against the wall and seeing what sticks.  Unfortunately, this has led to Christmas being more synonymous with bad movies than good.  Some are well meaning films that fall short of their goal like the Vince Vaughn film Fred Claus (2007) or the odd but charming Santa Clause: The Movie (1985).  And then there are ugly, shallow and distasteful films like Deck the Halls (2006), the Ben Affleck disaster Surviving Christmas (2004), or the deeply disturbing Michael Keaton film Jack Frost (1998), with the creepy as hell CG snowman.  And the less said about the horrible 2000 How the Grinch Stole Christmas remake the better.  Overall, it is very hard to make an honestly cherished holiday classic in Hollywood, and that’s mainly because the business just tries too hard.  If you look closely, you’ll actually find that a beloved holiday classic may come from the unlikeliest of places.
This was definitely the case with what has become not just one of the best loved Christmas movies, but one of the best movies period; that film being Frank Capra’s It’s a Wonderful Life (1946).  Capra’s movie tells the story of George Bailey (a flawless Jimmy Stewart), a man who has given so much back to his hometown and has gotten so little in return, reaching the verge of suicide due to his depression.  Through the intervention of a guardian angel on Christmas Eve, George is shown what his life would have been like if he never lived and he rediscovers his value and purpose and as it turns out is finally rewarded by those whom he’s helped all his life on the following Christmas Day.  The film is very uplifting and perfectly illustrates the true impact that the Christmas season has in our lives.  With a theme like that, you would think that the movie was a smash hit when it was first released, but instead the movie was a colossal bomb.  It bankrupted the company that made it and ruined Frank Capra’s directing career from then on.  The focus on George Bailey’s increasing depression was probably too hard for audiences to take at the time, given that many soldiers were returning home after the end of WWII.  Despite it’s initial failure, It’s a Wonderful Life managed to survive through TV airings which happened on, naturally, Christmas Eve and the film not only found it’s audience but it became a seasonal standard.  To this day, It’s a Wonderful Life is still aired on network TV (the only classic era movie that still is), and audiences from every generation still embraces it warmly, no matter how old fashioned it may be.  Pretty good legacy for a film that started off as a failure.
A holiday classic can come from an unlikely place like It’s a Wonderful Life, but for many, what is considered a classic is usually determined by their own tastes.  That’s why some people find romantic comedies set around Christmastime to be considered a holiday classic.  Case in point, the movie Love, Actually (2003) has grown into a beloved holiday classic, even though the themes in the movie are less about Christmas and more about the intertwining relationships between the characters.  By standing out as a strong romantic film with a Christmas setting, it stands to see this film as being an example of two types of genres working together.  Cult movie fans even have holiday classics that they cherish, like the weird campy film Santa Claus Conquers the Martians (1964), which can hold the distinction of being one of the worst movies ever made, and incredibly entertaining at the same time. And some people can even claim that Die Hard (1989) counts as a Christmas movie, because of it’s holiday setting.  Pretty much it’s whatever we bring with us from our own experiences to the movies that determines what we consider to be entertaining.  Like with how most people gravitate towards a movie based on their own interests, so too do we see that with Holiday films as well.  Hollywood has in some cases picked up on this and has catered to select audiences at Christmastime with genre specific movies.  Usually, it will take a consensus of a large audience to determine which ones will stand out as the undisputed classics.
I think where Hollywood hits it mark most often is when it comes to making a successful holiday film that appeals to the memories of our own experiences of Christmas.  The film that I think hit a perfect bulls-eye in this regard, and stands as a true masterpiece of Christmas themed film-making, is the 1983 classic A Christmas Story.  Directed by Bob Clark, and inspired by the auto-biographical stories of novelist Jean Shepherd, A Christmas Story perfectly captures the highs and lows of a young boy’s experience during the holiday season.  Ralphie (Peter Billingsley) is a character who was relatable to any young boy growing up in small town America, myself included, and seeing how he tries so hard to manipulate his parents into getting him his dream present is something every child will identify with.  Couple that with the hilarious performance of Darren McGavin as the Old Man and the iconic Leg Lamp, and you’ve got the very definition of a holiday classic.  But, just like how A Christmas Story highlights good Christmas memories, we see classic films that also center around a disastrous Christmas experience as well.  The best example of this would be the very funny and endlessly quotable National Lampoon’s Christmas Vacation (1989).  We’ve had just as many Christmases like the Griswold family as we have like the Parker family from A Christmas Story, and Christmas Vacation just perfectly encapsulates all the bad things that happen at Christmas time, without ever losing the holiday spirit underneath.  Not to mention its the last time we ever saw a funny performance out of Chevy Chase.
So, despite the low success rate, we as an audience still seem to find a classic seasonal favorite every in every generation.  But how does Hollywood keep making bad Christmas movies every year despite the demanding tastes of the movie-going public rejecting all the junk they put out.  I think it’s because the season itself is such an overwhelming cultural force, that most filmmakers don’t really care about the product they’re making, as long as it’s holiday themed and ready to capitalize on the mood of the period.  When it comes down to it, a great holiday classic is not determined by how soaked up in the holiday spirit it is, but rather by how strong it story works.  We keep watching It’s a Wonderful Life every year because of how inspirational George Bailey’s life story is, and not because of the Christmastime finale that has come to define it.  In fact, the movie is not really about Christmas at all; it’s about the life of one man and his value to the world.  Other Christmas movies usually become classics just because of a wintry setting, where the holiday is not even mentioned.  And even films that subvert Christmas traditions, like 2003’s Bad Santa, have become genuine holiday classics to some people.
I, myself, love a good Christmas movie, and because I’m such an ardent appreciator of movies in general, these films have certainly become a part of my holiday tradition.  I return to It’s a Wonderful Life and A Christmas Story every year and never get tired of them.  And not a year will go by when I don’t at least drop one quotable line from Christmas Vacation during this season.  I hope every generation gets their own perennial classic that will last for years to come.  Just please; no more remakes or sequels.  We all saw the backlash that an announcement of a sequel to It’s a Wonderful Life got recently.  I only wish The Grinch and A Christmas Story had been spared the same fate.  Like too much Christmas dinner, there can always be too much of a good thing when it comes to Christmas movies.

Flicks and Picks – The End of the Blockbuster Video Era

blockbuster

Though it was long seen coming, it finally became official this last week.  Blockbuster Video is no more.  While this is a sign of how things have progressed in home entertainment for mostly the better, with on-demand and streaming video making it easier for the consumer to watch whatever they want, it does also bring an end to an institution that has been at the center of many cinephiles lives.  Apart from some independent holdovers here and there, you rarely will find a video store in your local neighborhood today.  But back in the day, finding a store devoted to video rentals was as easy as finding a McDonald’s.  The decline of video stores over the years certainly has had to do with the advancements in streaming video, but the dominance of Blockbuster Video as a company also played a role as well.  In a way, by working so hard to become the top dog of the video rental market, Blockbuster also facilitated it’s own downfall when the market changed once again.  Though the end of Blockbuster was inevitable, and needed to happen, it does leave a gap for those of us who’ve built their love for film through renting from their local video store.  The video rental experience, while not exactly life-changing, is something that most film lovers have been through at some point in their lives, and this week it has now become a thing of the past.   In this article, I will look back on this era that Blockbuster Video defined, and what it’s end means for the future of home entertainment.
In the late 80’s, we saw the emergence of VHS, which gave studios and filmmakers the ability to make films available for purchase after their theatrical release for the very first time.  Before, audiences had to wait for airings on television before they could see their favorite films again, and that also meant having to put up with commercial breaks as well.  When VHS tapes started to be produced by the studios directly, it led to the creation of a niche market, with stores opening up across the country, directly geared toward filling that public appetite.  Being able to own a movie as part of a collection is a commonplace thing nowadays, but when home video sales began, it was an exciting new frontier and it had an influence on the film industry almost instantly.  Not only did the rise of home video affect the number of theatrical runs that a movie would have, but it also drove the movie studios towards film preservation and restoration as well, because of course, presentation matters for home viewing.
But, like with most new technology, VCR tape players were very expensive, and buying a movie to play in it was also not cheap at the time.  Some retailers even had to pay prices as high as $100 per movie in order to have it available in their stores.  So, in order to get more out of their product, and to let audiences have better access to the movies they wanted, video rental services came into being.  Like checking a book out from a library, consumers would be able to rent a movie for a certain amount of days at a low price.  This business model worked extremely well and led to boom in VCR sales.  Video stores popped up all across the country, both locally owned and franchise operated, and home video sales very quickly became a major part of the film industry as a whole.  But, it wasn’t just studio films that benefited  from this new market.  Independent producers saw an open opportunity in this new industry, and before long a whole Direct to Video market opened up, thanks to video stores allowing to indiscriminately sell and rent out a whole variety of films as a way to fill their shelves with more product.  In these early days, it was very common to see a diverse collection of independent stores in your hometown, as it was in mine.  There were stores that I grew up with  in my hometown of Eugene, Oregon that went by such varied names as Silver Screen Video or Flix & Picks, and choosing a rental from these places certainly had an affect on my growing interest in movies at a young age.
But that changed in the mid 90’s when the video rental industry became more standardized.  Out of this period of time came a chain of stores known as Blockbuster Video.  Blockbuster was founded in 1985 in Dallas, Texas, and started off as just another local retailer like most other stores, before it began to expand rapidly.  In the late 90’s, it was common to find at least one local Blockbuster in your area, and by the end of the decade, Blockbuster was unrivaled in the home video market.  Their rise had the negative affect of forcing all of the other competition out of business, which benefited them for the time being, but it would come back to bite them in the years ahead.  Blockbuster may have been ruthless to the competition, but to become the best in the industry, they did manage to do many beneficial things that did revolutionize the market.  For one thing, they were the first national retailer to begin video game rentals.  Their standardization of rental pick ups and drop offs also revolutionized the way we rent movies, making the drop off slots at your local store a life-saver late at night.  Also, Blockbuster was also the first chain to begin working within the film industry to create exclusive promotions and deals on upcoming releases.  Despite seeing a lack of choices in rental stores happen because of Blockbuster’s dominance, I don’t believe that consumers cared much about it as long as Blockbuster still operated efficiently.
Most film lovers will attest that they’ve probably spent a good amount of their time in a Blockbuster store.  While many of us could find exactly what we wanted at any time, there was another side effect that also changed how we grew up watching movies after spending time in a Blockbuster, and that effect would be the impulse rental.  I’m sure most of you out there have come out of a Blockbuster Video at one time with a movie you’ve never even heard of instead of the one you wanted, simply out of curiosity.  Having a variety of choices seems normal now, but not until video rental came about did consumers have that level of control over what they were able to choose.  Before, you would have been limited to the whatever was playing on TV or in your local cinema, but stores like Blockbuster made consumer choices as simple as a quick scan through their shelves.  For cinephiles, I’m sure that part of their growing love for films started out of making a surprise choice in the local video store, and with stores as big and as well stocked as Blockbuster, those surprises could have come from even the most obscure of titles.  Blockbuster was also handy for film students like me whenever we had to watch a film as part of an assignment.  Whether it was a film we knew or not, at least we had the comfort of knowing that there was a place we could look for it in a hurry.
In the later years, however, the market began to change again.  The internet revolutionized video streaming in the later part of the 2000’s, and our reliance on Video and DVD for home entertainment purposes soon became a thing of the past too.  Even though Blockbuster cleared out all comparable competition, they were ill equipped to take on the likes of a Netflix.  What Netflix did was to eliminate the middle man in video rentals, and have movies sent directly to the home through the mail, which made it unnecessary for anyone to go out to a store and rent a movie anymore.  Blockbuster tried it’s own rent by mail service in response, but by then the damage had already been done.  Netflix had surpassed Blockbuster as the number one rental service and the former giant had to begin downsizing in order to survive.  Soon, Redbox emerged and took away even more business from Blockbuster, appearing as convenient vending machines in grocery stores for anyone looking for an impulse rental.  Like most all other forms of retail, the trend has moved towards online shopping, and Blockbuster is one of the biggest to have fallen, mainly because their business model was one that couldn’t adapt in the digital age.  All that’s left for Blockbuster is it’s still recognizable name, and even that is owned by someone else now (it was purchased by DirectTV in 2011 for the branding it’s on-demand service).
Because Blockbuster eliminated much of the competition beforehand, it has actually made the transition to on-demand video renting faster and less rocky.  There was no large grouping of various retailers resisting the the changes in the market; only Blockbuster.  And now that they are gone, the era of land-based video rental shops has ended with them.  Sure there are independent stores in certain areas that still serve nostalgic purposes, but their clientele is limited.  Now it is more commonplace to hear that people have a Netflix account rather than a Blockbuster card.  But Blockbuster still left a legacy that will not be quickly forgotten, especially among longtime movie aficionados.  Many of us can still remember moments when being close to a Blockbuster came in handy; whether it was for a late night impulse rental, or for a quick bit of research, or for merely wanting to see a movie that you missed the first time around.  For many people, the first time they watched a particular movie, it was probably not in a movie theater but through a rental from big blue.  I can certainly say that I credit my local Blockbuster for helping me experience so many different types of movies.  One of my favorite films of all time (Seven Samurai) came to me out of an impulse rental from Blockbuster, and I will always be grateful for that.
So it’s a bittersweet end for the onetime giant.  Their closure spells the end of an institution that has been a big part of all of our cinematic experiences, but it’s a closure that was necessary.  Netflix and Redbox are just much better and convenient services, and Blockbuster was a relic that was standing in the way.  But, as we move forward, will those two also fall prey to the same fate as Blockbuster.  My guess is probably not.  Blockbuster had the unfortunate circumstance of being the top force in a market that was destined to fall.  Netflix and Redbox, however, have relished in the fact that they stand in direct competition with each other, and that has led to new and creative avenues for both companies.  Unlike Blockbuster, Netflix has branched out and generated their own exclusive content, including comedy specials and original shows like House of Cards, which not only makes it a great rental service, but also a competitor to broadcast TV.  And Redbox is able to make itself available in locations all across the world without having to set up the infrastructure of an entire store chain.  And with Amazon and Walmart entering the market with their own video streaming services like AmazonPrime and VUDU, it’s showing that the rental market is one that is going to continue growing in this new direction.  Blockbuster is certainly done as an independent company, but without it ever being there in the first place, the rental business would certainly never have gotten to where it is now, and that’s the legacy that it ultimately will leave behind.

Apocalyptic Cinema – Making Disasters Entertaining in Movies

ID4

One thing that we often see in human nature are destructive impulses; or to be more specific, we all like to see something get destroyed.  Whether it is a benign thing like blowing down a house of cards or something more extreme like an implosion of a building, we just enjoy watching something that was built up be taken down.  Hell, we even do it to each other through schadenfreude; whether it’s in politics like the Anthony Wiener scandal, or the rise and fall of a Hollywood star like Lindsey Lohan.  Our culture seems to relish destruction as a part of entertainment.  I don’t necessarily find this to be a bad thing, as long as it doesn’t get out of hand.  And that’s usually what we find in a lot of movies as well.  Disaster films have been a staple of movie-making for generations, but in recent years, we’ve seen visual effects work become sophisticated enough to the point where destruction looks authentic enough to be believable.  But, when we start to see movies become ever more comfortable showing widespread destruction as a part of their storytelling, there starts to be a question about where the line must be drawn.  Is it right for us to feel entertained when we see things like the White House or the Capitol Building being destroyed?  How about the entire world?  In this article, I will look at the highs and lows of disaster film-making and how the audiences reactions to them reveal the extremes to which people want to be entertained.
A lot of the reason why Disaster films exist is because they are a great showcase for special effects.  Going all the way back to the silent era, we’ve seen filmmakers use primitive but successful effects work to create larger than life destruction.  You could even look at some of the early Biblical epics like Cecil B. DeMille’s 1923 version of The Ten Commandments as early examples of a disaster movie.  The film had a moral message yes, but there were many audience members I’m sure who saw the film just because they wanted to see the grandiose destruction caused by the ten plagues and the parting of the Red Sea.  As special effects have become more sophisticated, so has there been an increase in disaster movies.  Soon films were crafted around some of the most famous disasters in history, like In Old Chicago (1937), which depicted the Great Chicago Fire of 1871, or San Francisco (1938), dramatizing the famous 1906 earthquake.  It wasn’t until the 1970’s, however, when Disaster films could be declared a genre all to itself.  In that period, we saw a glut of disaster related movies made specifically for the purpose of being epic, star-studded extravaganzas, with the latest is special effects work on display.  These films included Earthquake (1974), starring Charlton Heston; The Poseidon Adventure (1971), with Gene Hackman and Ernest Borgnine; and The Towering Inferno (1974), with Paul Newman and Steve McQueen, just to name a few.
The rise of the disaster movie genre in the 70’s began to die down in the 80’s, mainly due to the rise of Science Fiction and Fantasy films as a showcase for effects work, but the genre lived on as it began to evolve.  In the 1990’s, we saw the emergence of a filmmaker who would go on to not only redefine the genre, but make it all his own.  This filmmaker was German born director Roland Emmerich, and over the course of his career, you can see that nearly 80% of his filmography is made up of disaster movies.  The movie that put him on the map in the film industry was a film that actually redefined two genres in one, and that was 1996’s Independence Day.  The movie was essentially an alien invasion narrative, but what Roland Emmerich did was to use the techniques utilized in popular disaster films as a means to make the destruction caused by the aliens look and feel as real as possible.  In the movie, we see catastrophic explosions engulf entire cities, destroying landmarks before our very eyes, including the White House itself.  This was a film that not only drew upon our greatest fears of total annihilation, but it also made it feel completely real.  Independence Day was a phenomenal success when it premiered, and it made the disaster genre a force to be reckoned with.  As for Emmerich, he has stuck mostly with the genre that had made him a player in Hollywood, with mixed results, with successful but ludicrous films like Godzilla (1998), The Day After Tomorrow (2004), and 2012 (2009) all falling into that same mold as Independence Day.
But, what was interesting about the success of Independence Day was that it revealed something about how we react to seeing destruction on film.  In the movie, famous landmarks like the Empire State Building are blown to pieces and thousands of people are destroyed in seconds before our very eyes.  And this is what we consider entertaining?  Maybe entertaining isn’t the right word.  I think movies like Independence Day do well because it allows us to face our fears and indulge that sinking feeling of helplessness.  It’s not so much the scenes of destruction themselves that we find so entertaining, but the framework around them.  While watching a disaster movie, we need to feel the impact of the destruction, and that’s why so many disaster films have to finish with a happy ending.  In Independence Day, the colossal destruction closes the first act of the film.  The rest of the movie details how humankind copes with the aftermath, and how they fight off the invaders despite the odds against them.  You have to go through a lot of darkness before you can appreciate the light at the end of the tunnel, and that’s what has defined the best films in the genre.  If a film takes a bleak outlook and doesn’t give the movie a satisfying resolution, then it’s going to fail.  This has been the case with other disaster films, like 2009’s Knowing, which leaves everyone dead and earth uninhabitable at the end; sorry to spoil it for you.  Even the laughable 2012 left room for some hope for humanity, and not surprisingly, it did much better.
Disaster films have to thrive on that feeling of hope.  We become enthralled when we see something grand get destroyed, but it’s what rises from the ashes that makes us feel grateful in the end.  That’s why we enjoy watching controlled demolitions; old buildings must come down in order to make way for something better.  That’s helps us to understand why we accept destruction as entertainment.  Many films skirt that line very often, but the way a disaster film can get the audience on its side is through the characters.  Characters in disaster movies must be likable and easy to identify with.  It also helps if they are not thinly drawn stereotypes as well, but fully defined people.  Emmerich’s films have tended to have lackluster characters, which is why casting makes a difference in his movies, and other ones like them.  Independence Day worked well because you had charismatic performances from actors like Jeff Goldblum and Will Smith, who helped to balance the film out by creating characters you wanted to root for.  Other disaster films tend to miscast their roles, making their characters’ story-lines a little more hard to swallow.  Case in point, John Cusack in 2012.  Cusack is a fine actor when a movie calls for it, but when your character is a mild-mannered author who somehow is able to outrun the eruption of a Supervolcano; that I have a hard time buying.  Now it’s difficult to say that a character needs to believable in a movie centered around a fictional disaster, but sometimes it does matter.  Likability of the characters is what separates the good disaster films from the bad ones, and unfortunately that’s something you rarely see work effectively.
For the most part, disaster films exist because they are showcases for the newest techniques in special effects.  The human element in the films are crucial, but they do play a lesser part in the creation of the movies as a whole.  But, when the balance of these films aren’t settled in the right way, then they do run the risk of seeming either lackluster or worse, exploitative.  This was an issue in Hollywood in the aftermath of the September 11th attacks in New York City, where we saw a level of destruction in real life that we could only comprehend in movies before.  Soon after, the Independence Day style destruction of city-scapes in movies stopped for a while, because that imagery became all too real for us and seeing it on the big screen afterwards would’ve been seen as insensitive.  Now that time has passed, we are seeing that kind of destruction depicted again, but it took a while for us to get there.  What I think makes audiences understand the level of acceptability in disaster imagery is the balance between the level of destruction in the movie and how it functions within the narrative.
Even though it came out months before 9/11, I think that the Michael Bay film Pearl Harbor (2001) feel into that unacceptable exploitation category because it didn’t find that right balance.  In the movie, the famous attack is depicted in gruesome detail, but it lacks any resonance because it is just the backdrop to a rather lackluster love triangle plot.  A lot more respect could have been paid to the real men and woman who died on that day instead of having everything hinge on fictional characters that we care so little about.  Pearl Harbor felt more like a shallow Hollywood attempt to exploit a tragedy for the purpose of creating a film that showcased impressive production values and matinee idol stars.  In other words, it was a movie driven more by marketing than actually informing audiences about the real event.  If you don’t find that right balance in a disaster movie, than your film will not be believable, as was the case here.  Pearl Harbor failed as a movie mainly because it knew what it wanted to be, but the filmmakers didn’t know how to make it work.  They were trying to follow in the footsteps of what has ultimately been the only disaster film to date to ever win the Academy Award for Best Picture; that being director James Cameron’s Titanic (1997).  The reason why Titanic worked and Pearl Harbor didn’t was because it had a balance to it.  The love story at the center of Titanic, while not the most engrossing, did keep the narrative moving and it did endear the characters involved to the audience before the pivotal event happens.  Also, James Cameron put so much detail into the recreation of the ship’s sinking, and every moment of that is well executed on screen. No shaky cam or needless destruction is present in the climatic moments of the movie.  It works because the film was, dare I say, respectful to the actual disaster and to the victims of the event as well.
Making disaster movies thoughtful turns out to have been a secret to the genre’s success.  Going back to my example film once again, Independence Day, we see that the film works despite it’s more ludicrous moments by actually having characters work out logical answers to their dilemmas. It’s not enough to have the characters just move from one disaster to another without explanation, like in 2012   Or to have our characters helplessly standby as the world crumbles around them and inject stale philosophical points about why it all has happened, like in The Day After Tomorrow.  We want to see our characters be problem solvers and actually deal with the apocalypse like its something they can come back from.  That’s why, despite it’s many flaws, Independence Day succeeds.  Mankind coming together to help “Take those sons of bitches down,” is an ultimately inspiring thing.  Whether it’s against nature, or the extraterrestrial, or against our own selves, we enjoy watching characters pull themselves out of a struggle.  That’s why I think World War Z succeeded this year, despite all the naysayers who predicted it would fail (myself included).  The movie looked like another exploitative take on the zombie sub-genre, but the finished film was a more thoughtful examination about how the survivors of the catastrophe try to deal with the problem and learn to survive.  Sometimes it helps to treat your audience to a more thoughtful story about survival, rather than just destruction.
Disaster films will always be around as long as there is an audience for them.  And as long as filmmakers actually treat its audiences’ intelligence levels more respectfully, then we’ll also see the Disaster genre gain more respectability in the film community.  I like the fact that Disaster films have become such an acceptable part of cinematic history, that it’s now commonplace to spoof it as well.  This summer, we got not one, but two comedies centered around apocalyptic events: Seth Rogen’s This is the End and Edgar Wright’s The World’s End.  Both films are hilarious takes on the genre, but they both know what makes a good disaster film work in the end and they exploit those elements perfectly.  It comes down to characters you want to root for and wanting to see them overcome even the complete destruction of society as we know it.  Even though the film’s are played for laughs, the same basic elements hold true and the filmmakers who made them know that. Overall, destruction becomes entertainment because we look forward to the process of renewal.  Disaster movies fail if they indulge too heavily in the destructive parts or leave the audience with no satisfying resolution.  It’s human nature to enjoy seeing something blow up, but we also enjoy seeing something good rise out of the rubble of the destruction, and in the end, that’s why we enjoy good a disaster movie.

Not So Scary – Modern Horror Movies and the Lack of Genuine Scares

 

mama
Horror movies have been around since the very beginning of cinema.  From F.W. Murnau’s classic vampire flick Nosferatu (1922) to Universal Studio’s monster movies like Dracula (1931) and Frankenstein (1931), audiences have made watching scary films a long standing tradition.  And, like most other genres, horror has grown and evolved with the times, satisfying the changing tastes of it’s audiences.  In the 50’s, we saw the rise of the Sci-fi monster movies and in the 60’s and 70’s, “schlock” horror began to become popular, thanks to relaxed restraints over acceptable on-screen violence.  It is a genre that has more or less stayed strong in every decade and is much more adaptable than any other genre of film.  But, in recent years, I have noticed that there has been a severe drop off in horror movies that actually leave a mark.  It seems that today, studios are more interested in quantity over quality and its a trend that is having a negative effect on the genre as a whole.  My belief is that studios are using the horror genre as a way to generate a quick influx of cash, knowing that there is a built in audience of people who watch horror movies no matter what it is.  That’s why you see so many horror films quickly drop off after their opening weekend.  There seems to be the belief nowadays that you can pass off something as a horror movie if it has one or two big scares; but the reality is that the best horror films don’t always rely on things that make us jump out of our seats.
What makes a great Horror movie is the use of atmosphere.  This has been the case since the very beginning; back when cinema was still silent.  F.W. Murnau’s silent masterpiece Nosferatu shows exactly how atmosphere can be used to signify terror.  In the movie, we see how simple staging and effective use of shadows can be used to terrifying effect.  The vampire Count Orlok, played by actor Max Schreck, is able to strike at his victims using just his shadow, an image in the film that is made simply with the movie’s use of lighting, but still done with chilling effectiveness.  Early Hollywood horror films likewise made great use of atmosphere.  If you look at a movie like Dracula, there is actually very little on-screen violence present.  Instead, the film presents a feeling of dread through the gloomy atmosphere of the vampire’s castle.  Thanks to that, and Bela Lugosi’s iconic performance, you don’t need to see the bloodletting of Dracula’s victims in order to be scared.  This has helped to give these movies lasting power over so many years.  It’s amazing that movies made in the early days of cinema can still be scary, given all the limitations they had.  And given all the bad things we’ve seen happen to movie vampires in recent years (I’m looking at you Twilight), I’m glad that Lugosi’s version of the Count still can create a chill.
Understandably, the horror genre has had to grow and evolve with the times in order to survive, but for many years there was still an emphasis on atmosphere at play.  The more rebellious era of the 70’s allowed for more use of onscreen violence, and while many filmmakers perhaps went a little overboard in this period, there were a few that actually made an impact.  Dario Argento created films that were not only gory but also artistically staged like The Cat of Nine Tales (1971), Deep Red (1975) and the very twisted Suspiria (1977), which showed off how atmosphere could still be used to enhance the gore on film.  Director George A. Romero likewise used atmosphere effectively in a sub-genre of horror that he helped create; the zombie flick.  Despite the fact that these directors were given more leeway to do what they wanted, what made their early work so effective was in how they showed restraint.  You can show a lot more in horror movies nowadays, but sometimes what remains unseen becomes the scariest element, and that’s why films of this era managed to be effective.  The filmmakers knew when to be shocking and when to show restraint, based on what the horror movies that inspired them had done in the past.  But, as generations of filmmakers become more desensitized to what can be allowed in a horror movie, that sense of restraint also goes away.
The problem that I see in most modern horror movies today is that there is no self-restraint left in them.  For the most part, the filmmakers chose to throw atmosphere out the window in favor of “jump scares.”  A “jump scare” is when something suddenly pops onto screen out of nowhere in an attempt to make the audience scream and jump all at the same time, usually accompanied with a loud music cue to maximize effect.  A “jump scare” can work, when it is used sparingly, but too many films today are overusing it, which diminishes it’s effectiveness over time.  One of the best examples of a jump scare is actually in a film that you would consider more of a thriller than a horror movie; Jaws (1975).  The scene in question is when scientist Hooper (Richard Dreyfuss) is investigating a shark attack on a fishing boat at night.  While examining the hole in the bottom of the boat, a severed head pops out suddenly, creating a genuine scare for both him and the audience.  This scene is effective because it is unexpected and is built up thanks to the atmosphere of the moment.  Also, it is one of the few times that director Steven Spielberg actually uses a “jump scare” in the movie.  The  fewer times it happens, the more effective it is, and unfortunately that’s a technique that few horror filmmakers today understand.  When you use a technique too many times, it becomes tiresome and the audiences become more aware of it.  Unfortunately, too many filmmakers get carried away and have too much fun creating these kinds of “jump scares.”
One other problem I have noticed with modern horror films is the over-abundance of CGI.  While computer effects can sometimes be helpful in a horror film, like making it look like a character has lost a limp or manipulating an environment in a way that defies physics, there is a larger problem of effects work making moments that should be scary less so.  The problem is that most computer effects look too artificial.  Of course, when you see puppetry and prosthetic work used in horror movies, they are far from realistic too, but those effects are at least are physical in nature and actors can still interact with them.  When you see a horror movie use CGI too much, you just know that the actors are reacting to nothing else but a green screen effect.  A recent movie like Mama (2013), loses all effective chills when you see the digital apparition in it appear.  This is more apparent in smaller budget horror films, which you can kinda excuse due to limitations in budgets.  But when a bigger budget horror film, like the upcoming Carrie remake, looks so pathetic because of the overdone CGI effects, then you begin to see how digital imagery has a negative effect on the genre.  Even a good horror film like World War Z suffered from some unnecessary CGI work, which had the unfortunate affect of making the zombies less frightening.  If ever there was a place where I wish horror filmmakers would show more restraint, it would be here.
One other problem that I see plaguing the horror genre is the lack of original ideas.  Today we are seeing an overabundance of the same kinds of ideas used over and over again.  Seriously, how many haunted house movies do we need?  Not only that, there are far too many remakes and sequels in the horror genre.  Do we really need seven Saw movies and four Paranormal Activities?  Horror sequels have become so absurdly common, that we have ridiculous titles like The Last Exorcism 2 and A Haunting in Connecticut 2: Ghosts of Georgia appear as a result; and yes that second title is real.  I see it as commerce taking precedence over artistic vision, and the fact that film studios are more likely to invest in something already established than in something new.  Every now and then, you do see a movie with a fresh idea come about, like Paranormal Activity in 2007, but even that was driven into ground with too many follow ups with diminishing returns.
Remakes are also a negative factor in horror movies today.  What you usually see in these horror remakes are films that get rid of all the atmosphere from the originals in favor of upping the gore factor and the scary bits; just because filmmakers have the ability to do now what could only be implied at in the past.  The problem with this is that it completely misses the point of what made the original films so effective in the first place.  A particular example is the terrible remake of John Carpenter’s The Thing, which loses all of the substance of the original in favor of just making the film as gory as possible.  Gore does not equal scary.  Filmmakers like Carpenter knew that, and that’s why they used gore sparingly.  The sad thing is that remakes try to one up these originals because the tools today are so much better; but it fails miserably every time.
Thankfully, despite the attempts by Hollywood to try to push the Horror genre into more exploitative territories, the classics still hold up all these years later.  Even a 90 year old film like Nosferatu still gives audiences chills to this day.  And I think that it all comes down to atmosphere.  It’s like how people tell ghost stories around a campfire.  Would you rather listen to the story that builds up to a chilling ending that’ll leave you with nightmares, or would you rather listen to someone’s story that gets caught up in the gory details and then just ends without a payoff?  That’s what’s being lost in horror movies today.  The classics knew how to build their stories around scary ideas, and not just the imagery.  The Twilight Zone became popular on television because it presented us with unsettling scenarios that made us anxious the longer we thought about them.  Not once did we see the monster on the wing of a plane attack William Shatner in the famous episode; it was the frightening possibilities that could have come about that made the episode scary and also Shatner’s paranoia in his performance.  The best horror movies have staying power because they knew that their audiences had the imaginations capable of filling in the gory details that remained unseen.
So, is horror a dying genre?  Of course not.  There is an abundance of terrible horror movies out there, but that’s only because the market has been flooded.  Every now and then, a fresh new idea comes along and not only makes an impact, but it will also go on to influence the genre as a whole.  One thing that I would like to see an end to in the horror genre is the over-abundance of terrible remakes.  Just looking at the new Carrie remake trailer makes me laugh, because it’s taking everything that worked in the original and makes it less subtle.  I believe it strongly; CGI, and shaky-cam for that matter, are making horror films less frightening.  They are showy techniques that ruin atmosphere needed for a good horror movie and I wish more filmmakers would show more restraint.  I’ve stayed away from horror films generally because of this, and the horror movies that I gravitate towards are ones that have been around a long time.  If you’re wondering which one I consider my favorite, it would be Stanley Kubrick’s The Shining (1980).  Talk about a film that makes the most out of it’s atmosphere.  I hope that other horror filmmakers take a look at what makes the classics as scary as they are, and learn the effectiveness of restraint.  You’d be surprised how much a little scare can go when it’s built up well enough.

The Best of the Worst – Why We Have a Good Time Watching Bad Movies

 

manos
If there was ever a place where the word “bad” could be considered a relative term, it would be in the movies.  Over the course of film history, we have seen Hollywood and the film industry at large put out an astounding variety of movies, and not all of them have hit their targets the way that the filmmakers had intended.  If you produce hundreds of products within a given year, the odds are that some, if not most of them are not going to be good.  But like most things, one man’s trash can be another man’s treasure, and that has led to a fascinating occurrence in the film community.  Some “bad movies” have actually earned a fanbase all on their own, finding an audience in some unexpected ways.  This has been the case with films that have built a reputation over time, but nowadays, we are actually seeing intentionally bad movies become phenomenally successful upon their initial releases, as was the case with the premiere of Sharknado on the SyFy Channel earlier this summer.  How and why “bad movies” find their audiences is still a mystery to many, but what I love about this trend is that it shakes up our preconceived notions about the film industry, and makes us reconsider what we find entertaining in the first place.
So, what is it about these “bad” movies that makes them so entertaining to us?  The truth is that there is no “one thing” that defines the success behind these films, and usually it’s all relative to each individual movie.  Sometimes it’s the incompetency behind the making of the film that we find so entertaining.  Sometimes it’s because the film is so out-of-date that it becomes hilarious.  Sometimes it’s the lack of self-awareness and ego behind the director’s vision.  And sometimes it’s the filmmakers just not giving a damn what other people think and just going all out with their material. The formula has no consistency, and yet we see many film’s fall into these many different categories of “bad” films.  Usually the best of the these are the ones that fulfill the criteria of a “bad” movie so perfectly, that it becomes memorable and re-watchable.  Only in rare cases does this work intentionally, and usually the best “bad” films arise from an unexpected accident.
Some of the best “bad’ movies have come out of turmoil, which makes their existence all the more fascinating.  Usually this is attributed to movies that were made despite the fact that their filmmakers didn’t know what they were doing.  One of the most notorious examples of this was the 1966 cult classic, Manos: The Hands of Fate.  Manos was the creation of Hal Warren, a fertilizer salesman from El Paso, Texas who made a bet with a screenwriter friend of his that he could make his own movie without any help from Hollywood.  Making good on his wager, Mr. Warren wrote and directed this schlocky horror film centered around a cult leader named the Master (pictured above) who holds a family hostage in his compound, which is watched over by a lecherous caretaker named Torgo.  Hal Warren shot the film with a camera that could only shot 30 seconds of film at a time with no recorded sound, and most of the movie was shot at night with set lighting attracting moths in almost every shot.  The finished film is a convoluted mess and it ended any shot for Hal Warren to pursue a career in filmmaking.  However, many years later, the film was rediscovered by the producers of the show Mystery Science Theater, who then featured it on their show and created a renewed interest in this odd little film that no one outside of Texas knew about.  Manos became a hit afterwards because people were fascinated by how silly this poorly made film was, something that the MST crew had a hand in.  Since then, Manos has earned a reputation for being among the worst films ever made, and that in itself has made it a favorite for people who gravitate towards that.
While Manos represents an example of a disaster turned into a success, there are other bad films that have become fan favorites just out of being incredibly dated.  These movies usually make up the majority of what people consider good “bad” films, since most films are a product of their times.  Whether people are entertained by these because of their “out-of-date” nature or merely because of shear nostalgia, there’s no denying that time has a way of changing how we view these kinds of movies.  The 1950’s has become an era that many film fans find to be full of some good trash, mainly due to the rise of the B-movie in this period of time.  Some cult hits like The Blob (1958), Creature from the Black Lagoon (1954), Attack of the 50 ft. Woman (1958), and The Thing from Another World (1951) all rode the surge of a sci-fi craze of the post-war years, and while everything from these films, like the visual effects and the acting, feels antiquated today, they still have a camp value that makes them watchable all these years later.  The “cheese factor” plays a big role in keeping these films entertaining long after their relevance has diminished.  You can see this also in the beach party movies of the early 60’s, which are charming despite their paper-thin plots. The one other era that has produced it’s own distinctive set of dated films would be the 1980’s, with it’s collection of dated fantasy pictures and culturally infused fluff, like He-Man inspired Masters of the Universe (1987) or the E.T. wannabe Mac and Me (1988).  By all accounts, these films would have long been forgotten outside of their era, and yet they have lived on with audiences who still find something entertaining in them.
One of my favorite types of “bad” film is the kind that comes from a complete lack of control from either the director or the performer.  There have been some directors that have actually gained their reputation as an filmmaker by staying within the B-Movie community.  The most famous of these filmmakers has become Ed Wood Jr.; a person who some have claimed to be the worst director in history.  Ed Wood’s notable contributions to cinema have been the cross-dressing comedy Glen or Glenda (1953), the Bela Lugosi-starring Bride of the Monster (1955), and what many consider the director’s “masterpiece,” Plan 9 from Outer Space (1959).  Whether or not Ed Wood was earnest in his vision or whether he made his film’s intentionally bad is still debated, but there is no doubt that Plan 9 is a special kind of “bad;” a movie so aggressively cheesy, that it is hard not to be entertained by it.
Other filmmakers who were more aware of their B-Movie status have still gained an honored reputation with audiences.  Roger Corman, a man who prided himself in making movies both fast and cheap, has actually become influential to a who generation of blockbuster filmmakers.  His Little Shop of Horrors (1962) even inspired a Broadway musical.  Also, sometimes a way out there performance can often make a “bad” movie worth watching.  I would argue that this is the case with most Nicolas Cage films, like Vampire’s Kiss (1988) or Ghost Rider: Spirit of Vengeance (2012).  One film that has become a cult classic mainly due to one “out-of-control” performance was 1981’s Mommie Dearest, where Faye Dunaway chews the scenery in a good way as a way over-the-top Joan Crawford.  Usually a lack of restraint by the filmmakers can sink a film, but these movies actually prove that it’s not always the case.
While many films usually become a cult hit over time, there a select few that attempt to achieve cult status right away by being intentionally bad.  Like I stated earlier, Sharknado became an instant hit when it premiered on cable, and having seen the film myself, it’s clear that the filmmakers behind it knew what kind of movie they were making.  Rarely do you see filmmakers try to aim for that intentionally “bad” gimmick for their movies, because obviously if audiences don’t accept it, then you’ve just made a bad movie.  Director Tim Burton tried to create a homage to B-Movie sci-fi with his 1996 film Mars Attacks, but the film was an odd blend of tounge-in-cheek mockery with earnest storytelling, and the end result doesn’t achieve what it set out to do.
But, one example of an intentionally bad film that did click with audiences is the campy musical The Rocky Horror Picture Show (1975); a movie that pays homage to campy horror and sci-fi, while mixing in 50’s rock music and trans-sexual humor.  Rocky Horror tries so hard to be so bad, you would think that the whole thing would be a mess; and yet, it remains entertaining and it has one of the most dedicated fanbases in  the world.  I think the reason why a movie like Rocky Horror works is because of the fact that it just doesn’t care what people will think about it.  It is what it is, and that’s why people gravitate to it.  It’s a one of a kind.  A movie like Mars Attacks didn’t click as a throwback, because it didn’t have that same kind of assured belief in itself, and that shows why it is hard to make a bad movie feel good.
When it comes down to it, “bad” movies are usually determined by the tastes of the people who watch them.  We have made some of these “bad” movies our favorites because of the value we find in their chessy-ness, or by our fascination with how badly it gets things wrong.  For a movie to be all around bad, it has to lack any kind of entertainment value in the end.  For those who are wondering, the worst movie that I have ever seen, and one I see no redeeming value in, is the 1996 film Space Jam.  To me, it was the worst experience I have ever had watching a movie, mainly because I saw it as a blatant self-serving production piece for a sports super star (Michael Jordan) and it ruined three things on film that I love dearly: NIKE, Looney Tunes, and Bill Murray.  But, I do recognize that the film does have its fans, so in the end it all comes down to taste.  But, it is fascinating how our tastes leave room for something as poorly made as a Manos or even the more recent Birdemic: Shock and Terror (2010), a movie that needs to be seen to be believed.  There is certainly value in seeing something that we find entertaining, and perhaps that is why these films live on the way they do.

Inspired by a True Story – The Process of Showcasing History in Hollywood

 

jobs3
This week, two very different biopics open in theaters, both ambitious but at the same time controversial.  What we have are Ashton Kutcher’s Jobs and Lee Daniel’s:The Butler (you can thank uptight Warner Bros. for the title of the latter film.)  Both are attempting to tell the stories of extraordinary men in extraordinary eras, while at the same time delving into what made these people who they are.  But what I find interesting is the different kinds of receptions that these two movies are receiving.  Lee Daniel’s The Butler is being praised by both audiences and critics (it’s receiving a 73% rating on Rottentomatoes.com at the time of writing this article) while Kutcher’s Jobs is almost universally panned.  One would argue that it has to do with who’s making the movies and who is been cast in the roles, but it also stems from larger lessons that we’ve learned about the difficult task of adapting true-life histories onto film.  The historical drama has been a staple of film-making from the very beginning of cinema.  Today, a historical film is almost always held to a higher standard by the movie-going public, and so it must play by different rules than other kinds of movies.  Often it comes down to how accurately a film adheres to historical events, but that’s not always an indicator of a drama’s success.  Sometimes, it may work to a film’s advantage to take some liberties with history.
The Butler and Jobs make up what is the most common form of historical drama; the biopic.  In this case the subjects are White House butler Cecil Gaines, portrayed by Oscar-winner Forrest Whittaker, and visionary Apple Computers co-founder Steve Jobs.  Both are men who hold extraordinary places in history, but in very different ways.  Despite the differences in the subjects, it is the history that surrounds them that plays the biggest part of the story-telling.  Filmmakers love biopics because it allows them to teach a history lesson while at the same time creating a character study of their subject.  Usually the best biopics center around great historical figures, but not always.  One of the most beloved biopics of all time is Martin Scorsese’s Raging Bull (1980), which tells the story of a washed-up heavyweight boxer who was all but forgotten by the public. Scorsese was attracted to this little known story of boxer Jake LaMotta, and in it he saw a worthwhile cautionary tale that he could bring to the big screen.  The common man can be the subject of an epic adventure if his life’s story is compelling enough.  But there are challenges in making a biopic work within a film narrative.
Case in point, how much of the person’s life story do you tell.  This can be the most problematic aspect of adapting a true story to the big screen.  Some filmmakers, when given the task of creating a biopic of a historical figure, will try to present someone’s entire life in a film; from cradle to grave. This sometimes works, like Bernardo Bertolucci’s The Last Emporer (1987), which flashbacks to it’s protagonist’s childhood years frequently throughout the narrative.  Other times, it works best just to focus on one moment in a person’s life and use that as the focus of understanding who they were.  My all-time favorite film, Lawrence of Arabia (1962) accomplishes that feat perfectly by depicting the years of Major T.E. Lawrence’s life when he helped lead the Arab revolts against the Turks in World War I.  The entire 3 1/2 hours of the film never deviates from this period in time, except for a funeral prologue at the beginning, and that is because the film is not about how Lawrence became who he was, but rather about what he accomplished during these formidable years in his life.  How a film focuses on it’s subject is based around what the filmmakers wants the audience to learn.  Sometimes this can be a problem if the filmmaker doesn’t know what to focus on.  One example of this is Richard Attenborough’s Chaplin (1992), which makes the mistake of trying to cram too much of it’s subject’s life into one film.  The movie feels too rushed and unfocused and that hurts any chance the movie has with understanding the personality of Charlie Chaplin, despite actor Robert Downey Jr.’s best efforts.  It’s something that must be considered all the time before any biopic is put into production.
Sometimes there are great historical dramas that depict an event without ever centering on any specific person.  These are often called historical mosaics.  Often times, this is where fiction and non-fiction can mingle together effectively without drawing the ire of historical nitpicking.  It’s where you’ll find history used as a backdrop to an original story-line, with fictional characters participating in a real life event; sometimes even encountering a historical figure in the process.  Mostly, these films will depict a singular event using a fictional person as a sort of eyewitness that the audience can identify with.  You see this in films like Ben-Hur (1959), where the fictional Jewish prince lives and bears witness to the life and times of Jesus Christ.  More recently, a film like Titanic (1997) brought the disaster to believable life by having a tragic love story centered around it.  Having the characters in these movies be right in the thick of historical events is the best way to convey the event’s significance to an audience, because it adds the human connection into the moment.  Titanic and Ben-Hur focus on singular events, but this principle can also be true about a film like Forrest Gump (1994) as well, which moves from one historical touchstone to another.  Forrest Gump’s premise may be far-fetched and the history a little romanticized, but it does succeed in teaching us about the era, because it does come from that first-hand experience.  It’s that perspective that separates a historical drama from a documentary, because it helps to ground the imagination behind the fictional elements into our own lives and experiences.
Though most filmmakers strive to be as historically accurate as they can be, almost all of them have to make compromises to make a film work for the big screen.  Often, a story needs to trim much of the historical elements and even, in some cases, take the extraordinary step of rewriting history. You see this a lot when characters are created specifically for a film as a means of tying the narrative together; either by creating an amalgam of many different people into one person, or by just inventing a fictional person out of nowhere.  This was the case in Steven Spielberg’s Catch Me if You Can (2002), which followed the extraordinary life of Frank Abagnale Jr. (played by Leonardo DiCaprio), a notorious con-artist.  In the film, Abagnale takes on many different identities, but is always on the run from a persistent FBI agent named Carl Hanratty (Tom Hanks). Once finally caught, Abagnale is reformed with the help of Hanratty and the film’s epilogue includes the statement that, “Frank and Carl remain friends to this day.”  This epilogue had to be meant as a joke by the filmmakers, because even though Frank Abagnale is a real person, Carl Hanratty is not. He’s an entirely fictional character created as a foil for the main protagonist.  It’s not uncommon to see this in most films, since filmmakers need to take some liberties to move a story forward and fill in some gaps.  Other films do the risky job of depicting real history and completely changing much of it in service of the story.   Mel Gibson’s Braveheart takes so many historical liberties that it almost turns the story of Scottish icon William Wallace into a fairy-tale; but the end result is so entertaining, you can sometimes forgive the filmmakers for making the changes they did.
But while making a few changes is a good thing, there is a fine line where it can be a disservice to a film.  It all comes down to tone.  Braveheart gets away with more because it’s subject is so larger than life, that it makes sense to embellish the history a bit to make it more legend than fact.  Other films run the risk of either being too irreverent to be taken seriously or too bogged down in the details to be entertaining.  Ridley Scott crosses that line quite often with his historical epics, and while he comes out on the right side occasionally (Gladiator and Black Hawk Down) he also comes up with the opposite just as many times (Robin Hood, 1492: Conquest of Paradise, Kingdom of Heaven theatrical cut).  Part of Scott’s uneven record is due to his trademark style, which services some films fine, but feel out of place with others.  Tone also is set with the casting of actors, and while some feel remarkably appropriate for their time periods (Daniel Day-Lewis in Lincoln for example) others will feel too modern or awkwardly out-of-place (Colin Farrell in Alexander).  Because historical films are expensive to make, compromises on style and casting are understandable for making a film work, but it can also do a disservice to the story and shed away any accountability in the history behind it.  While stylizing history can sometimes work (Zack Snyder’s 300), there are also cinematic styles that will feel totally wrong for a film.  Does the shaky camera work, over-saturated color timing and CGI enhancements of Pearl Harbor (2001) make you learn any more about the history of the event?  Doubtful.
So, with Lee Daniel’s The Butler and Jobs, we find two historical biopics that are being received in very different ways.  I believe The Butler has the advantage because we don’t know that much about the life that Mr. Cecil Gaines lived.  What the film offers is a look at history from a perspective that most audiences haven’t seen before, which helps to shed some new light on an already well covered time period.  With Jobs, it has the disadvantage of showing the life of a person that we already know everything about, and as a result adds nothing new to the table.  Both films are certainly Oscar-bait, as most historical films are, but The Butler at least took on more risks in its subject matter, which appears to have paid off in the end.  Jobs just comes off as another failed passion project.  What it shows is that successful historical dramas find ways to be both educational and entertaining; and on occasion, inspiring.  That’s what helps to make history feel alive for us, the audience.  It’s the closest thing we have to time machines that help be an eyewitness to our own history.  And when it’s a good story, it stays with us for the rest of our lives.

Thrown into the Briar Patch – The Uneasy and Confusing Controversy of Disney’s “Song of the South”

 

songofthesouth
What does it take to blacklist a whole film?  Walt Disney’s 1946 film Song of the South has the dubious distinction of being the only film in the company’s history to be declared un-releasable. Many people state that it’s because of the perception that the film has a racist message and that it sugarcoats and simplifies the issue of slavery in an offensive way.  I would argue that it’s not right to label a film one way without ever having seen it, but unfortunately Disney is reluctant to even let that happen.  What is interesting is the fact that by putting a self-imposed ban on the distribution of the film, Disney is actually perpetuating the notion that Song of the South is a dangerous movie, due to the stigma it holds as being the one film that they refuse to make public.  Disney, more than any other media company in the world, is built upon their wholesome image, and for some reason they are afraid to let their guard down and air out their dirty laundry.  But, is Song of the South really the embarrassment that everyone says it is, or is it merely a misunderstood masterpiece. Thankfully, I have seen the film myself (thank you Japanese bootlegs and YouTube), so I can actually pass judgment on it, and like most other controversial things, you gain a much different perspective once you remove all the noise surrounding it.
For a film that has gained such a notorious reputation over the years, the actual history of the production is relatively free of controversy.  Walt Disney wanted to adapt the Uncle Remus Stories, which were popular African-American folktales published by Joel Chandler Harris in post-Reconstruction Georgia.  Disney said that these stories were among his favorites as a child and he was eager to bring to life the moralistic tales through animated shorts starring the characters Brer Rabbit, Brer Fox and Brer Bear.  The film was a breakthrough production for the Disney company as it was a mix of live action and animation.  Sequences where the live action character of Uncle Remus interacts with the cast of animated critters were astonishing to audiences at the visual effects were highly praised at the time; remember this was almost 20 years before Mary Poppins (1964), which was also a hybrid film in itself.  Walt Disney treated the subject material with great reverence and he brought in the best talent possible to work on the film, including Oscar-winning Cinematographer Gregg Toland (Citizen Kane, The Grapes of Wrath).  Disney was especially proud of the casting of James Baskett as Uncle Remus, and he even campaigned heavily to earn Mr. Baskett an Oscar nomination for his performance;  Baskett wasn’t nominated, but he did win a special honorary Oscar in recognition of his work on the film.  The movie was a financial success and it did earn another Oscar for the song “Zip-a-Dee-Do-Dah,” which has become a sort of an unofficial anthem for the Disney company.
Surprisingly, the film would be re-released constantly for decades afterwards.  It even provided the inspiration for what is still one of Disneyland’s most popular attractions: Splash Mountain.  It wouldn’t be until after a short theatrical run in 1985 that Disney began their policy of keeping the film out of the public eye.  Not surprisingly, this was also around the same time that a new corporate team, led by Michael Eisner, had taken over operation of the company, and with them a whole new mindset centered around brand appeal.  While Song of the South would sometimes be called out in the past by organizations like the NAACP for it’s quaint portrayal of post-slavery life, the film was not considered an outright embarrassment.  It was merely seen as a product of its time and was much more notable for its animated sequences than for its actual story line.  But once Disney made it their policy to shelve the film for good based on the perception that the film made light of slavery, that’s when all controversy started heating up.  To this day, Song of the South has yet to receive a home video release here in the United States, and Disney is still continuing to stand by their decision to not make the film public.
So, having seen the actual film, it has given me the impression that Disney didn’t ban the film just because of its content, but rather it was an attempt to keep their image as clean as possible.  My own impression of the film is this; it’s harmless.  Don’t get me wrong, it is not the most progressive depiction of African-American life in America and some of the portrayals of the ex-slave characters are certainly out of date to the point of being cringe-inducing.  But it’s no worse than a film like Gone with the Wind (1939), and that film is considered one of the greatest movies of all times.  If Song of the South has a flaw it would be that it’s boring.  The movie clearly shows Walt Disney’s lack of experience in live action film-making, as the main story of the film is very dull and flimsy. Basically it follows the life of a young southern boy, played by Disney child star Bobby Driscoll (Peter Pan) as he deals with the break-up of his family and the finding of solace in the stories told to him by a former slave, Uncle Remus.  There’s not much more to it than that.  Where the film really shines is in its animated sequences, which are just as strong as anything else Disney was making in the post-War era.  The art style in particular really does stand out, and conveys the beauty of the Southern countryside perfectly.
Somehow, I believe that there’s a different reason why the film has garnered the reputation that it has.  Disney is a big company that has built itself around an image.  Unfortunately, when you go to certain extremes to keep your image as flawless as it can be, it’s going to make other people want to tear that down even more.  There are a lot of people out there who hate Disney purely on their wholesome image alone, and when they find cracks in that facade, they are going to keep on exploiting that whenever possible.  Walt Disney himself has been called everything from racist to anti-Semitic, which if you actually dig deeper into any of those claims, you’ll find that there’s little truth to them and that they’re usually attributed to people who came from rival companies or had a contract dispute with Mr. Disney.
Unfortunately, by trying so hard to sweep so much under the rug, the Disney company opens itself to these kinds of accusations; and they have no one else to blame for them but themselves. Walt Disney was not a flawless man by any means and the company has made embarrassingly short sighted decisions in the past; hell they’re still making them now (John Carter, The Lone Ranger). But, their flaws are no worse than the ones that plague other companies in Hollywood.  Just look at the racial stereotypes in old Warner Brothers cartoons; there was an actual war propaganda Looney Tunes short called Bugs Nips the Nips, which is about as racist as you can get.  The only difference is that Warner Brothers has not shied away from it’s past embarrassments, and have made them public while stating the historical context of their productions.  As a result, Warner Brothers has avoided the “racist” labels entirely and their image has been kept intact.  For some reason, Disney doesn’t want to do that with Song of the South, despite the fact that Disney has made public some of their older shorts that are far more overtly racially insensitive than the movie. There are shorts from the 1930’s that showed Mickey Mouse in black face, and yet they still got a video release as part of the Walt Disney Treasures DVD Collection.  I think the reason why Song of the South didn’t get the same treatment is because it’s such a polished and earnest production, and it’s probably easier to dismiss the silly cartoon for it’s flaws because they’re less significant.
Regardless of how it accurately it addresses the issues of slavery and the African-American experience, the Song of the South should at least be given the opportunity to be seen.  It’s a part of the Disney company’s history whether they like it or not, and to sweep it aside is doing a disservice to the Disney legacy as a whole.  Being a white man, I certainly can’t predict what the reaction from the African-American community will be, but is that any excuse to hide the film from them.  Maybe black audiences will come to the film with an open mind; quite a few at least.  It just doesn’t make any sense why this is the film that has been deemed un-watchable when other films like Gone with the Wind, which is very similar content wise, is heralded as a classic.  Even D.W. Griffith’s The Birth of a Nation (1915) is available on home video, and that film openly endorses the Ku Klux Klan.  Song of the South is so harmless by comparison and the worst that you can say about it is that it’s out of date.
As a film, I would recommend everyone to give it at least a watch, if you can.  The animated sequences are definitely worth seeing on their own, and I think some people will appreciate the film as a sort of cinematic time capsule.  While the African-American characters are portrayed in a less than progressive way, I don’t think that it’s the fault of the actors.  James Baskett in particular does the most that he can with the role, and it’s hard not to like him in the film.  He also does double duty playing both Uncle Remus and the voice of Brer Fox, which shows the range that he had as a performer.  The music is also exceptional with songs like “The Laughing Place,” “Sooner or Later,” “How Do You Do?” and the Oscar-winning “Zip-a-Dee-Do-Dah;” crowd-pleasers in every way. It’s definitely not deserving of the reputation it’s gotten.  Disney’s reluctance to make the film available just goes to show the folly of trying to keep a flawless image, when it would actually serve them better to have it out in the open.  Sometimes you just need to take your medicine and let things happen.  After all, aren’t the people who ride Splash Mountain everyday at Disneyland going to wonder some day what film it’s all based on?

Nerd Heaven – The Highs and Lows of Marketing at San Diego Comic Con

 

spiderman
This is no ordinary corporate showcase.  In the last decade or so, San Diego Comics Convention (SDCC), or its more commonly known name COMIC CON, has become a full-fledged festival for the whole of Nerd-Dom.  Not only is it a great place for fans to encounter their favorite artists and filmmakers in person, but it’s also a great place for Hollywood to showcase their tent-pole productions to an eager audience.   In all, its a celebration of all forms of media, where the experience of the presentations and panels can often overshadow the actual products themselves. But, while everything is all in fun at Comic Con, the business end is what matters the most on the actual show floor itself.  As with all conventions, Comic Con is geared toward marketing.  Big studios and publishers get the most attention in the media coverage of the Con, but SDCC started with the small vendors and they continue to be part of the backbone of the whole show.  For everyone involved, there is a lot at stake in these four packed days in mid-July.
Up and coming artists, journalists, and filmmakers are just as common amongst the visitors as they are among the headliners, and the mingling of different talents defines much of the experience at the Con.  While many people get excited by the surprises on hand, that excitement can sometimes have difficulty extending outside the walls of San Diego’s Convention Center.  Marketing to a crowd of fans is much different than marketing to a general audience.  I believe Comic Con works best as a testing ground for marketing strategies in the bigger push of selling a project to the world. Sometimes, a lot of buzz can be generated with a surprise announcement or a with a well placed tease.  One clear example of this at Comic Con this year was the surprise announcement of a Superman and Batman movie coming in 2015.  The announcement was a bombshell for the fans who witnessed it live at the convention, and that extended to a media blitz that spread quickly through all news sources that same day.  This surprise effectively gained needed attention for a project that has only been in the planning stages so far.  Where the risk lies is in the effectiveness of this kind of moment, and there can be no more unforgiving audience than one made up of nerds.
Many of the big studios have figured this out over time, and the planning of their showcases at Comic Con are almost as intricate as the projects they’re trying to sell.  One thing they have certainly learned is that Comic Con patrons are extremely discerning, and are often even more informed about the different projects than the talents involved.  There is a fine line between excitement and scorn within the fan community, and if you fall on the wrong end of that line, it can be brutal.  Comic Con is all about fan service, which is no surprise to everyone.  This year in particular, there were more instances of stars making appearances in costume than ever before.  As you can see in the photo above, Spiderman was there to address the audience in person, which was a special treat considering that the film’s star, Andrew Garfield, was the one behind the mask.  Avengers villain Loki also showed up to introduce footage from the upcoming Thor sequel, with actor Tom Hiddleston completely in character the whole time.  All of these moments make the live presentations far more entertaining, and that in turn helps to make the audience even more enthusiastic about the upcoming films.  Comic Con is a place where theatrics meets commerce, and where a a well made sales pitch can turn into a fanboy’s dream come true.
Given that SDCC started as a showcase for comics, its no surprise that Marvel and DC are the ones who put on the biggest shows; and are the ones who connect to their audiences better than anyone else, through all their experience.  But more recently, the showcases have steered away from the printed page and have been more focused on the silver screen.  Its not that Comic Con has abandoned the medium that started it all; print comics still have a place on the convention floor. It’s just that the movie industry is bigger and more involved, and have seen the benefits of marketing at the Convention.
With production budgets rising, Comic Con has become more important than ever as a way to generate enthusiasm for film projects; even ones that have trouble getting attention.  Several years ago, Disney made a surprise announcement at Comic Con that there was a sequel to their cult-hit Tron (1982) in the works, which was highlighted by a teaser trailer.  Little was known or talked about with the project, and Disney wasn’t quite sure if the project would go anywhere past Development, so the trailer was made as a way to test the waters.  The reception they got was overwhelming, especially when it was revealed that the original film’s star, Jeff Bridges, was involved and production went full steam ahead afterwards.  Few expected a Tron sequel to be newsworthy, let alone the hot topic of the conversation at the convention, but Disney showed that year what a simple surprise could do to generate excitement.  Since then, surprises have not only become more frequent, but now they are expected.
That leads to some unforeseen consequences sometimes in a high stakes venue like this.  When the audiences are expecting a surprise to happen at any moment, it puts even more pressure on the marketing teams to deliver the goods.  There have been many cases when a production company ends up promising too much and then fails to deliver.  A couple years ago, Guillermo del Toro teased the crowd at a Disney presentation by revealing his involvement in a new Haunted Mansion film, which he promised was going to be more spiritually faithful (no pun intended) to the original Disneyland ride than the Eddie Murphy flop had been.  It was an exciting announcement at the time, but several years later, almost nothing new has been heard about the project, and with Del Toro taking on more and more new projects, it’s becoming more obvious that this particular project is probably not going to happen.  Other broken promises have included several announcements of a Justice League movie, including one that is currently out there now and remains to be seen; or news that TV-scribe David E. Kelley was going to give Wonder Woman a new TV series, which led to a disastrous pilot episode that never got picked up.  This is why production companies need to show good judgment when they present their projects at Comic Con.  Once you make a promise, you have to commit.  If you don’t, no one will take those promises seriously, and the whole aura that a Comic Con surprise makes will stop working.
In many ways, Comic Con has become a more favorable place for television than film.  TV shows like Game of Thrones, The Walking Dead, Dexter, and Doctor Who can benefit from all the same kinds of media buzz that a theatrical film can get at the Con, without having the pressure of marketing a massive project with a $250 million budget; although TV budgets are rising too. Comic Con isn’t the only platform for marketing a film, but it’s certainly one of the biggest and the stakes are getting higher.  In a year like 2013, which has seen numerous under-performing films hitting theaters this summer, the pressure is on when it comes to getting the message to resonate beyond the cheering fans in Hall H.  I don’t envy the people behind the Comic Con presentations one bit, because they have so much resting on their shoulders.  And when you’re dealing with a fan-base as well informed as those in the fan community, it’s a wonder how they can keep the surprises coming.
I should note that I have yet to attend Comic Con myself.  My observations are from an outsider’s perspective, though I do follow the live news coverage of the conventions every year with great anticipation.  I hope to someday see it for myself; just to take in the experience of seeing the whole carnival-esque atmosphere of the place.  I’m not sure if I’ll attend it in costume like all the cosplaying regulars there, but then again, “when in Rome…”.  Overall, there’s no doubt that Comic Con is one of the most important institutions we have in our media culture today, and it will continue for many years to come.  There are Comic conventions to be found across the world over, but this is the grand-daddy of them all, and no other convention has this kind of influence on the film industry in general.  Plus, where else are you going to see cool stuff like this: