Tag Archives: Editorials

Tis the Season – Why Some Films Become Holiday Perennials

its_a_wonderful_life_3

We’ve reached the end of another calendar year and of course that can only mean that it’s Holiday season once again.  Whether we are celebrating Christmas, or Hanukkah, or whatever, it’s a time of the year where we all gather together and honor family, tradition, and the gift of giving.  What’s interesting about Christmastime, however, is just how much the holiday tradition is actually influenced and centered around Holiday themed movies.  A Holiday film can pretty much be considered a genre all it’s own, since so many of them exist out there, and are created specifically to invoke the holiday spirit.  Not only that, but they are movies that we continually return to every year around this same time, like it’s part of our holiday ritual.  This doesn’t happen with every Christmas themed movie, however, since many of them try to hard to hit their mark and fail spectacularly.  And yet, we see small films that no one thought much of at first grow into these perennial classics over time, and in some cases add to the overall Christmas mythos that defines the season.  But, how do we as an audience discern the classics from all the rest?  What really separates a Miracle on 34th Street from a Jingle all the Way?  Quite simply, like with most other movies, it’s all determined by what we bring from our own experiences in life when we watch a movie.
The emergence of  perennial holiday classics is nothing new in pop culture and actually predates the beginning of cinema by a good many years.  Literature has contributed holiday themed stories in both short form and novels for the last couple hundred years, helping to both shape and reinvent Christmas traditions in a very secular fashion.  Our modern day physical interpretation of Santa Claus can in fact be contributed to his appearance in “Twas the Night Before Christmas,” the 1823 poem by American author Clement Clarke Moore.  Moore’s nearly 200 year poem is still being recited today and it shows just how much tradition plays a role in keeping a perennial classic alive in the public’s eye.  Around the same time, acclaimed British novelist Charles Dickens wrote the story of A Christmas Carol, chronicling the tale of Ebenezer Scrooge and his visits by three ghosts on Christmas Eve.  Since it’s original printing in 1843, A Christmas Carol has gone on to be one of the most re-adapted story-lines in history.  Perhaps nearly a quarter of all holiday classics can claim to have been influenced by Dickens’ classic tale, where a dreary old cynic has his heart warmed by the holiday spirit.  Dickens meant for his novel to be a meditation on greed and class inequality, but I have no doubt that he purposefully meant for Christmas traditions to be the healing influence in Scrooge’s reawakening.  These stories continue to stand strong so many years later and it shows just how far back our culture began to value Christmas stories and songs as a part of the holiday tradition.
Even from the very outset of cinematic history we saw films carry on holiday themes.  Both Twas the Night Before Christmas and A Christmas Carol provided inspiration for movie-makers many times, given their already beloved appeal, but some people in Hollywood also saw opportunities to add their own original holiday themed stories into the mix.  When the studio system emerged, they were very well aware of the marketability of a holiday themes.  After all, people usually visited movie theaters frequently during the cold winters, so why not play up the festive mood that everyone was already in.  For the most part, movies celebrated Christmas more frequently in short segments than in full length story-lines in these early years; whether it was capitalizing on a new popular Christmas song in a lavish musical segment, or by portraying a Christmas celebration as part of larger arching narrative.  Many people forget that one of the most popular Christmas tunes ever written, “Have Yourself a Merry Little Christmas,” wasn’t even from a Christmas themed movie; rather it came from the 1944 musical Meet Me in St. Louis.  But eventually the Christmas season became such an influential part of our modern cultural tradition, that it would inspire films devoted entirely to the holiday spirit.
So, in the years since, we have seen Holiday films become standard practice in Hollywood.  Every year, it’s inevitable to see a Christmas movie released in time for the holidays.  Unfortunately, for most of them, Christmas movies very rarely achieve classic status.  For every one that audiences grow an attachment to, there will be about a dozen more that will likely be forgotten by next December.  Evidently, it seems like Hollywood’s approach to the holiday season is less carefully planned out than any other part of the year.  Their approach seems to be throwing whatever has Christmas in the title up against the wall and seeing what sticks.  Unfortunately, this has led to Christmas being more synonymous with bad movies than good.  Some are well meaning films that fall short of their goal like the Vince Vaughn film Fred Claus (2007) or the odd but charming Santa Clause: The Movie (1985).  And then there are ugly, shallow and distasteful films like Deck the Halls (2006), the Ben Affleck disaster Surviving Christmas (2004), or the deeply disturbing Michael Keaton film Jack Frost (1998), with the creepy as hell CG snowman.  And the less said about the horrible 2000 How the Grinch Stole Christmas remake the better.  Overall, it is very hard to make an honestly cherished holiday classic in Hollywood, and that’s mainly because the business just tries too hard.  If you look closely, you’ll actually find that a beloved holiday classic may come from the unlikeliest of places.
This was definitely the case with what has become not just one of the best loved Christmas movies, but one of the best movies period; that film being Frank Capra’s It’s a Wonderful Life (1946).  Capra’s movie tells the story of George Bailey (a flawless Jimmy Stewart), a man who has given so much back to his hometown and has gotten so little in return, reaching the verge of suicide due to his depression.  Through the intervention of a guardian angel on Christmas Eve, George is shown what his life would have been like if he never lived and he rediscovers his value and purpose and as it turns out is finally rewarded by those whom he’s helped all his life on the following Christmas Day.  The film is very uplifting and perfectly illustrates the true impact that the Christmas season has in our lives.  With a theme like that, you would think that the movie was a smash hit when it was first released, but instead the movie was a colossal bomb.  It bankrupted the company that made it and ruined Frank Capra’s directing career from then on.  The focus on George Bailey’s increasing depression was probably too hard for audiences to take at the time, given that many soldiers were returning home after the end of WWII.  Despite it’s initial failure, It’s a Wonderful Life managed to survive through TV airings which happened on, naturally, Christmas Eve and the film not only found it’s audience but it became a seasonal standard.  To this day, It’s a Wonderful Life is still aired on network TV (the only classic era movie that still is), and audiences from every generation still embraces it warmly, no matter how old fashioned it may be.  Pretty good legacy for a film that started off as a failure.
A holiday classic can come from an unlikely place like It’s a Wonderful Life, but for many, what is considered a classic is usually determined by their own tastes.  That’s why some people find romantic comedies set around Christmastime to be considered a holiday classic.  Case in point, the movie Love, Actually (2003) has grown into a beloved holiday classic, even though the themes in the movie are less about Christmas and more about the intertwining relationships between the characters.  By standing out as a strong romantic film with a Christmas setting, it stands to see this film as being an example of two types of genres working together.  Cult movie fans even have holiday classics that they cherish, like the weird campy film Santa Claus Conquers the Martians (1964), which can hold the distinction of being one of the worst movies ever made, and incredibly entertaining at the same time. And some people can even claim that Die Hard (1989) counts as a Christmas movie, because of it’s holiday setting.  Pretty much it’s whatever we bring with us from our own experiences to the movies that determines what we consider to be entertaining.  Like with how most people gravitate towards a movie based on their own interests, so too do we see that with Holiday films as well.  Hollywood has in some cases picked up on this and has catered to select audiences at Christmastime with genre specific movies.  Usually, it will take a consensus of a large audience to determine which ones will stand out as the undisputed classics.
I think where Hollywood hits it mark most often is when it comes to making a successful holiday film that appeals to the memories of our own experiences of Christmas.  The film that I think hit a perfect bulls-eye in this regard, and stands as a true masterpiece of Christmas themed film-making, is the 1983 classic A Christmas Story.  Directed by Bob Clark, and inspired by the auto-biographical stories of novelist Jean Shepherd, A Christmas Story perfectly captures the highs and lows of a young boy’s experience during the holiday season.  Ralphie (Peter Billingsley) is a character who was relatable to any young boy growing up in small town America, myself included, and seeing how he tries so hard to manipulate his parents into getting him his dream present is something every child will identify with.  Couple that with the hilarious performance of Darren McGavin as the Old Man and the iconic Leg Lamp, and you’ve got the very definition of a holiday classic.  But, just like how A Christmas Story highlights good Christmas memories, we see classic films that also center around a disastrous Christmas experience as well.  The best example of this would be the very funny and endlessly quotable National Lampoon’s Christmas Vacation (1989).  We’ve had just as many Christmases like the Griswold family as we have like the Parker family from A Christmas Story, and Christmas Vacation just perfectly encapsulates all the bad things that happen at Christmas time, without ever losing the holiday spirit underneath.  Not to mention its the last time we ever saw a funny performance out of Chevy Chase.
So, despite the low success rate, we as an audience still seem to find a classic seasonal favorite every in every generation.  But how does Hollywood keep making bad Christmas movies every year despite the demanding tastes of the movie-going public rejecting all the junk they put out.  I think it’s because the season itself is such an overwhelming cultural force, that most filmmakers don’t really care about the product they’re making, as long as it’s holiday themed and ready to capitalize on the mood of the period.  When it comes down to it, a great holiday classic is not determined by how soaked up in the holiday spirit it is, but rather by how strong it story works.  We keep watching It’s a Wonderful Life every year because of how inspirational George Bailey’s life story is, and not because of the Christmastime finale that has come to define it.  In fact, the movie is not really about Christmas at all; it’s about the life of one man and his value to the world.  Other Christmas movies usually become classics just because of a wintry setting, where the holiday is not even mentioned.  And even films that subvert Christmas traditions, like 2003’s Bad Santa, have become genuine holiday classics to some people.
I, myself, love a good Christmas movie, and because I’m such an ardent appreciator of movies in general, these films have certainly become a part of my holiday tradition.  I return to It’s a Wonderful Life and A Christmas Story every year and never get tired of them.  And not a year will go by when I don’t at least drop one quotable line from Christmas Vacation during this season.  I hope every generation gets their own perennial classic that will last for years to come.  Just please; no more remakes or sequels.  We all saw the backlash that an announcement of a sequel to It’s a Wonderful Life got recently.  I only wish The Grinch and A Christmas Story had been spared the same fate.  Like too much Christmas dinner, there can always be too much of a good thing when it comes to Christmas movies.

Flicks and Picks – The End of the Blockbuster Video Era

blockbuster

Though it was long seen coming, it finally became official this last week.  Blockbuster Video is no more.  While this is a sign of how things have progressed in home entertainment for mostly the better, with on-demand and streaming video making it easier for the consumer to watch whatever they want, it does also bring an end to an institution that has been at the center of many cinephiles lives.  Apart from some independent holdovers here and there, you rarely will find a video store in your local neighborhood today.  But back in the day, finding a store devoted to video rentals was as easy as finding a McDonald’s.  The decline of video stores over the years certainly has had to do with the advancements in streaming video, but the dominance of Blockbuster Video as a company also played a role as well.  In a way, by working so hard to become the top dog of the video rental market, Blockbuster also facilitated it’s own downfall when the market changed once again.  Though the end of Blockbuster was inevitable, and needed to happen, it does leave a gap for those of us who’ve built their love for film through renting from their local video store.  The video rental experience, while not exactly life-changing, is something that most film lovers have been through at some point in their lives, and this week it has now become a thing of the past.   In this article, I will look back on this era that Blockbuster Video defined, and what it’s end means for the future of home entertainment.
In the late 80’s, we saw the emergence of VHS, which gave studios and filmmakers the ability to make films available for purchase after their theatrical release for the very first time.  Before, audiences had to wait for airings on television before they could see their favorite films again, and that also meant having to put up with commercial breaks as well.  When VHS tapes started to be produced by the studios directly, it led to the creation of a niche market, with stores opening up across the country, directly geared toward filling that public appetite.  Being able to own a movie as part of a collection is a commonplace thing nowadays, but when home video sales began, it was an exciting new frontier and it had an influence on the film industry almost instantly.  Not only did the rise of home video affect the number of theatrical runs that a movie would have, but it also drove the movie studios towards film preservation and restoration as well, because of course, presentation matters for home viewing.
But, like with most new technology, VCR tape players were very expensive, and buying a movie to play in it was also not cheap at the time.  Some retailers even had to pay prices as high as $100 per movie in order to have it available in their stores.  So, in order to get more out of their product, and to let audiences have better access to the movies they wanted, video rental services came into being.  Like checking a book out from a library, consumers would be able to rent a movie for a certain amount of days at a low price.  This business model worked extremely well and led to boom in VCR sales.  Video stores popped up all across the country, both locally owned and franchise operated, and home video sales very quickly became a major part of the film industry as a whole.  But, it wasn’t just studio films that benefited  from this new market.  Independent producers saw an open opportunity in this new industry, and before long a whole Direct to Video market opened up, thanks to video stores allowing to indiscriminately sell and rent out a whole variety of films as a way to fill their shelves with more product.  In these early days, it was very common to see a diverse collection of independent stores in your hometown, as it was in mine.  There were stores that I grew up with  in my hometown of Eugene, Oregon that went by such varied names as Silver Screen Video or Flix & Picks, and choosing a rental from these places certainly had an affect on my growing interest in movies at a young age.
But that changed in the mid 90’s when the video rental industry became more standardized.  Out of this period of time came a chain of stores known as Blockbuster Video.  Blockbuster was founded in 1985 in Dallas, Texas, and started off as just another local retailer like most other stores, before it began to expand rapidly.  In the late 90’s, it was common to find at least one local Blockbuster in your area, and by the end of the decade, Blockbuster was unrivaled in the home video market.  Their rise had the negative affect of forcing all of the other competition out of business, which benefited them for the time being, but it would come back to bite them in the years ahead.  Blockbuster may have been ruthless to the competition, but to become the best in the industry, they did manage to do many beneficial things that did revolutionize the market.  For one thing, they were the first national retailer to begin video game rentals.  Their standardization of rental pick ups and drop offs also revolutionized the way we rent movies, making the drop off slots at your local store a life-saver late at night.  Also, Blockbuster was also the first chain to begin working within the film industry to create exclusive promotions and deals on upcoming releases.  Despite seeing a lack of choices in rental stores happen because of Blockbuster’s dominance, I don’t believe that consumers cared much about it as long as Blockbuster still operated efficiently.
Most film lovers will attest that they’ve probably spent a good amount of their time in a Blockbuster store.  While many of us could find exactly what we wanted at any time, there was another side effect that also changed how we grew up watching movies after spending time in a Blockbuster, and that effect would be the impulse rental.  I’m sure most of you out there have come out of a Blockbuster Video at one time with a movie you’ve never even heard of instead of the one you wanted, simply out of curiosity.  Having a variety of choices seems normal now, but not until video rental came about did consumers have that level of control over what they were able to choose.  Before, you would have been limited to the whatever was playing on TV or in your local cinema, but stores like Blockbuster made consumer choices as simple as a quick scan through their shelves.  For cinephiles, I’m sure that part of their growing love for films started out of making a surprise choice in the local video store, and with stores as big and as well stocked as Blockbuster, those surprises could have come from even the most obscure of titles.  Blockbuster was also handy for film students like me whenever we had to watch a film as part of an assignment.  Whether it was a film we knew or not, at least we had the comfort of knowing that there was a place we could look for it in a hurry.
In the later years, however, the market began to change again.  The internet revolutionized video streaming in the later part of the 2000’s, and our reliance on Video and DVD for home entertainment purposes soon became a thing of the past too.  Even though Blockbuster cleared out all comparable competition, they were ill equipped to take on the likes of a Netflix.  What Netflix did was to eliminate the middle man in video rentals, and have movies sent directly to the home through the mail, which made it unnecessary for anyone to go out to a store and rent a movie anymore.  Blockbuster tried it’s own rent by mail service in response, but by then the damage had already been done.  Netflix had surpassed Blockbuster as the number one rental service and the former giant had to begin downsizing in order to survive.  Soon, Redbox emerged and took away even more business from Blockbuster, appearing as convenient vending machines in grocery stores for anyone looking for an impulse rental.  Like most all other forms of retail, the trend has moved towards online shopping, and Blockbuster is one of the biggest to have fallen, mainly because their business model was one that couldn’t adapt in the digital age.  All that’s left for Blockbuster is it’s still recognizable name, and even that is owned by someone else now (it was purchased by DirectTV in 2011 for the branding it’s on-demand service).
Because Blockbuster eliminated much of the competition beforehand, it has actually made the transition to on-demand video renting faster and less rocky.  There was no large grouping of various retailers resisting the the changes in the market; only Blockbuster.  And now that they are gone, the era of land-based video rental shops has ended with them.  Sure there are independent stores in certain areas that still serve nostalgic purposes, but their clientele is limited.  Now it is more commonplace to hear that people have a Netflix account rather than a Blockbuster card.  But Blockbuster still left a legacy that will not be quickly forgotten, especially among longtime movie aficionados.  Many of us can still remember moments when being close to a Blockbuster came in handy; whether it was for a late night impulse rental, or for a quick bit of research, or for merely wanting to see a movie that you missed the first time around.  For many people, the first time they watched a particular movie, it was probably not in a movie theater but through a rental from big blue.  I can certainly say that I credit my local Blockbuster for helping me experience so many different types of movies.  One of my favorite films of all time (Seven Samurai) came to me out of an impulse rental from Blockbuster, and I will always be grateful for that.
So it’s a bittersweet end for the onetime giant.  Their closure spells the end of an institution that has been a big part of all of our cinematic experiences, but it’s a closure that was necessary.  Netflix and Redbox are just much better and convenient services, and Blockbuster was a relic that was standing in the way.  But, as we move forward, will those two also fall prey to the same fate as Blockbuster.  My guess is probably not.  Blockbuster had the unfortunate circumstance of being the top force in a market that was destined to fall.  Netflix and Redbox, however, have relished in the fact that they stand in direct competition with each other, and that has led to new and creative avenues for both companies.  Unlike Blockbuster, Netflix has branched out and generated their own exclusive content, including comedy specials and original shows like House of Cards, which not only makes it a great rental service, but also a competitor to broadcast TV.  And Redbox is able to make itself available in locations all across the world without having to set up the infrastructure of an entire store chain.  And with Amazon and Walmart entering the market with their own video streaming services like AmazonPrime and VUDU, it’s showing that the rental market is one that is going to continue growing in this new direction.  Blockbuster is certainly done as an independent company, but without it ever being there in the first place, the rental business would certainly never have gotten to where it is now, and that’s the legacy that it ultimately will leave behind.

Apocalyptic Cinema – Making Disasters Entertaining in Movies

ID4

One thing that we often see in human nature are destructive impulses; or to be more specific, we all like to see something get destroyed.  Whether it is a benign thing like blowing down a house of cards or something more extreme like an implosion of a building, we just enjoy watching something that was built up be taken down.  Hell, we even do it to each other through schadenfreude; whether it’s in politics like the Anthony Wiener scandal, or the rise and fall of a Hollywood star like Lindsey Lohan.  Our culture seems to relish destruction as a part of entertainment.  I don’t necessarily find this to be a bad thing, as long as it doesn’t get out of hand.  And that’s usually what we find in a lot of movies as well.  Disaster films have been a staple of movie-making for generations, but in recent years, we’ve seen visual effects work become sophisticated enough to the point where destruction looks authentic enough to be believable.  But, when we start to see movies become ever more comfortable showing widespread destruction as a part of their storytelling, there starts to be a question about where the line must be drawn.  Is it right for us to feel entertained when we see things like the White House or the Capitol Building being destroyed?  How about the entire world?  In this article, I will look at the highs and lows of disaster film-making and how the audiences reactions to them reveal the extremes to which people want to be entertained.
A lot of the reason why Disaster films exist is because they are a great showcase for special effects.  Going all the way back to the silent era, we’ve seen filmmakers use primitive but successful effects work to create larger than life destruction.  You could even look at some of the early Biblical epics like Cecil B. DeMille’s 1923 version of The Ten Commandments as early examples of a disaster movie.  The film had a moral message yes, but there were many audience members I’m sure who saw the film just because they wanted to see the grandiose destruction caused by the ten plagues and the parting of the Red Sea.  As special effects have become more sophisticated, so has there been an increase in disaster movies.  Soon films were crafted around some of the most famous disasters in history, like In Old Chicago (1937), which depicted the Great Chicago Fire of 1871, or San Francisco (1938), dramatizing the famous 1906 earthquake.  It wasn’t until the 1970’s, however, when Disaster films could be declared a genre all to itself.  In that period, we saw a glut of disaster related movies made specifically for the purpose of being epic, star-studded extravaganzas, with the latest is special effects work on display.  These films included Earthquake (1974), starring Charlton Heston; The Poseidon Adventure (1971), with Gene Hackman and Ernest Borgnine; and The Towering Inferno (1974), with Paul Newman and Steve McQueen, just to name a few.
The rise of the disaster movie genre in the 70’s began to die down in the 80’s, mainly due to the rise of Science Fiction and Fantasy films as a showcase for effects work, but the genre lived on as it began to evolve.  In the 1990’s, we saw the emergence of a filmmaker who would go on to not only redefine the genre, but make it all his own.  This filmmaker was German born director Roland Emmerich, and over the course of his career, you can see that nearly 80% of his filmography is made up of disaster movies.  The movie that put him on the map in the film industry was a film that actually redefined two genres in one, and that was 1996’s Independence Day.  The movie was essentially an alien invasion narrative, but what Roland Emmerich did was to use the techniques utilized in popular disaster films as a means to make the destruction caused by the aliens look and feel as real as possible.  In the movie, we see catastrophic explosions engulf entire cities, destroying landmarks before our very eyes, including the White House itself.  This was a film that not only drew upon our greatest fears of total annihilation, but it also made it feel completely real.  Independence Day was a phenomenal success when it premiered, and it made the disaster genre a force to be reckoned with.  As for Emmerich, he has stuck mostly with the genre that had made him a player in Hollywood, with mixed results, with successful but ludicrous films like Godzilla (1998), The Day After Tomorrow (2004), and 2012 (2009) all falling into that same mold as Independence Day.
But, what was interesting about the success of Independence Day was that it revealed something about how we react to seeing destruction on film.  In the movie, famous landmarks like the Empire State Building are blown to pieces and thousands of people are destroyed in seconds before our very eyes.  And this is what we consider entertaining?  Maybe entertaining isn’t the right word.  I think movies like Independence Day do well because it allows us to face our fears and indulge that sinking feeling of helplessness.  It’s not so much the scenes of destruction themselves that we find so entertaining, but the framework around them.  While watching a disaster movie, we need to feel the impact of the destruction, and that’s why so many disaster films have to finish with a happy ending.  In Independence Day, the colossal destruction closes the first act of the film.  The rest of the movie details how humankind copes with the aftermath, and how they fight off the invaders despite the odds against them.  You have to go through a lot of darkness before you can appreciate the light at the end of the tunnel, and that’s what has defined the best films in the genre.  If a film takes a bleak outlook and doesn’t give the movie a satisfying resolution, then it’s going to fail.  This has been the case with other disaster films, like 2009’s Knowing, which leaves everyone dead and earth uninhabitable at the end; sorry to spoil it for you.  Even the laughable 2012 left room for some hope for humanity, and not surprisingly, it did much better.
Disaster films have to thrive on that feeling of hope.  We become enthralled when we see something grand get destroyed, but it’s what rises from the ashes that makes us feel grateful in the end.  That’s why we enjoy watching controlled demolitions; old buildings must come down in order to make way for something better.  That’s helps us to understand why we accept destruction as entertainment.  Many films skirt that line very often, but the way a disaster film can get the audience on its side is through the characters.  Characters in disaster movies must be likable and easy to identify with.  It also helps if they are not thinly drawn stereotypes as well, but fully defined people.  Emmerich’s films have tended to have lackluster characters, which is why casting makes a difference in his movies, and other ones like them.  Independence Day worked well because you had charismatic performances from actors like Jeff Goldblum and Will Smith, who helped to balance the film out by creating characters you wanted to root for.  Other disaster films tend to miscast their roles, making their characters’ story-lines a little more hard to swallow.  Case in point, John Cusack in 2012.  Cusack is a fine actor when a movie calls for it, but when your character is a mild-mannered author who somehow is able to outrun the eruption of a Supervolcano; that I have a hard time buying.  Now it’s difficult to say that a character needs to believable in a movie centered around a fictional disaster, but sometimes it does matter.  Likability of the characters is what separates the good disaster films from the bad ones, and unfortunately that’s something you rarely see work effectively.
For the most part, disaster films exist because they are showcases for the newest techniques in special effects.  The human element in the films are crucial, but they do play a lesser part in the creation of the movies as a whole.  But, when the balance of these films aren’t settled in the right way, then they do run the risk of seeming either lackluster or worse, exploitative.  This was an issue in Hollywood in the aftermath of the September 11th attacks in New York City, where we saw a level of destruction in real life that we could only comprehend in movies before.  Soon after, the Independence Day style destruction of city-scapes in movies stopped for a while, because that imagery became all too real for us and seeing it on the big screen afterwards would’ve been seen as insensitive.  Now that time has passed, we are seeing that kind of destruction depicted again, but it took a while for us to get there.  What I think makes audiences understand the level of acceptability in disaster imagery is the balance between the level of destruction in the movie and how it functions within the narrative.
Even though it came out months before 9/11, I think that the Michael Bay film Pearl Harbor (2001) feel into that unacceptable exploitation category because it didn’t find that right balance.  In the movie, the famous attack is depicted in gruesome detail, but it lacks any resonance because it is just the backdrop to a rather lackluster love triangle plot.  A lot more respect could have been paid to the real men and woman who died on that day instead of having everything hinge on fictional characters that we care so little about.  Pearl Harbor felt more like a shallow Hollywood attempt to exploit a tragedy for the purpose of creating a film that showcased impressive production values and matinee idol stars.  In other words, it was a movie driven more by marketing than actually informing audiences about the real event.  If you don’t find that right balance in a disaster movie, than your film will not be believable, as was the case here.  Pearl Harbor failed as a movie mainly because it knew what it wanted to be, but the filmmakers didn’t know how to make it work.  They were trying to follow in the footsteps of what has ultimately been the only disaster film to date to ever win the Academy Award for Best Picture; that being director James Cameron’s Titanic (1997).  The reason why Titanic worked and Pearl Harbor didn’t was because it had a balance to it.  The love story at the center of Titanic, while not the most engrossing, did keep the narrative moving and it did endear the characters involved to the audience before the pivotal event happens.  Also, James Cameron put so much detail into the recreation of the ship’s sinking, and every moment of that is well executed on screen. No shaky cam or needless destruction is present in the climatic moments of the movie.  It works because the film was, dare I say, respectful to the actual disaster and to the victims of the event as well.
Making disaster movies thoughtful turns out to have been a secret to the genre’s success.  Going back to my example film once again, Independence Day, we see that the film works despite it’s more ludicrous moments by actually having characters work out logical answers to their dilemmas. It’s not enough to have the characters just move from one disaster to another without explanation, like in 2012   Or to have our characters helplessly standby as the world crumbles around them and inject stale philosophical points about why it all has happened, like in The Day After Tomorrow.  We want to see our characters be problem solvers and actually deal with the apocalypse like its something they can come back from.  That’s why, despite it’s many flaws, Independence Day succeeds.  Mankind coming together to help “Take those sons of bitches down,” is an ultimately inspiring thing.  Whether it’s against nature, or the extraterrestrial, or against our own selves, we enjoy watching characters pull themselves out of a struggle.  That’s why I think World War Z succeeded this year, despite all the naysayers who predicted it would fail (myself included).  The movie looked like another exploitative take on the zombie sub-genre, but the finished film was a more thoughtful examination about how the survivors of the catastrophe try to deal with the problem and learn to survive.  Sometimes it helps to treat your audience to a more thoughtful story about survival, rather than just destruction.
Disaster films will always be around as long as there is an audience for them.  And as long as filmmakers actually treat its audiences’ intelligence levels more respectfully, then we’ll also see the Disaster genre gain more respectability in the film community.  I like the fact that Disaster films have become such an acceptable part of cinematic history, that it’s now commonplace to spoof it as well.  This summer, we got not one, but two comedies centered around apocalyptic events: Seth Rogen’s This is the End and Edgar Wright’s The World’s End.  Both films are hilarious takes on the genre, but they both know what makes a good disaster film work in the end and they exploit those elements perfectly.  It comes down to characters you want to root for and wanting to see them overcome even the complete destruction of society as we know it.  Even though the film’s are played for laughs, the same basic elements hold true and the filmmakers who made them know that. Overall, destruction becomes entertainment because we look forward to the process of renewal.  Disaster movies fail if they indulge too heavily in the destructive parts or leave the audience with no satisfying resolution.  It’s human nature to enjoy seeing something blow up, but we also enjoy seeing something good rise out of the rubble of the destruction, and in the end, that’s why we enjoy good a disaster movie.

Not So Scary – Modern Horror Movies and the Lack of Genuine Scares

 

mama
Horror movies have been around since the very beginning of cinema.  From F.W. Murnau’s classic vampire flick Nosferatu (1922) to Universal Studio’s monster movies like Dracula (1931) and Frankenstein (1931), audiences have made watching scary films a long standing tradition.  And, like most other genres, horror has grown and evolved with the times, satisfying the changing tastes of it’s audiences.  In the 50’s, we saw the rise of the Sci-fi monster movies and in the 60’s and 70’s, “schlock” horror began to become popular, thanks to relaxed restraints over acceptable on-screen violence.  It is a genre that has more or less stayed strong in every decade and is much more adaptable than any other genre of film.  But, in recent years, I have noticed that there has been a severe drop off in horror movies that actually leave a mark.  It seems that today, studios are more interested in quantity over quality and its a trend that is having a negative effect on the genre as a whole.  My belief is that studios are using the horror genre as a way to generate a quick influx of cash, knowing that there is a built in audience of people who watch horror movies no matter what it is.  That’s why you see so many horror films quickly drop off after their opening weekend.  There seems to be the belief nowadays that you can pass off something as a horror movie if it has one or two big scares; but the reality is that the best horror films don’t always rely on things that make us jump out of our seats.
What makes a great Horror movie is the use of atmosphere.  This has been the case since the very beginning; back when cinema was still silent.  F.W. Murnau’s silent masterpiece Nosferatu shows exactly how atmosphere can be used to signify terror.  In the movie, we see how simple staging and effective use of shadows can be used to terrifying effect.  The vampire Count Orlok, played by actor Max Schreck, is able to strike at his victims using just his shadow, an image in the film that is made simply with the movie’s use of lighting, but still done with chilling effectiveness.  Early Hollywood horror films likewise made great use of atmosphere.  If you look at a movie like Dracula, there is actually very little on-screen violence present.  Instead, the film presents a feeling of dread through the gloomy atmosphere of the vampire’s castle.  Thanks to that, and Bela Lugosi’s iconic performance, you don’t need to see the bloodletting of Dracula’s victims in order to be scared.  This has helped to give these movies lasting power over so many years.  It’s amazing that movies made in the early days of cinema can still be scary, given all the limitations they had.  And given all the bad things we’ve seen happen to movie vampires in recent years (I’m looking at you Twilight), I’m glad that Lugosi’s version of the Count still can create a chill.
Understandably, the horror genre has had to grow and evolve with the times in order to survive, but for many years there was still an emphasis on atmosphere at play.  The more rebellious era of the 70’s allowed for more use of onscreen violence, and while many filmmakers perhaps went a little overboard in this period, there were a few that actually made an impact.  Dario Argento created films that were not only gory but also artistically staged like The Cat of Nine Tales (1971), Deep Red (1975) and the very twisted Suspiria (1977), which showed off how atmosphere could still be used to enhance the gore on film.  Director George A. Romero likewise used atmosphere effectively in a sub-genre of horror that he helped create; the zombie flick.  Despite the fact that these directors were given more leeway to do what they wanted, what made their early work so effective was in how they showed restraint.  You can show a lot more in horror movies nowadays, but sometimes what remains unseen becomes the scariest element, and that’s why films of this era managed to be effective.  The filmmakers knew when to be shocking and when to show restraint, based on what the horror movies that inspired them had done in the past.  But, as generations of filmmakers become more desensitized to what can be allowed in a horror movie, that sense of restraint also goes away.
The problem that I see in most modern horror movies today is that there is no self-restraint left in them.  For the most part, the filmmakers chose to throw atmosphere out the window in favor of “jump scares.”  A “jump scare” is when something suddenly pops onto screen out of nowhere in an attempt to make the audience scream and jump all at the same time, usually accompanied with a loud music cue to maximize effect.  A “jump scare” can work, when it is used sparingly, but too many films today are overusing it, which diminishes it’s effectiveness over time.  One of the best examples of a jump scare is actually in a film that you would consider more of a thriller than a horror movie; Jaws (1975).  The scene in question is when scientist Hooper (Richard Dreyfuss) is investigating a shark attack on a fishing boat at night.  While examining the hole in the bottom of the boat, a severed head pops out suddenly, creating a genuine scare for both him and the audience.  This scene is effective because it is unexpected and is built up thanks to the atmosphere of the moment.  Also, it is one of the few times that director Steven Spielberg actually uses a “jump scare” in the movie.  The  fewer times it happens, the more effective it is, and unfortunately that’s a technique that few horror filmmakers today understand.  When you use a technique too many times, it becomes tiresome and the audiences become more aware of it.  Unfortunately, too many filmmakers get carried away and have too much fun creating these kinds of “jump scares.”
One other problem I have noticed with modern horror films is the over-abundance of CGI.  While computer effects can sometimes be helpful in a horror film, like making it look like a character has lost a limp or manipulating an environment in a way that defies physics, there is a larger problem of effects work making moments that should be scary less so.  The problem is that most computer effects look too artificial.  Of course, when you see puppetry and prosthetic work used in horror movies, they are far from realistic too, but those effects are at least are physical in nature and actors can still interact with them.  When you see a horror movie use CGI too much, you just know that the actors are reacting to nothing else but a green screen effect.  A recent movie like Mama (2013), loses all effective chills when you see the digital apparition in it appear.  This is more apparent in smaller budget horror films, which you can kinda excuse due to limitations in budgets.  But when a bigger budget horror film, like the upcoming Carrie remake, looks so pathetic because of the overdone CGI effects, then you begin to see how digital imagery has a negative effect on the genre.  Even a good horror film like World War Z suffered from some unnecessary CGI work, which had the unfortunate affect of making the zombies less frightening.  If ever there was a place where I wish horror filmmakers would show more restraint, it would be here.
One other problem that I see plaguing the horror genre is the lack of original ideas.  Today we are seeing an overabundance of the same kinds of ideas used over and over again.  Seriously, how many haunted house movies do we need?  Not only that, there are far too many remakes and sequels in the horror genre.  Do we really need seven Saw movies and four Paranormal Activities?  Horror sequels have become so absurdly common, that we have ridiculous titles like The Last Exorcism 2 and A Haunting in Connecticut 2: Ghosts of Georgia appear as a result; and yes that second title is real.  I see it as commerce taking precedence over artistic vision, and the fact that film studios are more likely to invest in something already established than in something new.  Every now and then, you do see a movie with a fresh idea come about, like Paranormal Activity in 2007, but even that was driven into ground with too many follow ups with diminishing returns.
Remakes are also a negative factor in horror movies today.  What you usually see in these horror remakes are films that get rid of all the atmosphere from the originals in favor of upping the gore factor and the scary bits; just because filmmakers have the ability to do now what could only be implied at in the past.  The problem with this is that it completely misses the point of what made the original films so effective in the first place.  A particular example is the terrible remake of John Carpenter’s The Thing, which loses all of the substance of the original in favor of just making the film as gory as possible.  Gore does not equal scary.  Filmmakers like Carpenter knew that, and that’s why they used gore sparingly.  The sad thing is that remakes try to one up these originals because the tools today are so much better; but it fails miserably every time.
Thankfully, despite the attempts by Hollywood to try to push the Horror genre into more exploitative territories, the classics still hold up all these years later.  Even a 90 year old film like Nosferatu still gives audiences chills to this day.  And I think that it all comes down to atmosphere.  It’s like how people tell ghost stories around a campfire.  Would you rather listen to the story that builds up to a chilling ending that’ll leave you with nightmares, or would you rather listen to someone’s story that gets caught up in the gory details and then just ends without a payoff?  That’s what’s being lost in horror movies today.  The classics knew how to build their stories around scary ideas, and not just the imagery.  The Twilight Zone became popular on television because it presented us with unsettling scenarios that made us anxious the longer we thought about them.  Not once did we see the monster on the wing of a plane attack William Shatner in the famous episode; it was the frightening possibilities that could have come about that made the episode scary and also Shatner’s paranoia in his performance.  The best horror movies have staying power because they knew that their audiences had the imaginations capable of filling in the gory details that remained unseen.
So, is horror a dying genre?  Of course not.  There is an abundance of terrible horror movies out there, but that’s only because the market has been flooded.  Every now and then, a fresh new idea comes along and not only makes an impact, but it will also go on to influence the genre as a whole.  One thing that I would like to see an end to in the horror genre is the over-abundance of terrible remakes.  Just looking at the new Carrie remake trailer makes me laugh, because it’s taking everything that worked in the original and makes it less subtle.  I believe it strongly; CGI, and shaky-cam for that matter, are making horror films less frightening.  They are showy techniques that ruin atmosphere needed for a good horror movie and I wish more filmmakers would show more restraint.  I’ve stayed away from horror films generally because of this, and the horror movies that I gravitate towards are ones that have been around a long time.  If you’re wondering which one I consider my favorite, it would be Stanley Kubrick’s The Shining (1980).  Talk about a film that makes the most out of it’s atmosphere.  I hope that other horror filmmakers take a look at what makes the classics as scary as they are, and learn the effectiveness of restraint.  You’d be surprised how much a little scare can go when it’s built up well enough.

The Best of the Worst – Why We Have a Good Time Watching Bad Movies

 

manos
If there was ever a place where the word “bad” could be considered a relative term, it would be in the movies.  Over the course of film history, we have seen Hollywood and the film industry at large put out an astounding variety of movies, and not all of them have hit their targets the way that the filmmakers had intended.  If you produce hundreds of products within a given year, the odds are that some, if not most of them are not going to be good.  But like most things, one man’s trash can be another man’s treasure, and that has led to a fascinating occurrence in the film community.  Some “bad movies” have actually earned a fanbase all on their own, finding an audience in some unexpected ways.  This has been the case with films that have built a reputation over time, but nowadays, we are actually seeing intentionally bad movies become phenomenally successful upon their initial releases, as was the case with the premiere of Sharknado on the SyFy Channel earlier this summer.  How and why “bad movies” find their audiences is still a mystery to many, but what I love about this trend is that it shakes up our preconceived notions about the film industry, and makes us reconsider what we find entertaining in the first place.
So, what is it about these “bad” movies that makes them so entertaining to us?  The truth is that there is no “one thing” that defines the success behind these films, and usually it’s all relative to each individual movie.  Sometimes it’s the incompetency behind the making of the film that we find so entertaining.  Sometimes it’s because the film is so out-of-date that it becomes hilarious.  Sometimes it’s the lack of self-awareness and ego behind the director’s vision.  And sometimes it’s the filmmakers just not giving a damn what other people think and just going all out with their material. The formula has no consistency, and yet we see many film’s fall into these many different categories of “bad” films.  Usually the best of the these are the ones that fulfill the criteria of a “bad” movie so perfectly, that it becomes memorable and re-watchable.  Only in rare cases does this work intentionally, and usually the best “bad” films arise from an unexpected accident.
Some of the best “bad’ movies have come out of turmoil, which makes their existence all the more fascinating.  Usually this is attributed to movies that were made despite the fact that their filmmakers didn’t know what they were doing.  One of the most notorious examples of this was the 1966 cult classic, Manos: The Hands of Fate.  Manos was the creation of Hal Warren, a fertilizer salesman from El Paso, Texas who made a bet with a screenwriter friend of his that he could make his own movie without any help from Hollywood.  Making good on his wager, Mr. Warren wrote and directed this schlocky horror film centered around a cult leader named the Master (pictured above) who holds a family hostage in his compound, which is watched over by a lecherous caretaker named Torgo.  Hal Warren shot the film with a camera that could only shot 30 seconds of film at a time with no recorded sound, and most of the movie was shot at night with set lighting attracting moths in almost every shot.  The finished film is a convoluted mess and it ended any shot for Hal Warren to pursue a career in filmmaking.  However, many years later, the film was rediscovered by the producers of the show Mystery Science Theater, who then featured it on their show and created a renewed interest in this odd little film that no one outside of Texas knew about.  Manos became a hit afterwards because people were fascinated by how silly this poorly made film was, something that the MST crew had a hand in.  Since then, Manos has earned a reputation for being among the worst films ever made, and that in itself has made it a favorite for people who gravitate towards that.
While Manos represents an example of a disaster turned into a success, there are other bad films that have become fan favorites just out of being incredibly dated.  These movies usually make up the majority of what people consider good “bad” films, since most films are a product of their times.  Whether people are entertained by these because of their “out-of-date” nature or merely because of shear nostalgia, there’s no denying that time has a way of changing how we view these kinds of movies.  The 1950’s has become an era that many film fans find to be full of some good trash, mainly due to the rise of the B-movie in this period of time.  Some cult hits like The Blob (1958), Creature from the Black Lagoon (1954), Attack of the 50 ft. Woman (1958), and The Thing from Another World (1951) all rode the surge of a sci-fi craze of the post-war years, and while everything from these films, like the visual effects and the acting, feels antiquated today, they still have a camp value that makes them watchable all these years later.  The “cheese factor” plays a big role in keeping these films entertaining long after their relevance has diminished.  You can see this also in the beach party movies of the early 60’s, which are charming despite their paper-thin plots. The one other era that has produced it’s own distinctive set of dated films would be the 1980’s, with it’s collection of dated fantasy pictures and culturally infused fluff, like He-Man inspired Masters of the Universe (1987) or the E.T. wannabe Mac and Me (1988).  By all accounts, these films would have long been forgotten outside of their era, and yet they have lived on with audiences who still find something entertaining in them.
One of my favorite types of “bad” film is the kind that comes from a complete lack of control from either the director or the performer.  There have been some directors that have actually gained their reputation as an filmmaker by staying within the B-Movie community.  The most famous of these filmmakers has become Ed Wood Jr.; a person who some have claimed to be the worst director in history.  Ed Wood’s notable contributions to cinema have been the cross-dressing comedy Glen or Glenda (1953), the Bela Lugosi-starring Bride of the Monster (1955), and what many consider the director’s “masterpiece,” Plan 9 from Outer Space (1959).  Whether or not Ed Wood was earnest in his vision or whether he made his film’s intentionally bad is still debated, but there is no doubt that Plan 9 is a special kind of “bad;” a movie so aggressively cheesy, that it is hard not to be entertained by it.
Other filmmakers who were more aware of their B-Movie status have still gained an honored reputation with audiences.  Roger Corman, a man who prided himself in making movies both fast and cheap, has actually become influential to a who generation of blockbuster filmmakers.  His Little Shop of Horrors (1962) even inspired a Broadway musical.  Also, sometimes a way out there performance can often make a “bad” movie worth watching.  I would argue that this is the case with most Nicolas Cage films, like Vampire’s Kiss (1988) or Ghost Rider: Spirit of Vengeance (2012).  One film that has become a cult classic mainly due to one “out-of-control” performance was 1981’s Mommie Dearest, where Faye Dunaway chews the scenery in a good way as a way over-the-top Joan Crawford.  Usually a lack of restraint by the filmmakers can sink a film, but these movies actually prove that it’s not always the case.
While many films usually become a cult hit over time, there a select few that attempt to achieve cult status right away by being intentionally bad.  Like I stated earlier, Sharknado became an instant hit when it premiered on cable, and having seen the film myself, it’s clear that the filmmakers behind it knew what kind of movie they were making.  Rarely do you see filmmakers try to aim for that intentionally “bad” gimmick for their movies, because obviously if audiences don’t accept it, then you’ve just made a bad movie.  Director Tim Burton tried to create a homage to B-Movie sci-fi with his 1996 film Mars Attacks, but the film was an odd blend of tounge-in-cheek mockery with earnest storytelling, and the end result doesn’t achieve what it set out to do.
But, one example of an intentionally bad film that did click with audiences is the campy musical The Rocky Horror Picture Show (1975); a movie that pays homage to campy horror and sci-fi, while mixing in 50’s rock music and trans-sexual humor.  Rocky Horror tries so hard to be so bad, you would think that the whole thing would be a mess; and yet, it remains entertaining and it has one of the most dedicated fanbases in  the world.  I think the reason why a movie like Rocky Horror works is because of the fact that it just doesn’t care what people will think about it.  It is what it is, and that’s why people gravitate to it.  It’s a one of a kind.  A movie like Mars Attacks didn’t click as a throwback, because it didn’t have that same kind of assured belief in itself, and that shows why it is hard to make a bad movie feel good.
When it comes down to it, “bad” movies are usually determined by the tastes of the people who watch them.  We have made some of these “bad” movies our favorites because of the value we find in their chessy-ness, or by our fascination with how badly it gets things wrong.  For a movie to be all around bad, it has to lack any kind of entertainment value in the end.  For those who are wondering, the worst movie that I have ever seen, and one I see no redeeming value in, is the 1996 film Space Jam.  To me, it was the worst experience I have ever had watching a movie, mainly because I saw it as a blatant self-serving production piece for a sports super star (Michael Jordan) and it ruined three things on film that I love dearly: NIKE, Looney Tunes, and Bill Murray.  But, I do recognize that the film does have its fans, so in the end it all comes down to taste.  But, it is fascinating how our tastes leave room for something as poorly made as a Manos or even the more recent Birdemic: Shock and Terror (2010), a movie that needs to be seen to be believed.  There is certainly value in seeing something that we find entertaining, and perhaps that is why these films live on the way they do.

Inspired by a True Story – The Process of Showcasing History in Hollywood

 

jobs3
This week, two very different biopics open in theaters, both ambitious but at the same time controversial.  What we have are Ashton Kutcher’s Jobs and Lee Daniel’s:The Butler (you can thank uptight Warner Bros. for the title of the latter film.)  Both are attempting to tell the stories of extraordinary men in extraordinary eras, while at the same time delving into what made these people who they are.  But what I find interesting is the different kinds of receptions that these two movies are receiving.  Lee Daniel’s The Butler is being praised by both audiences and critics (it’s receiving a 73% rating on Rottentomatoes.com at the time of writing this article) while Kutcher’s Jobs is almost universally panned.  One would argue that it has to do with who’s making the movies and who is been cast in the roles, but it also stems from larger lessons that we’ve learned about the difficult task of adapting true-life histories onto film.  The historical drama has been a staple of film-making from the very beginning of cinema.  Today, a historical film is almost always held to a higher standard by the movie-going public, and so it must play by different rules than other kinds of movies.  Often it comes down to how accurately a film adheres to historical events, but that’s not always an indicator of a drama’s success.  Sometimes, it may work to a film’s advantage to take some liberties with history.
The Butler and Jobs make up what is the most common form of historical drama; the biopic.  In this case the subjects are White House butler Cecil Gaines, portrayed by Oscar-winner Forrest Whittaker, and visionary Apple Computers co-founder Steve Jobs.  Both are men who hold extraordinary places in history, but in very different ways.  Despite the differences in the subjects, it is the history that surrounds them that plays the biggest part of the story-telling.  Filmmakers love biopics because it allows them to teach a history lesson while at the same time creating a character study of their subject.  Usually the best biopics center around great historical figures, but not always.  One of the most beloved biopics of all time is Martin Scorsese’s Raging Bull (1980), which tells the story of a washed-up heavyweight boxer who was all but forgotten by the public. Scorsese was attracted to this little known story of boxer Jake LaMotta, and in it he saw a worthwhile cautionary tale that he could bring to the big screen.  The common man can be the subject of an epic adventure if his life’s story is compelling enough.  But there are challenges in making a biopic work within a film narrative.
Case in point, how much of the person’s life story do you tell.  This can be the most problematic aspect of adapting a true story to the big screen.  Some filmmakers, when given the task of creating a biopic of a historical figure, will try to present someone’s entire life in a film; from cradle to grave. This sometimes works, like Bernardo Bertolucci’s The Last Emporer (1987), which flashbacks to it’s protagonist’s childhood years frequently throughout the narrative.  Other times, it works best just to focus on one moment in a person’s life and use that as the focus of understanding who they were.  My all-time favorite film, Lawrence of Arabia (1962) accomplishes that feat perfectly by depicting the years of Major T.E. Lawrence’s life when he helped lead the Arab revolts against the Turks in World War I.  The entire 3 1/2 hours of the film never deviates from this period in time, except for a funeral prologue at the beginning, and that is because the film is not about how Lawrence became who he was, but rather about what he accomplished during these formidable years in his life.  How a film focuses on it’s subject is based around what the filmmakers wants the audience to learn.  Sometimes this can be a problem if the filmmaker doesn’t know what to focus on.  One example of this is Richard Attenborough’s Chaplin (1992), which makes the mistake of trying to cram too much of it’s subject’s life into one film.  The movie feels too rushed and unfocused and that hurts any chance the movie has with understanding the personality of Charlie Chaplin, despite actor Robert Downey Jr.’s best efforts.  It’s something that must be considered all the time before any biopic is put into production.
Sometimes there are great historical dramas that depict an event without ever centering on any specific person.  These are often called historical mosaics.  Often times, this is where fiction and non-fiction can mingle together effectively without drawing the ire of historical nitpicking.  It’s where you’ll find history used as a backdrop to an original story-line, with fictional characters participating in a real life event; sometimes even encountering a historical figure in the process.  Mostly, these films will depict a singular event using a fictional person as a sort of eyewitness that the audience can identify with.  You see this in films like Ben-Hur (1959), where the fictional Jewish prince lives and bears witness to the life and times of Jesus Christ.  More recently, a film like Titanic (1997) brought the disaster to believable life by having a tragic love story centered around it.  Having the characters in these movies be right in the thick of historical events is the best way to convey the event’s significance to an audience, because it adds the human connection into the moment.  Titanic and Ben-Hur focus on singular events, but this principle can also be true about a film like Forrest Gump (1994) as well, which moves from one historical touchstone to another.  Forrest Gump’s premise may be far-fetched and the history a little romanticized, but it does succeed in teaching us about the era, because it does come from that first-hand experience.  It’s that perspective that separates a historical drama from a documentary, because it helps to ground the imagination behind the fictional elements into our own lives and experiences.
Though most filmmakers strive to be as historically accurate as they can be, almost all of them have to make compromises to make a film work for the big screen.  Often, a story needs to trim much of the historical elements and even, in some cases, take the extraordinary step of rewriting history. You see this a lot when characters are created specifically for a film as a means of tying the narrative together; either by creating an amalgam of many different people into one person, or by just inventing a fictional person out of nowhere.  This was the case in Steven Spielberg’s Catch Me if You Can (2002), which followed the extraordinary life of Frank Abagnale Jr. (played by Leonardo DiCaprio), a notorious con-artist.  In the film, Abagnale takes on many different identities, but is always on the run from a persistent FBI agent named Carl Hanratty (Tom Hanks). Once finally caught, Abagnale is reformed with the help of Hanratty and the film’s epilogue includes the statement that, “Frank and Carl remain friends to this day.”  This epilogue had to be meant as a joke by the filmmakers, because even though Frank Abagnale is a real person, Carl Hanratty is not. He’s an entirely fictional character created as a foil for the main protagonist.  It’s not uncommon to see this in most films, since filmmakers need to take some liberties to move a story forward and fill in some gaps.  Other films do the risky job of depicting real history and completely changing much of it in service of the story.   Mel Gibson’s Braveheart takes so many historical liberties that it almost turns the story of Scottish icon William Wallace into a fairy-tale; but the end result is so entertaining, you can sometimes forgive the filmmakers for making the changes they did.
But while making a few changes is a good thing, there is a fine line where it can be a disservice to a film.  It all comes down to tone.  Braveheart gets away with more because it’s subject is so larger than life, that it makes sense to embellish the history a bit to make it more legend than fact.  Other films run the risk of either being too irreverent to be taken seriously or too bogged down in the details to be entertaining.  Ridley Scott crosses that line quite often with his historical epics, and while he comes out on the right side occasionally (Gladiator and Black Hawk Down) he also comes up with the opposite just as many times (Robin Hood, 1492: Conquest of Paradise, Kingdom of Heaven theatrical cut).  Part of Scott’s uneven record is due to his trademark style, which services some films fine, but feel out of place with others.  Tone also is set with the casting of actors, and while some feel remarkably appropriate for their time periods (Daniel Day-Lewis in Lincoln for example) others will feel too modern or awkwardly out-of-place (Colin Farrell in Alexander).  Because historical films are expensive to make, compromises on style and casting are understandable for making a film work, but it can also do a disservice to the story and shed away any accountability in the history behind it.  While stylizing history can sometimes work (Zack Snyder’s 300), there are also cinematic styles that will feel totally wrong for a film.  Does the shaky camera work, over-saturated color timing and CGI enhancements of Pearl Harbor (2001) make you learn any more about the history of the event?  Doubtful.
So, with Lee Daniel’s The Butler and Jobs, we find two historical biopics that are being received in very different ways.  I believe The Butler has the advantage because we don’t know that much about the life that Mr. Cecil Gaines lived.  What the film offers is a look at history from a perspective that most audiences haven’t seen before, which helps to shed some new light on an already well covered time period.  With Jobs, it has the disadvantage of showing the life of a person that we already know everything about, and as a result adds nothing new to the table.  Both films are certainly Oscar-bait, as most historical films are, but The Butler at least took on more risks in its subject matter, which appears to have paid off in the end.  Jobs just comes off as another failed passion project.  What it shows is that successful historical dramas find ways to be both educational and entertaining; and on occasion, inspiring.  That’s what helps to make history feel alive for us, the audience.  It’s the closest thing we have to time machines that help be an eyewitness to our own history.  And when it’s a good story, it stays with us for the rest of our lives.

Thrown into the Briar Patch – The Uneasy and Confusing Controversy of Disney’s “Song of the South”

 

songofthesouth
What does it take to blacklist a whole film?  Walt Disney’s 1946 film Song of the South has the dubious distinction of being the only film in the company’s history to be declared un-releasable. Many people state that it’s because of the perception that the film has a racist message and that it sugarcoats and simplifies the issue of slavery in an offensive way.  I would argue that it’s not right to label a film one way without ever having seen it, but unfortunately Disney is reluctant to even let that happen.  What is interesting is the fact that by putting a self-imposed ban on the distribution of the film, Disney is actually perpetuating the notion that Song of the South is a dangerous movie, due to the stigma it holds as being the one film that they refuse to make public.  Disney, more than any other media company in the world, is built upon their wholesome image, and for some reason they are afraid to let their guard down and air out their dirty laundry.  But, is Song of the South really the embarrassment that everyone says it is, or is it merely a misunderstood masterpiece. Thankfully, I have seen the film myself (thank you Japanese bootlegs and YouTube), so I can actually pass judgment on it, and like most other controversial things, you gain a much different perspective once you remove all the noise surrounding it.
For a film that has gained such a notorious reputation over the years, the actual history of the production is relatively free of controversy.  Walt Disney wanted to adapt the Uncle Remus Stories, which were popular African-American folktales published by Joel Chandler Harris in post-Reconstruction Georgia.  Disney said that these stories were among his favorites as a child and he was eager to bring to life the moralistic tales through animated shorts starring the characters Brer Rabbit, Brer Fox and Brer Bear.  The film was a breakthrough production for the Disney company as it was a mix of live action and animation.  Sequences where the live action character of Uncle Remus interacts with the cast of animated critters were astonishing to audiences at the visual effects were highly praised at the time; remember this was almost 20 years before Mary Poppins (1964), which was also a hybrid film in itself.  Walt Disney treated the subject material with great reverence and he brought in the best talent possible to work on the film, including Oscar-winning Cinematographer Gregg Toland (Citizen Kane, The Grapes of Wrath).  Disney was especially proud of the casting of James Baskett as Uncle Remus, and he even campaigned heavily to earn Mr. Baskett an Oscar nomination for his performance;  Baskett wasn’t nominated, but he did win a special honorary Oscar in recognition of his work on the film.  The movie was a financial success and it did earn another Oscar for the song “Zip-a-Dee-Do-Dah,” which has become a sort of an unofficial anthem for the Disney company.
Surprisingly, the film would be re-released constantly for decades afterwards.  It even provided the inspiration for what is still one of Disneyland’s most popular attractions: Splash Mountain.  It wouldn’t be until after a short theatrical run in 1985 that Disney began their policy of keeping the film out of the public eye.  Not surprisingly, this was also around the same time that a new corporate team, led by Michael Eisner, had taken over operation of the company, and with them a whole new mindset centered around brand appeal.  While Song of the South would sometimes be called out in the past by organizations like the NAACP for it’s quaint portrayal of post-slavery life, the film was not considered an outright embarrassment.  It was merely seen as a product of its time and was much more notable for its animated sequences than for its actual story line.  But once Disney made it their policy to shelve the film for good based on the perception that the film made light of slavery, that’s when all controversy started heating up.  To this day, Song of the South has yet to receive a home video release here in the United States, and Disney is still continuing to stand by their decision to not make the film public.
So, having seen the actual film, it has given me the impression that Disney didn’t ban the film just because of its content, but rather it was an attempt to keep their image as clean as possible.  My own impression of the film is this; it’s harmless.  Don’t get me wrong, it is not the most progressive depiction of African-American life in America and some of the portrayals of the ex-slave characters are certainly out of date to the point of being cringe-inducing.  But it’s no worse than a film like Gone with the Wind (1939), and that film is considered one of the greatest movies of all times.  If Song of the South has a flaw it would be that it’s boring.  The movie clearly shows Walt Disney’s lack of experience in live action film-making, as the main story of the film is very dull and flimsy. Basically it follows the life of a young southern boy, played by Disney child star Bobby Driscoll (Peter Pan) as he deals with the break-up of his family and the finding of solace in the stories told to him by a former slave, Uncle Remus.  There’s not much more to it than that.  Where the film really shines is in its animated sequences, which are just as strong as anything else Disney was making in the post-War era.  The art style in particular really does stand out, and conveys the beauty of the Southern countryside perfectly.
Somehow, I believe that there’s a different reason why the film has garnered the reputation that it has.  Disney is a big company that has built itself around an image.  Unfortunately, when you go to certain extremes to keep your image as flawless as it can be, it’s going to make other people want to tear that down even more.  There are a lot of people out there who hate Disney purely on their wholesome image alone, and when they find cracks in that facade, they are going to keep on exploiting that whenever possible.  Walt Disney himself has been called everything from racist to anti-Semitic, which if you actually dig deeper into any of those claims, you’ll find that there’s little truth to them and that they’re usually attributed to people who came from rival companies or had a contract dispute with Mr. Disney.
Unfortunately, by trying so hard to sweep so much under the rug, the Disney company opens itself to these kinds of accusations; and they have no one else to blame for them but themselves. Walt Disney was not a flawless man by any means and the company has made embarrassingly short sighted decisions in the past; hell they’re still making them now (John Carter, The Lone Ranger). But, their flaws are no worse than the ones that plague other companies in Hollywood.  Just look at the racial stereotypes in old Warner Brothers cartoons; there was an actual war propaganda Looney Tunes short called Bugs Nips the Nips, which is about as racist as you can get.  The only difference is that Warner Brothers has not shied away from it’s past embarrassments, and have made them public while stating the historical context of their productions.  As a result, Warner Brothers has avoided the “racist” labels entirely and their image has been kept intact.  For some reason, Disney doesn’t want to do that with Song of the South, despite the fact that Disney has made public some of their older shorts that are far more overtly racially insensitive than the movie. There are shorts from the 1930’s that showed Mickey Mouse in black face, and yet they still got a video release as part of the Walt Disney Treasures DVD Collection.  I think the reason why Song of the South didn’t get the same treatment is because it’s such a polished and earnest production, and it’s probably easier to dismiss the silly cartoon for it’s flaws because they’re less significant.
Regardless of how it accurately it addresses the issues of slavery and the African-American experience, the Song of the South should at least be given the opportunity to be seen.  It’s a part of the Disney company’s history whether they like it or not, and to sweep it aside is doing a disservice to the Disney legacy as a whole.  Being a white man, I certainly can’t predict what the reaction from the African-American community will be, but is that any excuse to hide the film from them.  Maybe black audiences will come to the film with an open mind; quite a few at least.  It just doesn’t make any sense why this is the film that has been deemed un-watchable when other films like Gone with the Wind, which is very similar content wise, is heralded as a classic.  Even D.W. Griffith’s The Birth of a Nation (1915) is available on home video, and that film openly endorses the Ku Klux Klan.  Song of the South is so harmless by comparison and the worst that you can say about it is that it’s out of date.
As a film, I would recommend everyone to give it at least a watch, if you can.  The animated sequences are definitely worth seeing on their own, and I think some people will appreciate the film as a sort of cinematic time capsule.  While the African-American characters are portrayed in a less than progressive way, I don’t think that it’s the fault of the actors.  James Baskett in particular does the most that he can with the role, and it’s hard not to like him in the film.  He also does double duty playing both Uncle Remus and the voice of Brer Fox, which shows the range that he had as a performer.  The music is also exceptional with songs like “The Laughing Place,” “Sooner or Later,” “How Do You Do?” and the Oscar-winning “Zip-a-Dee-Do-Dah;” crowd-pleasers in every way. It’s definitely not deserving of the reputation it’s gotten.  Disney’s reluctance to make the film available just goes to show the folly of trying to keep a flawless image, when it would actually serve them better to have it out in the open.  Sometimes you just need to take your medicine and let things happen.  After all, aren’t the people who ride Splash Mountain everyday at Disneyland going to wonder some day what film it’s all based on?

Nerd Heaven – The Highs and Lows of Marketing at San Diego Comic Con

 

spiderman
This is no ordinary corporate showcase.  In the last decade or so, San Diego Comics Convention (SDCC), or its more commonly known name COMIC CON, has become a full-fledged festival for the whole of Nerd-Dom.  Not only is it a great place for fans to encounter their favorite artists and filmmakers in person, but it’s also a great place for Hollywood to showcase their tent-pole productions to an eager audience.   In all, its a celebration of all forms of media, where the experience of the presentations and panels can often overshadow the actual products themselves. But, while everything is all in fun at Comic Con, the business end is what matters the most on the actual show floor itself.  As with all conventions, Comic Con is geared toward marketing.  Big studios and publishers get the most attention in the media coverage of the Con, but SDCC started with the small vendors and they continue to be part of the backbone of the whole show.  For everyone involved, there is a lot at stake in these four packed days in mid-July.
Up and coming artists, journalists, and filmmakers are just as common amongst the visitors as they are among the headliners, and the mingling of different talents defines much of the experience at the Con.  While many people get excited by the surprises on hand, that excitement can sometimes have difficulty extending outside the walls of San Diego’s Convention Center.  Marketing to a crowd of fans is much different than marketing to a general audience.  I believe Comic Con works best as a testing ground for marketing strategies in the bigger push of selling a project to the world. Sometimes, a lot of buzz can be generated with a surprise announcement or a with a well placed tease.  One clear example of this at Comic Con this year was the surprise announcement of a Superman and Batman movie coming in 2015.  The announcement was a bombshell for the fans who witnessed it live at the convention, and that extended to a media blitz that spread quickly through all news sources that same day.  This surprise effectively gained needed attention for a project that has only been in the planning stages so far.  Where the risk lies is in the effectiveness of this kind of moment, and there can be no more unforgiving audience than one made up of nerds.
Many of the big studios have figured this out over time, and the planning of their showcases at Comic Con are almost as intricate as the projects they’re trying to sell.  One thing they have certainly learned is that Comic Con patrons are extremely discerning, and are often even more informed about the different projects than the talents involved.  There is a fine line between excitement and scorn within the fan community, and if you fall on the wrong end of that line, it can be brutal.  Comic Con is all about fan service, which is no surprise to everyone.  This year in particular, there were more instances of stars making appearances in costume than ever before.  As you can see in the photo above, Spiderman was there to address the audience in person, which was a special treat considering that the film’s star, Andrew Garfield, was the one behind the mask.  Avengers villain Loki also showed up to introduce footage from the upcoming Thor sequel, with actor Tom Hiddleston completely in character the whole time.  All of these moments make the live presentations far more entertaining, and that in turn helps to make the audience even more enthusiastic about the upcoming films.  Comic Con is a place where theatrics meets commerce, and where a a well made sales pitch can turn into a fanboy’s dream come true.
Given that SDCC started as a showcase for comics, its no surprise that Marvel and DC are the ones who put on the biggest shows; and are the ones who connect to their audiences better than anyone else, through all their experience.  But more recently, the showcases have steered away from the printed page and have been more focused on the silver screen.  Its not that Comic Con has abandoned the medium that started it all; print comics still have a place on the convention floor. It’s just that the movie industry is bigger and more involved, and have seen the benefits of marketing at the Convention.
With production budgets rising, Comic Con has become more important than ever as a way to generate enthusiasm for film projects; even ones that have trouble getting attention.  Several years ago, Disney made a surprise announcement at Comic Con that there was a sequel to their cult-hit Tron (1982) in the works, which was highlighted by a teaser trailer.  Little was known or talked about with the project, and Disney wasn’t quite sure if the project would go anywhere past Development, so the trailer was made as a way to test the waters.  The reception they got was overwhelming, especially when it was revealed that the original film’s star, Jeff Bridges, was involved and production went full steam ahead afterwards.  Few expected a Tron sequel to be newsworthy, let alone the hot topic of the conversation at the convention, but Disney showed that year what a simple surprise could do to generate excitement.  Since then, surprises have not only become more frequent, but now they are expected.
That leads to some unforeseen consequences sometimes in a high stakes venue like this.  When the audiences are expecting a surprise to happen at any moment, it puts even more pressure on the marketing teams to deliver the goods.  There have been many cases when a production company ends up promising too much and then fails to deliver.  A couple years ago, Guillermo del Toro teased the crowd at a Disney presentation by revealing his involvement in a new Haunted Mansion film, which he promised was going to be more spiritually faithful (no pun intended) to the original Disneyland ride than the Eddie Murphy flop had been.  It was an exciting announcement at the time, but several years later, almost nothing new has been heard about the project, and with Del Toro taking on more and more new projects, it’s becoming more obvious that this particular project is probably not going to happen.  Other broken promises have included several announcements of a Justice League movie, including one that is currently out there now and remains to be seen; or news that TV-scribe David E. Kelley was going to give Wonder Woman a new TV series, which led to a disastrous pilot episode that never got picked up.  This is why production companies need to show good judgment when they present their projects at Comic Con.  Once you make a promise, you have to commit.  If you don’t, no one will take those promises seriously, and the whole aura that a Comic Con surprise makes will stop working.
In many ways, Comic Con has become a more favorable place for television than film.  TV shows like Game of Thrones, The Walking Dead, Dexter, and Doctor Who can benefit from all the same kinds of media buzz that a theatrical film can get at the Con, without having the pressure of marketing a massive project with a $250 million budget; although TV budgets are rising too. Comic Con isn’t the only platform for marketing a film, but it’s certainly one of the biggest and the stakes are getting higher.  In a year like 2013, which has seen numerous under-performing films hitting theaters this summer, the pressure is on when it comes to getting the message to resonate beyond the cheering fans in Hall H.  I don’t envy the people behind the Comic Con presentations one bit, because they have so much resting on their shoulders.  And when you’re dealing with a fan-base as well informed as those in the fan community, it’s a wonder how they can keep the surprises coming.
I should note that I have yet to attend Comic Con myself.  My observations are from an outsider’s perspective, though I do follow the live news coverage of the conventions every year with great anticipation.  I hope to someday see it for myself; just to take in the experience of seeing the whole carnival-esque atmosphere of the place.  I’m not sure if I’ll attend it in costume like all the cosplaying regulars there, but then again, “when in Rome…”.  Overall, there’s no doubt that Comic Con is one of the most important institutions we have in our media culture today, and it will continue for many years to come.  There are Comic conventions to be found across the world over, but this is the grand-daddy of them all, and no other convention has this kind of influence on the film industry in general.  Plus, where else are you going to see cool stuff like this:

The Terrible Threes – The Hard Road of Second Sequels

 

second sequels
The number 3 seems to be unlucky for film franchises.  That’s the thought that came to mind when I watched The Hangover Part III.  Short review; it sucked, and I’m beginning to see how it falls into a pattern.  Movie franchises seem to fizzle out around the point that a third entry is released.  Unless its a part of a pre-planned trilogy, like The Lord of the Rings, it is very rare to see a second sequel rise to the level of its predecessors.  So, why do so many filmmakers insist on moving forward with a series that has clearly lost steam after two films.  The simple fact is that sequels are easy to make and unfortunately the law of diminishing returns applies far too often.  In many cases, the first and second sequels just repeat the formula of the initial films, and that not only shows a loss in creativity, but it also defeats the purpose of building up the brand in the first place.  Audiences naturally want to see new things when they watch a movie, even when it comes from a sequel.  Some sequels do manage to breath new life into familiar stories; even deviate from the previous ones in wild and interesting ways.  But while you can sometimes catch lightning in a bottle in two tries, it almost rarely happens again.
There are many factors that go into making a great sequel.  A sequel has to know what made the first film a success and do exactly the same, only bigger.  In some cases, a sequel can even far exceed its predecessor.  Director James Cameron seems to take that principle to heart when making the sequels to his films.  In the case with Terminator 2 (1992), he not only continued the story of the first film, but made it bigger and more epic in the process.  For many people, it’s the movie they most think about when the hear the word “Terminator.” It’s no simple feat for a sequel to be the definitive entry in a series.  A more recent example of this would be Christopher Nolan’s The Dark Knight (2008), which became so popular, that it changed the way we market superhero movies today.  We no longer look at Nolan’s films as the Batman Begins trilogy.  Instead, it’s considered the Dark Knight trilogy, which is the direct result of the sequel overshadowing the first film.
Though the track record for a first sequel is good, there’s less success when it comes to the second sequel.  Once a series hits its third entry, that’s the point where it begins to show signs of exhaustion.  By this point, filmmakers are almost trapped by their own success; having to keep something fresh and interesting long after the good ideas have been used up.  Like I mentioned before, unless a series was planned long ahead of time as a trilogy or more, then most of the creativity will be spent by the time the third film comes along.  It’s very hard to be a sequel to a sequel, and audiences can only take so much of the same story before they lose interest.
The genre that seems to suffer the most from this 3rd film curse is the superhero genre.  Usually superhero films that carry a 3 next to it’s name have ended up being the most criticized by their fans.  We’ve seen this with films like Superman 3, Spiderman 3, X-Men: The Last Stand, and now it appears from this year as well, Iron Man 3.  Even Christopher Nolan’s critically lauded The Dark Knight Rises failed to deliver for some fans.  As is the case with most of these films, they are the follow-ups to very some very beloved sequels; ones that fans had hoped these trilogy cappers would’ve built upon.  There are a couple reasons that could explain why these films have fallen short: one, the audiences’ expectations were just too high for the filmmakers to deliver; two, the filmmakers decided to deviate too much from a proven formula as a means to spur on their creative juices; or three, the filmmakers had clearly lost interest and were just trying to fulfill their obligations. The worst case is when a series decides that it’s ready to be done, without the foresight of establishing a means of wrapping up the story.  This was the case with X-Men: The Last Stand (2006), which haphazardly crammed in a bunch of story points and characters in a film that didn’t need them in order to please the fans expectations as they cut the story off way too short.  The final result was jumbled mess that ended up pleasing no one and it hurt the brand for years to come.
Other franchises also suffer from this pattern, but out of some very different outcomes.  Sometimes, a series does plan ahead and creates a trilogy based off the original film’s popularity, leading to the production of two films at once.  This, however, is a huge risk because it puts the pressure on the middle film in the series to deliver; otherwise the third film will be left out to dry if it doesn’t work.  This has happened on several occasions, such as with the Back to the Future trilogy, the Matrix trilogy, and the Pirates of the Caribbean series.  The Pirates films in particular became so notoriously over-budgeted, that it actually led to the end of studios making simultaneous productions for sequels.  While the receptions of the films are mixed, there was no denying that these series lost steam the longer they went on.  The same happens with the opposite as well, when an unnecessary third film is made many years after the previous sequel.  The Godfather Part III (1990) is probably the most famous example, having come nearly 16 years after the last film, and which extended a story that people thought was perfectly resolved earlier, for no other reason other than to do it all over again.
That’s exactly what most 3rd films end up being: unnecessary.  That’s what I thought when I saw The Hangover Part III.  The series has long exhausted it’s potential and is now running on the fumes.  Could the series have sustained enough interest over three films is another question entirely.  It certainly had enough clout for one sequel.  But whether or not a film series makes it too a 3rd film should entirely be the result of the need to explore the possibilities of the story, and not just to repeat the same formula for the sake of making some quick cash.  These films must be able to stand on their own and not just be an extension of what came before.  The best trilogies are ones where each entry has its own identity, and can entertain well enough on their own without feeling like the extended part of a greater whole.  Films like Return of the Jedi, Indiana Jones and Last Crusade, Goldfinger and The Return of the King are beloved because they entertain while also being an essential part of their overall stories.  And most importantly, they didn’t waste their potential.  Something that the filmmakers behind the Hangover films should’ve considered.

Your Movie is Loading – Digital Innovations and the Resulting Tightened Gap Between Cinema and Home Entertainment

 

home theater
Growing up through the 80’s and 90’s, it was clear that going to the movies and watching one on TV were very different experiences.  But in the years since, technology has revolutionized the ways in which we experience a movie.  Thanks to innovations like Movie on Demand and digital camera and projection, that line between the two experiences has been clearly redefined.  Film companies can now premiere their projects on multiple platforms whereas years ago, you had to wait for sometimes even a year before you were able to buy a film on video after it left the theaters.  The accessibility of the internet has influenced that shift more than anything; allowing people to see what they want, when they want, all through the process of video streaming.  Like most new things, this shift in how we watch the movies has its pros and cons.  For one thing, it gives exposure to movies and media that normally wouldn’t have been seen years ago, while at the same time, causing previous standards of the movie industry to become obsolete and forgotten.  We live in an era where things are changing rapidly, and I wonder if these changes are just trends or are here to stay for good.
One thing that has changed movie watching dramatically is the actual digitizing of media for home viewing.  Before, we had to to buy a tape or a disc to watch a film at home, but nowadays, many people are opting to just cut the middle man out and download a film off the internet.  Places like iTunes allow for the purchasing of a digital version of a film the same day it’s released in stores, and sometimes even earlier on certain exclusives.  It’s a good place to purchase a movie for those who don’t want to clutter their shelves with DVD boxes.  This has also changed the rental business, with sites like Netflix and Hulu putting old juggernaut rental chains like Blockbuster out of business.  That development right there spells out just how powerful this new trend has become.
But what’s interesting about this change is that the film industry has yet to figure it out.  Accessing a movie through a digital copy or through a streaming service is difficult because there is no standardization.  You have certain movies available on some formats and unavailable on others.  It all depends on who has the contract with the retailer.  At least in the years past, you had standardization with all movies released on the preferred format.  Yes, VHS and Blu-ray had to gain dominance over BetaMax and HD DVD in order to become the standard, but once they did, selections in a video store became only a matter of which title the person wanted.  Nowadays, a person who wants the digital copy of a film has to download multiple media players onto their computer or mobile devices just so that they can watch the movies that they want.  And all of these media providers are competitive enough to survive in this market, so any standardization will not be happening soon.  Perhaps its a good thing for there to be competitiveness in the market of media sharing, as it leads to more innovations, but it has the consequence of making the market difficult to navigate.
One of the things that I do find to be a troubling change is the loss of a movie being an actual physical thing.  It may be strange to think of a movie as an object, but I consider myself a collector as well as a fan of cinema, and when I like a movie, I want to include it in my collection.  I have been collecting movies since childhood, and that has included VHS tapes, DVD’s and now Blu-rays.  I am the kind of person that has multiple copies of a single film in different formats and my library is bound to continue growing for a long time to come.  To me, its just a nice feeling to be able to look at a film sitting on my shelf and see it as a part of a physical history of cinema.  This is why I haven’t digitized my film collection.  I am far more likely to buy a disc of a film rather than download one, mainly because I still prefer holding a movie in my hand, even though the idea of having everything stored on a computer is one that I do understand.  For many, a digital copy is a preferred method for people who have been wanting it for a long time and in many ways it’s the faster and easier mode of distribution.
This trend has definitely changed distribution in Hollywood in a good way.  Some movies in the past have struggled to get appropriate distribution, whether they lacked the funding or they were just too risky a project for the studios to make a fuss about in the first place.  In some cases, movies would become hits long after their run in theaters, once they were seen on cable or home video; cult classics like Office Space (1999) or Clerks (1994).  Now, it is possible for a film to bypass the pressure of a theatrical exhibition and be seen almost immediately on whichever format a person chooses.  This is especially true with documentaries, which can be seen on anything from movie screens to YouTube, and not lose any of their impact.  Director Kevin Smith saw the potential in this multi-platform model of release, and decided to self-distribute his most recent film Red State (2011) outside the Hollywood system.  The results were mixed on the success of the release, but Kevin Smith did make waves due to the attempt, and it has made multi-platform distribution just as viable a trend as anything else we’ve seen in the past few years.
Another surprising thing that technology has done to the film industry is to change the way films are both made and processed nowadays.  Digital photography has advanced so much, that it’s oftentimes hard to tell if a movie is shot digitally or not.  Digital projection has certainly taken over cinemas completely, as it’s now hard to find a place that still runs film prints; another sad change, where a film stops existing as a physical thing.  But digital projection has been around long enough to make audiences no longer see any real difference, unless they have a trained eye.  The same goes for digital photography.  Digital cameras are now able to shoot in such high resolutions that it actually exceeds the clarity of regular 35mm film.  This has enabled some new advancements in the presentation of movies, like Digital 3D and 48 frames per second.  While unique, these trends are sometimes just a gimmick, and are usually dependent on the quality of the film to work for an audience.  But the trend has moved in favor of digital photography for a while now.  Only a few filmmakers have stuck by traditional film, like Quentin Tarantino and Steven Spielberg, but for many filmmakers who have limited means and want to bypass the film processing phase, they are embracing the new technology with great enthusiasm.
This has also crossed over into television as well, which has made that line between cinema and home entertainment even more blurred.  TV shows today are filmed mostly with digital cameras, and that has significantly changed what kinds of TV productions that are seen now.  Shows like Game of Thrones and The Walking Dead are done with such complexity in their production, that they can be comparable to the quality of a theatrical film.  This is thanks to digital camerawork that is able to replicate film clarity and allows for manipulation in post production, either through color grading and/or the additions of visual effects.  Years ago, there was no mistaking the difference between what a film looked like and what a TV show looked like.  They were completely distinctive forms of entertainment.  Now the gap has tightened, and it’s probably what has drawn more people towards home viewing.  Can you imagine what shows such as M.A.S.H and Happy Days would be like if they were made with today’s technology.
It’s an interesting tug-of-war that we are seeing today between film and television; one that has been brought about through digital innovations.  While some things will never change, there are other trends that have clearly made things different than what we grew up with.  I for one have my line with what I’m willing to embrace from these new trends, but I am pleased to see so many advancements made in the last few years.  I certainly like the increased accessibility to films that I would normally have had trouble finding.  Digital photography has also made television a whole lot better in recent years.  But, I also miss the experience of working with actual film.  My years as a projectionist gained me a strong appreciation for the look and feel of a film print, but it’s sad to see it become an obsolete tool in film presentation today.  Also, while digital presentation and video streaming are convenient and innovative, the movie itself is what will make or break the investment in the end.
Ultimately, there’s nothing that beats a good time at a movie theater with an auditorium full of people.  Home entertainment may be at a high standard now, and techniques like 3D and high frame rate may be eye-catching, but it ultimately comes down to the human factor.  I enjoy watching a movie, no matter what technology is behind it, as long as it remains entertaining.  And that’s an experience that will always be timeless, even if the ticket prices are getting painfully and astronomically higher.