Category Archives: Editorials

Counter-Programming – The Simple Pleasures of Art House Cinema

art house cinema

When we go to the movies, we above all are looking to be entertained, and it is usually expected of Hollywood to deliver on that front.   However, even with millions of dollars invested and hundreds of man hours spent in production, there’s still a good chance that whatever Hollywood puts out from week to week will still fall flat.  You could probably chalk that up to the homogeneous nature of the business, where studios try to copy one another’s success and few chances are taken.  How many loud and vacuous action films are we presented with every summer?  Eventually, the movie-going public grows tired of the same old thing and wants to look for something new on the big screen.  This has led to a special niche market in cinema called the Art House scene.  An art house cinema is usually a small venue, sometimes made up of only one or two screening rooms, that presents films  made outside of the studio system, and/or are usually made on a smaller budget.  Commonly, an art house cinema is the only possible place in your local community that screens international, foreign release.s  Also, if you are an up-and-coming filmmaker, an arts cinema might be where all your hard work will finally receive it’s first viewing.  I believe what makes art cinemas special most of all is the fact that they provide a welcome alternative to the multi-screen cineplex experience.  While it may be quieter and more classy, it’s still no less of a place to be entertained.  Art House Cinemas gives us the ability to discover and enjoy something new, as well as to serve as a welcome communal ground for both cinephiles and casual viewers alike; bringing the idea of cinema alive in ways that the big guys can’t.

Art House Cinema has been around for a while, but it didn’t become a common thing until the later half of the last century.  The studio system more or less kept all theatrical presentations under their strict control up until the 1950’s.  Up until that time, all movie theaters were contracted to release only whatever the studios were making. In addition, back in those days, movie-going experiences were more of a casual experience, with people coming and going as they pleased, not caring if they’ve seen the program in it’s entirety.  But, with the dissolving of the old studio system in the 50’s and 60’s, local cinemas were freed up to showcase whatever they wanted.  Most often they would still showcase a studio film, but if the demand was there, a community cinema could show something out of the ordinary.  Usually this would be an internationally acclaimed film from a foreign land, like the works of Japanese filmmaker Akira Kurosawa or Swedish director Ingmar Bergman.  But also at this time was the rise of the independent film.  Independent cinema became a great way to tap into the changing cultural landscape that was happening in America and spread it across to multiple markets.  While most Middle America cinemas stuck with the same old studio releases, many local theaters in big cities and college towns across the country started to specialize their programming around audiences wanting to see these new, progressive films.  Thus, did we see the beginning of specialty, art house theaters.  Some presented films that you could instantly classify as art, while others were clearly geared towards exploitation.  Despite whatever class of audience these theaters were catering to, there’s no doubt that it was a change that would never go away.

There was, however, a time where art house cinema did see a decline.  In the 1970’s, Hollywood began to embrace independent filmmakers and brought them more into the mainstream.  No longer were the likes of Francis Ford Coppola and Martin Scorsese working outside of the system; in these years, they became the system.  Cinema as a whole changed dramatically during this time, and the line between independent film-making and studio film-making became increasingly blurred.  With the arrival of George Lucas and Steven Spielberg in the late 70’s and early 80’s, we began to see the rise of the blockbuster, with the phenomenal premieres of Jaws (1975) and Star Wars (1977).   This era again changed the way that films were presented, with the beginnings of the multiplex business.  Now, it was commonplace for one venue to present multiple movies at the same time, sometimes on 10 screens or more, depending on the demographics of where you lived.  Unfortunately, with the arrival of the multiplex, we saw a rise in movies made within the system and a diminishing of the independent film market.  Exploitation films all but vanished during this time and foreign releases began to become much more of a niche market; becoming increasingly harder to find in smaller communities.  Multiplexs would certainly dominate the film-going experience in the years to come, and the convenience of their availability would keep Hollywood secure in their ability to reach audiences all across America.  But, the art house was not blotted out completely, and it would come back in a big way.

In the late 80’s and early 90’s, independent film-making made a comeback, as cinematic tools and knowledge became more accessible in the years since the last independent boom.   New voices from the likes of Richard Linklater, Gus Van Sant, Kevin Smith, and Quentin Tarantino rose out of this movement, and even though their popularity has earned them spots in the mainstream, they still maintain their independent spirit to this day.  Also in this time period did we see the rise of the festival circuit, which started largely with Robert Redford’s Sundance Festival; a showcase primarily meant to highlight independent filmmakers.  What is great about the independent film festival circuit in America is that it’s almost entirely hosted by these art cinemas across the country.  Sure, Cannes and Toronto have hosted film fests long before Sundance, but those showcases haven’t tapped into the art house experience in the same way.  In many ways, you do need that intimate atmosphere of an arts cinema in order to really appreciate an unique, independent art film.  Sundance brought that to the attention of Hollywood, and in the years since, the studios have actually invested in the art house cinema market by creating their own independent off-shoots, like Universal’s Focus Features, or Paramount’s Classics, or Fox Searchlight, or Warner Independent.  Not only are film festivals and art house premieres a great place to show off something unique, but they are also great places to try out a potential awards winner.  In fact, many of the recent Best Picture winners at the Oscars have come from the Art House market, like 2011’s The Artist.  It’s one of the reasons why Arts Cinema has matured to the point of being a permanent fixture in the cinematic community.

So what makes going to an arts cinema so different from attending a multiplex, since there seems to be so much more cooperation between the two?  For one thing, Art House cinemas not only presents us the audience with unique, outside-the-norm films, but it also goes out of it’s way to preserve some of the old fashioned cinematic experiences as well.  Indeed, most art house theaters are actually remodeled from the old movie houses of yesteryear.   Long before multiplexes started to become the standard, films were primarily showcased in large and often ornate auditoriums, much like theaters built for stage productions.   In the years that followed the arrival of multiplexes, many of these old “movie houses” fell into disrepair and/or were re-purposed into something else; forgotten and lost to time.  But thanks to the rise of the art house scene, many of these old movie houses were spared and given new life.  Mainly with the help of independent investors and a dedicated community of movie fans, you can find many of these places preserved and maintained as a unique cinematic experience.  Because of this, art cinemas have managed to present a look at the future of cinema, while still honoring the past.  In this sense, Art Cinema takes on a whole new meaning, as the theaters themselves could be classified as works of art.  And the fact that these old-fashioned structures are used to showcase some films that could generally be seen as button-pushing is also a subversive treat.  It’s interesting to think that you may be watching something as disturbing as the new Lars von Trier film in the same place where your grandparents had watched Shirley Temple sing and dance long ago.  And for many cinephiles that is indeed one of the many pleasures of the experience.

However, art cinema is not just limited to preserving an old-fashioned cinematic treasure.  Indeed, if you have the money and the passion to create, you can turn anything into an arts cinema.  This is something that’s been true in college towns across America for years, where arts cinema has always been alive.  College students benefit largely from having a community that honors intellectual curiosity, so their demand for independent cinema has enabled film distributors to cater to this audience.  And if there is no infrastructure in place to placate this group of cinephiles, no doubt some bright entrepreneur will find a way.  That’s why in many college towns across America you will find art cinemas built into re-purposed buildings that were never meant for movies before.  These include old office space, restaurants, warehouses, and even schools.  In my hometown of Eugene, Oregon, the local arts cinema is actually built out of a defunct Episcopalian church, complete with the arched ceiling still intact.  Seeing a movie in one of these place also adds to the whole cinematic experience, given the uniqueness of the surroundings.  But, it’s not just unique places like a re-purposed church where you’ll find art cinemas thriving in towns across America.  The rise of the cineplex also led to an inevitable decline, which led to the closure of many cineplex theaters across the country.  Again, independent investors stepped in and some of these old multiplexes now specialize in art house films.  That’s become the case in Los Angeles, where I live currently, as independent theater owners like the Laemmle company have brought art films to old, defunct multiplexes.   Overall, arts cinema has become an avenue for any entrepreneurial movie fan to try out new things and make cinematic experiences unique for anyone looking to find it.

Also, art house cinema has become a perfect place to present more than just movies.  It has also become the place for communal activities centered around the movie industry.   Usually, an art house cinema is where filmmakers and movie-goers can interact, either through one on one interactions or through a moderated Q&A’s.  Film festivals commonly present these interactions, but sometimes a special appearance can be made as part of lecture within a college seminar, or a premiere screening.  If these experiences can be found in your local arts cinema, than it is a one-of-a-kind experience well worth taking.  Here in LA, there is a special program called American Cinematheque, which presents special screenings in old Hollywood movie houses complete with in-depth analyses by film-makers and film scholars alike, all with the purpose of educating the public about the importance of cinema.  It’s a worthwhile, year-long program that I recommend that people check out, and if there are similar programs held in your local community, all the better.  But unique communal experiences in an art house theater aren’t just limited to education.  Sometimes it can be a party too.  The Rocky Horror Picture Show (1975) has become one of the biggest cult films of all time and that’s largely due to the fact that it’s been in continuous release for almost 40 years, being a staple of the art house circuit and also an interactive experience for it’s audience.  Not only do audiences watch the movie, they participate in the action on screen, singing along and even reenacting the moments in the film as it’s happening; and this unique experience has been endorsed and promoted by the art house scene for years.  It’s also the only place where you’ll be able to find it.  Once again, one of the many unique pleasures of an art house cinema experience.

But why do we love Art Cinema so much today, even more than the tent-pole releases that come out of Hollywood?  I think that it’s because they offer a much needed alternative.  Variety is what keeps the industry alive, and indeed, when awards season comes around, it’s the small independent market that is put into the spotlight.  Even still, Art Cinema still maintains a small slice of the whole Hollywood pie, but it’s one that nowadays is central to the whole piece.  If anything has been lost in the shuffle, it’s that niche markets catering to very specific audiences have been lost.  Exploitation films have become a thing of the past, and noble attempts to try to recapture that experience, like Quentin Tarantino and Robert Rodriquez’s Grindhouse (2007) still come up short.  Also, with more interactions by major studios in independent film-making, are we truly seeing movies that are as groundbreaking as they used to be?  Whatever the case, an Art House cinematic experience is still a great alternative to the same old multiplex junk, and more often than not, you’ll still watch something out-of-the-ordinary there.  What I love best are the little discoveries; like a movie that you never thought much of before, but after seeing it in an Art House Cinema, it ends up changing your life.  I’ve certainly had my fair number of experiences with these surprising gems over the years, like 2008’s In Brudges or 2011’s Drive.  In many cases, I could only have found these special discoveries if for no other reason than I just wanted to watch them on a whim, and that’s that’s the kind of specialty that Art Cinemas offer.  I don’t believe I would be the same kind of film buff that I am now had I not had a local Art House theater in my community.  So, if you’re not interested in yet another dumb Transformers movie, I recommend that you search out an Arts Cinema in your area and find something more interesting to watch.  That life-changing film could very well be in an Art House theater near you.

 

The Good Old West – Why Modern Revisionist Westerns are Failing

true grit

If there has been a literary and cinematic contribution to modern society that can be classified as distinctively American, it would be the Western.   Just as Shakespeare is to “merry Olde” England and anime art has been to Japan, the Western has become America’s most impactful addition to world culture, without ever loosing it’s national identity.  And like most other international contributions to popular culture, it has evolved over time; though still maintaining it’s genre characteristics.  No matter the setting or the circumstance, a Western will always involve characters exploring an untamed frontier and will usually center around a protagonist who is the very definition of a rugged individualist; more often than not, a gun-totting cowboy.  While the American West was naturally the setting for these stories over the years, the thematic elements of the genre don’t necessarily need to be tied to it.  The amazing thing about American Westerns is how much of an impact they’ve had on other forms of cultural art over the years; sometimes in unexpected places.  Akira Kurosawa for one drew a lot of inspiration from American Westerns when he made his Samurai films like Seven Samurai (1954) and Yojimbo (1961), both of which were then reimagined by Hollywood, becoming The Magnificent Seven (1960) and A Fistful of Dollars (1964) respectively.   But as society began to change in the later part of the 20th Century, so did the genre, and there became a need to reexamine what the American West was really about.  Thus, we got the era of the Revisionist Western, which has defined much of the genre for the last several decades.  But, with the recent failures of movies like Cowboys vs. Aliens (2011) and The Lone Ranger (2013), as well as this year’s A Million Ways to Die in the West (2014), is it possible that this era of revision is coming to an end?

The Western has become popular the world over, but what is it exactly about the genre that we like.  I think that it’s the idea of the frontier that we find so appealing.  It puts into perspective how little an impact we have individually in the world, thereby raising the tension when that same world tests you.  Because of this, the Western hero would be defined by very out-sized personalities, and this is probably why so many of them are still admired today.  When Westerns became a staple of the rising market of Hollywood, the actors and filmmakers responsible for making them likewise became larger than life figures in modern culture.  John Wayne, Henry Fonda, Clint Eastwood, Lee Marvin, and Jimmy Stewart; these men became the faces of the American West to the culture at large and they still embody the ideal of the rugged individualist today.  Likewise, whenever someone wants to recreate the image of the American West in a painting, a photograph, or a film, they will usually follow the visual eye of John Ford, Howard Hawkes, or Sergio Leone.  Orson Welles once said that he found his visual inspirations for his iconic Citizen Kane (1941) by watching John  Ford’s Stagecoach (1939) over and over again.   Because of these men, we have a definitive idea of what a Western is, and what it can be.  But, even the great masters weren’t tied down by the confines of their own genre either.  Of course, the Western is something that can be reinterpreted many times over, and filmmakers like Ford and Hawkes used their movies to tackle a variety of issues in society, including racism (The Searchers), paternal abuse (Red River), and civil rights (Cheyenne Autumn).   But, Westerns would go through a whole new phase once people who grew up on them began to turn their own eyes on the genre.

The 1960’s marked the beginning of the counter-culture movement, which changed the outlook on all American culture in general.  The Western was reexamined during this time, and many new filmmakers saw the glorification of the American West in these films as a bad thing.  To many of them, the rugged individualist embodied by actors like John Wayne represented a view of America that was looking backwards and was impervious to change.  The plight of the American Indian became a popular theme in this time and many modern filmmakers wanted to highlight that untapped perspective in their movies.  One film in particular that explored this idea was Arthur Penn’s Little Big Man (1970), which starred Dustin Hoffman as a white man raised by an Indian tribe in the old West.  The film is a very interesting reversal of the conventions of the Western, where the native people are the main heroes, and the cowboys are the villains.  While the movie still focuses on a white protagonist, it nevertheless puts emphasis on the Native American people that few films had done up to that point, and it was a revision that was very welcomed in-deed  at the time.  If you haven’t seen it yet, do so.  It’s a very interesting and surprisingly funny movie at times, and it treats the Native American characters humanely, while at the same time making them flawed and complex as individuals.  What Little Big Man also represented was a shift in the genre that would go on to define the Western for many years to come.  Suddenly, revision became the popular form of telling a Western story, though if you look at each film individually, you can still see the inspirations of past masters at work in these films.

The most popular kind of revisionist Westerns at this time were also the bloodiest.  Sam Peckinpah took the Western to a whole other level of brutality when he created his classic The Wild Bunch (1969).  This film resonated with audiences because it seemed to reflect more accurately how the world really was.  In The Wild Bunch, there are no clear winners or losers.  There was no nobility in the rugged individual in this movie; everyone was a dirty, rotten scoundrel.  In this film, moral relativity defined who we were rooting for, since all the characters were flawed in some way.  And with a bloody shootout at the very end that puts all other shootouts to shame, we saw a reflection of the true brutality of violence in the old West.  The fact that this movie came out during the height of the Vietnam War was no accident.  Peckinpah wanted audiences to see how brutal gun-fighting is and show that the gun-ho attitude that the American soldier picked up on after watching the Westerns of the past was probably not the best thing to bring into battle.  Other negative aspects of the old West were also reexamined during these heyday years of the Revisionist Western, and that included the awful history of racism in the old West.  This was the focal point of Mel Brook’s classic Western comedy Blazing Saddles (1974).  Blazing Saddles manages to effectively breakdown the issue of racism in a Western by exploiting the Hell out of it.  No other film mocks the conventions of the Western more effectively than Saddles, and it is still one of the funniest movies ever made.  And while these movies attempt to break apart every traditional Western convention,  the still manage to hold up as an effective Western themselves, which shows just how resilient the genre is.

However, over time, even a revisionist view of the genre tends to lose steam after a while.  While this type of re-interpretation of the genre continued to define much of it’s output in the last few decades, even through the hands of one of it’s icons (Clint Eastwood and his Oscar-winning Unforgiven (1992)), there came a point where the Revisionist Western itself became commonplace.  I believe this started around the time Dances With Wolves (1990) came into theaters.  While Dances With Wolves was an enormously popular movie, and a winner of multiple Oscars, it has unfortunately lost some of it’s luster over time, mainly due to the fact that we’ve grown too accustomed to a movie of it’s type.  Kevin Costner’s film depicts the life of an American soldier sent out West to live among the Native American tribes of the Western plains.  The film, while still having it’s heart in the right place, today seems a little too heavy-handed in it’s messaging, and at times almost pandering to the Native American audience.  The noble white man character has unfortunately become one of the less effective elements of the Revisionist Western, and it’s mainly because it takes away from the voices of the actual native people themselves.  What started with Dances With Wolves has continued on through other films in the genre, and it’s made the Western more predictable and less exciting over the years.  I understand the inclination to show the plight of an oppressed people through the eyes of an outsider, but in the end, I only think the decision is made because Hollywood thinks that Middle American white audiences won’t accept a movie unless they see someone they can identify with in it, especially if they are also a bankable star.  But, more likely, a big problem with these movies is that they put message over story, and in the end, that’s something that will hurt a film, no matter what the genre.

For some time after Dances With Wolves and Unforgiven, the Western began to disappear from movie theaters.  This led to many filmmakers trying to revive the genre by trying to do different things.  This included mash-ups like Cowboys vs. Aliens, which put a sci-fi spin on the traditional Western aesthetic, as well as the mega-budget flops like Wild Wild West (1999) and The Lone Ranger (2013), both of which seemed to think that a lot of eye-candy would be enough to bring in audiences.  Instead, it only made more people weary of the genre.  What has surprisingly been the reason for these movies’ colossal failures has actually been audiences desires for authentic Westerns.  The traditional Western, even with all it’s flaws has become desirable again to many viewers.  This is reflected in the fact that the only Westerns that have been a success in recent memory are remakes of classic Westerns in the past.  This includes the Russell Crowe and Christian Bale headlined 3:10 to Yuma (2007), which was a remake of the Glenn Ford classic, as well as a remake of the John Wayne classic, 1969’s True Grit, made by the Coen Brothers in 2010.  Both films are traditionalist Westerns right down to their DNA, albeit with modern flourish.  But, what is surprising about these films is their incredible runs at the box office, both making well over $100 million domestic.  True Grit (2010) in fact is the highest grossing Coen Brother movie ever,  making more than all their previous movies combined.  Could this be the beginning of another reversal in the genre, or is it just a reflection of how well made these two remakes are?

I think that audiences indeed are beginning to re-embrace the traditional Western once again.  This is reflected again in the popularity of older Westerns, as well as the remakes that we see today.  John Ford’s The Searchers saw one of the biggest jumps ever in recognition by the  industry when AFI made an updated list of their Top 100 movies; moving from #96 to #12 in ten short years.  Other people are also claiming Western movies as among their favorites and even the most successful revisionist Westerns today are ones that still honor the traditions of the older classics.  I’m sure Quentin Tarantino’s true intention behind the making of Django Unchained (2012) was to make an exciting Western, and less so to do with it’s statement on slavery.  That’s something that all the great Westerns have done in the past; be entertaining.  When a movie becomes too concerned with rewriting the conventions of the genre (Wild Wild West), or tries to mock it without understanding what the punchline will be (Million Ways to Die in the West) it leaves audiences cold and more inclined to just return back to what they like in the first place.  As Mel Brooks has said before, “We mock the things we love,” which means that to make a good revisionist Western, you have to be a fan of it as well.  In many ways, deconstructing the Western genre is what has kept it alive all these years.  Even Revisionist takes are now considered defining representations of the genre, like Wild Bunch and Butch Cassidy and the Sundance Kid (1969).  Any revision that doesn’t respect the past is doomed to be forgotten, and unfortunately that’s what has defined most recent Westerns.  If anything, my hope is that the classics will endure and still give inspiration to aspiring filmmakers, so that the Westerns of the future will still keep the Spirit of the West alive.

Game Over – Why So Many Video Game Movies Fail

Super Mario Bros

Just as going to the cinemas has become a cherished pastime for many people over many generations, we are now seeing a whole new type of media beginning to take charge and become an even larger part of our everyday lives.  For forty years now, we’ve seen the rise of video games, from crude blocks of color on the TV to full blown blockbuster releases that even rival what’s coming out of Hollywood today.  It’s quite a eye-opener when you see the newest Grand Theft Auto title out-grossing every film released in the same year, but that’s what’s happening in our culture  today.  Of course, Hollywood has taken notice, and really they’ve been trying to figure out gaming culture since it’s very inception.   The only problem is that there is no easy way to translate a video game experience to the big screen.  When we watch a movie, we expect that the story will guide us to a conclusion, but when playing a video game, we’re the ones who guide the story.  Sure, there are narrative driven games, but many others are built around the randomness of our own choices, and that’s what makes them unique.  Now that game programming has become as sophisticated as it is, video games are starting to eclipse what film-making is capable of.   With this kind of popularity, it’s only natural that Hollywood would want to capitalize on it.  The only problem is that by doing so, they lose some of that unique experience that video games gives us.  Not surprisingly, most video game inspired movies have failed over the years and that is due to them either trying too hard to be like the original game, or trying too little, or being so removed from the original concept that it becomes something else entirely.

When looking at all the problems that video game movies have, it helps to see where things went wrong at the very beginning.  In the early years of video gaming, the titles that were coming out were very primitive.  It wasn’t until titles like Donkey Kong came into the market that you could see any semblance of narrative.  Naturally in these early years, Hollywood became more interested in the gaming culture than with the games themselves.  Back in those days, nobody thought that 3D graphics and online play would be a reality, so everything was more or less about getting the highest score.   The lack of foresight may have been a reason why Hollywood never jumped headlong into video game culture and as a result, they’ve seen game development become competitive with their own industry.  The 1980’s showcased some examples of this exploitation of the culture as we saw many films feature Arcades as popular hangouts.  One film in fact centered entirely on the Arcade subculture of the 80’s; 1989’s The Wizard, starring The Wonder Years’s Fred Savage.  The Wizard offers a interesting window into how the world perceived video games years ago, but it’s also firmly a product of it’s time as well, treating a video game contest no differently than any other over-coming the odds narrative back then; the game itself was irrelevant.  However, there was one film at the time that actually did explore the possibilities of the gaming world; Disney’s Tron (1982).  The film explored the idea that a video game could be fully interactive world inhabited by simulated people based on our own selves.  In other words, it’s depicting something like a MMO (massive multi-player online) game, resembling World of Warcraft, that we all know today.  Though limited by what it was capable of in it’s time, Tron has proved to be a very forward thinking film, and naturally something that groundbreaking ended up being a failure at the box-office.

When Hollywood began to take video games seriously is at the point when video games started to have characters and narratives that people gravitated towards.  The early 90’s brought us the early beginnings of video game franchises, with the likes of Super Mario Bros. and Sonic the Hedgehog being among the biggest names.  And while Hollywood was smart enough to jump on board and bring some of these popular characters to the big screen, they unfortunately didn’t know exactly how to do it.  When you look at the original Super Mario game, you can see how difficult it was to translate.  Basically it’s a red-suited man jumping into pipes and breaking blocks with his head; not much to draw from to make a 90-minute film.  When 1993’s Super Mario Bros. made it to the big screen, it was instantly slammed by both critics and fans of the game alike.  What ended up happening was that with no clear idea of how to adapt the Mario game accurately, the filmmakers just threw in every weird idea they could think of in the end, making the finished movie an incomprehensible mess.  Instead of the big-headed Goomba minions from the game, we get the large-body, small head Goomba guards who look more creepy than silly.  Not only that, but we also get actor Dennis Hopper looking all sorts of confused while playing King Koopa, as a sort of lizard-human hybrid.  The only thing the movie got right was the casting of Bob Hoskins as Mario, who does indeed look and act the part well.  But, what Super Mario Bros. represented most was a prime example of Hollywood not understanding what a video game was and how to make it work as a film.  This would be systemic of most the 90’s video game adaptations, and namely for most if not all adaptations thereafter.

The big problem with video game adaptations today is that it’s impossible to make something interactive feel the same in a non-interactive form.  For the most part, a video game translation ends up just feeling like a cut-scene that never ends.  For gamers, the narrative is there to move them from one level to the other, all with the goal of reaching the end and beating the game.  Of course, that doesn’t mean that the story is irrelevant.  In fact, most video games have very complex and involving story-lines, particularly those that have come out in the last decade or so.   The reason why video game movie narratives suffer is because of the limited run-time.  Movies are only allowed on average about two hours to tell their entire story, so if you try to take a video game story that takes 10 times that amount to unravel and condense it into a film narrative, you’re going to have to lose quite a bit in the process.  As a result, most video game adaptations lack character development and spend way too much time setting up it’s world.  A prime example of this would be Final Fantasy: The Spirits Within (2001).  The Final Fantasy series is heralded by both gamers and casual fans for both it’s colorful characters and it’s complex story-lines.  But the reason why these games are so complex is because they take many hours to complete; sometimes 50 hours or more.  If you take 50-plus hours of development and try to condense that same kind of complexity into a two-hour film, you’re going to get something that’s crushed tighter than bedrock.  That’s the fatal flaw of The Spirits Within, a film that’s so concerned with it’s world’s complexity (and being one of the first movies ever to utilize photo-realistic CG human characters), that it leaves everything else by the wayside, making the whole film feel very hollow, particularly with the characters.  While many of these films try to noblely translate beloved story-lines to the big screen, there’s just no possible way to contain it all.

Another problem is the fact that many video game movies try too hard to be just like the titles their trying to adapt.   This usually is evident in some of the film’s characterizations.  Video games can sometimes get away with generic and usually archetypal characters, because by playing the game, we are infusing ourselves into the story, so a blank-slate protagonists is usually a good thing.  When it comes to the movies, however, lack of character can pretty much sink a film.  This is especially painful when characters that people love in the games are translated so poorly by under-qualified actors.  The Mortal Kombat films in particular gives us the right look of the characters with no other depth beyond that.  That’s the unfortunate result we get from adapting something as simple as a fighting game.  There’s little character development to begin with, so if you take that directly to the big screen and do nothing to build upon that, you’re going to get very bland characters.   The same can be said for pretty much every other video game movie out there.  Also, another way that a video game adaptation can fail is by trying too hard is in capturing the look of the video game.  This is especially true of more recent video game movies.  With the advances in CGI over the years, simulated reality is becoming ever more convincing, and the lines between video game graphics and cinematic graphics is growing closer and closer.  Because of this, the limitations of film-making again come into play.  A video game lets the visuals unfold organically and lets the player examine it all at their own leisure.   A movie has to cut around and limit what the viewer sees.  That’s why video game movies that try to look so much like their predecessors, like 2005’s Doom and 2006’s Silent Hill suffer, because that interactive element is removed, making the viewer feel less involved.

Not that every video game adaptation has been a complete failure.  There have been exceptions over the years that have managed to make a dent at the box-office, even if it’s a minor accomplishment.  Paul W.S. Anderson’s Resident Evil series has been going strong for six films now; some of which people say are actually better than the more recent games in the wanning series.  Also succeeding is the Tomb Raider adaptations, both of which star Angelina Jolie as the popular archaeological adventurer, Lara Croft.   What I actually think is interesting about the film Lara Croft: Tomb Raider (2001) is that it represents how to do a video game movie right.  While no masterpiece, the film is nevertheless competently made; finding a way to make Lara Croft work as a character on the big screen in a stand-alone film.  The movie doesn’t try to recreate the video game experience (how could it, with those Playstaion 1 graphics?), it merely translates the character’s personality into a narrative that can be told cinematically.  It doesn’t try to put the cart before the horse like so many other video game movies do, and let’s the character be the star rather than the world she lives in.   Naturally, because of this, Lara Croft: Tomb Raider is one of the few films based off of a video that has been profitable.  Of course, it helps that the world it’s adapting is not particularly complex.  Sometimes a modest title is the best kind of game to adapt, which is probably why shoot-em-ups are popular adaptations today, like 2007’s Hitman  or a racing game turned movie like Need for Speed (2014).

Of course, sometimes the opposite comes true, when a filmmaker or studio takes something that can translate perfectly to the big screen and uses the completely wrong approach.  That was the case with the Disney produced Prince of Persia: The Sands of Time (2010).  Prince of Persia was already a very cinematically infused game before this movie came out, so Disney should’ve had an easy time making it translate onto film.  Unfortunately with a miscast lead (Jake Gyllenhaal), a un-focused director (Mike Newell), and an out-of-control production budget (est. $200 million), the movie failed on every level and sullied not just the reputation of it’s creators, but it also sullied the Prince of Persia brand overall.  No game in the series has been released since, and probably never will, at least until the memory of the movie has gone away and the demand for the game returns.   At the same time, there are also people out there who only adapt video games as a way of exploiting them.  Sometimes it’s as harmless as an up-and-comer who wants to showcase what they can do by taking a little known video game title and putting their own spin on it.  And then, you have someone like Uwe Boll, who’s whole career has been defined by cheap adaptations of video games like Bloodrayne (2006) and Alone in the Dark (2005).  The only reason he adapts video games, turns out, is because of a tax loophole in his native Germany that let’s him make more money off failed adaptations of licenced games.  So, not only is he getting rich off of bad movies, but he’s also trashing games that people have loved for many years, making it the worst kind of exploitation.  The wrong approach usually ends up being worse than a confused or bland adaptation, because in the end, it’s the original games that suffer and the legacy that they carry.

So, even with all the failure that have come in the past, will there ever be a video game movie that will actually become a huge hit.  I would like to see it happen, but it probably never will, because there are just too many fundamental differences in the way to keep it from ever happening.  For one thing, a movie can never capture the interactive experience that a video game presents.  And with more and more video games growing visually more complex, it’s clear that Hollywood filmmaking is starting to face some tough competition.  But, to the industry’s credit, they have found a way to embrace video game culture over the years, and make it a part of itself.  Many studios have their own software development departments and it’s very common to see tie-in video games released alongside major Hollywood releases.   Even still, Hollywood still hasn’t given up on trying to make a big-screen translation of a video game work.  Adaptations of Assassin’s Creed, World of Warcraft, and even Angry Birds are in the works, though I highly doubt any of these will feel exactly like the original games.  In reality, I think Hollywood is better off looking at what the games mean to us, rather than take a literal translation approach.  I strongly recommend films like Tron and Wreck-it Ralph (2012), both really fun movies that look at video games as a lived in world rather than as a form of diversion.  Also, I strongly recommend the documentary King of Kong: A Fistful of Quarters (2008) which brilliantly presents the impact video games have on our culture.  One day, Hollywood will figure out the formula and hopefully deliver a great video game adaptation someday.  I can tell you this; I’ve been waiting my whole life for that Legend of Zelda movie, and that wait is still going strong.

 

The Gospel According to Mel – “The Passion” Ten Years Later and Bringing Scripture to the Big Screen

melpassion

Often we see a renowned filmmaker and/or a movie star step off the pedestal that the entertainment business has set them upon in order to make something that not only is risky, but could also jeopardize all the goodwill that they have earned in their career.  I put together a top ten list of these kinds of “passion projects” before, but one that certainly has left an impact over the last decade, on both the industry and on it’s creator, is Mel Gibson’s The Passion of the Christ (2004).  This year marks the 10th anniversary of this controversial film, which may be a milestone of celebration to some and a dark chapter for others who wish to forget.  No matter what your opinion is on the movie, you cannot deny that it is one of the most monumental films of the new century, and it’s legacy will probably be felt for a long time to come.  But, for the most part, that legacy centers more around the controversy surrounding it and less about how it stands as cinematic art.  No doubt Mel Gibson himself has been unable to shake away from the legacy of this film, and all the divisiveness surrounding it; and for better or worse, it will be the movie that defines his career in Hollywood.  Looking at the ten years since The Passion’s debut, we have learned a lot about how difficult it is to take holy texts and bring them to the big screen.  Did Mel Gibson’s film prove that biblical stories can indeed work in movie form, or did it show that it’s better to keep religion out of entertainment?
In order to understand why Mel Gibson would risk his reputation over a single movie, you have to understand the conditions that led up to it’s production.  Long before The Passion, Mel tried to segway his acting career into directing, starting off with 1993’s The Man Without a Face.  This was a modest production that earned Mr. Gibson some good praise, but considering that Mel was mentored in his early career by visionary and ambitious Australian directors like George Miller (The Road Warrior) and Peter Weir (Gallipoli), he had something much more epic in mind.  Naturally, his follow-up was the groundbreaking Braveheart (1995), which earned Gibson Oscars for both Best Picture and Best Director.  After Braveheart, Mel returned to acting regularly, until the early 2000’s, when he decided to bring a story near and dear to his heart to the big screen; the story of Christ’s crucifixion.  Raised in a ultra-traditionalist Catholic household, it was no surprise that Mel would look to scripture for inspiration, and while nobody doubted that he could pull it off cinematically, concerns about whether or not he should soon arose.  It wasn’t until the script was made public that the controversy around the film started, given that people interpreted it as anti-Semitic.  Mel’s project was dropped from all interested parties as a result and he ended up funding the project with his own money.  The movie eventually made it to theaters, and despite all the controversy, or perhaps because of it, The Passion of the Christ became a box office phenomenon, earning $83 million on opening weekend and $370 million overall.
Despite what Mel intended for the film, it’s aftermath took on a life of it’s own.  It became a focal point in what many people call the “culture war” in America, which in turn took the whole controversy surrounding the film and politicized it.  The “culture war” is basically a term created by news media to frame political arguments related to pop culture, and show a cultural divide between the left and the right in America whether there is one or not.  Given that The Passion was released in 2004, which was also an election year, the movie became sort of a rallying point for both political camps, with Christian conservatives seeing the movie as a powerful affirmation of their beliefs, while liberals were almost universally opposed to the movie, calling it religious propaganda.  There were people who did break ranks from ideology and judged the film on it’s own merits; Christianity Today, a faith-based publication, was sharply critical of the movie when it premiered, while left-wing film critics Roger Ebert and Richard Roeper both gave the movie two thumbs up, and stood by their reviews many years later.  Nevertheless, reactions to The Passion divided America, probably more so than it should have.  It became a political tool, which I believe is something that Mel never wanted it to be.  Though Mr. Gibson leans to the right politically, he’s never been exactly been a dyed-in-the-wool Republican icon; and for the most part, he’s been sharply critical of all political parties his whole career.  The movie becoming a lightning rod for this so-called “culture war” is probably the legacy that Mr. Gibson wishes the film had avoided.
But, regardless of intent, Mel Gibson had to have known that the movie was going to upset people no matter what.  This is the risk that comes with adapting scripture to film.  There always are skeptics out there who will dismiss biblical stories as nonsense, as well as others who take every word as, well gospel.  Naturally, if you make an earnest attempt at bringing the film to the big screen, it will be scrutinized, especially if it strays from expectations.  You see this in other modern attempts at adapting stories from the Bible.  Martin Scorsese’s The Last Temptation of Christ was sharply criticized by people of faith for it’s depiction of a “what if” scenario where Jesus chose life instead of sacrifice.  In the movie, Christ still dies for man-kind’s sins like he does in the Bible, but Scorsese let’s the film explore the idea of how Jesus might of struggled with that choice.  Opening up that dialog proved to be to much for traditionalist Christians, who condemned the movie as blasphemous.  A similar controversy is brewing right now over Darren Aronofsky’s Noah (2014), with Christians once again attacking a film over it’s revisions.  But despite all of the controversies, I believe that each of these films have more in common than people think.  Again, I believe that it’s all the nonsense about a “culture war” that has shaped the divided responses to these movies.  Overall, they each represent an expression of faith on the part of their respective filmmakers, and each shows how the cinematic medium can find stories that are interesting and complex in a source as widely familiar as the Bible.
You may be wondering what I actually think of Mel Gibson’s The Passion, especially looking back on it now over ten years.  To put it simply, it’s an easy film for me to respect than to admire.  I do think that it is a triumph of film-making; showing Mel Gibson’s unparalleled talent as a director.  I am amazed that the movie was self funded and completed on just a $30 million dollar budget.  It was released around the same time as big-budget epics like 2004’s Troy and Alexander, and yet feels more authentic to it’s time period than those two ever did, even with their $200 million plus budgets.  The film is also gorgeously crafted, and shot by one of the world’s greatest cinematographers, Caleb Deschanel.  Actor Jim Cavizel shines in the role of Jesus, bringing new meaning to the phrase “suffering for his art.”  Where the film is at fault though is in it’s story.  I know it’s odd for me to critique the “greaest story ever told,” but my problem has more to do with Mel’s interpretation.  Like Mr. Gibson, I was raised Catholic (albeit in a less traditionalist church), so I know all the important points of the story by heart.  Where the movie loses me is in how it’s all focused.  Mel just lets the events of Christ’s crucifixion play out without grounding it in a narrative.  Pretty much the story just goes through the paces, indulging more in the grim details than explaining exactly why they are happening.  This leads to a lack of character development that sadly makes most of the supporting players feel less interesting.  The only standouts in terms of character are Cavizel’s Jesus, actress Maia Morgenstern’s outstanding portrayal of the virgin mother Mary, and a chilling interpretation of Satan by Italian actress Rosalinda Celentano, who taunts Christ by taking the form of a mother figure.
I do remember seeing the movie with family back when it first premiered, as well as the hours long conversation we had about it afterwards.  While we were moved by the movie, I don’t think it had any kind of effect on our religious beliefs.  To be honest, I’ve moved further away from the Catholic church in the years since, but not as a result of this movie.  I still respect the risk Mel took to make it, and I’m glad the movie exists.  As far as the anti-Semitic undertones that people claim the movie promotes, I have a hard time seeing them.  Sure, there are people who see the depictions of the Hebrew high priests in the movie as problematic, but to me the priests depicted in the film are so far removed from modern day Jews that I don’t even see the two as even remotely comparable.  Not only that, but the movie does go out of it’s way to portray the Roman guards as the true villains in Christ’s story.  If there is any criticism that’s leveled against the film that has any merit, it’s in the way the Gibson indulges in the suffering of Jesus in his final hours.  The movie shows you every cut, gouge, and impaling that is inflicted onto Jesus during his execution, and it literally is the focus of the entire movie.  It could be argued that Mel is obsessed with portraying suffering and torture on film in gruesome detail, much like he did with the ending of Braveheart, and that this misses the point of Christ’s teachings in the first place.  While I don’t think Mel intentionally misinterpreted Biblical passages in order to indulge his own cinematic passions, the film nevertheless is defined more by it’s gruesome elements than by it’s uplifting message.
In the ten years since, people have been trying to interpret exactly what was meant by Mel Gibson’s film, and what it means for the future of scriptural film-making.  Unfortunately, Mel’s personal life problems have clouded the reputation of the film, and Mel’s drunken rants have given weight to the claims of antisemitism.  Because of the sharply divided responses from people due to the ongoing “culture war,” faith-based films have once again been marginalized into a niche market; choosing to preach to the faithful rather than have their movies appeal to all audiences.  The recent success of the Christian film God is Not Dead (2014) is something that I see as being a negative result of the “culture war” division, because it portrays a “straw-man” argument that all Christians are morally right and that atheists are using education to corrupt people.  The same argument can be made on the other side, when Hollywood adapted The Golden Compass (2007) to the big screen, which itself was a atheistic fantasy story that portrayed religion as an evil force.  Religious films are best when they don’t insult the intelligence of the viewer and actually challenges their beliefs, no matter what their faith is.  Back in the Golden Age of cinema, Hollywood found a way to make movies that faithfully adapted scripture, while still maintaining a sense of entertainment.  Cecil B. DeMille’s The Ten Commandments (1956) has stood the test of time because people of all faiths enjoy the spectacle that DeMille put into his production, while William Wyler’s Ben-Hur (1959) is still beloved because of it’s universal story of adversity against hatred.  Like these films have shown, Biblical stories can work in cinema if one knows how to reach their audience correctly.
So, while Mel Gibson’s The Passion of the Christ may have taken on a life of it’s own beyond what the filmmaker intended, it nevertheless is still one of the most monumental films in recent memory.  You bring this movie up in conversation and even 10 years later this movie will still stir up passionate feelings in people.  While Mel has his own moral issues to deal with, I don’t believe that he created this movie out of a need to condemn, but rather to explore his own feelings about his faith.  I think he felt like there was a lack of worthwhile religious themed films out there and he sought to fill that gap in some way.  I think the movie stands up over time, especially compared to the lackluster, church-funded movies that have come in it’s wake.  It’s not the best faith-based movie I’ve seen, and certainly not one of Mel Gibson’s best either; I still look at Braveheart as his masterpiece, and his Passion follow-up Apocalypto (2006) is an underrated gem.  Even still, the best legacy this film could have made is that it sparked a renewed interest in making unique and personal Biblical films once again, which cinema has been severely lacking in.  It took a while, but Aronofsky’s Noah seems to be that film the first film since The Passion to actually make good on that promise, though of course time will tell if it lasts.  As for The Passion of the Christ, as flawed as it may be, it nevertheless changed the way Biblical movies are seen in our modern culture and showed that taking a big risk has it’s rewards in Hollywood; a legacy that I think serves the movie well over time.

Box Office Duels – Hollywood’s Reliance on Copycat Movies

copycats

If you watch a lot of movies like I do, you’ll know that original concepts and ideas in blockbuster movies are few and far between.  And it’s easy to see why; Hollywood prefers to play things safe and cater to the same crowds over and over again.  This isn’t necessarily a bad thing.  After all, given how much money these studios pour into their big tent-pole productions, you’ll understand why they would prefer to not step out of line in order to get most of their investment back.  But, at the same time, when you try too many times to repeat the same kind of business over and over again, the end products will lack any definition of their own, and will look more transparently like a cash in.  Sticking close to formula can only last as long as the end product stays fresh.  Sometimes, filmmakers even run the risk of unfortunate timing, as their movie ideas are already being copied by another company before they are even able to get production up and running.  These are known as copycat films, and sometimes their reputations as a movie only becomes defined by how they perform against their like-minded counterpart.  While it is amusing to see how unoriginal some movies can sometimes be, it’s still apparent  that the trend of mimicking other people’s movies is and will always be a part of Hollywood’s legacy.
So how do we necessarily know when a movie should be labeled a “copycat.”  It basically comes down to when we recognize a movie exists only because of the presence of a near identical film.  The movie doesn’t need to be exactly the same, but it should have all the same basic elements there.  This could mean that it has the same plot structure with nearly identical characters; it could have the same visual style; or it could be depicting the same kinds of events, only from a different angle.  What is most interesting, however, is that sometimes these identical movies are released within months, or even days, of each other by competing studios.  This is what is commonly known in the industry as dueling; where the studios purposefully put their competing movies in theaters at the same time in order to see who will get the bigger numbers, purely for bragging rights.  This is also a contentious spot between filmmakers and the studio heads, because usually the people who make the movies don’t see their work as a competition.  The other area where you see a lot of copycat film-making is in the aftermath of a standout movie’s huge box office success, and all the wannabe movies that come out in it’s wake.  These are the “knock-off” movies, and like most knock-offs, they tend to be of lower quality.  But, sometimes it’s the juxtaposition that we see in each of these movies with their counterparts that actually make them interesting to us.
Dueling movies are interesting because of how we judge them based off their likeness to another film.  It pretty much comes down to the “who did it better argument,” given how they are usually around the same level of quality.  The more cliched the genre is, the more likely you’ll find a pair of dueling films in it.  Action movies usually is the resting ground for most of these kinds of flicks and  many times you’ll have action movies that are so alike, that they are usually confused for one another, and as a result, end up losing their individuality.  Case in point, last year’s dueling set of movies set around attacks on the White House; the Antoine Fuqua-directed Olympus Has Fallen and director Roland Emmerich’s White House Down.  Both movies follow the exact same premise, and were coincidentally released only months apart.  Was it’s the studio system’s way of testing out the “White House Attack” sub-genre on two fronts, or were the studios just trying to jump on a trend before their competitors could get there?  My guess is that, like most dueling movies, one film got the greenlight shortly after the other one did, only because one studio had the script already archived and saw the opportunity to put it into production after seeing the other studio take the bite.  Essentially both were “Die Hard at the White House” story-lines and were safe bets for both studios as genre pictures.  And it’s not the only time Hollywood has seen this happen.  Back in the 90’s, we saw the battle of the volcano movies with Dante’s Peak (1997) and Volcano (1997) released together, as well as the summer of  “destruction from above” movies like Deep Impact (1998) and Armageddon (1998).
While most of these “dueling” movies tend to come from loud and dumb action genres, it doesn’t mean that all copycat movies are necessarily sub-par.  There are actually instances where two dueling movies are both high quality films.  Case in point, the fall of 2006, when audiences were treated to two psychological period dramas centered around magicians; Neil Burger’s The Illusionist and Christopher Nolan’s The Prestige.  It’s unusual to see this kind of subject matter spawn two very similar yet very distinct films at the same time, but both movies have managed to stand out even after crossing paths at the box office.  I happen to like both films, and it’s unfortunate that their histories are always going to be tied together because of their close release window, but it does represent the fact that two movies can duel it out at the same time, and still both be considered  winners in the end.  Animation is another field of film-making where you’ll see studios purposefully trying to undermine the others’ fresh ideas, but still with genuinely good products.  In 1998, we saw the release of not one, but two computer animated movies centered around bug-based societies; Dreamworks’ Antz and Pixar’s A Bug’s Life.  Both films are admirable productions, and are pretty much equal in entertainment value, but Dreamworks wanted to be the first out of the gate.  So, they sped up production in order to beat Pixar to the finish line; a decision that may have undermined the film’s potential for success in the end.  Pixar’s early success may have been attributed to the fact that Dreamworks was trying too hard to compete in the early days, which also became a problem when the dismal Shark Tale (2004) followed up Pixar’s Oscar-winning Finding Nemo (2003).
Apart from the dueling movies that we see from time to time, the much more common type of copycat film is the one that follow trends in the market.  These are the “knock-off” movies that I mentioned earlier and their sole existence has been to capitalize off the enormous success of another big movie that has come before it.  Of course, after the monumental success of Titanic (1997), we got Michael Bay’s insultingly cliched Pearl Harbor (2001); and the Oscar glory heaped onto Ridley Scott’s Gladiator (2000) led to the expensive busts that were Wolfgang Petersen’s Troy and Oliver Stone’s Alexander (both 2004).  More often than not, this is where you’ll find most of the copycat movies that have failed.  Perhaps the trend that has led to the most failed knock-offs in cinema is the fantasy genre.  A decade ago, we saw the enormous success of both The Lord of the Rings  and the Harry Potter franchises begin, which led many other studios to believe that they could pick up any random fantasy source material out there and have a surefire hit on their hands.  Unfortunately, not every one of these book series has the same kind of fan-base that Tolkein and Rowling has earned over the years.  Over the last decade we’ve seen many one and done franchises fizzle at the box office, like 2007’s The Golden Compass, 2007’s The Seeker: The Dark is Rising, and 2008’s The Spiderwick Chronicles.  The Narnia and Percy Jackson series managed to survive to make more than one film, but even they failed to live up to their lofting ambitions.
There is however a trend that does seem to be working well in Hollywood right now, and has continued to be profitable despite the fact that most of these movies are just copying each other’s formulas, and that’s the young adult novel adaptations.  More specifically,  the movies that have followed in the wake of author Stephanie Meyer’s Twilight series and author Suzanne Collins’ The Hunger Games series.  These two franchise have become huge cash cows for their respective studios, and are currently defining the trend that we see today.  While Twilight is far from perfect as a movie, there’s no doubt that it has left an impact on Hollywood in recent years, and you can blame the current trend off “sexy monster movies” directly on it.  Honestly, would a zombie love story (2013’s Warm Bodies) ever have existed had Twilight‘s vampire-werewolf love triangle not hit it’s mark with teenage audiences first?  Even bigger is the Hunger Games impact.  Now, post-apocalyptic stories centered around adolescents are in vogue in Hollywood, with adaptations of Orson Scott Card’s Ender’s Game (2013) and Veronica Roth’s Divergent (2014) getting the big screen treatment.  While these movies may not rise to the same levels as their predecessors, they are nevertheless finding their audiences, and it’s proving to Hollywood that this is still fertile ground to explore.  We are likely to see many more Twilight and Hunger Games knock-offs in the years to come, given that YA adaptations are the hot trend of the moment, but that’s only because the audiences are less concerned about the quality of the adaptations themselves as they are about how well these movies deliver on the entertainment side of things.
Over the last decade, there has actually been an entire industry of film-making devoted to not only copying movies, but also just blatantly ripping them off.  This has become known as the Mockbuster industry.  More often or not they are cheap, direct-to video copycats of current blockbusters that are sometimes released on the same premiere dates.  Usually, its the hope of these Mockbuster producers that uninformed consumers will be tricked when they see their “knock-off” on a shelf in the video store and think that it’s the same thing as the bigger movie that’s currently playing in a nearby theater.  Mockbusters of course are no where near the same level of quality of a big budget film, and are usually defined by shoddy production values, D-list acting, and laughably bad special effects.  One of the companies that has made it’s name providing these kinds of films to the market is called Asylum, and their library consists of many notable “knock-offs” like The DaVinci Treasure, Snakes on a Train, Atlantic Rim, Abraham Lincoln vs. Zombies, American Warships, and of course Transmorphers.  Now while many can criticize Asylum for ripping off other movies for a quick cash grab, they’ve actually been pretty upfront about their intentions and make no qualms about what they do.  They are even finding an audience who do enjoy their laughable, low quality productions as a goof.  In fact, Asylum actually hit it big last year with the surprise hit Sharknado when it premiered to a lot of fanfare on the SyFy channel.  Which just goes to show that even Mockbuster film-making can find it’s place in the world.
But is the trend of copycat film-making just another sign that Hollywood is out of ideas.  It all depends on whether or not the movies still work as entertainment in the end.  It is kind of fun to contrast two like-minded movies, especially when they are almost indiscernible from each other.  I think you can create a very applicable drinking game out of spotting all the cliches that a pair of dueling movies have in common; especially with films like Olympus Has Fallen and White House Down, which I swear are nearly identical in everything but tone.  And a Mockbuster can be entertaining for a laugh if you’re in the right state of mind.  The only time when copycat film-making becomes problematic is when there’s no passion behind it.  It merely exists to piggy-back off the success of a much better film.  That’s something that you see in a lot of the failed franchises of the last decade.  In the end, it’s okay to show off a little familiarity in your movie, just as long as you make the most of it.  Even Harry Potter and Lord of the Rings had their inspirations before them, and let’s not forget how many adventures have followed the “hero’s journey” template to the letter on the big screen over the years.  Audiences are smart enough to see when a movie’s story-line feels too familiar to them, and that’s usually what separates the copycat movies that stay with us from the ones that don’t.

Holy Grails – The Noble Search for Cinema’s Lost Treasures

metropolis

One of the best things to happen to cinema over the last few years has been the emergence of digital archiving.  Sure, it is sad to see classic film stock disappearing as the norm, but there is a reason why movies are better suited for the digital realm.  If you have a digital backup for your film, you are better able to transfer it, download it, and make multiple duplications without ever losing video or sound quality.  When a movie exists as a digital file, it is set in stone visually and aurally as long as it is never erased.  This has become beneficial for people out there who do consider film restoration a passionate endeavor in life.  For years, film restorers have had to contend with the forces of time undoing all their hard work as they try to keep some of our most beloved films looking pristine.  Now, with digital tools at their disposal, preservationists can undo the years of wear and tear on most old films and make them look even better than when they were first released.  The advent of DVD and Blu-ray has given more studios a reason to go into their archives and dust off some of their long forgotten classics, and because of this, restorations have not only become a noble cause for the sake of film art, but also a necessity.  While there’s no trouble finding most movies in any studio archive, there are a few gems that usually have alluded archivist whereabouts for years, and these are known to film historians as the “Holy Grails” of cinema.
It’s hard to believe that there was once a time when film prints were considered disposable.  Back when the studio system was first starting up, it was commonplace for production companies to dispose of their used film stock once a film was no longer in rotation at the movie theaters.  This was done so that they could either make room for new releases, or to prevent any accidents from happening at their studio.  The reason film prints were considered dangerous to store in a warehouse back in the 20’s and 30’s was because they were made from nitrate, the same material used to make dynamite.  Several fires have happened to film vaults over the years because of nitrate film spontaneously combusting, including a 1967 incident at the MGM Studios in Culver City, CA.  Incidents like this, as well as the careless disposal of early films, are the reason why 90% of all films made before 1920 are lost to us today, according to Martin Scorsese’s Film Foundation.  It wasn’t until the mid-30’s that filmmakers like Charlie Chaplin and Cecil B. DeMille started to actively preserve older movies, and their efforts have helped to keep many of these classics alive.  One thing that helped was the fact both Chaplin and DeMille had ownership over their work, so they could keep the original negatives preserved in their own collections and safe from studio hands.  Also, by keeping their films in good condition and preserved well enough to have them screened over and over again, it helped to convince the studios that it was worthwhile to do the same.
Even with better efforts to keep films archived and in good condition, older film stock still wears out over time and with many of them still made out of very volatile materials, many have just rotted away to ash in the vaults.  That is why many archivists have fully embraced the digital revolution, because it has enabled them to preserve many of these disappearing classics for posterity in a definitive way.  But, before a film can be preserved, the damage must be undone, and again digital tools are what saves these movies in the end.  There is a whole class of digital artist out there whose whole job is to scan older films from the best sources available and touch up the scratches and marks on every single frame.  Now that High Definition has become the norm in home entertainment, the results of film restorations are held to a much higher scrutiny, and that has led many studios to take better care of their whole catalog of flicks, which is nothing but a good thing for cinema as a whole.  The fact that some classic films like The Wizard of Oz (1939) and Casablanca (1943) look so good after so many years is a testament to the great efforts made by restorers over the years.  It would be unthinkable to see these kinds of films all scratched up and with faded coloring, which is why film restorations has to be an essential part of the studio business.
But, while beloved classics benefit from better care, some films have not been so lucky.  Early cinematic history is unfortunately a lost age for many film historians, because so much of it is gone.  We only know that many of these movies exist purely because of documentation from their filmmakers, or from a piece of advertisement that has been uncovered in an archive or private collection.  Sometimes movie trailers have popped up for a movie that no longer exists as a whole, like the early “lost” Frank Capra film called Say it with Sables (1928).  There are a few that have risen above the rest as films that are clearly calling out to be rediscovered and preserved.  These are the “Holy Grail” films, and some of them have become famous merely because of their elusiveness.  Like Indiana Jones searching for the Lost Ark, film preservationists have searched the world over for any evidence of the existence of these “Holy Grail” pieces of cinema.  Part of the allure of these films is the fact that they have remained unseen by the public for many years, and in some cases, never seen at all, and yet when given just one titillating glance from a press photo or from a storyboard proving their existence, it’s enough to send film nuts on a mad search.
Probably the most famous example of a lost and found “Holy Grail” film is Fritz Lang’s groundbreaking classic Metropolis (1927).  Lang’s film was made during the height of silent film-making and is considered to be the era’s crowning achievement.  Made in Germany before the rise of Hitler, Metropolis was the most expensive film of it’s time, and showed to the world that European cinema was on par with the film industry emerging in Hollywood at the same time.  However, when the movie made it’s debut in America, it was subjected to heavy cuts due to it’s more pro-Socialist themes, taking the run-time down from 145 minutes to just under 2 hours.  The Nazi regime also destroyed most of the film’s early prints, as well as the original negatives, making a full restoration impossible to do over time.  For years, the shorter cut of Metropolis was all that audiences had to see, and while it did regain it’s reputation as a cinematic classic, it remained an incomplete vision.  Film preservationists had to fill in the missing gaps with title cards explaining what was missing for many years, but while a Blu-ray release was being prepped in 2008, something miraculous happened.  A print of the original uncut version of the movie was found in Argentina in a private film collection.  The Lang Film Foundation in Germany quickly picked up the find and made their best efforts to reincorporate the lost scenes.  Even though the restoration couldn’t make the new scenes look as beautiful as the rest of the movie, due to the damage on the film stock, we are now fortunate to have a nearly complete version of this monumental film.
The saga behind the rediscovery of Metropolis’ uncensored cut gives many people hope that these “Holy Grail” movies can someday be found, and the odds of that happening improves more all the time.  There is a more concerted effort to find lost treasures tucked away in film vaults across the world, and while some “Holy Grails” have remained elusive, the fruits of the film restorers’ labors are still reaping many rewards.  Many of these finds have emerged from private collections and some unlikely places.  Sometimes it’s thanks to a very forward thinking film technician or vault librarian who saved these treasures from early destruction, sometimes without even knowing it.  A 1911 short movie called Their First Misunderstanding, the very first film to feature legendary actress Mary Pickford, was discovered in a New Hampshire barn in 2006.  Even a simple mislabeling has been the fault of some of these classics being lost.  The first ever Best Picture winner at the Oscars, 1927’s Wings, was considered gone forever due to negligent care of the original nitrate negative at the Paramount Studio Vault.  But, the film was rediscovered in the Cinematheque Francaise archive in Paris, found almost by accident when the archivists were going through their back stock, and it was quickly given a more permanent and secure place in the Paramount vault.
Sometimes, like Metropolis, it’s not a whole film that gets lost, but rather fragments that are removed and then later discarded against the wishes of the filmmaker.  These are not what we commonly know as the Deleted Scenes that inevitably have to be trimmed by the editor to make a movie work more effectively.  What I’m talking about are pieces of the movie that are removed even after the film’s first premiere, leaving big chunks of the finished film out of the public eye for whatever reason.  Sometimes these cuts were made because of censorship, and done at the protest of the filmmakers.  Or they were trimmed for the purpose of time constraints.  Back in the late 50’s and early 60’s, there was a trend for big Hollywood pictures to be shown as Roadshow presentations; meaning they were special events complete with printed out programs, musical overtures played while the audience took their seats, and special intermission at the halfway point of the movie.  These were often 3 hour plus in length programs, so when these Roadshow movies had to make it to less grand theaters across the country, it meant that the whole show had to be trimmed to meet time constraints, including removing scenes from the actual movie.  Recently, film restorations have tried to reassemble these old Roadshow versions, and while many of these have been found intact, like Lawrence of Arabia (1962) and Spartacus (1960), a few have still yet to be fully restored.  Movies like George Cukor’s A Star is Born (1954) and Stanley Kramer’s A Mad, Mad, Mad, Mad World (1963) have been given partial restorations that do their best to make these films feel complete again with the best elements left available.
Sometimes, there are films that remain lost merely because they’re being withheld by a particular artist or by the production company that made it.  This usually is because the film’s are an embarrassing black mark on the person or studio’s reputation and they would prefer that it remains unseen.  But, the downside of withholding a known property is that it will inevitably raise people’s curiosity about these films, and it will in turn will put pressure on the filmmakers to make it available again.  The most notorious example of this would be the 1946 Disney film Song of the South, which the Disney company refuses to release to the public, due to fears that it will spark controversy over its racial themes.  Though not necessarily a “Holy Grail” film, due to the fact that it was available for many decades to the public and can still be seen by anyone who can secure a bootleg copy from Asia, we’ve still yet to see a fully restored version made by the Disney company.  One withheld film that surely would be considered a “Holy Grail” type would be Jerry Lewis’ notorious film The Day the Clown Cried, which has been seen by only a small handful of people in Mr. Lewis’ inner circle.  Supposedly because of the Holocaust setting and Mr. Lewis’ less than genuine depiction of the tragedy, the film has been kept hidden from the public, probably to spare Jerry from the controversy that could arise from it.  Still, rare behind the scenes footage did emerge last year, which has raised people’s curiosity about it once again.  We may someday get a true glance at both movies, but that choice is still determined by the ones who originally made them.
What I do love is the fact that film restoration is no longer looked at as just a noble cause, but rather an essential part of cinema as a whole.  With data back-ups as common as they are now, we are far less likely to see catastrophic losses of film like we did before digital tools were made available to us.  Today we can securely preserve the works of our present as well as restore the classics of our past.  And the search for the most intriguing “Holy Grails” of cinema will undoubtedly continue to inspire both archivists and treasure hunters for years to come.  Now that we’ve managed to see Metropolis become complete, the focus now shifts to the next big find, like the lost Lon Chaney thriller London After Midnight (1927), the most notable victim of the MGM fire; the lost director’s cut of Orson Welles’ The Magnificent Ambersons (1943); or the full 7 1/2 hour version of Erich von Stroheim’s legendary silent epic, Greed (1924).  Some of these films may sadly be forever lost, but the hope always remains.  The great thing about these searches though is that it demonstrates the importance of preserving our cinematic legacy.  Martin Scorsese illustrated this idea beautifully in his 2011 film Hugo, where a young boy helps to rediscover a long forgotten filmmaker, whose legacy has all but disappeared due to the destruction of his original film prints.  Thanks to passionate film preservationists like Mr. Scorsese and the people that work in film foundations and archives around the world, our cinematic legacy is no longer disappearing, but is instead coming back to life again more and more.

http://www.filmpreservation.org/

True Romance – The Problems with Modern Hollywood Love Stories

sixteen

With Valentine’s Day just around the corner, we commonly see Hollywood try to capitalize on the romantic mood of this time of year.  Of all the genres in film-making, the one that seems to have stayed strong year after year is the Romance genre, which benefits from a very specific audience that usually makes up a good percentage of the film-going public; that being people looking for something to watch on a date.  But, what I find interesting about this year is that there has been a significant reduction in the number of romantic movies in theaters.  In fact, this Valentine’s Day has only one wide release that you could consider a traditional romantic movie; the Colin Ferrell-headlined Winter’s Tale, which has to compete in it’s opening weekend with Robocop.  How’s that for date night counter-programming.  The foreseeable future also looks absent for the romantic genre, with not a single wide release film until April 25’s The Other Woman, and that one might be considered more of a slapstick comedy.  I don’t know if this is just a fluke in the schedule, or a sign that the romantic genre has suffered a backlash, due to a recent string of notable failures.  I can see how the latter could be true.  Some truly horrendous movies have come from the genre recently, and I see it as a result of the genre’s current troubles with tone, character development, and just overall lack of definition.
I should state that I have a little bias when it comes to talking about genre pictures like romantic movies.  Romance is a genre that I generally don’t understand and usually try to avoid, not because of themes or content, but because I rarely get any entertainment value out of seeing characters fall in love throughout an entire movie.  I do, however, acknowledge that there are films that do work well in the genre and can be quite uplifting as well.  I just gravitate more towards action oriented genres, although romantic subplots are indeed found in the movies I watch as well.  Some romantic plots in action movies can even more memorable than the ones that come from the romance genre itself.  What I mean to say is that I do like a good love story, it just all depends on the movie.  But, when a movie is clearly boxed in by the genre restraints put on it, I inevitably end up judging a book by it’s cover and in most cases, I’ll probably end up right.  Romantic films, probably more than any other genre, suffers from too little diversification.  There is a specific audience that goes to these types of movies, and the studios make every effort they can to meet those expectations.  But the fact is, there are fewer fresh ideas coming out of this genre and the studios are beginning to scrape the bottom of the barrel just so they can have anything that will draw the audience it wants.
I think one of these problems can be attributed to an issue that in fact is affecting all aspects of film-making; and that’s the overabundance of movies.  Now, it might be unusual to think that more movies out there is a bad thing, but it’s an issue that actually is causing a decrease in the quality of films out there overall.  Steven Spielberg and George Lucas’ now famous op-ed from last year stressed that the studio system was going to implode on itself because of the out-of-control ways that movies are funded and distributed, and that’s something that the romance genre clearly suffers from.  Originally, there would be one standout romantic film released in every quarter of the year, which would do very well.  But, because of the increased flow of production, we have seen multiple genre movies all released at the same time.  This time of year has typically belonged to the romance genre, with movies like Safe Haven (2012) and Beautiful Creatures (2012) battling it out in the same weekend.  But, what usually has been constructive competition has ended up making it rough road for the romance genre, with very few entries actually gaining a foot hold at the box-office.  And when studios absolutely must have 5 or 6 new genre movies season, it means that less care is going to be given to the choices of stories given the green-light.
This is a trend that has come about more recently in the last few years.  Hollywood of course has had a long history with the genre, dating all the way back to the silent melodramas.  But, when we think about the most beloved romances out there, not all of them could be easily classified by any genre.  Sometimes, the most surprising love stories are the ones that are the most beloved, and the ones that have no suspense whatsoever are the ones we most revile.  Take for instance the classic film The African Queen.  The John Huston-directed movie follows a scruffy, callous boatman (Humphrey Bogart) and a stuck-up missionary widow (Kathrine Hepburn) as they travel through the heart of Africa on a small river boat.  Throughout the film, these polar opposites end up growing closer together and form one of the quirkier and more charming couples in movie history, and it all happens it what is essentially an adventure film.  The reason why this romantic plot works so well is because entertainment is drawn from the friction between the two main characters, which the two stars portray perfectly.  Romances can also work with even the most perfect of couples, as long as the outcome is unexpected.  The reason why Casablanca is held up as one of the greatest romances of all time is because the two lovers don’t end up together in the end.  It’s that parting of ways that we find so romantic because of how much each character longs for the other, and what they have to sacrifice for love.  As Bogart puts it so eloquently in the final scene, “We’ll always have Paris.”
 But what I think has happened to the romance genre is that it has become complacent.  Like I mentioned before, the genre is valuing quantity over quality, and that is leading to more movies that are exactly the same.  The strongest culprit of this would be the dreaded “Wedding” picture.  If the Romantic genre were like a sinking ship, “wedding” movies would be the anchor dragging it to the bottom.  In the last couple years, we have seen movies like Licence to Wed (2007), Made of Honor (2008), Bride Wars (2009), The Proposal (2009), and last year’s The Big Wedding all make it to the big screen and predictably get trashed by critics.  I think this is primarily because this sub-genre is characterized more than any other by it’s own cliches.  Pretty much every movie I mentioned can be summed up with the same story-line.  Girl wants to get married, problems ensue, girl ends up with the guy she really wants, the end.  The less interesting the plot, the less people are going to like it, and this sub-genre has become something of a joke over the last few years because of movies like these.  Bride Wars in particular turned out to be so insultingly bad, and probably the least progressively feminist movie ever, that even fans of the genre had to cry fowl.  It seems like filmmakers feel that just the sight of wedding traditions is entertainment enough, which is entirely the wrong way to look at your audience.  The reason why Bridesmaids (2011) became so popular was because it subverted this despised genre in hilarious ways, and that’s ultimately what people wanted in the end.
But is the genre completely helpless and without quality entertainment.  Not at all.  Usually all it takes is for one inspired idea, or a filmmaker who gives a damn like Nora Ephron or John Hughes, for a romantic film to work.  Last year, we saw two examples of quality movies from the genre, made by people who have already left their mark with these kinds of films before.  One was from Richard Linklater, which was Before Midnight, starring Ethan Hawke and Julie Delpy.  Midnight is the continuation of a series of movies following the same couple as they reach different stages in their lives, which started with 1995’s Before Sunrise and continued with 2004’s Before Sunset.  These movies are almost universally beloved and respected and it shows that if the people involved are invested enough in what they are making, it can end up being a quality film.
Another movie that actually left a good mark on the genre last year was the film About Time, which illustrated how you could make a charming romance work by injecting a new idea into it.  The movie was written by Richard Curtis, who has become synonymous with the Romance genre over the years with his scripts for beloved movies like Love Actually (2003), Notting Hill (1999) and Four Weddings and a Funeral (1994).  What Curtis did with this film to make it stand out, however, was to include a supernatural element; in this case, time travel.  While not entirely a novel idea, he nevertheless made it work with the film’s themes of awkward romances and regret, which in turn made it a more enriching film overall.
Having a unique voice helps to make a Romantic movie work nowadays, and it certainly is a breath of fresh air when a good one comes along.  The reason why many of the best ones hold up is because they treat their characters with dignity.  One of the biggest mistakes a person can make when writing a love story is to value one character’s worth over the other.  This sometimes gets into the tricky issue of gender politics, which can be a minefield if handled improperly.  Oftentimes, when a person writes a very poor love story, it’s because the male and female characters are played as generic stereotypes.  How believable is it when you see a movie where a girl has a hard time finding an attractive man, even when she has no flaws herself?  Hollywood has a problem with portraying body image correctly in movies, largely because they put glamour before everything else. Would Bridget Jones’ Diary (2001) have been better if a fuller figure actress had played the main character, instead of the more petite Renee Zellweger?  I honestly think Hollywood should give something like that a shot.  Also, as a male, I feel like men are underdeveloped in these movies.  Either they are just the object of desire, or a sexist jerk who doesn’t understand the main girl’s feelings, and that’s it.  Sometimes it’s the girl who also gets the short end of the stick in this genre, by being too self involved in their own feelings, thereby being less interesting.  Overall, the best love stories are the ones where the characteristics of both individuals are given enough time to develop, because in the end, love is a two way road.
So, is Hollywood seeing a backlash from a long string of terrible genre picks.  It might be too early to tell.  One thing that I think may have happened is that the genre has evolved into something else that can’t be clearly defined by the genre norms that we’re all familiar with.  For one thing, the rise of Young Adult novel adaptations has changed the way we look at romantic plot-lines in movies.  With films like Twilight bringing romance into the supernatural realm, it’s safe to say that you can make any type of genre flick into a popular romance.  Hell, last year we even got a zombie love story with the movie Warm Bodies.  The reason this trend seems to be happening is because the audiences that would have normally gone to the movies as part of a date night are now seeing movies of all kinds, not just Romantic movies.  In many ways, Hollywood has actually done a fairly good job of making movies that appeal to both genders, like The Hunger Games series.  That seems to be why the traditional romantic movie seems to have disappeared recently.  Oh, it’s still there, only not as prevalent as it once was, and that might be to it’s advantage.  Less competition can help a genre film stand out and maybe even get a boost from a more discerning audience.  There will always be an audience out there that wants a good, old-fashioned love story and this is the perfect time of year to not only indulge in the same old thing, but to also fondly remember the ones that really touched all of our hearts.

Ecstasy for Gold – The Cutthroat Campaigns of Awards Season

OscarStatue

Awards season is once again upon us and as always, there is a lot of debate over which film is deserving of the industry’s highest honors.  What is interesting about this year, however, is how up in the air it is.  For the first time in a long while, there are no clear favorites in this year’s Oscar race.  In years past, a clear picture would form by now of who was leading the pack after the Golden Globes and all the industry guilds have made there choices.  But so far, every one of the top honors this year has been varied, leaving no clear front runner for Best Picture at the Oscars; made all the more confusing after the Producers Guild Awards ended in a tie for the first time in it’s history, awarding both 12 Years a Slave and Gravity for Best Film.  Sure, any accolades for these movies are well deserved and appreciated by their recipients, but it’s the Academy Awards that solidifies the award season, and it’s what everyone in the industry strives for in the end.  That strong desire to win the top award has become such a dominant force in the industry, that it has started this troubling trend of negative campaigning in Hollywood. In recent years, we’ve seen Oscar campaigns become so overblown and vicious that it would make even Washington insiders queasy.  And the sad result is that in the pursuit of the industry’s top honors, the movies themselves will get lost in the shuffle.
This isn’t something new either, but it has developed over time into something bigger.  Oddly enough, when the Academy Awards first started in 1927, the awards themselves were considered an afterthought.  Instead, it marked the conclusion of a banquet dinner held by the Hollywood elite to celebrate the end of the year.  Many of the winners in this first ceremony either discarded their Oscars or pawned them off in later years, not foreseeing the significance that those statues would have in the years to come.  It wasn’t until about 4-5 years later when the ceremony gained significance, around the time when they started announcing the winners on the radio, allowing audiences to be informed about Hollywood’s awards recipients.  Once the ceremonies began to be televised in the 50’s, the awards season had now become a full blown cultural event and a focal point for the industry ever since.  Of course, with the whole world now interested in who was winning, it soon led to some of the studios making behind the scenes deals in order to get their movies to the top.  One of the earliest examples of questionable campaigning for an award came in the 1940 Oscar race, when producer David O. Selznick, hot off his Awards success for Gone With the Wind (1939), pressured a lot of entertainment press agents to campaign for his next film, the Hitchcock-directed Rebecca (1940).  The aggressive campaigning helped the film win Best Picture, but it failed to win any other major award, which led many people to question whether or not it deserved it in the first place; especially considering that it beat out the more beloved The Grapes of Wrath (1940) that same year.
This illustrates the major problem with an overly aggressive awards campaign that I’ve observed; the doubt that it raises over whether or not the movie deserves what it got.  We’ve seen the Academy Awards honor films that have certainly withstood the test of time (Casablanca (1943), Lawrence of Arabia (1962), and The Godfather (1972), just to name a few), but there are also choices made in other years that have left us wondering what the Academy was thinking.  But it’s not the final choices that make the Oscar campaigning problematic.  We all differ when it comes to choosing our picks for the awards, because everyone’s tastes are different.  What I find to be the problem is the increasingly nasty ways that movie companies try to get their movies an award by attacking their competition.  In recent years, I’ve noticed that this has gone beyond the usual “For Your Consideration” campaigning that we commonly get from the studios, and it has now devolved into fully-fledged mudslinging.  Truth be told, I don’t even think political campaigns get this cutthroat, but then again, I’m not much of a political observer.  This year in particular, we’ve seen complaints leveled at films for inaccuracies in their historical reenactments and for mis-characterizations of their subjects.  While some accusations have merit, there becomes the question of whether or not it matters. There are some voters out there who are persuaded by the chatter and would rather let the outside forces persuade them towards making a choice than judging a film on its own strengths, which becomes problematic when that chatter is ill-informed.
The most troubling thing about the recent trend of negative campaigning in the awards season is the inclusion of outside forces brought in to give weight to the criticisms behind a film.  This goes beyond just the negative reviews from critics.  What we’ve seen happen recently is the involvement of the media and press more and more in Oscar campaigns.  This has included articles written by scholars and experts that call into question the authenticity of the facts in the film as a way of slamming a movie’s credibility.  Famed astrophysicist Neill DeGrasse Tyson made the news weeks back when he published an article that pointed out the scientific information that the movie Gravity got wrong, which many people in the industry jumped upon to undermine Gravity’s chances for some of the top awards.  Mr. Tyson later on said that he did the article just for fun and continued to say that he still enjoyed the film immensely, but this seemed to get lost in the controversy that his first article stirred up.  It could be argued that film companies utilize negative campaigning just because it’s easier and more effective, which is probably true, but what it ends up doing is to distract people away from what the awards season should really be about which honoring the best work done by people in the industry that year.
The most dangerous kinds of negative campaigning that I’ve seen have been the ones that have no bearing in actual fact.  One of my first articles on this blog was an editorial addressing the smear campaign leveled against Quentin Tarantino’s Django Unchained.  At the time of the film’s release, African-American director Spike Lee openly criticized the movie because of it’s pervasive use of the “N-word,” and he denounced the film as “racist” and an insult to the history of slavery; despite the fact that he hadn’t seen the film yet.  Spike Lee’s comments however were used as ammo against the movie during last years Oscar race, which fortunately had little effect as the film walked away with two awards; for Screenplay and for Supporting Actor Christoph Waltz.  The same cannot be said for Kathryn Bigelow’s Zero Dark Thirty, however.
Released around the same time as Django, Zero Dark Thirty had a lot of hype built up around it, seeing as how it was documenting the search and capture of Osama Bin Laden.  The film’s hype was a case where Hollywood’s connections with political insiders became both a blessing and a curse.  Some left-wing studio heads even wanted to fast track the film’s release, so it would premiere before the 2012 election in the hopes that it would boost President Obama’s chances for reelection.  When the film premiered, however, the film’s reception was not what people expected.  Bigelow’s very frank depiction of torture used by the CIA to help find Bin Laden angered many people, and criticism of the film shifted from it being called left-wing propaganda to right-wing propaganda.  The film’s producers rightly argued that politics had nothing to do with the movie’s overall depiction, but the damage had already been done.  The one time Oscar front-runner was dealt a significant blow.  Kathryn Bigelow was shut out of the Best Director category and the film only ended up winning one award for Best Sound Editing, which it had to share in a tie with Skyfall (2012).  You could say that Zero Dark Thirty became a victim of it’s own pre-release hype, but I think the negative campaigning against the film rose to an almost unethical level when political leaders got involved.  Just weeks before the Oscar’s ceremony, Democratic Senator Diane Feinstein, along with fellow Democrat Carl Levin and Republican John McCain, called for an investigation into the film’s development, examining how Bigelow and writer Mark Boal got their information.  When the Oscars were over, almost on cue, the investigation was dropped.  We may never know if there was some backroom deal involved, but I saw this as an example of Awards campaign interference gone too far.
It’s troubling to think that some people are so easily persuaded by hype and negative press in the film industry, but it’s a result of how valuable these awards have become.  It is true that winning an Oscar will increase a film’s overall box-office numbers, which may be good for business, but it’s bad for the film’s legacy.  What is there to gain from a short-term boost in grosses when you’re hurting the film’s chances of having a long shelf life?  There are many examples of movies gaining a negative stigma if they win the top award over more deserving films.  The most controversial example would be 1998’s Shakespeare in Love, which many people say stole the Best Picture award away from Steven Spielberg’s Saving Private Ryan; so much so, that new campaign rules were drafted up by the Academy when it was revealed how much money Miramax execs Bob and Harvey Weinstein put into the film’s Oscar campaign.  Shakespeare did see a boost at the box office in the weeks before and after the awards, but the controversy behind it has unfortunately overshadowed the film itself over the years, which has in turn destroyed its staying power.  Time is the best judge of great movies, but the Oscars have only the year long window for perspective, so usually their picks have little foresight in the end.  1999’s winner, American Beauty, has almost faded into obscurity over time, as other films from that same year, like The Iron Giant, Fight Club, and The Matrix have become beloved classics up to today.
Is it right in the end to criticize a film over it’s content, or it’s adherence to the facts?  My argument is that a movie should be judged solely on it’s own strength as a movie.  The truth is that there is no absolute truth in film; it’s all make-believe after all.  If a film needs to take some historical liberties in order to tell a more fulfilling story-line, then so be it.  What I hate is when controversies come up around a film when it really doesn’t matter in the end.  Some controversies this year have erupted over films like Saving Mr. Banks and Captain Phillips, because of their white-washed approach to the depictions of their main characters, and the negative campaigns against them robbed actors like Tom Hanks and Emma Thompson out of recognition for awards that their outstanding performances would’ve otherwise deserved.  So what if aspects of these people’s lives are left out of the film; in the end they have nothing to do with the story’s that the filmmakers wanted to focus on in the first place.  The Wolf of Wall Street has had it’s own set of controversies, some of which the movie purposely provoked, and yet it didn’t effect it’s chances at the Oscars, so it shows that there is a selective bias in the negative campaigning behind against these films; all depending on who has something to gain from knocking out the competition.
When the winners of the Oscars are announced this year, my hope is that the voters use their best judgement when they cast their ballots.  For the most part, the Academy Awards will never please everybody.  Most often, whenever people say they were upset by the Awards, it’s more because there are few surprises and the whole thing ends up being boring in the end.  That’s why I am excited about this year’s open race, because anybody could win.  Unfortunately, the closer the race, the more negative the attacks against each film will be.  I think that hype can be a dangerous tool for a film if it is misused, and will ultimately end up clouding the merits of the movie itself.  In the end, Oscar gold does not always mean certification of excellence.  Great films stand the test of time, while the Oscars are more or less a time capsule of public tastes from that specific year.  Sometimes they pick the right Best Picture or performance, sometimes they don’t.  But what is certain is that negative campaigning is getting uglier and more prevalent in the award season.  What I hate is the fact that it’s become less about honoring great works in cinema and more about competition, seeing who’ll take home the most awards at the end of the night.  What seems to be lost in the shuffle is whether or not people like the actual films; that the movies are becoming increasingly seen as an afterthought in the awards season, with hype and name recognition mattering more in the media’s eye.  But, in the end, what matters is the entertainment value of it all, and no doubt we’ll still continue to be on the edge of our seats each time those envelopes open.

Hollywood Royalty – The Ups and Downs of a Film Acting Career

actress

A lot of work goes into making movies from many different departments, but what usually ends up defining the finished product more than anything is the quality of the actors performing in it.  Whether we like it or not, the actors and actresses are what the audiences respond to the most; more than the script and the direction itself.  Sure, writers and filmmakers can leave an impression and can build a reputation of their own, but their work is meant to be unseen and part of the illusion of reality.  It is the actors who must be in front of the camera the whole time and be able to make you forget that you are watching something that was constructed for your entertainment.  And this is mainly why we hold the acting profession in such high regard.  Sure, anybody can get in front of the camera and act, but it does take real skill to make it feel authentic and true to life.  Hollywood actors are an interesting lot because of the whole aura of celebrity that surrounds them.  They are simultaneously the most beloved and most reviled people in the world, and this usually is a result of the work that they do.  What I find fascinating is the way that a film actor’s career evolves over time, and how this affects the way we view them in the different roles they take.  Some people come into fame unexpectedly and then there are others who work their way up.  There are many ways to look at an actors career and it offers up many lessons on how someone can actually make an impact in the business depending on what they do as an actor.
The way we cast movies has certainly changed over the years.  When the studio system was at it’s height in the 30’s and 40’s, actors were mandated to be under contract, meaning that they had to work for that studio exclusively.  This became problematic whenever an actor or actress coveted a role that was being produced at a competing studio, excluding them from consideration.  Actors also had little choice in what kinds of movies they made, mainly due to the studio bosses who would make those decisions for them.  Many of these actors would end up being typecast in roles that the studios believed were the most profitable for them.  It wasn’t until the establishment of the Screen Actors Guild that actors finally had the ability to dictate the parameters of their contract, and also to have more say in the direction of their careers.  Even still, the pressure to be a successful matinee idol was a difficult thing to manage in Hollywood.  In many ways, it was often better to be a character actor in these early years than a headliner.  A character actor at this time may not have gotten the name recognition or the bigger paydays, but they would’ve gotten more diverse roles and a steadier flow of work of screen credits.  Actors from this time like Peter Lorre, Walter Brennan, and Thelma Ritter enjoyed long lasting careers mainly because they made the most of their supporting roles and had more leeway in the directions of their careers.
It’s the status of a matinee idol that really makes or breaks an actor.  Over the many years since the inception of cinema, we’ve seen actors rise and fall, and in some cases rise again.  Making a career out of film acting is a difficult nut to crack, and seeing how the industry is sometimes very cruel to outdated actors, it’s any wonder why there are so many people who want to do it.  I believe that it’s the allure of fame that drives many young up and comers to want to be actors, but following a dream does not an actor make.  It takes hard work, just like any other field in entertainment.  If I can give any advice to someone pursuing an acting career, it’s that you should never get into it just because you have the looks of a movie star.  Do it because you like performing and being a part of the film-making process.  Of course, it’s probably not my place to give advice to an actor, seeing as how I have not been on a stage since the eighth grade, and that I am looking at this from a writer’s point of view.  But, from what I’ve observed in the film community, it is that the best actors out there are the ones who are really engaged in the process, and not the ones who are in it just to build up their image.  The tricky part, however, is figuring out how to maintain this over time.
Becoming a successful actor in Hollywood has a downside that can be either a minor thing or a major negative thing depending on the person it happens to, and that’s the stigma of celebrity.  Whether an actor seeks it out or not, by being out in front of the camera, they have exposed themselves to a public life.  This isn’t a problem if the actor or actress manages their public and private lives well, but if they don’t, it’ll end up defining their careers more than the actual work that they do.  Case in point, actor/director Mel Gibson. Mel’s career has been negatively impacted by his off-screen troubles, including a nasty break-up with his Russian girlfriend and an Anti-Semitic fueled rant during a drunk driving arrest.  What’s most problematic for Mr. Gibson out of all this is the fact that no matter what he does now, no matter how good, it will always be overshadowed by his own bad behavior.  And it is a shame because, in my opinion, he’s still a very solid actor.  I still love Braveheart (1995) to death, and I think a lot of people are missing out if they haven’t seen his work in The Beaver (2011) yet.  Or for that matter, his excellent direction in Apocalypto (2006). Unfortunately, all his hard work is for not as he continues to alienate more of his audience because of his off-screen behavior.  This is the downside of celebrity that we see, and whether an actor is deserving of the scorn or not, it will always be a part of the business.
Actors and actresses can also find themselves in a rut simply because they are unable to adapt to the changing course of the industry.  This is certainly the case with people who have created their own signature style of acting.  Comedic actors in particular fall into this trap.  I’ve noticed that some actors who breakthrough in comedies in certain decades will almost always loose their audience by the next.  Shtick is a deceptive tool in the actor’s arsenal, because it helps people achieve stardom right away, but also anchors them down and keeps them stuck in place.  We’ve seen this happen to many comedic stars, like Eddie Murphy and Mike Meyers and Jim Carrey; and it’s starting to become apparent in Sacha Baron Cohen’s post-Borat career.  The only comedic actors who seem to make long lasting careers are the one’s who choose a dramatic role once in a while, like Bill Murray or Robin Williams.  Age also plays a factor in the downfall of people’s careers.  It usually happens with child actors who can’t shake off their youthful image, and unfortunately diminish and disappear once they become adults.  Making that transition from child actor to adult actor is tough, and it’s what usually separates the Elijah Woods from the Macaulay Culkins.  It’s less difficult nowadays for elderly actors to loose their careers than it was many years ago, mainly because movies like Nebraska (2013) give older actors much better roles.  But, in the past, the industry was incredibly cruel to older actors; something highlighted brilliantly in Billy Wilder’s classic Sunset Boulevard (1950).
What usually ends up making an actor or actresses’ career survive is their ability to grow as a performer.  There’s something to the old adage of there being “no role too small.”  Actors should relish the opportunity to diversify their choices of roles.  And usually the ones who have the longest lasting careers are the people who can play just about anything.  Meryl Streep is considered the greatest actress of her generation, and she didn’t do it by playing the same kind of character over and over again.  She has done comedies, dramas, cartoons; she has played Austrians, Australians, teachers, mothers, daughters, grandmothers, you name it.  No one would ever consider her lazy.  She has made a living challenging herself as an actor, and while not every role of her’s works (Mama Mia, for example) she nevertheless commands respect for her efforts.  What I respect the most is the ability of an actor or actress to move effortlessly from genre to genre and still act like it’s a role worthy of their talents.  That’s why I admire actors like Christian Bale, who can go from dark and twisted roles like The Machinist (2004) to playing Batman, or Amy Adams who can appear in movies as diverse as Paul Thomas Anderson’s The Master (2012) and The Muppets (2011) and give each film her best effort.  It’s always refreshing to see actors who commit themselves to any role they get, which in turn helps to endear them to us as an audience.  An invested actor will almost always make a film better.
Usually nowadays a bad performance is not measured by how misplaced an actor is or by how out of their league they may be.  The worst kinds of performances come from actors who are just lazy.  At the point where an actor just works for the paycheck and nothing more is usually where their careers begin to decline.  We’ve seen this with many actors who just get too comfortable doing the same role over and over again, or with people who take a job just for the pay and act like the part is beneath them.  When this happens, it’s usually driven by ego, which is another negative by-product of celebrity.  When an actor begins to dictate the terms of their comfort level in the production, rather than try to challenge themselves as a performer, it will usually mean that they’ll put in a lackluster performance, which leads them towards becoming a one-note performer.  This sometimes happens to people who hit it big and then become afraid of alienating this new audience they’ve built.  Johnny Depp is an actor that I think has reached this point, having built a wide fan-base from his Pirates of the Caribbean films.  The once ground-breaking actor has now fallen victim to his own shtick and that has negatively impacted his recent slate of films like The Lone Ranger (2013), which shows what happens when you try to play things too safe.
It is remarkable when you see these changes in a film actor’s career, because they usually happen unexpectedly.  Overall, the actor is the one responsible for their own career path, but the market itself can be a wild card factor in the lives of these people.  I for one value the efforts of a strong actor who’ll continue to try hard, even when the roles stop being what they are used to.  It’s something of a miracle to see actors who continue to stay relevant year after year, like Tom Hanks or Sandra Bullock.  They’ve locked into a career path that seems to have worked for them and have managed to maintain they’re faithful audiences even when they take on more challenging roles. What is also interesting is how Hollywood values a redemption story when it comes to an actor’s career.  A Hollywood comeback always manages to be a positive thing in the industry, especially when it happens to the least expected people; like with Robert Downey Jr. bouncing back from his drug addiction to play Iron Man, or Mickey Rourke pulling himself out of B-movie hell when he made The Wrestler (2008).  Film acting careers are probably the least predictable in the industry and it takes someone with a lot of personal resilience to make it work.  If there is anything an up and coming film actor should learn is that hard work pays off.  Don’t fall victim to concerning yourself over the changing trends or acting out of your comfort zone.  In the end, the best thing you can do is to commit to the role, no matter what it is.  Like the great George Burns once said, “Acting is all about sincerity.  And if you can fake that, then you’ve got it made.”

Tis the Season – Why Some Films Become Holiday Perennials

its_a_wonderful_life_3

We’ve reached the end of another calendar year and of course that can only mean that it’s Holiday season once again.  Whether we are celebrating Christmas, or Hanukkah, or whatever, it’s a time of the year where we all gather together and honor family, tradition, and the gift of giving.  What’s interesting about Christmastime, however, is just how much the holiday tradition is actually influenced and centered around Holiday themed movies.  A Holiday film can pretty much be considered a genre all it’s own, since so many of them exist out there, and are created specifically to invoke the holiday spirit.  Not only that, but they are movies that we continually return to every year around this same time, like it’s part of our holiday ritual.  This doesn’t happen with every Christmas themed movie, however, since many of them try to hard to hit their mark and fail spectacularly.  And yet, we see small films that no one thought much of at first grow into these perennial classics over time, and in some cases add to the overall Christmas mythos that defines the season.  But, how do we as an audience discern the classics from all the rest?  What really separates a Miracle on 34th Street from a Jingle all the Way?  Quite simply, like with most other movies, it’s all determined by what we bring from our own experiences in life when we watch a movie.
The emergence of  perennial holiday classics is nothing new in pop culture and actually predates the beginning of cinema by a good many years.  Literature has contributed holiday themed stories in both short form and novels for the last couple hundred years, helping to both shape and reinvent Christmas traditions in a very secular fashion.  Our modern day physical interpretation of Santa Claus can in fact be contributed to his appearance in “Twas the Night Before Christmas,” the 1823 poem by American author Clement Clarke Moore.  Moore’s nearly 200 year poem is still being recited today and it shows just how much tradition plays a role in keeping a perennial classic alive in the public’s eye.  Around the same time, acclaimed British novelist Charles Dickens wrote the story of A Christmas Carol, chronicling the tale of Ebenezer Scrooge and his visits by three ghosts on Christmas Eve.  Since it’s original printing in 1843, A Christmas Carol has gone on to be one of the most re-adapted story-lines in history.  Perhaps nearly a quarter of all holiday classics can claim to have been influenced by Dickens’ classic tale, where a dreary old cynic has his heart warmed by the holiday spirit.  Dickens meant for his novel to be a meditation on greed and class inequality, but I have no doubt that he purposefully meant for Christmas traditions to be the healing influence in Scrooge’s reawakening.  These stories continue to stand strong so many years later and it shows just how far back our culture began to value Christmas stories and songs as a part of the holiday tradition.
Even from the very outset of cinematic history we saw films carry on holiday themes.  Both Twas the Night Before Christmas and A Christmas Carol provided inspiration for movie-makers many times, given their already beloved appeal, but some people in Hollywood also saw opportunities to add their own original holiday themed stories into the mix.  When the studio system emerged, they were very well aware of the marketability of a holiday themes.  After all, people usually visited movie theaters frequently during the cold winters, so why not play up the festive mood that everyone was already in.  For the most part, movies celebrated Christmas more frequently in short segments than in full length story-lines in these early years; whether it was capitalizing on a new popular Christmas song in a lavish musical segment, or by portraying a Christmas celebration as part of larger arching narrative.  Many people forget that one of the most popular Christmas tunes ever written, “Have Yourself a Merry Little Christmas,” wasn’t even from a Christmas themed movie; rather it came from the 1944 musical Meet Me in St. Louis.  But eventually the Christmas season became such an influential part of our modern cultural tradition, that it would inspire films devoted entirely to the holiday spirit.
So, in the years since, we have seen Holiday films become standard practice in Hollywood.  Every year, it’s inevitable to see a Christmas movie released in time for the holidays.  Unfortunately, for most of them, Christmas movies very rarely achieve classic status.  For every one that audiences grow an attachment to, there will be about a dozen more that will likely be forgotten by next December.  Evidently, it seems like Hollywood’s approach to the holiday season is less carefully planned out than any other part of the year.  Their approach seems to be throwing whatever has Christmas in the title up against the wall and seeing what sticks.  Unfortunately, this has led to Christmas being more synonymous with bad movies than good.  Some are well meaning films that fall short of their goal like the Vince Vaughn film Fred Claus (2007) or the odd but charming Santa Clause: The Movie (1985).  And then there are ugly, shallow and distasteful films like Deck the Halls (2006), the Ben Affleck disaster Surviving Christmas (2004), or the deeply disturbing Michael Keaton film Jack Frost (1998), with the creepy as hell CG snowman.  And the less said about the horrible 2000 How the Grinch Stole Christmas remake the better.  Overall, it is very hard to make an honestly cherished holiday classic in Hollywood, and that’s mainly because the business just tries too hard.  If you look closely, you’ll actually find that a beloved holiday classic may come from the unlikeliest of places.
This was definitely the case with what has become not just one of the best loved Christmas movies, but one of the best movies period; that film being Frank Capra’s It’s a Wonderful Life (1946).  Capra’s movie tells the story of George Bailey (a flawless Jimmy Stewart), a man who has given so much back to his hometown and has gotten so little in return, reaching the verge of suicide due to his depression.  Through the intervention of a guardian angel on Christmas Eve, George is shown what his life would have been like if he never lived and he rediscovers his value and purpose and as it turns out is finally rewarded by those whom he’s helped all his life on the following Christmas Day.  The film is very uplifting and perfectly illustrates the true impact that the Christmas season has in our lives.  With a theme like that, you would think that the movie was a smash hit when it was first released, but instead the movie was a colossal bomb.  It bankrupted the company that made it and ruined Frank Capra’s directing career from then on.  The focus on George Bailey’s increasing depression was probably too hard for audiences to take at the time, given that many soldiers were returning home after the end of WWII.  Despite it’s initial failure, It’s a Wonderful Life managed to survive through TV airings which happened on, naturally, Christmas Eve and the film not only found it’s audience but it became a seasonal standard.  To this day, It’s a Wonderful Life is still aired on network TV (the only classic era movie that still is), and audiences from every generation still embraces it warmly, no matter how old fashioned it may be.  Pretty good legacy for a film that started off as a failure.
A holiday classic can come from an unlikely place like It’s a Wonderful Life, but for many, what is considered a classic is usually determined by their own tastes.  That’s why some people find romantic comedies set around Christmastime to be considered a holiday classic.  Case in point, the movie Love, Actually (2003) has grown into a beloved holiday classic, even though the themes in the movie are less about Christmas and more about the intertwining relationships between the characters.  By standing out as a strong romantic film with a Christmas setting, it stands to see this film as being an example of two types of genres working together.  Cult movie fans even have holiday classics that they cherish, like the weird campy film Santa Claus Conquers the Martians (1964), which can hold the distinction of being one of the worst movies ever made, and incredibly entertaining at the same time. And some people can even claim that Die Hard (1989) counts as a Christmas movie, because of it’s holiday setting.  Pretty much it’s whatever we bring with us from our own experiences to the movies that determines what we consider to be entertaining.  Like with how most people gravitate towards a movie based on their own interests, so too do we see that with Holiday films as well.  Hollywood has in some cases picked up on this and has catered to select audiences at Christmastime with genre specific movies.  Usually, it will take a consensus of a large audience to determine which ones will stand out as the undisputed classics.
I think where Hollywood hits it mark most often is when it comes to making a successful holiday film that appeals to the memories of our own experiences of Christmas.  The film that I think hit a perfect bulls-eye in this regard, and stands as a true masterpiece of Christmas themed film-making, is the 1983 classic A Christmas Story.  Directed by Bob Clark, and inspired by the auto-biographical stories of novelist Jean Shepherd, A Christmas Story perfectly captures the highs and lows of a young boy’s experience during the holiday season.  Ralphie (Peter Billingsley) is a character who was relatable to any young boy growing up in small town America, myself included, and seeing how he tries so hard to manipulate his parents into getting him his dream present is something every child will identify with.  Couple that with the hilarious performance of Darren McGavin as the Old Man and the iconic Leg Lamp, and you’ve got the very definition of a holiday classic.  But, just like how A Christmas Story highlights good Christmas memories, we see classic films that also center around a disastrous Christmas experience as well.  The best example of this would be the very funny and endlessly quotable National Lampoon’s Christmas Vacation (1989).  We’ve had just as many Christmases like the Griswold family as we have like the Parker family from A Christmas Story, and Christmas Vacation just perfectly encapsulates all the bad things that happen at Christmas time, without ever losing the holiday spirit underneath.  Not to mention its the last time we ever saw a funny performance out of Chevy Chase.
So, despite the low success rate, we as an audience still seem to find a classic seasonal favorite every in every generation.  But how does Hollywood keep making bad Christmas movies every year despite the demanding tastes of the movie-going public rejecting all the junk they put out.  I think it’s because the season itself is such an overwhelming cultural force, that most filmmakers don’t really care about the product they’re making, as long as it’s holiday themed and ready to capitalize on the mood of the period.  When it comes down to it, a great holiday classic is not determined by how soaked up in the holiday spirit it is, but rather by how strong it story works.  We keep watching It’s a Wonderful Life every year because of how inspirational George Bailey’s life story is, and not because of the Christmastime finale that has come to define it.  In fact, the movie is not really about Christmas at all; it’s about the life of one man and his value to the world.  Other Christmas movies usually become classics just because of a wintry setting, where the holiday is not even mentioned.  And even films that subvert Christmas traditions, like 2003’s Bad Santa, have become genuine holiday classics to some people.
I, myself, love a good Christmas movie, and because I’m such an ardent appreciator of movies in general, these films have certainly become a part of my holiday tradition.  I return to It’s a Wonderful Life and A Christmas Story every year and never get tired of them.  And not a year will go by when I don’t at least drop one quotable line from Christmas Vacation during this season.  I hope every generation gets their own perennial classic that will last for years to come.  Just please; no more remakes or sequels.  We all saw the backlash that an announcement of a sequel to It’s a Wonderful Life got recently.  I only wish The Grinch and A Christmas Story had been spared the same fate.  Like too much Christmas dinner, there can always be too much of a good thing when it comes to Christmas movies.