Category Archives: Editorials

Game Over – Why So Many Video Game Movies Fail

Super Mario Bros

Just as going to the cinemas has become a cherished pastime for many people over many generations, we are now seeing a whole new type of media beginning to take charge and become an even larger part of our everyday lives.  For forty years now, we’ve seen the rise of video games, from crude blocks of color on the TV to full blown blockbuster releases that even rival what’s coming out of Hollywood today.  It’s quite a eye-opener when you see the newest Grand Theft Auto title out-grossing every film released in the same year, but that’s what’s happening in our culture  today.  Of course, Hollywood has taken notice, and really they’ve been trying to figure out gaming culture since it’s very inception.   The only problem is that there is no easy way to translate a video game experience to the big screen.  When we watch a movie, we expect that the story will guide us to a conclusion, but when playing a video game, we’re the ones who guide the story.  Sure, there are narrative driven games, but many others are built around the randomness of our own choices, and that’s what makes them unique.  Now that game programming has become as sophisticated as it is, video games are starting to eclipse what film-making is capable of.   With this kind of popularity, it’s only natural that Hollywood would want to capitalize on it.  The only problem is that by doing so, they lose some of that unique experience that video games gives us.  Not surprisingly, most video game inspired movies have failed over the years and that is due to them either trying too hard to be like the original game, or trying too little, or being so removed from the original concept that it becomes something else entirely.

When looking at all the problems that video game movies have, it helps to see where things went wrong at the very beginning.  In the early years of video gaming, the titles that were coming out were very primitive.  It wasn’t until titles like Donkey Kong came into the market that you could see any semblance of narrative.  Naturally in these early years, Hollywood became more interested in the gaming culture than with the games themselves.  Back in those days, nobody thought that 3D graphics and online play would be a reality, so everything was more or less about getting the highest score.   The lack of foresight may have been a reason why Hollywood never jumped headlong into video game culture and as a result, they’ve seen game development become competitive with their own industry.  The 1980’s showcased some examples of this exploitation of the culture as we saw many films feature Arcades as popular hangouts.  One film in fact centered entirely on the Arcade subculture of the 80’s; 1989’s The Wizard, starring The Wonder Years’s Fred Savage.  The Wizard offers a interesting window into how the world perceived video games years ago, but it’s also firmly a product of it’s time as well, treating a video game contest no differently than any other over-coming the odds narrative back then; the game itself was irrelevant.  However, there was one film at the time that actually did explore the possibilities of the gaming world; Disney’s Tron (1982).  The film explored the idea that a video game could be fully interactive world inhabited by simulated people based on our own selves.  In other words, it’s depicting something like a MMO (massive multi-player online) game, resembling World of Warcraft, that we all know today.  Though limited by what it was capable of in it’s time, Tron has proved to be a very forward thinking film, and naturally something that groundbreaking ended up being a failure at the box-office.

When Hollywood began to take video games seriously is at the point when video games started to have characters and narratives that people gravitated towards.  The early 90’s brought us the early beginnings of video game franchises, with the likes of Super Mario Bros. and Sonic the Hedgehog being among the biggest names.  And while Hollywood was smart enough to jump on board and bring some of these popular characters to the big screen, they unfortunately didn’t know exactly how to do it.  When you look at the original Super Mario game, you can see how difficult it was to translate.  Basically it’s a red-suited man jumping into pipes and breaking blocks with his head; not much to draw from to make a 90-minute film.  When 1993’s Super Mario Bros. made it to the big screen, it was instantly slammed by both critics and fans of the game alike.  What ended up happening was that with no clear idea of how to adapt the Mario game accurately, the filmmakers just threw in every weird idea they could think of in the end, making the finished movie an incomprehensible mess.  Instead of the big-headed Goomba minions from the game, we get the large-body, small head Goomba guards who look more creepy than silly.  Not only that, but we also get actor Dennis Hopper looking all sorts of confused while playing King Koopa, as a sort of lizard-human hybrid.  The only thing the movie got right was the casting of Bob Hoskins as Mario, who does indeed look and act the part well.  But, what Super Mario Bros. represented most was a prime example of Hollywood not understanding what a video game was and how to make it work as a film.  This would be systemic of most the 90’s video game adaptations, and namely for most if not all adaptations thereafter.

The big problem with video game adaptations today is that it’s impossible to make something interactive feel the same in a non-interactive form.  For the most part, a video game translation ends up just feeling like a cut-scene that never ends.  For gamers, the narrative is there to move them from one level to the other, all with the goal of reaching the end and beating the game.  Of course, that doesn’t mean that the story is irrelevant.  In fact, most video games have very complex and involving story-lines, particularly those that have come out in the last decade or so.   The reason why video game movie narratives suffer is because of the limited run-time.  Movies are only allowed on average about two hours to tell their entire story, so if you try to take a video game story that takes 10 times that amount to unravel and condense it into a film narrative, you’re going to have to lose quite a bit in the process.  As a result, most video game adaptations lack character development and spend way too much time setting up it’s world.  A prime example of this would be Final Fantasy: The Spirits Within (2001).  The Final Fantasy series is heralded by both gamers and casual fans for both it’s colorful characters and it’s complex story-lines.  But the reason why these games are so complex is because they take many hours to complete; sometimes 50 hours or more.  If you take 50-plus hours of development and try to condense that same kind of complexity into a two-hour film, you’re going to get something that’s crushed tighter than bedrock.  That’s the fatal flaw of The Spirits Within, a film that’s so concerned with it’s world’s complexity (and being one of the first movies ever to utilize photo-realistic CG human characters), that it leaves everything else by the wayside, making the whole film feel very hollow, particularly with the characters.  While many of these films try to noblely translate beloved story-lines to the big screen, there’s just no possible way to contain it all.

Another problem is the fact that many video game movies try too hard to be just like the titles their trying to adapt.   This usually is evident in some of the film’s characterizations.  Video games can sometimes get away with generic and usually archetypal characters, because by playing the game, we are infusing ourselves into the story, so a blank-slate protagonists is usually a good thing.  When it comes to the movies, however, lack of character can pretty much sink a film.  This is especially painful when characters that people love in the games are translated so poorly by under-qualified actors.  The Mortal Kombat films in particular gives us the right look of the characters with no other depth beyond that.  That’s the unfortunate result we get from adapting something as simple as a fighting game.  There’s little character development to begin with, so if you take that directly to the big screen and do nothing to build upon that, you’re going to get very bland characters.   The same can be said for pretty much every other video game movie out there.  Also, another way that a video game adaptation can fail is by trying too hard is in capturing the look of the video game.  This is especially true of more recent video game movies.  With the advances in CGI over the years, simulated reality is becoming ever more convincing, and the lines between video game graphics and cinematic graphics is growing closer and closer.  Because of this, the limitations of film-making again come into play.  A video game lets the visuals unfold organically and lets the player examine it all at their own leisure.   A movie has to cut around and limit what the viewer sees.  That’s why video game movies that try to look so much like their predecessors, like 2005’s Doom and 2006’s Silent Hill suffer, because that interactive element is removed, making the viewer feel less involved.

Not that every video game adaptation has been a complete failure.  There have been exceptions over the years that have managed to make a dent at the box-office, even if it’s a minor accomplishment.  Paul W.S. Anderson’s Resident Evil series has been going strong for six films now; some of which people say are actually better than the more recent games in the wanning series.  Also succeeding is the Tomb Raider adaptations, both of which star Angelina Jolie as the popular archaeological adventurer, Lara Croft.   What I actually think is interesting about the film Lara Croft: Tomb Raider (2001) is that it represents how to do a video game movie right.  While no masterpiece, the film is nevertheless competently made; finding a way to make Lara Croft work as a character on the big screen in a stand-alone film.  The movie doesn’t try to recreate the video game experience (how could it, with those Playstaion 1 graphics?), it merely translates the character’s personality into a narrative that can be told cinematically.  It doesn’t try to put the cart before the horse like so many other video game movies do, and let’s the character be the star rather than the world she lives in.   Naturally, because of this, Lara Croft: Tomb Raider is one of the few films based off of a video that has been profitable.  Of course, it helps that the world it’s adapting is not particularly complex.  Sometimes a modest title is the best kind of game to adapt, which is probably why shoot-em-ups are popular adaptations today, like 2007’s Hitman  or a racing game turned movie like Need for Speed (2014).

Of course, sometimes the opposite comes true, when a filmmaker or studio takes something that can translate perfectly to the big screen and uses the completely wrong approach.  That was the case with the Disney produced Prince of Persia: The Sands of Time (2010).  Prince of Persia was already a very cinematically infused game before this movie came out, so Disney should’ve had an easy time making it translate onto film.  Unfortunately with a miscast lead (Jake Gyllenhaal), a un-focused director (Mike Newell), and an out-of-control production budget (est. $200 million), the movie failed on every level and sullied not just the reputation of it’s creators, but it also sullied the Prince of Persia brand overall.  No game in the series has been released since, and probably never will, at least until the memory of the movie has gone away and the demand for the game returns.   At the same time, there are also people out there who only adapt video games as a way of exploiting them.  Sometimes it’s as harmless as an up-and-comer who wants to showcase what they can do by taking a little known video game title and putting their own spin on it.  And then, you have someone like Uwe Boll, who’s whole career has been defined by cheap adaptations of video games like Bloodrayne (2006) and Alone in the Dark (2005).  The only reason he adapts video games, turns out, is because of a tax loophole in his native Germany that let’s him make more money off failed adaptations of licenced games.  So, not only is he getting rich off of bad movies, but he’s also trashing games that people have loved for many years, making it the worst kind of exploitation.  The wrong approach usually ends up being worse than a confused or bland adaptation, because in the end, it’s the original games that suffer and the legacy that they carry.

So, even with all the failure that have come in the past, will there ever be a video game movie that will actually become a huge hit.  I would like to see it happen, but it probably never will, because there are just too many fundamental differences in the way to keep it from ever happening.  For one thing, a movie can never capture the interactive experience that a video game presents.  And with more and more video games growing visually more complex, it’s clear that Hollywood filmmaking is starting to face some tough competition.  But, to the industry’s credit, they have found a way to embrace video game culture over the years, and make it a part of itself.  Many studios have their own software development departments and it’s very common to see tie-in video games released alongside major Hollywood releases.   Even still, Hollywood still hasn’t given up on trying to make a big-screen translation of a video game work.  Adaptations of Assassin’s Creed, World of Warcraft, and even Angry Birds are in the works, though I highly doubt any of these will feel exactly like the original games.  In reality, I think Hollywood is better off looking at what the games mean to us, rather than take a literal translation approach.  I strongly recommend films like Tron and Wreck-it Ralph (2012), both really fun movies that look at video games as a lived in world rather than as a form of diversion.  Also, I strongly recommend the documentary King of Kong: A Fistful of Quarters (2008) which brilliantly presents the impact video games have on our culture.  One day, Hollywood will figure out the formula and hopefully deliver a great video game adaptation someday.  I can tell you this; I’ve been waiting my whole life for that Legend of Zelda movie, and that wait is still going strong.

 

The Gospel According to Mel – “The Passion” Ten Years Later and Bringing Scripture to the Big Screen

melpassion

Often we see a renowned filmmaker and/or a movie star step off the pedestal that the entertainment business has set them upon in order to make something that not only is risky, but could also jeopardize all the goodwill that they have earned in their career.  I put together a top ten list of these kinds of “passion projects” before, but one that certainly has left an impact over the last decade, on both the industry and on it’s creator, is Mel Gibson’s The Passion of the Christ (2004).  This year marks the 10th anniversary of this controversial film, which may be a milestone of celebration to some and a dark chapter for others who wish to forget.  No matter what your opinion is on the movie, you cannot deny that it is one of the most monumental films of the new century, and it’s legacy will probably be felt for a long time to come.  But, for the most part, that legacy centers more around the controversy surrounding it and less about how it stands as cinematic art.  No doubt Mel Gibson himself has been unable to shake away from the legacy of this film, and all the divisiveness surrounding it; and for better or worse, it will be the movie that defines his career in Hollywood.  Looking at the ten years since The Passion’s debut, we have learned a lot about how difficult it is to take holy texts and bring them to the big screen.  Did Mel Gibson’s film prove that biblical stories can indeed work in movie form, or did it show that it’s better to keep religion out of entertainment?
In order to understand why Mel Gibson would risk his reputation over a single movie, you have to understand the conditions that led up to it’s production.  Long before The Passion, Mel tried to segway his acting career into directing, starting off with 1993’s The Man Without a Face.  This was a modest production that earned Mr. Gibson some good praise, but considering that Mel was mentored in his early career by visionary and ambitious Australian directors like George Miller (The Road Warrior) and Peter Weir (Gallipoli), he had something much more epic in mind.  Naturally, his follow-up was the groundbreaking Braveheart (1995), which earned Gibson Oscars for both Best Picture and Best Director.  After Braveheart, Mel returned to acting regularly, until the early 2000’s, when he decided to bring a story near and dear to his heart to the big screen; the story of Christ’s crucifixion.  Raised in a ultra-traditionalist Catholic household, it was no surprise that Mel would look to scripture for inspiration, and while nobody doubted that he could pull it off cinematically, concerns about whether or not he should soon arose.  It wasn’t until the script was made public that the controversy around the film started, given that people interpreted it as anti-Semitic.  Mel’s project was dropped from all interested parties as a result and he ended up funding the project with his own money.  The movie eventually made it to theaters, and despite all the controversy, or perhaps because of it, The Passion of the Christ became a box office phenomenon, earning $83 million on opening weekend and $370 million overall.
Despite what Mel intended for the film, it’s aftermath took on a life of it’s own.  It became a focal point in what many people call the “culture war” in America, which in turn took the whole controversy surrounding the film and politicized it.  The “culture war” is basically a term created by news media to frame political arguments related to pop culture, and show a cultural divide between the left and the right in America whether there is one or not.  Given that The Passion was released in 2004, which was also an election year, the movie became sort of a rallying point for both political camps, with Christian conservatives seeing the movie as a powerful affirmation of their beliefs, while liberals were almost universally opposed to the movie, calling it religious propaganda.  There were people who did break ranks from ideology and judged the film on it’s own merits; Christianity Today, a faith-based publication, was sharply critical of the movie when it premiered, while left-wing film critics Roger Ebert and Richard Roeper both gave the movie two thumbs up, and stood by their reviews many years later.  Nevertheless, reactions to The Passion divided America, probably more so than it should have.  It became a political tool, which I believe is something that Mel never wanted it to be.  Though Mr. Gibson leans to the right politically, he’s never been exactly been a dyed-in-the-wool Republican icon; and for the most part, he’s been sharply critical of all political parties his whole career.  The movie becoming a lightning rod for this so-called “culture war” is probably the legacy that Mr. Gibson wishes the film had avoided.
But, regardless of intent, Mel Gibson had to have known that the movie was going to upset people no matter what.  This is the risk that comes with adapting scripture to film.  There always are skeptics out there who will dismiss biblical stories as nonsense, as well as others who take every word as, well gospel.  Naturally, if you make an earnest attempt at bringing the film to the big screen, it will be scrutinized, especially if it strays from expectations.  You see this in other modern attempts at adapting stories from the Bible.  Martin Scorsese’s The Last Temptation of Christ was sharply criticized by people of faith for it’s depiction of a “what if” scenario where Jesus chose life instead of sacrifice.  In the movie, Christ still dies for man-kind’s sins like he does in the Bible, but Scorsese let’s the film explore the idea of how Jesus might of struggled with that choice.  Opening up that dialog proved to be to much for traditionalist Christians, who condemned the movie as blasphemous.  A similar controversy is brewing right now over Darren Aronofsky’s Noah (2014), with Christians once again attacking a film over it’s revisions.  But despite all of the controversies, I believe that each of these films have more in common than people think.  Again, I believe that it’s all the nonsense about a “culture war” that has shaped the divided responses to these movies.  Overall, they each represent an expression of faith on the part of their respective filmmakers, and each shows how the cinematic medium can find stories that are interesting and complex in a source as widely familiar as the Bible.
You may be wondering what I actually think of Mel Gibson’s The Passion, especially looking back on it now over ten years.  To put it simply, it’s an easy film for me to respect than to admire.  I do think that it is a triumph of film-making; showing Mel Gibson’s unparalleled talent as a director.  I am amazed that the movie was self funded and completed on just a $30 million dollar budget.  It was released around the same time as big-budget epics like 2004’s Troy and Alexander, and yet feels more authentic to it’s time period than those two ever did, even with their $200 million plus budgets.  The film is also gorgeously crafted, and shot by one of the world’s greatest cinematographers, Caleb Deschanel.  Actor Jim Cavizel shines in the role of Jesus, bringing new meaning to the phrase “suffering for his art.”  Where the film is at fault though is in it’s story.  I know it’s odd for me to critique the “greaest story ever told,” but my problem has more to do with Mel’s interpretation.  Like Mr. Gibson, I was raised Catholic (albeit in a less traditionalist church), so I know all the important points of the story by heart.  Where the movie loses me is in how it’s all focused.  Mel just lets the events of Christ’s crucifixion play out without grounding it in a narrative.  Pretty much the story just goes through the paces, indulging more in the grim details than explaining exactly why they are happening.  This leads to a lack of character development that sadly makes most of the supporting players feel less interesting.  The only standouts in terms of character are Cavizel’s Jesus, actress Maia Morgenstern’s outstanding portrayal of the virgin mother Mary, and a chilling interpretation of Satan by Italian actress Rosalinda Celentano, who taunts Christ by taking the form of a mother figure.
I do remember seeing the movie with family back when it first premiered, as well as the hours long conversation we had about it afterwards.  While we were moved by the movie, I don’t think it had any kind of effect on our religious beliefs.  To be honest, I’ve moved further away from the Catholic church in the years since, but not as a result of this movie.  I still respect the risk Mel took to make it, and I’m glad the movie exists.  As far as the anti-Semitic undertones that people claim the movie promotes, I have a hard time seeing them.  Sure, there are people who see the depictions of the Hebrew high priests in the movie as problematic, but to me the priests depicted in the film are so far removed from modern day Jews that I don’t even see the two as even remotely comparable.  Not only that, but the movie does go out of it’s way to portray the Roman guards as the true villains in Christ’s story.  If there is any criticism that’s leveled against the film that has any merit, it’s in the way the Gibson indulges in the suffering of Jesus in his final hours.  The movie shows you every cut, gouge, and impaling that is inflicted onto Jesus during his execution, and it literally is the focus of the entire movie.  It could be argued that Mel is obsessed with portraying suffering and torture on film in gruesome detail, much like he did with the ending of Braveheart, and that this misses the point of Christ’s teachings in the first place.  While I don’t think Mel intentionally misinterpreted Biblical passages in order to indulge his own cinematic passions, the film nevertheless is defined more by it’s gruesome elements than by it’s uplifting message.
In the ten years since, people have been trying to interpret exactly what was meant by Mel Gibson’s film, and what it means for the future of scriptural film-making.  Unfortunately, Mel’s personal life problems have clouded the reputation of the film, and Mel’s drunken rants have given weight to the claims of antisemitism.  Because of the sharply divided responses from people due to the ongoing “culture war,” faith-based films have once again been marginalized into a niche market; choosing to preach to the faithful rather than have their movies appeal to all audiences.  The recent success of the Christian film God is Not Dead (2014) is something that I see as being a negative result of the “culture war” division, because it portrays a “straw-man” argument that all Christians are morally right and that atheists are using education to corrupt people.  The same argument can be made on the other side, when Hollywood adapted The Golden Compass (2007) to the big screen, which itself was a atheistic fantasy story that portrayed religion as an evil force.  Religious films are best when they don’t insult the intelligence of the viewer and actually challenges their beliefs, no matter what their faith is.  Back in the Golden Age of cinema, Hollywood found a way to make movies that faithfully adapted scripture, while still maintaining a sense of entertainment.  Cecil B. DeMille’s The Ten Commandments (1956) has stood the test of time because people of all faiths enjoy the spectacle that DeMille put into his production, while William Wyler’s Ben-Hur (1959) is still beloved because of it’s universal story of adversity against hatred.  Like these films have shown, Biblical stories can work in cinema if one knows how to reach their audience correctly.
So, while Mel Gibson’s The Passion of the Christ may have taken on a life of it’s own beyond what the filmmaker intended, it nevertheless is still one of the most monumental films in recent memory.  You bring this movie up in conversation and even 10 years later this movie will still stir up passionate feelings in people.  While Mel has his own moral issues to deal with, I don’t believe that he created this movie out of a need to condemn, but rather to explore his own feelings about his faith.  I think he felt like there was a lack of worthwhile religious themed films out there and he sought to fill that gap in some way.  I think the movie stands up over time, especially compared to the lackluster, church-funded movies that have come in it’s wake.  It’s not the best faith-based movie I’ve seen, and certainly not one of Mel Gibson’s best either; I still look at Braveheart as his masterpiece, and his Passion follow-up Apocalypto (2006) is an underrated gem.  Even still, the best legacy this film could have made is that it sparked a renewed interest in making unique and personal Biblical films once again, which cinema has been severely lacking in.  It took a while, but Aronofsky’s Noah seems to be that film the first film since The Passion to actually make good on that promise, though of course time will tell if it lasts.  As for The Passion of the Christ, as flawed as it may be, it nevertheless changed the way Biblical movies are seen in our modern culture and showed that taking a big risk has it’s rewards in Hollywood; a legacy that I think serves the movie well over time.

Box Office Duels – Hollywood’s Reliance on Copycat Movies

copycats

If you watch a lot of movies like I do, you’ll know that original concepts and ideas in blockbuster movies are few and far between.  And it’s easy to see why; Hollywood prefers to play things safe and cater to the same crowds over and over again.  This isn’t necessarily a bad thing.  After all, given how much money these studios pour into their big tent-pole productions, you’ll understand why they would prefer to not step out of line in order to get most of their investment back.  But, at the same time, when you try too many times to repeat the same kind of business over and over again, the end products will lack any definition of their own, and will look more transparently like a cash in.  Sticking close to formula can only last as long as the end product stays fresh.  Sometimes, filmmakers even run the risk of unfortunate timing, as their movie ideas are already being copied by another company before they are even able to get production up and running.  These are known as copycat films, and sometimes their reputations as a movie only becomes defined by how they perform against their like-minded counterpart.  While it is amusing to see how unoriginal some movies can sometimes be, it’s still apparent  that the trend of mimicking other people’s movies is and will always be a part of Hollywood’s legacy.
So how do we necessarily know when a movie should be labeled a “copycat.”  It basically comes down to when we recognize a movie exists only because of the presence of a near identical film.  The movie doesn’t need to be exactly the same, but it should have all the same basic elements there.  This could mean that it has the same plot structure with nearly identical characters; it could have the same visual style; or it could be depicting the same kinds of events, only from a different angle.  What is most interesting, however, is that sometimes these identical movies are released within months, or even days, of each other by competing studios.  This is what is commonly known in the industry as dueling; where the studios purposefully put their competing movies in theaters at the same time in order to see who will get the bigger numbers, purely for bragging rights.  This is also a contentious spot between filmmakers and the studio heads, because usually the people who make the movies don’t see their work as a competition.  The other area where you see a lot of copycat film-making is in the aftermath of a standout movie’s huge box office success, and all the wannabe movies that come out in it’s wake.  These are the “knock-off” movies, and like most knock-offs, they tend to be of lower quality.  But, sometimes it’s the juxtaposition that we see in each of these movies with their counterparts that actually make them interesting to us.
Dueling movies are interesting because of how we judge them based off their likeness to another film.  It pretty much comes down to the “who did it better argument,” given how they are usually around the same level of quality.  The more cliched the genre is, the more likely you’ll find a pair of dueling films in it.  Action movies usually is the resting ground for most of these kinds of flicks and  many times you’ll have action movies that are so alike, that they are usually confused for one another, and as a result, end up losing their individuality.  Case in point, last year’s dueling set of movies set around attacks on the White House; the Antoine Fuqua-directed Olympus Has Fallen and director Roland Emmerich’s White House Down.  Both movies follow the exact same premise, and were coincidentally released only months apart.  Was it’s the studio system’s way of testing out the “White House Attack” sub-genre on two fronts, or were the studios just trying to jump on a trend before their competitors could get there?  My guess is that, like most dueling movies, one film got the greenlight shortly after the other one did, only because one studio had the script already archived and saw the opportunity to put it into production after seeing the other studio take the bite.  Essentially both were “Die Hard at the White House” story-lines and were safe bets for both studios as genre pictures.  And it’s not the only time Hollywood has seen this happen.  Back in the 90’s, we saw the battle of the volcano movies with Dante’s Peak (1997) and Volcano (1997) released together, as well as the summer of  “destruction from above” movies like Deep Impact (1998) and Armageddon (1998).
While most of these “dueling” movies tend to come from loud and dumb action genres, it doesn’t mean that all copycat movies are necessarily sub-par.  There are actually instances where two dueling movies are both high quality films.  Case in point, the fall of 2006, when audiences were treated to two psychological period dramas centered around magicians; Neil Burger’s The Illusionist and Christopher Nolan’s The Prestige.  It’s unusual to see this kind of subject matter spawn two very similar yet very distinct films at the same time, but both movies have managed to stand out even after crossing paths at the box office.  I happen to like both films, and it’s unfortunate that their histories are always going to be tied together because of their close release window, but it does represent the fact that two movies can duel it out at the same time, and still both be considered  winners in the end.  Animation is another field of film-making where you’ll see studios purposefully trying to undermine the others’ fresh ideas, but still with genuinely good products.  In 1998, we saw the release of not one, but two computer animated movies centered around bug-based societies; Dreamworks’ Antz and Pixar’s A Bug’s Life.  Both films are admirable productions, and are pretty much equal in entertainment value, but Dreamworks wanted to be the first out of the gate.  So, they sped up production in order to beat Pixar to the finish line; a decision that may have undermined the film’s potential for success in the end.  Pixar’s early success may have been attributed to the fact that Dreamworks was trying too hard to compete in the early days, which also became a problem when the dismal Shark Tale (2004) followed up Pixar’s Oscar-winning Finding Nemo (2003).
Apart from the dueling movies that we see from time to time, the much more common type of copycat film is the one that follow trends in the market.  These are the “knock-off” movies that I mentioned earlier and their sole existence has been to capitalize off the enormous success of another big movie that has come before it.  Of course, after the monumental success of Titanic (1997), we got Michael Bay’s insultingly cliched Pearl Harbor (2001); and the Oscar glory heaped onto Ridley Scott’s Gladiator (2000) led to the expensive busts that were Wolfgang Petersen’s Troy and Oliver Stone’s Alexander (both 2004).  More often than not, this is where you’ll find most of the copycat movies that have failed.  Perhaps the trend that has led to the most failed knock-offs in cinema is the fantasy genre.  A decade ago, we saw the enormous success of both The Lord of the Rings  and the Harry Potter franchises begin, which led many other studios to believe that they could pick up any random fantasy source material out there and have a surefire hit on their hands.  Unfortunately, not every one of these book series has the same kind of fan-base that Tolkein and Rowling has earned over the years.  Over the last decade we’ve seen many one and done franchises fizzle at the box office, like 2007’s The Golden Compass, 2007’s The Seeker: The Dark is Rising, and 2008’s The Spiderwick Chronicles.  The Narnia and Percy Jackson series managed to survive to make more than one film, but even they failed to live up to their lofting ambitions.
There is however a trend that does seem to be working well in Hollywood right now, and has continued to be profitable despite the fact that most of these movies are just copying each other’s formulas, and that’s the young adult novel adaptations.  More specifically,  the movies that have followed in the wake of author Stephanie Meyer’s Twilight series and author Suzanne Collins’ The Hunger Games series.  These two franchise have become huge cash cows for their respective studios, and are currently defining the trend that we see today.  While Twilight is far from perfect as a movie, there’s no doubt that it has left an impact on Hollywood in recent years, and you can blame the current trend off “sexy monster movies” directly on it.  Honestly, would a zombie love story (2013’s Warm Bodies) ever have existed had Twilight‘s vampire-werewolf love triangle not hit it’s mark with teenage audiences first?  Even bigger is the Hunger Games impact.  Now, post-apocalyptic stories centered around adolescents are in vogue in Hollywood, with adaptations of Orson Scott Card’s Ender’s Game (2013) and Veronica Roth’s Divergent (2014) getting the big screen treatment.  While these movies may not rise to the same levels as their predecessors, they are nevertheless finding their audiences, and it’s proving to Hollywood that this is still fertile ground to explore.  We are likely to see many more Twilight and Hunger Games knock-offs in the years to come, given that YA adaptations are the hot trend of the moment, but that’s only because the audiences are less concerned about the quality of the adaptations themselves as they are about how well these movies deliver on the entertainment side of things.
Over the last decade, there has actually been an entire industry of film-making devoted to not only copying movies, but also just blatantly ripping them off.  This has become known as the Mockbuster industry.  More often or not they are cheap, direct-to video copycats of current blockbusters that are sometimes released on the same premiere dates.  Usually, its the hope of these Mockbuster producers that uninformed consumers will be tricked when they see their “knock-off” on a shelf in the video store and think that it’s the same thing as the bigger movie that’s currently playing in a nearby theater.  Mockbusters of course are no where near the same level of quality of a big budget film, and are usually defined by shoddy production values, D-list acting, and laughably bad special effects.  One of the companies that has made it’s name providing these kinds of films to the market is called Asylum, and their library consists of many notable “knock-offs” like The DaVinci Treasure, Snakes on a Train, Atlantic Rim, Abraham Lincoln vs. Zombies, American Warships, and of course Transmorphers.  Now while many can criticize Asylum for ripping off other movies for a quick cash grab, they’ve actually been pretty upfront about their intentions and make no qualms about what they do.  They are even finding an audience who do enjoy their laughable, low quality productions as a goof.  In fact, Asylum actually hit it big last year with the surprise hit Sharknado when it premiered to a lot of fanfare on the SyFy channel.  Which just goes to show that even Mockbuster film-making can find it’s place in the world.
But is the trend of copycat film-making just another sign that Hollywood is out of ideas.  It all depends on whether or not the movies still work as entertainment in the end.  It is kind of fun to contrast two like-minded movies, especially when they are almost indiscernible from each other.  I think you can create a very applicable drinking game out of spotting all the cliches that a pair of dueling movies have in common; especially with films like Olympus Has Fallen and White House Down, which I swear are nearly identical in everything but tone.  And a Mockbuster can be entertaining for a laugh if you’re in the right state of mind.  The only time when copycat film-making becomes problematic is when there’s no passion behind it.  It merely exists to piggy-back off the success of a much better film.  That’s something that you see in a lot of the failed franchises of the last decade.  In the end, it’s okay to show off a little familiarity in your movie, just as long as you make the most of it.  Even Harry Potter and Lord of the Rings had their inspirations before them, and let’s not forget how many adventures have followed the “hero’s journey” template to the letter on the big screen over the years.  Audiences are smart enough to see when a movie’s story-line feels too familiar to them, and that’s usually what separates the copycat movies that stay with us from the ones that don’t.

Holy Grails – The Noble Search for Cinema’s Lost Treasures

metropolis

One of the best things to happen to cinema over the last few years has been the emergence of digital archiving.  Sure, it is sad to see classic film stock disappearing as the norm, but there is a reason why movies are better suited for the digital realm.  If you have a digital backup for your film, you are better able to transfer it, download it, and make multiple duplications without ever losing video or sound quality.  When a movie exists as a digital file, it is set in stone visually and aurally as long as it is never erased.  This has become beneficial for people out there who do consider film restoration a passionate endeavor in life.  For years, film restorers have had to contend with the forces of time undoing all their hard work as they try to keep some of our most beloved films looking pristine.  Now, with digital tools at their disposal, preservationists can undo the years of wear and tear on most old films and make them look even better than when they were first released.  The advent of DVD and Blu-ray has given more studios a reason to go into their archives and dust off some of their long forgotten classics, and because of this, restorations have not only become a noble cause for the sake of film art, but also a necessity.  While there’s no trouble finding most movies in any studio archive, there are a few gems that usually have alluded archivist whereabouts for years, and these are known to film historians as the “Holy Grails” of cinema.
It’s hard to believe that there was once a time when film prints were considered disposable.  Back when the studio system was first starting up, it was commonplace for production companies to dispose of their used film stock once a film was no longer in rotation at the movie theaters.  This was done so that they could either make room for new releases, or to prevent any accidents from happening at their studio.  The reason film prints were considered dangerous to store in a warehouse back in the 20’s and 30’s was because they were made from nitrate, the same material used to make dynamite.  Several fires have happened to film vaults over the years because of nitrate film spontaneously combusting, including a 1967 incident at the MGM Studios in Culver City, CA.  Incidents like this, as well as the careless disposal of early films, are the reason why 90% of all films made before 1920 are lost to us today, according to Martin Scorsese’s Film Foundation.  It wasn’t until the mid-30’s that filmmakers like Charlie Chaplin and Cecil B. DeMille started to actively preserve older movies, and their efforts have helped to keep many of these classics alive.  One thing that helped was the fact both Chaplin and DeMille had ownership over their work, so they could keep the original negatives preserved in their own collections and safe from studio hands.  Also, by keeping their films in good condition and preserved well enough to have them screened over and over again, it helped to convince the studios that it was worthwhile to do the same.
Even with better efforts to keep films archived and in good condition, older film stock still wears out over time and with many of them still made out of very volatile materials, many have just rotted away to ash in the vaults.  That is why many archivists have fully embraced the digital revolution, because it has enabled them to preserve many of these disappearing classics for posterity in a definitive way.  But, before a film can be preserved, the damage must be undone, and again digital tools are what saves these movies in the end.  There is a whole class of digital artist out there whose whole job is to scan older films from the best sources available and touch up the scratches and marks on every single frame.  Now that High Definition has become the norm in home entertainment, the results of film restorations are held to a much higher scrutiny, and that has led many studios to take better care of their whole catalog of flicks, which is nothing but a good thing for cinema as a whole.  The fact that some classic films like The Wizard of Oz (1939) and Casablanca (1943) look so good after so many years is a testament to the great efforts made by restorers over the years.  It would be unthinkable to see these kinds of films all scratched up and with faded coloring, which is why film restorations has to be an essential part of the studio business.
But, while beloved classics benefit from better care, some films have not been so lucky.  Early cinematic history is unfortunately a lost age for many film historians, because so much of it is gone.  We only know that many of these movies exist purely because of documentation from their filmmakers, or from a piece of advertisement that has been uncovered in an archive or private collection.  Sometimes movie trailers have popped up for a movie that no longer exists as a whole, like the early “lost” Frank Capra film called Say it with Sables (1928).  There are a few that have risen above the rest as films that are clearly calling out to be rediscovered and preserved.  These are the “Holy Grail” films, and some of them have become famous merely because of their elusiveness.  Like Indiana Jones searching for the Lost Ark, film preservationists have searched the world over for any evidence of the existence of these “Holy Grail” pieces of cinema.  Part of the allure of these films is the fact that they have remained unseen by the public for many years, and in some cases, never seen at all, and yet when given just one titillating glance from a press photo or from a storyboard proving their existence, it’s enough to send film nuts on a mad search.
Probably the most famous example of a lost and found “Holy Grail” film is Fritz Lang’s groundbreaking classic Metropolis (1927).  Lang’s film was made during the height of silent film-making and is considered to be the era’s crowning achievement.  Made in Germany before the rise of Hitler, Metropolis was the most expensive film of it’s time, and showed to the world that European cinema was on par with the film industry emerging in Hollywood at the same time.  However, when the movie made it’s debut in America, it was subjected to heavy cuts due to it’s more pro-Socialist themes, taking the run-time down from 145 minutes to just under 2 hours.  The Nazi regime also destroyed most of the film’s early prints, as well as the original negatives, making a full restoration impossible to do over time.  For years, the shorter cut of Metropolis was all that audiences had to see, and while it did regain it’s reputation as a cinematic classic, it remained an incomplete vision.  Film preservationists had to fill in the missing gaps with title cards explaining what was missing for many years, but while a Blu-ray release was being prepped in 2008, something miraculous happened.  A print of the original uncut version of the movie was found in Argentina in a private film collection.  The Lang Film Foundation in Germany quickly picked up the find and made their best efforts to reincorporate the lost scenes.  Even though the restoration couldn’t make the new scenes look as beautiful as the rest of the movie, due to the damage on the film stock, we are now fortunate to have a nearly complete version of this monumental film.
The saga behind the rediscovery of Metropolis’ uncensored cut gives many people hope that these “Holy Grail” movies can someday be found, and the odds of that happening improves more all the time.  There is a more concerted effort to find lost treasures tucked away in film vaults across the world, and while some “Holy Grails” have remained elusive, the fruits of the film restorers’ labors are still reaping many rewards.  Many of these finds have emerged from private collections and some unlikely places.  Sometimes it’s thanks to a very forward thinking film technician or vault librarian who saved these treasures from early destruction, sometimes without even knowing it.  A 1911 short movie called Their First Misunderstanding, the very first film to feature legendary actress Mary Pickford, was discovered in a New Hampshire barn in 2006.  Even a simple mislabeling has been the fault of some of these classics being lost.  The first ever Best Picture winner at the Oscars, 1927’s Wings, was considered gone forever due to negligent care of the original nitrate negative at the Paramount Studio Vault.  But, the film was rediscovered in the Cinematheque Francaise archive in Paris, found almost by accident when the archivists were going through their back stock, and it was quickly given a more permanent and secure place in the Paramount vault.
Sometimes, like Metropolis, it’s not a whole film that gets lost, but rather fragments that are removed and then later discarded against the wishes of the filmmaker.  These are not what we commonly know as the Deleted Scenes that inevitably have to be trimmed by the editor to make a movie work more effectively.  What I’m talking about are pieces of the movie that are removed even after the film’s first premiere, leaving big chunks of the finished film out of the public eye for whatever reason.  Sometimes these cuts were made because of censorship, and done at the protest of the filmmakers.  Or they were trimmed for the purpose of time constraints.  Back in the late 50’s and early 60’s, there was a trend for big Hollywood pictures to be shown as Roadshow presentations; meaning they were special events complete with printed out programs, musical overtures played while the audience took their seats, and special intermission at the halfway point of the movie.  These were often 3 hour plus in length programs, so when these Roadshow movies had to make it to less grand theaters across the country, it meant that the whole show had to be trimmed to meet time constraints, including removing scenes from the actual movie.  Recently, film restorations have tried to reassemble these old Roadshow versions, and while many of these have been found intact, like Lawrence of Arabia (1962) and Spartacus (1960), a few have still yet to be fully restored.  Movies like George Cukor’s A Star is Born (1954) and Stanley Kramer’s A Mad, Mad, Mad, Mad World (1963) have been given partial restorations that do their best to make these films feel complete again with the best elements left available.
Sometimes, there are films that remain lost merely because they’re being withheld by a particular artist or by the production company that made it.  This usually is because the film’s are an embarrassing black mark on the person or studio’s reputation and they would prefer that it remains unseen.  But, the downside of withholding a known property is that it will inevitably raise people’s curiosity about these films, and it will in turn will put pressure on the filmmakers to make it available again.  The most notorious example of this would be the 1946 Disney film Song of the South, which the Disney company refuses to release to the public, due to fears that it will spark controversy over its racial themes.  Though not necessarily a “Holy Grail” film, due to the fact that it was available for many decades to the public and can still be seen by anyone who can secure a bootleg copy from Asia, we’ve still yet to see a fully restored version made by the Disney company.  One withheld film that surely would be considered a “Holy Grail” type would be Jerry Lewis’ notorious film The Day the Clown Cried, which has been seen by only a small handful of people in Mr. Lewis’ inner circle.  Supposedly because of the Holocaust setting and Mr. Lewis’ less than genuine depiction of the tragedy, the film has been kept hidden from the public, probably to spare Jerry from the controversy that could arise from it.  Still, rare behind the scenes footage did emerge last year, which has raised people’s curiosity about it once again.  We may someday get a true glance at both movies, but that choice is still determined by the ones who originally made them.
What I do love is the fact that film restoration is no longer looked at as just a noble cause, but rather an essential part of cinema as a whole.  With data back-ups as common as they are now, we are far less likely to see catastrophic losses of film like we did before digital tools were made available to us.  Today we can securely preserve the works of our present as well as restore the classics of our past.  And the search for the most intriguing “Holy Grails” of cinema will undoubtedly continue to inspire both archivists and treasure hunters for years to come.  Now that we’ve managed to see Metropolis become complete, the focus now shifts to the next big find, like the lost Lon Chaney thriller London After Midnight (1927), the most notable victim of the MGM fire; the lost director’s cut of Orson Welles’ The Magnificent Ambersons (1943); or the full 7 1/2 hour version of Erich von Stroheim’s legendary silent epic, Greed (1924).  Some of these films may sadly be forever lost, but the hope always remains.  The great thing about these searches though is that it demonstrates the importance of preserving our cinematic legacy.  Martin Scorsese illustrated this idea beautifully in his 2011 film Hugo, where a young boy helps to rediscover a long forgotten filmmaker, whose legacy has all but disappeared due to the destruction of his original film prints.  Thanks to passionate film preservationists like Mr. Scorsese and the people that work in film foundations and archives around the world, our cinematic legacy is no longer disappearing, but is instead coming back to life again more and more.

http://www.filmpreservation.org/

True Romance – The Problems with Modern Hollywood Love Stories

sixteen

With Valentine’s Day just around the corner, we commonly see Hollywood try to capitalize on the romantic mood of this time of year.  Of all the genres in film-making, the one that seems to have stayed strong year after year is the Romance genre, which benefits from a very specific audience that usually makes up a good percentage of the film-going public; that being people looking for something to watch on a date.  But, what I find interesting about this year is that there has been a significant reduction in the number of romantic movies in theaters.  In fact, this Valentine’s Day has only one wide release that you could consider a traditional romantic movie; the Colin Ferrell-headlined Winter’s Tale, which has to compete in it’s opening weekend with Robocop.  How’s that for date night counter-programming.  The foreseeable future also looks absent for the romantic genre, with not a single wide release film until April 25’s The Other Woman, and that one might be considered more of a slapstick comedy.  I don’t know if this is just a fluke in the schedule, or a sign that the romantic genre has suffered a backlash, due to a recent string of notable failures.  I can see how the latter could be true.  Some truly horrendous movies have come from the genre recently, and I see it as a result of the genre’s current troubles with tone, character development, and just overall lack of definition.
I should state that I have a little bias when it comes to talking about genre pictures like romantic movies.  Romance is a genre that I generally don’t understand and usually try to avoid, not because of themes or content, but because I rarely get any entertainment value out of seeing characters fall in love throughout an entire movie.  I do, however, acknowledge that there are films that do work well in the genre and can be quite uplifting as well.  I just gravitate more towards action oriented genres, although romantic subplots are indeed found in the movies I watch as well.  Some romantic plots in action movies can even more memorable than the ones that come from the romance genre itself.  What I mean to say is that I do like a good love story, it just all depends on the movie.  But, when a movie is clearly boxed in by the genre restraints put on it, I inevitably end up judging a book by it’s cover and in most cases, I’ll probably end up right.  Romantic films, probably more than any other genre, suffers from too little diversification.  There is a specific audience that goes to these types of movies, and the studios make every effort they can to meet those expectations.  But the fact is, there are fewer fresh ideas coming out of this genre and the studios are beginning to scrape the bottom of the barrel just so they can have anything that will draw the audience it wants.
I think one of these problems can be attributed to an issue that in fact is affecting all aspects of film-making; and that’s the overabundance of movies.  Now, it might be unusual to think that more movies out there is a bad thing, but it’s an issue that actually is causing a decrease in the quality of films out there overall.  Steven Spielberg and George Lucas’ now famous op-ed from last year stressed that the studio system was going to implode on itself because of the out-of-control ways that movies are funded and distributed, and that’s something that the romance genre clearly suffers from.  Originally, there would be one standout romantic film released in every quarter of the year, which would do very well.  But, because of the increased flow of production, we have seen multiple genre movies all released at the same time.  This time of year has typically belonged to the romance genre, with movies like Safe Haven (2012) and Beautiful Creatures (2012) battling it out in the same weekend.  But, what usually has been constructive competition has ended up making it rough road for the romance genre, with very few entries actually gaining a foot hold at the box-office.  And when studios absolutely must have 5 or 6 new genre movies season, it means that less care is going to be given to the choices of stories given the green-light.
This is a trend that has come about more recently in the last few years.  Hollywood of course has had a long history with the genre, dating all the way back to the silent melodramas.  But, when we think about the most beloved romances out there, not all of them could be easily classified by any genre.  Sometimes, the most surprising love stories are the ones that are the most beloved, and the ones that have no suspense whatsoever are the ones we most revile.  Take for instance the classic film The African Queen.  The John Huston-directed movie follows a scruffy, callous boatman (Humphrey Bogart) and a stuck-up missionary widow (Kathrine Hepburn) as they travel through the heart of Africa on a small river boat.  Throughout the film, these polar opposites end up growing closer together and form one of the quirkier and more charming couples in movie history, and it all happens it what is essentially an adventure film.  The reason why this romantic plot works so well is because entertainment is drawn from the friction between the two main characters, which the two stars portray perfectly.  Romances can also work with even the most perfect of couples, as long as the outcome is unexpected.  The reason why Casablanca is held up as one of the greatest romances of all time is because the two lovers don’t end up together in the end.  It’s that parting of ways that we find so romantic because of how much each character longs for the other, and what they have to sacrifice for love.  As Bogart puts it so eloquently in the final scene, “We’ll always have Paris.”
 But what I think has happened to the romance genre is that it has become complacent.  Like I mentioned before, the genre is valuing quantity over quality, and that is leading to more movies that are exactly the same.  The strongest culprit of this would be the dreaded “Wedding” picture.  If the Romantic genre were like a sinking ship, “wedding” movies would be the anchor dragging it to the bottom.  In the last couple years, we have seen movies like Licence to Wed (2007), Made of Honor (2008), Bride Wars (2009), The Proposal (2009), and last year’s The Big Wedding all make it to the big screen and predictably get trashed by critics.  I think this is primarily because this sub-genre is characterized more than any other by it’s own cliches.  Pretty much every movie I mentioned can be summed up with the same story-line.  Girl wants to get married, problems ensue, girl ends up with the guy she really wants, the end.  The less interesting the plot, the less people are going to like it, and this sub-genre has become something of a joke over the last few years because of movies like these.  Bride Wars in particular turned out to be so insultingly bad, and probably the least progressively feminist movie ever, that even fans of the genre had to cry fowl.  It seems like filmmakers feel that just the sight of wedding traditions is entertainment enough, which is entirely the wrong way to look at your audience.  The reason why Bridesmaids (2011) became so popular was because it subverted this despised genre in hilarious ways, and that’s ultimately what people wanted in the end.
But is the genre completely helpless and without quality entertainment.  Not at all.  Usually all it takes is for one inspired idea, or a filmmaker who gives a damn like Nora Ephron or John Hughes, for a romantic film to work.  Last year, we saw two examples of quality movies from the genre, made by people who have already left their mark with these kinds of films before.  One was from Richard Linklater, which was Before Midnight, starring Ethan Hawke and Julie Delpy.  Midnight is the continuation of a series of movies following the same couple as they reach different stages in their lives, which started with 1995’s Before Sunrise and continued with 2004’s Before Sunset.  These movies are almost universally beloved and respected and it shows that if the people involved are invested enough in what they are making, it can end up being a quality film.
Another movie that actually left a good mark on the genre last year was the film About Time, which illustrated how you could make a charming romance work by injecting a new idea into it.  The movie was written by Richard Curtis, who has become synonymous with the Romance genre over the years with his scripts for beloved movies like Love Actually (2003), Notting Hill (1999) and Four Weddings and a Funeral (1994).  What Curtis did with this film to make it stand out, however, was to include a supernatural element; in this case, time travel.  While not entirely a novel idea, he nevertheless made it work with the film’s themes of awkward romances and regret, which in turn made it a more enriching film overall.
Having a unique voice helps to make a Romantic movie work nowadays, and it certainly is a breath of fresh air when a good one comes along.  The reason why many of the best ones hold up is because they treat their characters with dignity.  One of the biggest mistakes a person can make when writing a love story is to value one character’s worth over the other.  This sometimes gets into the tricky issue of gender politics, which can be a minefield if handled improperly.  Oftentimes, when a person writes a very poor love story, it’s because the male and female characters are played as generic stereotypes.  How believable is it when you see a movie where a girl has a hard time finding an attractive man, even when she has no flaws herself?  Hollywood has a problem with portraying body image correctly in movies, largely because they put glamour before everything else. Would Bridget Jones’ Diary (2001) have been better if a fuller figure actress had played the main character, instead of the more petite Renee Zellweger?  I honestly think Hollywood should give something like that a shot.  Also, as a male, I feel like men are underdeveloped in these movies.  Either they are just the object of desire, or a sexist jerk who doesn’t understand the main girl’s feelings, and that’s it.  Sometimes it’s the girl who also gets the short end of the stick in this genre, by being too self involved in their own feelings, thereby being less interesting.  Overall, the best love stories are the ones where the characteristics of both individuals are given enough time to develop, because in the end, love is a two way road.
So, is Hollywood seeing a backlash from a long string of terrible genre picks.  It might be too early to tell.  One thing that I think may have happened is that the genre has evolved into something else that can’t be clearly defined by the genre norms that we’re all familiar with.  For one thing, the rise of Young Adult novel adaptations has changed the way we look at romantic plot-lines in movies.  With films like Twilight bringing romance into the supernatural realm, it’s safe to say that you can make any type of genre flick into a popular romance.  Hell, last year we even got a zombie love story with the movie Warm Bodies.  The reason this trend seems to be happening is because the audiences that would have normally gone to the movies as part of a date night are now seeing movies of all kinds, not just Romantic movies.  In many ways, Hollywood has actually done a fairly good job of making movies that appeal to both genders, like The Hunger Games series.  That seems to be why the traditional romantic movie seems to have disappeared recently.  Oh, it’s still there, only not as prevalent as it once was, and that might be to it’s advantage.  Less competition can help a genre film stand out and maybe even get a boost from a more discerning audience.  There will always be an audience out there that wants a good, old-fashioned love story and this is the perfect time of year to not only indulge in the same old thing, but to also fondly remember the ones that really touched all of our hearts.

Ecstasy for Gold – The Cutthroat Campaigns of Awards Season

OscarStatue

Awards season is once again upon us and as always, there is a lot of debate over which film is deserving of the industry’s highest honors.  What is interesting about this year, however, is how up in the air it is.  For the first time in a long while, there are no clear favorites in this year’s Oscar race.  In years past, a clear picture would form by now of who was leading the pack after the Golden Globes and all the industry guilds have made there choices.  But so far, every one of the top honors this year has been varied, leaving no clear front runner for Best Picture at the Oscars; made all the more confusing after the Producers Guild Awards ended in a tie for the first time in it’s history, awarding both 12 Years a Slave and Gravity for Best Film.  Sure, any accolades for these movies are well deserved and appreciated by their recipients, but it’s the Academy Awards that solidifies the award season, and it’s what everyone in the industry strives for in the end.  That strong desire to win the top award has become such a dominant force in the industry, that it has started this troubling trend of negative campaigning in Hollywood. In recent years, we’ve seen Oscar campaigns become so overblown and vicious that it would make even Washington insiders queasy.  And the sad result is that in the pursuit of the industry’s top honors, the movies themselves will get lost in the shuffle.
This isn’t something new either, but it has developed over time into something bigger.  Oddly enough, when the Academy Awards first started in 1927, the awards themselves were considered an afterthought.  Instead, it marked the conclusion of a banquet dinner held by the Hollywood elite to celebrate the end of the year.  Many of the winners in this first ceremony either discarded their Oscars or pawned them off in later years, not foreseeing the significance that those statues would have in the years to come.  It wasn’t until about 4-5 years later when the ceremony gained significance, around the time when they started announcing the winners on the radio, allowing audiences to be informed about Hollywood’s awards recipients.  Once the ceremonies began to be televised in the 50’s, the awards season had now become a full blown cultural event and a focal point for the industry ever since.  Of course, with the whole world now interested in who was winning, it soon led to some of the studios making behind the scenes deals in order to get their movies to the top.  One of the earliest examples of questionable campaigning for an award came in the 1940 Oscar race, when producer David O. Selznick, hot off his Awards success for Gone With the Wind (1939), pressured a lot of entertainment press agents to campaign for his next film, the Hitchcock-directed Rebecca (1940).  The aggressive campaigning helped the film win Best Picture, but it failed to win any other major award, which led many people to question whether or not it deserved it in the first place; especially considering that it beat out the more beloved The Grapes of Wrath (1940) that same year.
This illustrates the major problem with an overly aggressive awards campaign that I’ve observed; the doubt that it raises over whether or not the movie deserves what it got.  We’ve seen the Academy Awards honor films that have certainly withstood the test of time (Casablanca (1943), Lawrence of Arabia (1962), and The Godfather (1972), just to name a few), but there are also choices made in other years that have left us wondering what the Academy was thinking.  But it’s not the final choices that make the Oscar campaigning problematic.  We all differ when it comes to choosing our picks for the awards, because everyone’s tastes are different.  What I find to be the problem is the increasingly nasty ways that movie companies try to get their movies an award by attacking their competition.  In recent years, I’ve noticed that this has gone beyond the usual “For Your Consideration” campaigning that we commonly get from the studios, and it has now devolved into fully-fledged mudslinging.  Truth be told, I don’t even think political campaigns get this cutthroat, but then again, I’m not much of a political observer.  This year in particular, we’ve seen complaints leveled at films for inaccuracies in their historical reenactments and for mis-characterizations of their subjects.  While some accusations have merit, there becomes the question of whether or not it matters. There are some voters out there who are persuaded by the chatter and would rather let the outside forces persuade them towards making a choice than judging a film on its own strengths, which becomes problematic when that chatter is ill-informed.
The most troubling thing about the recent trend of negative campaigning in the awards season is the inclusion of outside forces brought in to give weight to the criticisms behind a film.  This goes beyond just the negative reviews from critics.  What we’ve seen happen recently is the involvement of the media and press more and more in Oscar campaigns.  This has included articles written by scholars and experts that call into question the authenticity of the facts in the film as a way of slamming a movie’s credibility.  Famed astrophysicist Neill DeGrasse Tyson made the news weeks back when he published an article that pointed out the scientific information that the movie Gravity got wrong, which many people in the industry jumped upon to undermine Gravity’s chances for some of the top awards.  Mr. Tyson later on said that he did the article just for fun and continued to say that he still enjoyed the film immensely, but this seemed to get lost in the controversy that his first article stirred up.  It could be argued that film companies utilize negative campaigning just because it’s easier and more effective, which is probably true, but what it ends up doing is to distract people away from what the awards season should really be about which honoring the best work done by people in the industry that year.
The most dangerous kinds of negative campaigning that I’ve seen have been the ones that have no bearing in actual fact.  One of my first articles on this blog was an editorial addressing the smear campaign leveled against Quentin Tarantino’s Django Unchained.  At the time of the film’s release, African-American director Spike Lee openly criticized the movie because of it’s pervasive use of the “N-word,” and he denounced the film as “racist” and an insult to the history of slavery; despite the fact that he hadn’t seen the film yet.  Spike Lee’s comments however were used as ammo against the movie during last years Oscar race, which fortunately had little effect as the film walked away with two awards; for Screenplay and for Supporting Actor Christoph Waltz.  The same cannot be said for Kathryn Bigelow’s Zero Dark Thirty, however.
Released around the same time as Django, Zero Dark Thirty had a lot of hype built up around it, seeing as how it was documenting the search and capture of Osama Bin Laden.  The film’s hype was a case where Hollywood’s connections with political insiders became both a blessing and a curse.  Some left-wing studio heads even wanted to fast track the film’s release, so it would premiere before the 2012 election in the hopes that it would boost President Obama’s chances for reelection.  When the film premiered, however, the film’s reception was not what people expected.  Bigelow’s very frank depiction of torture used by the CIA to help find Bin Laden angered many people, and criticism of the film shifted from it being called left-wing propaganda to right-wing propaganda.  The film’s producers rightly argued that politics had nothing to do with the movie’s overall depiction, but the damage had already been done.  The one time Oscar front-runner was dealt a significant blow.  Kathryn Bigelow was shut out of the Best Director category and the film only ended up winning one award for Best Sound Editing, which it had to share in a tie with Skyfall (2012).  You could say that Zero Dark Thirty became a victim of it’s own pre-release hype, but I think the negative campaigning against the film rose to an almost unethical level when political leaders got involved.  Just weeks before the Oscar’s ceremony, Democratic Senator Diane Feinstein, along with fellow Democrat Carl Levin and Republican John McCain, called for an investigation into the film’s development, examining how Bigelow and writer Mark Boal got their information.  When the Oscars were over, almost on cue, the investigation was dropped.  We may never know if there was some backroom deal involved, but I saw this as an example of Awards campaign interference gone too far.
It’s troubling to think that some people are so easily persuaded by hype and negative press in the film industry, but it’s a result of how valuable these awards have become.  It is true that winning an Oscar will increase a film’s overall box-office numbers, which may be good for business, but it’s bad for the film’s legacy.  What is there to gain from a short-term boost in grosses when you’re hurting the film’s chances of having a long shelf life?  There are many examples of movies gaining a negative stigma if they win the top award over more deserving films.  The most controversial example would be 1998’s Shakespeare in Love, which many people say stole the Best Picture award away from Steven Spielberg’s Saving Private Ryan; so much so, that new campaign rules were drafted up by the Academy when it was revealed how much money Miramax execs Bob and Harvey Weinstein put into the film’s Oscar campaign.  Shakespeare did see a boost at the box office in the weeks before and after the awards, but the controversy behind it has unfortunately overshadowed the film itself over the years, which has in turn destroyed its staying power.  Time is the best judge of great movies, but the Oscars have only the year long window for perspective, so usually their picks have little foresight in the end.  1999’s winner, American Beauty, has almost faded into obscurity over time, as other films from that same year, like The Iron Giant, Fight Club, and The Matrix have become beloved classics up to today.
Is it right in the end to criticize a film over it’s content, or it’s adherence to the facts?  My argument is that a movie should be judged solely on it’s own strength as a movie.  The truth is that there is no absolute truth in film; it’s all make-believe after all.  If a film needs to take some historical liberties in order to tell a more fulfilling story-line, then so be it.  What I hate is when controversies come up around a film when it really doesn’t matter in the end.  Some controversies this year have erupted over films like Saving Mr. Banks and Captain Phillips, because of their white-washed approach to the depictions of their main characters, and the negative campaigns against them robbed actors like Tom Hanks and Emma Thompson out of recognition for awards that their outstanding performances would’ve otherwise deserved.  So what if aspects of these people’s lives are left out of the film; in the end they have nothing to do with the story’s that the filmmakers wanted to focus on in the first place.  The Wolf of Wall Street has had it’s own set of controversies, some of which the movie purposely provoked, and yet it didn’t effect it’s chances at the Oscars, so it shows that there is a selective bias in the negative campaigning behind against these films; all depending on who has something to gain from knocking out the competition.
When the winners of the Oscars are announced this year, my hope is that the voters use their best judgement when they cast their ballots.  For the most part, the Academy Awards will never please everybody.  Most often, whenever people say they were upset by the Awards, it’s more because there are few surprises and the whole thing ends up being boring in the end.  That’s why I am excited about this year’s open race, because anybody could win.  Unfortunately, the closer the race, the more negative the attacks against each film will be.  I think that hype can be a dangerous tool for a film if it is misused, and will ultimately end up clouding the merits of the movie itself.  In the end, Oscar gold does not always mean certification of excellence.  Great films stand the test of time, while the Oscars are more or less a time capsule of public tastes from that specific year.  Sometimes they pick the right Best Picture or performance, sometimes they don’t.  But what is certain is that negative campaigning is getting uglier and more prevalent in the award season.  What I hate is the fact that it’s become less about honoring great works in cinema and more about competition, seeing who’ll take home the most awards at the end of the night.  What seems to be lost in the shuffle is whether or not people like the actual films; that the movies are becoming increasingly seen as an afterthought in the awards season, with hype and name recognition mattering more in the media’s eye.  But, in the end, what matters is the entertainment value of it all, and no doubt we’ll still continue to be on the edge of our seats each time those envelopes open.

Hollywood Royalty – The Ups and Downs of a Film Acting Career

actress

A lot of work goes into making movies from many different departments, but what usually ends up defining the finished product more than anything is the quality of the actors performing in it.  Whether we like it or not, the actors and actresses are what the audiences respond to the most; more than the script and the direction itself.  Sure, writers and filmmakers can leave an impression and can build a reputation of their own, but their work is meant to be unseen and part of the illusion of reality.  It is the actors who must be in front of the camera the whole time and be able to make you forget that you are watching something that was constructed for your entertainment.  And this is mainly why we hold the acting profession in such high regard.  Sure, anybody can get in front of the camera and act, but it does take real skill to make it feel authentic and true to life.  Hollywood actors are an interesting lot because of the whole aura of celebrity that surrounds them.  They are simultaneously the most beloved and most reviled people in the world, and this usually is a result of the work that they do.  What I find fascinating is the way that a film actor’s career evolves over time, and how this affects the way we view them in the different roles they take.  Some people come into fame unexpectedly and then there are others who work their way up.  There are many ways to look at an actors career and it offers up many lessons on how someone can actually make an impact in the business depending on what they do as an actor.
The way we cast movies has certainly changed over the years.  When the studio system was at it’s height in the 30’s and 40’s, actors were mandated to be under contract, meaning that they had to work for that studio exclusively.  This became problematic whenever an actor or actress coveted a role that was being produced at a competing studio, excluding them from consideration.  Actors also had little choice in what kinds of movies they made, mainly due to the studio bosses who would make those decisions for them.  Many of these actors would end up being typecast in roles that the studios believed were the most profitable for them.  It wasn’t until the establishment of the Screen Actors Guild that actors finally had the ability to dictate the parameters of their contract, and also to have more say in the direction of their careers.  Even still, the pressure to be a successful matinee idol was a difficult thing to manage in Hollywood.  In many ways, it was often better to be a character actor in these early years than a headliner.  A character actor at this time may not have gotten the name recognition or the bigger paydays, but they would’ve gotten more diverse roles and a steadier flow of work of screen credits.  Actors from this time like Peter Lorre, Walter Brennan, and Thelma Ritter enjoyed long lasting careers mainly because they made the most of their supporting roles and had more leeway in the directions of their careers.
It’s the status of a matinee idol that really makes or breaks an actor.  Over the many years since the inception of cinema, we’ve seen actors rise and fall, and in some cases rise again.  Making a career out of film acting is a difficult nut to crack, and seeing how the industry is sometimes very cruel to outdated actors, it’s any wonder why there are so many people who want to do it.  I believe that it’s the allure of fame that drives many young up and comers to want to be actors, but following a dream does not an actor make.  It takes hard work, just like any other field in entertainment.  If I can give any advice to someone pursuing an acting career, it’s that you should never get into it just because you have the looks of a movie star.  Do it because you like performing and being a part of the film-making process.  Of course, it’s probably not my place to give advice to an actor, seeing as how I have not been on a stage since the eighth grade, and that I am looking at this from a writer’s point of view.  But, from what I’ve observed in the film community, it is that the best actors out there are the ones who are really engaged in the process, and not the ones who are in it just to build up their image.  The tricky part, however, is figuring out how to maintain this over time.
Becoming a successful actor in Hollywood has a downside that can be either a minor thing or a major negative thing depending on the person it happens to, and that’s the stigma of celebrity.  Whether an actor seeks it out or not, by being out in front of the camera, they have exposed themselves to a public life.  This isn’t a problem if the actor or actress manages their public and private lives well, but if they don’t, it’ll end up defining their careers more than the actual work that they do.  Case in point, actor/director Mel Gibson. Mel’s career has been negatively impacted by his off-screen troubles, including a nasty break-up with his Russian girlfriend and an Anti-Semitic fueled rant during a drunk driving arrest.  What’s most problematic for Mr. Gibson out of all this is the fact that no matter what he does now, no matter how good, it will always be overshadowed by his own bad behavior.  And it is a shame because, in my opinion, he’s still a very solid actor.  I still love Braveheart (1995) to death, and I think a lot of people are missing out if they haven’t seen his work in The Beaver (2011) yet.  Or for that matter, his excellent direction in Apocalypto (2006). Unfortunately, all his hard work is for not as he continues to alienate more of his audience because of his off-screen behavior.  This is the downside of celebrity that we see, and whether an actor is deserving of the scorn or not, it will always be a part of the business.
Actors and actresses can also find themselves in a rut simply because they are unable to adapt to the changing course of the industry.  This is certainly the case with people who have created their own signature style of acting.  Comedic actors in particular fall into this trap.  I’ve noticed that some actors who breakthrough in comedies in certain decades will almost always loose their audience by the next.  Shtick is a deceptive tool in the actor’s arsenal, because it helps people achieve stardom right away, but also anchors them down and keeps them stuck in place.  We’ve seen this happen to many comedic stars, like Eddie Murphy and Mike Meyers and Jim Carrey; and it’s starting to become apparent in Sacha Baron Cohen’s post-Borat career.  The only comedic actors who seem to make long lasting careers are the one’s who choose a dramatic role once in a while, like Bill Murray or Robin Williams.  Age also plays a factor in the downfall of people’s careers.  It usually happens with child actors who can’t shake off their youthful image, and unfortunately diminish and disappear once they become adults.  Making that transition from child actor to adult actor is tough, and it’s what usually separates the Elijah Woods from the Macaulay Culkins.  It’s less difficult nowadays for elderly actors to loose their careers than it was many years ago, mainly because movies like Nebraska (2013) give older actors much better roles.  But, in the past, the industry was incredibly cruel to older actors; something highlighted brilliantly in Billy Wilder’s classic Sunset Boulevard (1950).
What usually ends up making an actor or actresses’ career survive is their ability to grow as a performer.  There’s something to the old adage of there being “no role too small.”  Actors should relish the opportunity to diversify their choices of roles.  And usually the ones who have the longest lasting careers are the people who can play just about anything.  Meryl Streep is considered the greatest actress of her generation, and she didn’t do it by playing the same kind of character over and over again.  She has done comedies, dramas, cartoons; she has played Austrians, Australians, teachers, mothers, daughters, grandmothers, you name it.  No one would ever consider her lazy.  She has made a living challenging herself as an actor, and while not every role of her’s works (Mama Mia, for example) she nevertheless commands respect for her efforts.  What I respect the most is the ability of an actor or actress to move effortlessly from genre to genre and still act like it’s a role worthy of their talents.  That’s why I admire actors like Christian Bale, who can go from dark and twisted roles like The Machinist (2004) to playing Batman, or Amy Adams who can appear in movies as diverse as Paul Thomas Anderson’s The Master (2012) and The Muppets (2011) and give each film her best effort.  It’s always refreshing to see actors who commit themselves to any role they get, which in turn helps to endear them to us as an audience.  An invested actor will almost always make a film better.
Usually nowadays a bad performance is not measured by how misplaced an actor is or by how out of their league they may be.  The worst kinds of performances come from actors who are just lazy.  At the point where an actor just works for the paycheck and nothing more is usually where their careers begin to decline.  We’ve seen this with many actors who just get too comfortable doing the same role over and over again, or with people who take a job just for the pay and act like the part is beneath them.  When this happens, it’s usually driven by ego, which is another negative by-product of celebrity.  When an actor begins to dictate the terms of their comfort level in the production, rather than try to challenge themselves as a performer, it will usually mean that they’ll put in a lackluster performance, which leads them towards becoming a one-note performer.  This sometimes happens to people who hit it big and then become afraid of alienating this new audience they’ve built.  Johnny Depp is an actor that I think has reached this point, having built a wide fan-base from his Pirates of the Caribbean films.  The once ground-breaking actor has now fallen victim to his own shtick and that has negatively impacted his recent slate of films like The Lone Ranger (2013), which shows what happens when you try to play things too safe.
It is remarkable when you see these changes in a film actor’s career, because they usually happen unexpectedly.  Overall, the actor is the one responsible for their own career path, but the market itself can be a wild card factor in the lives of these people.  I for one value the efforts of a strong actor who’ll continue to try hard, even when the roles stop being what they are used to.  It’s something of a miracle to see actors who continue to stay relevant year after year, like Tom Hanks or Sandra Bullock.  They’ve locked into a career path that seems to have worked for them and have managed to maintain they’re faithful audiences even when they take on more challenging roles. What is also interesting is how Hollywood values a redemption story when it comes to an actor’s career.  A Hollywood comeback always manages to be a positive thing in the industry, especially when it happens to the least expected people; like with Robert Downey Jr. bouncing back from his drug addiction to play Iron Man, or Mickey Rourke pulling himself out of B-movie hell when he made The Wrestler (2008).  Film acting careers are probably the least predictable in the industry and it takes someone with a lot of personal resilience to make it work.  If there is anything an up and coming film actor should learn is that hard work pays off.  Don’t fall victim to concerning yourself over the changing trends or acting out of your comfort zone.  In the end, the best thing you can do is to commit to the role, no matter what it is.  Like the great George Burns once said, “Acting is all about sincerity.  And if you can fake that, then you’ve got it made.”

Tis the Season – Why Some Films Become Holiday Perennials

its_a_wonderful_life_3

We’ve reached the end of another calendar year and of course that can only mean that it’s Holiday season once again.  Whether we are celebrating Christmas, or Hanukkah, or whatever, it’s a time of the year where we all gather together and honor family, tradition, and the gift of giving.  What’s interesting about Christmastime, however, is just how much the holiday tradition is actually influenced and centered around Holiday themed movies.  A Holiday film can pretty much be considered a genre all it’s own, since so many of them exist out there, and are created specifically to invoke the holiday spirit.  Not only that, but they are movies that we continually return to every year around this same time, like it’s part of our holiday ritual.  This doesn’t happen with every Christmas themed movie, however, since many of them try to hard to hit their mark and fail spectacularly.  And yet, we see small films that no one thought much of at first grow into these perennial classics over time, and in some cases add to the overall Christmas mythos that defines the season.  But, how do we as an audience discern the classics from all the rest?  What really separates a Miracle on 34th Street from a Jingle all the Way?  Quite simply, like with most other movies, it’s all determined by what we bring from our own experiences in life when we watch a movie.
The emergence of  perennial holiday classics is nothing new in pop culture and actually predates the beginning of cinema by a good many years.  Literature has contributed holiday themed stories in both short form and novels for the last couple hundred years, helping to both shape and reinvent Christmas traditions in a very secular fashion.  Our modern day physical interpretation of Santa Claus can in fact be contributed to his appearance in “Twas the Night Before Christmas,” the 1823 poem by American author Clement Clarke Moore.  Moore’s nearly 200 year poem is still being recited today and it shows just how much tradition plays a role in keeping a perennial classic alive in the public’s eye.  Around the same time, acclaimed British novelist Charles Dickens wrote the story of A Christmas Carol, chronicling the tale of Ebenezer Scrooge and his visits by three ghosts on Christmas Eve.  Since it’s original printing in 1843, A Christmas Carol has gone on to be one of the most re-adapted story-lines in history.  Perhaps nearly a quarter of all holiday classics can claim to have been influenced by Dickens’ classic tale, where a dreary old cynic has his heart warmed by the holiday spirit.  Dickens meant for his novel to be a meditation on greed and class inequality, but I have no doubt that he purposefully meant for Christmas traditions to be the healing influence in Scrooge’s reawakening.  These stories continue to stand strong so many years later and it shows just how far back our culture began to value Christmas stories and songs as a part of the holiday tradition.
Even from the very outset of cinematic history we saw films carry on holiday themes.  Both Twas the Night Before Christmas and A Christmas Carol provided inspiration for movie-makers many times, given their already beloved appeal, but some people in Hollywood also saw opportunities to add their own original holiday themed stories into the mix.  When the studio system emerged, they were very well aware of the marketability of a holiday themes.  After all, people usually visited movie theaters frequently during the cold winters, so why not play up the festive mood that everyone was already in.  For the most part, movies celebrated Christmas more frequently in short segments than in full length story-lines in these early years; whether it was capitalizing on a new popular Christmas song in a lavish musical segment, or by portraying a Christmas celebration as part of larger arching narrative.  Many people forget that one of the most popular Christmas tunes ever written, “Have Yourself a Merry Little Christmas,” wasn’t even from a Christmas themed movie; rather it came from the 1944 musical Meet Me in St. Louis.  But eventually the Christmas season became such an influential part of our modern cultural tradition, that it would inspire films devoted entirely to the holiday spirit.
So, in the years since, we have seen Holiday films become standard practice in Hollywood.  Every year, it’s inevitable to see a Christmas movie released in time for the holidays.  Unfortunately, for most of them, Christmas movies very rarely achieve classic status.  For every one that audiences grow an attachment to, there will be about a dozen more that will likely be forgotten by next December.  Evidently, it seems like Hollywood’s approach to the holiday season is less carefully planned out than any other part of the year.  Their approach seems to be throwing whatever has Christmas in the title up against the wall and seeing what sticks.  Unfortunately, this has led to Christmas being more synonymous with bad movies than good.  Some are well meaning films that fall short of their goal like the Vince Vaughn film Fred Claus (2007) or the odd but charming Santa Clause: The Movie (1985).  And then there are ugly, shallow and distasteful films like Deck the Halls (2006), the Ben Affleck disaster Surviving Christmas (2004), or the deeply disturbing Michael Keaton film Jack Frost (1998), with the creepy as hell CG snowman.  And the less said about the horrible 2000 How the Grinch Stole Christmas remake the better.  Overall, it is very hard to make an honestly cherished holiday classic in Hollywood, and that’s mainly because the business just tries too hard.  If you look closely, you’ll actually find that a beloved holiday classic may come from the unlikeliest of places.
This was definitely the case with what has become not just one of the best loved Christmas movies, but one of the best movies period; that film being Frank Capra’s It’s a Wonderful Life (1946).  Capra’s movie tells the story of George Bailey (a flawless Jimmy Stewart), a man who has given so much back to his hometown and has gotten so little in return, reaching the verge of suicide due to his depression.  Through the intervention of a guardian angel on Christmas Eve, George is shown what his life would have been like if he never lived and he rediscovers his value and purpose and as it turns out is finally rewarded by those whom he’s helped all his life on the following Christmas Day.  The film is very uplifting and perfectly illustrates the true impact that the Christmas season has in our lives.  With a theme like that, you would think that the movie was a smash hit when it was first released, but instead the movie was a colossal bomb.  It bankrupted the company that made it and ruined Frank Capra’s directing career from then on.  The focus on George Bailey’s increasing depression was probably too hard for audiences to take at the time, given that many soldiers were returning home after the end of WWII.  Despite it’s initial failure, It’s a Wonderful Life managed to survive through TV airings which happened on, naturally, Christmas Eve and the film not only found it’s audience but it became a seasonal standard.  To this day, It’s a Wonderful Life is still aired on network TV (the only classic era movie that still is), and audiences from every generation still embraces it warmly, no matter how old fashioned it may be.  Pretty good legacy for a film that started off as a failure.
A holiday classic can come from an unlikely place like It’s a Wonderful Life, but for many, what is considered a classic is usually determined by their own tastes.  That’s why some people find romantic comedies set around Christmastime to be considered a holiday classic.  Case in point, the movie Love, Actually (2003) has grown into a beloved holiday classic, even though the themes in the movie are less about Christmas and more about the intertwining relationships between the characters.  By standing out as a strong romantic film with a Christmas setting, it stands to see this film as being an example of two types of genres working together.  Cult movie fans even have holiday classics that they cherish, like the weird campy film Santa Claus Conquers the Martians (1964), which can hold the distinction of being one of the worst movies ever made, and incredibly entertaining at the same time. And some people can even claim that Die Hard (1989) counts as a Christmas movie, because of it’s holiday setting.  Pretty much it’s whatever we bring with us from our own experiences to the movies that determines what we consider to be entertaining.  Like with how most people gravitate towards a movie based on their own interests, so too do we see that with Holiday films as well.  Hollywood has in some cases picked up on this and has catered to select audiences at Christmastime with genre specific movies.  Usually, it will take a consensus of a large audience to determine which ones will stand out as the undisputed classics.
I think where Hollywood hits it mark most often is when it comes to making a successful holiday film that appeals to the memories of our own experiences of Christmas.  The film that I think hit a perfect bulls-eye in this regard, and stands as a true masterpiece of Christmas themed film-making, is the 1983 classic A Christmas Story.  Directed by Bob Clark, and inspired by the auto-biographical stories of novelist Jean Shepherd, A Christmas Story perfectly captures the highs and lows of a young boy’s experience during the holiday season.  Ralphie (Peter Billingsley) is a character who was relatable to any young boy growing up in small town America, myself included, and seeing how he tries so hard to manipulate his parents into getting him his dream present is something every child will identify with.  Couple that with the hilarious performance of Darren McGavin as the Old Man and the iconic Leg Lamp, and you’ve got the very definition of a holiday classic.  But, just like how A Christmas Story highlights good Christmas memories, we see classic films that also center around a disastrous Christmas experience as well.  The best example of this would be the very funny and endlessly quotable National Lampoon’s Christmas Vacation (1989).  We’ve had just as many Christmases like the Griswold family as we have like the Parker family from A Christmas Story, and Christmas Vacation just perfectly encapsulates all the bad things that happen at Christmas time, without ever losing the holiday spirit underneath.  Not to mention its the last time we ever saw a funny performance out of Chevy Chase.
So, despite the low success rate, we as an audience still seem to find a classic seasonal favorite every in every generation.  But how does Hollywood keep making bad Christmas movies every year despite the demanding tastes of the movie-going public rejecting all the junk they put out.  I think it’s because the season itself is such an overwhelming cultural force, that most filmmakers don’t really care about the product they’re making, as long as it’s holiday themed and ready to capitalize on the mood of the period.  When it comes down to it, a great holiday classic is not determined by how soaked up in the holiday spirit it is, but rather by how strong it story works.  We keep watching It’s a Wonderful Life every year because of how inspirational George Bailey’s life story is, and not because of the Christmastime finale that has come to define it.  In fact, the movie is not really about Christmas at all; it’s about the life of one man and his value to the world.  Other Christmas movies usually become classics just because of a wintry setting, where the holiday is not even mentioned.  And even films that subvert Christmas traditions, like 2003’s Bad Santa, have become genuine holiday classics to some people.
I, myself, love a good Christmas movie, and because I’m such an ardent appreciator of movies in general, these films have certainly become a part of my holiday tradition.  I return to It’s a Wonderful Life and A Christmas Story every year and never get tired of them.  And not a year will go by when I don’t at least drop one quotable line from Christmas Vacation during this season.  I hope every generation gets their own perennial classic that will last for years to come.  Just please; no more remakes or sequels.  We all saw the backlash that an announcement of a sequel to It’s a Wonderful Life got recently.  I only wish The Grinch and A Christmas Story had been spared the same fate.  Like too much Christmas dinner, there can always be too much of a good thing when it comes to Christmas movies.

Flicks and Picks – The End of the Blockbuster Video Era

blockbuster

Though it was long seen coming, it finally became official this last week.  Blockbuster Video is no more.  While this is a sign of how things have progressed in home entertainment for mostly the better, with on-demand and streaming video making it easier for the consumer to watch whatever they want, it does also bring an end to an institution that has been at the center of many cinephiles lives.  Apart from some independent holdovers here and there, you rarely will find a video store in your local neighborhood today.  But back in the day, finding a store devoted to video rentals was as easy as finding a McDonald’s.  The decline of video stores over the years certainly has had to do with the advancements in streaming video, but the dominance of Blockbuster Video as a company also played a role as well.  In a way, by working so hard to become the top dog of the video rental market, Blockbuster also facilitated it’s own downfall when the market changed once again.  Though the end of Blockbuster was inevitable, and needed to happen, it does leave a gap for those of us who’ve built their love for film through renting from their local video store.  The video rental experience, while not exactly life-changing, is something that most film lovers have been through at some point in their lives, and this week it has now become a thing of the past.   In this article, I will look back on this era that Blockbuster Video defined, and what it’s end means for the future of home entertainment.
In the late 80’s, we saw the emergence of VHS, which gave studios and filmmakers the ability to make films available for purchase after their theatrical release for the very first time.  Before, audiences had to wait for airings on television before they could see their favorite films again, and that also meant having to put up with commercial breaks as well.  When VHS tapes started to be produced by the studios directly, it led to the creation of a niche market, with stores opening up across the country, directly geared toward filling that public appetite.  Being able to own a movie as part of a collection is a commonplace thing nowadays, but when home video sales began, it was an exciting new frontier and it had an influence on the film industry almost instantly.  Not only did the rise of home video affect the number of theatrical runs that a movie would have, but it also drove the movie studios towards film preservation and restoration as well, because of course, presentation matters for home viewing.
But, like with most new technology, VCR tape players were very expensive, and buying a movie to play in it was also not cheap at the time.  Some retailers even had to pay prices as high as $100 per movie in order to have it available in their stores.  So, in order to get more out of their product, and to let audiences have better access to the movies they wanted, video rental services came into being.  Like checking a book out from a library, consumers would be able to rent a movie for a certain amount of days at a low price.  This business model worked extremely well and led to boom in VCR sales.  Video stores popped up all across the country, both locally owned and franchise operated, and home video sales very quickly became a major part of the film industry as a whole.  But, it wasn’t just studio films that benefited  from this new market.  Independent producers saw an open opportunity in this new industry, and before long a whole Direct to Video market opened up, thanks to video stores allowing to indiscriminately sell and rent out a whole variety of films as a way to fill their shelves with more product.  In these early days, it was very common to see a diverse collection of independent stores in your hometown, as it was in mine.  There were stores that I grew up with  in my hometown of Eugene, Oregon that went by such varied names as Silver Screen Video or Flix & Picks, and choosing a rental from these places certainly had an affect on my growing interest in movies at a young age.
But that changed in the mid 90’s when the video rental industry became more standardized.  Out of this period of time came a chain of stores known as Blockbuster Video.  Blockbuster was founded in 1985 in Dallas, Texas, and started off as just another local retailer like most other stores, before it began to expand rapidly.  In the late 90’s, it was common to find at least one local Blockbuster in your area, and by the end of the decade, Blockbuster was unrivaled in the home video market.  Their rise had the negative affect of forcing all of the other competition out of business, which benefited them for the time being, but it would come back to bite them in the years ahead.  Blockbuster may have been ruthless to the competition, but to become the best in the industry, they did manage to do many beneficial things that did revolutionize the market.  For one thing, they were the first national retailer to begin video game rentals.  Their standardization of rental pick ups and drop offs also revolutionized the way we rent movies, making the drop off slots at your local store a life-saver late at night.  Also, Blockbuster was also the first chain to begin working within the film industry to create exclusive promotions and deals on upcoming releases.  Despite seeing a lack of choices in rental stores happen because of Blockbuster’s dominance, I don’t believe that consumers cared much about it as long as Blockbuster still operated efficiently.
Most film lovers will attest that they’ve probably spent a good amount of their time in a Blockbuster store.  While many of us could find exactly what we wanted at any time, there was another side effect that also changed how we grew up watching movies after spending time in a Blockbuster, and that effect would be the impulse rental.  I’m sure most of you out there have come out of a Blockbuster Video at one time with a movie you’ve never even heard of instead of the one you wanted, simply out of curiosity.  Having a variety of choices seems normal now, but not until video rental came about did consumers have that level of control over what they were able to choose.  Before, you would have been limited to the whatever was playing on TV or in your local cinema, but stores like Blockbuster made consumer choices as simple as a quick scan through their shelves.  For cinephiles, I’m sure that part of their growing love for films started out of making a surprise choice in the local video store, and with stores as big and as well stocked as Blockbuster, those surprises could have come from even the most obscure of titles.  Blockbuster was also handy for film students like me whenever we had to watch a film as part of an assignment.  Whether it was a film we knew or not, at least we had the comfort of knowing that there was a place we could look for it in a hurry.
In the later years, however, the market began to change again.  The internet revolutionized video streaming in the later part of the 2000’s, and our reliance on Video and DVD for home entertainment purposes soon became a thing of the past too.  Even though Blockbuster cleared out all comparable competition, they were ill equipped to take on the likes of a Netflix.  What Netflix did was to eliminate the middle man in video rentals, and have movies sent directly to the home through the mail, which made it unnecessary for anyone to go out to a store and rent a movie anymore.  Blockbuster tried it’s own rent by mail service in response, but by then the damage had already been done.  Netflix had surpassed Blockbuster as the number one rental service and the former giant had to begin downsizing in order to survive.  Soon, Redbox emerged and took away even more business from Blockbuster, appearing as convenient vending machines in grocery stores for anyone looking for an impulse rental.  Like most all other forms of retail, the trend has moved towards online shopping, and Blockbuster is one of the biggest to have fallen, mainly because their business model was one that couldn’t adapt in the digital age.  All that’s left for Blockbuster is it’s still recognizable name, and even that is owned by someone else now (it was purchased by DirectTV in 2011 for the branding it’s on-demand service).
Because Blockbuster eliminated much of the competition beforehand, it has actually made the transition to on-demand video renting faster and less rocky.  There was no large grouping of various retailers resisting the the changes in the market; only Blockbuster.  And now that they are gone, the era of land-based video rental shops has ended with them.  Sure there are independent stores in certain areas that still serve nostalgic purposes, but their clientele is limited.  Now it is more commonplace to hear that people have a Netflix account rather than a Blockbuster card.  But Blockbuster still left a legacy that will not be quickly forgotten, especially among longtime movie aficionados.  Many of us can still remember moments when being close to a Blockbuster came in handy; whether it was for a late night impulse rental, or for a quick bit of research, or for merely wanting to see a movie that you missed the first time around.  For many people, the first time they watched a particular movie, it was probably not in a movie theater but through a rental from big blue.  I can certainly say that I credit my local Blockbuster for helping me experience so many different types of movies.  One of my favorite films of all time (Seven Samurai) came to me out of an impulse rental from Blockbuster, and I will always be grateful for that.
So it’s a bittersweet end for the onetime giant.  Their closure spells the end of an institution that has been a big part of all of our cinematic experiences, but it’s a closure that was necessary.  Netflix and Redbox are just much better and convenient services, and Blockbuster was a relic that was standing in the way.  But, as we move forward, will those two also fall prey to the same fate as Blockbuster.  My guess is probably not.  Blockbuster had the unfortunate circumstance of being the top force in a market that was destined to fall.  Netflix and Redbox, however, have relished in the fact that they stand in direct competition with each other, and that has led to new and creative avenues for both companies.  Unlike Blockbuster, Netflix has branched out and generated their own exclusive content, including comedy specials and original shows like House of Cards, which not only makes it a great rental service, but also a competitor to broadcast TV.  And Redbox is able to make itself available in locations all across the world without having to set up the infrastructure of an entire store chain.  And with Amazon and Walmart entering the market with their own video streaming services like AmazonPrime and VUDU, it’s showing that the rental market is one that is going to continue growing in this new direction.  Blockbuster is certainly done as an independent company, but without it ever being there in the first place, the rental business would certainly never have gotten to where it is now, and that’s the legacy that it ultimately will leave behind.

Apocalyptic Cinema – Making Disasters Entertaining in Movies

ID4

One thing that we often see in human nature are destructive impulses; or to be more specific, we all like to see something get destroyed.  Whether it is a benign thing like blowing down a house of cards or something more extreme like an implosion of a building, we just enjoy watching something that was built up be taken down.  Hell, we even do it to each other through schadenfreude; whether it’s in politics like the Anthony Wiener scandal, or the rise and fall of a Hollywood star like Lindsey Lohan.  Our culture seems to relish destruction as a part of entertainment.  I don’t necessarily find this to be a bad thing, as long as it doesn’t get out of hand.  And that’s usually what we find in a lot of movies as well.  Disaster films have been a staple of movie-making for generations, but in recent years, we’ve seen visual effects work become sophisticated enough to the point where destruction looks authentic enough to be believable.  But, when we start to see movies become ever more comfortable showing widespread destruction as a part of their storytelling, there starts to be a question about where the line must be drawn.  Is it right for us to feel entertained when we see things like the White House or the Capitol Building being destroyed?  How about the entire world?  In this article, I will look at the highs and lows of disaster film-making and how the audiences reactions to them reveal the extremes to which people want to be entertained.
A lot of the reason why Disaster films exist is because they are a great showcase for special effects.  Going all the way back to the silent era, we’ve seen filmmakers use primitive but successful effects work to create larger than life destruction.  You could even look at some of the early Biblical epics like Cecil B. DeMille’s 1923 version of The Ten Commandments as early examples of a disaster movie.  The film had a moral message yes, but there were many audience members I’m sure who saw the film just because they wanted to see the grandiose destruction caused by the ten plagues and the parting of the Red Sea.  As special effects have become more sophisticated, so has there been an increase in disaster movies.  Soon films were crafted around some of the most famous disasters in history, like In Old Chicago (1937), which depicted the Great Chicago Fire of 1871, or San Francisco (1938), dramatizing the famous 1906 earthquake.  It wasn’t until the 1970’s, however, when Disaster films could be declared a genre all to itself.  In that period, we saw a glut of disaster related movies made specifically for the purpose of being epic, star-studded extravaganzas, with the latest is special effects work on display.  These films included Earthquake (1974), starring Charlton Heston; The Poseidon Adventure (1971), with Gene Hackman and Ernest Borgnine; and The Towering Inferno (1974), with Paul Newman and Steve McQueen, just to name a few.
The rise of the disaster movie genre in the 70’s began to die down in the 80’s, mainly due to the rise of Science Fiction and Fantasy films as a showcase for effects work, but the genre lived on as it began to evolve.  In the 1990’s, we saw the emergence of a filmmaker who would go on to not only redefine the genre, but make it all his own.  This filmmaker was German born director Roland Emmerich, and over the course of his career, you can see that nearly 80% of his filmography is made up of disaster movies.  The movie that put him on the map in the film industry was a film that actually redefined two genres in one, and that was 1996’s Independence Day.  The movie was essentially an alien invasion narrative, but what Roland Emmerich did was to use the techniques utilized in popular disaster films as a means to make the destruction caused by the aliens look and feel as real as possible.  In the movie, we see catastrophic explosions engulf entire cities, destroying landmarks before our very eyes, including the White House itself.  This was a film that not only drew upon our greatest fears of total annihilation, but it also made it feel completely real.  Independence Day was a phenomenal success when it premiered, and it made the disaster genre a force to be reckoned with.  As for Emmerich, he has stuck mostly with the genre that had made him a player in Hollywood, with mixed results, with successful but ludicrous films like Godzilla (1998), The Day After Tomorrow (2004), and 2012 (2009) all falling into that same mold as Independence Day.
But, what was interesting about the success of Independence Day was that it revealed something about how we react to seeing destruction on film.  In the movie, famous landmarks like the Empire State Building are blown to pieces and thousands of people are destroyed in seconds before our very eyes.  And this is what we consider entertaining?  Maybe entertaining isn’t the right word.  I think movies like Independence Day do well because it allows us to face our fears and indulge that sinking feeling of helplessness.  It’s not so much the scenes of destruction themselves that we find so entertaining, but the framework around them.  While watching a disaster movie, we need to feel the impact of the destruction, and that’s why so many disaster films have to finish with a happy ending.  In Independence Day, the colossal destruction closes the first act of the film.  The rest of the movie details how humankind copes with the aftermath, and how they fight off the invaders despite the odds against them.  You have to go through a lot of darkness before you can appreciate the light at the end of the tunnel, and that’s what has defined the best films in the genre.  If a film takes a bleak outlook and doesn’t give the movie a satisfying resolution, then it’s going to fail.  This has been the case with other disaster films, like 2009’s Knowing, which leaves everyone dead and earth uninhabitable at the end; sorry to spoil it for you.  Even the laughable 2012 left room for some hope for humanity, and not surprisingly, it did much better.
Disaster films have to thrive on that feeling of hope.  We become enthralled when we see something grand get destroyed, but it’s what rises from the ashes that makes us feel grateful in the end.  That’s why we enjoy watching controlled demolitions; old buildings must come down in order to make way for something better.  That’s helps us to understand why we accept destruction as entertainment.  Many films skirt that line very often, but the way a disaster film can get the audience on its side is through the characters.  Characters in disaster movies must be likable and easy to identify with.  It also helps if they are not thinly drawn stereotypes as well, but fully defined people.  Emmerich’s films have tended to have lackluster characters, which is why casting makes a difference in his movies, and other ones like them.  Independence Day worked well because you had charismatic performances from actors like Jeff Goldblum and Will Smith, who helped to balance the film out by creating characters you wanted to root for.  Other disaster films tend to miscast their roles, making their characters’ story-lines a little more hard to swallow.  Case in point, John Cusack in 2012.  Cusack is a fine actor when a movie calls for it, but when your character is a mild-mannered author who somehow is able to outrun the eruption of a Supervolcano; that I have a hard time buying.  Now it’s difficult to say that a character needs to believable in a movie centered around a fictional disaster, but sometimes it does matter.  Likability of the characters is what separates the good disaster films from the bad ones, and unfortunately that’s something you rarely see work effectively.
For the most part, disaster films exist because they are showcases for the newest techniques in special effects.  The human element in the films are crucial, but they do play a lesser part in the creation of the movies as a whole.  But, when the balance of these films aren’t settled in the right way, then they do run the risk of seeming either lackluster or worse, exploitative.  This was an issue in Hollywood in the aftermath of the September 11th attacks in New York City, where we saw a level of destruction in real life that we could only comprehend in movies before.  Soon after, the Independence Day style destruction of city-scapes in movies stopped for a while, because that imagery became all too real for us and seeing it on the big screen afterwards would’ve been seen as insensitive.  Now that time has passed, we are seeing that kind of destruction depicted again, but it took a while for us to get there.  What I think makes audiences understand the level of acceptability in disaster imagery is the balance between the level of destruction in the movie and how it functions within the narrative.
Even though it came out months before 9/11, I think that the Michael Bay film Pearl Harbor (2001) feel into that unacceptable exploitation category because it didn’t find that right balance.  In the movie, the famous attack is depicted in gruesome detail, but it lacks any resonance because it is just the backdrop to a rather lackluster love triangle plot.  A lot more respect could have been paid to the real men and woman who died on that day instead of having everything hinge on fictional characters that we care so little about.  Pearl Harbor felt more like a shallow Hollywood attempt to exploit a tragedy for the purpose of creating a film that showcased impressive production values and matinee idol stars.  In other words, it was a movie driven more by marketing than actually informing audiences about the real event.  If you don’t find that right balance in a disaster movie, than your film will not be believable, as was the case here.  Pearl Harbor failed as a movie mainly because it knew what it wanted to be, but the filmmakers didn’t know how to make it work.  They were trying to follow in the footsteps of what has ultimately been the only disaster film to date to ever win the Academy Award for Best Picture; that being director James Cameron’s Titanic (1997).  The reason why Titanic worked and Pearl Harbor didn’t was because it had a balance to it.  The love story at the center of Titanic, while not the most engrossing, did keep the narrative moving and it did endear the characters involved to the audience before the pivotal event happens.  Also, James Cameron put so much detail into the recreation of the ship’s sinking, and every moment of that is well executed on screen. No shaky cam or needless destruction is present in the climatic moments of the movie.  It works because the film was, dare I say, respectful to the actual disaster and to the victims of the event as well.
Making disaster movies thoughtful turns out to have been a secret to the genre’s success.  Going back to my example film once again, Independence Day, we see that the film works despite it’s more ludicrous moments by actually having characters work out logical answers to their dilemmas. It’s not enough to have the characters just move from one disaster to another without explanation, like in 2012   Or to have our characters helplessly standby as the world crumbles around them and inject stale philosophical points about why it all has happened, like in The Day After Tomorrow.  We want to see our characters be problem solvers and actually deal with the apocalypse like its something they can come back from.  That’s why, despite it’s many flaws, Independence Day succeeds.  Mankind coming together to help “Take those sons of bitches down,” is an ultimately inspiring thing.  Whether it’s against nature, or the extraterrestrial, or against our own selves, we enjoy watching characters pull themselves out of a struggle.  That’s why I think World War Z succeeded this year, despite all the naysayers who predicted it would fail (myself included).  The movie looked like another exploitative take on the zombie sub-genre, but the finished film was a more thoughtful examination about how the survivors of the catastrophe try to deal with the problem and learn to survive.  Sometimes it helps to treat your audience to a more thoughtful story about survival, rather than just destruction.
Disaster films will always be around as long as there is an audience for them.  And as long as filmmakers actually treat its audiences’ intelligence levels more respectfully, then we’ll also see the Disaster genre gain more respectability in the film community.  I like the fact that Disaster films have become such an acceptable part of cinematic history, that it’s now commonplace to spoof it as well.  This summer, we got not one, but two comedies centered around apocalyptic events: Seth Rogen’s This is the End and Edgar Wright’s The World’s End.  Both films are hilarious takes on the genre, but they both know what makes a good disaster film work in the end and they exploit those elements perfectly.  It comes down to characters you want to root for and wanting to see them overcome even the complete destruction of society as we know it.  Even though the film’s are played for laughs, the same basic elements hold true and the filmmakers who made them know that. Overall, destruction becomes entertainment because we look forward to the process of renewal.  Disaster movies fail if they indulge too heavily in the destructive parts or leave the audience with no satisfying resolution.  It’s human nature to enjoy seeing something blow up, but we also enjoy seeing something good rise out of the rubble of the destruction, and in the end, that’s why we enjoy good a disaster movie.