Category Archives: Editorials

The Long Game – How Great Movies Gain Their Audience Over Time

fight club

When we look at many of our favorite movies over the years, it’s natural to think that any of them were always viewed as beloved classics from the day they premiered.  Some of them have no doubt, but there are many others that didn’t find their way into our hearts until many years later.  Oftentimes, it’s just a matter of timing, and that some movies were either overlooked upon their first release, or they fell victim to poor marketing that didn’t effectively allow the movies to find their target audience.  For whatever reason, Hollywood often has a hard time predicting how movies will perform, both in the short run and the long run.  No doubt, the business of the industry is centered around profitability and the more a film is able to make a return on their grosses in their immediate release the better.  That’s why there’s such a reliance on franchise building and sequel bating in the film industry, especially if your film costs are in the $100 million range.  But, there are the films that are stuck in the middle, those that are ambitious but hard to market that unfortunately are held to the same standard of the blockbusters.  It may seem unfair, but Hollywood is a commercial business, and the only way money gets spent is if those providing the funding can see the potential for big returns.  Thankfully, many filmmakers have become good at pitching projects that do push the boundaries and try something different while at the same time appealing to a large audience.  And these ambitious experiments often turn into some of the greatest cinematic wonders that we love today.  Unfortunately, they are also films that make Hollywood weary of failure.

This is common around Awards season, and this year in particular is a strong example of many ambitious projects under-performing according to the high standards of Hollywood.  The last month, we saw a strong collection of releases from some of Hollywood’s most acclaimed talent, which included Guillermo del Toro’s Crimson Peak, Robert Zemeckis’ The Walk, Danny Boyle’s Steve Jobs and the Sandra Bullock starrer Our Brand is Crisis.  All were heavily marketed as potential Awards season champions and quality entertainment that was sure to give the season a more sophisticated identity over the bombastic dumb fun of the summer.  Unfortunately, apart from Ridley Scott’s The Martian and Steven Spielberg’s Bridge of Spies, every other ambitious film from the last month failed at the box office.  Entertainment Weekly recently ran an article discussing this very thing in their November 13 issue (read it here) in which they dubbed the string recent disappointments “SHOCKTOBER.”  While Hollywood should fret about a pattern of underwhelming returns at the box office, at the same time I don’t think that it’s also fair to say that it was the movies themselves that were to blame.  Really, even though the recent box office has been sluggish, it’s not a reflection of the quality of the films, and many of them are actually still worth seeing.  I already reviewed The Walk and Crimson Peak favorably, and I actually believe that Steve Jobs is one of the best films of the year so far.  But because none of these movies made a profit, it unfortunately leads to a desire on Hollywood’s part to not invest in projects like them in the future, and that’s the sad reality about the business.  Though immediate box office can help boost a movie’s esteem, sometimes other films take their time, and develop their audiences over a long period.  And in some cases, this is actually better for the lifespan of a movie.

It’s the staying power of a movie that ultimately belies it’s greatness.  When we look at the best movies of all time, they all share a popularity with audiences that transcends their time and place.  But, when you dig deeper into a handful of them, you will notice that many lists of the greatest movies ever made will include a mix of both successes and failures from box offices of years past.  For every Star Wars (1977), Some like it Hot (1959)and Casablanca (1943) there’s a Blade Runner (1982), Groundhog’s Day  (1992), and a Touch of Evil (1958).  All are considered masterpieces now, but the latter category didn’t achieve success immediately and in fact weren’t fully appreciated until many years later.  In some cases, a spectacular failure can even turn into a beloved classic completely out of nowhere.  I’m sure nobody thought that director Frank Capra’s biggest box office failure would turn into his most beloved feature decades later; the Christmas perennial It’s a Wonderful Life (1946).  That movie performed so badly that it shut down the company that made it, and yet today it is almost a sin for it not to air on network television during the holidays.  These are clear signs that great movies always find their audiences eventually; it’s just that not all of them do it in the same way.  Though the stigma of failure can plague a movie for a while, we’ve been shown that quality does get appreciated in the end and that time can help refresh a film’s perception in interesting ways.  Why, we’re even seeing that now with notorious flops like Heaven’s Gate (1980), which was deemed worthy of a Criterion release recently despite it’s reputation.

But, for these movies to exist at all there has to be credibility in their value, and Hollywood, as much as they try, can’t always predict how movies will perform in the long run.  This ultimately effects what films end up getting made, and the need for immediate satisfaction is the prevailing desire on the part of those financing the projects.  When a movie fails to make money, the studios become less likely to invest more into something different, and that’s when we see fewer chances being taken.  I would only ask Hollywood to consider the fact that movies, if they are good enough, can be more profitable in the long run and that immediate box office won’t always be the last word on a film’s success.  Take the case of Ridley Scott’s Blade Runner; this was a box office failure in it’s time and people viewed it as a sign of Scott’s decline in stature in the industry.  But, with subsequent home video releases and airings on cable, the movie found an audience and  became a cult hit.  That cult status later hit the mainstream and now Blade Runner is not only considered one of Scott’s most beloved films, but also considered to some as his masterpiece over successes like Alien (1978) and Gladiator (2000).  Also, most importantly, it has become a moneymaker for it’s studio Warner Brothers; maybe not to a Star Wars level, but still you’ll see a fair share of memorabilia and special edition releases devoted to the film to this day, all of which generate plenty of money.  This is a perfect example of a movie that has aged beautifully, like fine wine.  It shows that you can’t just dismiss a movie right away because it didn’t give you what you wanted up front.  That being said, no body can predict how audiences tastes will change over time.

A large part of how a movie does perform at the box office has to do with how well it answers the hype that surrounds it.  Marketing of course does the work of generating attention for movies, and in many cases hype can be helpful and deserved.  But, there’s also the risk of putting too much hype on a film , because it can generate the wrong kind of attention.  This was the case with many of the recent releases that failed at the box office this October.  A lot of attention was drawn to the quality filmmakers and star power that these movies had, and also the fact that they were about something important and/or artistically daring.  In most cases they were, but the marketing failed to make that case to audiences.  What I saw in the advertisements for these films was a desperate desire to make these movies appear important, but at the same time, it ended up also making them appear indistinct.  That’s the danger of Awards season marketing; studios want to make these movies look like contenders, such as those that have succeeded before them, but at the same time, it diminishes what could have made them different from the rest.  The Steve Jobs movie, for example is one of the most interesting cinematic experiments I’ve seen this year; telling the story of a historical figure in our culture in only 3 scenes, helped out by the masterful direction from Danny Boyle and a killer screenplay by Aaron Sorkin.  Unfortunately, that daring artistic choice is not highlighted in the marketing, and it made the movie look just like any other biopic we’ve seen, which it is not.  The same can be said about the downplaying of the artistic achievements in Crimson Peak and The Walk.  Like the Entertainment Weekly article states, this is a case where Hollywood fell victim to making “too many films for a similar audience.”  But when you look at the films themselves, there’s nothing similar about them at all.  It’s only the marketing that made it look like they were of the similar vein.  That’s the danger of Award season marketing, because it puts all movies into a similar category when really they should belong in their own spotlight.

And being the big winner of award season doesn’t always give a movie a long life span either.  Anybody else remember Ordinary People, the Best Picture Oscar winner of 1980?  Didn’t think so.  There are other years where you can find many of the greatest classics ever made by Hollywood that all lost to a movie that few today even remotely remember.  One of the more recent examples of this was 1999.  That year, American Beauty walked away with the big awards, beating out movies like The Green Mile and The Sixth Sense.  It probably made sense at the time, but sixteen years later, the movies that stand out from 1999 that have aged the best are ones that weren’t even nominated; American Beauty not being one of them.  This includes my own favorite film from that year, David Fincher’s Fight Club.  The movie reached theaters amid mixed reviews from critics and a disappointing box office run, especially given that A-lister Brad Pitt was the star of it.  But, despite not clicking with the Hollywood elite initially, Fight Club did find success in the underground market, especially among college aged youth at the time, and like Blade Runner it developed a cult following that eventually hit the mainstream.  Now Fight Club is rightfully considered a classic years later, even to the point where awarded thesis papers are written today on college campuses across America discussing the philosophical questions raised by the film and it’s significance to cinematic art.  Other 1999 films have also likewise developed devoted followings like The Matrix and The Iron Giant, and have since left a remarkable impact in the decade following their release.  Iron Giant in fact recently received a special anniversary re-release, which is pretty remarkable for a movie that bombed when it first came out.  All the while, American Beauty isn’t even mentioned much today, much less seen worthy of an anniversary re-release.  Director Sam Mendes is in fact much more heralded today for his James Bond movies and less for the film that earned him an Oscar.  It just shows that vying for the end of the year gold doesn’t always guarantee a long life span for your film, and that sometimes it’s much better to make a movie that builds an audience over time.

The other thing that determines a movie’s ability to find it’s audience is how it deals with the circumstances of it’s release. Like I stated earlier, failure in the beginning doesn’t always mean failure for eternity in the whole of cinematic history.  If a movie is worthy of it, it will eventually find an audience.  Sometimes this is helped by viewing the film through the prism of nostalgia.  This often happens with movies that are emblematic of the time they were made and feel unique when contrasted with the movies of today.  Just look at any of the movies mocked on Mystery Science Theater.  What seemed bland and sub-par in it’s own time can come off as charmingly ridiculous when taken out of their original contextual time period.  The same goes with some of Hollywood’s more undiscovered classics.  People attracted to different genres can often find a hidden gem deep in the studio vaults, if Hollywood gives them a  chance to be seen.  That’s why Film Noir, Western and Sci-fi genres benefit from the passage of time, because audiences that seek out unseen classics will almost always find what they are looking for, just due to the sheer probability taken out of diverse tastes.  Time makes us ultimately forget how a movie performed and instead makes us see the movie on it’s own merits, as a great story worth telling and that’s what ultimately makes them a classic in the end.  Sometimes a great film was overlooked at the time just because the studio didn’t see any value in it and decided to bury it for years.  Thankfully, with the resources we have now, nothing is buried anymore, and even the forgotten are given a chance to shine.  Blade Runner and Fight Club managed to do it on home video, and It’s a Wonderful Life did it on television.  The more avenues a movie has given to it, the better chance it has to find it’s audience in the end, and all the great ones do eventually.

So, despite Entertainment Weekly’s worries that one bad month is an omen of ill tidings for the industry, it should not be a reflection on the movies themselves.  A great film eventually finds a way to make money in the long run.  Sadly, Hollywood is an impatient beast, and waiting for returns a decade later is not a good way to run a business.  So, movies like Steve Jobs, Crimson Peak and The Walk are going to carry the stigma of being disappointments for a while, and it will probably hurt their chances during Awards season, which is a little unfair.  But, Hollywood should understand that box office numbers are not always a sign of a film’s actual overall value.  Sometimes a failure at the box office may be discovered by an aspiring filmmaker who is then inspired by it and eventually one day they make a game-changing film that does produce an immediate box office success.  Overall, I’m saying that Hollywood execs shouldn’t be discouraged from taking chances once in a while.  Yeah, it will be good for business if you travel down the safe route with predictable, name brand fair that’s guaranteed to give you a big opening weekend.  But, if you have the opportunity to reach for greatness by making something that’s different and challenging, it may give you decades worth of positive returns.  Basically, you’re left with the choice between producing an opera or a fireworks show.  Both have the potential to entertain, but one will stick with people for far longer despite costing you more initially.  Hopefully the October releases this year can stick it out; and the awards season has been known to pull movies out of the abyss of disappointment by giving them the spotlight through a deserved nomination.  In that regard, it shows that playing the long game can be tricky, but at the same time, oh so rewarding.

Location, Location – The Silent But Crucial Supporting Character in Movies

north by northwest rushmore

The magic of cinema is the power to transport the viewer to another time and place.  We can sit back in our seats at a local cinema or lounge in front of the TV in our living room and have the world around us slip away once we settle in and let the movie grab a hold of us.  To audiences, the movies are alive.  A lot of work goes into pulling off that magic trick, whether it be the effectiveness of the production and costume design or the authenticity of the actor’s performance.  But, if there is one aspect of film-making that sometimes goes unheralded, it’s the effectiveness of the setting itself.  Yes, a lot of artificiality is involved in staging a scene in a particular place, especially when shooting entirely indoors on a manufactured set.  But, there are quite a few movies that use the natural world for the setting of their movies, and just as much consideration goes into finding the right location for a film as it does finding the right actor for a role.  There are many movies where the setting plays a crucial role in the story, and in many cases, is often a character unto itself.  It may not be an active player, but you will often find movies where the setting is either a threat to our main characters, a safe haven, or a place of endearment that is valued by many.  A place can also have it’s own personality, based on the collective characteristics of it’s inhabitants.  But, when the importance of location is not taken into consideration, it can often reflect poorly on the identity of it’s own story.  Over the years, we’ve seen many amazing locations presented in movies, but not enough has been said about the work that goes into making those same locations an integral part of a movie’s success.

Producing a film often starts with the process of location scouting.  Often supervised by the directing team itself, finding the right locations for a movie is important for finding the vision for a story-line.  It’s one thing for the filmmaker to have an idea in their mind of what their setting will look like and how the story will progress within it, but it’s another thing to actually see it in person.  Blocking a shot takes on different challenges when done in the real world.  A director must deal with details and obstacles that normally wouldn’t occur on a controlled set, and this often leads to some interesting directorial choices.  Sometimes, a story can even drastically change in the development process when a location is found that presents a whole bunch of new possibilities to the filmmakers.  And it’s largely a part of the way that a location lends itself cinematicly in different ways.  It can either be the embodiment of one particular place in your story, or can act as nowhere in particular but serve your needs.  Sometimes you want a setting that looks unlike anything you’ve ever seen, but can also have that chameleon like ability to be any number of places.  It’s all to the discretion of the filmmakers.  Because of the importance of a location’s impact on a story, they often have to be more carefully chosen than the actors that inhabit it.  And, as we’ve seen in many important and monumental films, locations and setting often make these movies stand out and retain their own identity.

Some filmmakers choose to use their movies not just to tell the story of their characters, but of the specific places themselves.  Most of the time, they are love letters to a filmmaker’s hometown or place of origin; most often a major city or a cultural region.  Directors do this intentionally for the most part, but sometimes it just comes as part of the filmmaker’s own style.  New York City is often presented as a crucial part of many film narratives; probably more so than any other place in the world.  One particular filmmaker, Woody Allen, created an identity as a director by using the Big Apple in so many of his early films, identifying himself with New York while at the same time presenting an loving image of the city through his own cinematic eye.  Films like Annie Hall (1977), Hannah and Her Sisters (1986), and Bullets Over Broadway (1994) probably wouldn’t have the same impact if they weren’t set within Woody Allen’s own idealized version of New York, which is often as quirky and unpredictable as the man himself.  His Manhattan (1979) in particular is almost the very definition of a love letter to a single location.  But, as much as celebrates the city through it’s wondrous aspects, there are other filmmakers that celebrate  New York in less glamorous ways.  Spike Lee presented New York as a grittier place in his 1989 masterpiece Do the Right Thing, which depicted the racial tensions that undercut much of the daily life in the city between law enforcement and the poorer black neighborhoods.  Though far from the idealized New York of Woody Allen’s movies, Spike Lee’s NYC is no less a potent character in his movies, and Lee celebrates the vibrancy of the people who inhabit it, and likewise celebrates the indomitable spirit of the city’s often forgotten poor.  It shows how much a single place can carry so much character in a movie, even through different kinds of perspectives.

It’s another thing altogether to take a location and make it someplace that exists nowhere else in the world in a believable way. What I’m talking about is recreating a place from a work of fiction by using real locations in different areas and stitching them together to create the illusion that it’s all one place.  This is a trick that’s been used in Hollywood for many years, but has grown in complexity as film-making tools have improved.  Through the magic of editing, you can make real world settings become anywhere you want it to be. This is often used to great effect in comic book movies, where New York City has on more than one occasion played the role of Metropolis in the Superman franchise.   It helps to give extraordinary stories like those a more grounded reality, which in turn helps to transport the viewer more effectively into these fictional worlds.  One of the filmmakers who has done this to spectacular effect is Christopher Nolan, who is renowned for his insistence on real world authenticity in his epic scale movies.  He showed his expertise with this effect when he chose real world locations for his Dark Knight trilogy.  Sometimes his choices of location were pretty obvious to pin down (Downtown Chicago acting as Downtown Gotham City in The Dark Knight’s spectacular chase scene), but there were other scenes that displayed quite a bit of ingenuity to make the fictional Gotham feel real.  In The Dark Knight Rises (2012) Nolan managed to combine three different cities into one chase scene and make the audience feel like they were authentically taking a tour of a real Gotham City.  It was when Batman chases the villainous Bane on motorcycles, with the on-location shooting starting on Wall Street in New York, heading through the underground tunnels of Chicago, before ultimately ending up in Downtown Los Angeles.  That’s a spectacular use of multiple locations to make a fictional one feel as real as possible, and as a result, it gives it a more authentic impact to the story.

But this kind of technique isn’t just limited to giving a fictional place authenticity; it can also allow for a filmmaker to create any world they want, no matter how otherworldly, and still make it feel real.  Inspiration can often come from the natural world in this sense, as the camera can transport the viewer anywhere, but with the story filling the context, and not the location.  Natural wonders across our planet, especially obscure ones, often play the part of different worlds, and these are locations that are given special consideration during the scouting phase.  In the fantasy and science fiction realms, a location has even more influence with the shaping of a story than anything else, so the better you can present it on film, the better.  This is a case where locations must have that trans-formative effect to look unlike anything we’ve ever seen, but still come off as believable, and this often leads to some very complex planning on the filmmakers part.  Peter Jackson managed to this spectacularly well with his Lord of the Rings and Hobbit trilogies, where he found the ideal locations to make J.R.R. Tolkein’s Middle Earth come to life through the natural beauty of his own native New Zealand.  New Zealand was a mostly untapped source for location shooting before these movies came out, but with Peter Jackson’s vision, he managed to showcase it in a spectacular way, while at the same time authentically visualizing the wonders of Tolkein’s world in there as well.  It’s much better to see the Fellowship of the Ring climbing real mountains than recreating it on a stage with visual effects.  As a result, a natural looking Middle Earth became just as much a part of that movie series’ success as anything else, and that same devotion to detail is influencing many more movie projects today, not to mention boosting New Zealand’s tourism industry significantly.

But, it’s not just the expanse nor the many layers of a location that helps to make it a significant factor in a story.  Sometimes a single iconic look to a place can drive the story along as well.  Some movies can even be identified by a single iconic structure or a scene that utilizes the most unbelievable of settings.  The Bradbury Building, for example, is a real place in Los Angeles known for it’s amazing interior ironwork within it’s atrium.  The location has been used in many movies, but none more memorably so than in Ridley Scott’s Blade Runner (1982), where it served as the location for the climatic showdown in that dystopian futuristic classic.  It’s a great example of using an iconic location in a nontraditional way, and as a result, giving it a whole other identity in the story than it’s own real purpose in reality.  But, no other filmmaker made use of iconic locations in his movies better than Alfred Hitchcock.  Locations have always played a crucial role in his stories, even during his early years in Britain when he used the Scottish Moors so effectively in The 39 Steps (1935).  After coming to America, Hitchcock became enamored with the many different types of iconic Americana in our society and all of his later movies would highlight much of these in both spectacular and sometimes even subversive ways.  Some of them are pretty spectacular, like Kim Novak’s attempted suicide by the Golden Gate bridge in Vertigo (1958), or the thrilling chase across Mount Rushmore in North by Northwest (1959).  But, Hitchcock would also make minor locations take on identities of their own; including the most frightening roadside motel ever in 1960’s Psycho.  As often the case with these movies, it’s the singular location that stands out, and more often than not, it defines the movie as a whole.  In the case of Psycho, it’s the Gothic mansion that’s becomes the selling point of the movie, and not the actors, which tells you a lot about the power that an iconic location can have on it’s audience.

Though a lot of movies take into consideration the importance of a location, a movie can also run the risk of feeling too disjointed from one as well.  Filmmakers sometimes do not see the importance of a location when they are working on a much smaller scale, but they run the risk of limiting their storytelling options that way.  A setting can reveal many different things about the characters, sometimes in unexpected and unplanned ways.  It’s a part of piecing together the character’s life outside of the narrative; revealing to us how they live day to day within their larger world.  Showing that a character lives in the city may hint at a more cosmopolitan side to their personality, or if they come from the country, perhaps they have a more laid back and simple outlook on life.  If your character is from Genericsburg U.S.A., then it’s more likely that they will have no defining characteristics to them at all.  In the movies, a character is defined by their surroundings more than anything else, and that’s why a setting is often the most important supporting factor in their story.  Authenticity is also a huge factor, especially when audiences can tell when a movie is accurately reflecting a real location or not.  One of the worst examples I’ve ever seen of using a location in a movie was in the film Battle Los Angeles (2011).  Speaking as someone who lives and works in LA, I can tell you that this particular film in no shape or form looks and feels like it’s in the real city of LA.  And that’s because not a single frame of it was shot there.  The whole thing was filmed in Louisiana, with sets constructed to look like streets in Los Angeles and nearby Santa Monica.  Unfortunately, it robs the film of it’s character by feeling so fabricated.  As a result, it’s a generic action flick that will tell you nothing about the city of LA, showing the downside of not treating your location with the respect it deserves.  You want an authentic portrait of the city of Los Angeles, watch some of Michael Mann’s films like Heat (1996) or Collateral (2004).

It may not be apparent from the first time you watch a movie, but the setting of the story plays perhaps the most crucial role in it’s overall effectiveness.  And it can be through specific intention on the filmmaker’s part, wishing to highlight a specific place, or by finding a setting that perfectly supports the action and characters that exist with it.  It is the silent supporting player in a movie’s plot and can surprisingly be the thing that most movies hinge their success on.  Any filmmaker who values the process of capturing a sense of reality in their movie will tell you how much they appreciate the variety of cinematic choices they can have when they find an ideal location.  If the location is interesting enough, anywhere you point your camera will reveal new things for the audience, helping to enrich their experience.  When I was working on sets back in film school, I often enjoyed the location shoots much more than the ones in a studio.  A real, authentic location just has a lot more variety, even when it isn’t meant to represent no particular place.  One of my favorite location shoots was on a film set called Four Aces, located out in the Mojave Desert, about a 2 1/2 hour drive from Hollywood.  It’s been used for films like Identity (2002) and Kill Bill Vol. 1 (2003), plus a dozen other music videos and commercials, as well as the student movie that I crewed on.  What struck me is how this fabricated set made to look like a gas station with an attached diner and motel out in the middle of nowhere could be so many different things, and yet almost always stand out with it’s own identity no matter what movie project it was in.  That’s the power of having a great location in a film.  Locations are sometimes the most important supporting character a movie can have, and they become so without saying a single word of dialogue.

The Big Twist – The Rise and Fall of M. Night Shyamalan

One of the most valuable things to have in the film industry is a unique voice.  Whether it’s through the lens of a camera or with the mastery of the written word, being able to distinguish yourself among all the other artists in film is something that everyone aspires to.  Many try, but few actually can achieve the status of true originality.  Oftentimes, in order to make a living in the film industry, some filmmakers will sacrifice originality and adopt a standardized style that gets them work more readily.  Other artists will toil for years to create something that appeals to their senses, and possibly alienate their audiences with too much artistic self-indulgence.  But, there comes a time when some artists are struck with inspiration and create something unexpected that helps to propel them to the next level, which itself can also have it’s own consequences.  This particular career trajectory happened to Indian-American filmmaker M. Night Shyamalan.  Shyamalan, perhaps more than any other filmmaker in recent memory, has had one of the most tumultuous careers in film.  At one time a struggling wannabe filmmaker, Shyamalan managed to break out with an unexpected hit called The Sixth Sense (1999), which then led to high demand for his next projects.  The Sixth Sense‘s unbelievable success was both a blessing and a curse for the director over time, because even though it propelled his career and made him a household name overnight, it also laid unrealistic expectations on him as well, something which has plagued him ever since and ultimately turned Shyamalan’s career into something of a cautionary tale.

M. Night Shyamalan can be either considered a unique visionary, or a pretentious hack, depending on who you talk to.  But, there is one thing for certain and that’s the fact that his career has taken a tumble over the years.  He did follow up The Sixth Sense with another critical hit called Unbreakable (2000) and a box office smash called Signs (2002), showing that he’s more than just a one hit wonder.  But, all the movies he’s made since then have either been panned by critics or have flopped at the box office, or both.  And the strange thing is that most of the reasons why people say they hate his movies is because of the director himself.  It has become a bizarre reversal of fortune for Shyamalan.  At one time, his career was so hot that putting his name above the title proved to be a mark of quality.  Now, film studios are actively hiding his involvement in film projects, so as to not incur the wrath of hostile audiences.  But, why has Shyamalan’s brand dropped down so much?  Audiences hold his movies up to so much scrutiny, more than any other active director, and it doesn’t have anything to do with the man himself.  As a person, Shyamalan seems like a nice guy with nothing in the way of negative baggage.  So, why the hate?  Simply put, his rise and fall as a director more or less has to do with the way we value the quality of storytelling, and how much effort a filmmaker puts into his own work.  In the case of M. Night Shyamalan, we experienced the arrival of a unique voice in Hollywood who unfortunately couldn’t shake off the shadow of his own metoric rise.  By sticking to the formula for success that he pioneered, Shaymalan became a parody of himself.

When you look at the career of M. Night Shyamalan, the one thing that instantly defines the whole of it is the term, “plot twists.”  Shyamalan proved to be the master of pulling the rug out from under his audiences and presenting them with completely unexpected plot swerves that no one saw coming.  In fact, if one were to compile a list of the greatest plot twists in movie history, I’m sure you’ll find two of his there.  The first one, and really the one big thing that put Shyamalan on the map, was the big twist at the end of The Sixth Sense.  I won’t spoil it here (though honestly who doesn’t already know it by now?), but it hit audiences so hard that it caused a word of mouth campaign that boosted it’s box office numbers, just based on the notion that everyone had to see it to experience it fresh to get the true impact.  The twist took on a legendary life of it’s own and people were anxious to see if Shyamalan could one up himself the next time around.  What he made next proved that he indeed had more tricks up his sleeve, but in a wholly unexpected way.  Moving from ghost stories to superhero origins, Shyamalan crafted an equally compelling film called Unbreakable, re-teaming himself with Sixth Sense star Bruce Willis.  Unbreakable didn’t have quite the box office success that Sixth Sense did, but it was well received by audiences who saw it, including myself (I named it my favorite film of 2000, and I still stand by the pick).  But, what was remarkable about Unbreakable was that Shyamalan managed to work in another unexpected plot twist, one that even rivaled his last one.  The twist would soon became a Shyamalan trademark and it would continue to become an expected part of his later film projects, including his follow ups Signs and The Village (2004).

But, when your career trademark becomes something that is supposed to be unexpected, it begins to rob some of it’s power once it stops being a surprise.  This is largely what caused a downturn in Shyamalan’s latter career.  His style no longer had the power to surprise.  When he was just starting to make a name for himself, he could blind side his audiences with his twists, because they were far better hidden in less familiar style.  Now, after seeing Sixth Sense and Unbreakable, people anticipated the twists, which made it harder for Shyamalan to be creative with them.  As a result, his twist endings became more confounded and pretentious over time, loosing their intended impact and leaving the audience underwhelmed as a result.  A perfect example of how poorly his trademark twists were handled can be seen in The Village.  The premise of the movie is intriguing; an isolated turn-of-the-century village is attacked every  night by cloaked monsters, and the only thing that saves it’s residents is strict adherence to traditional customs and complicated rituals meant to ward off the intruders.  The movie, at times, has some chilling tension, as well as some good performances from actors like Joaquin Phoenix and Bryce Dallas Howard.  But, all the momentum of the story is undermined once the truth behind the monsters is revealed.  Spoilers, they’re not real.  Not only that, but the big twist at the end (that all of this was really set in modern times) is telegraphed way in advance by some of the ways the adult characters speak to one another.  Overall, The Village  proved that M. Night Shyamalan’s formula wasn’t infallible, and that by forcing it into a story that would’ve been better served without it, the twist ended up becoming a negative rather than a positive.

Another factor that also alienated Shyamalan from his audience was his insistence on showing off his style in every movie.  When Shyamalan was unknown, his flashy style was more effective, because it helped him stand out.  People saw the clever use of color symbolism in Sixth Sense and the cold, washed out cinematography of Unbreakable as bold choices made by a man who knew exactly how cinema should work.  But, those two movies were perfectly suited for the Shyamalan style.  Once the director moved out of his comfort zone into other genres, his style became more distracting as it was forced into movies where it wasn’t needed.  The Lady in the Water (2006) should have been an uplifting fantasy tale, but it ended up getting bogged down by Shyamalan’s deliberate pacing.  The Happening (2008) takes itself way too seriously with it’s ludicrous premise, and ends up being unintentionally hilarious for it’s ineptness.  And these were two movies that should have been interesting experiments for him, and he chose to neither take advantage of the opportunities nor challenge himself.  Shyamalan’s major fault was his inability to adapt his style over time.  Many directors take on a variety of projects in different genres, but they make the jumps more effectively when they conform to what is best for the project, rather than forcing their own style into places where it shouldn’t be.  That’s how Steven Spielberg can be the director of both E.T. The Extra-Terrestrial (1982) and Schindler’s List (1993), and Martin Scorsese can be the director of both Goodfellas (1990) and Hugo (2011).  It’s the secret to longevity in the business; challenging yourself with variety rather than staying in your comfort zone.

But, at the same time, M. Night Shyamalan had to deal with the cost of fame, and the unrealistic expectations that were laid out in front of him.  As a result, those of us who considered ourselves fans of Shyamalan are also to blame for his downfall.  We expected far too much of the man, and unfairly blamed him for not meeting our demands.  There is a peculiar thing in our collective culture where we like to build up icons only to take them down later if they show us even the slightest hint of impurity.  Shyamalan certainly has his faults as a filmmaker, but he at least earned the spotlight that would soon put everything he made under harsher scrutiny.  For him, the rise in public image was too fast and too overwhelming, and it’s probably due to the fact that Hollywood was all too eager to crown a fresh new face in the business.  Newsweek Magazine prematurely declared Shyamalan “the next Spielberg,” which was a little unfair for a director who had only had just a couple movies to his name at that point.  Though, truth be told, Shyamalan did buy into some of the hype himself by creating a brand to distinguish his works from everyone else.  Not only did he put his name above the title for a while, but he injected himself into all his movies as well.  And this was more than just Hitchcockian cameos.  In The Lady in the Water, there is a writer character who is prophisized to one day change the world with his work, so naturally Shyamalan cast himself as the character.  Not very subtle there M. Night.  Still, he’s not alone as a fallen idol in Hollywood.  Whether you’re Orson Welles or Michael Cimino, Hollywood seems to enjoy tearing down their wunderkinds whenever they fly too close to the sun; perhaps as a way to curb unchecked ambition.  But, even though Shyamalan contributed to his own fall from grace, the pedestal on which he stood shouldn’t have been so high to begin with.

In addition to the unfair expectations, there were also the unfortunate circumstances of M. Night Shyamalan becoming involved in projects that were never a good fit for him from the start.  As a way to keep himself working as his own ambitious projects failed, Shyamalan took on directing duties in projects developed from outside sources.  This included the big budget adaptation The Last Airbender (2010), which was based on the popular Avatar animated series on Nickelodeon, as well as the Will Smith sci-fi vehicle called After Earth (2013).  Both films were slammed by critics and audiences, and a lot of the blame was laid at Shyamalan’s feet, which I find to be a little unfair.  For one thing, adapting Avatar: The Last Airbender into a two hour movie was doomed from the start.  I’ve never seen the series, so after watching the movie, I didn’t have the visceral hatred for it that other people did.  It’s certainly flawed, but only in a storytelling standpoint, which I account to the adaptation, as well as to some very poor casting choices.  But, what I do admire in the movie is seeing Shyamalan branch out and try new things.  The Last Airbender is the least Shyamalan-esque movie that he has made, and it’s interesting to see him work with bigger scale and scope.  But, then again, if the purpose was to adapt the series correctly, then I can understand the view of this being a massive failure.  But, was it Shyamalan’s fault, or Nickelodeon’s for believing that he was right for the material?  With regards to After Earth, I lay more blame on Will Smith than on Shyamalan, because he was clearly the driving force behind the movie; using it as a platform to spotlight his less talented son Jaden Smith.  Sadly, Shyamalan was just along for the ride and it ended up dropping his stock even more, even though his mark on the movie was minimal.  It just shows that circumstance also was a part of the director’s downward spiral, and that disappoints aren’t always of the artists own doing.

So, is Shyamalan forever doomed to be a shadow of his former self.  It’s entirely left up to him.  What I have seen in recent years from Shyamalan is a drive to reinvent himself as a filmmaker and try new things.  Sadly, those new things have failed him, but not without effort on his part.  What I do like is the fact that he’s abandoning the things in his career that were clearly holding him down, like the trademarks that had lost their effectiveness and the self-indulgent directing choices as well.  Also, he’s injecting himself less and less into his own movies; fewer cameos and no name above the title anymore.  But, can he make it work?  Well, this week marks the premiere of his new film, The Visit, which is his return to the suspense thriller genre after a long absence.  Already, he is receiving far more favorable reviews from critics for this one (it currently has a fresh rating on Rottentomatoes.com) which is something that he hasn’t achieved in over a decade.  And probably the reason for this is because it’s a smaller budgeted, more intimate story; relying on far less hype and instead it just allows M. Night Shyamalan to craft something closer to his own interests.  The Visit appears to be a perfect recharge for the director; free of the Shyamalan brand and able to just stand on it’s own merits.  And if that’s what Shyamalan needs to regain his status as a respectable filmmaker once again, then it will be just the right thing for him.  Overall, the rise and fall of M. Night Shyamalan is a Hollywood caution tale, showing us the risks of becoming too big too fast, and also showing how we can fall victim to too much hype and/or our own lack of restraint.  M. Night Shyamalan learned all of this the hard way and only now is he starting to alter the course of his career in the right direction.  Who knows?  Perhaps the greatest twist Shyamalan has ever made is the one that may yet decide which way his remaining career will go.

Scraping the Bottom – Has Hollywood Truly Run Out of New Ideas?

_DSC6085.DNG

Check your local theater listings and see if you can spot any movie on there that sounds wholly original and unlike anything you’ve ever seen before.  That’s becoming a rarer sight nowadays.  Sure, you’ll see something at your local art house cinema that’s certainly different and groundbreaking, but independent cinema runs by a different and less risky set of standards than the big studios.  Hollywood seems caught in an endless loop of recycling everything that has worked in the past.  Sometimes it has worked out to feed upon an audiences’ sense of nostalgia; just look how well it worked for Jurassic World (2015) this year.  But for every sequel, remake and reboot that does hit it’s mark, there are a hundred or more that don’t.  A lot of factors can play into that; either Hollywood is just cashing in on a name brand and nothing else, or an experimental re-imagining goes horribly wrong and stains a franchise, or the product being remade just doesn’t have any relevance left to ever be taken seriously again.  And yet, with all the failed attempts to capitalize on old ideas, Hollywood is still very eager to invest in them anyway.  All this has led many to speculate whether Hollywood has truly, unequivocally run out of ideas.  While this complaint has been made for years about Hollywood, even through some really trans-formative and ground-breaking periods, it actually feels more and more like the case.  We are now in a period where Hollywood has become, for better or worse, “nostalgia” crazy, with remakes and reboots being favored for production over new and bold ideas.  As a result, Hollywood is able to capitalize on the reliability of a built in audience, while at the same time stalling any chance that their yearly products will ever have any impact outside of their era.

One thing that Hollywood is missing out on right now is the chance to make movies that can define an era and redefine the rules of cinema.  Every decade or so, we’ve seen trends and cultural movements reflected back in the movies made within the same time period.  This has helped every decade feel unique, whether it is the classiness of the 50’s, the psychedelia of the 60’s, the grittiness of the 70’s, the excess of the 80’s, or the digital revolution of the 90’s.  But, with the advent of the internet age in the 2000’s, and the increased accessibility to media from all eras, entertainment suddenly has become less grounded within it’s own era, and instead began to focus more on the nostalgia of past trends.  With online social networking becoming an increasing reliable way to gauge the likes and dislikes of an audience, Hollywood picked up on the fact that nostalgia played a significant part in determining what people choose to watch in the theater or on TV.  As a result, long dormant franchises suddenly were revitalized in order to capitalize on audiences’ awareness and their long held attachments to them.  Sometimes a revitalized franchise is welcome, especially if there is new territory waiting to be uncovered in it’s cinematic world (Star Wars Episode 7, being a prime example).  But, when Hollywood decides to capitalize on a brand name without exploring new ground, it ends up being rejected by fans of the old while loosing any chance of gaining any new audiences.  This has unfortunately happened to too many beloved franchise and singular films that have succumbed to the “reboot” bug in Hollywood, and this over reliance on nostalgia has unfortunately made the last decade or so become a characterless era in film-making.

This year, in particular, has been flooded with remakes, reboots, and sequels.  In fact, the three highest grossing movies of the year are from already established franchises (Jurassic WorldAvengers: Age of Ultron, and Furious 7).  But, sequels have the advantage of continuing an on-going story, which makes their presence far more expected.  But, even with these successful films, I don’t think anyone would consider them ground-breaking either, especially when compared to their predecessors.  Of all the big studio tentpoles released this summer, only one could be considered an original idea (Inside Out).  In most cases, animated films seem to be the only venue open to new ideas in Hollywood, and even here we find an increasing trend of sequel-itis.  Overall, the danger of relying too heavily on established brands is that it creates a less diverse output.  That’s why if you are only in the business of marketing around a singular intellectual property, you will also be subjected to the pitfalls of that same property once it’s relevance has run out.  Hollywood needs to continually replenish itself with new ideas in order to keep audiences interested long term, but sadly new ideas in Hollywood can be viewed as not worth the risk.  That’s why we see more original ideas develop in the independent market, because Hollywood would rather work with what they know than what they don’t know.  And thus, if you’re filmmaker with a vision, you’d better find an investor outside of the studio system, because Hollywood is looking for more Transformers and less Ex Machina‘s.

Very little of this reliance on nostalgia has actually helped Hollywood either.  Most of the time, audiences whole-heartedly reject remakes.  I think that there’s a misconception in Hollywood that remaking a past film and updating it to our time period is going to make it relevant once again.  But, as is almost always the case, updating a beloved classic will strip away part of it’s original charm.  A dated film has it’s own kind of entertainment value, ad the reason we love some of these movies is because they are so steeped in their time period.  A perfect example of this is the 1990 Paul Verhoeven sci-fi classic Total Recall.  Despite being set in the future, Recall is an undeniably late-80’s early-90’s film based on the styles of the era and the limitations of the visual effects.  And you know what; it’s what audiences embrace about the movie.  In fact, Total Recall has aged quite well over the years as an entertaining time capsule of it’s era while simultaneously looking absurdly out of date.  The reputation of the movie remained strong over the years, leading it’s distributor (Sony/ Columbia/ Tristar) to believe that there was potential in the name itself that could be exploited with our improved technology and revised visions of the future.  Thus, we got the 2012 remake starring Colin Farrell in the place of Arnold Schwarzenegger.  The remake sticks more closely to the source novel by Philip K. Dick and features the latest in CGI visual effects, but you know what it lacked; entertainment value.  Gone was the goofy charm of Verhoeven’s original in favor of a sluggish, more serious tone that completely drains it’s story of any charm whatsoever.  As is with the case of many pointless remakes, newer doesn’t always mean better, and some classic movies are better left untouched, even if they look cheesy and dated.

But remakes are one thing when they can be easily dismissed and forgotten about in favor of the original.  Reboots on the other hand can run the more dangerous road of ruining the legacy of a beloved franchise.  Now, if done well, some reboots are welcome.  The recent resurgence of Planet of the Apes for example has proved to be successful, because it honors the roots of where it began while at the same time doing something new and different with the franchise.  But, there are other examples where Hollywood tries to squeeze every last ounce out of a series that should have been laid to rest years ago with a pointless reboot, meant to restart a new chapter that doesn’t need to be explored.  A perfect example of this is the recently released Vacation (2015).  The Vacation series started off with the 1983 original from National Lampoon, starring Chevy Chase and Beverly D’Angelo, which followed the Griswold family on a road trip across America to a popular California amusement park named Wally World, complete with hilarious mishaps along the way.  An excellent stand alone comedy on it’s own, National Lampoon’s Vacation spawned 3 sequels; one a genuine classic (Christmas Vacation) and two that were bland and forgettable (European Vacation and Vegas Vacation).  Now, long after the series has run out of steam, the Vacation franchise is being rebooted as a starring vehicle with Ed Helms, here in the role of Griswold son, Rusty.  While 2/3 of the Vacation sequels were not very good, they at least tried to take the series in different directions.  This reboot on the other hand just rehashes the plot of the original, minus the originality and the charm.  I just know that this reboot will fail, because you can’t replace the brilliant writing of John Hughes and the peerless direction of Harold Ramis with gross out humor we’ve seen a million times before.  But, Hollywood seems to still believe that name recognition is worthy enough of investment, and that’s why they want reboots to take hold and extend franchise out longer than they need to.  I really hope that this doesn’t happen with this Vacation reboot, because a Christmas Vacation remake would absolutely destroy me.

But, the primary reason why sequels, reboots and remakes continue to dominate the Hollywood landscape today is because of one simple fact; movies are expensive to make and established brands are more reliable investments.  Any studio can put it’s money behind a huge, epic scale production that’s based off of an original idea, but whether or not it makes them any money is determined solely by us, the audience.  Sometimes we forget that Hollywood is a profit based industry that must continually produce hits in order to survive and not a artfully driven enterprise; so, it’s not all that strange to see so many of them turn away from newer ideas.  Movies are million dollar investments, and the safest bet will usually be the best bet.  But, Hollywood’s reliance on safe bets must also have to contend with changing trends in the markets.  Sometimes, what proved to be a profitable franchise one year will suddenly be old news in the next.  Not only that but production turnaround is notoriously sluggish, especially on big tentpoles, so if audiences have lost interest in your film by the time it’s released, you’re completely out of luck.  Movie audiences have a much more diverse and evolving taste for movies than many might realize and those unpredictable swings in audience preference can have unexpected effects on the industry.   Disney capitalized on it’s audience’s sense of nostalgia when it turned one of their theme park rides into a profitable franchise with Pirates of the Caribbean (2003), but several sequels later, the novelty wore off and audiences were no longer interested in seeing theme park ride-inspired movies anymore, shown clearly with the box-office failure of the ambitious Tomorrowland (2015) this year.  And it’s the wearing off of novelty that really shows the negative effects of continually trying to recycle ideas over time.

What really worries me about Hollywood’s play-it-safe attitude and their absence of originality is that it’s making this millennial era we’re living in devoid of character.  Say what you will about how dated some of the movies made in the 60’s, 70’s and 80’s looked today; at least their time period helps to define them long after they were made.  When you look at movies like Back to the Future (1985), or Saturday Night Fever (1977), you can’t help but see the markings of their era in full display, and that’s what has helped them endure all these years later.  Today’s movies don’t reflect our contemporary society as much anymore, because it seems that many of them are trying too hard to avoid the stigma of becoming dated.  But, the unfortunate by product of this is that those movies will neither age well nor will they leave an impact on the era they exist in.  Honestly, the only movies that I can think of today that could actually be fondly remembered decades from now are the ones that touch upon contemporary issues or trends; like a movie that addresses the social ramifications of online networking or the advancements in LGBT rights here in America.  Sure, years from now, we may look at social and political movements of today as quaint and ridiculous compared to the issues of the future, but movies would provide a great cultural touchstone for what this era of time was like for us, as it has in so many decades before.  Rehashing old ideas just wipes away any defining cultural touchstone that we might have.  Of course, the other major way to bring originality back to Hollywood is to have producers willing to stick their necks out for something bold and new.  Every now and then we get a visionary director who manages to build up enough good will in the industry to make their dream projects a reality, with a budget substantial enough to make it work; like when Christopher Nolan was allowed to make Inception (2010) for Warner Brothers.  Hollywood needs to be in a groundbreaking mindset, much like how they were in the 70’s with the rise of New Hollywood.  But, of course, it involves taking risks and in that era too you needed to go through a couple of movies like Heaven’s Gate (1980) before an Apocalypse Now (1979) could emerge.

So, is Hollywood completely out of ideas.  If their current trend of appealing to audiences’ nostalgia continues, than it might actually be the case.  There are only so many ideas that can be done over and over again before audience grow bored with it, and new ideas are absolutely necessary to keep the business alive.  Unfortunately, studios aren’t looking towards long lasting impacts that their movies could hold; they just want to maximize what they already have because it’s the less risky option.  And sadly, the upcoming slate of movies in the near future looks more and more like everything we’ve seen before.  Superheros are less likely to be reborn so much as recast; franchises will continue to rehash the same plot points as opposed to extending off in a new direction; and beloved movies of the past will be given watered-down updates that remove all the charm that the originals had.  I am seriously dreading that Point Break remake, as I’m sure that many more of you are as well.  It’s really up to us, the audience, in the end to determine the direction of this trend in Hollywood.  Ideas are out there, they are just not getting championed highly enough to get the attention of the people at the top of the industry.  If audiences reject half-assed attempts to appeal to our sense of nostalgia by exploiting established brands, then the industry will start looking for other properties they can use to base movie productions around, and that may even lead some of them to take risks once in a while.  Sometimes it can result in failure at first, but even failures can turn into successes in the long run; look at Blade Runner (1982), or The Iron Giant (1999), or Fight Club (1999), all box office failures that are now considered masterpieces.  So, for your own survival Hollywood, you need to procure those rising visionary filmmakers, skim through that list of “black list” screenplays, and find the next great big idea that could extend your impact on the industry and leave a cultural impact for future generations to come.

Breaking the Illusion – The Uses and Misuses of Visual Effects

jurassic park t-rex

Though visual effects have been a part of cinema since the very beginning, it’s only been in the last 3 decades that we’ve seen huge leaps and bounds made thanks to Computer Generated Imagery (CGI), up to the point where anything is possible on film.  It has been an undeniable driving force of change in both how movies look compared to several years ago, as well as what kinds of movies can be made.  We have visual effects to thank for making worlds like Middle Earth and Narnia feel like they actually exist, and for also making extraordinary events here on Earth seem all the more tangible on the big screen.  But, even with all the great things that computers can do for the art of cinema, there is also the risk of having too much of a good thing as well.  While CGI can still impress from time to time, some of the novelty has worn off over the years as techniques have become more or less standardized.  Hollywood sadly seems to value CGI perhaps a bit too much and their over reliance on the medium has unfortunately had the effect of making too many movies look artificial.  The curious result of this is that it’s making movies that use practical effects and subtle CGI look far more epic and visually impressive than the films that use it in abundance.  Part of this is because more of the audience is able to tell the difference between what’s digital and what’s real today than they have before, and two, because we also admire the work put into something hand crafted.  Using CGI for filmaking is not a bad thing at all; it’s just that there has to be a purpose and necessity for it to work.

The sad reality of the last decade or so is that filmmakers have seen CGI as a shortcut in story-telling rather than as an aid.  Back in the early days of CGI, filmmakers were limited by what computers were capable of rendering at the time, so if they had to use them, it needed to be perfect and absolutely crucial.  Now, anything can be rendered realistically, whether it be an animal, a place, or even a person, and it comes very close to looking 100% authentic.  But, even with all these advancements in technology, filmmakers are still learning the best ways to use them, and sometimes quantity trumps quality in many cases.  Usually it’s a decision dictated more by studios and producers who want to save a buck by shooting scenes in front of a green screen instead of on location, but then there are also filmmakers who have indulged too much in CGI effects as well.  Thankfully, there are filmmakers out there who insist on using the tried and true practical effects, but their impact doesn’t extend to the whole community.  As it is with all of filmmaking, it’s all about story in the end, and whether or not the tools that you have are able to serve it in an effective way.   Would you rather watch CGI bring to life a talking raccoon and his tree monster friend, or do you want to watch two hours of CGI animated robots fighting?  It really comes down to what impresses us the most and usually the quality of the movie itself factors into that.  But, despite the quality of the flick and it’s effects, there seems to be a lot of bingeing going on with regards to CGI effects and it makes you wonder if Hollywood is doing a disservice to itself by not diversifying.

It helps to look back at a time when CGI still was a novelty to see where it’s value lies.  Developed in the late 70’s and early 80’s, CGI saw some of it’s earliest and briefest uses in movies like Star Wars (1977).  A few years later, Disney created the movie Tron (1982), which made the use of CGI environments for the first time in film, albeit on a very primitive level.  But, even with Tron‘s limitations, it still showed the promise of what was to come, and it stood out strongly in an industry that still valued practical effects like matte paintings and models.  Soon after, the movie Young Sherlock Holmes (1984) introduced the first integrated CGI effect into a live action film (the stained-glass knight scene) which paved the way for more digital additions in movies.  And then, in 1993, we got the mega-hit Jurassic Park.  Directed by Steven Spielberg, Jurassic Park was the biggest lead forward in CGI that the industry had seen to date, and that’s because more than any other film before it, we saw the true potential of what CGI could create.  Naturally it helped to have someone like Steven Spielberg at the helm, given his comfortable history of using special effects in his movies, but this was on a level unseen before.  Originally, the plan was to use stop motion animation to bring dinosaurs to life in the film, just because it was the standard in Hollywood ever since the brilliant Ray Harryhausen made it popular.  Thankfully, engineers at ILM (Industrial Light & Magic) convinced Spielberg to take the risk and the result brought us Dinosaurs that both looked and moved realistically.  Only CGI could’ve made those creatures move as smoothly as they did, and since then, it has been the go to tool for bringing to life characters and creatures that otherwise could never exist.

But, what is even more remarkable about Jurassic Park‘s legacy is not just the fact that it was a great movie with amazing effects, but it’s also a film that has remarkably held up over time.  It’s unbelievable to think that the movie was made over 20 years ago at a time when CGI was still maturing.  You would think that time would make the movie look dated now, but no; the CGI still holds up.  This is partly due to the filmmakers who busted their butts to make the CGI look perfect, but another reason is also because the CGI animation is not overdone.  In fact, there are actually not that many computer enhanced shots in the entire movie.  Whatever moments there were had help from practical effects that helped to blur the lines between the different shots.  The only times the movie uses CGI is when the dinosaurs’ are shown full body.  When close-ups were needed, the filmmakers would use an animatronic puppet, or sometimes just a movable limb.  It was a way of keeping old tricks useful while still leaving room for the new enhancements, and the result works spectacularly well.  Filmmaker Walt Disney had a philosophy when using special effects in his movies that you could never use the same effects trick twice in a row between shots because it would spoil the illusion.  You can see this idea play out in many of the amazing moments found in Mary Poppins (1964), a groundbreaking film of it’s own.  Like Jurassic ParkMary Poppins mixes up the effects, helping to trick the eye from shot to shot.  By doing this with the dinosaurs in Park, it made the CGI feel all the more real, because it would match perfectly with the real on set characters.  It’s a balance that redefined visual effects, and sadly has not been replicated that much in the years since.

Jurassic Park has seen it’s share of sequels over the years, with the fourth and latest one, Jurassic World (2015) making it to theaters this week.  And interestingly enough, each one features more and more CGI in them, and fewer practical effects.  Some of them look nice, but why is it that none of these sequels have performed as well as their predecessor?  It’s probably because none of them are as novel as the first one was, but another reason could be that the illusion is less impressive nowadays in a world inundated with CGI.   Somehow a fully rendered CGI T-Rex attacking humans in a digitally shot, color-enhanced image in Jurassic World doesn’t have the same grittiness of a giant puppeteered T-Rex jaw smashing through a glass sun roof on a climate controlled sound-stage in Jurassic Park.  Sometimes it helps to look old-fashioned.  Some of the action may be impressive in Jurassic World, but you won’t get the same visceral reaction from the actors on screen that you got in the original.  The reason why you believed actors like Sam Neill and Jeff Goldblum were in real danger is because they were reacting to full-sized recreations of the dinosaurs on stage.   Chris Pratt may be a charming actor, but you feel less concern for his character in the movie when you know that all he’s acting opposite of is probably just a tennis ball on a stick.  The original Jurassic Park made it’s CGI the glue that stitched together all the other effects, and that made the movie feel more complete.  World has the benefit of having the best effects tools available to it, which is better than what The Lost World (1997) and  Jurassic Park III (2001) had, but it still won’t have the same visceral power of the original, and that’s purely because it’s moved so far in one direction from where it started.

Hollywood in general has abandoned many of the old, traditional effects in favor of more CGI.  Some of this has been for the better (does anyone really miss rear-projection?).  But, too much can sometimes even hurt a movie.  This is especially true when filmmakers, even very good ones, become too comfortable with the technique and use it as a shortcut in story-telling.  George Lucas unfortunately became too enamored with the limitless potential of CGI, and used it to an almost absurd level in his Star Wars prequel trilogy.  Yes, it looked pretty, but nearly every shot in the movie was digitally enhanced, and it only worked to highlight the artificiality of every scene as well as distract from the story.  It gets annoying in certain parts where you can obviously tell that the actors are standing in front of a green screen in scenes that could have easily been shot on location.  For Lucas, I’m sure part of the allure of making his movies this way was so that he didn’t have to deal with location shooting problems like climate and extras.  But, what I think he failed to recognize is that part of the appeal of the original Star Wars was the fact that it was imperfect in spots, which made the special effects stand out that much more.  By trying to make everything more glossy, he unfortunately made his world look fake, showing that CGI is not a fix-all for everything in cinema.  And Lucas wasn’t alone in making this misjudgment.  The Lord of the Rings was also another groundbreaking film series in terms of effects, and that was largely because director Peter Jackson applied an all of the above approach to making Middle Earth appear real, including extensive use of models and location shooting.  When he set out to bring The Hobbit to the big screen, Jackson shifted to rely more heavily on CGI.  While it doesn’t ruin the experience as a whole, one can’t help but miss the practical and intricate model work that was passed over this time around in favor of fully-CGI rendered locations. For both cases, more didn’t exactly equal better.

In recent years, it’s actually become more ground-breaking to avoid using CGI in the crafting of a movie.  Some filmmakers like Christopher Nolan are making it part of their style to do as much as they can on set before having to use CGI for the final film.  When you see something like the hallway fight scene in Inception (2010), you’re initial impression will probably be that CGI had to have been used for that moment at some point.  That is until you’re blown away by the fact that nothing had been altered in that shot at all.  It was accomplished with mounted cameras, a hydraulic gimble machine, and some well-trained actors; a low tech feat pioneered years back by Stanley Kubrick in 2001: A Space Odyssey (1968) and implemented to the next level by Nolan.  That helps to make the scene feel all the more real on screen because it uses the camera and the set itself to create the illusion.  Doing more on set has really become the way to make something big and epic once again in movies.  We are more impressed nowadays by things that took their time to execute, and if the finished result is big enough, it will hold up against even the most complex of CGI effects.  That’s why we’re seeing a come-back of sorts in recent years with regards to practical effects.  It’s manifesting itself in little, predictable ways like using real stunt cars and pyro explosions in Furious 7 (2015) or in big ways like having Tom Cruise really hang off the side of a plane in midair in the upcoming Mission Impossible: Rogue Nation (2015).  And J.J. Abrams is bringing practical effects back to the Star Wars franchise, which is step in the right direction as well.

Overall, a movie’s special effects are more or less tied to how well they work in service to everything else.  Too much or too little CGI effects can spoil a picture, but not using it at all would also leave a movie in a bad position.  Today, CGI is a necessary tool for practically every movie that makes it to the silver screen, even the smaller ones.  A small indie film like Whiplash (2014) even needed the assistance of CGI when it had to visualize a car accident halfway through the movie.   It all comes down to what the story needs, and nothing really more than that.  Of course, there are boundless things that CGI can bring to life out of someone’s imagination, but sometimes a film is better served by taking the practical way when creating a special effect.  Watch some of the behind the scenes material on the Lord of the Rings DVDs and tell me if it wasn’t better in some cases to use practical effects like models and forced perspective to enhance a scene instead of CGI.  Sure, some creatures like Gollum and Smaug can only be brought to life through a computer, but it’s only after the animators had the guide of on set performances given by actors as talented as Andy Serkis and Benedict Cumberbatch.  Plus, physically transformed actors in make-up come across more believably than their equivalents in CGI form, with exceptions (Davy Jones in the Pirates of the Caribbean series).  I’d say restraint is the best practice in using CGI overall.  As Dr. Ian Malcolm (Jeff Goldblum) says about technology run amok in Jurassic Park, “you were so preoccupied with whether or not they could that you never stopped to think if they should,” and the same truth can apply to how CGI has been used in Hollywood.  It solves some problems, but it can also reduce the effectiveness of a story if mishandled.  We’ve seen a lot of mediocre movies come out with wall to wall CGI effects recently, and much of the wonder that the technology once had has unfortunately worn off.  Hopefully, good judgement on the filmmakers part will help to make visual effects an effective tool in the films of the future.  The best illusions are always from those magicians who have something you never expected or seen up their sleeve.

Filming the Insanity – When Unusual Directors Make the Strangest of Movies

the room tommy

Films roughly fall into four separate categories.  There are the ones that we love, the ones that leave us conflicted or indifferent, or the ones that just plain suck.  Ninety percent of all movies fall into these three categories, but there’s that last ten percent that make up a whole different category.  These films are the ones that defy all categorization and are mainly there to leave us wondering how in the world they exist at all.  And this is by far the most interesting class of film out there.  We all know what they are; movies that are so bizarre and defy all logical explanation.  You would be led to believe that movies such as these are usually terrible and some of them certainly are, but their awfulness is also what makes them entertaining to audiences.  Sometimes, this fourth class of cinema can even carry more of a devoted following than something that was competently made.  How and why this happens is almost impossible to predict.  Sometimes it’s the discovery of the unusual that drives their popularity and helps to carry their legacy on long after their initial release.  Most cult movies start out as low budget failures and end up finding their audiences through word of mouth, and usually the odder the movie, the more appeal it will have with audiences who are attracted to that sort of entertainment.  Such is the case with something like The Rocky Horror Picture Show (1975), which opened small, but has continually been running now for 40 years as a staple of midnight screenings across the world.  But Rocky Horror is just one of many movies that have left their mark on cinema and not by being the best made or the most moving, but instead by being the most unexplanable.  And if there is anything that most of these movies have in common, it’s that they came from the minds of ambitious yet eccentric people.

Probably the most notorious creation in recent years from a truly unusual mind would be 2003’s The Room; one of the most bizarre movies that has ever been made.  The Room is a movie that really defies all reasonable explanation.  On the surface, it doesn’t seem like much; just a small budget drama about selfish people falling in and out of relationships within the backdrop of San Francisco.  But what makes the movie notorious is the fact that every film-making choice, whether it’s the editing, the music, or the staging, is absolutely wrong.  Scenes exist for no purpose (like the famous flower shop scene), characters pop-up out of nowhere with no context or set-up, and the dialogue is completely tone deaf (“I got the results of the test back.  I definitely have breast cancer.”)  Normally you would think that a mess of a movie like this would make the film forgettable, and yet the sum of all these bizarre film-making choices makes The Room one of the most unintentionally funny movies ever made.  It’s especially entertaining to watch this movie with a full audience, just to hear peoples’ reactions to what they’re watching.  This is why The Room has now become one of the most popular cult hits of the last decade.  Anybody could have made a bad movie, but it had to take a very special, twisted mind to have made something as hilariously inept as The Room, and that special person was writer, director, and star Tommy Wiseau.  Tommy was a struggling actor with no film-making expertise.  Yet, he had endless ambition and The Room was his pet project from start to finish.  I think that part of the reason The Room succeeds where other bad movies fail, is that no one stood in Tommy Wiseau’s way and made him second guess himself, leading to a finished product that thinks it’s a movie but really ends up becoming a fascinating enigma.

With films that come from unchecked egos like Tommy Wiseau’s, we are able to get a fairly good insight into the mind of a director and how they view the world and tell it’s story.  Film-making is a trade based mostly around compromise, but the more you strip away outside influences, like studio heads and focus groups, you begin to see stories told on a more personal level, especially when the director has full control.  There are many different kinds of directors, but the most celebrated are the ones who have completely control over their voice and their style.  Usually, these types of directors write their own scripts and do their own editing, and in some more extreme cases (like Steven Soderbergh or Robert Rodriguez) do their own camera work and scoring.  For many of these celebrated directors, it usually takes many years to refine their craft, whether it be by apprenticing on film sets or attending film schools.  But, with the democratizing of media and the wider availability of film-making tools, we are seeing more people today taking it upon themselves to create their own movies, whether they have the know how to do so or not.  Sometimes you have people who have natural talent as filmmakers, and then you have people like Tommy Wiseau, who are ambitious amateurs in the most extreme sense.  The charm that comes from Wiseau’s folly is the fact that he put so much money and effort into the movie, spending his own fortune buying camera equipment and renting studio space, without ever trying to learn the language of film.  People might see that as lazy, but there is something endearing about Wiseau’s desire to create, even if he’s not the most qualified to do so.

Tommy Wiseau isn’t alone in this field, because he does come from a long line of failed filmmakers with vision but no skill.  The B-movie craze of the 50’s and 60’s was an especially strong time for amateur filmmakers.  Many would prove to be forgettable, but some had such unusual visions, that even their failures have withstood the test of time.  One such director was Edward D. Wood Jr., who’s responsible for making what many consider to be among the worst movies ever made.  His notorious filmmography includes bizarre movies like the cross-dressing comedy Glen or Glenda (1953), the schlocky Bride of the Monster (1955), or his magnum opus Plan 9 From Outer Space (1959), each one crazier than the last.  Critics of course decried the amateurishness of Wood’s films, but audiences who rediscovered the movies years later found his flicks to be peculiar little oddities that carried their own charm.  They are horrible, of course, but almost to the level that they stand out from all other movies of their kind.  Like Tommy Wiseau, Ed Wood was driven by the sensation of creating a movie, and his ambition was unchecked by the limitations of his skills as a filmmaker.  In a sense, the reason why we enjoy Ed Wood movies is because there’s no pretentiousness to them.  They are what they are and in the end we only see the zaniness of Wood’s vision.  Director Tim Burton perfectly captures this creative drive from Ed Wood in his 1994 biographical film about the director.  That movie helped to craft the image of Ed Wood as a hero for any aspiring filmmaker hoping to fulfill their dreams, while also warning them to refine their skills and avoid Wood’s mistakes.  But, at the same time, you also needed a filmmaker with a twisted mind like Burton to tell the story of someone like Ed Wood and stay true to his spirit.

A directors’ vision ultimately determines whether or not a film can connect with it’s audience.  The reason why Wiseau and Wood are celebrated even despite their faults is because their movies are unforgettable.  Yes, part of why they leave an impression is because we marvel at just how bad they are, but even still that’s something that can ultimately keep a movie alive for generations.  The movies that usually fail in the long run are the ones that are bad and forgettable.  Even with all the cinematic tools at their disposal, Hollywood filmmakers can still create some truly horrible films that burn out just as quickly as they are made.  Case in point, Roland Emmerich; a man who’s become synonymous with big budget disaster flicks like The Day After Tomorrow (2004) and 2012 (2009).  His movies are almost always mired in ham-fisted sermonizing about politics and the environment along with high-quality visual effects that ultimately all just looks the same in the end.  They lose their punch quickly and end up being more laboured than fun.  Contrast this with the movie Birdemic: Shock and Terror (2010), directed by former tech salesman James Nguyen, which has the same sermonizing but with far more amateurish visual effects and storytelling.  Emmerich’s films fall by the wayside while Birdemic has since achieved cult status.  But, why is that?  It seems to be the fact that Emmerich’s movies try too hard to connect with it’s audience, while Birdemic just doesn’t even try.  It’s the movie that James Nguyen wanted to make, regardless if he was qualified to do so.  Like Wood and Wiseau, Nguyen’s vision is on display and it’s charmingly naive.  And that’s what ultimately makes Emmerich’s movies fall short; the lack of charm and the insistence that it be taken seriously.

But, even though most strange movies come from amateurish filmmakers, it doesn’t mean that it’s always the case.  Even some of cinema’s greatest minds have found themselves in the middle of creating some truly bizarre movies, and all with the backing of the studios as well.  Who would’ve thought that someone as skilled as Stanley Kubrick could have made something as out of this world and undefinable as 2001: A Space Odyssey (1968) or take a radioactive lightning rod of a novel like A Clockwork Orange and make it into a work of art; but he did just that.  A director’s vision can ultimately take on any form, even in the realm of the bizarre.  The 1970’s was a particularly big period for oddball films, and many button-pushing filmmakers made a name for themselves in these years, like Ken Russell with his gonzo condemnation of religion in The Devils (1971), John Boorman with his oddball sci-fantasy Zardoz (1974),  or the many art house, free form pictures made by Andy Warhol.   But, even interesting cinematic experiments can come from great artists who cross into a point of insanity when creating a high profile movie.  Such a thing happened to Francis Ford Coppola, who had to be dragged away from the set of Apocalypse Now (1979) because he wouldn’t stop filming, and was desperately trying to make sense of a project that was driving him crazy.  Martin Scorsese likewise made Taxi Driver (1976) in a haze of delirium, fueled by his then addiction to cocaine.  Luckily for both men, those crazed states helped them create long lasting works of cinematic art, but it’s not without consequences.  Hollywood let these big projects go too far many times and ultimately had to pull the plug when director’s egos got out of hand, which is what happened to Michael Cimino and his mess of a movie known as Heaven’s Gate (1980).

Whether strange and twisted movies happen by mistake, or by design, it is ultimately up to the audience to decide.  Usually it isn’t just the movie itself that builds the legend around it, but rather the story behind it’s making.  We are fascinated just as much about the creators as we are about the creations and by watching the movies themselves, we get to see just a little bit of the madness that drives them.  That madness is what we ultimately find fascinating.  Tim Burton’s Ed Wood found the story of a dreamer who wanted to tell stories in unusual ways in the background of it’s subject, and that helps us modern viewers see movies like Plan 9 in a whole new way.  The same is holding true for The Room, which itself is gearing up to have it’s creation chronicled in an upcoming comedy starring James Franco as Tommy Wiseau.  That’s a true indication of how fascinated we’ve become with these strange, oddball films.  Even great filmmakers with complete and competent visions can develop a cult status by taking some very unorthodox routes and become legendary as a result.  There are stories within stories of the origins of these movies, especially when you study the  strange and sometimes dangerous film-making techniques of directors like Stanley Kubrick or Werner Herzog.

But, the things that really set these movies apart is that they defy explanation.  Their existence is so baffling that it makes each of them a minor miracle, which in turn leads them to be celebrated.  They are even more special if they are rediscovered many years later and completely out of the blue, like Manos: The Hands of Fate (1966) or Santa Claus Conquers the Martians (1964).   These are the Outliers of cinema and they ultimately reinforce the power that the medium can have.  By embracing the bizarre, we can see the limitless potential of cinema, and how no vision is too strange.  Even if talent behind the camera exceeds the likes of Tommy Wiseau, he still managed to do what most filmmakers have always dreamed of doing and that’s to have their movie be seen by audiences around the world.  Of course, part of The Room’s success is because we watch just to laugh at it, but still that makes it more entertaining than most of the junk that Hollywood usually puts out. In the end, it helps to be the oddball in an artistic world.  That’s why we hang paintings of a Campbell’s soup can in the same gallery as a Rembrandt or play a Beatles song in the Royal Albert Hall in London.  We gravitate to what entertains us, and it’s just fine if it’s from the mind of a spirited yet not completely normal creative mind.  And sometimes the crazier the mind is, the more unique the end result will be.

Bigger Than Any Movie – The Making and Unmaking of Cinematic Universes

Thanos Gauntlet

As discussed in my review of Avengers: Age of Ultron, Marvel has enjoyed unprecedented success with their cinematic universe.  However, it took many years and a lot of conviction to make it happen.  Up until the start of Marvel’s master plan, there was no connected universes when it came to Super Hero movies.  Every comic book adaptation stayed within it’s own cinematic worlds, with the main hero being the sole focus.  But, that all changed when the post-credits scene appeared at the conclusion of Iron Man (2008).  For those who waited patiently for through the end credits of the film, they were treated to a groundbreaking moment when Tony Stark (Robert Downey Jr.) walked into his palatial mansion and found Nick Fury (Samuel L. Jackson) waiting in his living room.  This was a surprise to audiences, because not only did it introduce another major character from the Marvel comics, but it also established the idea that Iron Man’s story was not only going to continue in his own movies, but also in something much bigger.  As Nick Fury puts it in the movie “you’ve become a part of a bigger universe, you just don’t know it yet.”  And with that promise, Marvel did expand on that bigger universe, establishing even more new characters and having every story-line come to a head in the first and monumental The Avengers (2012).  The momentum continued on in each superhero story thereafter, creating a cinematic universe that has become the envy of all of Hollywood.  Because of Marvel’s success with their cinematic universe, now many other studios are trying their hand at building their own, making it the newest trend in the film-making industry.  Unfortunately, not all the best laid plans by these other studios has worked as well as Marvel’s.

What Marvel has at it’s disposal are decades worth of story-lines on which to draw from within the comics themselves.  Thankfully most of them have shown how to cross over characters many times, giving the current film adaptations a workable blueprint to adapt from.  But, even still, part of Marvel Studio’s success has been in the casting of the right actors, and keeping them committed to the process over multiple movies.  This is especially difficult for some of the side characters in these Marvel movies, like Hawkeye (Jeremy Renner) and Black Widow (Scarlett Johansson), both of whom have not headlined their own movie, and yet have big parts to play in the larger cinematic universe.  Getting commitments from these actors and having them see the larger picture is important to the process and so far, Marvel has kept it’s team mostly intact (only The Hulk and War Machine needed to be recast, and Marvel handled those transitions splendidly).  Being able to know the ultimate destination is also important.  For Marvel, the plan has always been to lead up to the Infinity War, which is currently where Phase 3 of their cinematic universe is meant to conclude.  The Infinity War story-line involves supervillain Thanos (voiced by Josh Brolin) collecting the all powerful Infinity Stones and placing them on his gauntlet, which ultimately grants him God-like powers and makes him a threat to the whole of Marvel’s cinematic universe.  So far, this has been built up in the movies by establishing each Infinity Stone within the story-lines of select Marvel films, showing that the studio is clearly keeping the endgame in focus and using the stones as a way to tie everything together.

But, commitment has it’s own risks too.  For one thing, with all this build up, Marvel’s two-part Infinity War had better live up to the hype.  Otherwise, all this build-up would seem pointless in the end.  As grandiose as it might be, Marvel could easily fall under the weight of it’s own mythology, and alienate much of the audience by choosing to stick to the larger plan in detriment to the entertainment value of the whole enterprise.  A little bit of that hampered Age of Ultron, but the movie itself managed to survive thanks to clever writing and charismatic performances, which helped guide audiences through the convoluted plotting.  But, in less capable hands, the overwhelming weight of the cinematic universe could prove overwhelming.  Not only must each Avengers movie carry it’s own story, but all the continuing narratives of each individual character as well.  And you have to fit that all in a tight 2 1/2 hour package.  In that sense, it’s amazing that Age of Ultron didn’t turn into an incomprehensible mess as a result.  It’s not as neatly plotted as the first Avengers, but it still got the job done and was entertaining in the end.  In fact, it’s amazing how long Marvel has kept this train going without loosing momentum.  I think that the big reason for this is because Marvel puts just as much emphasis on the individual story-lines as it does on the big picture.  The standalone movies are just as good as the crossovers, and maybe even better.  You could even do a standalone universe with just the Guardians of the Galaxy  (2014) setting alone.  And that really has been the key to Marvel’s success.  Every team member has their own story to tell, and Marvel is committed to telling them all.

Though Marvel has established new ground with their Cinematic Universe, it’s not exactly the first time that Hollywood has tried to do this either.  In fact, crossovers have been common in film-making for many decades before, albeit on a much smaller scale.  Began with matinee serials during the early days of Hollywood, which themselves were inspired by comics of the day, the idea of having some of literature and cinemas most famous characters interact together has always been an appealing concept.  Sometimes crossovers could happen in the unlikeliest places.  Weirdly enough Abbott and Costello Meets Frankenstein (1948) could be seen as an early precursor to modern crossover movies, because despite being a screwball comedy, it did feature horror icons Lon Chaney Jr. (The Wolfman) and Bela Lugosi (Dracula) playing their individual characters once again, knowingly referencing their past films.  Crossovers were also a popular concept for television for many years (remember when The Jetsons crossed paths with the Flintsones).  But, it wouldn’t be until the rise of New Hollywood in the sixties and seventies that we saw the idea of telling stories set in an interconnected universe emerge.  Planet of the Apes almost discovered this idea by accident, as parent studio Fox tried to stretch the original premise of their franchise out in order to get more sequels made. With the movie Escape from the Planet of the Apes (1971), the story reveals that apes Cornelius and Zira (Roddy McDowall and Kim Hunter) escaped their planet before it’s destruction in the previous film and landed on present day Earth.  There, Zira gives birth to a son named Ceasar (also McDowall), who leads the Ape Rebellion and creates the titular Planet of the Apes.  By stretching this premise, Fox managed to change the franchise and showed that you could break from the main story-line and film any plot you wanted in this established cinematic universe.  This concept has continued on in the current Planet of the Apes movies, which are far removed from the movies that started the franchise.

But, if there was a franchise that really began to define the idea of larger cinematic universes, both literally and figuratively, it would be Star Wars.  When Star Wars was released in 1977, it was a phenomenon unlike anything Hollywood had ever seen.  But, it was just the beginning of the story that creator George Lucas wanted to tell.  The story continued with two equally popular sequels, The Empire Strikes Back (1980) and Return of the Jedi (1983), but even with the trilogy complete, it still didn’t even scratch the surface of the story that George Lucas had envisioned.  The reason why Star Wars has become such a massive universe on it’s own is because Lucas plotted out the whole mythology of his universe before ever writing his first script.  His back-stories for the characters and the worlds of Star Wars are incredibly detailed (almost Tolkein-like in their intricacy) and could fill the narratives of their own movies, which is in fact what George Lucas did eventually do.  He took the notes that he made for the original trilogy and crafted what would become the prequel trilogy.  Unfortunately his grand vision far exceeded his talents as a writer and director, and the prequels didn’t nearly succeed as well as the original trilogy did.  But, the vision of this galaxy far, far away still has inspired fans across the world, and Star Wars lives on even beyond what George Lucas planned for it.  With the acquisition by Disney a couple years ago, Star Wars is now entering a new phase where it actually will be able to live up to the promise that the original trilogy held for audiences.  They will continue the main story-line this Winter with the release of Episode VII: The Force Awakens, but there are also standalone movies also planned, which will gives audiences the chance to see more of what the cinematic universe that Star Wars can be, un-teathered to what George Lucas had imagined.  What Lucas created was other-worldly, but the potential of a Star Wars cinematic universe that could be set anywhere and be about anyone is boundlessly exciting.

For the most part, building cinematic universes can be a costly but still a rewarding enterprise for most filmmakers.  That’s why so many studios are trying to follow Marvel’s lead and do the same with some of their prized characters and franchises.  But, while some seem like natural bases for larger cinematic universes, others are a bit more puzzling.  For one thing, does anyone think that a cinematic universe could work for Ghostbusters.  Sony Pictures plans to take their Ghostbusters brand and build it into a interconnected universe inhabited by different teams of Ghostbuster troops.  This started with the announcement of an all female Ghostbusters remake last year, which led to an unhappy fan-base reaction, saying they felt that it was changing too much of their beloved franchise in order to appeal to a whole different demographic.  In order to appease those fears from fans, Sony revealed that this planned movie was not a reboot or a remake, but rather one in a new cinematic universe that they were planning, which itself became a controversial position.  It’s not a question of whether it should be done, but rather one of if it can be done.   Sure, you could take the Ghostbusters brand and build a whole mythology and universe around it, but is it something that Sony has enough ideas for?   Sony is going to have to build it all from scratch, which could prove challenging.  Marvel rivel DC comics has the benefit of having all their comics to draw from as they begin their own push for a cinematic universe.  Unfortunately, they do this in the shadow of what Marvel has accomplished, and the danger for them is that they’re plans are all based around playing catch up, which could hurt the momentum of what they ultimately want to get to.  Also, building on a shaky foundation could also hurt DC.  Man of Steel was not a widely beloved film (though I didn’t seem to mind it as much), and that negativity could cloud the rest of the universe as a whole.  Time will tell if DC can compete with Marvel’s cinematic universe.  My fears are less with the quality of the films (which to me look just fine) and more with whether or not DC has the right master plan in place.

DC’s Cinematic Universe could certainly learn a thing or two from the disastrous attempt at a Spiderman universe from Sony.  When Disney acquired Marvel, they made an effort to gather all the properties and characters they could in order to make the plan for a cinematic universe work.  Unfortunately, for years Marvel had signed licenses over to other studios, all of whom were content to keep making their own features outside of Disney and Marvel’s control.  Sony held onto the rights for one of Marvel’s biggest names, Spider-Man, and defiantly refused to let him play a part in Marvel’s planned Cinematic Universe.  This became an issue once they made the decision to abandon the story-line of the original, Sam Raimi directed Spider-Man’s and instead reboot the series as a whole from scratch with a new cast and new story-line, just as an excuse to keep the character away from Disney.  After seeing the success that Marvel Studios had with their Cinematic Universe, Sony thought they could do the same with Spider-Man, and plans were set out not just for more sequels to their re-titled The Amazing Spiderman series, but also spin-offs planned around the popular Venom character and a team-up film based around Spider-Man villains called Sinister Six.  However, The Amazing Spiderman (2012) opened to modest box-office and mixed reviews.  The even more ambitious Amazing Spiderman 2 (2014) fared even worse.  The big problem with the planned Spiderman universe was that it tried too hard to match Marvel.  There was more thought put into planting the seeds of a larger universe than actually crafting a compelling story, with way too many characters introduced that have no impact in the film, but were meant for bigger things to come.  That’s ultimately what sunk the Spiderman universe at Sony and now the character has finally returned back to Marvel, where he will again be rebooted, only now connected to the Cinematic Universe.  What Sony’s failed experiment proved was that you shouldn’t craft a cinematic universe just for the sake of having it.  Sony served up an under-prepared buffet platter, while Marvel has given us a three course meal with everything cooked to perfection.

Planning out a cinematic universe is the key in the end.  Marvel knew that with the first meeting between Iron Man and Nick Fury, and we are now seeing that potential realized to incredible lengths with each progression in the larger narrative.  But, what makes Marvel’s Cinematic Universe stand out so well is not the high points where the different characters team-up in the Avengers.  It’s the individual stories in each phase that really makes the Cinematic Universe so special.  Audiences are enjoying the progression of Iron Man, Captain America, and Thor’s story-lines just as much as they do the Avengers narratives; perhaps even more so now.  That’s where Marvel’s success has come from and where other like-minded cinematic universes have failed.  It’s not where the Universe is going that audience find so intriguing; it’s where it’s at now and how these moments all tie together that we find so interesting.  Yeah, of course audiences get excited when they get a brief glimpse of Thanos on his throne near the film’s end, or a passing mention of a future member of the team (like the brief mention of Doctor Strange in Captain America: The Winter Soldier), but these are only treats presented to us after watching a satisfying, standalone story.  That’s often why these teases appear at the end of the movie, and not within the plots themselves; so that each movie can stand on it’s own.  That’s why Sony failed with Spider-Man, because you can’t fill your entire movie with teases for the future and have characters with no purpose (like with the horrible shoehorning of The Rhino).  My hope is that the potential that we’re seeing in the emerging Cinematic Universes from other studios pays off, and that they understand the key factors of how to build these universes the right way.  Because, when you’ve established a cinematic universe that lives up to it’s potential, than there’s no limit to the stories that can be told.

The Hills Are Alive – The Sound of Music at 50 and Movie Musicals Today

Sound of Music

Tastes in movies and music can often interconnect, but at other times they very much diverge.  For many people, like myself, a love of music can even stem from a love of movies. And though there are many films that put the music front and center in a musical format, most of my favorite pieces of music actually originate from non-musical films, as evidenced in my recent top ten list.  But, there are some commendable movie musicals out there as well, and one that particularly stands out in my mind is the 1965 Best Picture winner, The Sound of Music.  Though it originated on the stage, I’m sure that for most people the first thing they think about when hear that title wI’ll be this film, and the image above is probably what pops into their minds immediately.  When it first released into theaters, it became an instant phenomenon at the box office, and is still one of the highest grossing movies of all time when adjusted for inflation. It helped to save the troubled 20th Century Fox studio after the financial ruin brought on by the Cleopatra (1963) production, and it has gone to perform well for many decades thereafter. Now in 2015, it has hit a major milestone by celebrating its 50th anniversary and once again the movie has been given a new focus, highlighting it for a new generation of film goers. And all this lavish attention is justly deserved. Though there could be an argument made for the brilliance of 1952’s Singin’ in the Rain or 1961’s West Side Story (both great on their own), in my opinion The Sound of Music is the greatest movie musical of all time. For one thing, there is no other musical that uses the film medium to its highest advantage and what it also does is highlight what’s wrong with most movie musicals made today.

What makes The Sound of Music stand out so much from other musicals is in it’s grandiosity. When director Robert Wise set out to adapt this story from the stage to the screen, he made sure to remove all connections to the theater and bring the production outdoors. This was an unusual move at the time, because most productions of musicals stayed indoors within the studio soundstages, where all the elements such as lighting could be tightly controlled. In fact, some of the most famous musicals at the time like 1964’s My Fair Lady and Mary Poppins were both filmed entirely indoors on soundstages in Hollywood, despite their turn of the century English setting.  Wise, however, chose instead to film The Sound of Music on location in Salzburg, Austria; the authentic setting of the real life story of the Von Trapp family.  By doing so, he made this movie look and feel bigger than any other musical adaptation up to that point. The movie has a free and open feel to it, and any notion that this story originated on the stage is quickly forgotten.  Indeed this was a story that needed the epic treatment.  It just makes sense for actress Julie Andrews to be out in the backdrop of the majestic Alps when she sings the song “The Hills are Alive with the Sound of Music.”  Robert Wise probably saw the value in shooting on location when he shot the opening number for West Side Story on the streets of New York City.  Though it was just for that opening sequence, I’m sure that Wise realized then that to make a movie musical stand out, you need to shoot it like an epic and less like a stage production, which is a lesson put into brilliant practice in Sound of Music.  Because of this, Music is both groundbreaking as well as entertaining, and is a benchmark in the whole history of movie musicals.

Musicals have been a part of cinema ever since the introduction of sound to the medium. In fact, it could be said that the very first “talkie,” 1927’s The Jazz Singer, is a musical, considering all the sound parts are the musical numbers sung by star Al Jolson.  When talking pictures became the norm, musicals were often the most popular genre for audiences.  No other genre showed off the new technology better, so it was just natural for the studios to exploit it as much as possible. The musical was such a popular medium at the time that even the Oscars took notice, naming 1929’s The Broadway Melody as their second ever choice for Best Picture.   During the Depression years, the musicals became an escape for a disenfranciesed populace, with stars like Fred Astaire, Ginger Rogers, and Shirley Temple as the highlights of the period.  The war years saw a downturn of the movie musical, as the medium became more a propaganda tool, with movies like Yankee Doodle Dandy (1942).  The Golden Age of the epics in 50’s and 60’s helped to lay the groundwork for the grand reemergence of the movie musical in this era and it reached its zenith with The Sound of Music, though many other widescreen productions like Oklahoma (1957), The King and I (1956) and of course My Fair Lady were also standouts.  That era, however, came to an end after high profile flops like Doctor Dolittle (1967) and Hello Dolly (1969) crashed hard due to changing tastes in the market. The 70’s brought us more revisionist takes on the musical format with movies like Cabaret (1972) and Grease (1978).  And then came a twenty year period in the 80’s and 90’s when the movie musical all but disappeared, being relegated mostly to animated films.

It wasn’t until 2001’s Moulin Rouge, directed by Baz Luhrrman, that the movie musical came back in a big way.  Now, it’s not only common to see musicals on the big screen today, but most of them actually are profitable.  The downside of this however is that even though the genre has seen a resurgence, most of the newer adaptations are not quite up to the standards of their predecessors. There have been a couple standouts, but their success usually is bookended by a lot of copycats and wannabes. Case in point, the success of 2002’s Chicago.  The movie adaptation of the long running Broadway musical cemented the return of the musical genre to the big screen and became the first musical since 1968’s Oliver to win the Oscar for Best Picture.  But, the success of that movie led to the start of many other likeminded productions that aspire to be like Chicago, but fall well short.  This is most evident in splashy productions of Broadway musicals that try to recapture Chicago’s disjointed and gritty atmosphere in contrast to what the musical actually requires in order to shine on the screen and fails; seen clearly in awful adaptations like 2005’s Rent or 2009’s Nine. Although it is nice to see the technique of location shooting take hold in the musical genre since The Sound of Music, it has not matched the grandiosity and visual flair that that classic managed to capture.  Stylistically, something has been lost over the years, and the foundation we have right now is built less around the wow factor that the big screen can give and more around how well the movie plays on the TV screen in the confines of home entertainment, of which Chicago managed to fulfill well enough.

Though some musicals do alright with a smaller scale, I do think that there is something lost in this new trend when translating a musical to the cinema.  In particular, I think that some of the epic grandeur has been lost over the years, and that’s particularly evident in musical adaptations that call for epic visuals.  For example, the big screen adaptation of the hit Broadway show Les Miserables (2012).  Adapted from the Victor Hugo novel, Les Miz (as it is most often called) is widely considered to be the grandest, and most epic musical ever put on the stage, becoming one of the most popular stage musicals since its 1987 premiere.  Given that reputation, you would expect this musical to be given the lavish Sound of Music treatment, shot on location in France with grand, sweeping widescreen visuals.  But, when Universal Stuidos put the movie into production, they chose to give it to director Tom Hooper, a man who is capable at directing period films ( like his Oscar-winning The King’s Speech) but on a much smaller scale. This unfortunately led to the exact opposite approach to visualizing the musical than what it should have been.  Instead of using epic scale shots in eye-catching locations, Hooper instead shot the film mostly in tight and constrained close-ups of the actors without drawing attention to the period details which are important to the story.  It in turn minimaizes a story that should have otherwise have been grand in scale. While not entirely a disaster, I do see Les Miz as a missed opportunity, where the visual presentation is a letdown and one where it was the director who was ultimately miscast.  It makes me wonder what would have happened if the production was given over to a more visual director on the level of say Ridley Scott. At least he would’ve gotten a more interesting performance out of Russell Crowe in the film.

But aside from diminishing returns in the visual department, there is also the change in how movie musicals are staged that unfortunately has distanced itself from some of the cinematic magic from the Sound of Music days.  In particular, the influence of MTV and its music videos has produced a negative impact on the genre. While most musical numbers flowed naturally as part of the storyline in the past, today those same numbers contrast sharply with the rest of the film because they are staged and edited in the music video fashion.  It might be as a result of how the filmmakers have been influenced by this era of music videos we’ve seen in recent years, and indeed many filmmakers today got their start directing music videos.  But, most of them should understand that what works in a 7 minute video format won’t translate as well into a two hour long narrative. This is most jarring in what has become known as the “jukebox” musical, where pop songs are forced into a narrative in place of original content.  With pop songs combined with music video filmmaking, you get movie musicals that don’t stand well enough on their own as a narrative, and more or less just become prententious exercises in editing to music; like with 2007’s Across the Universe or 2008’s Mamma Mia.  But what can be even worse is when a production takes an already established musical and completely changes the purpose and meaning while only cherry-picking the songs they want to appeal to what they think modern audiences like. This happened last year with the disastrous remake of Annie, where most of the songs and original book were jettisoned in favor of a “modern” rewrite that just reuses only the popular songs without the context. Essentially, they turned what was an already well-established musical, and turned it into a “jukebox” musical for their own underwhelming narrative.  This is a particularly negative aspect of this new, music video infused era of movie musicals.

That’s not to say that the genre is devoid of any good examples from recent years. Sometimes it just all comes down to having the right team and vision in place. I for one saw the 2007 adaptation of Sweeny Todd to be a great success. For a film adaptation of a musical based around the Demon Barber of Fleet Street, you needed a filmmaker who could capture the Gothic nature of that story perfectly while still maintaining the musical’s macabre sense of humor; and no one was better suited for the job than Tim Burton. Burton not only gave the film the Gothic look that it needed, but he also did a good job of restraining himself in the production as well. It doesn’t go too over the top, but still feels cinematic enough to help lift the material to work on the big screen. Another great film adaptation of a Broadway musical in recent years was 2006’s Dreamgirls.  While the musical is very pop music infused, it’s meant to be that way by design, chronicling the rise of Motown style music in American culture during the 60’s and 70’s. In adapting the musical for the big screen, director Bill Condon took the exact right approach, shooting the musical in the same way you would make a biopic; an approach that compliments the story and the music perfectly and doesn’t feel unnatural.  In addition, both of these musicals also benefited from casting actors who could actually hold a tune, with Johnny Depp and Helena Bonham Carter doing justice to Sondheim’s complicated melodies, and Beyoncé and Jamie Foxx bringing a lot of Motown soul into their selective songs.  A well matched vision and a capable cast makes all the difference in the end.  Other times you’ll just end up with movies like The Phantom of the Opera (2004) which has the right director, but the wrong cast, or Into the Woods (2014) with the right cast but wrong director. Or Les Miserables, where everything is wrong.

Despite all the problems that have plagued movie musicals in recent years, it has thankfully not diminished the power that The Sound of Music still holds.  And amazingly, 50 years later the movie still remains timeless.  Julie Andrews singing voice is still out-of-this-world and her performance is perfectly balanced with Christopher Plummer’s exceptionally grounded work as Captain Von Trapp. But the real star is Robert Wise’s direction, which takes most of the production out into the real world and shows off the stunning Salzburg locations in all its widescreen glory. I may not be a musical fan, but I am a fan of epic movies, and The Sound of Music fits the definition of the word “epic” in every single frame.  My hope is that this movie continues to remain influential in the musical genre.  For one thing, I’d like to see a return to this kind of epic filmmaking in musicals and a departure away from the MTV influence that we see mostly used today.  The phrase “they don’t make them like they used to,” could easily apply to the musicals of The Sound of Music’s era, and I think it’s about time that the movie musical could use a refresher.  A lot can be improved upon, but when the musical genre works on the big screen, it can become the highest form of cinematic art, and The Sound of Music will always continue to stand as one of its absolute masterpieces.

The Bigger Screen – Game of Thrones in IMAX and Cinematic TV Coming of Age

game of thrones tyrion

Ever since it’s beginnings in the early 50’s, televsion has been locked into battle with cinema for supremacy in the viewer market.  Movie attendance dropped significantly once the first TV screens made their way into living rooms across the world, but that only inspired Hollywood to invest in new technologies that helped to push the medium into new and exciting territories, like the introduction of widescreen and surround sound, and in turn audiences came back in large numbers.  Over the course of this new cinematic revolution, TV more or less became standardized, and in some ways reduced because it was still restricted by the still primitive technologies that made television broadcasts possible.  There was just no way to compare the experiences of seeing Gilligan’s Island  on the small screen with seeing Ben-Hur (1959) on the giant wide screen.  Cinema was the mature artform, and television was just light entertainment.  But, with recent advances in digital photography and high definition home presentations, television has now reached a maturity point where they can now once again compete with the cinematic experience.  Noticeably in the last decade or so, we’ve seen television redefine the rules of storytelling and deliver some of the most groundbreaking and buzzworthy narratives ever seen in the medium.  Sitcoms have have dropped the obviously fake canned laughter and instead gotten their laughs through the single camera format.  Hour long dramas have likewise moved outside the soundstages and have taken us on journeys to the far reaches of the world, and even beyond. And now it’s even common to see a network show that has a substantial CGI effects budget. All of this, combined with some of the recent implosions in tent pole filmmaking, has led to an era where the best minds and talent in the industry are now looking to Televsion as their desired destination. And no TV series right now has challenged the cinematic experience more than HBO’s megahit, Game of Thrones.

Adapted from the novels by George R. R. Martin, and produced by creators David Benioff and D.B. Weiss, Game of Thrones is perhaps the most cinematic TV series made to date.  It has generated a following of fans around the world, including yours truly, and has generated an enormous amount of revenue for the cable network.  And now in it’s fifth year, the show has taken the unprecedented step of promoting itself on the largest screen possible, courtesy of the IMAX format.  Airing a television series on a movie screen is not unheard of at this point.  Fathom Events has done the same thing for several years now, playing classic episodes from popular TV series in the past in order to mark an anniversary or special date.  There was also a special simulcast recently of the 50th anniversary event for the BBC series Doctor Who.  But HBO’s choice to put Game Of Thrones in IMAX theaters is a major step towards showing how far television has come, and it really makes a statement about the kind of regard the network has for the show.  By doing this, HBO effectively is saying that their show can indeed compete with the big dogs on the largest format possible and indeed even be more worthy than some of the other films that have been given the IMAX treatment.  By making this statement, we are now seeing a true testament to the maturity level that television has reached, showing that it is no longer just a simple form of entertainment, but a place where real art can be created.

But, the real question is, did the presentation really do what it set out to accomplish. I was fortunate enough to take in the show at my local IMAX theater for this one week only engagement, which I saw as worth the ticket price even though I had already watched most of the show itself when it first aired.  The showing was made up of the final two episodes of Season 4 (the most recently aired) and it included a teaser trailer at the end for the upcoming Season 5.  The two episodes in question were called “The Watchers on the Wall” and the season finale titled “The Children.” Both are excellent edpisodes and they each featured some of my favorite moments from last season.  But, what struck me most about the IMAX presentation is how it benefited one more than the other.  While “The Children” has its epic moments, it was obviously the more subdued of the two episodes, relying on quieter character moments over spectacle. “The Watchers on the Wall,” however, was a bit of a revelation on the IMAX screen. When I first saw this episode on TV, I was a tiny bit underwhelmed by it.  It never felt like it was big enough to match the moment that it was depicting.  Now I understand why; my 42-inch TV screen wasn’t big enough to convey the moment effectively. When projected on the nearly 100 foot IMAX screen, the episode really came to life, and it gave me a much better appreciation for the episode.  I immediately felt the difference when the screening showed that first wide angle shot over the show’s monumental Wall, as well as the first time you see a giant riding a wooly mammoth.  It was clear from that point exactly why HBO made the choice to bring Game of Thrones to IMAX; a first for any TV series.  It’s because no other TV series could do justice to the format.

Overall, the screening was an absolute success, and my hope is that it’s just the beginning. While I’m still a believer in the special experience of watching a movie in a theater, I am also aware of the fact that some of the best filmmaking happening right now is on TV.  And Game of Thrones is just one of the many example of TV shows that have pushed the bar in recent years. Indeed, I believe that this is just the beginning for screenings in IMAX theaters.  While I doubt you’ll see the likes of Mad Men in IMAX, I do see other epic scale productions like AMC’s The Walking Dead or BBC’ Doctor Who making their way to larger formats.  But, it has to take a certain kind of show to make that transition. Most of the TV series of the past are unfortunately restrained by the limitations that they have had to work with. Even epic productions like Star Trek are still bound by their broadcast standard look.  Really, only TV series crafted under the more cinematic standards of recent years could hold up on an IMAX screen, and even still, it has to have the right kind of vision behind it. It’s a very recent phenomena of TV series that are able to hold up visually with their big screen counterparts.  HBO led the way for that transition with their visually spectacular shows of the past like The Sopranos, Deadwood and Rome and even they had to mature their looks over time. The TV shows of today are built on the shoulders of these groundbreakers and they continue to refine the look of modern TV, even pushing production quality into unexpected areas like video streaming.   And with tent pole productions becoming increasingly less reliable as investments, it seems logical that theater chains would look to popular TV shows as a way to draw in the crowds.

For many years, it would have been seen as impossible for a TV show to match up against a big screen movie.  But, the tide has changed considerably as TV shows have become the safe haven for creative freedom and experimentation while movie studios have largely tried to play it safe.  Up until this time, the seperation was very different.  TV shows just didn’t have the budgets to compete.  They were only filmed on cheaper film stock and the idea of preserving them for longevity was seen as laughable.  Even TV shows that managed to gain enough notoriety that they spawned a movie adaptation were also cast aside into a niche category.  For many years, if there was a movie adaptation of a TV series, it was usually a comedy meant to mock the outdated conventions of the original show, like The Brady Bunch Movie (1995) or Starsky and Hutch (2004).  The only films in that time that managed to make the trip to the big screen with any sliver of dignity and faithfulness to their original form were the films in the Star Trek franchise (some better than others) and 1993’s The Fugitive, which actually was honored with a Best Picture nomination.  Nowadays, more and more TV shows are given respectful big screen translations; sometimes even shepparded there by their original creators like 1998’s The X-Files or 2005’s Serenity.  Televsion has also benefited from even more film to TV translatisions, showing that some stories can even prosper in a longer story format.  Big screen classics like Fargo, From Dusk Til Dawn, and even characters like Hannibal Lector have made it successfully to small screen and have shown that the medium can indeed support stories and characters that have already proven themselves on the big screen.  All this has shown the increasingly blurred line that separates film quality from TV quality.

Another sign of this change is present in the fact that more and more talented people are choosing to steer their careers into TV broadcasting.  Before, television work was looked down upon by A-listers in the industry.  In the early days, you moved up from television work into and never looked back, unless your career was in a down turn and it was the last option left to you. And indeed, the early years of television production was a great incubator of the great filmmakers of tomorrow.  Directors like Stanley Kramer, Arthur Penn and Steven Speilberg all got their start working on TV shows, as well as many future groundbreaking writers like Paddy Chayefsky or Charlie Kaufman.   Now while many people who start in television still move on to movies today, there is also a growing trend among filmmakers who are going from the big screen to the little screen in order to satisfy their creative tastes.  Case in point, David Fincher’s recent forays into TV production.  Thanks to his clout as a director, he managed to get the hit series House of Cards onto the small screen, with award winning stars like Kevin Spacey and Robin Wright on board as well.  Actors are also going back and forth between film and television, with none of the disdain they would have shown for the medium in the past.  Game of Thrones in particular has an especially large number of cast members itself that balances their film careers with their work on the show.  He stigma that television has had in the past is gone for most people in the industry, and now some will even look to it as a more desirable avenue to pursue than a film career.  Imagine if the same had been true in the early years of television.  Can you imagine seeing the “Duke” John Wayne headlining a weekly drama or Jerry Lewis producing a sitcom. This is certainly one of the biggest signs of television’s maturity as an entertainment art form.

But, the one big thing that still separates cinema and television is the level of production that it receives.  Even with all the ambition that TV producers put into their shows, their budgets will still fall short of the big productions, and it sometimes shows in many of the visual effects.  Even a show as highly budgeted as Game of Thrones has to make due with what HBO is able to allocate them.   Sometimes they pull it off, while other times you can’t help but feel that a moment falls short.  The Battle at the Wall seen in the IMAX showing of Thrones is a perfect example of this compromise that the show’s producers had to deal with.  You can tell that they were trying their hardest to make it feel like a big moment, but even they had to cut corners and downplay the moment from how it originally appeared in the source novel.  But what helps the show in the end is not how sharp or big it looks, but rather how they utilize their effects.  And honestly, the makers of Game of Thrones us their visual effects with more care and effectiveness than most blockbusters.  It’s giving the production the illusion of grandeur and making the production feel even stronger and more epic than its budget would have you believe.  They do this by putting the emphasis in the performances and then storytelling, which in turn makes the imaginary world of Game of Thrones feel even more real to the viewer.  And given that the show presents a continuing narrative in a serial format, it has actually made the show feel all the more epic.  I was actually stunned by how small The Lord of the Rings trilogy felt the last time I saw it, because it’s 12 hour plot line now feels dwarfed by the nearly 40 hours we’ve spent in the world of Game of Thrones; and were not even at the halfway point yet.  That is a testament to how well a TV series can overcome its budget limitations and even surpass its big screen competitors.

So, my hope in the future is that we see more of this mingling between television and cinema.  Yes, some of the allure of cinematic filmmaking is being lost in the process, but that’s only because television has upped it’s game and has met the challenge. In some odd way, this is something that could save a fledgling movie market, at least from the vantage point of theater owners.  Watching something epic in your own living room has advantages, but there is nothing quite as great as watching the same show on the biggest screen possible. And there’s nothing bigger than IMAX.   Game of Thrones is the perfect test subject for this experiment, and I feel like the experiment was well worth it in the end.  My hope is that HBO does the same thing next year, and that other networks with epic scale shows like AMC, FX, and Showtime follow suit.  It of course has to be the right kind of shows as well; just like with the movies, you need spectacle on screen to justify the larger format.  Another good idea would be to not have to wait for the whole run of the show to be over, and instead maybe consider simulcasting the broadcast of the show when it airs on the big screen.  The complications of that could be troublesome for both theaters and the studios,mbut you never know especially if demand is high enough.  I for one welcome the competition between cinema and television.  Competition between the two mediums allows for a more diverse set of choices for the viewer and it allows for many production companies and producers to take chances.  As of now, HBO and Game of Thrones are setting the standards high for Hollywood and it’s already leading the market to reevaluate how to present certain projects.  But, no matter how you watch the show in the end, quality comes from a great story, and having the gumption to make it work.  And as a result, any size screen will do.

Hot Buttons – Controversial Movies and Whether or Not to Watch Them

clockwork eye torture

For as long as there have been movies there has been the desire to tell stories that depart from the norm and venture into sometimes dangerous new territory.  And whenever we see a movie that intentionally means to provoke a response, that response will usually manifest itself as a backlash from those who don’t like it’s message or content.  But whether or not the response the movie gets is positive or negative, the one thing that ‘s for sure when a film courts controversy is that it gets everyone talking about it.  What is amazing today, in our social media driven culture is that controversy can now become a viable marketing tool for a movie to run on.  Even if you have a movie that is drawing scorn from a large amount of people, it still boosts the exposure of the product, especially if it becomes controversial to the point of being headline worthy.  But the question remains whether or not a movie demands to be seen once it becomes a hot button issue.  Are we compelled to see what all the fuss is about or should we ignore the hype and stop feeding the beast?  Like most things, it really comes down to the product itself, and whether or not it can stand on it’s own amid all the noise.  But the fact that not only does controversy help put the spotlight on the film but actually helps it to gain much more success than it would have normally is really something interesting in the industry.  One other interesting outcome of this is finding out whether the controversy is warranted in the first place.  Sometimes a movie is just ahead of its time and looking back on past controversies can sometimes make them look ridiculous in hindsight.  It’s a cycle played out all the time in Hollywood, and even though it happens often, we can still be surprised by the extent of a movie’s impact on our larger culture, or at least the established order it can shake up.

This is something that we are currently seeing played out in our cinemas now with Clint Eastwood’s new film American Sniper.  In my review of this movie a couple weeks ago, I highlighted the fact that the film took a rather polarizing figure in our recent history of war and used him as a focal point for a larger examination of the life of a modern American soldier.  Understandably, basing a movie on the life of a controversial figure like Chris Kyle was going to ruffle a few feathers in both our pop cultural and political world, and sure enough, the last month has been a firestorm of everyone putting in their own two cents about the movie.  Interestingly enough, the critiques of the movie have run pretty much down party lines, albeit with a couple open-minded voices actually breaking from the predictable opinion.  Since the movie’s release, we’ve seen critics attack this movie as being right-wing propaganda and racists to Muslims, while others on the opposite side view it as a strong endorsement of American military might.  My own opinion is that neither extreme is true, and that the movie is an intriguing character study of a flawed but talented soldier in combat, and how that experience is indicative of many more like him trying to come back to a normal life at home.  But, what I find the most interesting about the movie is how much the controversy has fueled it’s box office numbers.  It broke all sorts of box office records for the month of January and is continuing to dominate headlines in both the entertainment and political world.  In this case, I think that the movie accomplished something good by being controversial, because it’s gotten everyone talking about important issues like the responsibilities of war and how we treat our wounded soldiers.  And it also offers up a picture of war that can’t be so easily defined by conservative or liberal talking points.

But, while American Sniper is the hot button issue of now, it’s not the first movie to stir up controversy, nor the first to have been the beneficiary of added exposure.  Cinematic history is full of movies that pushed the boundaries of taste and socially acceptable behavior.  In the early days of Hollywood, the standards of violence and sex had yet to be established, so for many years filmmakers were free to push boundaries all the time.  Even studio made movies had openly frank movies about violence (1931’s Scarface) and sex (1933’s Baby Face), but that all came to an end with the establishment of the Hays Code, a restrictive set of guidelines set up by religious figures in conjunction with studio heads in order to “clean up” the loose morals of the movies.  For the decades that followed, Hollywood would play it safe and movies became a lot more sanitized and morally righteous in the years after.  Restrictions on depicting violence loosened during the war years, but even still, approval by committee had to be given.  In the 50’s, the McCarthy trials also made the studio system weary of political messages in the films as well, which led to an unprecedented era of censorship in the realm of film-making.  And naturally, when restrictions become too much of a burden, it inevitably leads to a backlash, and for many years Hollywood started to fall behind as necessary controversy existed outside of the industry, rather than being guided within.  It wasn’t until the late 60’s, after the Hays Code and the Blacklist were abolished did we see an era where free expression and new ideas become a big part of the cinematic experience again, at least for a time.

During the tumultuous 60’s and 70’s, when Vietnam and Watergate dominated the headlines, Hollywood was finally embracing filmmakers who had ideas and stories that could shake up the world and even change it for the better.  One thing that came about in this time was the creation of the Ratings system, conducted by the MPAA (Motion Picture Association of America), a corporation set up for the specific purpose of determining the appropriate audience for any select film based on it’s content.  Sort of a self governing body, the MPAA took the power out of the government and studios hands to determine if a movie should be seen or not based on it’s content and instead left that determination up to the customers, giving them the information based on the rating upon it’s release.  In the beginning of this new system, Hollywood didn’t prejudge movies by their ratings, and instead celebrated the changing attitudes that were starting to become popular at this time.  Even movies slapped with the highly restrictive X-rating (which banned any audience members under the age of 17 from seeing it) were celebrated; director John Schlesinger’s X-Rated Midnight Cowboy (1969) even won the Oscar for Best Picture that year.  But, even though hot button movies were embraced in this time period, there was also an inevitable backlash as well.  Religious groups organized to the point where they could put pressure on the studios once again, and the MPAA’s ratings system became less of a suggestion for audiences, and more of a standardized label that segregated movies away from each other.  In the 80’s and 90’s, it began to matter a lot more whether you were saddled with a PG or R rating.  And in this time, controversial movies began to stand out that much more.

But do filmmakers set out to make their movies controversial and do they really use that as a way to boost their production’s exposure?  For many, I don’t think that filmmakers really want their movie to be seen as controversial.  Provocative, yes; but I don’t think they have the intention to draw criticism onto themselves.  For a lot of filmmakers, their choice of project is more about the story they want to tell and their belief that people will indeed want to hear it.  Controversy will sometimes arise when the filmmaker runs into a wall of rejection when their tastes run contrary to a whole select of people, usually those who share different worldviews than the filmmaker.  What may seem rational to one will seem radical to another, and the storm raised between one and another is what fuels the controversy around any given movie.  This usually comes down to three certain areas of contention, which are politics, religion, and standards of violence.  Probably the most effective way to really push a few buttons in Hollywood would be through tackling religious doctrine.  The film industry is primarily secular, with a few religious camps present as sort of a niche market, but if a major filmmaker chooses to take on a religious subject head-on, it usually ends up drawing the ire of some.  I’m of the belief that religious themed movies can be worthwhile if they have something interesting to say; which unfortunately very few of them do.  But, when experimental filmmakers like Martin Scorsese and Darren Aronofsky tackle biblical tales in unconventional ways, it suddenly makes their projects much more explosive, because of how their breaking from an established order of things in Hollywood, both spiritual and secular.  Mel Gibson’s The Passion of the Christ (2004) stands as one of the most controversial movies of all time, but that’s less because of it’s religious stance and more of because of how it affected different groups politically.  Even still, controversy got it exposure it definitely wasn’t expecting.

Whether expected or not, once a movie becomes a topic of discussion, it drives more people to want to see it.  Sometimes  it happens because of the movie’s content, and sometimes it just happens out of nowhere.  I’m sure that Seth Rogen and James Franco never thought that their silly buddy comedy The Interview would become so controversial that it would get pulled from the cinemas before release, but once you draw the ire of an international body intent on publicly shaming you just because you used them as your point of ridicule, you suddenly become on of the most controversial movies of all times.  And usually it’s just the timing of the controversy that matters the most.  Your movie may intend to push a few buttons, but unless it’s relevant to the times we live in, no one will really care.  Some controversial movies of the past now feel tame, and that’s usually because of the loosening of social standards on violence and sex in the years since.  Stanley Kubrick’s masterpiece A Clockwork Orange (1971) was so controversial in it’s time, that it was banned in the United Kingdom for decades for having inspired rising gang violence in that country.  Seen today, after all the violent movies that Hollywood puts out today, Clockwork Orange doesn’t have the same kind of shock factor, unless you count the unsettling eye torture that Malcolm McDowell’s Alex goes through later in the film.  But it’s notoriety was definitely fueled by it’s early explosive reputation.  And it’s button-pushing movies like this that have indeed moved society in a different direction that helps to make it’s content more acceptable.

Other controversial movies however loose their luster after their time has come and gone, especially if all that made them stand out in the first place was just their button-pushing content.  Documentaries are especially notorious for brewing controversy and gaining exposure right away.  Michael Moore’s Fahrenheit 9/11 (2004) became noteworthy for being the highest grossing documentary of all time, mainly due to being a scathing critique of a sitting President during an election year.  It elevated Moore in the world of film and politics, believing that his movie would lead to the ousting of then President George W. Bush and would make  history becoming the first documentary to be nominated for Best Picture.  When neither happened, the aura around the movie diminished and Michael Moore has been increasingly marginalized as his kind of firebrand film-making has become less popular during the Obama administration years.  And indeed, political documentaries are the ones who can gobble up awards and get the most exposure quickly, but they are usually forgotten much quicker.  The documentaries that last longer are the ones with a compelling story like Shoah (1985) or Hoop Dreams (1994).  But sometimes controversies drive movies not by design but by circumstance.  For instance, the creation of Joseph L. Mankiewicz’s Cleopatra (1963) ran into controversy when both of it’s leading stars (Elizabeth Taylor and Richard Burton) began a highly publicized affair on the set.  In order to capitalize on the public awareness of the scandalous couple, Fox quickly streamlined the already troubled production and turned what was going to be a two-part series into one four hour long epic.  It was seizing a moment they had to save their over-budget monster, even though it was not planned that way.  Controversy is a double-edge sword sometimes.  Either it can give your film production a lot of headaches, or it can be that lightning in a bottle to boost your movie into places unseen before.
American Sniper likewise is finding itself caught up in a flurry of controversy that is driving a wedge between different political factions, but at the same time is also boosting it’s box office to unprecedented levels.  I for one believe that it’s a movie deserving of exposure in any way it can get it.  I just hope that people also go into the movie with an open mind.  The most interesting thing about controversial movies is the fact that they are unpredictable.  Usually, we can tell a lot about ourselves and where we stand as a culture by which movies we end up making the most controversial.  Sometimes they force us into confronting issues that need to be addressed, or it can force us to reevaluate or own standards of decency and acceptance.  Many controversial movies have benefited us in the long run, while others diminish when they prove to just be a product of their time.  But is it right to give into the hype surrounding a movie?  Sometimes these movies only become controversial because we are the ones fueling the fire behind them.  Even though a movie becomes a button-pushing issue, it should be stated at the same time that it’s also just a movie.  The only power you give it is the effect that it has on you.  You can choose to ignore it or you can participate in the frenzy and see what all the fuss is about.  Sometimes, your opinion may actually surprise you once you’ve actually witnessed the movie yourself.  But, there’s no mistaking the fact that these movies have a power over the industry, and are usually the ones that become the driving force behind what we see in the future.  Hot Button movies are just as important to the industry as the tried and true formula pictures.  In the end, it’s not a big deal if controversy and hype fuel the performance of a movie.  Time will tell in the end if it was all worth it and whether or not a movie can stand on it’s own.