Tag Archives: Editorials

Strength and Honor – 20 Years of Gladiator and the Last of the Sword and Sandal Epics

The late 1990’s was an interesting transitional period for Hollywood, as the advancement of computer aided technology had opened many new doors for the industry.  Special effects extravaganzas like Twister (1996), Independence Day (1996) and Armageddon (1998) were dominant at the box office, but were usually seen as little more than popcorn flicks that were rarely celebrated for anything other than their effects.  But, there was another interesting occurance that was happening during this time and that was a surprising resurgence of something that was once the dominant genre in filmmaking at one time in Hollywood; the historical epic.  Reaching a peak in the late 50’s and early 60’s, the historical epic became the defining example of what Hollywood could accomplish, taking full advantage of the new widescreen processes of the time and delivering larger than life recreations of legendary moments of the past.  Started off with the biblical epics of The Ten Commandments (1956) and Ben-Hur (1959), the genre would go on to be defined by other history based dramas like Spartacus (1960), Lawrence of Arabia (1962), Doctor Zhivago (1965) and Patton (1970).  They were all grand in scale and with epic lengths to match their ambitious scope (a three hour run-time being the norm).  But with out of control costs also defining their productions, like the near cataclysmic Cleopatra (1963), the genre that would become known as the “sword and sandal” epic would not be able to sustain itself for long.  But once the CGI revolution of the 90’s began, filmmakers raised on those epics of the past found a new, cost effective way to bring movies on that scale back to grandiose life.  Movies like Kevin Costner’s Dances With Wolves (1990) and Mel Gibson’s Braveheart (1995) began to reivigorate the long dormant genre initially, and then James Cameron’s Titanic (1997) pushed the door wide open.  But a true return to an actual “sword and sandal” epic wouldn’t happen until the turn of the century with Ridley Scott’s Gladiator (2000), a movie that both revived the epic in a big way, but also began it’s inevitable second decline as a result.

Gladiator, now reaching it’s 20th anniversary, is a rather intriguing oddity in the history of Hollywood.  Feeling both modern and nostalgic, it’s a movie that really feels out of place in it’s time.  It is very much a throwback to the “sword and sandal” epics of Hollywood’s “Golden Age,” and yet every aspect of it’s cinematic style is thoroughly contemporary.  And even more surprising is how much it was embraced by audiences and critics alike upon it’s release.  Despite the fact that a movie set in ancient Rome had not been a box office hit in nearly 5 decades, Gladiator somehow managed to be one of 2000’s highest grossing films, bested only by Mission Impossible 2 (2000) and How the Grinch Stole Christmas (2000).  And it did so with no A-List stars at the time.  Australian actor Russell Crowe had only appeared in a handful of Hollywood films at the time, most notably in The Insider (1999) which garnered him a Best Actor nomination a year prior, but he was unproven as an action star.  Joaquin Phoenix’s career had barely just begun, as he had only recently moved beyond the child actor roles of his youth into more adult ones.  Plus, director Ridley Scott had been known more for his work in Science Fiction with movies like Alien (1979) and Blade Runner (1982), and a lavish, costume drama was something that he had tried and failed at before with 1492: Conquest of Paradise (1992).  But despite these limitations, the movie caught on in a big way and would go on to become one of the most influential movies of the decade.  It even rode it’s success to a Best Picture win at the Oscars, something that even most “sword and sandal” epics couldn’t do, coupled with a Best Actor win for Crowe.  It really leads you to wonder why this movie, above all the others, became the one to reach the top like it did in such an unexpected way.

Part of it’s creation no doubt was a result of a group of filmmakers wanting to indulge their nostalgia for the epics of the past, while at the same time finding a way to make good use of the new tools that could allow them to make a movie like it work.  The original concept came from screenwriter David Franzoni, who had been working on the story since the 1970’s, having been a fan of old fashioned epic movies and carrying an interest for Roman history his whole life.  Finding his way eventually to Dreamworks, and working alongside Steven Spielberg on the movie Amistad, Franzoni pitched the project to Spielberg, who himself had a long-standing interest in the genre.  Spielberg, who was too tied up at the time to direct himself, eventually gave the project over to his longtime friend Ridley Scott, who was more than eager to jump on board.  Scott certainly was also influenced by old fashioned epics, but he was far more interested in looking at this era through a more authentic vision; creating a Rome that felt less polished and more lived in.  Nevertheless, the movie would give him the chance of imagining Ancient Rome on a scale that old Hollywood would’ve only dreamed of, and on a more modest budget.  Aided with cutting edge CGI technology, Ridley could not only rebuild the mighty Colesseum, but even take you inside it, and span around looking at it from all angles, like in the now iconic 360 degree shot from it’s ground level.  After bringing on playwright John Logan and screenwriter William Nicholson in for rewrites to further match Scott’s vision, production began in earnest in the spring of 1999.  Shooting in places as diverse as Morrocco, Malta, and Northern Ireland, Gladiator was certainly one of the most ambitious projects of it’s time.  But even with a professional assembly of cast and crew on board, the movie still endured some unexpected hurdles.  Most notably, actor Oliver Reed, who was playing the role of gladiatorial trainer Proximo, suffered a fatal heart attack during the middle of shooting.  With the actor playing a crucial role in the final film, and without having completed his last scenes, Scott needed to find a way to bring his story-line closure without having to recast the actor and reshoot half the movie.  He made the controversial choice to digitally resurrect Oliver by placing his face artificially on another actors body in a groundbreaking visual effect.  Yes it’s a little ghoulish to do this after the actor has died and has no say over how his likeness is used, but in doing so, Ridley Scott managed to preserve the rest of his performance within the film, which is a wonderful final bow for the legendary character actor.

Though the movie is a magnificent showcase for Ridley Scott’s talents as a film director, the thing that I think really turned the movie into an iconic hit with audiences was the character at the center of it all; General Maximus Aurelius Meridius.  From the moment we first see Russell Crowe’s world worn but resilient face on screen as the character, we know that this will be someone who will command our attention for the next 2 1/2 hours.  Crowe absolutely owns as the charcter, being both physically imposing while also vulnerable in spirit.  And his journey through the film is no doubt what has captured the imagination of audiences around the world.  In the film’s tagline, it describes the character of Maximus as “the general who became a slave; a slave who became a gladiator; a gladiator who defied an empire.”  And it’s that undaunted drive to push beyond your limits and fight your way back to the place where you belong that makes his story so captivating.  After loosing everything, his rank, his home and his family, at the whims of a power hungry dictator, Maximus uses what power he has left (his fighting skills and knowledge of combat) to fight his way back to seek justice.  Russell Crowe is commanding in the moments that call for him to be larger than life, but I feel that his best work in the movie comes in the quiet moments in between, where Maximus becomes introspective and questions whether or not his path should lead him down this road towards vengence.  Of course, a hero is also only as good as the villain he faces, and Gladiator has a memorable one as well.  Joaquin Phoenix’s Commodus is one of the most depraved heavys put on screen in recent memory, and Phoenix does a wonderful job of capturing the slimy nature at his very core.  It’s a performance that hints at the budding talent that Phoenix would display in the 20 years and more than anything else has been the thing in the movie that has aged the best over the years. Whereas most historical epics portray their villainous figures as pillars of insurmountable strength, Commodus is defined by his insecurities; a sniveling man child prone to abusing his power to compensate for his lack of strength. In many ways, Commodus is a truer depiction of what a tyrant actually behaves like than anything else we’ve seen in movie before, and it’s a real credit to the power of Joaquin’s manic portrayal of the character. It also plays off of Crowe’s Maximus perfectly, centering the movie around a truly dynamic contest between idealic good and loathsome evil.

In the 20 years since the release of Gladiator, it’s very interesting to see what kind of legacy it has left behind.  One thing that may be surprising about it is the fact that while it did revive the long dormant “sword and sandals” epic, it only did so for a short time, at least in the general traditional sense.  The movie’s box office success and subsequent Oscar wins signaled to Hollywood that this type of movie was easy money once again, which is something that only in retrospect now looks like a foolish assumption.  As stated, Gladiator stood out because of it’s stellar story and unforgettable characters, and not just because of it’s spectacle.  But, what Hollywood believed was the key to it’s success was the ground-breaking visual effects that it used to tell it’s story, which made people believe that they could plant into any old historical event and make it work just as well.  This led to some rather misquided attempts at Gladiator clones.  Much like how Pearl Harbor (2001) tried and failed to capitalize on the Titanic craze from a couple years prior, these wannabe Gladiators also would pale in comparison.  Two of the most notable failures came from one studio in particular, Warner Brothers, who put their money behind a couple of bloated epics in the same calendar year.  First was Troy (2004) from Wolfgang Petersen, which mistakenly believed that vain, egocentric Achilles (played by a miscast Brad Pitt) could become the next Maxiumus; and then there was Alexander from Oliver Stone with an even more grossly miscast Colin Ferrall as the boy conquerer.  I already went into detail comparing the two movies in an article here, but suffice to say, both missed the mark completely in recapturing the spark of Gladiator.  Ridley Scott himself would also find it hard to repeat within the genre he helped to revitalize.  His Crusades era epic, Kingdom of Heaven (2005) was butchered by 20th Century Fox upon release, removing nearly an hour of run-time from his original director’s cut and creating a hollow shell of what it could have been.  Even a re-teaming of Russell Crowe and Ridley Scott in a new version of Robin Hood (2010) couldn’t even come close to the effectiveness of Gladiator.  Which really made people wonder if Gladiator really did in fact revive the “sword and sandals” epic at all, or was it just an anomaly.

In one way, I do think that Gladiator did in fact bring the “sword and sandals” epic back in a big way; just not in the way you would exepect.  The aesthetics of the genre were revitalized by Gladiator, but it managed to thrive beyond it in a entirely different genre.  Only a year after Gladiator’s release, Peter Jackson and New Line Cinema released the first film of their ambitious The Lord of the Rings trilogy.  The Fellowship of the Ring (2001) in many ways took up the mantle of what Gladiator brought to cinema, making it’s fantasy world feel authetic in the same way that Scott’s film tried to recreate ancient Rome.  For Peter Jackson, he wanted to take J.R.R. Tolkein’s fantastic world of Middle Earth and film it in a way that felt like a historical epic.  Great attention to detail was called for, making the world that the characters inhabited feel like it’s existed for centuries, and not look like an obvious set on a soundstage.  Removing the staginess of the production is one thing that the movie has in common with Ridley Scott’s epic, and it wouldn’t surprise me if Peter Jackson was in some way inspired by the success of his predecessor while making his own epic.  He’s certainly not copying Scott’s style, but seeing Gladiator succeed with it’s own level of authenticity no doubt must have emboldened Jackson’s determination to do the same.  So, though the historical epic genre floundered in the 20 years since Gladiator, the same kind of epic grandeur still lived on in the fantasy genre, which saw it’s own Renaissance in the new century.  We are even seeing the super hero genre beginning to pick up the mantle, with moments in Marvel’s Avengers: Endgame (2019) and DC’s Wonder Woman (2017) and Aquaman (2018) also emmulating the epic style of Gladiator and RIngs.  So, even if Gladiator didn’t make a lasting impact in the historical epic genre, it’s style still was able to live on in other genres that have only increased their epic grandeur.  Look at something like the arena fight between Thor and the Hulk in Thor: Ragnarok (2017) and tell me that one of the first movies that you’ll recall in your mind is Gladiator.  Sure it may look different, but there are certainly echos that ring familiar at the same time.

One has to wonder though, is Gladiator the last of it’s kind as well.  It certainly is the last historical epic to have ridden a wave of success towards a Best Picture win at the Oscars.  In the 20 year since, the Oscars have favored contemporary dramas more than any other, with 2003’s The Lord of the Rings: The Return of the King and 2013’s 12 Years a Slave being the rare exceptions.  The “sword and sandals” epic revival was pretty much over with by the end of the 2000’s, with even Ridley Soctt leaving it behind for a return to sci-fi in more recent years.  And it would seem that the once pinnacle of Hollywood filmmaking was no longer a viable artform in this vastly different market that we see today.  Nobody wants to invest in a lavish reproduction anymore of what is essentially a history lesson.  You do see modest attempts at it every now and then, but even so, they don’t have the majesty of something like Gladiator.  Netflix has recently attempted a couple tries at creating lavish period epics, like Outlaw King (2018) starring Chris Pine as Robert the Bruce, and The King (2019) starring Timothee Chalamet as Henry V.  Though both films are more ambitious than most, you can almost feel themselves being restrained by a sense of not trying too hard.  I think that’s what made Gladiator hit so well with audiences.  It was the last epic of it’s kind to not only be lavish in it’s execution, but unapologetically so.  It was a movie that was not afraid to embrace the over-the-top nature of it’s genre while at the same time trying to modernize it with a gritty aesthetic.  It’s not just in it’s epic scale that it worked, but also in it’s characterizations.  Maximus and Commodus are iconic characters that capture the imagination.  You can give them the most absurd, old-fashioned dialogue possible and they will still capture the imagination because they are that interesting.  In many ways, it’s not that the style of genre went out of fashion, but the fact that Hollywood forgot to add substance to these epics, leaving nothing but production values to do most of the heavy lifting.  And that’s why the style that Scott’s Gladiator repopularized moved on into genres that had far more interesting characters.

So, what we take away from the legacy of Gladiator is that it had a profound impact on the movies of the new century, but in a way that I think Hollywood and even the filmmakers involved probably didn’t expect.  Most of the industry expected the “sword and sandals” epic to just pick up right where it left off thanks to the movie, but that just wasn’t the case.  We ended up seeing a short lived revival, while other genres would end up prospering, picking up the aesthetic that had served the historical epic so well in the past.  But, though the movie may not have given it’s own genre the kick that it needed, it still holds it’s own as a fine epic that stands the test of time.  It did a lot of things for Hollywood as well; it reinvented Ridley Scott as an epic filmmaker, broke new ground in visual effects, made Russell Crowe a star and shed a spotlight onto a young Joaquin Phoenix. Legendary character actors like Richard Harris, David Hemmings, and Oliver Reed were also well served by this movie, with Reed’s final performance thankfully maintained even while being incomplete. But I think what works best for the movie is in how it inspires audiences who see it; reminding them of a time when historical epics set the high bar in filmmaking. Sure, Gladiator is far from historically accurate, as most films like it are, but it shows just how incredible a window into the past can capture our imaginations on the big screen.  During a rousing pre-battle speech early in the movie, Maximus tells his soldiers that, “what we do in life, echoes in eternity.”  It’s a profound statement that not only hits a personal core, but also reflects well on the legacy the movie leaves behind.  By being the bold artistic vision that it is, and striving to hold up the high standard of grand epic movies in the past, Gladiator continues a whole new generation of filmmakers to keep that tradition going on into the future, and hopefully inspire a desire to see more movies like it made in the future.  Strength and honor indeed.

What’s the Rush? – Are Deadlines Creative Roadblocks to Movies and Is It Worth It to Delay?

Something peculiar and unprecedented happened in the film industry last year, and it came from the unlikeliest of places.  When Paramount Pictures delivered a first look trailer for their big screen adaptation of the video game Sonic the Hedgehog, it was received with a fair bit of outrage.  Long time fans of the character were quick to point out how terrible looking his new “enhanced” character model appeared, and they flooded social media with their complaints.  But what shocked many people afterwards was the fact that Paramount quickly pulled the movie off their schedule, stating that they were going to “fix” the animation and change the model of Sonic to better reflect the demands of the fans.  This is something that is pretty much unheard of in Hollywood, that a movie studio halts the release of a movie after the backlash it received from the trailer.  Originally slated for a holiday season release, the Sonic the Hedgehog movie is now being released in theaters with it’s new “refreshed” animation and, as of this writing, it is looking like it’s going to have a better than expected opening weekend.  Which raises the question; did delaying the movie actually improve it’s chances.  I haven’t seen it myself yet, but by judging the progression the movie went through from that original trailer to the final product, it looks like Paramount might have indeed salvaged what could have been an embarrassing train wreck.  I’ll definitely say that the new and improved model is a step up from the grotesque version we saw in the original trailer.  He at least looks like the character from the video game now.  But, what does this tell us about the film-making process in general.  Does it actually benefit a movie to have a delay in production in order to fix supposed problems?  Is meeting a deadline actually counterproductive to making a film better?  Is it right to take the response to a trailer as a motivation to re-work a movie?  In a blockbuster driven market like the one we are living through now in Hollywood, the questions raised by Sonic the Hedgehog’s troubled production provides some clues to problems that stem far and wide throughout the business as a whole.

So, what was the issue with Sonic the Hedgehog being delayed a few more months.  The answer is it’s something that just does not happen in Hollywood; at least not this late in the game.  One of the longest running mechanics of the studio system in Hollywood is a planting of a flag within a release schedule.  It’s a way of the studios telling the industry that they plan to have a movie ready for release  on a specific date many years in advance, even if they don’t quite know what that movie is yet.  It’s mainly done to assure business with the theater chains, who want to know what to expect over the next several years so they can make their long term plans.  Normally, these tent-pole dates occur during important periods of the year where both the studios and the theaters expect bigger than average audiences, like Memorial Day weekend or Thanksgiving or the Holiday Break.  And by planting their flag on these busy weekends, the studios can assure themselves that they have stood out among the other competition that weekend.  After that, then the pre-production planning begins, where studios figure out what they’ll actually fill those dates with.  Some studios know exactly what they’ll put there; Marvel for example has laid claim to the first weekend of the summer season every year for more than a decade.  But, other times the date exists there just for the studio to have a claim to a lucrative time frame, and then they sometimes fill it with a movie that they hope will benefit from that.  Regardless of what that movie is, the studios have that date set, and the focus from there out is making sure that a movie is ready on that date.  Once it’s determined what movie will fill the slot, then it up to the production team to make that goal.  And, as we’ve seen with Sonic, sometimes a movie just isn’t ready for prime time.

There are different classes out there with regards to how deadlines are viewed within the industry.  Some people, mainly writers, view deadlines as a positive, because it helps to sharpen their focus and allows them to get things done without too much second guessing.  On the other hand, there are other people in Hollywood, mainly on the production side, who view deadlines as cumbersome to the creative process.  Film directors in particular like to have as much time to work on a project as they possibly can, because it allows them to be experimental and shoot a scene from every possible vantage point.  Without the constraint of a time limit, they can also accommodate their production to deal with delays that occur, such as with the unpredictability of the weather.  The same goes in the editing process, where more time allows for a more thorough search for mistakes and oversights missed on the days of production.  But, a movie studio can’t allow for film crews to have all the free time in the world; because then you end up having runaway productions.  This is one of the things that ultimately ended the era of the New Hollywood in the 1970’s.  When Francis Ford Coppola’s Apocalypse Now (1979) and Michael Cimino’s Heaven’s Gate (1980) both went over-budget and over-schedule, the studios intervened and put an end to these projects that seemingly were going to continue going on without end.  Ambitious visions are a valuable thing for a filmmaker, but at the same time, there has to be accountability with the budgets, as well as an end point in place, and thus, that’s why studios have become more reliant on those tent-pole dates to ensure things don’t spiral out of control.

But even when you have a well oiled machine that is able to fire on all cylinders towards meeting those deadlines, it can cause friction along the way.  Some filmmakers find that meeting those demands from the studio ends up diminishing the finished product.  In a tweet delivered shortly after he parted ways with Marvel Studios, Doctor Strange (2016) director  Scott Derickson described studio release dates as “the enemy of art.”  That led to what he and Marvel mutually described as creative differences that led to his being let go from the upcoming Doctor Strange sequel.   And it’s a trend that we are seeing happen more and more in Hollywood as blockbusters are becoming more like an assembly line product in service of ongoing franchises.  Some filmmakers are able to work under those conditions, while other feel stifled by it.  Marvel has benefited from a long line of stable productions throughout it’s run, but the same can’t be said about it’s sister company under the big Disney umbrella; Lucasfilm.  Multiple productions on that side have faced upheavals in recent years, with several filmmakers like Colin Treverrow, Benioff & Weiss and Lord & Miller all either being let go from a project or exiting out of their own choice.  Creative differences likely played a part in these shake-ups, but also the fact that many of them recognized that delivering under a tight schedule would’ve negatively affected their projects.  This seems especially to have manifested with Star Wars: The Rise of Skywalker (2019).  The movie was almost completely started from scratch once J.J. Abrams stepped in to take over from Treverrow’s previous direction, and yet it still needed to make that December 2019 deadline.  It became clear, with an unfocused narrative and far too many plotholes left open, that the movie needed a few more months to polish out it’s story problems.  But, parent company Disney was insistent that it be done that specific year, because the release was going to coincide with the opening of Galaxy’s Edge in the parks as well as the launch of the Disney+ series that tied into the universe.  And thus, we got an unfinished movie in theaters that not only was the least popular of the new batch of films, but in some ways also tarnished the brand.

And out of that, you see why Scott Derrickson views deadlines as the “enemy of art.”  But at the same time, a project run amok has it’s downside too.  What the Star Wars,  and for that matter Sonic the Hedgehog shows us is that there should be more assessment over how much time a movie actually needs to be ready.  This can usually be examined early on in a film’s development, and oftentimes you do see film studios halt production well beforehand in order to keep a movie from going off the rails.  There are often many movies that get announced, but are never made.  Some people wonder why these movies never get off the ground and it’s because the studios assess the risks involved if they continue to head down the same road, and sometimes those risks are not worth the investment.  There’s the example of Warner Brother’s Superman Lives, which was going to be Tim Burton’s own spin on the famed comic book character, after he had already famously brought Batman to the big screen.  The movie went through a fair amount of steps in pre-productions including casting (with Nicolas Cage playing the man from Krypton) and location scouting, before the studio ultimately pulled the plug.  And the result usually comes from either the executives balking at the budget or because of a lack of enthusiasm from the public in general.  There are probably more examples of movies that died in development than there are ones that made it into theaters, and that includes projects from some of our greatest filmmakers like Stanley Kubrick’s Napoleon, or Francis Ford Coppola’s Megaopolis, or Ridley Scott’s Tripoli.  But these projects never end up hurting reputations for their creators, because they stalled long before things got way out of hand.  Perhaps what makes a case like Sonic the Hedgehog so unique is the fact that it changed course so far into production.

The unfortunate result of doing that kind of course correction so late is that it put extra pressure on the people working on your film.  In Sonic’s case, the visual effects team pretty much had to throw out months worth of work and effectively star over again.  This is especially problematic because the visual effects industry is notorious for over-working their artists, as well as adding a substantial amount to the budgets of the movies.  When a film is reaching that crunch time before a release date, it’s the pot-production crews that feel that crunch the most.  And with the case of Sonic the Hedgehog, they were saddled with having to work overtime on a project that they thought was nearly in the can already.  Having them go back and re-doing their work meant that it was going to take extra time away from their families in order to make the new deadline, and over the holidays no less.  To add more salt in those wounds, the visual effects company responsible for the Sonic redesign, Motion Picture Company (MPC), closed it’s Vancouver location shortly after, where all the Sonic work was done.  So, not only did the effects artists have to work through the holidays, but they were left without a job right after.  This speaks more to the volatility of the visual effects industry which is a whole other story, but it’s indicative of the growing problem where movie productions fall victim to their own inability to plan things out effectively.  Usually, movie studios haven’t taken the responses to movie trailers as seriously before, but in this case, the response had become so severe that Paramount had to intervene.  But, was it worth putting artists through a tough time for.  Many people get into the business for the love of creating movie magic, but when it’s becomes an arduous task to reverse a problem that should’ve been caught long before hand, that allure of creativity doesn’t seem so bright anymore.

The question remains, should movies be so beholden to set timelines.  In many ways, cinema is the only art-form that has to conform to such a demanding schedule.  Literature, for example, sometimes takes years to make it to the bookshelves from when they are announced to their ultimate publication.  We’ve been waiting how long now for George R.R. Martin to finish his next book in the  “Song of Ice and Fire” series?  Music also premieres in a different way, with singles often pre-releasing before a complete album; usually as a way to drive up hype.  The video game industry, like film, also uses release dates to gain attention for their products, but as they often fall prey to delays almost across the board, the gaming consumer base has become more lenient when it comes to receiving a video game far longer than what was expected.  So, why is it that movies are held to such a constraining time limit.  No doubt the history of out-of-control productions may have influenced it, but does holding onto it actually diminish a final product that ultimately needs more time to prepare.  It’s something that should become apparent when a major disruption happens, like a complete overhaul of the script or the team in charge of the production.  Otherwise, you are left with a movie project that either becomes a problem far too late into production like Sonic the Hedgehog, or a movie that lacks an identity because it had no time to evolve into something different like Star Wars: The Rise of Skywalker.  The worst thing a movie can do is to waste it’s potential, and sometimes it needs just that little extra time to finally meet it.  Otherwise, you run into the embarrassment of having a movie that fails at what it was supposed to do, because there wasn’t enough time to fix it.

Ultimately, there are many films that are far beyond fixing.  I don’t think that a post-release clean-up of the movie Cats was ever going to save that film from embarrassment.  But, at the same time, we may have seen another film like Sonic the Hedgehog possibly turn around it’s fortunes.  Sure, it’s not going to become an all-time great, but it may have saved itself from the train-wreck that it looked like it was heading towards.  A last minute delay like it’s was still not without it’s costs though, as the poor digital effects artists will tell you.  But, a better box office performance for the film may teach Paramount and other studios in Hollywood that rushing towards a release date is not always a good thing.  Some movies need to incubate a little longer, and the studios need to recognize exactly when is the right moment to change course.  It certainly shouldn’t happen as late as post-production, where you have to completely redesign a character because of the immediate backlash you faced from the trailer.  At the same time, a deadline also keeps a project in check, so it shouldn’t so much be a removal of all boundaries as just a re-positioning of the goal post.  If Star Wars hadn’t been so strict with their unmanageable release schedule, they wouldn’t have been forced into a hiatus like they are now, with so much personnel being shifted around.  At some point, a movie will let you know how much more time it will need, or even if it’s going to ever happen at all.  Overall, I don’t think Scott Derrickson is right when he says that deadlines are the “enemy of art,” because I see a lot of people become more driven when they know they’ve got an end point they need to get to.  It’s probably just the writer part of me that thinks that, but having a deadline in front of me allows me to keep my mind focused on a goal and eliminates all distractions.  But, there should be precautions allowed for any case where a project gets de-railed by unforeseen forces.  I don’t blame movies like Spider-Man (2002) and Zoolander (2001) delaying their release so that they could make a last minute edit of their films to remove the World Trade Center immediately after the events of 9/11 for example.  Sometimes, deadlines are a necessary evil, but it’s one that should be flexible enough to allow movies to become the best that it ultimately can be.

Tarnished Gold – Are the Oscars Losing their Importance in Hollywood?

Hollywood is a city built on glamour and prestige.  Though movies are made for the masses, the heart of the community itself is in presenting this golden gleam of high class and glamour.  It’s the place where you either have to be somebody important or at least can pretend to be somebody important.  Much of it is a facade, but there’s no doubt that when you do visit Hollywood, there is an air of luxury and decadence all around you.  It’s the kind of Hollywood that you see outside of the tourist haven of Hollywood Boulevard; the one that exists where the stars and power players live and play.  The real Hollywood actually exists across the hills behind the famous sign in the less glamorous San Fernando Valley (where I actually live), because that’s the home of the biggest studios.  But the Hollywood that we seem to picture in our minds is the one found in places like Beverly Hills and Malibu.  There is a stark class difference in these kinds of places, because of the way the communities cater to their famous residents, and it’s the kind of luxury way of life that definitely gives this aura of desirability to the lifestyle of the movie star.  But, there is a downside to this kind of high quality way of life, in that it also causes the people living in these communities to live in a bubble; one that unfortunately may cloud their perception over what is really valued within their industry.  One of the biggest complaints leveled at Hollywood over the last few years is that it’s becoming more and more out of touch with the audiences of film goers and show watchers that they are reliant on for keeping them in the business.  This can be seen in the way that some within the industry are resistant to changes in the market (like the expanding influence of streaming platforms), and also sometimes alienating themselves from a fan-base by demanding too much from their loyalty.  But if there was ever a place where the disconnect between people in the industry and the audiences across the country appears most prominently, it’s with what should be the biggest night in entertainment every year; The Academy Awards.

The Oscars, as they are more commonly known by, has for nearly a century been the pinnacle of achievement within the movie industry.  Not only that, it’s a driving force as well.  Countless movies have been made with one purpose in mind, and that’s to secure that golden statue at the end of the year.  We may not have seen some of the most memorable films and performances on the big screen had it not been for the allure of the Oscar.  But, when something is that highly valued, you can almost always count on dishonest ways of securing it to always occur behind the scenes.  The Academy of Motion Pictures Arts and Sciences have changed their rules countless times in order to make sure that their system remains pure and without corrupt influence.  Even still, it’s a highly competitive race, and much like in the world of politics, it gets uglier every year.  Now when a movie gets nominated, you’ll almost always read opinion papers and news reports about how problematic it’s content is and how this star’s off the set behavior is reflecting badly on the movie itself, and why voting for this film or performance would be morally wrong.  It’s all studio driven smear campaigns meant to influence the very easily persuadable voting block that is the Academy, and these campaigns in themselves can cost millions of dollars on their own.  And why all this effort?  Because, in years past, an Oscar win meant a boost in box office ticket sales for any given movie.  For the movie studios, Oscar campaigns are worth the cost in the end, because the box office would justify it in the end.  But, with streaming taking out the factor of box office grosses, this is changing the game a little bit more, and now the studios are starting to find that the influence of Oscar gold is not as important as it used to be.

The rise of streamers like Netflix and Amazon has put the Academy in an awkward position, because now the effect it has now on box office is somewhat lessened.  Before, it became a big deal to have a movie proclaim itself as Best Picture of the year.  These days, a Best Picture win is almost forgotten about and barely even mentioned a year after the fact.  Can anyone, other than serious Oscar history buffs (like myself) name the Best Picture winners of the last five years.  I’d be surprised if anyone can remember who one it last year; Green Book for those interested (and good for you for already forgetting it existed).  The Oscars have always struggled to keep up with the changing times, but it’s status as an institution in our culture had never been challenged until recently.  Now, the Oscars are starting to hit a crucial point where they are teetering on the brink of irrelevance, and there doesn’t seem to be an easy way for them to get back to the top.  The awards have been in a gradual decline ever since it’s peak in the late 90’s, when the movie Titanic (1997) swept through with record breaking viewership of the broadcast, but the fall has been precipitous in the last couple years.  Despite the expansion that the Oscars has made to the Best Picture category since 2010 (up to 10 nominees each year), it seems like the winners increasingly end up pleasing nobody in the end.  And with so much doubt cast over what should be the biggest award of the night, along with less influence it has on the box office itself, the Oscars are in desperate need of a reinvention.  But how do they do that, when tradition is so ingrained in it’s DNA, and the Academy itself is so resistant to changing it’s ways.

For one thing, there has to be a fundamentally different way it needs to present itself to the public itself.  The Oscars have always been a stuffy, affluent affair, and in many ways that’s been a part of it’s appeal.  But, in the last few years, the Oscars broadcast has tried way too hard to appeal to all audiences, and in doing so has lost their identity.  Gone are the musical numbers and the montages, and instead we are treated to just an endless roll out of the awards with little pomp and circumstance to surround it.  The show has even dispensed with the host of the ceremonies, who usually would end up being the only one to give the show some much needed levity.  This has been an unfortunate result of the Academy trying way to hard to comply with the demands of the medium on which they are presented.  The Academy Awards have been broadcast on television since 1953, and has been a fixture ever since.  But, as stricter FCC rules have come down hard on live shows like the Academy Awards, the opportunities for spontaneity to occur has also dwindled.  The Oscars producers have tried more and more to stamp down any moment that might get them in trouble at these shows, like a political rant or a publicity stunt gone awry.  But unfortunately for them, these are the moments that have made the Oscars the fascinating institution that they are, and trying to suppress these moments only makes the show feel more boring and unremarkable.  Not only that, the show has to limit itself in order to hit those necessary commercial breaks that the network demands.  That’s why the orchestra always plays music in the middle of a winner’s speech, because it’s the show producer’s way to tell the person to wrap it up.  Even with that, the Oscars always receive the complaint that they are too long.  And in response to the network’s complaints about the shows’ lengths, the Academy made the fundamentally ill-planned decision to pull some of the categories out of the broadcast all together; a decision that was thankfully reversed after the backlash it received from rightfully indignant members of the industry.

Though it may be a controversial proposition, I would suggest that maybe broadcast television may not be the best place for the Oscars to be at this point anymore.  Much like how the industry is already moving in this direction already, perhaps the Academy should embrace streaming as an alternative form of presentation.  This way, they can avoid the pitfalls of having to comply with broadcast standards and commercial breaks, and instead present the ceremony in all it’s glitz and glamour like it used to.  There is the issue of how they deal with the cost of the ceremony, which the commercial breaks from the live broadcast would have taken care of, but there could be an alternative to this as well.  The studios could use the ceremony itself to premiere exclusive first looks at their upcoming movies, paying the Academy itself for the privilege.  Yes, it makes the show more commercial in itself, but honestly, isn’t it that way already.  The Oscars can’t pretend that their ceremony isn’t all about building hype and earning money for the movies winning the awards.  There is the argument that it’s about honoring the art, which is valid, but Hollywood is still a business, and I would rather see the Academy take the ceremony back into their own hands than to have them comply to the standards of another branch of the entertainment business.  Other awards shows are already starting to embrace the streaming model, like the Game Awards, so this might be a possible avenue in the Academy’s future.  If anything, it will free them up to be the kind of show that it honestly should be, which is un-apologetically showbiz at it’s most spectacular.  Hosts should be free of constraints, winners should be able to say whatever they want after they win, musical numbers should dazzle and amaze.  Yes it could all be messy, but it will still make it memorable.

There is also the issue with how the Academy votes for their winners.  The downside of the industry living within a bubble can be especially felt here.  More and more we are seeing a disconnect between what the audiences value and what the Academy values.  At a time when audiences, critics, and industry elites can’t agree on what deserves the year end accolades, it becomes increasingly unclear whether the Academy is still the supreme authority over this in the end.  This is especially clear when it comes to movies that are deemed “popular.”  A couple years ago, the Academy got into hot water again when it was putting forward the idea of making a Popular Film category for the Oscars.  This caused a huge backlash, and was again quickly reversed, but it was also telling of just how insulated the Academy voters are as an audience themselves.  To them, they thought that throwing a bone like that to blockbuster favorites was a positive step forward, but what it actually did was expose the elitism that the Academy seems to be unaware they have.  When a big budget blockbuster crosses over into becoming highly influential for the culture at large, like the movie Black Panther (2018) did in breaking down so many barriers for African-American filmmakers, it stands that a movie like it should get the due recognition from the Academy.   But to ghettoize it by pushing it into the “Popular Film” category just undermines it’s impact, and is kind of an insult to the people who made it and the fandom that embraced it.  This has increasingly become an issue with the Academy, who seem to be making more and more “safe” choices at the ceremony, like what happened with the Green Book debacle last year.  In one of the Academy’s least popular choices for Best Picture in many years, the Oscars looked like it was beginning to lose touch with the audience, because it was ever so clear this time that the Academy just went for the least offensive pick in a field of otherwise challenging films.

There’s also the unfortunate factor of what appears to be a far less engaged pool of voters within the Academy.  The demographics of the Academy that we’ve come to find out has shown that they are disproportionately white, male, and above the age of 50.  There has been more efforts to boost the diversity of the Academy voting block, especially in the last decade, but even still, the movies that end up winning Best Picture seem to be the ones that appeal only to that narrow demographic that I stated above.  Not only that, they are a demographic that has their own biases when it comes to what qualifies as a movie deserving of the award.  As we’ve learned over the years, many Academy voters tend to not watch movies in a theater, instead choosing to base their votes on the screeners that they can watch from the comforts of their own home.  And those that do watch in the theaters are passionate about that standard of presentation, and are skeptical of new models like streaming.  There are even those who don’t watch the movies at all and just vote based on their gut feeling.  This apathy shown to the experience of watching the movies themselves really raises the question if the votes the Academy makes are valid at all.  Sure, no one should pressure the Academy to vote one way or another, but at the same time, you really wish they would go in informed before they cast their votes.  My feeling is that a vote should be cast only after the voter has viewed all the nominees eligible for the award.  Preferably they should see it in a theater, as many movies are best viewed that way, but I do know that it’s not possible for some of the oldest voters in the Academy.  They just need to show those of us outside of their closed, elite organization that they are ensuring that every movie is given it’s fair exposure to the voting block as a whole, and that those ingrained biases that the voters might hold will not go unchallenged.  Like any important institution, there needs to be a trust between the industry and the consumer; otherwise, it’ll appear that the Academy is purely just catering to a select group of elites and nobody else.

Are the Academy Awards destined to become an irrelevant relic of the past.  Hardly.  It still holds an importance every year in Hollywood that will likely never go away.  At the same time, with shifting demographics, newer platforms for presentation, and changing attitudes both within the industry and in the public at large, the Academy really needs to wake up and try experimenting a little in order to not look like it’s stuck in the past.  For one thing, it should embrace it’s glory days of the past, and not be so eager to conform to a strict standard that robs it of any spontaneity.  It should also reconsider what it considers worthy of Oscar gold, because as we’ve seen in recent years, some of the best films are the ones that don’t even get a passing glance from the Academy, because they are too unconventional.  The Academy is not compromising it’s integrity if it suddenly embraces a movie that’s deemed “popular.”  Popular movies can be works of art too.  Also, there should be more effort to broaden the spectrum of voices within the Academy itself.  Part of why the demographics of the Academy have shifted so far one way is because that’s what the industry valued many decades prior, but now the industry has taken on a much more diverse character and the Academy itself should reflect that more closely.  Otherwise, that divide between what the Academy values and what the movie-going public values is only going to widen further, leading to even further irrelevance in the future.  It would also stand for the Oscars to maybe embrace new forms of presentation to allow greater access for viewers to see the ceremony in the same way that the attendees do.  Instead of the broadcast model, allow for an uncut live feed to be available online; that way you don’t have to cut out categories and allow the ceremony to move along at it’s own pace.  At the same time, I understand that I’m making these suggestions as an outsider who will probably never move the Academy to change it’s ways.  But, I do speak as someone who has been a fan of the Academy Awards and what it represents.  I want to see the Oscars gain back some of it’s glory, and that requires a bit of change to make it happen.  Hopefully, the Academy learns to embrace some of the changes made to their organization over the years and hopefully welcomes in a wider swath of deserving movies into it’s pantheon of winners.  We want the Oscars to mean something, and that requires them to make the most informed choices in who they honor.  Like the statues they give out, all they need is a little polish in order to make it shine once again.

Where to Now? – How Cinema Transitions from One Era to Another

When we enter into a new decade, the first thought that we often ponder over is what the last 10 years were all about.  This can cover a variety of things; politics, music, culture, and really just the lives we had during that time.  Essentially, we like to mark this transition in years as an era of time, as if these 10 calendar years themselves had their own defining characteristics.  The truth is that eras are not so easily defined, as a time period we know as the 70’s in fact probably didn’t define itself until probably the latter half of the decade, and spilled a little bit over into the early 80’s.  But, we still seem to define these decades as such because of all those above factors: the culture, the politics, the music, and of course, the movies.  If anything, it’s really the movies that have come to define the transitioning of our culture from decade to decade, as you can definitely see a progression that not only was shaped by the culture that made them, but also would go on to influence the culture itself.  We all like to determine what was the defining 80’s movie, or the defining 70’s movie, and so on, and there are always some worthwhile candidates throughout.  But, as indicated earlier, the movies that come to define an era don’t always come right at the turn of a new decade.  Despite some rare examples, few movies actually make that transition hit right at the turn of the decade, and are often found somewhere in the middle, or even at the very end.  But, even still, it is interesting to see how much eras of cinema coincide with the character of the decades that they exist within.  And as we go into a new one this year, it makes us wonder where the next ten years are going to take us next, and if those markers even matter anymore, given how much change cinema seems to be going through even year to year now.

In many ways, we really didn’t take into account how much a decade left it’s mark on the movies until really after the culture itself shifted.  Once the counterculture movement started to move into full swing in the late 1960’s, it was about then that film criticism and analysis started to look back on the years prior as a way of defining the past from the culturally shifting present.  That’s when people started to look at the eras that were apparent, much less defined by the decades they existed in, and more defined by the advancements they made within the art-form.  Specifically, early films were defined as the Silent Era, which encompassed decades worth of movies extending from the first Edison Vitaphone shorts at the turn of the 20th Century to the grand expressionist masterpieces of the German masters to the very beginnings of Hollywood itself.  This celebrated era finds it’s end with the release of Al Jolson’s The Jazz Singer (1927), the first “talkie” which would revolutionize the industry overnight and bring synchronized sound to the art-form.  But even after The Jazz Singer, silent films didn’t just end; Chaplain for instance would continue making silent movies for several more years.  But it would mark the end of their dominance in the medium, as sound film would quickly take over as the norm.  This as a result becomes the narrative of the history of cinema; with one fell swoop, one era of movies comes to an end and then another begins, ignoring the more opaque line that really exists between each.  Even still, cinema aficionados really want to classify a time period within these parameters and pinpoint exactly where the era ends and begins.  This is why the Silent Era feels so fittingly concluded by The Jazz Singer, because it’s works like a cinematic exclamation.  Also, it marked a point where new advancements in technology would play the defining role in presenting a transition for cinema in general.

As such, the years that followed would see new eras defined by the various new advancements in the medium.  The introduction of technicolor, the invention of anamorphic widescreen, even 3D and Smell-o-vision would characterize the changing times of cinema in the years ahead.  Real world issues would also play a factor too.  The 1940’s would absolutely be characterized as one thing in particular within cinema, because it was the thing that was on everyone else’s mind at the time; the War Years.  With World War II raging throughout the globe from 1939 to 1945, it’s easy to see how such a worldwide event would dominate every aspect of the culture, including the movies.  Indeed, every movie made in those years was in one way or another affected by the War, with some more overtly addressing it than others.  Even if you watch a sweet little romantic movie from that era, you’ll notice in the movie’s credits that there’s a reminder to buy war bonds in the lobby, which shows that even escapist entertainment needed to do it’s part for the war effort.  But, even despite the war hanging over the culture and the industry like it did, it doesn’t mean that there was a disruption in the advancement of film-making during that time either.  Some of the greatest movies ever made directly deal with the War head on and still hold up even long after the conflict is over; Casablanca (1943) being one of the shiniest examples.  But the War years as they are known in cinema also extended beyond just the War itself, as the aftermath also left it’s mark in the years after.  Soldiers coming home from the war became not just a different audience for the movies, but also an interesting subject as well.  The Oscar-winning drama The Best Years of Our Lives (1946) tackled the lingering trauma of the post war experience head-on, including having a real life wounded vet, Harold Russell, playing a key role.  There was also a movie like It’s a Wonderful Life (1946) that while not about the War itself still was thematically linked to it; especially considering that both director Frank Capra and star Jimmy Stewart were returning vets themselves.  Culture and technological advancements alike would both shape the different perceived eras of cinema, and though brief in comparison, the War Years themselves would leave the most profound of change on the industry.

But, once the counterculture began to really push society in a different direction, the importance of cinema leaving a statement became more relevant to how it would define an era.  For the most part, the years immediately following the War probably defined cinema the most, as it has been affectionately been dubbed the Golden Years.  During this time, to give rapidly growing families from the “Baby Boom” the kind of escapist entertainment that they desired, Hollywood began investing in bigger, more lavish productions.  This was the era of the Roadshow picture, with massive scope and production values meant to envelope the audience in an experience that they could only find on the big screen.  This was also spurned on by the beginning of television as a direct competitor.  Movies became grandiose spectacle, and with it, so came the inevitable downfall.  These movies often became financially unsound, with budgets ballooning to unfathomable heights.  20th Century Fox’s Cleopatra (1963) nearly bankrupted the studio, and they weren’t the only ones feeling the crunch.  At the same time, people were growing frustrated with the Hollywood machine, and were more attracted to the international output of bold new artists coming out of the French New Wave or the Italian Neo-realist Movement.  Thus, we began to see push-back from the Counter-culture, who saw big “Hollywood” as a relic of the past, and who wanted to carve out a “New Hollywood” in it’s place.  And in this period of time, you will find the most definitive year of stark transition ever in Hollywood.  Though the psychedelic 60’s had a major influence throughout the decade in cinema, Old Hollywood was still a lingering presence.  And then came 1969, where you see the real schism finally split the two apart.  It was the year that produced both Hello, Dolly (1969), an old-fashioned, and expensive, throwback musical and Easy Rider (1969), a micro-budget celebration of hippie culture in America.  Dolly crashed and burned at the box office, while Easy Rider became a smash hit, and the writing was finally on the wall.  1969 was the year that New Hollywood had finally come into it’s own.  This was even more apparent come Oscar time, when Best Picture was given to the first X-Rated winner, Midnight Cowboy; on the same night that Old Hollywood legend John Wayne won his Oscar for True Grit no less.  You won’t find a year that stated so much about the change in cinema than that one right there.

From that point on, it became less about the advancements in the medium that defined, but more about the culture itself that defined the movies.  And as such, the decades themselves became the benchmarks for the movies that premiered within them.  The 1970’s, in retrospect, took the counter-culture ideal more seriously, and as a result we saw a significant reduction in Studios being the driving force behind the movies and more the directors being the one’s pushing cinema to the next level.  It was the era of the director, a time period defined as some would call the “easy riders and the raging bulls,” as the 2003 documentary of the same name details.  Coincidentally enough, those were exactly the same movies that would bookmark the era, as the creative freedom given after Easy Rider would dissipate soon after Martin Scorsese’s Raging Bull (1980).  Much like the studios before them, the ambitions of the maverick directors of the era would soon become unmanageable, and their projects would in turn go over-budget and under-seen as well.  Great promising careers from amazing directors like William Friedkin, Michael Cimino, and even Francis Ford Coppola were cut short because they lost the trust of the studios financing them, and were left to work under tighter constraints for the rest of their careers.  Only Steven Spielberg and Martin Scorsese would manage to continue working on consistently high levels in the years ahead, which would be easily defined as the era of the blockbuster.  The 1980’s evolved in the wake of the downfall of the director era, and became more about escapist entertainment.  Every studio thereafter wanted their own Star Wars (1977) or Raiders of the Lost Ark (1981), and it became a fruitful time for fantasy, sci-fi, and adventure.  Though we see this as a defining aspect of the 1980’s, it would also extend far into the 1990’s, as digital technology began to redefine special effects.  And in this time period, box office became a race like never before.  Back in the early days of cinema, there would always be untouchable kings, like Gone With the Wind (1939), but starting in the 90’s, records would fall with regular consistency, and it was not always an indication of the quality of the film, but more about how well a movie can do on it’s opening weekend.  Thus we got into a time when intellectual properties became the most prized commodity in Hollywood; not the stars, nor the directors, but the brand, and that in a way has led to an era that more or less hasn’t changed in the last 20 years.

Right now, if you were to define the 2000’s and the 2010’s in cinema, I’d say that you’d have a much harder time than previous eras before.  That’s because the traditional markers that we’ve used before in defining the different eras of cinema have kind of lost their value over time.  What I think is the most defining change over the last 20 years of film is the advancement of digital cinema.  Since the year 2000, digital film-making has gone from a novelty to a norm in a very short amount of time.  And in that same period, movie theaters have also quickly converted to digital presentations as well.  This has reduced the necessity of physical media in making and presenting media, which movie studios and theaters see as more cost effective and efficient.  But it also leads to something that I don’t think many people have realized.  The reason why so many movies from different eras have a different look and texture to them is because film stock itself changed so much over the years.  There are very big differences between how a movie looks in 70mm, 35mm, and 16mm, and even the brand differences between suppliers like Kodak and Fujifilm, and processors like Technicolor and Deluxe, would make a big difference in how a finished movie would appear.  But now, with many movies today not even using film, it leads to a result of all movies looking more or less the same, at least in terms of texture.  Everything now has that digital sheen to it, all the way down to the way they are presented.  Even television shows are beginning to look more like movies today, and that’s because they are using pretty much the same types of cameras.  There are holdovers that still shoot and even present on film, but for the most part, movies have been going in this decidedly digital direction, and that has defined most of what we’ve seen in the last several year.  Combine this with an even more homogenized studio system that favors brands over original ideas, and you’ve got an era of Hollywood that seems to be more driven by repetition and standardization than ever before.

The only really disruptive thing that we’ve recently seen in the last 20 years has been the way we watch movies now.  If there was ever something that defined the 2010’s in cinema, it would be the rise of Netflix and streaming cinema; as well as super hero movies.  Netflix didn’t start in the last decade (it’s actually a surprisingly 20+ year old company), but it certainly came into it’s own in the last 10 years, and that is mainly due to their decision to invest in their own content.  Probably seeing the writing on the wall early on, knowing that eventually the other studios would want to take their model and use it for their own distribution, Netflix spent billions on exclusive movies and shows that could only be viewed on their platform, and as result became a studio on their own with a reach in viewership rivaling that of the big six.  Even with Disney, Fox, Warner Brothers, and Universal all jumping into streaming now, Netflix still has themselves positioned well, because of the quality of content they’ve acquired, including movies now from giants like Martin Scorsese and the Coen Brothers.  More than anything, what Netflix has disrupted the most is the viewing habits of the movie going public.  Their streaming model has offered the most direct competition to the theatrical experience since the advent of television, and that in itself is defining the last decade of cinema more than any movie has.   Movie theaters are desperately trying to hold onto their patronage that has benefited them for several decades before, and because of Netflix and the like, we’re going to see a new era for the presentation side of cinema the likes of which we haven’t seen in many decades.  So, if it’s not the movies that are defining the eras of cinema at this point, it’s the way we are watching them that is.  For the last ten years, it was Netflix that reigned unchallenged; perhaps the next ten will be defined by how all the new platforms will challenge each other in this new competitive market.

There are many different ways to look at cinema as blocks of easily defined eras, but the truth is far more complex than that. The truth is that cinema has been fluidly flowing from one decade into another, and only in retrospect do we take a look back and try to form a pattern in it all. The movies that we say defined the decade may have, in fact, not been recognized as such in their day, and were instead more likely just seen as the great movies that they were. Defining an era more comes out of how we want to look back at the years that have passed us by, and see a way that we can explain why attitudes and personal tastes change over time. At the same time, our perceptions of cultural touchstones, like the movies, can also be influenced by the era they come from, and helps to shape their reception for newer audiences. Terms like the Silent Era, the Golden Era, and the Psychedelic Era are easily marketable and can help to draw attention to older movies based on what someone is looking for. In many ways, Hollywood enjoys define their different eras, even if they don’t exactly know how to shape them to begin with. In the end, it is determined by the things that we find the most fascinating about the movies in each era that determine how they will shape their place in time. Whether it’s through the technology that pushes the medium forward, the stars that capture our imagination, the artists that drive the art-form, or as we are seeing right now, the way we watch the movies, cinema will more or less tell it’s own story, which it does so through it’s own evolution. An era in cinema is an easy to grasp definition, one that doesn’t tie down to a set number of years. So, as we look back at the last ten years, and forwards to the next ten, it helps to understand that a new era of cinema is just another chapter in an ongoing story that flows in it’s own way. Great movies can come at any time from anywhere, and the great part of history is that it is constantly being written. For now, feel happy that you are experiencing a time in cinema that itself will be seen under different eyes in the years ahead, and that hopefully you’ll have been part of something exciting historical and important to the culture at large.

 

Home Alone for the Holidays – How a Home Invasion Comedy Became a Holiday Classic

Every generation of seems to have a holiday movie that resonates with them more than others.  For a lot of baby boomers, it was How the Grinch Stole Christmas? (1966), and the generation before that, it was Miracle on 34th Street (1947).  Us Generation X’ers who grew up in the 80’s and 90’s have a whole wealth of holiday specials that meant a lot to our nostalgia for the holidays.  But, if there was one that stood taller than the rest in our collective memories, it was the Chris Columbus directed blockbuster Home Alone (1990).  Now approaching it’s 30th anniversary, Home Alone was a phenomenon upon it’s initial release.  It rode it’s timely holiday season release to record breaking success, and even to this day, it still has the highest box office gross for a comedy when adjusted for inflation.  But it wasn’t just the holidays and the humor that carried the movie, and the real factor was surprising to most.  The key to Home Alone’s success remarkably came in the form of it’s then 8 year old star, Macaulay Culkin.  Culkin, who had only appeared in a handful of films before hand, was suddenly the most famous child star in the world thanks to this movie, achieving a level of fame in Hollywood for a child actor unseen since the days of Shirley Temple.  He represented a new generation of film goers who were going to make a big impact on cinema in the decades ahead, and the fact that many of us who were children at this time saw one of our own commanding the screen as well as he did in Home Alone really solidifies why we hold this movie up so much as a part of our holiday tradition.  But, it is interesting to see how the movie continues to resonate as a holiday film, given the fact that the movie isn’t necessarily about the holiday itself.

Don’t get me wrong, the movie is unmistakably a Christmas movie.  In fact, it is almost drenched in the holidays.  You’d have to look at something like It’s a Wonderful Life (1947) or the fore-mentioned Miracle on 34th Street to find another film with so much of Christmas infused into it’s DNA.  But, that’s an aesthetic part of the movie.  The basic premise of the film itself honestly didn’t need Christmas to work.  The story of a child accidentally left home alone by his family having to fend off home invaders could have easily been set at any time of the year.  A summer vacation setting would have made just as much sense in this case.  But, no doubt director Chris Columbus and writer John Hughes picked the holiday season because it provided a more atmospheric tone to the movie.  It’s one thing for a child to be left home alone; it’s another for it to happen around Christmastime.  Christmas is a holiday all about getting together with one’s family and enjoying the festivities together.  What happens when that’s all taken away.  The isolation of having no one around to enjoy Christmas with weighs heavy over the film, and gives it a poignancy that it might not have otherwise had in any other setting.  That being said, the movie probably could have worked well enough even without the holiday itself.  It’s far more about how Macaulay Culkin’s character, Kevin McCallister deals with the dilemma of keeping his home safe when he has no one else around him to rely upon for safety.  As a result, we see the characters ingenuity and the real reason audiences continue to be entertained by the movie after so many years.  The movie shifts suddenly in it’s final act into a screwball comedy on the level of something we’d see from the Three Stooges, and the results are pretty wild.

But it should be noted that the movie is never meant to invoke a holiday spirit or to solely illicit laughter from it’s audience.  Though on the surface it may seem like a farcical comedy, but underneath, there is something deeper.  Home Alone is in essence a coming of age story, showing the growth and maturity of Kevin McCallister over the course of the few days he’s left by himself.  John Hughes, who had spent much of the 80’s exploring the highs and lows of the average American teenage life in films like The Breakfast Club (1984) and Sixteen Candles (1987), went even further back into pre-adolescence when exploring the character of Kevin McCallister.  It’s interesting to note that when we first meet Kevin in the movie, he’s kind of rotten kid.  He’s disrespectful, bratty, and unsympathetic.  Combine this with the fact that he’s from an upper class household and Kevin represents every spoiled bourgeois American kid who you’ve no doubt seen throwing a tantrum every time they receive even the slightest rejection from their mother or father.    There’s even a point when he calls his mother an idiot to her face, something that I would have been severely reprimanded for if I said that to my mom.  And at first, when he finds that his whole family has left for their Paris bound Christmas vacation without him, he initially finds it liberating; immediately wrecking havoc throughout the house, and as he puts it, “watching trash and eating garbage.”  But as the movie rolls on, Kevin finds that isolation is not exactly as fun as he hoped it would be, and even begins to realize that a part of his loneliness is of his own making.  Through this, John Hughes gives Kevin a redemptive arc that helps to carry the film’s message of compassion.  Kevin, who started off the movie as a selfish brat, by the end has become more self-reliant as well as more considerate of the feelings of other people.

This message really becomes clearer beyond his character arc, as Kevin’s dilemma begins to affect those around him.  In particular, there is a beautifully told parallel story-line being told with Kevin’s mother Kate (played by an unforgettable Cathrine O’Hara).  Kate’s trek back to her son is just as harrowing as what’s going on with Kevin, because we really feel the pain that she is going through not knowing what’s going on with Kevin back home.  I find it funny looking back on this movie now in an era when everybody has a cell phone, and how so much of this would be solved today in an instant with a phone call or text message.  Still, even watching this movie almost 30 years later, Kate’s story-line still resonates, and I honestly think that Cathrine O’Hara doesn’t get enough credit for her performance here.  The normally comedic actress does have her wacky moments here and there (yelling at the incompetent flight desk representatives for one), but her moments of desperation and hopelessness do feel genuine as well.  There’s a wonderful scene late in the movie where she wonders if she is a terrible mother for leaving her child alone, while hitching a ride with a polka band in a U-Haul truck (lead by another comedy legend, John Candy), and it’s a honestly portrayed moment that shows the despair of a character who believes she has failed in her duty as a mother, not realizing that her desperate situation proves exactly the opposite.  Kate indeed becomes the movie’s beating heart, and it’s pleasing to see so much time devoted to her character as well.  Likewise, there is another wonderful arc explored with the character of Old Man Marley (played by Robert Blossom).  Kevin’s fearsome looking next door neighbor turns out to be a decent, caring person by the end, giving Kevin another opportunity to open up to others as a part of his character development.  In Marley, Kevin recognizes some who like him has pushed people away and it has left him isolated as well, and by recognizing this and encouraging the old man to reconnect with his own family, Kevin likewise recognizes what he must do for himself.  So, while there is a lot of shenanigans that go on throughout the course of the movie, it still never forgets that the characters involved are real people who evolve with their story.

Of course, the slapstick is a big part of the movie’s continued entertainment value, and it particularly works because of how on board the actors are to making it as funny as possible.  Working very much against type, we find Joe Pesci cast as one of the cat burglars hoping to rob the McCallister home in which Kevin is still present.  It should be noted that Pesci appeared in the Scorsese flick Goodfellas (1990) in the same year that he appeared in this movie, a role that would ultimately earn him an Academy Award.  To see him go from that to something as screwball as Home Alone really shows how much range he has as an actor.  Daniel Stern’s performance as the other cat burglar, Marv, is more logically placed, and Stern does indeed play up the Stooge like aspect of the character very well.  One of the biggest laughs in the movie comes from the scream that Marv belts out once he has a tarantula placed on his face.  Another reason why the comedy works is because Pesci and Stern have excellent chemistry, and their characters work so well in conflict with Culkin’s smartallecky Kevin.  Indeed, I think why so many fans of the film from my generation love this film so much is because we saw a child like us making buffons out of these adults.  Of course, a real life scenario like this would have a much darker outcome, but the movie never makes the mistake of taking itself too seriously.  Indeed, we will always enjoy seeing two incompetent criminals get pelted in the face with paint cans.  Some of the traps that Kevin sets up in particular are so wildly ridiculous that they defy logic, like Pesci’s Harry getting the top of his head blasted with a blow torch.  At the same time, it’s not like this slapstick comes out of nowhere in the final act though.  There are sprinkles of what’s to come throughout the movie, like the family’s mad scramble to get ready for their trip after sleeping in, or Kevin’s ridiculous indoor sledding down a staircase.  My favorite piece of comedy though is the film noir parody that Kevin watches while eating ice cream.  Doing a hilarious send up of James Cagney gangster flicks in the middle of this family oriented Christmas flick is something that I’ve grown to appreciate more as I’ve expanded my knowledge of film history, and it’s something that helps to make this movie a delight to watch still.

It is also interesting how the movie not only acts as a quintessential holiday film, but it has also gone on to leave it’s mark as a part of people’s traditions for the holidays.  For one thing, I think that more than any other movie of it’s generation, it has brought awareness to all these old Christmas standards from generations for younger audiences.  The movie is full of many songs that otherwise might not have resonated with Genration X or millennials beyond their initial years.  These are songs that are now standards like Brenda Lee’s “Rocking Around the Christmas Tree,” Mel Torme’s “Have Yourself a Merry Little Christmas,” or The Drifter’s rendition of “White Christmas.”  The movie’s soundtrack is basically a greatest hits album of all the Christmas songs that our parents grew up listening to, which is another great way the movie manages to bring multiple generations of audience members together as a part of the experience.  But the movie isn’t just blessed with a varied playlist of holiday standards.  Somehow, Chris Columbus managed to land the legendary John Williams to write an original score.  And for a movie as simple and small in scope as Home Alone is, it is amazing how much bigger it feels with a Williams score behind it.  Infusing more of a Christmas tone than anything else he has ever written, Williams probably is the one most responsible for making this an unmistakable holiday film.  This includes tones of memorable original pieces, like the mad-cap, sleigh bell infused melody that plays during the McCallister family’s rush to the airport, or the quiet grace of the original tune “Somewhere in my Memory,” that plays during the more heartwarming moments.  I don’t think the final shot of the movie with Old Man Marley reunited with his family would have had the same resonance without Williams amazing score in that moment.  Honestly, we have Home Alone to thank for the many different melodies that flood our airwaves during the month of December, both good and bad, and it all does helps to elevate the atmosphere of the movie itself.  As a result you can see why the filmmakers could not choose any other time but Christmas to set their movie in.

Home Alone is one of those movies that so perfectly contains it’s concept within it’s storyline, and it feels like there is no other way to improve upon it. Sadly, the filmmakers were saddled with the responsibility of having to make a sequel to Home Alone only a couple short years later due to how much money it made for studio 20th Century Fox. Long before The Hangover movies set a new standard for uninspired sequelizing of a hit comedy, Home Alone tried desperately to recapture the same lightning in a bottle with another movie but only this time in a new location; New York City in this case. Home Alone 2: Escape from New York (1992) does try, and is not without its moments, but it’s clear that Columbus and Hughes were really stretching the premise thin. And the main reason why the sequel doesn’t work as well is because it’s missing that crucial element that made the original so memorable; Kevin’s character arc. He’s already grown as a character, so by the time we see him again, he’s already gained his maturity. How do you resolve this in order to make a sequel; you regress the character and make him fall back into his bad habits, thereby undoing all the work of the original movie. It’s an unfortunately negative result that removes the emotional heart of the movie, resulting in a half-hearted “here we go again” feel to the movie. The relationship between Kevin and his mom is also unfortunately reduced as well. Even still, the movie has it’s fans, and I do enjoy some of the best parts of the movie, like another film noir parody as well as the addition of Tim Curry to the cast as a diabolical hotel manager. But what the sequel illustrates more than anything else was just how important that underlying heart was to making the original movie work as well as it did.

 The legacy that Home Alone has left behind is one that is inexorably linked now to the holidays. Children who first experienced the movie in its initial release are all adults now with children of their own, and I’m sure that they’ll no doubt be sharing the movie with them this time of year. Disney is even now reviving the property as a possible reboot for their Disney+ service, of which the original films are already available on. It’s easy to see why the movie became an instant hit, but I think the magnitude may have been the most unexpected part of all. It may have been too much for Macaulay Culkin in those hectic few years after Home Alone hit theaters, putting him at the center of Hollywood spotlight for most of his formative years. After being hounded by the industry for some time, Culkin retreated into a quieter life, but has more recently emerged on social media carrying around a sense of humor with the role that made him famous. He even jokingly pondered what a grown up Kevin McCallister would be like in a charming commercial for Google. Sure, time changes perception, and Home Alone is not without it’s quaintness due to the passage of time. But over the years, it has also gained something for its audience that all the best holiday classics have managed to do, which is to present a warm sense of nostalgia. My generation looks fondly back on Home Alone and we have grown to appreciate it more now that we have become grown ups ourselves. Sure, we all like to be a smart ass kid like Kevin McCallister, but over time we find ourselves also wanting to do whatever we can to be there for our loved ones for the holidays. In the end, the movie shows us that Holiday season is all about the importance of family and that being alone for Christmas is not the ideal situation. Togetherness is key, and Home Alone, in its own silly way, delivers that message beautifully. So, Merry Christmas, you filthy animal.

Dawn of Disney+ – First Impressions and What it Means for the Future of Streaming

One day in the whole history of the Walt Disney Company holds very special significance.  It was a day after several years of planning, building, and long arduous hours for all at the company.  And the outcome was far from certain.  Everything was put on the line, and all that was left was to premiere their product before the public and hope that all that hard work was worth it in the end.  And that pivotal day was, December 21, 1937.  That monumental moment in their company was the premiere of Snow White and the Seven Dwarves, the first full length animated feature ever made.  It may seem hard to think of it today, but the making of Snow White was the biggest gamble in Hollywood history up to that time.  Walt Disney staked his reputation and the future of his company on the success of this one feature.  Had it not succeeded, Disney as we know it would cease to be.  But, to Walt’s eternal gratitude, Snow White was a runaway hit, and it not only made all it’s money back, but was profitable enough to allow the Disney company to grow.  And you would think that after such an ordeal that the Disney company would back away from such gambles in the future, but no.  Another pivotal day came on July 17, 1955, when Walt Disney risked yet again his reputation and future in Hollywood on a new ambitious project; Disneyland.  Though it took some time after to make up it’s cost, Disneyland too became a smashing success for Disney, and again July 17 became a day of triumph for the media giant that is celebrated annually every year.  The post-Walt years have seen many rises and falls, but despite growing exponentially larger over the years, the Company hasn’t risked so much in a long time.  But this year, after much development and hype, The Walt Disney Company has introduced their first major project in years that could very well determine the direction that the company takes in the same way that Snow White and Disneyland did.  And because of it, we might be looking at November 12, 2019 as another one of those monumental days in Disney history.

That project of course is the new streaming platform known as Disney+.  Disney+ is only the first of several new direct to consumer streaming channels that are hitting the market over the next few years that is intended to challenge the supremacy of Netflix.  After a multi-year partnership with Netflix, Disney decided to strike out on their own with a streaming platform of their own based on the Netflix model.  Taking advantage of their valuable library of hits, Disney believed that this could give them a better chance of broadening their audience base, while at the same time taking bolder risks without having to worry about box office performance.  This is, of course, based on how well they can develop that subscriber base right of the bat, and there is where the risk lies in creating such a platform.  Netflix already has a decade long history of building up it’s subscriber base, to the point where they now reach nearly a billion households worldwide.  And with the capital that they make off of those monthly subscribers, they are able to reinvest into exclusive content that rivals anything shown in theaters.  Disney no doubt can bring on board it’s loyal base of fans, but it’s in expanding their audience in order to compete with the number of subscribers that Netflix has that they need to work on.  And considering the scale and scope of what Netflix is putting on their channel, Disney likewise has to put on exclusives that match and even surpass those of it’s competitor, and that is likely going to be costly.  Needless to say, Disney needs this new streaming channel to do well, right out of the gate in order for it to justify it’s cost.  Let’s not forget that a lot of investment has to go into all the infrastructure and programming costs, that will likely be tested by a large user base.  Streaming platforms don’t just program themselves; it takes a lot of pre-planning and engineering to make it work, and for any studio unused to such a enterprise, it could prove daunting.  But, then again, Disney has been here before.

Even with all their beloved classics, the Disney library wouldn’t have been able to stand up to the sheer magnitude of what Netflix has on their platform.  That’s why I believe Disney pursued that Fox merger so aggressively last year.  As a singular movie studio, it may not have carried enough properties to challenge the Netflix juggernaut on Day 1, but with two studios worth of properties, Disney might have a shot at it.  I’m not saying that it’s solely why Disney purchased 21st Century Fox, but it probably played a major factor in the process.  I’m sure Fox looked at it as a beneficial factor too, because it freed them up from having to invest in their own streaming platform, with Disney doing most of the work for them.  Disney also has the benefit of having all  their acquired properties over the last decade turning into major successes, including the Pixar, Star Wars, and Marvel brands.  It ensures that by making them all exclusive to their platform that they’ll carry those red hot franchises with them and translate those fanbases into a loyal subscriber base.  Even still, there is the risk of what it will cost to keep people subscribing, and that’s where the exclusives come in.  Disney is not resting on the laurels of it’s theatrical hits hitting the platform, and have invested heavily on new properties that will debut only on Disney+ over the next few years, which includes new films and series based on their Star Wars and Marvel properties.  All of this marks a monumental shift in the way that the Disney company operates, and it is proving to be both an exciting and nervous time for the company.  The platform has especially been the labor of love for Disney’s CEO over the last decade, Bob Iger.  Like his legendary predecessor, Uncle Walt, Iger has staked his own legacy and reputation on a project that he strongly believes in.  Whether or not his gamble pays off the same way that it did for Walt will remain to be seen, but it is a testament to Iger’s boldness as the figurehead of the Company that he would put so much personal stake into something that will change the company forever.

With November 12 having already passed us by, we can now judge for ourselves how Disney+ performs and if it is worth the plunge.  The starting cost for a monthly membership is $6.99, or $79.99 annually, almost half of the current cost of a Netflix membership.  For a starting point, this is a fair price to pay to have this much access to the Disney library.  It will likely rise over the next few years, but so will the number of available titles to watch, so Disney is wisely matching their price with the quantity of things to stream on the platform.  I managed to take advantage of an exclusive discount price available only to D23 Expo attendees this year, which gives me three years for the price of two, so I’ve paid through all the way to 2022, which should give plenty of opportunity to venture through everything available on the Disney+.  Like most other people I’m sure, I am coming to this new platform as a long time Netflix subscriber, so I’m definitely looking at this with some preconceived expectations.  So, after a couple days of finally using Disney+, what do I think?  Well, first of all, I have to praise Disney for an A+ effort in it’s presentation.  The look of the platform is incredible.  It shares similarities with the layout of Netflix, but there are subtle little things that really make it shine.  The home page for instance features tabs for the different brands that make up the Disney Company; notably Disney, Pixar, Star Wars, Marvel, and National Geographic.  Interestingly, no Fox tab is available, despite there being Fox Studio films on the platform, which I hope is just due to Fox still being fairly new as a part of the company.  What I like is the fact that every tab you click on leads to a home page for every movie, which feature beautiful background art.  Also, thank you Disney for not having an Auto-Play feature when arriving at these home page screens, which is one of my pet peeves about Netflix.  Disney+ as an interface is thankfully very easy to navigate and select what we want to watch.  It shows that they studied the Netflix model well and learned how to best utilize it for themselves.

What is also interesting is that Diseny+ is the first ever streaming channel to offer bonus features for their films.  These most fall into the range of theatrical trailers and deleted scenes, but on some films and shows, you even get more substantial things like Director’s Commentary and Making-of docs available. That in particular really shows how well Disney is serving it’s audience.  Disney has always delivered very well on home video bonuses with their numerous DVD and Blu-ray special editions, so to see them also available here on DIsney+ is a pleasant surprise.  Even more amazing is the fact that Disney has also made bonus features available here that are found nowhere else.  One noteworthy one comes from Avengers: Endgame (2019), which shows a deleted scene involving Tony Stark meeting a teenage version of his daughter in a spirit realm after he uses the infinity stones, in a scene reminiscent of the one at the end of Avengers: Infinity War (2018) with Thanos and Gamora. That deleted scene was not available in the Blu-ray edition, so it’s surprising that Disney made it available here, even with director commentary from the Russo Brothers.   For cinephiles like me, having exclusive bonus features is another major plus to justify our subscriptions to the service.  Not only that, but the presentations of the movies and shows are also top notch.  Apart from a few problems, which I’ll get to soon, the movies all have been given a polished HD remaster that gives them a beautiful pristine look.  You’d expect the newer films to look amazing, but what really struck me was how good all the older stuff appeared.  Disney not only put out their theatrical films on Disney+, but also a large amount of the many animated shorts from the heyday of the Animation studio.  And they all look the best they ever have.

Though there is a lot to be happy about with the platform, I do have a number of nitpicks to talk about.  First off, there are some bug to work out, which is not really too much of a problem, because those are pretty much expected for a newly launched service like this.  For the most part, I have not encountered any login, or access problems, as some other people have complained about online.  I have been able to login and click on whatever I wanted to watch without incident or experiencing the site crashing on me, which is pretty good for a first week.  I am using a direct Ethernet line connected to my PlayStation 4, so that may have helped out somewhat.  Even still, some of the bugs still manifest.  For one thing, every time I have watched something, the picture will freeze while the audio continues to run, which causes me to rewind a bit to put it back in sync.  I believed this has to do with the buffering capabilities of the video, as the movie plays while it still loads, just like on any platform.  But the thing is, I don’t encounter the same problem on Netflix or any other platform of the same ilk, so it’s got to be something on their end.  Hopefully Disney discovers this issue and patches it over time.  You got to remember this is only week number one; bugs are inevitable.  It’s kind of miraculous that we haven’t heard of a complete service meltdown considering the volume of activity that they had to deal with in the first week.  There are other problems though, and it does have to do with the actual content itself.  Some people have noticed that episodes of The Simpsons, which has made all seasons available on Disney+ day 1, have been cropped to fit widescreen TV’s, as opposed to it’s original 4:3 aspect ratio.  This has upset some fans, as some gags need the full picture to be fully appreciated.  I think it’s a major problem, because artistic intent is crucial for entertainment purposes, and cropping a movie to fit a format does hurt the product as a whole.  Luckily, word got out and Disney has publicly stated that the true aspect ration will be restored.  Another controversy came about with the realization that Disney was withholding problematic shows and movies from the channel as well. One such case is the Michael Jackson episode of The Simpsons, which presumably was pulled because of recent allegations made about the pop star.  Leaving the real world issues aside, it feels self serving on their part to not air the episode, despite the fact that it’s nearly 30 years old.  Withholding it only draws more attention to the controversy, which would have been lessened if they had just let the episode be.  It’s a similar situation that they’ve placed themselves into with Song of the South (1946), which is also notably missing from Disney+.  I’m on the side of hiding nothing from the public, and Disney is doing a disservice to both themselves and the audience by trying to sweep these controversial elements in their library under the rug.

Controversy aside, what do I think about the exclusive content available.  Well, or one thing, Disney made the smart choice of turning to Star Wars to deliver a Day One exclusive.  This comes in the form of the hotly anticipated series, The Mandalorian.  This ambitious new show is from the minds of director Jon Favreau and producer Dave Feloni (who previously created the Clone Wars animated series).  They wanted to create a Western style show within the Star Wars universe centered on a Mandalorian bounty hunter in the same mold as the iconic Boba Fett.  Though Disney+ had a lot of projects that were buzz-worthy leading up to it’s premiere, The Mandalorian was no doubt the one at the top of everyone’s list, and Disney was smart to make this one of it’s figurehead shows.  Having seen the only two episodes available so far, I can say that The Mandalorian is everything you want out of a Star Wars series.  It’s epic in scope, features incredible gritty performances from it’s cast which includes Pedro Pascal, Carl Weathers, Taika Waititi, and Werner Herzog of all people.  And it offers up an intriguing mystery that will likely open up a new chapter of Star Wars lore.  If there was ever a winning horse to bet on in Disney+’s early days, this was the right one to pick.  There are other shows available too, like the Kristen Bell produced Encore as well as a High School Musical series.  One show that I have found to be a delightful surprise is a National Geographic produced docu-series called The World According to Jeff Goldblum, which of course stars it’s titular host.  Goldblum is a delightful oddball and the show is tailor made for him, as he takes his unique perspective and investigates various small industries across the country with infectious fascination.  I have yet to look at the exclusive feature films debuting on Disney+, which includes a live action remake of Lady and the Tramp and the Christmas themed Noelle, starring Anna Kendrick and Bill Hader.  Those films no doubt show what’s in store for the future for Disney, as they begin to make more films that will be made exclusively for the platform and not for theatrical distribution.  And there is still many more on the horizon as well, including the very anticipated Marvel limited series, which are going to play a key role in the MCU Phase 4.  The only question remains is how bold will these exclusives be?

And what does this mean for streaming in the long run?  Will this begin to chip away at Netflix’s dominance in the streaming market?  While I do think Netflix will be affected in the short run, I don’t see Disney+ being a Netflix killer either.  Disney+ is just the competition, and if anything, competition will help to make Netflix even better.  Competition leads studios towards making bolder choices, and that is always a good thing for entertainment.  You are already seeing Netflix investing heavily in new talent and acquiring exclusive streaming rights to various properties, like their recent deal made with Nickelodeon.  And as more platforms hit the market in the coming years, like HBO Max and Peacock, both Netflix and Disney+ will only continue to raise the bar higher, hoping to gain the edge in the ever expanding market.  And that’s good news for creators out there, because now there is more demand for their ideas and talent.  Also, without the pressure of box office performance, these platforms can put together more films and shows based on outside the box concepts and perspectives.  It will give representation a boost as people who normally were not given the chance to put something that spoke to their community on screen before.  Up to this point, Netflix had been the kings of online streaming, and because of that, they were the ones who dictated the direction of the market.  Now, with competition from Disney, they are in the position of trying to find the fresh new thing that will keep them on top, and likewise, Disney will find fresh new ideas of their own to meet that challenge.  Like the past big gambles Disney has made in the past, Disney+ could be one that determines what kind of company they will be in the years and possibly decades ahead.  In my opinion, they are off to a solid start, albeit with just bit room for improvement, which they no doubt will take care of as time goes along.  It’s honestly one of the most exciting moments in Disney history and could indeed stand alongside Snow White and Disneyland as one of their greatest triumphs.  One can only hope that they’ll be able to sustain this outburst of creative fervor for a long time.  As for now, sit back in the comforts of your own home and enjoy all those Disney classics that you grew up loving, now just a simple click away.

Part of Our World – How a Little Mermaid Helped a Studio Find it’s Legs 30 Years Ago

When I was seven years old in 1989, I had a surprisingly acute sense of the different styles of animation out there.  That is to say, I could tell when something was a Disney production and when something was made by say Don Bluth or the like.  This was mainly due to the fact that my little film buff mind in the making had seen quite a few films already in the mid to late 80’s, as my mom had taken me and my siblings to the movies often.  And this was also a time when animation was beginning to see a bit of a rebound.  The previously mentioned Don Bluth had struck out on his own as a force in animation and created a string of hits during the decade, including The Secret of NIMH (1982), An American Tail (1986), and The Land Before Time (1988).  But curiously enough, the studio that had revolutionized the medium in the first place was notably quiet during the 1980’s.  Disney Animation was still a big deal to me as a kid, but unbeknownst to me at the time, most of what I was seeing during those formative years were movies far older than I realized.  Disney, in the days before home video, kept their library of classics in regular rotation with movie theater re-releases.  I can recall that the first movies I ever saw in a theater when I was about 2 or 3 were 101 Dalmatians (1961) and Sleeping Beauty (1959).  It’s to the strength of how well those movies hold up that I never caught on how old those movies were as a child, but it is interesting how reliant Disney was on their classics to see them through what were surprisingly turbulent times.  As I grew up and became more informed about the history of Hollywood and the medium of animation, I would soon learn that the 1980’s was a transitional time for Walt Disney animation, and one film in particular would change the course of it’s future forever.

That movie of course would be The Little Mermaid (1989).  Mermaid came at a crucial time for Disney, when it seemed like the future of animation at the studio was in serious doubt.  Since the untimely passing of Walt Disney in 1966, the studio was in a constant state of flux.  Walt’s brother Roy would hold the studio together for a while, but his passing in 1971, mere weeks after the opening of Walt Disney World in Florida, left a significant power vacuum at the studio.  Ultimately, Ron Miller, Walt’s son-in-law, would take up the position of CEO of the company and he oversaw the continuation of the animation division that had been the backbone of the studio since it’s inception.  During the 1970’s, Disney had modest success with films like Robin Hood (1973) and The Rescuers (1977), but some of the spark that had been present in the animated film’s of Walt’s era felt noticeably absent in these newer films.  The core group of animators, affectionately called the Nine Old Men, were all aging and about to retire, so animation at Disney was facing an uncertain future.  Re-releases of the classic features became much more frequent for the studio as they were trying to milk them for more cash in these cash strapped days, leading Walt’s nephew Roy E. Disney to complain that it felt like they were running a museum instead of a studio.  Eventually, Ron Miller ended up greenlighting a new film that would hopefully turn the tide, hoping to capitalize on the fantasy film resurgence of the 1980’s, due to the popularity of movies like The Never-Ending Story (1984).  That movie, The Black Cauldron (1985) was a financial disaster, going way over budget and falling well short at the box office, even losing to The Care Bears Movie (1985).  As a result, Miller’s time as the head of the Disney company came to a disastrous end.

The failure of The Black Cauldron nearly wiped out the credibility of Disney animation forever, and perhaps more than at any other time, it seemed like the house that Mickey Mouse built may actually have turned it’s back on animation forever.  Ron Miller’s exit from the company came out of a shocking turn of events, as Roy E. Disney helped to lead a shareholder revolt which led to his ouster.  In his place, Disney convinced a trio of executives from Paramount to come over to the Burbank studio to help revitalize the company.  These included Michael Eisner, who would become the new CEO, Frank Wells, the new COO, and Jeffrey Katzenberg who would become the new President of the Movie Division, which included the animation department.  Initially, the shake-up of the company put animation in a lower priority, as Eisner and Katzenberg were more intent on turning Disney into a more productive studio for live action films, which was their forte.  But Roy, who was now the Chairman of the company, convinced them to retain the animation department.  However, to appease the new executives wishes, animation was moved out of the Studio Lot offices in Burbank and relocated to a temporary facility in nearby Glendale; another sign of animation’s precarious position at the studio.  Already greenlit features like The Great Mouse Detective (1986) and Oliver & Company (1988) were allowed to continue production, but Katzenberg and Eisner needed convincing for what would come after.  In what Disney animators at the time refer to now as their “Gong Show”, members of the animation department were allowed to present pitches for potential new features that would receive the green-light.  One of the teams that made their pitch were young animation directors Ron Clements and John Musker, who came in with two ideas.  One idea was Treasure Island (but in Space) and the other was an adaptation of Hans Christian Andersen’s fairy tale classic, The Little Mermaid.  Of course one got the go ahead, while the other went on the shelf for another 15 years.

Though The Little Mermaid had been given the okay from Jeffrey Katzenberg, it’s production was still not without it’s risks.  The studio had fewer resources at their disposal, and creating an animated film with an undersea setting was going to require a significant level of ambition.  Musker and Clements also had to deal with the fact that all of their team would have to work off site at the new Glendale offices, which were less than ideal for animation production.  Yet, a couple of factors helped to give them the boost they needed to not only see this production through, but to also go above and beyond what others expected of them.  First of all, the new studio heads saw greater potential in the marketability of animation, as they saw surprising success with a 50th Anniversary re-release of Snow White and the Seven Dwarves (1937) as well as having a box office hit with the hybrid film Who Framed Roger Rabbit? (1988) which combined animation and live action to incredible effect.  Also, Musker and Clements looked to Broadway to give this Mermaid a whole different kind of voice.  Up until this point, Disney films had turned to Tin Pan Alley curated songwriters to fill their ever expanding songbook, with the celebrated Sherman Brothers being among the most influential.  But for the Little Mermaid, it was felt that  a more Broadway sounding score would help to elevate the story even more, so the directors reached out to the pair of composer Alan Menken and lyricist Howard Ashman.  The duo of Menken and Ashman had just come off the success of their musical Little Shop of Horrors, and they were eager to lend their talents to a Disney animated film.  Ashman in particular became very involved, taking on a role as producer and giving significant input into the script as well, particularly when it came to the character development of the mermaid herself, Ariel.  With a confident team now in place, the movie went full steam ahead and what ended up happening after was surprising to a lot of people, and a wake up call for Hollywood in general.

The Little Mermaid took Hollywood by storm.  It outperformed expectations at the box office, and helped to earn Menken and Ashman their very first Oscar wins, both for Original Score and for Original Song (Under the Sea), which was a feat that an animated Disney film hadn’t done since Pinocchio back in 1940.  More importantly, it put Disney back on the map in animation.  After so much doubt in it’s future viability as a part of the Disney Studio during the post Ron Miller years, it became clear, Animation was there to stay.  The Glendale offices were closed and the animators triumphantly returned to their old offices on the Burbank lot.  And with a hit now under their belt, Eisner and Katzenberg were eager to loosen up the purse strings and green-light a whole new batch of animated features, all with the same ambitious scale as The Little Mermaid.  In the years after, Disney kept building on each success, with Beauty and the Beast (1991), Aladdin (1992), and The Lion King (1994) breaking every box office record thereafter and racking up award after award.   This era would become known as the Disney Renaissance, which The Little Mermaid is often cited as the catalyst point for starting.  As a result, The Little Mermaid holds a special place in the hearts of Disney and Animation fans across the world.  It’s hard to imagine a world where this movie did not exist.  How different would animation be had The Little Mermaid not come out at that pivotal time.  I for one am grateful for it’s existence, because it ushered in a whole new era for Disney animation, rising to the same level as those classics made in Walt’s time.  But it’s also interesting to reflect on exactly why this movie in particular was able to make this significant change in the medium.

I think a large part of why the movie connected was because it fulfilled a need that both the industry and audiences were looking for.  It should be noted that animation is a very costly form of film-making, and a large reason why the medium suffered for a while is because it became too expensive to make movies like them for a while.  Walt Disney’s Sleeping Beauty, while a celebrated masterpiece, was also a financial burden that nearly caused the studio to fall into the red.  That’s why Disney resorted to cheaper methods in the years after, because he couldn’t confidently pull something as ambitious as Sleeping Beauty off again.  Sadly, in the years after Walt’s departure, they became complacent in this cheaper mode of animation, and it made people less interested in the medium for a while.  Don Bluth notably quit Disney to set out on his own because he was tired of the studio taking fewer risks and playing it safe.  By taking on something ambitious like The Little Mermaid, Disney was bucking this trend that they had found themselves in, and were finally embracing the fact that they could do a whole lot better.  It’s clear that Musker and Clements were looking to reach that higher standard that was set during the Walt era, and their team of animators were hungry to prove their worth and show just how great animation could be once again.  From the lush backgrounds, to the vibrant colors, to the expressive animation of the characters, The Little Mermaid just shines with every frame, and it shows that this team of young artists were determined to bring animation back in a big way.  It may not break new ground in the same way as Snow White or Sleeping Beauty had, but the confidence behind it helps to overcome it’s artistic shortcomings in order to earn it’s place alongside those beloved classics.

I believe that a large part of why The Little Mermaid works so well is because the characters are so vividly portrayed.  In particular, Ariel is a real breakthrough of a character that more than anything has helped to make this movie the classic that it is.  Though Snow White, Cinderella and Aurora are iconic Disney princesses in their own right, Ariel was very different.  She was not a damsel in distress waiting for her prince to come; she was very much in charge of her own destiny.  Sure, it still involves falling for a handsome prince, but her strong will made her very different from her predecessors.  She was willing to stand up for herself, speak her own mind, and do whatever it took to find her happiness.  She would lay the groundwork for so many free-thinking Disney heroines in the years ahead, including Belle and Jasmine, and in many ways was the thing that really helped to bring about a Renaissance for Disney animation.  For the first time, a Princess was the driving force of her story, and not a passive player in a grander narrative (though I would argue that Cinderella is often underappreciated in that regard).  A large part of Ariel’s character was no doubt influenced by the casting of a young Broadway ingenue named Jodi Benson, who was brought on board thanks to her close friendship and association with Howard Ashman.  In Ariel, you see the care and attention that Ashman instilled into the character, and it was important to him that her powerful voice would come through, which Benson absolutely delivers on, both in voice and song.  But the strength of a heroine is only measure by how well she reflects against a great villain, and The Little Mermaid has one of the all time greats.  Ursula, the Sea Witch, is an incredibly well designed and performed character, voiced unforgettably by Pat Carroll.  Everything we love about Disney villainesses is found in this character and she stands as one of Disney’s best alongside Maleficent, the Evil Queen, and Lady Tremaine.  Interesting enough, and showing just how risk-taking Disney had become, the visual inspiration for Ursula came from drag queen Divine, who just so happened to be an acquaintance of both Menken and Ashman, who no doubt modeled the character as a tribute.  In both Ariel and Ursula, we see how the Disney animated film came roaring back because these were characters that weren’t just following in other characters’ footsteps, but were instead meant to raise the bar for all those who would come after.

You can imagine how thrilled I was as a seven year old kid to see something like The Little Mermaid.  This movie just had everything I already loved about animation, but was entirely fresh and new.  Only after learning about how much it had to overcome in order to be the classic that it is just makes me appreciate it all the more now as an adult.  There’s a wonderful documentary about the turning point years of the Disney Renaissance called Waking Sleeping Beauty (2009) that I strongly recommend.  It chronicles the years leading up to and after the making of The Little Mermaid and it shows you just how important that movie was in changing the culture at that studio.  Had The Little Mermaid never become the success that it had, Disney may have abandoned it’s animation wing altogether, and animation in general may been lost to the fringes of the industry, relegated to a niche market.  Who knows how much the fortunes of the company may have changed.  Would Disney have continued to grow like it has over the years?  Would it have been bold enough to take critical moves like purchasing Marvel and Star Wars?  Would it have been put on the market and sold to some conglomerate, instead of retaining it’s independence like it always has?  So, many uncertain futures, all of which never happened because one little mermaid helped this struggling company find it’s footing again.  Musker and Clements would go on to become two of the most prolific animation directors of all time, including finally making their Treasure Island in Space project with 2002’s Treasure Planet.  Alan Menken would end up winning so many Academy Awards with Disney, even after the tragic passing of his partner Howard Ashman who succumbed to AIDS related illness in 1991, that the Academy had to change their own rules as a result.  And the animation team, who were once exiled off the studio lot, are now celebrated legends within the industry.  The animation department at Disney continues to be a crucial part of the company, cemented forever because of The Little Mermaid, and they now enjoy their home in the lavish new Roy Disney Animation Building adjacent to the Burbank lot.  If there was ever a movie in the Disney Animation canon that made the most difference, it was The Little Mermaid, because it was the one that ensured it’s survival.  This little mermaid gave Disney back it’s voice, and allowed it to sing strong for my generation in particular, and for all those thereafter.  We got no troubles, life is the bubbles, under the sea.

Monsters Among Us – Why Movies Don’t Have to Scare to Be Terrifying

Horror can be easily described as a genre defined by blood and gore and a bastion of monsters and murderers.  But, that’s mostly been a result of more recent entries in the genre that have leaned more heavily in the direction of graphic violence.  In reality, the horror genre has gone through a significant evolution through the whole history of cinema, dating all the way back to the silent era and all the way up to now.  And what you’ll discover about the horror genre by looking back on it’s history is that it didn’t always need to spook it’s audience in order to make them terrified.  For the most part, most horror filmmakers weren’t allowed to go as far as they are now with depicting blood and gore on screen, so they often relied on using cinematic language to suggest terrifying elements within their movies.  Looking back on some of the first horror films ever made, like The Cabinet of Dr. Caligari (1920) and Nosferatu (1922), it is quite amazing to see how well they are able to convey a feeling of absolute terror with only light and shadow, as well as some truly Gothic imagery.  They still remain some of the most terrifying films today, even almost 100 years later, and that shows just how powerful the visual image is at conveying terror.  Though the filmmakers were certainly working under limitations, it still enabled them to be creative and allow imagination to fill out the gaps.  Movies that give the audience the opportunity to imagine the unseen horror often stand out more as the most terrifying kind of horror movies, because nothing on screen could ever match up to the worst things that we can think of.  Our imagination can go into surprisingly dark territories when put to the test by these kinds of horror movies, and one thing that I’ve noticed in more recent horror films is a return to that kind of interplay between the filmmakers and the audience.

At the same time, we aren’t seeing horror films with graphic violence and supernatural monsters going away either.  IT: Chapter Two is still performing well at the box office, and that movie is what you’d expect as the atypical Hollywood horror flick.  It’s got scary monsters, jump scares, and a whole lot of blood and gore.  At the same time, it’s also apparent to the viewer that it’s not an entirely scary movie.  In fact, I’d say that half of the movie works as a comedy.  Is the film entertaining, yes; but not all that scary.  Sure there are some genuinely terrifying parts, but I don’t think that you’ll fnd anybody who’ll describe it as the most terrifying movie that’s ever been made.  It’s interesting to note how this contrasts with another Stephen King adaptation, The Shining (1980), which is described by far more people as the most terrifying movie ever made.  The Shining, though it works with the same standards of gore and violence as IT, comes through as far more consistently terrifying.  Why is that?  I believe it has to do with more consistency of tone.  IT bounces back and forth between the goofy and horrific, while The Shining builds it’s feeling of dread towards it’s ultimately horrific ending.  Most filmmakers tend to not like that slow burn style of storytelling and prefer to grab a hold of their audience right from the outset.  But what Stanley Kubrick revealed through his own telling of Stephen King’s classic novel is that by allowing the audience to absorb the movie before pulling the rug out from under them, you intensify their sense of terror as the movie goes along.  It’s that slow march towards the horrific that feels all the more rewarding, because as the movie goes along, the audience grows more and more anxious, knowing that something right around the corner will come out to shock them.

There are two schools of thinking that have developed around how you approach a horror movie, and they follow that divide that we’ve seen between the differences in the aforementioned Stephen King adaptations.  There are some filmmakers that choose to withhold moments of terror in favor of building up the atmosphere, while there are others that don’t waste a single moment in showing you every horrific thing it can.  The latter is usually what you’ll find coming from the major Hollywood studios, because they are the safe and predictable choice.  Taking the former approach is not as ideal for studios to invest in, because it requires far more faith that the audience will jump on board and accept the unpredictable.  But, playing it safe when it comes to horror has it’s pitfalls too, because if there is one thing that a horror movie fan hates, it’s complacency.  You scare someone once, they become guarded for what comes next, so if you just repeat the same kind of scares over and over again, they audience just grows numb to it.  You could see this play out very clearly in the decade long glut of slasher flicks that we got during the 2000’s, with movies like Final Destination (2000), Jeepers Creepers (2001), Valentine (2001) and many others trying perhaps way too hard to follow in the footsteps of Scream (1996).  Eventually the box office returns for these kinds of movies dried up and the studios began abandoning them.  It wasn’t until Blumhouse Productions stepped in the 2010’s that we’ve seen a revitalization for the genre, thanks to the indie producer’s more manageable production budgets.  And by setting the genre on more grounded footing, it allows for more filmmakers to experiment with the pacing of their horror, which itself garners up some interesting results.

One of the most interesting things about horror as a genre is how it’s very much driven by cinematic vision.  Indeed, the success rate for a horror film is determined by how well the filmmaker uses the medium to convey terror on screen.  This is where the groundwork of the pioneers of early cinema becomes so important, because they are the ones who wrote the language of visual horror in the first place.  We have visionaries like F.W. Murnau, Fritz Lang, James Whale and Tod Browning to thank for making us afraid of what lurks in the shadows at night.  Even when the genre shifted to more graphic violence thanks to the slasher flicks of the 1970’s and 80’s, the influence of those early films can still be felt.  Just look at how John Carpenter lights Michael Myers in Halloween (1978), almost completely in shadow with his mask being the one illuminated element.  Whether we know it or not, we are conscious of the rules of horror film-making, and adhering to those rules is what can make or break the effectiveness of a horror movie.  It’s especially interesting to see this play out in horror franchises, when the cinematic vision is dramatically shifted between films.  For instance, there is such a dramatic shift in tone between William Friedkin’s The Exorcist (1973) and the John Boorman directed sequel, Exorcist II: The Heretic (1977).  The original Exorcist is deliberately paced, almost procedural drama that smacks it’s audience hard when it arrives at it’s most horrifying moments, while Exorcist II is overblown, showy and decided less terrifying.  One director wanted to take his time building tension while the other wanted to show off, and that difference shows just how important it can be to give you audience the chance to absorb the film before being terrified.  This is one example of where a franchise was derailed by loosing it’s grip on subtlety, but more recently, we’ve seen an example where the opposite was true.  In 2014, Universal and Hasbro made the very cynical move of turning their Ouija board game brand into a horror film franchise.  The result was a standard, cliche ridden mess that did nothing to help promote the game nor make a splash within the genre.  However, when they planned to make a sequel, they turned to an actual horror filmmaker named Michael Flanagan who had previously won raves for his breakout film Oculus (2013) and he crafted a far more subtle, well thought out, and more importantly, scary follow-up with Ouija: Origin of Evil, which was far better received by critics and audiences.

More recently we’ve been seeing movies that have followed that pattern, straying away from fright by the minute tactics and choosing to use atmosphere and tension to terrify their audiences.  One company in particular that seems to be delivering that example is independent outfit A24.  Their catalog of films spans a variety of genres, but what is particularly interesting is how they are delivering in the horror genre.  They seem to favor very artistic horror films, with a very deliberate directorial stamp on them, helping them to stand out among others in the genre.  Indeed, it really is hard to compare an A24 film with anything else in the genre.  They were the ones who put out Kevin Smith’s foray into body horror film-making with Tusk (2014); they released the Robert Eggers period set The Witch (2015); and most recently they made a splash by putting out Ari Aster’s controversial cult movie Midsommar (2019).  Midsommar in particular stands out within the genre, because stylistically it goes against so many horror film-making rules.  Nearly the entire movie is bathed in sunlight, eliminating any use of shadows to hide terrors hiding within view.  It’s also a movie that doesn’t rely on jump scares or significant moments of graphic violence.  It instead plays by the same principle that movies like The Shining and The Exorcist built their moments of horror on, which is to build a sense of growing terror over time, allowing the audience to grow comfortable with the movie before the terror begins to envelop them.  By the end of the movie, the audience has reached a level of unease that may not have shaken them to the core, but nevertheless has left them emotionally drained and petrified.  It’s that kind of horror that really appeals to filmmakers, because it makes the film stick longer in the audiences memory.  Like I said before, audiences grow numb to consistent scares thrown at them, but slowing pulling them into a state of unease is something that leaves a lasting impact, and that’s something that Midsommar relishes in doing.

However, it may surprise you that a movie doesn’t even need to be about something supernatural or horrific to be terrifying.  Sometimes, a real life moment can create a sense of terror that is equal to what we see in any horror movies.  I can tell you that one of the most tense experiences that I had watching a movie this year was in watching the documentary Free Solo on an IMAX screen.  The movie is just about a free solo rock climber named Alex Hannold who tries to be the first person to ever scale the face of El Capitan in Yosemite National Park, without the assistance of safety ropes.  You know already just by the fact that the movie exists that he made it through alive, but even while watching the movie you feel this sense of dread that he could fall to his death at any moment, and that just fills you with this feeling of absolute dread while watching the movie; helped greatly by the you are there with him placements of the cameras.  And that’s just a movie that feels like a horror movie while being something else entirely.  There is another movie I saw this year that on the surface wouldn’t typically be looked at as a horror movie, but to me was absolutely the most terrifying thing I’ve seen all year.  That movie would be Gaspar Noe’s new film Climax.  The French auteur is notorious for breaking cinematic conventions and assaulting his audience will sometimes overwhelming imagery.  With Climax, he presents a very unconventional horror movie by mixing it into the world of dance.  Imagine if Step Up (2006) had a drug trips in it, and that is basically what Climax turns into.  Members of a dance troupe discover that their after party punch has been spiked with LSD, and the remainder of the film becomes something of a bad trip turned into a nightmare, and Noe never holds back.  You feel the overwhelming dread that spreads throughout the movie as the characters are trapped in their inescapable drugged state, and you are right there in the middle of it too.  That to me was more horrifying to watch play out than anything in IT: Chapter Two, and that’s because it’s rooted in a very human horror of a waking nightmare that you can’t escape until it’s run it’s coarse.

Sometimes I’ve found that the most terrifying movies ever made are the ones that are grounded in reality, which is probably why those kinds of movies endure longer than others.  Some of the greatest examples of the genre in fact ignore the cliches of the slasher killer or the supernatural monster, and instead remind us that the worst monsters of all are the ones around us.  To this day, only one movie from the horror genre has won Best Picture at the Oscars, and that’s The Silence of the Lambs (1991).  Lambs on the surface isn’t exactly a horror movie by traditional standards.  It’s more of a police procedural, with FBI agent Clarice Starling hunting down a serial killer.  Though there are graphic elements in the movie, there are also very few onscreen deaths as well.  Most of the gore is shown after the fact, with only one scene in particular that you would describe as traditionally horrific; Hannibal Lector’s escape scene.  This is a prime example of not playing tricks with the audience, but instead allowing them to absorb the movie and take in the growing tension before it’s released.  When we think of The Silence of the Lambs, the most terrifying moments to us are not the scary moments, but rather the quiet dialogue scenes, where the camera is uncomfortably close to Anthony Hopkins face as he stares directly at us.   Again, it’s using atmosphere to deliver the best effect for the moment.  A similar approach also resonated in David Fincher’s Seven (1995), where much of the terrifying elements are suggested to us and never shown.  We never do see what’s inside the box, but we can paint a terrifying picture in our mind, and that’s just as effective.  It’s all about knowing that right balance to get the audience to feel the dread even when they are not seeing all of it.  And in turn, it shows how we are still using the shadows to deliver the most terrifying of frights on the movie screen.

Horror goes through it’s many different phases over the years, but in the end, several principles still endure to help keep it in line with it’s roots.  That’s the reason why even the silent era movies still manage to scare even all these years later.  The loosening of standards has helped filmmakers get away with a lot more, but as we’ve seen, sometimes it helps to show restraint as well when making a horror movie.  Indeed, one thing that has proven true over the years is that trusting your audience to fill in the gaps has been beneficial to most horror movies, and that by trying to force a scare through too much will end up dulling their senses over time.  That’s why movies like The Shining, The Exorcist, The Silence of the Lambs and Seven are still terrifying today, no matter how many times people have seen them.  They give their audiences a full experience, and reward them for their patience.  I am encouraged to see movies like Midsommar try to follow that example.  Sure, there are the standard Hollywood horror films that serve their purpose, but the real force driving the horror genre into the future are the ones that are being produced on the fringes.  They show that a horror movie can come from any type of style and can be just about anything, like the movie Climax has shown.  Just chasing after scares is not the way to succeed in horror film-making.  It’s finding that right balance between terror and atmosphere, and also just having a story worth telling in the end.  And most of all, it helps to have a genuine human connection, because as real life has shown us, horror is all too real in our lives, and sometimes the worst kinds of nightmares are the ones that we dream up ourselves.

The Gathering Storm – What a Streaming Content War Might Look Like

As we stand now, 3 months away from the end of another decade, it’s interesting to see how far the last ten years have advanced the way that we consume content.  Looking back at the year 2010, when you heard of somebody watching a video online, most likely they would have been referring to short 10 minute clips that had been posted on YouTube.  The fact that the internet would soon be the primary source of our daily media intake was almost an inconceivable idea back in those days.  Back then, online videos were short little diversions, movies were still primarily destined for the big screen, and Netflix was still operating completely through their by mail servicing.  But, as the decade moved ahead and newer advances in communication made it more possible to download and process large files of content faster and cheaper than before, it suddenly changed what was possible in the realm of entertainment.  Now, people are able to watch what they want wherever they want, whether at home or on the go.  And through these advances, a new market in entertainment has opened up that has not only grabbed the attention of all the major studios in the business, but has become their primary in recent years.  This is the beginning of the Streaming Era in Hollywood, where the new cache is having a platform where people will pay a monthly fee to watch movies and shows that can only be seen on the specific channel.  This is a new frontier for the industry, as it enables them to bring content direct to the consumer in ways that have been impossible up to now.  Before, movie studios needed to work with the movie theater chains or the cable providers in order to have their catalog of content presented to the public.  But with streaming, the power is put into the studios hands, as they are now allowed a platform where they can bypass all other channels and bring the content directly into the home through an internet connection.

Netflix was the first to dip their toes into this new form of distribution.  First off, they added their streaming option to their monthly subscription plan as an alternative for people who thought the through the mail servicing was not fast enough.  As the streaming option became more popular over time, Netflix added more and more content to their streaming package.  What particularly became popular on their streaming platform was the television shows, as people were watching shows like Friends, Seinfeld, and many other classic shows that they had missed out on in their initial runs.  This created a new behavior for audiences which we know now as “binge watching,” as it was discovered that most people were watching multiple episodes all at once; sometimes even entire seasons in one sitting.  As Netflix saw their subscriptions rise because of this new streaming option, they suddenly came up with a bold idea; why not make our own shows, exclusive to our platform.  It a short amount of time, they put together a plan to stream original programming on their platform, which in turn would change the industry in a dramatic way.  Their first show, the short lived Lilyhammer (starring Steven Van Zandt) was not much to talk about, but their second exclusive series, the critically acclaimed House of Cards, became the talk of the town.  Suddenly, people took notice and Netflix emerged as a real contender in Hollywood.  With several more shows like Orange is the New Black and Stranger Things having an almost perennial presence in the Awards races, not to mention garnering devoted fan-bases,  it was only a matter of time before Netflix would look to conquer the original film market as well.  This is where the platform has seen the most push back from the industry, as movie theater chains have aggressively fought Netflix’s day and date release model, barring them from screening their movies in most chains across the country.  But, as Netflix has welcomed in more and more prestigious talent to their burgeoning studio, the pressure has risen on the industry to consider what place streaming should take as a part of the legacy of the business.

Though Netflix was the first to put their assets into the streaming market, they certainly weren’t the last.  In the years since, online retailer Amazon would launch their own platform through their Prime membership, giving exclusive content to their customers as well.  Though they have been competitive with Netflix on the television series end, with their own award winning shows like Transparent and The Marvelous Ms. Maisel, they have not followed their rival in the original film race as much, instead opting to form an Amazon Studios division to produce movies for theatrical distribution.  Hulu, which had already been streaming broadcast television shows before Netflix, decided to follow their lead and make exclusive content as well, both for television and film.  In an interesting confluence of events, both Netflix and Hulu even had competing films covering the same subject released days apart, namely the Fyre Festival documentaries, which I covered here.   Though all these channels have done a lot to innovate and move the industry towards creating a streaming market, we still refer to the format itself as the “Netflix” model.  And this is because out of all the competition there is in the streaming world, Netflix has positioned themselves as being the industry leader.  Amazon and Hulu are massive platforms in themselves, but Netflix has become a part of the culture in a way that the others quite haven’t reached.  As of right now, much of the maneuvering going on within the industry is in direct response to what Netflix has been doing.  Netflix has already conquered the video rental business, and even put Blockbuster out of business as a result.  They are dominant in the television market, which has forced cable companies to change their way of doing business as many people have opted out of their cable plans in favor of the more affordable Netflix.  What’s to stop them from taking on the movie studios as well.

That is why this year is a big turning point for the streaming era.  In one month, two new streaming platforms from two of the most formidable media companies in the world are going to be launching; Disney+ and AppleTV+.  For the first time ever, Netflix will be seeing competition that may actually affect their dominance in the streaming market.  Disney+ for one takes advantage of their extensive library of always popular content, and they are also bring a lot of exclusive stuff to the table with new shows based on their Marvel and Star Wars properties.  And then you have AppleTV+, which is going to benefit from the world’s largest corporations seemingly bottomless supply of funds for exclusive content.  They are also launching with subscription plans significantly less costly than Netflix currently is.  Netflix has certainly been planning for stiffer competition, as they’ve been signing exclusive contracts with some of the industry’s hottest producers right now, like American Horror Story’s Ryan Murphy, Game of Thrones‘ David Benioff and D.B. Weiss, and many more.  Not to mention, they’ve been pushing to get due recognition from the film community by campaigning hard for year end awards.  They made their best case yet with Alfonso Cuaron’s Roma last year which took home several Oscars and nearly won the top award as well (and probably should’ve).  They are again making their run for Awards season this year, with movies like Martin Scorsese’s The Irishman being heavily pushed, stating once and for all that Netflix, and by turn streaming itself, is an essential part of this industry and should be recognized as such.  Despite how this year will turn out, the streaming market is only going to keep expanding, with platforms like HBO Max, Peacock, and the short form channel Quibi all premiering in the next few years, so indeed, Netflix’s drive to change the industry is actually happening.

But what is that going to mean for the industry itself.  We have already seen a big shift happen with theatrical distribution as a result.  There has been a noticeable decline in recent years with regards to mid range productions making it to the big screen.  By mid-range, I mean movies that aren’t tent-poles but at the same time are not indies either.  These are typically comedies, romantic dramas, and the modest budgeted action flick.  They are disappearing from the multiplexes, where only a decade ago they once dominated.  This is mainly due to the fact that most of them have moved over to streaming, where these kinds of movies are much less of a financial risk if they don’t perform well.  What we’re left with at the local box office is big tent-pole films on one end and small budget indies on the other; mainly the only movies that the industry deems worthy of the big screen because they are the only ones that can turn a profit in the end.  In order to compete, movie theater chains have in a way adopted a model not unlike their streaming competitors, with many chains now adding their own subscription service to their patrons.  But even still, the options at the local multiplex are not as diverse as they once were, because streaming has soaked up so many of the mid-range films that used to drive a wider variety of people to the box office.  With more and more studios beginning to set up their own exclusive platforms, you may see even more of those types of movies taken out of the theatrical market and made available exclusively online.  It’s all going to depend on whether movie theaters are able to retain their appeal as an alternative to the home viewing experience.  The subscription model, first brought to the industry by the now defunct MoviePass, may be the key, but up until now, it was only Netflix they had to contend with, and not a whole industry of streaming studios.

Another thing that may drastically change in the years ahead is the value that is put on available content.  For one thing, Netflix has recently been on a shopping spree in order to lock up exclusive streaming rights to content that they know is going to drive people to their platform.  Seinfeld, the now 20-plus year old, just sold it’s streaming rights to Netflix from parent company Sony for a mind-boggling $500 million.  This was in direct response to Warner Brothers opting to take their shows Friends and The Office off of Netflix so that they could stream it exclusively on their platform, HBO Max.  Decade prior, you could probably expect studios to sell broadcasting privileges to cable outlets for a few million here and there.  But now, to have exclusive streaming rights, a single show now carries a half billion dollar price tag, and that’s only because it’s a big part of the way audiences consume content now.  Whoever has the sturdiest library will be the ones who make the most money, and though Netflix has a respectable collection of content to their name already, they don’t have the decades worth of movies and TV shows that the big studios do.  So, before Sony gets an idea to start their own exclusive streaming channel, Netflix is making sure that their prized properties find a home with them beforehand.  With Disney set to launch their platform in November, they carry an already built in library of not one, but two major studios (themselves and Fox), so the need for exclusive content with name recognition is pretty much the most valuable thing to have right now in the industry, and it’s only going to get fiercer in the years ahead; especially if some of these studios have to spend a lot more in order to compete.

One thing though that hasn’t really factored into the discussion of the upcoming streaming wars is how much of an impact will it put on the customer themselves.  For right now, there is a lot of excitement for what’s coming down the line, especially when it comes to all the properties that have been announced in the past few months.  But, as more and more channels are being announced as being in the works, people are going to start wondering if they are going to be able to afford all of it or not.  Right now, cable TV packages can cost in the range from $100 a month to $500, depending on how many channels they offer.  Streaming platforms are not part of those packages, and operate in a separate way; unless they make a deal with cable companies for a tie in promotion, like Amazon and Netflix sometimes do.  Some people have opted to leave their cable subscriptions behind and just do streaming instead, which has hurt cable providers a little bit, but not much as streaming still needs broadband internet to function.  But, when you look at all the streaming channels that are coming over the horizon, the cost of leaving cable for streaming may not look like the bargain it once was.  Netflix has already raised their basic streaming package over time, from a modest $8 a month to a now $12 fee, with possibly more increases down the road.  Disney and Apple are starting out with a relatively low $7 and $4 a month respectively, but again, the price might change over time.  And with HBO Max, Universal’s Peacock, and multiple other platforms having yet to announce their introductory monthly rates, you may in the end having a streaming bill in the $100 a month range all by itself, in addition to your internet fee.  Thankfully, cancelling a streaming subscription is easier than exiting a cable package, but as content becomes more and more exclusive over time, people are going to be hard pressed to make tough choices about which platforms they choose, and that might mean that some of these platforms may not survive as well as hoped over time.  Even Netflix may lose part of their subscriber base, if they are unable to compete.

What is fundamentally interesting about this point in time in the entertainment industry is all the uncertainty.  We don’t know how this upcoming streaming war will play out.  We just know that it’s about to get a whole lot bigger in the years ahead.  Already we are witnessing a change in the industry that may or may not benefit the quality of the content we watch.  I for one think that competition is a good thing for the industry, as it allows for more  creative risks to be taken, but even this comes at a price.  I certainly want the movie theater experience to survive the competition from streaming, and hopefully their own subscription based system might be the key to that survival.  Also, the added cost that it’ll put on the consumer might have a negative effect over time, as exclusive content might remain out of reach for those who can’t afford to add on that one extra channel.  We may end up living in a world where people only watch their movies and shows from one studio and nowhere else, which is kind of already happening at the worldwide box office, with Disney taking an almost 40% share of total theatrical ticket sales.  But, even with all that, we are also witnessing a flourishing of creativity in Hollywood now that we haven’t seen in a long time.  This Golden Age of television that people have been proclaiming recently has been largely fueled by the risk-taking new shows that have come from streaming, and that same flourishing is beginning to extend into films as well.  I love the fact that Netflix has invested in prestigious talent and given them free reign to do whatever they want.  It’s a refreshing change from the more budget-conscious, focus group driven model of the last few decades.  Despite what it may do to my wallet over time, I have already signed up for Disney+ and am considering adding more streaming platforms into my daily options instead of less.  There’s a reason why so many film studios are jumping on board, because streaming allows them a direct connection to the audience that they’ve never had before, and with a solid stream of revenue coming from monthly payments, many of them can return to making movies and shows the way they want to make them; with renewed confidence.

Rule Number One – Talking About 20 Years of Fight Club

When people discuss the years that are considered among the best ever for movies, probably the most recent one to come into that conversation would be the year 1999.  Closing out the 20th century as well as the last millennium with quite a bang, we saw a year that banged out one classic after another, many of which have gone on to be highly influential 20 years later.  But the interesting thing about 1999 is the fact that so many of the best loved movies from that year were ones that at the time were not major hits upon release.  I already spotlighted one of those movies a couple weeks ago with my retrospective on Brad Bird’s The Iron Giant here.  I myself in particular have really found the legacy of these rising classics particularly interesting, because they go all the way back to when I started paying closer attention to the movie industry in general.  In 1999, I was a sophomore in high school who had just seen Lawrence of Arabia on the big screen and knew that my life’s path was going to be in the pursuit of film-making.  Because of this renewed interest, I began expanding my range of films to look out for, trying to open my perspective to include a wider spectrum of what the industry had to offer.  And in the early fall of 1999, I managed to catch a movie that not only hit the right spot, but would go on to become the first movie that I ever proclaimed to be the best movie of the year, since this was also the first year ever that I began to keep track of that distinction.  That movie would be the shocking, “in-your-face” spectacle that was David Fincher’s Fight Club.  Fight Club knocked me off my feet the first time I ever watched it, and even 20 years later it still packs a punch.   But the interesting I discovered while revisiting it was how different it plays today than it did back then.  The movie still holds up, don’t get me wrong, but the message takes on a whole different meaning in today’s cultural climate.

Much like The Iron Giant had in the weeks prior to the release, Fight Club was not financially successful right away.  It didn’t bomb as heavily as Giant did, since it was not a terribly expensive movie to make, but it still underwhelmed, given that it had an A-list star like Brad Pitt on the marquee, as well as two rising, Oscar-nominated names like Edward Norton and Helena Bonham Carter involved as well.  Critics were even divided at first.  Roger Ebert notably gave the movie a thumbs down review, stating that he found the movie an unfocused mess.   But, as with most movies that become classics long after their initial theatrical release, Fight Club found it’s audience on home video.  Fight Club came out at just the right time to take advantage of this new technology called DVD, and it was the first cult hit to rise on this new format.  In time, the movie became a best-seller and critics were starting to change their tune.  For a lot of people, the movie was a revelation, as well as a breath of fresh air after a decade of polished, studio fare.  Like it’s fellow 1999 alum, The MatrixFight Club brought a notably punk style to the medium of film; changing the way movies looked, sounded, and even edited.  But while The Matrix had a sleek cyber-punk aesthetic to it, Fight Club was grungy, dank, and even rotting at the edges at some points.  It was a movie that was held nothing back and presented a decidedly anti-Hollywood aesthetic to the big screen.  And a lot of people started to ask themselves; is this movie the future of film-making?  Is this the start of an American New Wave?  Did Meat Loaf actually give an awards worthy performance in a movie?  No matter what anybody said then or in the years after, Fight Club had left a mark on the film industry.

Perhaps one of the things that really started pulling in new people to the movie was the unexpected way it played a trick on it’s audience.  Again, the year 1999 had many noteworthy things that left an impact on audiences, but one that really stood out was the renewed popularity of the plot twist.  M. Night Shayamalan had earlier that same year stunned people across the world with his now infamous twist ending to The Sixth Sense, helping to propel that film to record breaking success.  But, at the same time, Fight Club had it’s own shocking plot twist that in some ways is even better executed than Shayamalan’s.  Had The Sixth Sense not taken so much of the thunder away in the weeks prior to Fight Club’s release, I wonder if the revelation in Club may have hit harder than it initially did.  For those of you who haven’t seen the film and don’t know what I’m talking about, well fair warning, but there are SPOILERS ahead.  The plot centers around a nameless protagonist known only in the credits as the Narrator (played by Edward Norton).  After a rough couple of weeks of working a thankless job and going to therapy, he runs into a quirky gentleman on a plane ride home named Tyler Durden (Brad Pitt).  After the Narrator’s apartment burns down, he crashes at Durden’s run down home in the bad part of town, and the two come up with a crazy way of letting out some of their aggression; they begin to fight each other.  In time, other people want to join in and both the Narrator and Tyler start what ends up becoming known as the “Fight Club.”  Over time, the Club becomes bigger with Durden making more demands of it’s members to declare their loyalty to the group, creating in away a cult like organization.  This troubles the Narrator who desperately tries to reverse the zealot like direction this club is going in.  But, in his efforts, he finds that he can no longer reach Tyler and the Club has transformed into a full on terrorist organization, all of whom look to him as their leader.  And then we get the bombshell.   The Narrator cannot locate Tyler Durden, because he is Tyler Durden.  The Tyler we’ve seen is just an imaginary friend that has manifested through the Narrator’s paranoid schizophrenia, and with that, the movie reaches it’s legendary peak.

The plot twist stems from the original novel of the same name from author (and fellow Oregonian) Chuck Palahniuk, but it’s execution is so well handled by director David Fincher and screenwriter Jim Uhls.  For nearly two-thirds of the movie, you have to buy into the belief that Tyler Durden and the Narrator are two different people, and not two personalities in one body.  The excellent chemistry between Edward Norton and Brad Pitt sells that dynamic perfectly, and you just have to admire the fact that the movie expertly keeps you in the dark until the bombshell drops.  But that’s not the only thing that makes the movie a classic.  This movie, probably more than any other, cemented the Fincher style.  David Fincher had won acclaim four years prior with the unforgettable thriller Seven (1995), and was able to deliver a compelling drama with The Game (1997).  But with Fight Club, we saw Fincher really begin to play around with the camera, utilizing CGI for the first time, which he pushed to the limit with what was possible in 1999.  This was the first movie where Fincher started using his high speed pans, which would take the camera across the plain of view at such incredible speeds and through impossible barriers that could never be done with a standard camera (like through concrete walls or even something as tiny as the ring of a coffee mug).  Such a technique has since become David Fincher’s signature, and it put him on the map as a filmmaker.   The movie also defies many cinematic conventions, like several points where it breaks the fourth wall, and reminds you that you are indeed watching a movie.  There’s a great moment where we see Pitt’s Tyler working as a movie projectionist, and while he is working with the Narrator speaking directly to the audience about what a projectionist does, Tyler points to the point of the screen to the reel marker appears just as it flashes on screen, referring to it as a “cigarette burn.”  When I became a projectionist myself in college, you can bet that I called those markers cigarette burns as well.  And it’s that free-wheeling sense of anything goes that made Fight Club so appealing to a young, blossoming film buff like me back in 1999.  Indeed, watching it only gain in popularity over the years has been very satisfying as someone who took to it so strongly from the very beginning.

But, after 20 years since it’s initial release, it’s interesting to see how differently it plays now than it did then.  In 1999, the world was a much different place than it is today, and it’s amazing to re-watch the movie in 2019 and see how far ahead of it’s time it was.  There are some things that both the novel and the movie couldn’t have foreseen, like the rise of social media and the radicalization of fringe political movements, but given how so many elements of the story feel so relevant today, it’s really amazing to see how prophetic Fight Club actually is.  Some would argue that Fight Club is actually partially to blame for some of the discord that we see in the world today.  The movie has been described as a textbook for how to start a radical terrorist organization, showing step by step how groups such as those form; preying on angered, disenfranchised civilians to join their ranks in order to spread chaos as a means of pushing forward a demented agenda.  While the movie never intends to endorse this kind of behavior, it nevertheless shows a detailed look into the formation of a powerful political movement formed around a dangerous ideology.  And some have argued that Fight Club glamorizes it as well.  It’s hard to not notice that some of the tactics that the followers of Tyler Durden enact within the movie bear many resemblances to actions taken by fringe groups today.  The mayhem of the Fight Club cult are in many ways akin to the pranks pulled by Internet trolls today, though many are much more malicious than anything that Tyler Durden came up with.  There’s also the unfortunate way that elements of Fight Club has been co-oped into some hate groups’ talking points.  You know the term “snowflake” that the Alt-right likes to use dismissively against their political rivals; that came from Fight ClubFight Club certainly, as a movie, wanted to stir the pot a bit to shake conformities within society, but it’s unfortunate that the movie has in some ways emboldened some of the worst in our society, who have taken the absolute wrong message out of the movie as a whole.

The thing that gets gets ignored the most with regards to Fight Club as a narrative is that it is first and foremost a satire.  In particular, it’s a scathing indictment on the societal definition of masculinity.  One thing that gets forgotten about the movie is how the character of the Narrator is defined.  The movie is about him trying to discover for himself what it means to be a man.  First the story attacks how corporations define masculinity, as the Narrator spends his money on things that society says should define his worth as a man, but it all instead makes him feel empty.  He only finds true happiness after he does the least masculine thing that society has defined; weeping openly in the arms of another man, or in this case the voluptuous “bitch tits” of his friend Robert ‘Bob’ Paulson (an unforgettable Meat Loaf).  Later, he creates the persona of Tyler Durden through a need to have an ideal, masculine friend to rely upon, and that relationship in turn leads the Narrator down the road to promoting an ideology where the ideal of masculinity is defined by how hard you can hit the man that you consider your friend.  This later evolves into the desire to destroy beautiful things, because they are a threat to a masculine ideal because they come from a decadent, corporatized place.  This leads him to nearly killing pretty boy Angel Face (a young Jared Leto), because of that desire to destroy something he found beautiful, which gets to another deep psychological underpinning of the story.  It should be noted, this movie called by many as a Fascist, testosterone fueled pro-male fantasy, was authored by a politically progressive, openly gay novelist who lives in the hipster capital of America; Portland, Oregon.  This story was never intended to celebrate the ideal of the Heterosexual, Macho, hyper-masculine American male; it was meant to be a scathing indictment of that kind of person, and for some reason or another, that seems to have been ignored by both some of the movie’s fans as well as it’s critics.

Much like other misunderstood satires such as Blazing Saddles (1974) and Tropic Thunder (2008), the message seems to get lost in all the noise so that it appears to some that it’s participating in the very practice that it’s trying to mock.  The way that I look at the movie Fight Club today is not as a step by step breakdown of how terrorist groups are formed like some have described it, but as an extremely effective dissection of the root causes of toxic masculinity.  The brilliance of Chuck Palahniuk’s story is in the characterization of his Narrator.  The fact that he is never named in  the novel just illustrates the blank canvas that he is supposed to be in the narrative, and how that emptiness leads him down more dangerous roads in order to fill that empty void.  Tyler Durden is a fiction made reality through the Narrator’s deranged desire to find a better version of himself, and as the story shows us, the desire to be more manly in a way drives one to become a little more monstrous each day.  Palahniuk spotlights a very interesting point here, in how a pursuit towards a perceived ideal in masculinity sometimes drives men, even rational thinking men, into acting against their own self interest.  This sometimes manifests in some ways towards actions like men taking unnecessary risks in order to prove their manhood (such as the car wreck test halfway through the movie), or threatening violence against those who threaten their masculinity.  For the narrator, he’s on a dangerous journey of self-discovery, which ultimately leads him towards reconciling his perceived inadequacies with what he ultimately wants to be as a man; which is closer to being more compassionate.  In order to get there, he literally has to destroy a part of himself in order to excise Tyler Durden completely from his mind; bringing the self-harm metaphor right to the forefront.  You can see the same kind of toxic masculinity represented in those who same politically extreme groups that tout Fight Club as an inspiration, despite missing the whole point of the movie.  How many men have unnecessarily harmed themselves trying to prove their masculine worth.  How many of those men also attack other for not being masculine enough to their liking.  The same hatred can be found in all those closeted males who take their frustration over their own sexuality and channel it into persecution of openly queer people as a result.  Palahniuk, as a queer person himself, must have wanted to examine where that line of thinking might be coming from, and the narrative he found in the process did a brilliant job of helping to shed light on this issue in a meaningful way.

That’s why Fight Club feels just as relevant today, because we are currently in a climate where definitions of toxic and ideal masculinity are again beginning to stir heated debate.  In my opinion, Fight Club subverts what you might expect of it.  It presents this hyper-masculine pastiche on it’s surface, but underneath, it’s a biting satire of the destructive paths men take in order to reach an unrealistic ideal.  It’s just too bad that much of it’s message has been lost over the last 20 years, and in some ways has been unfairly misinterpreted by both some toxic fans and also by ill-informed critics.  It’s critique of toxic masculinity may be the most profound we have ever seen put on film.  It’s a shame that in the same year, the Academy Awards celebrated a different kind of examination of masculinity with the very overrated American Beauty (1999).  American Beauty also examines the psychological journey of a disgruntled American male, but instead of critiquing this kind of character, it almost lionizes his sudden transformation into a self-absorbed rebel, even making his lust over a teenage girl as a part of his awakening enlightenment.  It doesn’t help that he’s also played by an equally toxic human being named Kevin Spacey.  Suffice to say, American Beauty has aged terribly over the years and feels very much like a relic of it’s era, while Fight Club has become more relevant than ever.  No other movie really digs this deep into the root causes of toxic masculinity and shows it reaching such dire extremes.  What Fight Club shows is that much of the discord that arises from toxic masculinity comes from a dangerous sense of fear; fear of not being seen as manly enough by society and how that leads to destructive ends in order to compensate for those perceived weaknesses.  Tyler Durden is not a poster boy for the best of masculinity; he is a villain bent on destructive ends.  And the scary thing is, there’s a little Tyler Durden in every man, young or old, who feels like they are lacking something in their life.  Fight Club, even with the veneer of it’s revolutionary punk aesthetic, ultimately wants us the viewer to choose compassion over destruction, and that is why it continues to remain a beloved classic to this day.  Even 20 years later, I am still confident in my choice to have it top my list for the year 1999.  And by examining how it’s message resonates today, I am not just confident it’s the best movie of 1999, I think it stands as one of my favorite movies, period.  I am James’ complete lack of surprise.