Ignatiy Vishnevetsky is one of the more thoughtful critics working today, but his latest piece for MUBI’s Notebook, “What is the 21st Century?: Revising the Dictionary,” throws me for a loop. In it he attempts to introduce “workflow” in the critical lexicon, suggesting that if critics and cinephiles knew more about the filmmaking process they would be better equipped to pass judgement on movies.
I agree with this assertion to a point, but Vishnevetsky goes much further:
One of the biggest problems facing film criticism and film culture is that that there is often very little relationship between how movies are written about and how they’re actually made. Film is a medium that is inextricably linked to technology, but the language we use to talk about and evaluate films is by-and-large the language of antique or dying technologies or of environments (such as the old studio system, with its clear divisions of filmmaking labor) that no longer exist. While much of the old critical / cinephilic vocabulary—mise en scène, montage, etc.—still works, it’s often not enough.
This is one of the biggest problems facing film culture? I don’t see it, least of all because I don’t think it’s an actual deficiency.
For example, how much ink was spilled, how many bits were deployed discussing Tom Hooper’s decision to record the actors’ singing live on set in his adaptation of Les Misérables? How many reviews mentioned this little nugget of production workflow detail? The entire conversation surrounding the film, I would argue, was predicated on this creative decision.
Or Peter Jackson’s use of 48 frames-per-second in The Hobbit: An Unexpected Journey at. The critical responses that ignore that technological conceit are few and far between. So I would first argue that Vishnevetsky’s thesis is flawed; he’s saying that critics don’t do something they clearly already do.
Still, the workflows he mentions, those of David Fincher and Steven Soderbergh, do often go overlooked in reviews:
Modern film styles are the products of workflows. David Fincher’s The Girl with the Dragon Tattoo, for instance, was widely noted in the film tech sector for its innovative workflow. Footage was shot in 5K with a 2.1 aspect ratio but finished in 4k with a 2.4 aspect ratio. Only 70% of each shot frame was used in the finished film; this meant that Fincher could revise every shot—reframing, altering the speed of camera movements, adding zooms—during editing without any loss of image quality.
Professionally I am intrigued by these sorts of workflows. By day I work in post-production, and my curiosity is always piqued by any sort of advancement. I am smack in the middle of the film tech sector and I wish more directors would improve their workflows to Fincher/Soderbergh levels. It would make my job easier. (Or, as Vishnevetsky also points out, redundant.)
But I take that hat off when I go to watch or review a movie. Why? Because it’s utterly boring. All that’s described above is that Fincher did repositions on his footage, a decades-old technique that predates digital technology. Even the Academy Standard aspect ratio of 1.85:1 is achieved by lopping off the top and bottom of the picture. Kubrick, famously, shot a number of his films utilizing the entire 35mm frame, allowing for multiple projected aspect ratios.
So yes, Fincher took advantage of modern tools, but I don’t see how that should affect how I watch a film. Vishnevetsky:
What does this mean for critics, cinephiles, and other people who want to talk about movies? It’s still possible to note The Girl with the Dragon Tattoo’s mise en scène or [Steven Soderbergh’s] Side Effects’ narrative structure—that hasn’t changed. What has changed is the notion of environment and intent.
Has it, though? Vishnevetsky echoes this again in his conclusion:
Workflow-related processes like HDRx1 upset old orders; what happens to the idea of the shot when the shot no longer has integrity—that is, when an image doesn’t represent a record of something but is instead a seamless composite of two sets of digital data? The only way to build a critical framework for workflow, it seems, is to exercise doubt while also assuming that everything has the potential for authorship and intent.
What is the difference between new HDRx technologies and matte paintings of old which accomplish nearly the same effect? In a world where characters are created on a CPU, and have been for two decades, what difference does a workflow make, or rather how can it actually influence the way you watch a film? I do not understand Vishnevetsky’s call to “exercise doubt.” Doubt what? To what end?
The final bit though, “assuming that everything has the potential for authorship and intent,” is dead right and crucial to modern criticism. In fact, I’d counter that a lack of understanding of the concept of intentionality is “one of the biggest problems facing film criticism and film culture.”
It is my strong belief that critics must assume that what they are seeing on screen is the intended work of the author. Anything less, any doubt cast on the simple truth of intentionality, sells film culture short.
A quick story.
After the premiere of his film, Girl Walks Into A Bar, at SXSW 2011, director Sebastián Gutiérrez took the stage for the usual post-screening Q & A. First, though, he had something to tell the audience. He extolled the Alamo Drafthouse as one of his favorite movie theater chains and then complained that the picture we had just seen was a bit too dark. It looked better while he was editing it and he wished we could have seen it that way.
Here’s the thing: the film we saw, as we saw it, was the truth. The images he saw at his editing bay were the fantasy. Why? Because as viewers we can assume nothing less than intentionality when images flicker on screen. If the filmmaker weren’t in the house, would anyone have known the images were “wrong” according to his vision?
Moreover, putting my post hat back on, one color corrects images differently for the Web than one does for theatrical projection. Gutiérrez said the film had “leaked” on YouTube that night. It was never his intention for this to be the canonical screening of the film. Too bad for him.
One more quick story.
At SXSW last year2 I met a young woman waiting in line for a film. She asked what brought me to the festival and I told her I was a film critic. She bristled. “Oh, one of the bad guys.” I laughed and asked her what she meant. She went on and on to me (a stranger!) about how critics over-intellectualize movies and have no concept of how hard it is to actually make them.
This is the Kevin Smith argument, and it is legion. “You could either sit around and be entertained, or you could go out and try to be entertaining.” Film critics are the enemy because they never worked a day in their life, or some such nonsense.
It feels like Vishnevetsky’s argument that critics be more mindful of workflows stems from the same gland that causes people to dismiss critics as “bad guys.” If only critics knew the process, if only they knew how hard it is to make a movie, their responses to cinema would be more interesting, more useful.
Perhaps. Or perhaps they will start undermining intentionality by cutting filmmakers a break for the nastiest bits of their workflows.
I’d be remiss if I didn’t mention Lars Von Trier’s The Boss of it All. Here’s what I said about it back in 2009:
For 2006’s The Boss of it All, the acclaimed director utilized a system dubbed “Automavision”. Basically, he shot the film only from wider angles, or all establishing shots, and allowed a computer randomly tilt, pan, or zoom. Watching the film, you would never know this. It feels wholly organic, intentional. In fact, for Mr. Von Trier, it seems almost tame.[…]
Giving control over to a machine, he is simultaneously removing himself as creative controller and, amazingly, further pressing his status as provocateur by drawing attention to his shrunken role.
Von Trier wanted the audience to know of Automavision’s presence, to know the methodology behind the haphazard visuals accompanying his workplace farce. It certainly colors my view of the film (I’m talking about it years later, aren’t I?) but when watching the film it was the furthest thing from my mind. I was wrapped up in the story and, yes, grappling with the oddball camera placements. But did it matter that a computer was to blame/laud? I don’t think so.
How filmmakers create their art is something that will always be of interest to me both personally and professionally, but that information does not influence how I judge the work itself. If it did, I’d love everything. I’d be less critical.