Notes on Single-Camera Comedy

Judy Greer in Miss Guided

For a paper I'm giving next month at Console-ing Passions in Santa Barbara, I have been doing research on single-camera television comedies and on the rise of this format as a rival to the traditional multi-camera sitcom. I thought I would share some of my ideas here as I develop them. (Previously I have blogged about 30 Rock and similar shows as "anti-sitcoms" and about Jezebel James and its much-reviled laugh track, and have also been part of a discussion of shifting sitcom aesthetics at In Media Res with Jeremy Butler, Tim Anderson, and Jonathan Gray.)

I will not rehearse the difference between multi and single cam shows here, as I have done that already at the posts linked above. (Even so, this post repeat things I have already said a little bit, which might be in the nature of blogging.) What I want to get at here is more of a sense of the valuation of these styles, the idea that one is superior to the other, and the implications of this hierarchy for the television industry and media culture.

The most important thing to know about sitcoms today is probably that they are not as popular as they used to be. The top-rated show in America has not been a half-hour comedy since 2002 (Friends). This is in contrast to the ratings in previous decades, when sitcoms like I Love Lucy, All in the Family and The Cosby Show were perennial favorites (see the yearly winners at Wikipedia).

Last season, the top-rated sitcom was Two and a Half Men at #19. (This ranking is a bit confusing, since different iterations of reality programs--American Idol's Tuesday and Wednesday nights, the third and fourth installments of Dancing With the Stars--each count. All of the following ratings data come from the Hollywood Reporter.) The next sitcoms on the season-end chart are King of Queens at #33 and New Adventures of Old Christine at #40. 2.5 Doodz, KoQ, and Christine are all made in the old-fashioned multi-cam style. The highest rated single-cam show is My Name is Earl at #58. It is followed by How I Met Your Mother (multi) at #61 and the short-lived Help Me Help You (single) and The Class (multi) tied at #65. Just to name a few more rankings of single-cam shows you probably like more than the proverbial average viewer: The Office was #68, Scrubs was #87, 30 Rock was #102, and Everybody Hates Chris was #137 (Chris is on the CW but still they can't be happy with that). Meanwhile, single-cam shows can also be found on premium cable channels, which don't worry in the same way about ratings as the ad-supported channels (Weeds is on Showtime and Curb Your Enthusiasm and Flight of the Concords are on HBO).

So what we have here are a bunch of shows in the new style that critics and upscale audiences tend to like, but which have not caught on in a big way with the network audience previously known as mass. The single-cam style is supposed to mark an improvement on the traditional sitcom, but the traditional sitcom was a crowd-pleaser by comparison. Single-cam shows are often described as smart and edgy, but a lot of people don't get their humor. This is probably one of the things these shows's audiences appreciate. Arrested Development requires a high degree of getting it, and there is abundant pleasure in that.

One way that these shows appeal as "quality" is by seeming less like television and more like cinema. Everyone who tries to explain the difference between multi and single styles resorts to the cinematic analogy: shooting single cam is like shooting a movie. Each shot gets its own lighting setup. You can do more visual humor and use editing to insert jokes. The absence of a laugh track is likewise supposed to be more cinematic, and is offered as a gesture of respect to the audience: we know you're smart enough to know when to laugh. Never mind that the typical single camera comedy makes just as obvious its comic intentions with goofy musical cues and absurdist breaks in the narrational flow.

Why have television shows turned to this new format in the 2000s? I can think of a number of explanations. First is the perennial truism of the TV biz: nothing succeeds like success. Comedies were thought to be a dying breed around the year 2000, but the show that broke out that year as a new hit sitcom was Malcolm in the Middle, a wacky single-cam show that broke the fourth wall and just seemed different from everything else on television. Following this success, the networks were happy to consider single-camera shows.

Other influences would include Ally McBeal, which won an Emmy for best comedy in 1999 and seemed cutting edge with its fantasy and musical sequences and visual imaginativeness. Sex and the City, another non-traditional sitcom, won the Emmy in 2001, offering another model for an alternative to the multi-camera show, with voice-over narration and real (or realistic) locations. The cartoonish qualities of many sitcoms owe a debt to The Simpsons and other animated comedies, and perhaps as well to Raising Arizona, a movie that anticipates many of the single-cam conventions including ironic narration, aggressively noticeable comedic scoring, bright, surreal visuals, larger-than-life characters, and using editing for humor.

Economically, the declining mass popularity of sitcoms might help explain the networks's willingness to allow more experimental and untested ideas. When nothing is working, it can be helpful to try new things. More significant, though, the single-camera shows might appeal to a small but desirable audience, one with its share of affluent young consumers, and perhaps especially those elusive young male viewers. Attracting the biggest possible audience is not always a network's strategy. The whole idea of "quality TV" is that shows with low ratings can still succeed by appealing to a "quality" audience.

Within the film and TV industry, the multi-cam aesthetic has often been though of as inartistic in comparison with single-cam productions, in part because multi-cam shows were shot on video, and in part because of the need to light a multi-cam show very evenly and brightly, which cinematographers consider totally uninteresting. For actors as well, single-cam work often seems more appealing because it's more like working on a movie in terms of schedule and technique. For instance, John C. McGinley has said he would not have taken his part on Scrubs if it had been a multi-cam show.

It is hardly surprising, then, that TV critics and scholars and upscale audiences like the single-camera shows. They have been made for people like us. But we should be wary of condemning the multi-cam style as passé when the largest number of sitcom viewers are still tuning in to multi-cam shows (if we include reruns, of course, this point becomes even stronger). At In Media Res, Butler makes the standard valuation clear when he asks: "Is the sitcom truly dead, or is it just evolving into something more interesting?" I doubt the sitcom is dead (these things tend to be cyclical), and I see no reason to assume that the single-cam style is more interesting or further along in a natural progression, either. When I watch old episodes of brilliant multi-cam shows like All in the Family and Will & Grace, I realize how much I have missed these shows' wit and timing, the physical talents of their stars. The visual limitations of the three-wall set can seem quaint and unfortunate, but these very limitations allowed for a mode of comedy that required verbal cleverness and bravura acting with the whole body to make a show a success. Being "cinematic" isn't necessarily better than that. Like the contemporary Hollywood style David Bordwell has called "intensified continuity," the single-cam style emphasizes energy and speed by using frenetic editing and camerawork; but gains in one area can mean losses in others. The presentist notion that culture progresses toward better and better forms can bias us, making us liable to ignore the losses and see only the gains.

Elsewhere in the tubes:
-Wikipdia: List of Single-Camera Sitcoms.
-Ask Metafilter: Why use a single-camera mode when shooting television?

PS I do admit, Miss Guided is better than Jezebel James :(


Indie TV?

Is there such a thing as indie TV? Yes and no. Producers shoot pilots and series independently, i.e., without a studio or network deal. They might call themselves independent. Marshall Herskovitz likes to position his quarterlife experiment in indie terms. But culturally, no one thinks of television comedy and drama, whether on the networks or the cable channels, in terms of Hollywood vs. indie. This isn't the logic of TV culture.

And yet, in the post-network era, the boutique cable channels are functioning in relation to the networks as art house theaters function in relation to megaplexes. Small audience vs. big audience, class vs. mass, art vs. trash. And I have noticed that in form, some of the shows on HBO go for an indie sensibility, especially Tell Me You Love Me, which could easily work as an upscale film drama in the vein of Friends With Money. TMYLM is low key, stylish in its art direction, not very tightly plotted, and addressed to a sensitive viewer who is expected to appreciate subtle thematic connections among the characters. Its explicit sex scenes fit the "adult" construction of the art house audience too.

Another point of comparison would be in conventions of production and expectations about artistic autonomy. The indie sector of the film biz prides itself on being less commercial and more authentically artistic, allowing the indie auteur room to express him or herself. The fact that cable shows like The Wire give creators more autonomy--by ordering a whole season rather than a pilot and a few episodes and by letting the whole season run rather than waiting for a pick up depending on the ratings--would help position boutique television as independent. This post at the TV writer Kay Reindl's blog Seriocity is a rare instance I have seen in which this comparison is made explicit. She quotes John Slattery saying that the production of Mad Men (which you may recall was my favorite thing of 2007 in any medium) is like an independent film. And Reindl makes clear that she sees working on an AMC show like Mad Men as an alternative to "the system" of network production. It's a "passion project." She even calls it "quirkier." (Reindl doesn't mention that the show is produced by Lionsgate, one of the few powerful indie film/TV companies that is not a mini-major.)

It's such a cliché to say that TV is in another golden age. But there are significant differences between today and yesterday. The biggest one might be that technology enables the fragmentation of the audience into niches, and that some of these are upscale. It's only by being able to appeal to a small, affluent audience that HBO, AMC, and the other outlets pursuing a more "artistic" conception of television have been able to promote the kind of distinction that television now offers, as art house cinema has since the years after WWII.



The Vultures have apparently been getting slammed by others in addition to yrs truly over their profligacy in spoiling shows, including in the headlines of their posts which are impossible to miss if you are scanning a list of items in a feedreader. Thus their rant today encouraging viewers to watch their shows when they're on, which would make for a world where spoilers won't matter (via).

Actually the Vultures say what they do is different. Spoiling is when you have knowledge of what's going to happen in a show (movie, book, whatevs) before its official release. What they do is just reporting on what happened. They say that television programs should be treated like sports events: once they've aired the first time, they're fair game. According to this logic, the determination of spoiler-ness is a matter of time rather than of the nature of information revealed. It's not spoiling once the info is out there officially, only before. I disagree and I'll say why in a minute. But first I want to reiterate what I said the last time these folks had me in a blogworthy lather. As ever, these people are really, really wrong. Moreover, their position is embarrassingly self-serving. But the point of writing about this again is greater than just to mark my outrage and warn you off their website. There are larger implications about contemporary popular culture and its modes of consumption.

Spoilers have no doubt been around as long as stories have, but it is the discourse of media fandom that has popularized the idea of the spoiler as a token of knowledge-power. The one with the spoiler has the potential to influence someone else's experience of a narrative. Thus the warning of a spoiler to come is a courtesy, a gesture of respect. The expectation of spoiler warnings in popular discourse is a matter of etiquette. It would only exist in a scenario in which knowledge is unevenly distributed, and it mitigates the effects of this distribution. In particular, those like me who prefer not to be spoiled like to be respectful of others, whatever their preferences.

This same etiquette extends into discussions of media that are not conventionally considered in terms of spoilers, such as literature and theater. By convention, book and play reviews are--like film and television reviews--careful about what they will reveal. It is a convention of criticism that discourse aimed at those who have not yet seen or read the text may rehearse expository passages, but that the details of the plot once a situation has been set up will be treated vaguely, by reference to their emotional dimensions (harrowing, poignant) or their thematic significance (ultimately life-affirming, a paean to love). This etiquette of vagueness and indirection marks a distinction some might make between reviewing and criticism, with the former functioning as evaluative consumer guidance and the latter being more in-depth and serious. I don't know that I would avoid calling reviewers critics (some would). But there is clearly a difference between the expectations of a newspaper or magazine review and those of, say, a scholarly article, one of which being that a scholarly article isn't careful not to spoil. Its reader is assumed to have read/seen the whole thing. Warning the reader of a spoiler ahead basically says, treat this as criticism, not as a review.

The question in relation to Vulture, though, has to do with television, which has until very recently been different from movies, plays, and books in its flow, its ephemerality. Now some kinds of TV are moving away from flow toward something more like publishing.* The Vultures want to change the etiquette so that television will be different from movies and books and plays even as television texts in some ways are becoming more similar to those forms experientially. The novelty of television spoilers is of course a product of technological convergence. Only once we have TV on DVD, On Demand, DVRs, iTunes, etc., is it necessary to assume that television viewing is like book reading or cinemagoing: basically asynchronous, different people doing it at different times. Technology has freed the viewer (well, some of us viewers, anyway) from the shackles of the broadcast schedule. But this isn't good for a culture blog because it traffics in talking up the latest thing. The reason to return to Vulture hour by hour, day by day, is to read about something fresh and new. Television is an ideal topic to blog about for this very reason: it's constantly offering up fresh meat, and on a regular basis (daily and weekly episodes). Vulture doesn't want to be constrained in its ability to cover this material.

The problem is that Vulture wants TV to stay the way it was, not the way it now is and the ways it is becoming. It would probably be better for commercial blogging--blogging that is trying to sell the largest desirable audience to advertisers--if TV would stay the way it was. This would also please the media industries, from whose ads culture blogs like Vulture would love to profit. The networks and cable outlets would prefer if everyone would stop TiVoing their shows, stop downloading them, stop streaming them, stop waiting for them to come out on video, and just fucking watch them when they're on, with all the ads, like everyone used to in the good old days. They're terrified of their analog dollars becoming digital cents. Nothing would please them more than a cultural shift back to watching TV when it's on. If it becomes ok to spoil shows as soon as they have aired, it might promote this anachronistic idea of the schedule regularizing our experience. (The commonality of interest with the TV biz also explains why Vulture opposes spoiling a show before it has aired.)

Ultimately the issue of what's ok to spoil will be decided by the community of media consumers, not by any particular party, and certainly not by the media industries (including the hype industry of which Vulture is part). It is a matter of etiquette, and no one can mandate etiquette. But I'm grateful to Vulture for making clear their position, and I think everyone should have a clear position so that people can know where to pay their attention. Those of us who hate spoilers will unsubscribe from blogs that don't respect us just as we avoid obnoxious people. And those who don't care may continue not to care, though they may suffer an exceptionally brutal hell in the great beyond find themselves losing the attention they crave.

*On this point, see Derek Kompare, "Publishing Flow: DVD Box Sets and the Reconception of Television," in Television & New Media.

Update 3/14:
Vulture responds.

Update 3/15: Jason blogs my encounter with Vulture, including some reflections on differences between writers for the popular press and media scholars.

And: Chuck thinks Jason might be onto something re power dynamics between journalists and profs.

Update 3/17
: Fimoculous sez "most people who work in online media have found themselves embracing at one point or another" the argument for spoilers offered by Vulture. He calls for more public debate of the sort we see in the comments thread of Vulture's response to me.



I promised more blogging on the 2008 Society for Cinema and Media Studies conference in Philadelphia, so here goes. The conference was an opportunity to see many old friends and colleagues and to meet a number of new ones. The hotel, a Loews, was swanky, with boutique touches and classic modernist design and architecture. The restaurants I ate in were often excellent (if you're in Philadelphia here are my recs: Lolita for upscale Mexican, Penang for Malaysian, Vic Sushi Bar for excellent takeout unless you can snag seats at the tiny bar, which we did...and for finding good food in a new place, at least in the USA, it doesn't hurt to have the crowd-wisdom of yelp.com). The sessions were fine and I learned some new things from them, but the social dimension of the conference is always as significant as the intellectual experience. I was also excited during my slow circuit of the tables in the bookroom to see so many beautiful new books by people I know.

The panels and workshops I attended were on the contemporary film and TV industries, American independent cinema, Canadian media, character and emotion, the future of television studies, and my own on the Coen brothers. My paper, on the Coens and pastiche, went fine, though there was little time afterwards for questions. In addition to being at 8 am, our panel came the morning after advancing the clock an hour, so it felt like 7. Maybe this is unavoidable. But I do wish the hotel had put the fitness center somewhere other than above a conference meeting room, as the pounding of treadmill feet over our sleep-deprived heads seemed almost violent.

As Chris notes in his conference summary, the SCMS 08 theme of "architectures of the moving image" motivated many panels and papers to put "architecture" in their titles. I heard no one speak about architecture in a literal sense, and in no instance did the use of architecture in a figurative sense lend anything extra to the discussion. It seemed to be just tacked on, or used as a synonym for "structure," which itself is often a vague term taking up space.

My sense is that one trend in the society is strongly centrifugal, away from traditional film studies in terms of its topics and approaches. This isn't to devalue those things. There were more than 1300 attendees, and such a large meeting makes room for many kinds of scholarship. The complaint, however, that the conference offers too little to scholars with an interest in television, videogames, and other "new" media and the approaches that go along with them would now be anachronistic. This is to the society's benefit.

I wasn't taking notes during the sessions and don't really care to recap them, but I will link to other blogs that have been tracking SCMS, many of which do offer some summaries and reflections more specific than mine:

-Digital Sextant has a comprehensive list of the papers the blogger heard on topics like The Legend of Zelda and Peter Gallagher's eyebrows.
-Category D offers a number of criticisms of conference formats and proposals for the future.
-Dr. Mabuse's Kaeido-Scope has an open thread for SCMSy comments.
-Ephemeral Traces recaps two panels, one on fandom and the other on paratexts.
-Jamais Vu offers a list of reactions and reflections.
-Film Snob, whose thought-provoking paper on Mumblecore and cinephilia I heard on the indie panel, revives her blog to tell us what SCMS is like when it's your first time.
-Jonathan Gray at The Extratextuals blogs about panels and workshops at the conference about television studies, convergence culture, extratexts and paratexts.
-Sam Ford blogs his paper on vast narratives and immersive story worlds at the C3 blog, with a promise of more to come.
-Steven Shaviro links to a pdf of his SCMS paper entitled "Untimely Bodies."
-The Institute for Multimedia Literacy at USC blog offers the abstract for a panel on Web 3.0, with a promise of more to come.
-Scope has a lengthy conference report.
-And this is not exactly tracking SCMS, but I wanted to link here to Media Praxis, the blog of Alex Juhasz, who I really enjoyed meeting briefly in a hallway before a panel, where we talked about--what else?--blogging.

I'll update this list if I come across more SCMS blogging.


The Internet is for Nostalgia; SCMS

Could there be a better justification for the interwebs than the preservation in an accessible form of what might otherwise become cultural detritus? I don't mean just any old stuff, mind you. I have in mind the items from one's own experience that had been thought lost or perhaps were just forgotten. One day they turn up when you are searching for something entirely unrelated and, kapow, a wallop of your past, your whole youth and innocence dropped on your brain, and for a long moment the world outside your memory ceases to exist.

Lately I have been satisfied with early 80s music videos, Sesame Street Old School DVDs, and Genesis's Trick of the Tail for my nostalgia. Then I discovered the hundreds of videos posted to YouTube by the user WNED17, clips recorded off of cable TV in Toronto, mostly in the 1980s. What I love most about WNED17's videos is how they seize on the interstitial bits we often think of an inconsequential: commercials, idents, intros for news shows and movies, and promos. These once marginal forms now command my full attention in a way they probably did not then. My most intense reactions are for the children's programs that used to air on the Canadian public television channel TV Ontario, which was my only source of age-appropriate programming in the late afternoons in those days (PBS, which we received over the air from Buffalo, had Sesame Street in the mornings). Elsewhere on YouTube one can find bits of TVO programs like Fables of the Green Forest. And there is also a site where you can listen to MP3s of theme songs for shows that aired on TVO, including Barbapapa, Jeremy, and Simon in the Land of Chalk Drawings. I wasn't aware of how many of the shows I watched then were part of the global circulation of children's media. Barbapapa was French, Fables was Japanese, and Simon was English. Simon was parodied on SNL by Mike Myers. Here is the original.

This is the French intro segment for Barbapapa, adapted from books about creatures who change shape (barbe à papa, literally "daddy's beard," is the French word for cotton candy). I never appreciated as a child how trippy kids' culture could be, though I have been attentive to this idea ever since I caught my first glimpse of Teletubbies:

And the intro theme to The Polka Dot Door is a tune I heard every day for years as a child. If I am to grow old and demented, even if I forget my own name, I doubt I will ever lose my memory of this melody.


I am leaving tomorrow for the Society for Cinema and Media Studies conference in Philadelphia. My panel is on Sunday morning at 8 a.m. The topic is the Coen brothers. Hope to see you at SCMS, if not at 8 a.m. Sunday. My paper is called "The Coen Brothers and Pastiche" and is about the idea that their films play with their influences and sources. More SCMS blogging to come.