Monday, June 5, 2017

Talking Out of Turn: Remembrances

Love As An Open Argument: Remembering Kate McGarrigle

When I heard the news yesterday about the death of Kate McGarrigle from cancer, it really cut to the quick. “It’s like getting kicked in the gut, you know,” Sylvia Tyson remarked upon hearing the news. I know what she meant. Over the years, Kate and Anna McGarrigle, the folk singing siblings from Quebec, had written and performed a distinct body of beautiful songs that had the capacity to share wounds and open sores that their soaring voices could quickly heal. Their temperaments were radically different – Anna the hopeful romantic; Kate the skeptic – but the blending of their differences often created a harmony unique to popular song. Calling the McGarrigle sisters’ material “tart and tuneful,” J.D. Considine in Rolling Stone said their work married “the resonance of folk to the emotional immediacy of everyday life.” What Kate and Anna McGarrigle did was give the genre a different lease on everyday life. They wrote personal songs without resorting to solipsism. They wrote romantic songs without bending to sentimentality. They could be funny and sarcastic without being smug.

Kate McGarrigle

I first discovered the duo (like most people) through Linda Ronstadt’s 1974 cover of “Heart Like a Wheel,” a heartbreakingly beautiful tale of romantic remorse. But since Ronstadt, with her prevailing masochism, tended to enshrine a spirit of victimhood in her musical interpretations, the song came across as a lovely, yet self-pitying lament about loss. The Canadian poet bp Nichol though told me to listen to the McGarrigle’s original take on their first album for WEA. “You’ll hear how pain can cut both ways,” he remarked to me. I wasn’t sure what he meant until I heard the song. What I found in their version was a sense of what gets lost in a relationship. If Ronstadt played out the role of the heartbroken girl who never wins at love, the McGarrigles brought out the song's core of bereavement. They addressed what it is that gets broken in the heart. In their hands, “Heart Like a Wheel” was an inconsolably beautiful song about the pain of loss, the cost of love.

But they didn’t just sing about loss. Their songs often had the carnal, yet sophisticated, air of Cole Porter – as on “Kiss and Say Goodbye” (“I don’t know where it’s coming from/But I want to kiss you until my mouth gets numb”). They could distill pure longing out of simple nostalgia on a track like “(Talk to Me of) Mendocino.” Their French-language take on Bob Seger’s “You’ll Accompany Me” outclasses his original by turning the track from a guy’s lustful boast into a woman’s desire for companionship.

My favourite song however was not released on a major label. Back in the late 80s, guitarist and CBC producer Danny Greenspoon did some sessions with the sisters. He drew my attention to one song (which was released, along with a few others, on a CBC Variety collection) called “My Mother is the Ocean Sea.” Written by Wade Hemsworth, the tune is one of the most exquisite laments about the desire for ecological harmony ever composed. (It was commissioned for an NFB documentary with the unlikely title of Sea Weed: An Approach to Marine Agriculture.) In the song, Kate and Anna McGarrigle layer their wistful poetic harmonies over Danny’s weeping guitar to invoke the spirit of an underwater world lost and mysterious, but fully realized by their soft cries:

Oh, my mother is the ocean sea.
My father is the sun.
Long time ago they made me
Before time had begun.

Away, haul away children.
One sun, one ocean.

What I loved most in their music was their generosity of spirit that was always perfumed in a mischievous, sardonic style (something they obviously passed on to their kids). But when these ladies rendered themselves vulnerable, there were none better who could cut to the naked heart of the matter. In Kate’s lyrics for “I Cried For Us” on their album, Love Over and Over, she spares neither side in the romantic wars. Yet, as always, Kate still left you believing that - despite heartache and defeat - romance in its purest essence always remained an open argument:

I’ve tried and tried to put aside
A time to talk but without luck
So I’ll just pin this note within your coat
And leave the garden gate unlocked.

-- January 20/10

The Lonely Passion of Jean Simmons


Jean Simmons

The sad thing about British actress Jean Simmons’s death this past weekend is that she never truly had the career she should have. While some today often complain about the diminishing roles for older women in motion pictures, many of those aging actresses – Christine Lahti, Michelle Pfeiffer, and Angelica Huston - at least have had careers worthy of their talent. Simmons was trashed and mostly forgotten before she could even fulfill her ambitions.

Jean Simmons was the most graceful, quiet beauty who ever gently commanded the screen. She began acting at the age of 14, but she was quickly relegated to appearing in largely terrible and forgettable pictures. (When she went to Hollywood with her actor husband Stewart Granger in 1950, Howard Hughes bought out her contract from J. Arthur Rank. Hughes then stuck her in rank pictures like She Couldn’t Say No and Angel Face.) She began as a sprightly adolescent as Estella in David Lean’s Great Expectations (1946), but as she grew into adulthood, she developed into a more ethereal screen icon. Besides Great Expectations, and her heartbreaking Ophelia in Olivier’s Hamlet (1948), her best roles came towards the end of the fifties in Richard Brooks’ potboiler Elmer Gantry and Stanley Kubrick’s exciting and moving sand-and sandal epic Spartacus.

In Elmer Gantry (1960), adapted from Sinclair Lewis’s novel, a con artist (Burt Lancaster) manipulates a tent show evangelist (Simmons) who sincerely believes in her calling. Simmons’s gentle charisma complements and contrasts with Lancaster’s more physical acrobatics. Her evangelist is an angel with a lusting heart, but that lust doesn’t diminish her service to the Lord, on the contrary, it enhances her. Simmons’s muted sexuality tempers the bravado of Elmer Gantry’s theatrical barn-burning. Her serene acceptance of her faith makes it possible for Gantry to acknowledge his deceit, but it also draws him to her as passionately as she draws the congregation so devoted to her. Unlike Barbara Stanwyck’s equally compelling evangelist in Frank Capra’s The Miracle Woman (1931), Simmons isn’t a fiery orator; she’s a graceful presence with a spiritual hunger almost inseparable from carnal passion.

Jean Simmons in Elmer Gantry

In Kubrick’s Spartacus (1960), Simmons plays the slave woman Varinia who falls in love with the gladiator warrior (Kirk Douglas) who tries to bring down Rome. Simmons, once again, finds the character’s strength in calm defiance. There’s a perfect example early on at the gladiator school when the fighters are given women to satisfy them before they go out to possibly die. When Spartacus lays eyes on Varinia, he is so taken by her beauty and quiet integrity that he declines to have sex with her. When the masters, who have been watching this display, ridicule Spartacus, he yells, “I’m not an animal.” But Varinia also replies, “Neither am I,” while looking at Spartacus. Simmons makes you feel in that scene that sex with Varinia is only the beginning of what could be possible – if one were to show the dignity to accept her on human terms. Her loveliest scene comes later in a duet with Douglas, beautifully underscored by Alex North’s music, after their escape from the school, where they finally pronounce their devotion to each other. They make promises that we know can’t totally be kept because circumstances will tear them apart. But we still believe in their union because Simmons and Douglas make love itself feel like an evolutionary step towards perfection.

Jean Simmons made a lot of movies and appeared on many television shows like In the Heat of the Night and Murder, She Wrote, but she was merely ornamental in those lazy genre pieces. Simmons’s screen persona had an elegance matched by an innate intelligence that transcended being simply a supporting character. She brought an air of refined passion to the screen without a whiff of drawing room stuffiness. But she was also someone beyond our grasp, to hold in awe, in respect, and maybe, too, in regret that we didn’t love her as fully as we might have.

-- January 25/10


Mahalia Jackson

Back in the late fifties, my parents frequently took me to the Drive-In. My formative movie experiences were forged there. I saw a lot of bad films on those excursions (Butterfield 8, anyone?), but one of them found a way of staying with me for years. But as critic David Churchill always attests, there are sometimes scenes within terrible movies that are mini masterpieces. For me, a perfect case in point is Douglas Sirk’s Imitation of Life (1959), the second Hollywood adaptation of Fanny Hurst’s famous weeper.

Sirk made a number of glossy melodramas for producer Ross Hunter in the fifties including All That Heaven Allows and Written on the Wind. These soaps would later be acclaimed as subversive by auteur film critics in the ‘80s who were basically enamoured with the rather turgid work of the German director Rainer Werner Fassbinder (who used Sirk's style as his model particularly in The Marriage of Maria Braun). Imitation of Life is ostensibly about an aspiring actress (Lana Turner) who is also a single mother. While trying to develop a career and raise her young daughter, she gets help when she befriends a jobless black maid (Juanita Moore) and her young daughter (Susan Kohner). Once they come to live with the actress, however, the story shifts to the problems of two generations of black Americans and Lana Turner’s issues become comparatively minor.

Despite its progressive civil rights angle, Imitation of Life still remains shamelessly sentimental and manipulative – even when it deals with core issues like integration and miscegenation. But towards the end, just when the story has finally torn at all your heartstrings, an unexpected epiphany takes place. In a funeral scene held in a majestic old church, the great gospel singer Mahalia Jackson comes to the pulpit and delivers one of the most majestic and powerful performances in the history of movies. The effect pretty much blows the previous two hours of tear-jerking right off the screen. Jackson sings the traditional gospel tune “Trouble of the World,” a song that proclaims the end of life’s worst trials and where one finds a final peace in God. The track is a testament of faith, but Jackson takes the composition even further. In her version, the song has the power to fully release both the anguish and triumph of the Afro-American experience – from slavery to desegregation. There is a final acceptance that in God's mercy all unrighteousness will be undone. Jackson dramatically restores all that was kitsch in Imitation of Life to art. Her rendering of the song is a benediction. For myself, at five, I was spellbound and filled with tears watching Mahalia Jackson on the massive outdoor screen.

Imitation of Life

Many years later, I was taking a trip through the U.S. during the 1991 Desert Storm preparing material for a radio documentary that never came to pass. While I was in Dallas, Texas, I was buying some postcards to mail home. As I was standing in the check-out line, the cashier had a VCR playing a movie behind her. It was just my luck that it had to be Imitation of Life. Mahalia’s great moment was mere moments away and I could feel the tears starting to well up. I wanted to get out quick, but there was a person ahead of me. I quickly put on my sunglasses, but I still managed to attract the concern of the cashier when it came time for me to pay. As she cashed me out, Mahalia’s performance was still ringing loud over her shoulder – and I dashed out.

On that day in Dallas, though, I heard many new things in Jackson’s voice as I waited in line. I heard a testimony that seemed to reach beyond the time in which she sang it. “Trouble of the World” suddenly took in events and circumstances beyond the moment it was being performed. I could hear her voice in Martin Luther King’s final speech in Memphis 1968, the night before he was murdered, especially in his declaration of reaching the mountain top and seeing a future that he knew he wouldn't be part of. I could also hear her in Muddy Waters in The Band’s 1976 The Last Waltz concert, filmed by Martin Scorsese, singing “Mannish Boy” in complete acceptance that, as a man, he could command the world, and where he recognized that he was an indelible part of its making.One single camera captured every vowel and every gesture of his proud swagger.

As I quickly left the convenience store, I hopped a cab to go back to my hotel. The black cabbie was listening to a local gospel station. Suddenly, as if by divine intervention, Mahalia came on the radio. She wasn't singing “Trouble of the World” this time. But it didn't matter. I smiled as a few tears started to roll down my cheek. The cab driver caught my expression in his rear-view mirror. He looked back at me and said, “Ain’t no man a real man who don’t cry when Mahalia sings.”

-- February 5/10

Of Schulberg and Hughes


Budd Schulberg

Within 24 hours last summer, two very different Hollywood legends passed away. Novelist and screenwriter Budd Schulberg (What Makes Sammy RunThe DisenchantedOn the Waterfront), who died peacefully at the age of 95, was one. The other was John Hughes, writer and director of The Breakfast Club and Ferris Bueller’s Day Off, who had created a veritable industry of teen comedies during the Eighties. Both men became part of very different Hollywood zeitgeists.

Schulberg invoked an age and political climate still left fully unresolved. The era of the Hollywood Blacklist and McCarthyism, where both he and Elia Kazan (who would direct Schulberg’s script of On the Waterfront) would testify before the House of Un-American Activities. The notion of ratting on your friends, the theme of Waterfront, was key to understanding a time where McCarthy’s virulent paranoia was destroying lives and turning the Red Menace into a careerist’s game of watching some backs while knifing others. Yet there were those, like Kazan and Schulberg (as well as many others), who had earlier been part of the Communist movement with its hopes of fighting injustice at home and fascism abroad. But they came to see its true face in the ideology of Stalinism. In that moment, some of these same idealists quickly became realists. But the question arose, to paraphrase Victor Navasky (Naming Names), of what is the greater sin: Stalinism or McCarthyism? Some chose Stalinism as the bigger evil because they watched writers, composers and artists whose work they loved testifying at show trials and often later being executed. The American version of show trials, they felt, was certainly destructive, but ultimately not as lethal. In any event, choices were made and the era was challenged by men like Budd Schulberg.

Back in 1941, Schulberg had already given his employers in Hollywood massive headaches when he wrote his scathing novel, What Makes Sammy Run. He created a main character, a hustler, who tore the benign mask off of studio moguls and exposed their charade. The story of Sammy Glick, a Jewish kid from New York’s Lower East Side, who becomes determined to escape the dead-end of his own life and find fame and fortune at any price, echoed their own story of escaping Europe and rising anti-Semitism to establish a safer kingdom in Hollywood. Not surprisingly, his book (and its character) was accused of promoting anti-Semitic stereotypes. (Years later, these same accusations would also be addressed to Mordecai Richler after writing The Apprenticeship of Duddy Kravitz, a novel that featured its own Sammy Glick.)

In the Fifties, when conformity was the norm, Schulberg was the fly in the ointment, an uncompromising presence. He made you consider the choices before you. His voice could sometimes be shrill, as in Kazan’s A Face in the Crowd (1957), where television was identified as a tool for populist fascism. But Schulberg always refused to capitulate to the fashions of the time. Author Anthony Burgess (A Clockwork Orange) might have put it best when he once wrote of Schulberg that “[t]he final test of a novelist’s achievement is how far he is able to modify the sensibility of his readers.”

John Hughes

As for John Hughes, a former ad man, he entered Hollywood at the right time for what he set out to accomplish. Unlike Schulberg, Hughes was not a contentious figure; rather he had devised perfectly adaptable concept-driven comedies. After a decade where great diversity and individual temperament was dominant, concepts were now starting to dominate American movies. The studios quickly reasoned that greater sums of money could be made drawing middle-class teenagers into the theatre, so Hughes complied. His first film, Sixteen Candles (1984), was an offhandedly charming and funny romantic comedy which actually made social class its subject. The picture also brought the talented Molly Ringwald and Anthony Michael-Hall (as Hollywood’s first techno-geek) to prominence. But just as soon as Hughes had his first genuine hit, he began to think more like the studio marketing division. The Breakfast Club (1985) was superficially a critique of how students get trapped in high-school cliques, but the movie did more to enshrine their adolescent misery, rather than examine it. The true culprit turned out to be adult authority. (In castigating another picture with similar goals, critic Pauline Kael once wrote that someone should tell these directors where adults come from.)

When The Breakfast Club became a huge hit, John Hughes started to make films that combined the sly subversion of skit comedy with homespun American sentimentality. It was a treacly mixture that would lead to Ferris Bueller’s Day Off (where the only cause this rebel had was truancy) and Home Alone (where child abandonment and home invasion became window-dressing for crude, slapstick pratfalls). Yet, by the end of the Nineties, with many hits to his name, John Hughes disappeared into seclusion. Wes Anderson (Rushmore) and Judd Apatow (The 40-Year-Old Virgin) would, to some degree, carry on his legacy. It was a puzzling departure because, unlike Schulberg, there appeared to be no adversaries to determine his fate.

When both Schulberg and Hughes died they had both become faint figures on the cultural landscape - but for entirely different reasons. Schulberg was maybe too much of a reminder of the cost of compromise where you could, as Terry Malloy attested, end up with a “one-way ticket to Palookaville.” Who today would wish to consider the banishment to Palookaville? Schulberg wrestled with what had become fashionable because he knew that being in fashion was a transient enterprise. John Hughes, on the other hand, knew how to be fashionable. He lived in a moment when you could actually become that moment. He also seemed to recognize that when that moment was over, you could quietly abandon the spotlight. As a result, both men became specters in the culture of Hollywood. But where Budd Schulberg fought against the political and cultural imperatives of the times, John Hughes marked his time – until his time had finally run out.

- February 11/10

One Night of Sin


Elvis Presley
Although Elvis died over thirty years ago, he’s still very much alive in many parts of the popular culture and always turning up in the most interesting ways. Whether he's brought up in comparison to Michael Jackson’s recent death, or alluded to in television dramas, novels, songs and visual art, Elvis continues to mutate into every kind of wish fulfillment. He never really left the building.

The list of pop culture references is too long to indulge here, but one notion clearly fascinates me. I always think of Elvis in relation to Marlon Brando. While both emerged out of the rebellion of the Fifties, they also embodied America’s noblest democratic principle, the idea that a man can make of himself anything he desires. With that goal, or course, comes the eventual failure to live up to that quest for pure freedom that both men created in their work. Their failure to do so came, in part, because of our need to tame in them what we loved most in their distinctiveness; that is, we sought to make them ordinary, to be more like us. But their failure is also due to their own inability to maintain their distinctiveness, the very quality that attracted us to them in the first place. Although both became iconic figures in mass culture, they each ended up horribly isolated and trapped. They even at times became parodies of themselves as if to deny that iconic status, as if to mock it and make it less real. Perhaps that why it’s no accident that these charismatically handsome men would, in succumbing to those pressures, eventually become bloated.

Both Elvis and Brando played with a sense of sin. Brando’s best roles (A Streetcar Named DesireOn the WaterfrontReflections in a Golden EyeThe GodfatherLast Tango in Paris) mischievously broke down our expectations of what constituted entertainment. He took on parts that had the potential to upset us and he overturned any notion that we knew who he was. On the other hand, Elvis was never the star in his movies that he was on stage, in particular, before the army. The sense of sin he initially set loose, beyond the wiggle of his hips, was the playful idea of getting a complacent generation shaking. But then, what? Like Brando, in the Sixties, Elvis stopped shaking things up and was relegated to bland genre parts in films that emasculated him. Elvis's sin, to paraphrase Dylan, was his lifelessness. But as pop critic Greil Marcus once said about Elvis, it was sin that brought him back to life.

That was never more true than in his 1968 television comeback special. While watching The Beatles and The Rolling Stones capture the ground that he’d abandoned to a career in Hollywood, he was now faced with the challenge of making new demands on his audience. By 1968, that audience was older and had grown nostalgic. Elvis was the Fifties after all, so how relevant could he be next to The Doors? To find out, Presley slimmed down, got decked in leather, and gathered together with the men in which he began making his career. And there, performing in a concert in the round with an electric guitar, he finally reclaimed himself (and his audience) by making some of the best music of his life. Where else, too, would he find the desired elixir but in the blues. The restorative power of the blues was the very source that partly powered Elvis's ascension in the beginning.

Marlon Brando
One of the songs, “Trying to Get to You,” he had recorded during his first sessions at Sun Records in the mid-Fifties. In this new version, though, his delivery wasn’t as reverential. He seems to be singing to himself, to some lost possibility that's now been newly found. There’s anger and desire present, yet there's also great humour in this performance, a sense that he’d been away too long. You can hear the need to test himself against the neutered icon he’d become.

Of course, sin also becomes the acknowledged subject in his cover of Smiley Lewis’s “One Night (of Sin).” This song, about a trip to the whorehouse and the night that unfolds, isn’t played here for any prurient humour. In Elvis’s performance, he could be singing to his lost years in Hollywood. There is a hungry bite in his voice, an unrequited passion, but also an expression of pure relief. A sense of sin has indeed brought him back to life. In this song, he appears to have acknowledged both what he has and what he lost.

While some icons are comfortable in their established images, and others (like James Dean) didn’t live long enough to break their mold, both Elvis and Brando grew old before our eyes. They continued to embody the restless American artist’s strongest desire to re-make the culture as they simultaneously remade themselves. But with that goal in mind, by mapping out America’s greatest virtues, they also bravely courted its greatest failings.

-- February 13/10

Memories of Miriam


Miriam Makeba
Back when I was about nine, my family lived in the small suburb of West Rouge, Ontario (which used to be part of Pickering). Our house was located right down at the bottom of Island Road which overlooked the sprawling Rouge Valley. Our backyard was quite long and it connected to the backyard of the Fisher household on Rouge Hills Drive. One day, while I was playing out there, I saw this young black girl in the adjacent property. Since there weren't any black people that I recall living in West Rouge in 1963, I knew she was from somewhere else. When I went over to talk to her, she spoke only a little English and in an accent that I didn't recognize. But one thing I did understand perfectly was when she told me that her mother was a singer named Miriam Makeba.

My parents owned one of Miriam Makeba's albums having recently gone to see Harry Belafonte perform at the (then) O'Keefe Centre in downtown Toronto. (Makeba was one of the guest singers in his concert troupe.) She had one tune on her record that was referred to as the "click song." Coming from South Africa, Makeba sang in her native tongue which contained a sound that resembled a clicking in the back of the throat. I remember one time even putting my ear against my parents' stereo speaker so I could try and figure out what it was and how she did it. Now I saw myself having a chance to solve this dilemma and see her do it in person. So the girl invited me in to meet her mother. But I wanted to tell my folks first. So I went home and immediately informed them that Miriam Makeba was visiting just around the corner. Even though I wasn't prone to lying, my parents didn't believe a word I was saying. But I wouldn't back down. I insisted they phone the Fisher home and find out. When they did, we were told that she was indeed staying there during Belafonte's concert run and they invited us over.

When I met Miriam Makeba, she clasped my small hands in her long fingers and I was immediately overwhelmed by this majestically beautiful woman. She thanked me for playing with her daughter who she said had few friends to turn to when her mother was touring. After the introductions, she gave an impromptu performance in the neighbours' living room. I requested the "click" song because I wanted to know how she did it. Within moments, and only a few feet in front of me, she gave her performance of the song. I was still baffled. What I remember most was being stirred by the sheer power of her voice which rang in my ears. After a dinner of traditional South African food, she signed my parents' album and we went home.

Miriam Makeba would die of heart failure on stage in Italy in November 2008. Shortly after her death, I was telling this story to my friend and co-worker Tony Faure. As I finished, I wondered what had happened to the little girl I had played with that afternoon. Tony said we could probably find out on Wikipedia. While he searched, his face suddenly turned grim. He looked over at me and told me that she had died giving birth. A writer from an online magazine, African Music Safari, informed me not long after that her daughter's name was Sibongile, which meant 'thank you.' "Miriam usually just called her Bongi and they even recorded songs together," he explained to me in an e-mail. "Bongi actually had four kids. The third died suddenly just a few years old and the fourth while still unborn." Losing Bongi and her grand children became a very sad chapter in Miriam's biography. But she does have two descendants. In memory of Miriam, who effortlessly turned a young white suburban kid on to black African music, you can find a performance of that "click" song on YouTube from a concert held a few years after our fateful meeting.

-- February 22/10

Soul Deep: In Consideration of Alex Chilton

Alex Chilton
“Sometimes failure is more rewarding,” wrote Bill Reynolds in The Globe and Mail yesterday in concluding his obituary remembering Alex Chilton, who died last Wednesday of a heart attack in New Orleans at the age of 59. Chilton, the former lead singer of the ‘60s pop group The Box Tops ("The Letter," "Soul Deep") and co-founder of the influential 70s power pop band Big Star ("September Gurls," "In the Street"), was certainly no stranger to both failure and reward in his short life. In a sense, he took both extremes and created what critic Jason Anheny called “a poignantly beautiful sound which recaptured the spirit of pop’s past even as it pointed the way toward the music’s future.” Chilton was part of pop’s past and future in ways that became as much paradoxical as he became largely influential on later groups like The Replacements and (especially) R.E.M.

Howard Hampton had once described Kurt Cobain of Nirvana as Mark Chapman and John Lennon rolled into one: The first self-assassinating pop star. That observation comes from an understanding of the way pop shifted from the desire for success in bands from the Sixties and the distrust of it in bands from the Nineties and beyond. Of course, Alex Chilton was once part of that flowering of optimism when he fronted The Box Tops at the age of 16. Yet their first hit record “The Letter,” released in the summer of 1967, held hints of the discontent to come. The song may be about how the singer receives a letter from his girlfriend far away telling him that she can’t live without him. But his eagerness to reunite with her is tempered by the sheer exhaustion in his voice. Chilton sings “The Letter” as if he’s winded by trying to make it down the runway. (It came as no surprise to me when Joe Cocker would later soak the song in booze and turn it into another hit record.) When Stevie Winwood was a young teen fronting The Spencer Davis Group he sounded as bold as a baby Ray Charles, whereas Chilton sounded like he was fifty. The gravel stone depth of his voice, which brought appealing wistful buoyancy to the commercial pop of “Soul Deep” and “Cry like a Baby,” already held a future peppered with regret and loss. The Box Tops disbanded in the winter of 1970.

Coming from Memphis, Chilton was no stranger to the deep well of pining at the root of Southern soul music. This is why he was also largely influenced by the chiming transcendence heard in the glorious pop of both The Beatles and The Byrds. So in 1971, he teamed with guitarist/songwriter (and Anglophile) Chris Bell, bass player Andy Hummel and drummer Jody Stephens to summon the spirit of The Beatles with their debut album ironically titled #1 Record (1972). The record, despite such appetizing tracks as “Feel” and “When My Baby’s Beside Me,” didn’t come close to being Number One. Stax Records, the great Memphis soul label, was haphazard in their distribution of the record. But perhaps the marriage of irony and earnest desire in a lovely acoustic ballad like “Thirteen” didn’t sit well with pop audiences either. Especially listeners who wanted their tunes dished up straight. In the song, a couple of kids are in love with rock and they celebrate its spirit of rebellion ("Won't you tell your dad to get off my back/Tell him what we said about 'Paint It Black.'"), but “Thirteen” also harbors rebellion’s defeat. (Adding more irony, the track “In the Street” would be the theme song for TV’s That ‘70s Show.)

While Big Star would follow up #1 Record with the driven desperation of Radio City (1974), it became clear that the band had also invoked the tumult of the Fab Four. Chris Bell wanted to spend more time in the studio crafting their music, whereas Chilton was eager to perform live. Their creative tensions, plus the poor handling of their records and audience indifference, led to Bell abandoning the group to a solo career that was tragically halted with he was killed in a car crash in 1978. Alex Chilton and Big Star carried on, however, through 1978’s Third/Sister Lovers, which (like The Beatles’ Let it Be) captures the sound of a group coming apart. But unlike The Beatles, who erected a stage to dream on, while inviting others to dream with them, Chilton always distrusted the dream. And why not? The music business by the Seventies had grown bloated and cynical. Anger and disappointment were for Chilton the affirmations of a love that gets torn asunder. (That spirit of Chilton I can always hear in The Replacements’ masterpiece “I Will Dare.”)

Contemplating Chilton’s death yesterday, I was drawn to one of Big Star’s later tracks, one that laments back over a career, called “Thank You Friends.” While the sentiment in the lyrics suggests pop sincerity when Chilton sings “Thank you friends/Wouldn’t be here if it wasn’t for you,” he’s also singing to those who couldn’t give a shit. There’s an underlying tone of rage in the beauty of the melody and the R&B-inspired backing harmonies that makes the song less an ironic stunt (like Sid Vicious’s “My Way”) and more Chilton’s claim that he stayed true to what he loved in pop despite the defeats suffered. “Without my friends, I got chaos,” he sings later in the track, “I’m off in a bead of light.” Alex Chilton was always a bead of light, a reminder of pop’s grandest aspirations and a recognition of its deepest failings. Maybe to re-phrase Bill Reynold’s summation of Alex Chilton, I’d say, like Bob Dylan, that there’s no success like failure. And failure is no success at all.

-- March 20/10

Wake of the Flood: Louisiana,1927


Louisiana, 1927

Before the tragic events of Hurricane Katrina in 2005, the worst flood of the Mississippi River took place on Good Friday, April 15, 1927. When the levee broke, just above Clarksdale, water inundated the state. New Orleans in particular was hit with 14 inches of rain in 24 hours, although the city itself survived. What was most significant about the calamity was the transformative effect it had on the culture of Louisiana.

Racial strife became exacerbated when the flood caused over 300,000 blacks to live in refugee camps for many months. Ultimately, hundreds of thousands headed North. It also didn’t help matters that in the late twenties, Louisiana was a compromised state mortgaged to Standard Oil and under the boot of the Old Bourbons, the wealthy city fathers who controlled the political fortune of the Big Easy. Singer/songwriter Randy Newman who spent much of his early childhood in New Orleans believed that the Bourbons had a hand in bringing about the catastrophe. “The Bosses in New Orleans probably were behind the decision to let it flood up there, diverting the water away from their city,” he explained. “Anyway, the cotton fields were wiped out, changing America forever, disemploying hundreds of thousands of black field workers, most of whom held executive positions in the cotton industry, meaning that they were permitted to wear gloves while picking.” It also meant they were forced out of the South. “They all moved North and were greeted with open arms right across America,” Newman added sarcastically.

The flood aroused such popular interest that there was no shortage of songs depicting the tragedy, and not surprisingly, most of them were rooted in the blues. They included Bessie Smith’s “Muddy Water (A Mississippi Moan),” Blind Lemon Jefferson’s “Rising High Water Blues,” Memphis Minnie’s “When the Levee Breaks” (which Led Zeppelin would cover years later, amplified to the pitch of an apocalypse) and Vernon Dalhart’s “The Mississippi Flood.” One of the strongest recordings, “High Water Everywhere,” was by Charley Patton. Described at one time as the father of the Mississippi Delta blues, Patton generally straddled the fence between sacred and blues music. He performed blues under the pseudonym “The Masked Marvel,” while when he sang church music he became Elder J.J. Hadley. In his blues, Patton developed a pivotal style of whooping and hollering, as if he were gleefully sharing episodes of lewd abandon and drunken revelry. But in “High Water Everywhere,” recorded in two parts in 1929, Patton isn’t in much of a party mood. He performs the song like an urgent dispatch of breaking news, shouting over his guitar lines as if afraid that he could be cut short any second.

High Water Everywhere” had such lasting power that Bob Dylan would answer Patton’s urgent appeal, some seventy-two years later, in the song "High Water (For Charley Patton)" from the album "Love and Theft" (ironically, hitting the streets on 9/11, inadvertently invoking the calamity of that day). In typical fashion, Dylan presents a masked ball of American musical history featuring archetypal blues figures like Big Joe Turner, as he casually drops quotes from Elmore James’ “Dust My Broom.” We soon hear the faint echo of Clarence Ashley’s forbidding “The Coo Coo Bird” recorded by Ashley two years after the flood and carrying within it a symbol of homelessness – the cuckoo, which lays its eggs in the nest of others. Somehow, in the middle of this revelatory jamboree, in which Dylan captures a number of American artists and their work, he creates his own formidable image that seizes forcefully on the calamitous event (“High water risin’ six inches ‘bove my head/Coffins droppin’ in the street like balloons made out of lead”).

However memorable these other recorded accounts are, Randy Newman’s “Louisiana, 1927” transcends them all. Released as part of the album, Good Old Boys (1974), it's a record that takes full stock of the contemporary South while rooting itself in the era of Huey Long (and issued just as President Richard Nixon was experiencing his downfall). “Louisiana, 1927” has a majestic emotional sweep wherein Newman digs deep beneath the tragic consequences of the flood and plunges us into the wounded soul of the South. His performance expresses both outrage and empathy, yet it also gives us a sense of why the roots of the South are so durable. “Louisiana, 1927” may, on one level, be simply revealing the details of that horrible day, but those details are charged with a deeper metaphorical meaning – that the transgressions of the South, and the unresolved issues emanating from the Civil War, have invited this calamity. The winds (significantly from the North) dramatically change and transform the countryside until the town of Evangeline has six feet of water flowing in its streets. “There’s a feeling down there, definitely, of anti-Yankee animus toward the North, toward government, toward people trying to tell them what to do,” Newman once told the late Timothy White. “And that’s what it’s about to me.”

“Louisiana, 1927” opens with an elegiac string section worthy of one of Randy’s uncle Alfred Newman’s finer film scores. Throughout the song, we are never distanced from the pain in the material; Newman wants us to feel the tidal pull of the song’s power. He would again remind us of that tidal pull when he performed the song in 2005, after Katrina, giving the song yet another mournful dimension.

-- April 2/10

The Mozart of Mayhem: Spike Jones and His City Slickers



While re-watching The Marx Brothers in Monkey Business (1931) last week on TCM, I was trying to think of who might possibly be their musical equivalent. (Of course, The Marx Brothers had their own fair share of musical absurdity in their comedies.) Ultimately, I didn’t have to look too much farther than Spike Jones and His City Slickers. From the early Forties to the mid-Fifties, Spike Jones and his group tore into the pomposity of high culture with a savage intent. Jones implemented a storehouse of rude sounds that made composer Erik Satie’s experiments in Parade (1916) seem polite.

Jones turned musical history into a broadly satirical farce. He made a mockery of honoured classics like Rossini’s William Tell Overture, which he recast as a ridiculously hysterical horse race. The unbearably dippy standard “Love in Bloom” was torn to shreds in much the same manner that The Marx Brothers laid waste to Il Travatore in A Night at the Opera (1935). Johann Strauss’ delicate Blue Danube waltz was transformed into a drunken brawl (in contrast to the reverence shown by Stanley Kubrick in 2001: A Space Odyssey). In their assault on Bizet’s Carmen, the group’s “messy-soprano” Eileen Gallagher is heard frightening off three bulls with the mere shriek of her opening aria. Michael “Cub” Coda, of The Brownsville Station (who once sang about "Smoking in the Boy's Room"), remembers his father seeing Spike Jones at the Michigan Theatre in Detroit back in 1945. “They were crazy,” he recalled his dad telling him. “The stage went black and all these sirens and gunshots started going off. Then the stage lit up and it was Spike Jones and His City Slickers…They had a guy playing a toilet seat with strings on it, people onstage wearing wigs and crazy outfits – oh geez, they were nuts.” This nutty group spent their career blowing raspberries at High Art. And they did it with an all-American gusto.

Hailing from Long Beach, California, Jones first put together a college dance band that he patterned after Red Nichols’ Five Pennies. But performing standards bored him to pieces. When he met the multi-talented Del Porter, who played xylophone, violin, sax, and clarinet, and had hung out with the great impressionist Mel Blanc, Jones saw the possibility for integrating a little mayhem into the mix. They began as the Feather Merchants, but by the end of the thirties, they became Spike Jones and His City Slickers and their distinct lunacy caught the attention of RCA Records. By this time, the group included violinist Carl Greyson, whose tenor vocal style displayed a twisted grin in songs like “Cocktails For Two”; Red Ingle, who mutilated the sentimental weepie “Chloe,” turning it into a beautiful act of desperate desecration; Winstead “Doodles” Weaver, a former comedian who created the droll voice of the race-track announcer on the William Tell Overture; Freddie Morgan, a goofy-faced, mad banjo player; “Babyface” George David Rock, a trumpet player with the sweet kid voice in “All I Want For Christmas”; and Dr. Horatio Q. Birdbath (a.k.a. Purves Pullen) who contributed mightily to the massacre of “Love in Bloom.” Sir Frederick Gas (a.k.a. Earl Bennett) provided ample demonstrations of his peculiar emissions on songs like “Happy New Year” and “Knock Knock.” The City Slickers dressed like inebriated renegades out of a Preston Sturges comedy, with loud clothes and goofy hats. Their stage presence was a dada explosion from Dogpatch.

Spike Jones, a hayseed Harry Partch, unleashed a vast assortment of homemade musical weapons on the public, including the latrinophone – a toilet seat strung with catgut – as well as bathroom plungers and bicycle horns. These appliances put across a multitude of rude noises. Their brand of musical nonsense would continue until about the time Elvis entered the building. While the King stole all the pop thunder at RCA, the City Slickers slid into oblivion. They parted company with Jones and signed with Liberty Records in 1959 where they became resigned to doing straight-forward standards. Spike Jones would die from emphysema in 1965 at the age of 53. For one brief period in American pop
music, though, Spike Jones and His City Slickers created a legacy of disrepute that, like The Marx Brothers, was genuinely American in spirit. Their shared mission was to make outrageous noises in the church of good taste.

-- April 7/10

Beg, Scream & Shout!: The Flirtations' Nothing But a Heartache



When I was a kid, I treasured my transistor radio. Tuning into the local rock stations, I always found myself eagerly waiting to hear some song I fell in love with days earlier. Unlike today's Mp3 and IPad generation, where you can pick your tunes from a vast library you program on your player, there was an element of surprise to radio listening. You never knew when that favourite song would turn up. Sometimes you left your transistor on – with the ear-plug close by – just in case the DJ slipped in the tune, a track that changed the way you walked that day.

In 1969, one song that barely got any airplay (but certainly changed the way I walked) was The Flirtations' pop masterpiece "Nothing But a Heartache." The Flirtations were a black American all-female R&B band from South Carolina, but they actually made it big in England. They did it with a sound, too, that echoed the exuberance of the very best of Motown. "Nothing But a Heartache" was an indelible part of what was termed England's Northern Soul genre. In the late Sixties, Northern Soul had emerged out of the British Mod scene. The music consisted of a particular style of black American soul based on a heavy backbeat combined with the quick tempo of Detroit's pop sound. But one of the key ingredients of Northern Soul was the manner in which the singers would convey heartbreak. It wasn't by expressing despair, but instead, by providing leaps of pure exhilaration.

"Nothing But a Heartache" kicks off with a dynamic horn riff as the singer boldly comes in announcing that she has nothing but heartaches everyday. But who knew heartaches could sound so thrilling? While the backing group backs up her claim, she soars above their shouts propelled by the horns and strings. With some of the verve of The Supremes' "You Keep Me Hangin' On," "Nothing But a Heartache" also has some of the emotional weight of Martha Reeves' performance on the Vandellas' "Nowhere to Run," where Reeves expresses both the terror of being hounded and the excitement of being on the loose. What often characterized Northern Soul was that incongruity of joyful despair, a paradoxical sound that had first attracted The Beatles to Motown. They, too, would often write and sing optimistic songs of heartache (a prime example being "All My Loving").

Although The Flirtations, with numerous line-up changes, continued to make singles for the next three decades, "Nothing But a Heartache" was their most potent track. Yet it never cracked the Top Ten. Years later, due to its obscurity, I couldn't remember who recorded it. It took me many moons to find out who they were and what the title of the song was because DJs never back-announced the track. Believe it or not, I didn't end up finding out until the mid-nineties when film director Ron Mann's documentary Dreamtower (about the Toronto hippie enclave Rochdale) featured the tune. Even then, Mann didn't list it in the credits. I had to find out from Marc Glassman, who was the music supervisor on the project. But I was even happier in 2000 when I finally acquired the song on the substantial and invaluable 6-CD Rhino R&B box set, Beg, Scream & Shout!: The Big Ol' Box of '60s Soul.

-- April 29/10

A Masterpiece and its Spiritual Cousins: Rubber Soul, Pet Sounds and Aftermath



Is it possible that The Shirelles best embodied the idealistic spirit of JFK's New Frontier? Perhaps. Especially with one 1960 pop song, "Will You Still Love Me Tomorrow?" that delicately captured both the assurance of the decade and its secret fears. Written by Carole King, and her first husband, Gerry Goffin, "Will You Still Love Me Tomorrow?" had an awareness that within every hope lay the possibility of failure, defeat, and maybe betrayal. The singer accepts the devotion of her lover, the light she sees in his eyes, but she's also worried about the future, when that light may refuse to shine. In this enduringly complex tune, the stakes of love get raised so high that the fear of it all falling apart weighs pretty heavy. As Bob Dylan said in 1965, right at the cusp of his greatest glory, "when you ain't got nothin', you got nothin' to lose." The Shirelles had, in a certain sense, laid the ground for the romantic dream The Beatles (who would cover their songs) were about to create. But The Beatles also inherited that possibility of failure that The Shirelles saw coming. When the hopes of the New Frontier were so cruelly dashed in Dallas in 1963, The Beatles had reached into that despair, two months later, to hold our hand. But it was coming up to two years since The Beatles rekindled those hopes, and the question of whether we'd still love them tomorrow was still up for grabs.

Their electrifying early records had sought us out, demanding that we share in the pleasures those songs offered. When John Lennon said in "Please Please Me" that he'd continue pleasing us, if only we'd agree to please him, were offered a definite stake in the relationship. Each song they wrote was designed to be a two-way street, the creation of a romantic bond, which required the participation of the listener in every way. The utopianism heard in "There's a Place" was only viable when we first believed that the place actually existed. But by 1965, The Beatles were starting to grow weary and suspicious of their audience. There's a place, alright, and maybe it's now far away from you. No longer trusting the screams of adoration or enjoying the enduring isolation of hotel rooms and ducking into limos, the group began retreating into the safety of the studio.Within those walls, the sounds they began to create outclassed the sounds from the stage. The songs they wrote and covered, in the beginning, had taken the world by force, by the affection expressed in them. Now their music was more elusive, the pleasures tucked beneath the dense melodies. At this point, though, their retreat did not diminish their work. Instead, detachment took it deeper, farther into the exigencies of love and loyalty.


Over 45 years ago, The Beatles released Rubber Soul which showed that The Beatles, now seeking solace from the madness of Beatlemania, were creating a new music that sought to find the more discerning listener. The songs included reached out to find those who dared step outside the din of the screaming throng. With this record, they asked us to lean forward, listen carefully, and take the doubts along with the hopes and the desires along with the fears. Rubber Soul had all the yearnings and qualms of Goffin/King's "Will You Still Love Me Tomorrow?" but it didn't stop with the question of the title. Rubber Soul went much further to ask: If you don't love me tomorrow, then what? While taking over 113 hours to record, compared to the one-day they took putting together their debut Please Please Me (1963), Rubber Soul was startlingly innovative taking the R&B genre beyond its purist roots. Unlike many other white pop artists, especially the ones who merely paid reverence to the style and attitude of black blues and R&B, or channeled the essence of the form (as did Peter Green’s Fleetwood Mac), The Beatles sublimated rhythm and blues into their continually expanding musical fabric. And the record would irrevocably change the direction and sound of pop music. With a densely intelligent collection of love songs, Rubber Soul confronted a variety of issues: the cost of romantic desire (“I’m Looking Through You”), the power of love to heal (“The Word”), as well as to hurt (“Girl”); contemplation (“In My Life”); and the deep despair of estrangement (“Nowhere Man”). On the record, The Beatles broadened their musical identity, too, by introducing an original interpretation of classic R&B (specifically the Memphis Stax soul sound) while resisting being defined by black music (as many other British blues bands were). The Beatles instead defined their own interpretation of American black music.

What made the record such a radical departure from their previous work was that, earlier, The Beatles had reached out to listeners with a more supple enthusiasm, dramatically grabbing you in the process. But this music was sly, subtler, even crooking its finger to inch you nearer. “This music was seduction, not assault,” critic Greil Marcus once wrote about Rubber Soul. “[T]he force was all beneath the surface.” Seduction was something of a key part of their new musical direction in late 1965. Since they first got your attention with the dynamic power of “She Loves You,” they could now introduce you to love’s transgressions in the quiet wistfulness of “Girl.” Rubber Soul demonstrated fully that The Beatles were discovering shadows within the chimerical spirit of their music. Those shadows, too, would be cast over other contemporaries who were also inspired to do their most innovative work.

The Beach Boys

Brian Wilson of The Beach Boys was so astounded by Rubber Soul that he decided to change the entire musical direction of his own band. Before The Beatles created mayhem in America, The Beach Boys had already established themselves as a legendary pop group from Southern California. From their first song, “Surfin’,” in 1961, The Beach Boys had initiated their own artificial paradise that quickly defined their appeal. Early on, at the height of their popularity, they portrayed in their music an adolescent life filled with the hedonistic pleasure of beaches (“Catch a Wave”), an endless summer of chasing girls (“Fun, Fun, Fun,” “I Get Around”), and new unimagined freedoms offered by access to the automobile (“Little Deuce Coupe”). Unlike The Beatles, who provided a new world vision through their music, The Beach Boys simply heightened something of a world in which they – and their audience – were already a part. They caught the sunny part of the California pop enigma with its paradoxical pleasures; or what critic Jim Miller described as “a paradise of escape into private as often as shared pleasures.”

That element of escape was in large part a reflection of Brian Wilson’s unease with the world around him. While he adeptly discovered the delight held by the pop elements in the culture surrounding him, he didn’t truly live out any of it. He wasn’t a surfer (like his brother Dennis), nor did he exude the confident swagger of the characters in some of his songs. The Beach Boys were a daydream of an adolescent life Brian Wilson never had. (They were perhaps a daydream of an adolescent life Wilson wished he’d had.) His 1962 song “In My Room” gave hints of the troubled kid within the genius, but by 1964, you could sense that Wilson was trying to break through the mythical wall he’d erected around the band. His songs started to reflect aspects of Southern California youth culture that were less assured, where he could even detect hollowness in the rituals being acted out. The Beach Boys explored all this without once sacrificing the enjoyment offered under those California palm trees. “When I Grow Up (To Be a Man),” for instance, was not a braggart’s dream, but a contemplation on everything Wilson once held to be true. He candidly asked himself – and his audience – if the things he dug as a teenager would sustain him in adulthood. In “Don’t Worry Baby,” The Beach Boys’ finest song, he takes the freewheeling driver from “I Get Around” and situates him into the mundane concerns of adulthood. In doing so, Wilson didn’t sacrifice the joys of teenage freedom, even though the singer now recognized that the innocence in those joys has ended. So when Brian Wilson heard Rubber Soul, he knew exactly where he wanted The Beach Boys’ music to go.


Pet Sounds, like Rubber Soul, set out to alter The Beach Boys’ identity while changing the audience’s relationship to the group. On this record, Wilson wanted to find ways to take the characteristics of a Beach Boys song and infuse it with thematic ambiguity and a sonic lushness. The result was almost unbearably beautiful, quite poignant, even haunting. Pet Sounds began with the gorgeous yearning of “Wouldn’t it be Nice,” which took the young stud of “Fun, Fun, Fun” and brought him to face the possibility of romantic commitment. But, from there, Pet Sounds became a densely orchestrated catalog of Brian Wilson’s doubts and insecurities. In a forsaken voice dipped in sweetness, Wilson sought reassurance in “You Still Believe in Me.” “That’s Not Me” took stock of who had become by 1965 and questioned how he got there. “Don’t Talk (Put Your Head on My Shoulder),” almost as achingly pretty as “Don’t Worry Baby,” looked for the kind of comfort that went beyond what words could offer. “God Only Knows” was as sublime a love song as anything on Rubber Soul. For one thing, it was rare (not to mention daring), to begin a song about devotion that opens with the singer doubting if he’ll always love the woman he’s with. “I Just Wasn’t Made For These Times” was as exquisite a tune about alienation as any written. (For one thing, there’s not a snide note in it.) Pet Sounds was the spiritual cousin to Rubber Soul and it would have a lasting affect on Lennon and McCartney – so much so, that they answered it in 1967 with Sgt. Pepper’s Lonely Hearts Club Band.

If Pet Sounds was the convivial companion to Rubber Soul, The Rolling Stones’ Aftermath, released in April 1966, was its evil twin. Having matched The Beatles – album-for-album, single-for-single – The Stones dug in here with a quietly menacing record, the first to feature all original Jagger/Richards compositions. It’s an epic set that, in its U.K. version, ran close to an hour in length. Aftermath took the romantic skepticism of Rubber Soul and unleashed a tale of underclass revolt. The songs told tales of bohemians roaming London and flashing their contempt for anything that reeked of bourgeois contentment. You didn’t have to look too far to find their scorn. It was heard in the course put-downs of “Stupid Girl,” the sadistic cat-and-mouse games of “Under My Thumb,” the patronizing contempt expressed in “Out of Time,” and the brooding self-explanatory “Doncha Bother Me.” The dark humour erupted right off the top when The Stones turned society’s condemnation of the youth drug culture back on its accusers in “Mother’s Little Helper.” The song parodied the daily anxieties of the middle-class housewife who grows dependent on pills to get her through the day.


Despite their disdain, however, The Stones’ arrangements, as softly intricate as Brian Wilson’s on Pet Sounds, share the seductive ambiance of Rubber Soul. The hushed marimba takes the edge right off of both “Under My Thumb” and “Out of Time.” The harpsichord on the baroque “Lady Jane” lends it a lovely quaintness, as it does also on the lamenting “I Am Waiting.” The Rolling Stones didn’t abandon their blues roots on Aftermath either. With the astonishing 11-minute reverie “Goin’ Home,” Mick Jagger considers getting back to his girl and then takes his sweet time arriving there. Just as Rubber Soul had dramatically altered the pop landscape, by introducing the record album as a conceptual art statement, Pet Sounds and Aftermath did nothing less than change the very texture of popular music.



Was Maury Chaykin not the best character actor on the screen? I certainly thought so. But then some actors get prickly when you describe them as a character actor, as if you’d just demoted them to the role of gaffer. But Chaykin, like J.T. Walsh, brought out compelling intricacies in the characters he portrayed, idiosyncratic aspects that couldn’t be conveyed in the script. He took those roles out of the shadows and lit up parts of the movie that just weren’t getting enough light.

I first encountered Chaykin under those circumstances. During a press screening of John Badham’s WarGames (1983), an anti-nuclear war civics lesson disguised as a movie, Chaykin shows up in a hilarious cameo. In the film, Matthew Broderick plays a Seattle high school student who’s obsessed with computers. He inadvertently hacks into the military supercomputer to play its simulated war games and begins steps towards a nuclear strike. Early in the picture, while he’s still trying to figure out what he’s done, Broderick goes to visit two other computer pals who might be able to help him out. One of them, Malvin (Eddie Deezen), is the personification of computer nerd, a wind-up toy who looks like he’s spent a short lifetime ingesting Jerry Lewis pictures. Deezen is a complete chatterbox. But Malvin’s co-worker Jim (Chaykin) creates a quiet space around himself as if he’d never been invited to take part in a conversation. Bearded with a smiling face on his T-shirt, Jim contemplates Broderick’s problem and slowly – with some pride – begins to figure out what Broderick’s hacked into. But Deezen keeps jumping in with his own answers until Chaykin explodes calling him “Mr. Potato Head” over and over until he quiets down. Once he does, he quietly reminds Deezen of how "Mr. Potato Head" reminded him that if he ever became irritating, he was to be informed. I didn’t know who Chaykin was then, but given the tired and preachy anti-war plot to follow, I wanted to know more about that guy. Unfortunately, he disappeared from the movie. But not from my memory.

But Chaykin was different from most character actors because he could also play lead roles with equal verve. I was once astonished seeing him portray the part of Hal C. Banks, the strong-arm American labour union leader who came to Canada in 1949 to lead a war between rival shipping unions, in Donald Brittain’s superb TV drama, Canada’s Sweetheart: The Saga of Hal C. Banks (1985). Chaykin used his bulking frame to lay claim to space that others thought was theirs. Like a corrupt and contented Pasha, his Banks found a comfort zone in the recognition of just how dangerous he really was. It was a commanding performance.

On the other hand, he could also convey the eccentric outsider without needless embellishment, as he did in Richard J. Lewis’s otherwise negligible 1994 adaptation of the late Paul Quarrington’s novel Whale Music. Playing a character inspired by Beach Boy recluse Brian Wilson, Chaykin is Desmond Howl, a former rock legend who lives in seclusion in a seaside mansion until he’s invaded by a young squatter (Cyndy Preston) who breaks into his home. Howl has been spending his years composing a symphonic piece for whales that congregate near his property. But the music is also supposed to be in memory of his late brother who died in a car accident. The stuff between Preston and Chaykin is predictable, broad comedy, but when Chaykin enters the world of his music, he’s like a deaf man who’s been restored to hearing again. There’s a touching quality to Howl’s melancholy which Chaykin doesn’t play for depressive behaviour. Instead we see a man who has lost the means to connect to everyday life because of the grief it will invoke in him.

For years after, whenever I went to review a film and I saw Chaykin was in it, I perked up, eager to see what he might do. My favourite Chaykin role was in the wonderful, but sadly neglected, Diane Keaton film Unstrung Heroes (1995). Based on a memoir by journalist Franz Lidz, the story is about his growing up Jewish in the Fifties with his rationalist inventor father (John Turturro) and his life-loving mother (Andie MacDowell). When she gets diagnosed with ovarian cancer, Franz goes to live with his eccentric uncles; one of them the paranoid Danny (Michael Richards), and the other, the quietly observant Arthur (Maury Chaykin). Unstrung Heroes, in one sense, is a moving story about the restorative power of memories. Chaykin’s Arthur collects knick-knacks which he places in a box and gives to Franz with the hope that he will treasure personal mementos because they gather all the memories of who we are.

For me, Maury Chaykin’s career is a lot like that collection box where inside it are memories of sharply defined roles that don’t fade into obscurity, or gallantly chew up the scenery, but instead awaken cherished moments when experienced again. Chaykin approached his work with a sly attention to detail and a love for the oddball charm of the outsider who is often silently hidden in the shadows. Chaykin’s art though wasn’t simply to cast light on those oddballs, he illuminated characters always from the inside. By having us accept those characters on terms that only he could find agreeable, Maury Chaykin became my idea of the perfect unstrung hero.

-- July 28/10

Patricia Neal and Michael Rennie in The Day the Earth Stood Still

There are three reasons why I fell in love with the science-fiction classic The Day the Earth Stood Still (1951) when I was nine years old. One was the quiet, thoughtful presence of Michael Rennie as Klaatu, the man from space, who came to Earth to warn us of our destructive habits (i.e. nuclear warfare). The second reason was the eerie score by Bernard Herrmann which introduced me to the wondrous and evocative world of electronic music. The third reason was actress Patricia Neal, who became Klaatu's soul mate in getting the proper attention paid to the consequences of ignoring his warning. Her most famous scene, of course, would be preventing his robot Gort from destroying the planet through these famous words: Klaatu barada nikto. But I was taken by something else in Patricia Neal, who died a few days ago from lung cancer at the age of 84.

Given that I first saw the film in 1960, I was already getting used to the predominant image of the suburban mother through TV sitcoms like Father Knows BestLeave it to Beaver and Ozzie and Harriet. Besides being maternal to the point of being docile, these mother figures came across as blandly nurturing caregivers who were happily becoming part of the wallpaper in their environment. But Patricia Neal in The Day the Earth Stood Still was a different story. For one thing, she was a single mother raising a son. And although she looked the Fifties suburban parent, there was a bold undercurrent that set her apart from the typical mom. She wore her independence without being strident about it. For example, she accommodated her boyfriend, but when he tried to coerce her, she balked. Patricia Neal broke the stereotypical image of the Betty Crocker ideal in half and made her character flesh and blood, with a spirit that was soulful in its irreverence. In her scene with Gort, lying helpless before him on the grass, she commands his attention by paying honest respect to her fear. She isn't simply a damsel in distress, she knows that if she fails, all is destroyed. Her fear fuels her determination to survive – so the Earth will survive, too. Without Patricia Neal, The Day the Earth Stood Still might have been simply a competent cautionary tale instead of an impassioned appeal to ideals and reason.

Patricia Neal and Gary Cooper in The Fountainhead

Over the years, I came to adore Patricia Neal's work, but sporadically, and from radically different points in her career. No matter which period I encountered, though, there was always a continuity of soul from role to role. In the lunacy of King Vidor's adaptation of Ayn Rand's megalomaniacal The Fountainhead (1949), for instance, Neal made Dominique Francon's passion for the uncompromising architect Howard Roark (Gary Cooper) into something tangible and electric. (She even survived the hilarious camp of swooning lustfully at the lust-less Cooper while he digs symbolically into the open earth with his electric drill.) In Martin Ritt's Hud (1963), she brought a sublimated eroticism to the middle-aged housekeeper Alma, who is attracted to the lone wolf Hud (Paul Newman), but not to the point of becoming one of his conquests. Neal rarely wore her sexuality on her sleeve, but instead she demonstrated the art of thoughtful suggestion. She implied that she knew more than her characters would be willing to admit. But we always knew that these women, almost world weary, understood the world all too well and had sacrificed some part of themselves without becoming defeated in the process. This is one reason why when Alma leaves the family farm at the end of Hud, it isn't an admission of shame for Hud's attempted sexual assault on her. She'd just rather not be acquiescing to his desires. (Hud apologizes to Alma for his drunken assault, too, but not for his attraction to her.)

Over the years, Patricia Neal's movies would turn up on television where you would find memorable glimpses of her in unmemorable pictures; her wounded desire for Curt Jurgens in the awful Psyche 59 (1964), or the sexual hunger of her spinster in The Night Digger (1971), which was also made notable due to Bernard Herrmann's typically dirge like score. But I would have forgotten about my secret passion for Patricia Neal, where I'd assumed that she'd just had vanished into movie lore, until Robert Altman cast her in Cookie's Fortune (1999). In the picture, Neal played Jewel-Mae "Cookie" Orcutt, a wealthy, widowed dowager in a small Mississippi town. Having grown tired of being alone, she takes a pistol from her late husband's cabinet and kills herself. When she is discovered by her niece (Glenn Close) and her younger sister (Julianne Moore), they plot to cover up the suicide to look like a murder in order to preserve the family reputation – and, of course, inherit the spoils.

Cookie's Fortune

Neal's part in Cookie's Fortune is small, but not negligible. She casts a looming shadow over the rest of the movie. For Cookie's suicide is not portrayed as an act of self-loathing, but rather a release from a life that had grown less meaningful. Looking more fragile, Neal brought a certain dignity to the character which contrasted sharply with the lack of dignity expressed by her living survivors. She may have actually drawn part of that experience from her life. In the early sixties, she and her husband, author Roald Dahl (who adapted The Night Digger), had suffered through grievous injury. Their son Theo, four months old, died after acquiring brain damage when his baby carriage was struck by a taxicab in New York City. In 1962, their daughter, Olivia, died at age 7 from measles encephalitis. While pregnant in 1965, Neal suffered three-burst cerebral aneurysms, and was in a coma for three weeks. It was Dahl who directed her rehabilitation and she subsequently helped her relearn to walk and talk. Patricia Neal certainly knew suffering, but in her roles, she brought forth a dignified hunger to survive. In Cookie's Fortune, her suicide even failed to kill her memory and her lasting legacy. Her absence continued to dominate the picture. Patricia Neal's best films bring forth the same lasting riches and they yield the same fortune.

-- August 12/10

Shining On: Celebrating John Lennon's 70th Birthday



Perhaps more than his songs, I've always been intrigued by the sound of John Lennon's voice. I can't think of any singer who had such an ineradicable way of intertwining pleasure and pain. When he sang "Eight Days a Week," you could feel the ache in his voice as much as you could experience that soaring, shining quality that helped you transcend the pain. It didn't matter whether he was cutting loose with Chuck Berry's "Rock and Roll Music," expressing vulnerability in the ballad "If I Fell," or conveying the surreal sensation of pure memory in "Strawberry Fields Forever," you couldn't resist giving yourself over to the timbre of his voice. In his best songs, he made the song's sentiments seem real and true; not only his, but ours to share.

Even in weaker material, as on his 1973 Mind Games album, his voice lifted many of the tracks beyond the shabby writing. "Out the Blue," one of his many songs for Yoko Ono, may be yet another track that expresses his gratitude for having her in his life, but it also reveals more of his desperation than many of the others. (They were breaking up as it was being recorded.) The naked longing snaking through lines like, "All my life has been a long, slow knife/I was born just to get to you/Anyway, I survived/Long enough to make you my wife," are charged with the electricity of a man who knows he can only reach out when he reveals the full depth of his need. The song's power isn't simply in its words, but in the power of how those words are delivered.

Listening to the newly released stripped-down version of his and Yoko's 1980 album Double Fantasy, where she eliminated all the echo from his voice, the purity of its sound is startling to encounter. Lennon seems to be whispering in our ear with the heartfelt need of having you share in what he has to say. All that reverb earlier had masked the doubt and uncertainty of a man who had been away too long. Now we can hear in his voice an artist who is not sure what he has to say to an audience who has been waiting so long to hear him again.

It was his voice that also hooked me a few nights after his murder during a memorial tribute at Toronto City Hall. Lennon's death had already created some uncanny parallels. The Beatles had arrived in New York in 1964 to rekindle the utopian spirit of a nation grieving the assassination of their idealistic President. Now, in 1980, in New York City, the assassination of one of The Beatles sent Sixties' idealists into total despair. At our candlelight vigil, people sang songs and listened to Beatles stories recounted by local DJ John Donabie and British pop and blues singer Long John Baldry. We shivered in the December cold and shared our grief.

While most of the evening is a blur now, I can still clearly remember when the shock of the event turned to acknowledging the sorrow. As the public address system began to play Lennon's "A Day in the Life," the crowd began to sing along. But it wasn't the words of the song, so choice for the occasion, that put me in touch with the genuine sense of loss that I felt. For me, it was the moment toward the end, after Paul McCartney describes himself falling into a dream, when Lennon returns with his familiar wailing voice. Once again, that instrument of great anguish and great joy pierced the night air. As I tried to hum along, the joy I felt hearing that part of the song – always my favourite section – gave way to tears.

I found myself weeping not just for Lennon, but for all the friendships I'd formed because of the Fab Four's music. Pleasure and pain, the essence of John Lennon's voice, was something I heard in this powerful song about a man standing alone in the society he felt estranged from. No doubt, as people celebrate Lennon's life today, the sense of loss will run as deep as the joy expressed in his music. It's perhaps just as John Lennon had always intended it.

-- October 9/10

Therafields farm in the Seventies

Grant Goodbrand's Therafields: The Rise and Fall of Lea Hindley-Smith’s Psychoanalytic Commune (ECW Press, 2010), the story of one of the largest and influential therapeutic communes during the Sixties and Seventies, is an absorbing, insightful and contemplative study of the failure of good intentions. Therafields, an experimental psychotherapeutic collective was formed by British-born lay therapist, Lea Hindley-Smith, in the mid-Sixties. The commune was part of that period’s utopian spirit to create an alternate society which, by the end of the Seventies, came apart in division, death and suicide. “The experiment had ended in tragedies and bitter animosity, traumatically turning friend against friend in ruptures that never healed,” Goodbrand writes. Therafields might have been sparked by an egalitarian impulse, but it was one that was undone by false expectations, fantasies, idolatry and promises that couldn’t be kept. In Therafields, though, Grant Goodbrand keeps his own promise by trying to heal the breach in that history. “At the age of fourteen I read that only the self-examined life was worth living,” he writes in the introduction. You could say that Therafields is something of a testament to that view. Goodbrand combines memoir, social history and the psychotherapist’s favourite tool of analysis, in order to tell the story of how Lea Hindley-Smith’s attempt to use the healing powers of psychoanalysis to create the foundation for social change came apart at the seams. Goodbrand, who was one of the first therapists trained by Lea, begins his story with the early years of Lea Hindley-Smith in England. Right away, we see how the seeds of Therafields’ undoing truly began with her damaged childhood and a loveless marriage that overshadowed her ambitions.

By the time she began doing therapy in Toronto in the early Sixties, Lea had already created an eclectic fusion of Freud, the radical body-work of Wilhelm Reich – plus the incisive style of Robert Lindner (The Fifty-Minute Hour), who “held that the potential for evolutionary change within human beings was dependent on a rebellious instinct,” – to provide a therapeutic model that was less theoretical than it was flexible to the needs of the individual. She soon started attracting a number of nuns and priests who were influenced by reform at the Vatican in 1962. “Pope John XXII set an agenda in which the 2,500 bishops could debate, disagree and then eventually make decisions,” Goodbrand explains. “In short, he effectively relinquished some of his absolute power to the collective.” This enabled Lea to treat and eventually train this clergy into a group of psychotherapists who could then treat others. But since the Sixties counter-culture was also on the horizon, other idealists were drawn to Lea with the hope of creating new models of living – one of those idealists was Canadian poet bp Nichol, who along with Lea's son Rob, shored up the communal aspirations. By the time houses were being purchased in Toronto’s Annex, a farm in Mono Mills, office buildings and a beach house in Florida, Therafields had grown into a living experiment where, at its height, 900 people were involved in creating a therapeutic community. 

In the book, Goodbrand traces the rival streams of thought that would ultimately clash: The religious group, who simply wanted to do individual therapy; and the younger secular folks, who wanted to start a new social revolution. Grant suggests that the ultimate failure of Therafields was both a combination of the unresolved tension between these two groups plus the economic changes in the seventies which inspired more self-interest than collective dreaming. But the story of Lea also suggests another aspect. Due to her illness (Lea suffered from diabetes for many years) and a grandiloquence that lead to unchecked veneration, Therafields always suffered from a false perception of utopia. Instead of creating a community that integrated itself into the world, Therafields separated itself from the world by creating an umbrella that protected people from the neurosis of everyday life. The only trouble was: the neurosis of everyday life quickly bled into this insulated community and dramatically contributed to its demise. That isolation grew out from Lea's personality, as we come to learn towards the end, when we discover that she wasn't entirely who she claimed to be.

Therafields: The Rise and Fall of Lea Hindley-Smith’s Psychoanalytic Commune is a pretty sobering read that presents all sides fairly and gives room to all voices. But that is also part of the book’s sole, yet honest, flaw. By becoming largely an observer who creates room for the divisive and inspiring voices to have their say, Grant becomes more of a witness than a participant. Outside of a funny and revealing anecdote about when he first enters a therapy house in crisis, I missed hearing more of how he was affected by the Therafields experience and what it did to his life. The subjective voice of Therafields appears to belong to bp Nichol, whose epic poem The Martyrology lovingly and wistfully traces the arc of the book’s narrative.

Back in 1974, I was a nineteen-year-old kid fresh out of high school in Oshawa, Ontario when I first heard of Therafields. Although I’d been living on my own for the previous two years, supporting myself with part-time jobs and student welfare, I felt lost without any sense of what I wanted to do with my life. With the help of one of my high school teachers, I came to meet Grant Goodbrand, who brought me into Therafields. For the next five years, I took part in talk psychotherapy with a terrific therapist who Grant had recommended. I also did dynamic psychodrama in group therapy, engaged in work therapy by helping build the properties that would house Therafields, and I lived in one of those properties in what was called House Group Therapy. All told, the experience quite literally saved my life and I gained many friends who remain so to this day.

The Therafields experience was perhaps more enriching for me than it was for some others because I never really bought into the alternate community dream. (I did, though, enjoy many of the people I met who shared it.) But I wanted to find my own way to live in the world, not create an alternate reality that would help me feel safer from it. For me, therapy provided only a mirror for self-reflection. The society outside was where I did the real therapeutic work as I learned to live with its joys and frustrations. Therafields: The Rise and Fall of Lea Hindley-Smith’s Psychoanalytic Commune is an important and honest history of a promise that couldn't be kept. It not only does justice to the true spirit of that promise, Grant Goodbrand also bravely honors its failings.

-- October 17/10

The Ronettes

The other morning while having breakfast, I put my Mp3 player on shuffle because I always enjoy the element of surprise. As I was preparing my coffee and cereal, I was first treated to an excerpt from Anton Webern's beautifully spacious Symphony op. 21, which was followed by The Channels' elegiac 1956 doo-wop song, "The Closer You Are," and then the LA punk band, X, with their propulsive 1982 track "Blue Spark." While it's always enjoyable to create a virtual time machine out of music, where you can be dropped any place in time, these three tracks didn't pull me out of the moment of making my breakfast. They instead added something new to the daily routine, an incongruent and appealing soundtrack which roused me from slumber. Once the brittle harmonies of John Doe and Exene Cervenka stopped their song cold, though, the next track to follow was The Ronettes' "Be My Baby." At which point, I forgot what I was doing and breakfast went into suspended animation for a little over two minutes.

Produced by Phil Spector, “Be My Baby” was a huge hit single in 1963 for The Ronettes. But more than the other previous pieces of music, "Be My Baby" has the ability to jolt you out of the moment. With one of the most seductive drum openings (with considerations of Bob Dylan's "Like a Rolling Stone" and The Four Seasons' "Walk Like a Man" aside), "Be My Baby" has an opening pop hook that takes you immediately to dreamland, an imagined world that makes definitive claims on what your idea of desire is. Usually, at the core of pop music, is a quest for a sound which touches a nerve, something that strikes a pleasurable chord in the listener. The best pop tends to unify the incompatible world around it, even answer a subliminal calling. For instance, when Elvis cut loose in the Fifties, he shook up a generation clearly ready to be shook. He uncorked a bottle filled with a frustrated generation's desire to stand apart from the herd. But Elvis not only transcended what came before him, he validated everything good to come later. In general, pop music is basically about the celebration and sharing of good times. When The Ronettes sang "Be My Baby," you shared the intense joy in their voices. It was overwhelming to immerse yourself in such pleasure and still not lose yourself. You could melt into their sound and still be set apart from the herd.

The Ronettes offered a kinship, a spiritual bond so rich, so generous, that they quenched a longing, a craving for something impenetrably beautiful to experience. Be my baby, NOW! they demanded with a desire that made you feel a fool to resist it. In Mean Streets (1973), Martin Scorsese used “Be My Baby” over the opening credits of his fevered thriller about a petty hoodlum who gets seduced by both the Church and the Mob. The Ronettes here get turned into sirens on the shore luring the protagonist (Harvey Keitel) into sin and guilt. What other song could have the potency to make moral rot look so seductive?

Of course, The Beatles also scaled those peaks – even elaborated on them – by building greater expectations on each song they left behind. With "Eight Days a Week," John Lennon easily convinced you that his love had the power to extend the calendar beyond the expected seven days. He did it, too, in a voice that asked – no, demanded – that those deeply expressed sentiments, be shared and requited. But all of this was a long time ago, when pop music was in its infancy and artists dreamed of romantic ideals and possibilities.

Nirvana

By the Nineties, to invert John Sebastian's idyllic plea, no one believed in magic anymore. Pop artists still reached for a sound that could bond them with the listener and sum up an epoch, but the epoch no longer held the promises it once did. It was long after The Beatles' hurricane of love subsided, and the punk storm of the Seventies blew over, that Kurt Cobain of Nirvana created his own pop tempest in "Smells Like Teen Spirit." And it came right out of the dissipation of an era. For Cobain, honest feeling was being replaced by vague cynicism and glib hipness. You could hear the recoil in his voice under the rage of the clanging guitars and Dave Grohl's cannon-shot drumbeat. His aside of "Oh well...whatever...nevermind" (which echoed Peter Green of Fleetwood Mac in his "Oh Well" resignation from the vagaries of the pop world two decades earlier) was the sound of defiance being bled dry in an emotional vacuum. Yet the urgency of the music still ripped through the radio with the force of The Who. Cobain's voice was a drone of impacted rage, exploding only on the chorus. That explosion, though, brought listeners together as one.

However, Cobain didn't stand in front of the song, as Elvis did in "Hound Dog," or the way John Lennon could in "Eight Days a Week." He also didn't have the dynamism of Ronnie Spector in "Be My Baby." Cobain chased the song instead of riding it out. His sound was the exigency of emotional exhaustion, a man in dire straits to catch a runaway bus. "Smells Like Teen Spirit" collected the ennui of its time and blasted its contents with such vigour that it may be the most joyful song ever written about joylessness. Paradoxically, the song was often misunderstood as an expression of lethargic apathy, when it was actually a wince in the face of feelings that were too painful to consider.

Whatever mood these varied pop songs (from The Ronettes to Nirvana) conveyed of their time, the largeness of their vision encompassed something already rumbling in the culture, if not already desired in the audience they reached out to. In their songs, these artists built foundations for people to dream on. Which is why when The Ronettes' "Be My Baby" finally faded into Morrissey's "The Never Played Symphonies," I immediately woke up to finally drink my coffee.

-- October 20/10

Songs We Refuse to Sing


Mayor Rob Ford

While walking home from dinner with a friend last evening, I had the Toronto Civic Election on my mind. This year's Mayoralty race had been a bitterly fought battle between Rob Ford, a right-wing demagogue from the suburbs, and George Smitherman, a provincial Liberal Party politician, who entered the race to bring fiscal responsibility and social awareness to a metropolis where its suburban citizens were angry with our current Mayor David Miller. Many were enraged over high taxes, political entitlement, waste and an ill-functioning transit system. During his campaign, where he vowed to "stop the gravy train," Ford marshalled that fury into a frightening populist froth. He resembled the late comic Chris Farley on Saturday Night Live, acting out his character of the suburban Ralph Kramden, a big lug always in a state of continuous fulmination.

As I approached my apartment, the streets seemed ominously quiet. One person looking lost quickly passed me by as though she didn't actually live here, or perhaps anywhere. She seemed to be simply going from Point A to Point B. It reminded me of that scene in Invasion of the Body Snatchers (1979) when Brooke Adams wakes up to find her home city of San Francisco suddenly feeling foreign to her. Since I live downtown, where the support was largely for Smitherman, I had a sick feeling that he didn't win. When I got home, sure enough, the news was confirmed that Ford won handily. While Smitherman gave a thoughtful, quietly gracious concession speech, revealing the heart of a man who was largely missing during the campaign, Ford came thundering into his victory party to the strains of "Eye of the Tiger." This 1982 song, performed by the American rock band Survivor, was written at the request of Sylvester Stallone for his film Rocky III. With its steroid inflated beat and pumped up formula lyrics ("It's the eye of the tiger/it's the thrill of the fight/rising up to the spirit of our rival"), "Eye of the Tiger" (like the movie) pummels the audience into submission. Its sole aim is to get you roaring your defiant support for the underdog fighter. Significantly, "Eye of the Tiger" didn't invite you to feel the struggle of the fighter – it told you to blindly join the herd. The song perfectly fit the arrival of Ford as he faced his delirious supporters. Here was their underdog who fought all those folks who (as Chris Farley's comic motivational speaker once said) "wouldn't amount to Jack Squat" and now he was the victorious one. But what had he won? And who wanted to sing along?

"Eye of the Tiger" isn't really a song about anything. It's pure aerobics.Yet the man who called himself a "friend to the taxpayer" (a friend being who, let's be frank, can be defined as one who would rather not have you pay any taxes) took to the song as if it were a sweeping soundtrack for his heroic quest to "stop the gravy train" at City Hall. In his victory speech, he called Toronto "open for business" (when were we closed?) as if cities are simply buildings for commerce and not people who choose to live there as part of a growing community. "Eye of the Tiger," with its formulaic punch and swagger, is as industrial and impersonal as the definition Ford gives Toronto. Co-writer Jim Peterik once told Songfacts, "At first, we wondered if calling it 'Eye Of The Tiger' was too obvious. We were going to call the song 'Survival.' In the rhyme scheme, you can tell we had set up 'rival' to rhyme with 'survival.' At the end of the day, we said, 'Are we nuts?' That hook is so strong, and 'rival' doesn't have to be a perfect rhyme with the word 'tiger.' We made the right choice and went with 'Eye Of The Tiger.'" The right choice also became the wrong sound. And the popularity of the song came to anticipate the quest for success that soon dominated shows like American (and Canadian) Idol, where contestants don't achieve fulfillment from the art of performance, but instead seek to be liked and accepted by the judges as if they were inflated versions of Tiny Talent Time.

Yet as relentlessly exhausting as "Eye of the Tiger" is, it topped the Billboard Hot 100 singles chart for six weeks beginning July 24, 1982. The single was also number two on the Billboard Top Pop Singles of 1982 year-end chart and number one on the Cash Box Top 100 Pop Singles year-end chart. "Eye of the Tiger" found international acclaim as well as a huge hit in the UK, Irish and Australian charts. Survivor went on to win the Grammy Award for Best Rock Performance by a Duo or Group with Vocal for "Eye of the Tiger." The song received an Academy Award nomination for Best Original Song and was voted "Favorite Song" by the People's Choice Awards (in a tie with "Truly" by Lionel Richie). Sometimes bad songs (like bad candidates) do finish first.

As the election evening wore on, however, my mind went back over a number of songs and their place in politics. David Duke, the former Grand Wizard of the Ku Klux Klan who transformed himself into a Republican in 1992, once used Tom Petty's "I Won't Back Down" as his campaign theme, until Petty (who is a Southerner from Florida) went to war to win his song back. Of course, there's also the famous story of Ronald Reagan taking on Bruce Springsteen's "Born in the USA," his angry anthem for war veterans, which Reagan turned into an affirmation of his new Morning in America. Songs often fit and don't fit the times they were made in. They've also been given a life they were sometimes never intended to have.

Monty Python's Life of Brian

Earlier this month, at Toronto's Nuit Blanche, our all-night, city-wide arts festival which we patterned after the one in Paris, I went to the TIFF Bell Lightbox Theatre to see two installations. One was called Grindbox, named after the Grindhouse exploitation movies that once played in our low-rent theatres. Curator Colin Geddes ran over three hours of trailers from a variety of exploitation films from the Seventies and Eighties and it was indeed the trashy fun that was intended. But the other installation was called Singing in the Night, which played off the popular trend of running movies like The Sound of Music (1965) for people to sing along to (an idea about as horrifying as Rob Ford being mayor). Because of this, I didn't really want to go, but it turned out to be more subversive than I anticipated.

While it began predictably with a sing-a-long to The Rocky Horror Picture Show (1975), very soon the mood began to change. Our host, like Joel Grey's devil-doll from Cabaret (but reborn as Richard Simmons), danced up and down the aisles with his microphone, singing enthusiastically and encouraging others to join in with him. But the songs began to change with the movies. We were now being asked to sing to Stealer's Wheel's "Stuck in the Middle With You" from Reservoir Dogs (1992) along with Michael Madson's dancing psychopath as he hacks off his hostage's ear. By the time Malcolm McDowell was raping and stomping to Gene Kelly's "Singing in the Rain" in A Clockwork Orange (1971), the only happy traveler was our dancing and singing host. The crowd was dead silent. Singing in the Night turned the songs on the very audience singing them, leaving us to question what we were singing and why were singing it.

The final song he chose to end his film clips was "Always Look On the Bright Side of Life" from The Life of Brian (1979), where Eric Idle cheerfully hangs from the cross with all his fellow Christians. It was a perfect ending with which to send us out to the loo. We lined up along the latrines happily inventing new lyrics while we pissed away our soft drinks. I thought about "Always Look On the Bright Side of Life" as I woke up today to consider our new Mayor. But I still can't bring myself to sing it.

-- October 26/10

The Invisible Artist: Irvin Kershner 1923-2010



When George Lucas tapped director Irvin Kershner, who died last Saturday at 87 after a three-year battle with lung cancer, to direct The Empire Strikes Back (the sequel to Star Wars), Kershner asked him, "Of all the younger guys around, all the hot-shots, why me?" Lucas replied, "Well, because you know everything a Hollywood director is supposed to know, but you're not Hollywood." Lucas wasn't being facile. Nor was he simply pandering to the veteran director. Although Irvin Kershner had been making movies in Hollywood since the late Fifties, he certainly wasn't typical Hollywood. He didn't make the most obvious commercial entertainments, but rather he examined with thoughtful consideration what constitutes commercial entertainment. Which is one reason why The Empire Strikes Back was a significant improvement over its predecessor.

If Lucas created spectacle out of the pop treadmill of space action serials, Kershner gave his own film a sumptuousness that linked it to classic fairy tale. Star Wars was content being an entertainment machine that kept the audience peaked, but The Empire Strikes Back dug deeper into the underpinnings of the story while giving the characters flesh and blood emotion. It was the most dramatically charged and enchanting picture of the entire series. "I like to fill up the frame with the characters' faces," Kershner once said. "There's nothing more interesting than the landscape of the human face." That landscape was often filled with the temperament of a film maker who seldom settled for the outlines of what the stories gave him.

In one of his first films, The Luck of Ginger Coffey (1964), an unrelenting adaptation of Brian Moore's novel about an unfulfilled Irishman in Montreal, Kershner didn't turn Moore's tale of despair into the inspirational story of a proud man who can't find his place in the world. Like Moore, he delved into James 'Ginger' Coffey's darkest frustrations where his self-destruction and selfless pride resided. Both Robert Shaw in the lead, and Mary Ure as his wife, Veronica, turned dramatic gloom into the revelation of two sparring marital partners reaching revelation too late. Kershner would return to this theme in Loving (1970) where George Segal and Eva Marie Saint play a middle-class married couple blinded by the routine of their roles as parents and partners. Where Segal's husband hides his insecurities in a series of affairs, Eva Marie Saint becomes consumed by domesticity. What separates Loving from most other movies about romantic discord is that their eventual confrontation with the reality of their married life isn't met with condescending scorn (as the couple in American Beauty would be years later). Kershner's compassion for human frailty was coupled with his strong curiosity about how human nature worked.


In The Return of a Man Called Horse (1976), the sequel to A Man Called Horse (1970), Kershner expands on the theme of the original western. A Man Called Horse told the story of a British aristocrat John Morgan (Richard Harris), who gets captured on an American hunting trip by the Lakota Sioux Yellow Hand tribe. Over time, he becomes one of them until he decides at the end to return home. Kershner picks up the story after Morgan departs and the Sioux are driven off their sacred burial ground by trappers and their Indian cohorts leaving them spiritually lost and defeated. Their exile is powerfully contrasted with Morgan's own estrangement from his own culture. Sitting in his castle estate surrounded by the artifacts of his former life as a Sioux, he realizes that he now has no true home among his own class. So he returns to America to find that many of tribe have been massacred and the survivors put into lives of slavery. When asked by the exiled Sioux why he returned, Morgan tells them that there was "an empty place in my soul that I couldn't forget." The Return of a Man Called Horse addresses, in the context of an action western, that empty place in the soul plus the complex task of fulfilling it. The Return of a Man Called Horse is about the spiritual rebirth of both the Sioux and John Morgan. While Kevin Costner would trade on this same idea in his later Dances With Wolves (1990), Kershner's film is in every way the superior one. What Costner does is merely reverse the stereotype of the savage Indian versus the noble white settler. In his picture, we are the savage race while the indigenous culture gets infantilized into nobility. Kershner's picture is about spiritual renewal where Morgan and the Sioux both make tribal sacrifices in order to rediscover their roots.

Richard Harris in The Return of a Man Called Horse

Irvin Kershner didn't have a long career. (His last film was Robocop 2.) But it was a varied one. His work was never typical of any given genre. A Fine Madness (1966) was an uneven, yet spirited, tribute to bohemianism. The Eyes of Laura Mars (1978) was an intelligent thriller about a serial killer who stalks the friends of a fashion photographer (Faye Dunaway) who traffics in violent fetishism. He goes for his victims' eyes (the very thing we watch movies with) and mirroring the prurient voyeurism of the photographer. Up the Sandbox (1972), based on Anne Roiphe's novel, examined with an eccentric comic zeal a young mother (Barbra Streisand) who is ignored by her husband which leaves her fantasizing a more exciting life she couldn't ordinarily live (one that includes her playing tribal fertility music, planting explosives inside the Statue of Liberty, and doing the samba with Fidel Castro). Kershner even directed a James Bond remake of Thunderball (Never Say Never Again) in 1983 featuring Sean Connery in his last reprise of the role (and made at Warner Brothers outside of the official Bond canon made at MGM/UA) which was a sly and comical look at an aging 007 who's unsure that he still has the chops to do his job.

Irvin Kershner though did have the chops to do his job whether he's fully recognized for it or not. Outside of The Empire Strikes Back, most of his pictures went largely unnoticed by most critics and moviegoers making him something of an invisible artist. When he died, it drew scant notice largely because his films never inspired anything close to a cult of cinephiles studying his body of work. Perhaps because he let content determine both the style and tone of his pictures, Irvin Kershner's key strength was for allowing the process of discovery to clearly define his sensibility, thus leaving him outside the current trends and fashions of audiences and critics alike. But he was one of our most unassuming talents, a peerless craftsman who left us pondering "the landscape of the human face" rather than the face of the man behind the camera.

-- December 2/10

Music From the Other Side of the Fence: Remembering Don Van Vliet (Captain Beefheart) 1941-2010



Avant-garde artist and musician Don Van Vliet, otherwise known as Captain Beefheart, died early Friday at the age of 69 after a long battle with multiple sclerosis. The news was broken by the Michael Werner Art Gallery, which exhibited his abstract paintings after he left the music business in the early 1980s. With a gravel voice and a musical style that blended jazz, blues and abstract expressionist rock into a surreal blend, Captain Beefheart was hardly popular but he was one of the most original voices in popular music. I first discovered him in 1969 through my love of the music of Frank Zappa. Zappa would produce Beefheart's atonal masterpiece Trout Mask Replica. Although quite a contentious album, this 1969 two-record set had far ranging influence in both punk and alternative rock. Back in 2007, I was fortunate enough to have written a chapbook on Trout Mask Replica for the 33 1/3 music series. An excerpt of which appears below: 

Trout Mask Replica is an album so assured in its isolated world-view that no matter how much it might alienate potential listeners, it still demands to be heard on its own terms. Yet unlike most commercial pop, Beefheart doesn't write songs to seduce an audience. We're not asked to identify with him in this music for his songs don't represent a conventional baring of the artist's soul. Beefheart instead invites us to experience Trout Mask Replica, rather than telling us what to experience. So whoever you choose to share this strident and peculiar record with you're always going to be on your own with it. Which is why Trout Mask Replica embodied the punk aesthetic eight years before it happened in the UK with The Sex Pistols. If the Sixties hippie culture was clannish, punks were solitary. In time, though, Trout Mask would quite naturally inspire countless other artists – from The Clash to P.J. Harvey – in finding their own sound, their own voice and to show them how to walk comfortably alone in the world. Yet the record doesn't provide a map to guide you in finding your way in this world, the way other great pop records can. This album was about discovering yourself as an alien. If pop music provides a utopian spirit for audiences to share, Beefheart's utopia is the true definition of the word – nowhere – a desert island of the mind. Or as he once described it, music from the other side of the fence.


Trout Mask Replica has a way of spurning simple, or easy categorization, Throughout its twenty-eight tracks, the album mixes and combines various genres of music, including Delta blues, free jazz and expressionist lyricism. It's a scrapbook collection of songs and poems, impishly acted out with Dadaist abandon and jack-in-the-box hijinks, performed with jagged rhythms and sharp conflicting atonal melodies. Ultimately, the record comes to raise important questions about just what constitutes musical entertainment and what an audience's relationship might be to it. "People like to hear music in tune because they hear it in tune all the time," Beefheart once told Robert Carey of the New York Reader. "I tried to break that all down on Trout Mask Replica. I made it all out of focus." It may be out of focus, but the music is never blurry. Whether it's the pure erotic sensuality of the passionate wet sex in "Neon Meate Dream of a Octafish"; or the abstract a capella recitation of "The Dust Blows Forward 'n the Dust Blows Back," which seems to conjure up a Walt Whitman poem after it has been soaked in hillbilly booze; or "Dachau Blues," where the horror of the Holocaust gets dipped in an abstract rendering of apocalyptic gospel, Beefheart openly welcomes listeners to hear him rail against a world that is often at odds with his own distinct brand of humanism.

However, Beefheart's most radical move, with the firm assistance of drummer John French who transcribed and taught the music to the band, was to remove from his compositions the security of harmony ("the mother's heartbeat" according to Beefheart), where we traditionally seek a warm spot in the songs we come to love. There may be no lulling melodies to draw us into the musical canvas of Trout Mask but that doesn't mean that melodies don't exist. It's just that these spiky and jagged themes are quickly gone before we can catch them on first listen. The fleeting let's-try-it-on inventiveness of the compositions, in fact, come across with a shocking ebullience. As a listening experience, Trout Mask Replica is the story of an artist who finds himself at his most liberated. It is a tale of one who refuses the comforts of security, yet still continues to dream of a world where man and beast can commingle in harmony. In staking that territory, from a musical standpoint, Beefheart doesn't rely on the lovely pop hooks that we ache to hear as listeners. The freedom Trout Mask offers is freedom from the familiar – the very element that often makes an album a hit, or at least, a mass audience favourite.

Although Trout Mask Replica is generally considered a landmark avant-garde rock record, it's essential to note that Beefheart and The Magic Band didn't set out to make an art statement like the Dadaists. Declarations always have a clearly defined purpose, a political intent that fixes them in time. It then makes for easy explanations and pigeonholing, too. For example, when Lou Reed made Metal Machine Music in 1975, a two-record assault featuring nothing but sonic feedback, he clearly intended to outrage fans and annoy his record company. Trout Mask doesn't set out to deliberately anger anyone, even if it ultimately does, because Beefheart sincerely wants to entertain us. The record is also not in the adventurous cast of filmmaker Stan Brakhage, who decorated the film frame in Mothlight (1963), by pasting moth's wings onto film stock and then running it through an optical printer therefore making us aware of cinema's tactile qualities. Nor is Beefheart's record in the same world as Andy Warhol, when he extended the epic form of film-making in the somnambulistic Empire (1965), where we lay witness to a static shot of the Empire State Building for twenty-four hours. Beefheart's effort is the exact opposite of minimalist art, it's as maximalist as music can get. Yet what ultimately makes Trout Mask a bigger artistic challenge than any of those other departures from convention is that, while it effortlessly tears apart the conventions of songwriting, it attempts it within the commercial world of pop.

Captain Beefheart and Frank Zappa

Most great albums do create myths around them and Trout Mask Replica is no different. Many rock critics (including Beefheart himself) tried to diminish Frank Zappa's role as producer on this record. But those claims, specious as they are, seem to come out of a pathological dislike of Zappa in favor of a romantic idealization of Beefheart as the hermit genius. Anyone who cares to truly listen to Trout Mask can feel the abiding spirit of both men on it. Those particularly familiar with Zappa's music, especially Uncle Meat, will hear the conceptual shape that Zappa, as the producer, gave to the production of the music on Trout Mask Replica. In the end, Trout Mask Replica is a full expression of one American artist's quest for total freedom. But it is also an expression of the tyranny of freedom. When you find yourself becoming the person you want to be, doing exactly what you want to do, sometimes freedom can't be sustained. For Beefheart, his earlier records (Safe as Milk, Strictly Personal) designed an intricate blueprint that tilted him towards Trout Mask, where he acquired the autonomy to remake rock & roll by breaking every rule in the genre. Yet even as the record caught his yearning for a new world, it was delivered with a foreboding force that stripped the ground out from under him. Whether the subsequent records were good or bad, Beefheart really had nowhere to turn after Trout Mask Replica. He could either refine the sound of it (Lick My Decals Off, Baby), define it for commercial consumption (Clear Spot), attempt to recreate it (Bat Chain Puller), or escape it (Bluejeans and Moonbeams). Once you find freedom, you often have to confront the fact that you can never really keep it. "Men are freest when they are most unconscious of freedom," D.H. Lawrence once wrote of Americans. "The shout is a rattling of chains, always was." Beefheart's rattling of chains became the living drama of Trout Mask Replica. Beefheart's brand of freedom, in fact, raised the stakes of personal liberty for the man who envisioned it, the band who created it, and the stunned audience that would soon discover it.

-- December 18/10


                                                                   2011



The Wrong Men: Innocents in Noir Nightshade


Edward G. Robinson in The Woman in the Window

One of the cornerstones of film noir is the inevitability of fate. The deeper fear being that despite your best intentions, or your honest nature, bad things will happen to you – for no reason at all. That is, for no reason that is consciously intended. In Fritz's Lang's spiraling nightmare The Woman in the Window (1944), Edward G. Robinson's meekly self-effacing Professor Richard Wanley entertains his erotic fantasies gazing at an oil portrait of Alice Reed (Joan Bennett) in a storefront window. But when he suddenly meets Reed, in the flesh, on the street, his fantasies begin to have true consequences. After killing Alice's lover in self-defense, Wanley finds himself being pursued by Heidt (Dan Duryea), an ex-cop with blackmail on his mind. It also doesn't help that Heidt was the dead man's bodyguard. Suddenly, the milquetoast professor is stewing in primal juices he'd only dabbled in with his imagination.

In John Farrow's shrewdly plotted The Big Clock (1948), magazine editor George Stroud (Ray Milland) wishes to have a nice vacation with his enduringly patient wife (Maureen O'Sullivan). But before he can pack his bags, he finds himself fired by his malevolent boss, Earl Janoth (Charles Laughton), and eventually framed for the murder of Janoth's mistress (Rita Johnson). In a moment of human weakness, Stroud had engaged Janoth's lover in a conversation about possibly blackmailing his overbearing superior. Although Stroud had no interest in carrying it out (despite how appealing it was to consider), it sets him up to be the patsy. Stroud spends most of the movie running through the publishing house escaping capture instead of drinking cocktails with his wife on the beach. (The Big Clock would be remade as a political thriller, No Way Out, with Kevin Costner as the pursued innocent in 1987.)

In Alfred Hitchcock's ingenious and perversely entertaining adaptation of author Patricia Highsmith's Strangers on a Train (1951), a tennis star Guy Haines (Farley Granger) is tired of his unfaithful wife Miriam (Laura Elliott). He'd rather be married to Anne Morton (Ruth Roman), a senator's daughter, whom he truly loves. In a chance meeting on a train, he encounters Bruno Anthony (Robert Walker), a charming psychopath who knows all about Guy's problems. (He reads the gossip pages.) Bruno offers Guy a solution: As a way to commit the perfect murder, he will take care of Miriam in exchange for Guy taking out Bruno's hated father. Since nobody could connect these two strangers, who meet coincidently on a train, it would be the ideal "criss-cross" with no recognizable motive to suspect them. Guy thinks Bruno is kidding. When Bruno fulfills his side of the bargain, however, Guy realizes it's no joke. Now Bruno insists that Guy live up to his end. Before long, he has more than tennis tournaments to consider.

Farley Granger and Robert Walker in Strangers on a Train

What makes these three movies such invaluable noirs is how each one finds the least likely man to trap in a snare of murder and deceit. But rather than treat them as innocent victims of circumstance, each story builds on a plausible recognition of guilt due to their inherent weaknesses. In Strangers on a Train, Guy simply wants out of a bad marriage so he can be with someone he loves. But his repressed hatred of Miriam leaves him susceptible to Bruno's amoral solution. George's passivity in standing up to his tyrannical boss in The Big Clock leaves him open to being suspected as a killer. Professor Wanley's decision to waltz with his libido in The Woman in the Window becomes undermined by the hidden guilt of believing that having those impulses makes him both a killer and a cheating husband.

Henry Fonda and Sarah Miles in The Wrong Man

But probably the most unsettling (and least acknowledged) noir is Hitchcock's subdued 1956 thriller The Wrong Man. Based on Maxwell Anderson's novel, The True Story of Christopher Emmanuel Balestreroby, with the additional influence by Herbert Brean's "A Case of Identity," a 1953 article written for Life magazine about a falsely accused man, The Wrong Man concerns the capture and incarceration of a guy who doesn't even have a visible id, or no intent (conscious or not) to transgress. If the previous victims had inner shadows that were exploited by fate, Manny Balestrero (Henry Fonda) is your calmly serene family man without a devious bone in his body. Working nights as a musician at the Stork Club, Manny and his wife Rose (Vera Miles) struggle to make ends meet, but they are a happy couple who effortlessly tune into each other's thoughts. When Rose needs some emergency dental work done, Manny tries to borrow cash on her insurance plan making the trip to the insurance office himself to collect. The trouble is: He resembles an armed robber who once held up the office. So they call the police. From there, Manny is arrested, charged and incarcerated for the crime.

What makes The Wrong Man such an unnerving film noir is that Manny calmly co-operates with the authorities, totally confident in his innocence. He goes through witness line-ups (sometimes in other places where the robber had stolen before) and answers all their questions. But the more he does his duty, the insanity of legal bureaucracy and faulty perception digs him into a deeper hole. The sequence of events ultimately costs his wife her sanity as she begins to not trust even her own confidence in reality. Perfectly cast, Fonda conveys the common decency of a man who believes in truth and justice, but those sentiments are inadequate to human and legal fallibility.Owing no small debt to Kafka, The Wrong Man isn't so much about the unacknowledged darker wishes and desires of a good man; but instead, about a good man tainted by the unacknowledged darker perceptions of the culture he is part of. For no malevolent reason at all.

-- February 10/11

Electric Ladyland: Elizabeth Taylor 1932-2011



A few months ago, I was beginning a lecture series on the evolving role of women in Hollywood cinema from the pre-censorship Hays Code era of the early Thirties to the present. To begin the talk, as a way to introduce my overall theme of the enduring power of the female image, I quoted from an essay by feminist scholar Camille Paglia on Elizabeth Taylor from her book, Sex, Art, and American Culture (1992). In that piece, Paglia called Taylor "the pre-feminist woman" who she described as a more substantial screen actress than Meryl Streep. "Cerebral Streep was the ideal high-WASP actress for the fast track yuppie era, bright, slick and self-conscious," she wrote. Whereas, Elizabeth Taylor "instinctively understands the camera and its nonverbal intimacies." Paglia went on to write that Taylor "takes us into the liquid realms of emotion" where "economy and understatement are essential." She says that "an electric, erotic charge vibrates the space between her face and the lens."

While I couldn't claim, in technical terms, that Taylor was the better actress, I understood why Paglia preferred her to Meryl Streep. With Streep, every acting movement is highlighted in the same manner that an operatic diva's high C's are designated to get massive applause. She transforms the fluidity of human emotion into a catalogue of mannerisms: the flick of the hair, an accent, a hand gesture, they all become tics and inflections that call attention to her acting abilities rather than revealing more about the character she is playing. (This is why I usually prefer Streep in comedy where she relaxes her steely control.) Elizabeth Taylor, on the other hand, is so sexually charged that she becomes (to borrow the Jimi Hendrix title) electric ladyland.

Elizabeth Taylor in National Velvet

Not everyone at the lecture could reconcile the throne to Elizabeth Taylor. That may perhaps be due to Taylor being perceived a movie star, while Streep is seen as an actress. But Taylor was actually both star and actress; one who had, what Paglia called, "the hyper-reality of a dream vision." When she died this week at the age of 79, Elizabeth Taylor (as that dream vision) represented the passing of an era: the classic movie star of the studio period of old Hollywood who then morphed into the new Hollywood. Over her long career, she transformed dramatically from a luminous child actress in the Forties (National Velvet) to the regal beauty in the Fifties (A Place in the Sun, Giant, Cat on a Hot Tin Roof); to both a self-conscious icon (Cleopatra) and an overbearing broad in the Sixties (Who's Afraid of Virginia Woolf?). Later in that decade, too, she harnessed her star power to play a sexually rapacious southern belle in John Huston's unjustly ignored Reflections in a Golden Eye, just as she dove beneath her glamourous mask in X, Y and Zee (1972), where she grew into, what critic Pauline Kael called, "the raucous-demanding-woman role she faked in Who's Afraid of Virginia Woolf?" In fact, she had dramatically evolved with the times. The changes she'd undergone Kael described as beginning with "the fragile child with a woman's face to the fabled beauty to this great bawd."

No star though became a pop Queen quite like Elizabeth Taylor. While many actresses, like Garbo, retained their power through the image they portrayed on the screen; Taylor came with plenty of outside baggage from her personal life. If Garbo wished to be alone, Taylor never was. Her many marriages included dozens of tragic circumstances. Just consider her short time with producer Michael Todd (who would die in the chartered plane called The Lucky Liz). Not so lucky. That featherweight actor Eddie Fisher (who Kael once called "a vacuum on the screen" in his pairing with Taylor on Butterfield 8) couldn't last. Then the stormy tempest that became her war of the roses with Richard Burton (where their worst romantic instincts created even worse dramatic instincts in The V.I.P.'s and Boom!). Her life became as much of a compelling melodrama as the diverse roles she sought and occupied over her career.

Elizabeth Taylor and Montgomery Clift

But despite the tabloid culture that she helped feed and nourish, Taylor righteously commanded the attention of a diverse crowd. Director John Waters (Female Trouble, Hairspray) was a huge fan. "She was a real movie star," he told critic Michael Sragow. "Right up to the end she had a great sense of humour." (She would even appear on The Simpsons.) Taylor was also the model for Divine, who was Waters' cross-dressing Grand Dame. "Look at some of the movies we made like Multiple Maniacs, and you can see we were paying tribute to her. Divine wanted to be Elizabeth Taylor."

That made perfect sense. Her intuitive empathy with homosexual men, from her friendship with actor Montgomery Clift, whom she played opposite in A Place in the Sun, to her bringing awareness to the AIDS epidemic in the Eighties, Taylor made herself real flesh and blood beyond the iconic status she may have had among many gay men. (Her love for Rock Hudson, with whom she starred in Giant, led to her becoming the national chairwoman of the American Foundation for AIDS Research until later becoming the founding member of the Elizabeth Taylor AIDS foundation.)

Taylor on The Simpsons

Where some stars become dwarfed (even destroyed) by the turmoil of their personal lives (Marilyn Monroe certainly comes to mind), Elizabeth Taylor was by contrast emboldened by them. She didn't shirk from the tabloid press and she didn't wilt from the excesses of her personal life. Taylor was all of a piece, whether regal or debauched, or lounging comfortably in her sensual prowess delivering a line (as she did to Marlon Brando in Reflections in a Golden Eye) like, "Have you ever been collared and dragged out into the street and thrashed by a naked woman?"

It's rare that a big movie star could also command the screen as an actor. Often the two parts are held in conflict. For example, Brando was never comfortable being a star. But like Paul Newman, Taylor was able to integrate both parts into one whole. Unlike Newman, though, whose personal life didn't have the turbulent weather of Taylor's, he was allowed the freedom to be both great actor and huge star. Taylor will always be associated with the bad marriages and tragedies. But where Meryl Streep achieved acclaim playing a parade of victims, Elizabeth Taylor barrelled through life unbowed. Nobody before (or since) will likely ever find as many ways to captivate and electrify an audience with a wink, a sneer, or a solid kick in the ass.

-- March 27/11

Dreaming: Songs of Woodstock



Back in the summer of 2009, as some of you close to my age may recall, the 40th anniversary of the Woodstock Festival was being celebrated. Looking back, it's probably clear to most of us now that it was hardly the beginnings of an idyllic community, or the heralding of a new society. But as a cultural event, no question, it was certainly something significant to note. A number of artists also wrote interesting songs about that legendary swoon in the mud: two who performed there; and another who didn’t quite make it. Creedence Clearwater Revival's "Who'll Stop the Rain" begins simply by describing the torrential rain and the crowd's determination to outlast it. But then songwriter John Fogerty, quite movingly, leaps into larger concerns at work in the country. Those concerns took in the real storms to follow in the subsequent years ahead, the sense that the freedoms sought at Woodstock were not only illusory, but that a bigger price would soon be exacted out of all the frolicking. "Who'll Stop the Rain" went on to also provide both the title and emotional leit motif of Karel Reisz's film adaptation of Robert Stone's Dog Soldiers, a story about dashed ideals, the cost of loyalty, Vietnam, and the darker implications of the drug culture in the Seventies.

Joni Mitchell

The most recognizable song about the Festival, ironically, came from the one artist who did not make it there. Joni Mitchell composed “Woodstock” even though she got trapped in the massive traffic jam. (She ended up talking about Woodstock's significance instead on The Dick Cavett Show.) But in her composition, she tried to imagine the aspirations of the Woodstock generation while simultaneously providing a cautionary tone (a tone that unfortunately didn’t make it into Crosby, Stills, Nash & Young’s celebratory cover version). Mitchell’s version left a haunting reverberation. You can hear in it an intense longing for salvation along with a fear that it may not come to pass. In her eerie dissonant gospel rendering, doves and jet bomber planes become interchangeable; peace and possible apocalypse become discomfiting bedfellows.

But one of the forgotten songs about the Festival was also rendered in a gospel style. But it doesn't warn of dark days ahead. "Lay Down (Candles in the Rain)" was written and performed by Melanie Safka (otherwise known as Melanie), an artist who is barely remembered today. Like "Who'll Stop the Rain," her tune also focused on that massive audience caught in the downpour and who lit candles during her set. But the implicit meaning of her tune reveals something quite different than Fogerty's and Mitchell's. In Melanie's song, there is a yearning for something greater than lighting wicks. She reaches for the power of divinity, the idea that a candle's light can snuff out the darkness. Despite all skepticism, the emotional power of this performance makes that spiritual release almost possible.

In her day, Melanie was the ultimate flower-child who wrote precious songs like “Brand New Key,” “Leftover Wine” and “Look What They’ve Done to My Song, Ma.” She sang in an intensely scratchy voice that was sometimes inseparable from a whining shrill. At that time, she was also a follower of the guru Meher Baba (as was Pete Townshend of The Who which only goes to prove that it isn't just people with acoustic guitars who are susceptible to gurus). So Melanie approached the studio recording of "Lay Down (Candles in the Rain)" as if she had encountered divine revelation at Woodstock. (And to drive that sentiment home, she was backed up by the majestic Edwin Hawkins Singers.) While her lyrics had little of the ambiguity of Joni Mitchell’s "Woodstock," the sheer force of Melanie’s performance turns "Lay Down" into an ecstatic experience, one of unbridled and defiant joy at being unleashed. Her idealism expressed here was fueled by the desperate need to have it all come true.


Of course, all things didn’t come true – certainly not the aspirations of Woodstock. So in 1991, Melanie was on the road again. She performed the song, as well, on a Nashville television program. A lot had changed since Woodstock. She had renounced Meher Baba and had become a libertarian. Melanie still did the occasional tour, but she was now a devoted mother. Her performance of “Lay Down (Candles in the Rain)” that evening revealed a truly different side of the song: a beautifully melancholic acceptance of those ideals that pass with youth. She didn’t cynically renounce the track’s sentiments, exactly, or turn it into a blatant act of nostalgia. Rather, Melanie made her signature song a wistful lament for the youth nation that didn’t find its own divine revelation. She did this without a hint of disillusionment in her rendering, in fact, she seemed at peace, even happy. Melanie sang as if she fondly remembered the moment when this tune said all it could say about what a young woman once dreamed in front of a thunderous crowd in the rain. Twelve years after the fact, in times more jaded than those she sang about, Melanie turned "Lay Down" into a lament, a song whose passions couldn't find that lasting fulfillment it promised. But she didn't lie about those hopes either.

Melanie is about to make an appearance at Koerner Hall in Toronto in May. The question remains: Is there anything more that she can possibly ring from this track? Who knows? "You can't compete with the power of gospel music," singer/songwriter Randy Newman once told writer Paul Zollo. "It almost makes you wonder. Not quite, but almost. You can combat it with reason, which is dry. It's not musical. Music's not reasonable." Which is why, despite the sometimes cloying nature of Melanie's other material, "Lay Down (Candles in the Rain)" does indeed defy reason.

-- April 6/11

Dropping Out of Time: Marco Tullio Giordana's The Best of Youth


Jasmine Trinca as Georgia in The Best of Youth

Back in 2003, I was in the midst of attending early press screenings for the Toronto International Film Festival for Boxoffice Magazine. Although the Festival officially begins in early September, the work starts for most film journalists in mid-August. Since Boxoffice is also a trade publication (like Variety), we often had to see a fair number of movies at each Festival. One of the films I was assigned that year was an Italian picture called The Best of Youth (La meglio gioventù). Little did I realize that the movie was over six hours long. Little did I realize that it would also become one of the most satisfying movie experiences I would have in over thirty years of reviewing films.

It wasn't until I was planning my daily schedule did I realize The Best of Youth was an epic. I thought that maybe its running time was a typo. So I called my editor in Los Angeles to ask her if it was true. She was also surprised at the length. "So why are we reviewing this?" I asked. "Apparently, Miramax has it and is planning to open it," she informed me. "What American distributor opens six hour movies anymore? If this picture is bad, I'll be tapped out for the rest of the Festival," I explained with the hope that I could get out of this. When you are reviewing so many movies over a month of intensive film going and writing, you need to pace yourself. A bad movie can quickly turn you into a walking corpse. My editor immediately arrived at a solution. "I'll tell you what," she perked up. "Go see the movie. If it's really awful, just bail on it and we'll review it when it opens in L.A. later in the year. But if it's good, you've got your review." It seemed a fair deal. So I looked at the write-up in the Festival book and the story sounded intriguing. But I didn't know the director, or any of the actors, except one whose name I seemed to recall from Bernardo Bertolucci's 1964 debut Before the Revolution. I went to the screening with absolutely no expectations.

Jasmine Trinca, Luigi Lo Cascio and Alessio Boni

From the movie's opening moments, I was so completely drawn into the director's vision of how youthful idealism is both tested and sustained that, by its conclusion, I could barely move from my seat because I was so emotionally overcome. I remembered only two other occasions when that ever happened; once towards the end of Satyajit Ray's The World of Apu (1959) which was also, curiously, about the testing and sustaining of ideals. With tears streaming down my face, I was equally undone in the final moments of John Huston's 1987 adaptation of James Joyce's The Dead. The Best of Youth is a finely textured story that traces the lives of two distinctly different brothers from a middle-class Italian family. The film is divided into two parts. It begins with their graduation from university in the hopeful period of 1966, following through the politically turbulent Seventies, then into their middle age of the early Eighties. Part Two begins with Italy's 1982 World Cup victory and follows their lives into the Nineties on through to the present day as Italy tries to rebuild into a more modern nation.

While it is shaped episodically like a sweeping family saga, The Best of Youth is also a powerfully stirring coming-of-age story. Nicola (Luigi Lo Cascio) is a hopeful university graduate who discovers that he is open to the adventures and surprises that life offers him. His brother, Matteo (Alessio Boni), however, is a deeply unhappy family prodigy who ultimately seeks refuge in security and order. During the summer of their graduation, an encounter with Giorgia (Jasmine Trinca), a disturbed and vibrant young woman who has been institutionalized by her father, changes the course of all their lives. The Best of Youth contrasts, over time, the way historical events intersect with (and alter) the individual fates of the characters.


Although The Best of Youth was originally financed by – and for – Italian television, it has an intimacy and sensibility unlike most television movies. The story may be episodic, taking in the famous flood in Florence in the Sixties, Sicily's struggle against the Mafia, and the terrorism of the Red Brigades, but the characters add depth and meaning to those troubling periods. You wouldn't call The Best of Youth radical in terms of its technique, but with an informal realism, director Marco Tullio Giordana paints a luxurious and loving portrait of people coming to terms with their history. There are such beautifully modulated performances from this vast cast of compelling characters that The Best of Youth, like the best of Jean Renoir, Satyajit Ray, Luciano Visconti and Francois Truffaut (parts of Georges Delerue's poignant score for Jules and Jim are used) becomes a profoundly humanistic and affirmative experience.

I think maybe critic Roger Ebert put it best when he said, "Every review of The Best of Youth begins with the information that it is six hours long. No good movie is too long, just as no bad movie is short enough. I dropped outside of time and was carried along by the narrative flow; when the film was over, I had no particular desire to leave the theatre and would have happily stayed another three hours." He's right. I could have also stayed another three hours. We sometimes forget that there are many great long epics from Greed to the first two Godfather films; movies that transcend their genre and find a place where, as Ebert says, you can drop out of time.

-- May 26/11

Walking into History: Jennifer Stoddart & Paul Fusco's 1000 Pictures: RFK's Last Journey


Robert Kennedy's Presidential campaign in 1968

When Robert Kennedy was assassinated in June 1968, while campaigning to be the Democratic Party's choice for President, you could feel the air going out of the culture. At the time, I was a Grade 8 student about to write my final exams. But when I woke the morning after the California primary to find out that he had both won and was mortally wounded by an assassin, I walked to school and promptly failed every one. Getting into high school just didn't seem to matter anymore. JFK's assassination might have been a seismic shock to the system in 1963, but this was a murder that curdled and darkened the nation.

Even though I was a Canadian, I was fervently following RFK's campaign as if caught up in the passion of a political dream, the quixotic idea that one could remake a country. Since Martin Luther King Jr. had just been murdered that spring, it seemed even more urgent that those ideals be realized through Robert Kennedy. Kennedy seemed to galvanize the nation by imagining a country built on the inclusion of its citizens; where rich and poor, black and white and Mexican-Americans could share in its possibilities. They would line the streets daily during that campaign clamoring to shake his hand while stepping forward as if they were walking into history, wanting to be a part of its making. There was a true sense, even with the horrible war going on in Vietnam, that the country could still be truly remade into something resembling the ideals set forth in its founding documents. In the absolute worst of times, you felt a keen sense of anticipation. But RFK's death seemed to kill any desire to hope for anything better.

I realized that I was living in a different country with a different dream in the early glow of the Pierre Trudeau years. But still something in my political spirit changed that day, as it did for many others who either turned to violent revolution, or turned inward instead. In the documentary 1000 Pictures: RFK's Last Journey (currently on HBO), director Jennifer Stoddart revisits that hope and uncovers the painful despair that followed that tragic event. After Kennedy's funeral in New York on June 8, his body was taken back to Washington by train to be buried in Arlington Cemetery next to JFK. During the journey, people who had once lined railway stations and sidewalks to commune with Robert Kennedy, now were once again lining the tracks to bid him farewell. On that trip, photographer Paul Fusco took photos from inside the train of all those he passed and caught them fleetingly as if they were a twinkle in his eye.

(Photo by Paul Fusco)

While incorporating Fusco's photos, Stoddart traces that sojourn from New York to Washington by also talking to some of those people today who were part of that walk into history on that sad day. Their grief is not only still palpable but we also get a glimpse of how lives were either further shaped by the events of that moment, or how the events in a number of lives intersected with the moment that train went by. One photo captures Joe Fausti, of Trenton, New Jersey, who was 18-years-old on that day. While gathering with friends to pay his respects, he raised his hand to wave at a helicopter over his head and blindly touched a high-tension wire which then shot 35,000 volts through his body. "I burst into flames," he tells Stoddart. "The electricity entered my hand, went through my body, and exited my left ankle." As he ponders Fusco's photo of him lying on the ground wrapped in the various clothes people used to put out the flames consuming him, he describes how it gives him "goosebumps" now recognizing not only how he might have died that day, but also recalling the painful skin grafts taken to heal the damage.

Charles Maurone, who was a Democratic party chairman in Pennsylvania in 1968, seems caught in a perpetual time warp as he ponders the photo taken of him that day. “It was the end of something that hardly even began,” he tells Stoddart as if looking for a period to a sentence that has no conclusion. When the train got to Philidelphia, Fusco took a shot of Sedrick Robinson who was a young black teen at the time. Speaking to Stoddart in the film, he describes the city where both he and his country were born. “[It was a] highly segregated city at the time,” he explains while we see him tucked into the clusters of very distinctly separated groups of blacks and whites. “These memories brought back no hope, no food, going to school with no lunch.” While taking in the photo, Robinson also looks ahead in time considering those in the picture who wouldn't get out of the world alive due to violence and drugs. “I was just one who indulged in it. It was just a blessing from God how I got out of it.”


Perhaps the most moving moment comes from Vanessa Chambers who spots her boyfriend Tootie not far from her. “He was a very sweet guy, very quiet,” she recalls sadly. Although they soon experienced the joy of having a child together, Tootie would later be killed. “I’m not really sure of all the circumstances, but I know he was shot in the chest and he died on the way to the hospital.” The poignancy of the moment comes when she adds, “This is the only picture with both of us in it, just like you’re frozen in time.” The experience of watching 1000 Pictures: RFK's Last Journey is like the thawing of a deep freeze. Some people pinpoint that day to their memories of a high school reunion, as another recalls the pain of losing relatives killed that day by a passing train on the other track that hit them while they waited for the late Senator to arrive. Throughout the film, Stoddart also includes Kennedy's voice from some of his campaign speeches as if to underline how the yearning in his voice invoked a pining in those who came to wish him farewell. In doing so, she also reminds us of a time when the desire for social change was a calling.

Who knows why Kennedy's body was taken in a slow train back to Washington. Nobody in the film – especially his former press secretary Frank Mankiewicz – seems to know. “I had no idea people would gather or stand by the side of the train as it went by,” he recalls. But the image of the train has huge symbolic value. The train carries a wistful, even regretful quality in American popular songs like Johnny Cash's "Folsom Prison Blues"; it can also inspire a yelp of defiance in Elvis Presley's voice as he transforms Junior Parker's mournful "Mystery Train," a caboose that conclusively takes his girl away from him. In 1991, I travelled by train throughout the United States during the first Gulf War to trace the substance of the country in a time of war. This mode of travel not only linked me to disparate sections of the nation, but also different interpretations of its history.

FDR Funeral Train at Clemson Station

But Kennedy's journey also evoked another image from a moment when a train seemed to link a country in grief. After President Franklin Delano Roosevelt died in April 1945, his body left by train from Warm Springs, Georgia, where he had been visiting a health spa to ease his ailing and paralyzed body. It took him to his home in Hyde Park, New York, as people then lined the tracks to bid him goodbye. But while Roosevelt's journey reminded people of the cost exacted by a World War about to end (not to mention how its longest serving President had taken their nation through horrors of the Great Depression), Kennedy's train had a different impact. It reminds us now of a time that never happened, a dream that was never realized, a hope that was cruelly dashed. Roosevelt's train culminated an era of social change while Kennedy's forfeited the probability of one.

Jennifer Stoddart ends 1000 Pictures quite eloquently with the conclusion of Edward Kennedy's funeral eulogy for his dead brother. As we watch the still photos of the Arlington burial, held in the dark of night, Edward Kennedy's voice reminds us of what RFK once believed: "Some men see things as they are, and say why. I dream things that never were and say why not." 1000 Pictures: RFK's Last Journey ponders the fathomless depths of those words.

-- June 18/11

Voyeurism has always been an integral part of the appeal of motion pictures. However, over the years, the taboo of watching and staring into the lives of others was made largely acceptable by movies that didn't implicate us in our peeping. But Alfred Hitchcock and Brian De Palma changed all that. They turned that taboo of staring and watching into a dramatic strategy where both directors forced us to face our own perverse fantasies and forbidden desires. Where Hitchcock set out to become a master entertainer of exciting spy thrillers and dramas, De Palma questioned with ironic humour the very nature of what makes exciting drama. If Hitchcock desired (and won) a mass audience that made him one of the most highly regarded and respected commercial directors, De Palma became the opposite. He would often alienate audiences because of his ironic desire to treat movie conventions and storytelling in an irreverent way. In doing so, he deliberately (and cheerfully) undermined our desire for a happy resolution to the picture. Hitchcock may have been a genius at manipulating our responses by pulling the rug out from under our expectations in his dramas; but De Palma, in borrowing some of Hitchcock’s cinematic language (as well as the language of Buñuel, Polanski and Godard), used conventional drama to take us deeper and further into more contemporary issues of sexual fear and political unrest.

For instance, in both Hitchcock's Psycho (1960) and De Palma's Dressed to Kill (1980), the allure of voyeurism permeates the films. But they also go in different directions. In Psycho, Marion Crane (Janet Leigh) is an unhappy bank employee who is having a frustrating affair with John Gavin. One day, she impulsively steals some money she's supposed to deposit and hits the road, looking to make a new life. Who knows where that road is going but the audience immediately feels caught up in a suspenseful heist drama where the only anxiety is whether Marion will be caught. A sudden rain storm takes her off that road into the presumed quiet shelter of Bates Motel. While engaging in conversation with the shy, unassuming Norman Bates (Anthony Perkins), the proprietor, she begins to feel the need to cleanse herself of her earlier transgression. To do so, she will go back and face the consequences of her reckless act. First she takes a shower to wash away the guilt. But what she doesn't know (and we do) is that Norman has been watching her strip through a hole in his office wall. Before we can resolve our unease at watching him indulge his masturbation fantasy, we find ourselves watching her, too. When she steps into the shower we invade her most vulnerable and private space. Yet we continue to watch; getting away with our unseen spying, as Norman Bates had moments before. Shortly after, when she ends up brutally murdered, stabbed to death in the shower, we are implicated in that act. The act of watching.

Janet Leigh in Psycho

Besides breaking a tradition where the lead actress usually survives a picture, Hitchcock made us party to the crime, even slyly shifting our sympathies to Norman Bates. Hitchcock also shattered our safety zone. Janet Leigh was gone before the movie was even half over. What lead actress ever gets bumped off that early, if at all? More to the point, we felt partly responsible for her death not just because we were witnesses, but also because we couldn't reach into the screen and save her. We felt like passive accomplices to both the crime and our desire to watch. In his book, The Moment of Psycho: How Alfred Hitchcock Taught America to Love Murder, critic David Thomson elaborates on this. "Right from the start, Psycho played with...darker prospects," he wrote. "[N]ow the subversive secret was out – truly this medium was prepared for an outrage in which sex and violence were no longer games but were in fact everything." Before Psycho, I don't think sex and violence were games exactly, but they were often treated moralistically so that audiences could easily separate good from evil and feel inviolate. But Psycho ended all that. "The title warned that the central character was a bit of a nut, but the deeper lesson was that the audience in its self-inflicted experiment with danger might be crazy, too," Thomson continues. "Sex and violence were ready to break out, and censorship crumpled like an old lady's parasol."

Angie Dickinson in Dressed to Kill

By 1980, though, sex and violence was everywhere in movies. There was no longer a censorship code to contain it as there was in Hitchcock's time. Sex and violence became so pervasive, sometimes even lacking in shock, that a hack director could employ it whenever he felt that the audience was getting bored. When Brian De Palma made Dressed to Kill, he took the basic framework of Psycho and reframed it for an age where the fantasies of sex and violence – which movies now openly embraced and encouraged – could once again regain the power to shock and instill fear. Dressed to Kill begins with a shower scene (much like the famous one in Psycho), but this time, there was more going on than just our role of watching. The main character, Angie Dickinson, is inside her own fantasies, soaping herself and drawing us into her most private sexual thoughts. But we don't know that in the beginning. Instead our unease is stirred because De Palma opens his film with a scene that could have come out of dozens of soft-core pornos. We soon get snapped into reality when we discover that her shower fantasy is distracting her from the miserable round of sex she's having with her husband that morning. Frustrated and tired of the mechanical humping with her mate, she speaks to her therapist (Michael Caine) who tells her to confront her frustrations with her spouse. But rather than deal with the inevitable, she seeks out a man in a museum and has a quick fling. At first what appears to be a happy coincidence turns into something horrifying – and that isn't even the murder scene (which also doesn't take place in a shower). Furthermore, the killer who stalks her has issues of his own with sexual identity.

Critic Pauline Kael called Dressed to Kill a horror comedy that cleverly dealt with our fear of sex and also described it as a movie where everyone ended up spying on everyone else (and, of course, the movie came just before the decade would fulfill all those fears). With all the Internet spying going on today, where people post videos on just about anything, Dressed to Kill might be even more prescient than when it first appeared. De Palma though made an essential link in 1980 between our movie fantasies and the sex fantasies of his female character. Those fantasies would ultimately lead to the reality of her murder which led feminists groups to protest theatres claiming that Brian De Palma was endorsing violence against women. "I thought that was a very naive reading of the movie because the Angie Dickinson character is darling," Kael told me in an interview in 1983. "You feel so sorry for her. Here is this sweet woman who isn't harming anyone and this hideous irony happens where she steps out and tries to have some sexual pleasure and she gets killed after it. It's so subtly funny in the way that it's handled. Somehow the feminist critics have treated it as if she's being punished for her sexual transgression. I don't think that's remotely what's going on in the movie." Film critic Steve Vineberg agreed; perhaps nailing what it was that incensed people at the time. In No Surprises Please: Movies in the Reagan Decade, he wrote that "the link between fantasy and reality in this movie is unconscious. What makes Dressed to Kill both funny and frightening is that when the killer strikes, it's as if he were tuned into his victims' fantasies – which obviously he couldn't be; their fate is a perverted playing-out of their most sordid private thoughts." The playing-out of sordid private thoughts is the true common link between Psycho and Dressed to Kill. But where Psycho operates within the boundaries of genre convention, Dressed to Kill reaches out to society at large; a society where movies have a way of shaping a whole way of seeing, feeling and fantasizing.

-- July 24/11

 

                                                          You’ll never get my mind right
                                                          Like two ships passing in the night
                                                         Want the same thing when we lay
                                                         Otherwise, mine’s a different way.

                                                         Amy Winehouse “In My Bed.”


While the news could hardly have seem surprising that Amy Winehouse had been found dead in her apartment given her continuous struggle with substance and personal abuse, not to mention her disastrous recent concert tour (which seemed to invoke any number of Hollywood melodramas you cared to call up), it still seemed unreal. As I turned to more television coverage, some writers I watched trotted out the usual clichés about “the good dying young” and the eerie coincidence of her joining “The 27 Club” (which contains other dead 27-year-old performers like Jimi Hendrix, Janis Joplin and Kurt Cobain) others grappled with words to describe their grief. As I thought about Amy Winehouse, one question kept running through my mind. Is there any other pop singer who has been so visually scrutinized and drawn such voyeuristic attraction? Unlike other rock legends who met an early demise (but more in tune with Hollywood starlets like Lindsay Lohan), Winehouse’s problems became daily fodder on YouTube. We watched with both horror and fascination as she put on a lurid display of self-immolation, done with a willful defiance that trashed the aching artful touches she brought to the masochism in songs like “Love is a Losing Game” and “Wake Up Alone” (on her 2006 masterpiece Back to Black).

Whether it was Amy getting drunk with Pete Doherty (gee, there’s news), passing out in an alley, making up racist nursery rhymes, or losing touch with both her work and audience in her meltdown in Belgrade, we saw it all. And, like peeping toms, we continued to watch but could do nothing to stop it. It was if we were wondering which of her best songs, on both her debut CD Frank (2003) and Back to Black, might fulfill themselves in her final flame out. In fact, you could swear that she seemed to know the dynamic being played out in those online videos, where followers and fans just seemed to be waiting for the moth to be consumed by the fire. Sometimes you could watch Amy Winehouse simply taunt the camera by staring into the eyes she sensed were spying on her. Her music may have initially attracted listeners who longed to touch the depths of the despair she plunged into in “Take the Box” (“I just don’t know you, but you make me cry, where’s my kiss goodbye?”), but in the past few years the episodes of total collapse became her true follow-up album to Back to Black.

Janis Joplin

The daily chronicling of Amy Winehouse’s road to perdition was far different from the self-destruction of other pop heroes. When Jim Morrison died in his bathtub in Paris, his demise was viewed almost worshipfully. After all, people only remembered him taunting his audiences and whipping his dick out onstage in Miami. Fans saw those indulgences as performance art rather than a love affair with death. But YouTube, digital cameras and cell-phone cameras weren’t even on the horizon to capture Jim falling down drunk and not able to perform while his band, The Doors, scrambled helplessly behind him. Janis Joplin’s neurosis might have been gleaned best from her powerful reading of Big Mama Thornton’s “Ball and Chain” at the Monterey Pop Festival in 1967. But what if we also had daily video updates of Janis falling over due to her large consumption of Southern Comfort? How would her work measure up against that? The performers of the past could have some semblance of personal privacy while their music gave us windows into their troubled souls. Even Kurt Cobain, whose dark future could be read not only in Nirvana’s “Smells Like Teen Spirit,” but also in his harrowing performance of Leadbelly’s equally nightmarish “Where Did You Sleep Last Night?” from their Unplugged Concert, was spared the equivalent of a snuff film circulating on the Internet.

My friend Donald Brackett wrote a fascinating book a few years back called Dark Mirror: The Pathology of the Singer-Songwriter (Greenwood-Praeger, 2008) which examined with quick insight the dynamics between conflicted musical partners (Lennon & McCartney, Jagger & Richards and Simon & Garfunkel) as well as the solo artists who are divided within themselves (Bob Dylan, Joni Mitchell and Brian Wilson). One of his chosen performers was Amy Winehouse. Unlike those books that clinically dissect the inner torment of the artist, in Dark Mirror, Brackett wrestles with their work like a sailor caught up in a windstorm at sea. He doesn't provide easy answers, or obvious critical assessments either. He often talks rhetorically to the artists; sometimes in a running spree he moves through the grooves in their tracks, and wringing from them the tantalizing process that drew him to their music. He does occasionally get his facts wrong, but the nuances are always dead on. He uses a fan’s zeal and curiosity not for prurient fascination, but to distill the impact of the artist’s work while sharply examining their discontent. Although Brackett is certainly a critic, he simultaneously rehabilitates the notion of what it is to be a fan. In describing Amy Winehouse, he actually asks a pretty significant question: “If the pop song evolved into the soundtrack for the last century, as it so clearly seems to have done, what does that tell us about the emotional movie we all live in?”


What indeed? His pertinent question is perched right on the cusp of the continued fascination we’ve had for the many emotional movies of Amy Winehouse that have hit the Internet in the years that followed those words he wrote in 2008. “When it comes to making your life into your material and your material into a breathing emblem of your living sorrows, Winehouse seems to have achieved a new high watermark for such rare creative transformations,” Brackett continued to write. “[There’s] also a new low depth for such deep and deranged personal indulgences in a private and public hell.” In essence, he’s saying that her torch songs might soon become torch movies; in fact, become the videos we’d soon witness. “Her whole frail and bony being seems to be but one sharp No,” Brackett explains. “Something tells me this particular swimming pool seems to be filled with tears, and there is a rather garish sign preventing potential casual swimmers from even considering the arc of a dive from its lyrical depths.” With the endless parade of Amy Winehouse-in-distress videos soon to come that garish sign seemed to entice those casual swimmers.

Perhaps our sense of regret (even guilt) that followed Amy Winehouse’s death might have been the recognition that, as viewers to this travesty, we became implicated in the artist’s most disturbingly vivid plans. In Dark Mirror, Donald Brackett quotes John Updike’s famous line about celebrity as a mask that eats into the face. But now, in this new technological world, we have become party to the munching. This is why, I suspect, there is a look of unease on some of the faces now seen mourning her. If in those unsettling videos we kept her misery alive in our mind’s eye, what will we use now that she’s gone? As her nakedly exposed songs come back on the radio and her CDs once again begin to sell, will their power ever overcome the travesty of the endless peep show that followed them?

-- August 3/11

Mirrors: 9/11 and the New Media



One night, about four years ago, I was sitting in front of my computer at work, just killing time and finishing some e-mails. As I was about to head home, a Chinese employee in her mid-thirties just happened by my office to chat. In no particular hurry to leave, I asked her to sit and soon we began talking about her short time in Canada as well as the journey that brought her here from Beijing. Our conversation quickly got around to world affairs and some of the historical events that touched, perhaps even changed our lives. After I rhymed off some of the key ones for me – from JFK's assassination to 9/11 – I suggested that for her the massacre of the students at Tiananmen Square in 1989 had to be a seminal event. Instead of nodding in full recognition of the terrible slaughter of that June, she looked at me with the puzzled expression of someone left out of the loop of a conversation.

Certainly I must be confusing this horror with some other place, some other country, some other time, her face told me. While I insisted on what I knew to be historical fact, she was adamant that Tiananmen Square never saw such a calamity. For her, not only had Tiananmen Square never happened, student leader Chai Ling never existed, nor did the iconic sight of the sole protester standing in front of the tank; an image that, for many, stood for both the resiliency of human defiance as well as its futility when it's up against enormous odds. In her mind, there never were such odds at stake. Her expression of denial proved wrong the hopeful young female student speaking to the BBC who, in the middle of the protests, told the reporter, "What can they do to us? We have our whole future ahead of us, and we've seen it." The student obviously didn't see a future where one of her own citizens had no knowledge, or even a recognition of the events that prompted her to see a better future, a time she saw ahead as a period of democratic freedom that China has yet to attain.

My only means to recover the facts of that day lay right on the computer I hadn't yet shut down. Immediately, I brought up YouTube and quickly collected dozens of videos taken by reporters, bystanders and survivors depicting the bloodshed. As she watched the screen, her face seemed to be struggling with what her eyes kept telling her. Could this be fake? Is it a movie? No. She knew somewhere within herself that despite what she had learned, what she had been told, the images on the screen were confirming something true. When the tears started flowing down her cheeks, I quickly turned off the video. My intent was not to traumatize her. On that night, my goal was to collect with all my powers to discriminate what I knew to be true; or at least, to present some form of historical fact to counter something worse than cultural amnesia. Thankfully it was 2007 and there was YouTube. But when 9/11 happened, YouTube was still a few years away from being created by three former PayPal employees, so the only visual record that day was what we saw on the television news. But what we witnessed also created a hall of mirrors effect.

JFK in Dallas in 1963

Back in 1963, when JFK was assassinated in Dallas, television news was still in its infancy. The political events the medium had already covered, from debates to congressional hearings, had always been mapped out with planned shots, limited time and commercial breaks. But when those gunshots rang out in Dallas, television news scrambled to keep up, creating an improvised running narrative. In the style of later Robert Altman movies, news anchors talked over each other instead of at each other; reporters scrambled to get film footage (footage still wet from the chemicals in the developing lab) and television news created an on-the-spot story of that grey Friday in November. That quest for a narrative to make sense of the event continued all through the weekend. On Sunday, we could watch while a state funeral unfolded in Washington, then suddenly witness Jack Ruby snuffing out JFK's alleged assassin in a Dallas police parking lot. The images we caught on the fly that weekend would, in short time, influence the scenarios of a number of American movies over the next four decades.

But 9/11 was different. The images we saw on television, although providing shock waves comparable to the JFK assassination, didn't create a new dramatic narrative. They seemed more to be a product of the last decade of Hollywood action films. The horrific sight of planes going into huge buildings creating huge fireballs; or people scrambling for their lives from the collapsing structures, we could recall from pictures like the first three Die Hard pictures, The Siege and True Lies. People were talking about the event that way even as it was happening that very day. Over and over you heard citizens saying that it was just like a movie. In a Paris news conference, shortly after the terrorist attacks, director Robert Altman was promoting his new movie Gosford Park. When a reporter asked him his thoughts on 9/11, he answered that any number of movies Hollywood had made taught the terrorists exactly how to do what they did. What he didn't add was that the only difference was that this time there was no Bruce Willis or Arnold Schwarzenegger to pull us from the brink.

Today on YouTube, you'll find plenty of 9/11 stories that create narratives of that horrible day. But unlike the clear revelation that the images of Tiananmen Square provides, images which make some sense of what took place, the 9/11 videos create a fractured reality, a reality where anyone can invent any version of history they desire. You can find clear first person accounts documenting what we saw on the news, or sometimes images we didn't witness. There are the 911 calls – chilling in their immediacy – of trapped people in the towers desperate for the help that never came. (One trapped New Yorker, in his last desperate moments before the building collapses on him, cries out for help even if it has to come from New Jersey.) The site is also filled with conspiracy theorists and extremists offering images that neatly eliminate the planes; or, like the hero in Antonioni's Blow-Up (1966), they zoom in on little clouds of smoke that "prove" once and for all that it was explosives that did it. There are people in these videos crying out that 9/11 was an inside job while others – with clear explanations for the collapse of the towers and Building 7 – counter them.


Television news gave some coherent and credible shape to the JFK assassination (where conspiracy theorists grew only in the wake of the official investigation), but the 9/11 imagery on the web is a virtual soapbox that offers any number of alternative versions of history by self-appointed preachers. Journalist Jonathan Kay has written perceptively about the conspiracists in his new book Among the Truthers (HarperCollins, 2011). "Conspiracy theories may be nonsense, but the disturbing habits of mind underlying them - a nihilistic distrust in government, total alienation from conventional politics, a need to reduce the world's complexity to good-versus-evil fables, the melding of secular politics and End-Is-Nigh religiosity, and a rejection of the basic tools of logic and rational discourse – have become threats across our intellectual landscape," Kay explains. Conspiracy theorists also make pronouncements rather than seek clarity. In this world of images, you hardly require governments to censor information, as they continue to do in China or Iran. People (in the name of freedom of expression) will provide the same function by creating a video that tells you that 9/11 wasn't what the official news agencies told you it was. The conspiracy historian is here to tell you that there probably were no planes, no Mohamed Atta, no real casualties on that day. "For the first time in history, ordinary people now can spread their opinions, no matter how hateful or eccentric, without them first gaining the approval of editors, publishers, broadcasters, or paying consumers," Kay continues. "Rather than bring different groups into common discussion, they instead propelled radicals into their own paranoid echo chambers."

What made it easy for me to provide the reality of Tiananmen Square, the footage of news reporters covering an international event, made it harder to do with 9/11. Surfing online with full access to all the information that YouTube provides also lay a trap; a visual scrapbook of multiple subjective views not open to the scrutiny of critical and skeptical minds – or informed editors and credible historians. 9/11 is a historical fact just as Tiananmen Square was. But the new media climate also teaches us, as Frank Zappa once said, that information is not knowledge.

-- September 11/11

How Much History? Paul Simon's “The Sound of Silence” at Ground Zero



In the conclusion of his 1981 book Deep Blues, his musical and cultural exploration of the Mississippi Delta blues, the late music critic Robert Palmer wrote, "How much thought...can be hidden in a few short lines of poetry? How much history can be transmitted by pressure on a guitar string?" You could spend an inordinate amount of time contemplating the depth in those very fine lines. You might even say that Palmer spent his whole book in quest of that riddle. In the new paperback edition of Blues & Chaos (Scribner, 2011) – a collection of Palmer's essays first published in hardcover two years ago – that sojourn is outlined in a much more literal manner, one suited to a fine music historian. The editor, Anthony DeCurtis, has thematically designed the book as a journey into the vast mystery of music itself, which includes blues, jazz, rock and world music. But he begins the book with Robert Palmer's 1975 Downbeat magazine essay, "What is American Music?" In it, Palmer claims that "American music is non-proprietary ... in that American composers (and performers) innovate and move on."

That spirit of being non-proprietary made me think of many American artists, but mostly of Woody Guthrie, who once said that he didn't write songs, but pulled them out of the air. When a performing artist can create a work by reaching into the air, rather than simply claiming ownership of it, he/she taps into the essence of exactly how much history will be transmitted from the moment they begin to perform. The artist who innovates discovers a work's meaning rather than imposing meaning on it. As an audience, we can then discover how much history is transmitted when the song begins to change the artist who created it. That's what struck me most when I heard Paul Simon begin his classic song, "The Sound of Silence," during the events at Ground Zero this past Sunday.

Most of the coverage that day, from the readings of the victims' names to the various speeches, was thoughtful and moving, as people tried to sum up the impact of a ten-year history that is still too overwhelming to fully comprehend. It was midway through the afternoon that Paul Simon, one of New York's own, stood at the running water where tall buildings were once located and began to play one his earliest and most famous songs. Besides being the song that propelled the duo, Simon & Garfunkel, to stardom, "The Sound of Silence" also has a curious evolution that contains much of the history Robert Palmer speculated about when he first fell in love with the blues.

"The Sound of Silence" (sometimes called "Sounds of Silence") was written by Paul Simon in February 1964 shortly after the assassination of President Kennedy. The first version, heard on Simon & Garfunkel's 1964 debut album, Wednesday Morning, 3 A.M., was a quietly mournful folk song. Nothing dynamic or earth-shaking heard here. "The main thing about playing the guitar, though, was that I was able to sit by myself and play and dream," he told Playboy years later in 1984. The opening lines, "Hello darkness, my old friend, I've come to talk with you again," came to him, he told the magazine, as he sat in the darkness of his bathroom playing to a dripping faucet. But the song wouldn't find its place in the canon of American music until record producer Tom Wilson, who had just helped electrify the pop world with his production of Bob Dylan's 1965 epic "Like a Rolling Stone," decided to re-release "The Sound of Silence" that same year with overdubbed drums (Bobby Gregg), electric bass (Bob Bushnell) and electric guitar (Al Gorgoni). The song suddenly became an anthem and it spoke to those seeking solace in a world they felt alienated from. "The Sound of Silence" would reach Number One on New Year's Day 1966. But, by the next year, nobody thought of the Kennedy assassination when they heard it played.


In 1967, Director Mike Nichols was making The Graduate, his comedy about a recent university graduate, Benjamin Braddock (Dustin Hoffman), who comes home with no purpose or plans in life. He gets seduced by an older woman, Mrs Robinson (Anne Bancroft), but he falls in love with her daughter, Elaine (Katherine Ross). The movie became a counter-culture milestone that began as a funny parody of an innocent who receives carnal knowledge thanks to an unhappily married friend of his family. But the movie scored with younger audiences because it ultimately turned on Anne Bancroft by making her corrupted by her age and wealth, and portrayed the young graduate as morally superior for seeing through her shallowness. Of course, in the end, Benjamin gets the girl. During the shooting, Nichols became obsessed with Paul Simon's music. But since Simon was touring, he couldn't write any new material ("Mrs. Robinson" was originally about Eleanor Roosevelt and Joe DiMaggio, a whole different era, but Nichols had him change the words and title), so he included "The Sound of Silence" and "Scarborough Fair." In the picture, "The Sound of Silence" is used to reinforce Benjamin's feeling of moral superiority over the rich suburban life he grew up in. The song, as observing as it was, became a statement for our times and lost its connection to the tragedy that spawned it. Through no fault of Paul Simon, "The Sound of Silence" became a condescending comment on the spiritual waste of our material world (brought on by the older generation) and an endorsement of Benjamin's detached daydreaming. Listening to the song, we could feel above it all. "The Sound of Silence" became part of another history.

But last Sunday, Paul Simon rediscovered the song and its original intent. Dressed in a suit, with a 9/11 Memorial baseball cap on, Simon began the song with that strum of the guitar that indeed transmits history. At first, he begins tentatively, playing it as if to consider what it might now mean, certainly far removed from the smug certainties of The Graduate. As he played, he listened to the melody begin to find itself, striking a familiar chord, but with it came a whole new purpose. Paul Simon suddenly looked like he was a thousand years old. As the familiar tune emerged, Simon seemed to be searching not so much for the words, but for what this 47-year old song could still give to this mourning nation. Simon didn't play it for the obvious reasons to include this song. He also didn't want the song to be a statement; no, he wanted to see how much history he could transmit by strumming that guitar again. So he begins to sing. "Hello darkness, my old friend," but that darkness no longer belonged to the pampered world of Benjamin Braddock, instead it encompassed the infinite pain of a country's loss, a wound unrelieved, while offering a quiet prayer, a respite. For the first time, Simon sang the song as if he were just discovering the depths of what he wrote so long ago, a song that maybe was waiting until this very day to finally reveal its true nature. Returning "The Sound of Silence" to its original folk arrangement, Simon gave its solitary air a piercing ring of poignancy, a quality it never had before. While we watched people holding hands quietly, or being embraced as they wept, or others trying to sing along through their tears, he quietly sang:

In the naked light I saw, ten thousand people, maybe more
People talking without speaking, people hearing without listening
People writing songs that voices never shared, no one dared
Disturb the sounds of silence.


Simon's pained voice seemed to fill that quiet with unmistakable reverberations of regret and remembrance. As the song concluded, with its too clever line of "the words of the prophets are written on the subway walls and tenement halls," Simon buried the lines with the subtle echoes of the song's conclusion which reverberated in all its delicate familiarity. And then he produced two large guitar strums, like bells peeling, before he quietly walked away. For once, the song did disturb the sound of silence. There endeth the history lesson.

-- September 14/11

The Macho Imperative: The Enigma of Straw Dogs


Dustin Hoffman in Straw Dogs (1971)

In Sam Peckinpah's beautifully spacious and thematically rich western Ride the High Country (1962), two aging former lawmen Steve Judd (Joel McCrea) and Gil Westrum (Randolph Scott), both old friends, are hired to guard a gold shipment as it is delivered down from a mountain mining camp to a town below. During the trip, the men reminisce about their many years together as friends and contemplate how the times are changing (and not for the better). While Gil considers stealing the gold as one last stab at glory, he looks to Steve and inquires, "Is that what you want, Steve?" Without a moment to reflect, Steve replies, "All I want is to enter my house justified." That moral conflict with its Biblical sense of justice and retribution would come to define much of Peckinpah's work in the coming years, such as in The Wild Bunch (1969) and Ballad of Cable Hogue (1970), where he continually sought that elusive house to feel justified in. By the time he made Straw Dogs in 1971, however, that home became much more literal and the conflict much less complex.

Mathematician David Summer (Dustin Hoffman) is an American pacifist who decides to abandon his country during the anti-Vietnam War college demonstrations with his British-born wife Amy (Susan George). They return to her hometown of Wakely (which is a fictional village in Cornwall). From the moment they arrive, tensions mount between the couple due to David's taking refuge in his intellectual pursuits while his wife becomes more sexually polymorphous and draws the attention of the locals – including her former boyfriend, Charlie (Del Henney) and his buddies. As they repair David and Amy’s roof, they continually sneak looks at Amy in her braless sweaters until they concoct a way to get at her. Knowing that David is no match for their macho prowess, they take him out hunting while Charlie steals back to the house. As he tries to start up with Amy, she balks at his advances leading him to eventually rape her. (A large part of the controversy was caused by Peckinpah's treating the rape ambiguously as if somewhere deep down she desired Charlie’s assault because she was angry with her asexual husband. But the larger controversy came right after when one of Charlie’s buddies showed up and sodomized Amy and terrorized her. American censors shortened the scene thus clouding Peckinpah’s intent not to ‘glamourize’ the assault.)

When David returns home, Amy refuses to tell him about the rape but she can’t get the horrific images out of her head. One night, as they drive home from a town social, their car strikes Niles (David Warner), the local simpleton, who has accidentally murdered a young girl flirting with him. David and Amy provide shelter for Niles until they can get help, but Charlie and his clan want Niles delivered to their style of justice. As which point, David points out that this is his home and he then proceeds to defend it violently, dispatching each malcontent more viciously than the last. When Amy finally (at the urging of her struggling husband) fires the final bullets into the home invader, David discovers his manhood and wins the new respect of his wife. The basic story in Straw Dogs (based on Gordon Williams’ novel, The Siege of Trencher’s Farm) is pure revenge fantasy, where the worm turns, and the audience can cheer on the bloodlust invoked in our need for David to dispense with the human trash. (In the Vietnam years, it was common in movies like Dirty Harry and Death Wish to exploit moral indignation as a means to justify vigilante justice.) At the time, Peckinpah claimed to be after more than simple blood lust, or exploiting moral indignation. And given the years of heated debate over the picture, it’s clear that despite the territorial imperative invoked in the story, Straw Dogs has more conflicting motives driving the movie.

Susan George in Straw Dogs (1971)

The late film critic Robin Wood once smartly described Straw Dogs as being about a man who was determined to “defend a home that doesn’t really exist.” (When David is driving Niles back to the village after the carnage, Niles tells him, “I don’t know my way home.” David answers, as the movie concludes, “That’s okay, I don’t either.”) Wood went on to say that “the film is a reminder that the violence is not in the action but in them.” By ‘them,’ Wood was referring to those he called the “moralistic critics” who attacked the film, perhaps like Pauline Kael who famously called Straw Dogs “the first American film that is a fascist work of art.” While fascist is a word that continues to get thrown around rather sloppily especially today, Kael was referring specifically to the sexual fascism inherent in the material. What she saw was the macho imperative. Kael abhorred the idea of violence making a man out of the pacifist; she hated that Peckinpah held David up to ridicule until he finally proved that – deep down – his animal cunning made him once again sexually appealing to his wife. She also despised the fact that a major artist (whose work she generally loved) had done nothing more than present a view of violence and rape no more sophisticated than what was commonly voiced in bars by male drunks.

The fascism she spoke of later became the actual subject of Elvis Costello’s third album, Armed Forces in 1979 (an album he originally wanted to name Emotional Fascism), a record that critic Greil Marcus described as being about “[t]he secret, unspeakable realities of political life, realities we seem to successfully deflect or ignore, [that] rise up to force a redefinition of relationships between men and women, the essential stuff of ordinary life, on these unspeakable terms.” Perhaps that redefinition was what Peckinpah was also after in Straw Dogs but the purview was too narrow. If we had been allowed to accept the peaceful David and then get taken into the horror of what he was forced to do, Straw Dogs might have gotten at those ambiguous elements Peckinpah claimed were there. If Amy had not been this Lolita-like tease continually taunting the yokels, the rape scene might have had the intended terror of intimate invasion, of unspeakable violation. The violent conclusion might not have had such an inevitability had we understood exactly why this was a home David was dedicated to defending. Although Dustin Hoffman once said that he made the movie “because I was interested in [the] repressed violence in liberals,” that would have been clearer if the game hadn’t been rigged to make David’s violence the only response possible to the onslaught in Straw Dogs.

And yet, Peckinpah’s Straw Dogs remains a powerfully irresolvable work because you can feel the director reaching for a complexity that the story can’t possibly contain. Even the assault is not exploited for its prurient fascination like rape often is in action films, to tease the audience’s excitement and then to incite their lust for vengeance. (It instead had the dramatic tension of debased eroticism.) But if Sam Peckinpah’s Straw Dogs still has the power to incite debate and heated discussion, Rod Lurie’s new remake is pretty much forgotten upon first viewing. As critic Martin Morrow wrote recently in Toronto’s weekly paper, The Grid, by way of paraphrasing Pauline Kael, “Rod Lurie’s pointless remake retains that fascist outlook, but it’s no work of art.” It’s not even a work of substance.

James Marsden and Kate Bosworth in Rod Lurie's Straw Dogs (2011)

Although Lurie borrows the bare bones of the original movie, it carries none of the meat of the conflicting obsessions that drove Peckinpah’s version. Lurie is far too rational and sane so he distances himself from the teeming violence in the material. As a result, the movie isn’t driven by obsession, or even fear. This Straw Dogs could have been directed by the pacifist David. Lurie relocates the story to the American South and turns David (James Marsden) into a Hollywood screenwriter and Amy (Kate Bosworth) into an actress who worked on some television shows he wrote. But from the moment David starts playing classical music and competing with the Southern country rock that the local boys are playing, we end up laughing at the picture’s obvious snobbery. Amy’s former beau Charlie (well played by Alexander Skarsgard from True Blood) turns out to be a quietly patronizing version of Southern noblesse oblige where chivalry becomes a mask for misogyny. However the rest of the locals come off like a Northern liberal’s paranoid cartoon of redneck’s gone wild. (James Woods, as ‘the coach,’ hoots and hollers and e-haws so much he must think he’s back in Ghosts of Mississippi.)

Although the couple’s marriage carries many of the same tensions as in the first picture, Lurie doesn’t give us a clue as to what drew them together. Kate Bosworth also can’t get a bead on Amy, because Lurie keeps altering her character. One minute she’s teasing the rednecks, the next she’s invoking positions out of Gloria Steinem. James Marsden also looks embarrassingly uncomfortable playing possum to the good ol’ boys. Unlike Dustin Hoffman, who could effectively draw on his nasal harmlessness until the trap gets sprung, Marsden’s performance becomes horribly mannered as if we’re watching a grown man trying to be a helpless boy. When he eventually resorts to violence, it comes across as unintentionally funny because he doesn’t seem overtaken by forces he didn’t know he possessed. He seems rather to be fulfilling what the screenplay had been denying him – a backbone – since the beginning of the movie.

But what’s truly perplexing is why Rod Lurie even wanted to remake Straw Dogs. The material is still controversial and yet Lurie skirts the controversy (even in the rape scene which lacks the primal terror of the original because, being politically correct, he denies the characters the chemistry to make it shocking). Without the emotional force of the original, Lurie’s Straw Dogs is simply a cut-and-paste revenge picture made by a man who conceives it as if he never felt vengeance in his very bones. He falls back instead on clumsy and obvious metaphors (e.g. David is writing about the battle of Stalingrad so naturally his defense of the house mirrors his project) and the final stroke of violence is so badly staged that the audience collapses into laughter. (It’s the funniest exit for a villain in a bad movie since Billy Zane got shot in the mouth with a flare gun to conclude Dead Calm.) If Peckinpah’s Straw Dogs was cured in the ambivalence over the Vietnam War, Lurie invokes Iraq as some passing reference to give his picture political cache.

The true tragedy of Sam Peckinpah is that he never did get to enter his home justified. His battles with studios and producers, plus the war with his own demons, prevented him from having the kind of career his talent deserved. Straw Dogs became a distillation of the demons that gnawed at him. Rod Lurie, on the other hand, is so out of his element he wouldn’t know a demon if he met one.

-- November 27/11

The Buried Face of an Age: Hugo Ball's Flight Out of Time (1916)



Written while sitting on my couch this morning watching TV as the Occupy protesters are being evacuated from various parks around the world.

On February 5, 1916, while a world war was raging around them, a group of artists had just landed in Zurich, Switzerland, to perform in a club called the Cabaret Voltaire. Hugo Ball was a twenty-nine-year-old German poet and Catholic mystic. With him were his lover, cabaret singer Emmy Hennings; Tristan Tzara, a poet from Romania; painter Marcel Janco, Tzara's countryman; Albanian artist Jean Arp; and a medical student named Richard Huelsenbeck, who just happened to have a thing for the drums.

Among the group, who would soon be reborn as Dadaists, Ball was devoted to Richard Wagner's concept of Gesamtkunstewerk ("total work of art") which was a radical new philosophy whereby a one-dimensional society could be regenerated through a totality that combined all the arts. "Our debates are a burning speech, more blatant every day, for the specific rhythm and the buried face of this age," Ball would write in his 1916 diary Flight Out of Time, which would be published by Viking in 1974. This search for a specific rhythm took form as a totality of political theater. While the owners of the cabaret looked for pleasing poems that could be read, music that could be performed and songs that could be sung – all to boost what had become a sagging clientele at the cafe – Ball and his clan had other ideas. He was looking instead for something more tantalizing: a new expressive art form that could put the shock into entertainment.

"Dada is a new tendency in art," Ball would write as a means of describing the beginnings of a movement, a movement that denied the ideological shackles of becoming one. "One can tell this from the fact that until now nobody knew anything about it, and tomorrow everyone in Zurich will be talking about it." Ball also provided a loose and playful definition of Dada. "Dada comes from the dictionary. It is terribly simple. In French, it means 'hobby horse.' In German, it means 'Goodbye, 'Get off my back,' 'Be seeing you sometime.' In Romanian: 'Yes, indeed, you are right, that's it. But, of course, yes definitely right.' And so forth." Yet Dada was also designed to shake an audience's neatly held assumptions about the state of the world. Ball was witnessing a globe raging in chaos and blood, while a cabaret audience, seeking comfort from that discord, came to the club expecting a form of entertainment would help them forget it all.

As he made clear in Flight Out of Time, Ball wasn't about to contribute to this style of amusement: "How does one achieve eternal bliss? By saying dada. How does one become famous? By saying dada. With a noble gesture and delicate propriety. Till one goes crazy. Till one loses consciousness. How can one get rid of everything that smacks of journalism, worms, everything nice and right, blinkered, moralistic, Europeanized, engervated? By saying dada. Dada is the world soul, dada is the pawnshop." What Ball and his group performed nightly at the Cabaret Voltaire formed the basis of an explosive act of pure absurdity. The German painter Hans Richter described the scene quite simply: "The Cabaret Voltaire was a six-piece band. Each played his own instrument, i.e. himself." Arp went even further providing more graphic detail. "Total pandemonium. The people around us are shouting, laughing, and gesticulating. Our replies are sighs of love, volleys of hiccups, poems, moos, and miaowing of medieval Bruitists."

The audience was in complete shock as well as outraged. A piece of classical piano music could be rudely interrupted by a gunshot into the air; or by a snap of Huelsenbeck's snare drum. Unpredictability and lunacy quickly ripened before the audience's eyes. There was nowhere to hide, no way for them to discern the difference between the outside world and the swirling upheaval surrounding them. "Tzara is wiggling his behind like the belly of an Oriental dancer," Arp continued. "Janco is playing an invisible violin and bowing and scraping. Madame Hennings, with a Madonna face, is doing the splits. Huelsenbeck is banging away non-stop on the great drum, with Ball accompanying him on the piano, pale as a chalky ghost."

Hugo Ball in costume

But Ball still wasn't satisfied. He had something even more dramatic planned for the Cabaret. He'd been working on some phonetic verses that he called "Lautgedichte"; he wanted to shed the common dialect and communicate in pure vowels and syllables. "I shall be reading poems that are meant to dispense with conventional language, no less, and have done with it," Ball would write. "I don't want words that other people have invented... I want my own stuff, my own rhythm, and vowels and consonants too, matching the rhythm and all my own. If this pulsation is seven yards long, I want words that are seven yards long." To premiere his first 'sound poem,' Ball at first stood offstage, his legs covered in blue cardboard. He wore a collar that was gold on the outside and scarlet on the inside. Within a Cubist mask that covered his face was a pale, distressed expression that soon fell into a deathly calm. As the lights went down, Ball's voice took on the age-old measure of a requiem:

gadji beri bimba

glandridi lauli lonni cadori

gadjama bim beri glasssala

glandridi glassala tuffin i zimbrabim

blassa galassasa tuffin i zimbrabim
...

The audience was spellbound, yet puzzled, deprived of a common reaction to this hedonistic embrace of the absurd. This wasn't a language that could yet be shared, it was merely heard, and in need to be made sense of. Ball had gone fearlessly into the very source of language, the first guttural cries at birth, and the pain of waking into a new world. But given the human cost of the war surrounding them, clearly the audience didn't wish to wake up with him – it was better to sleep, and to dream and forget.

But disruptive moments of cultural upheaval are not so easily forgotten. While Dada went on to become a recognized art movement through Tristan Tzara, Ball abandoned the world stage within two years, retreating back into Catholicism in 1920 and died a poor, religious man in Switzerland in 1927. In his book, Ball describes a Dadaist as someone "still so convinced of the unity of all beings, of the totality of all things, that he suffers from the dissonances." While he may have suffered those dissonances, others embraced them. The Canadian sound-poetry quartet The Four Horsemen would claim Ball's spirit (along with another Canadian sound poetry group, Owen Sound, as well as other international poets and artists).

Laurie Anderson

In the pop world, you could hear Ball in 1973, in the electonica music played by the British band Cabaret Voltaire, named after the club where it all began. You could hear him literally, of course, in the Talking Heads' song "I Zimbra" from their 1979 album Fear of Music, where they took Ball's cabaret poem and set it to an African rhythm. A few years later, in 1981, you could also hear Ball more figuratively in Laurie Anderson's "O Superman (For Massenet)." In specific terms, Anderson constructed her song as a cover of the aria "Ô Souverain, ô juge, ô père" (O Sovereign, O Judge, O Father) from Jules Massenet's 1885 opera Le Cid. But the concept of creating a language unheard came directly from Ball. Using an electronic voice decoder, Anderson overlays a phonetic loop of her repeating the spoken syllable "Ha" with an Eventide Harmonizer, while she reads her text through a vocoder. Inspired by the Tao Te Ching, Anderson says, "Cause when love is gone, there's always justice/And when justice is gone, there is always force/And when force is gone, there's always Mom." Or Dada.

"It can probably be said for us that art is not an end in itself," Ball wrote further in Flight Out of Time. "[I]t is an opportunity for true perception and criticism of the times we live in." But it's not just the obvious forms of protest that become so easily defined and devoured by TV news cameras. "What can a beautiful, harmonious poem say if nobody reads it because it has nothing to do with the feelings of the times?" Ball also went on to write. "And what can a novel have to say when it is read for culture but is really a long way from even touching on culture?" For a brief moment in time, a group of artists avoided the comfortably didactic and illuminated the lines of demarcation, the lines that still do divide the culture.

-- November 15/11



                                                                       2012


Pod Culture: Phil Kaufman's Invasion of the Body Snatchers (1979) & The Reagan Era


Brooke Adams and Donald Sutherland encounter a different Flower Power

Since the early Sixties, you could turn to almost any American film and recognize the political period that spawned it. A rousing epic like Spartacus (1960) signaled the rising hopes of the Kennedy years, just as The Manchurian Candidate (1962) foreshadowed the tragedy in Dallas the next year. In the Heat of the Night (1966) reflected the racially troubled Johnson era as equally as the police thriller Bullitt (1968) indirectly brought brought out our ambivalence about Vietnam and the growing culture of violence at home. The subject of violence was articulated eloquently, but with a marked uncertainty, in various genre films, from the Depression-era gangster film Bonnie and Clyde (1967) to Sam Peckinpah's bloody western The Wild Bunch (1969).

The Seventies' paranoia of the Nixon years gave us both an anti-hero for the Silent Majority in Clint Eastwood's vigilante detective Dirty Harry (1972) and one for frustrated liberals in the counter-culture counterpart, Billy Jack (1971). The holistic Carter period tried to salve the country's wounds over Vietnam with the homespun nostalgia of Bound for Glory (1976) and Coming Home (1978). Yet it was the ground-breaking blockbuster Star Wars (1977), however, that took viewers back to a presumed innocent age. Drawing on the gloried past of Hollywood studio movie-making, Star Wars asked Americans to forget about the demons of Vietnam and Watergate. In providing a comforting creed for seeking salvation, which meant believing in the Force, Star Wars laid the seeds for the arrival of Ronald Reagan. Speaking of seeds, though, if there was one film that more presciently defined the chief characteristic of the Reagan years, and what that era would represent in the decade ahead, it was Philip Kaufman’s witty and sumptuously scary 1979 re-make of Invasion of the Body Snatchers. If the Reagan years were about refuting the Sixties by having us take refuge in the cozy nostalgia of an earlier time, where one could “sleepwalk through history” (as historian Haynes Johnson described it), Invasion of the Body Snatchers is a thrilling science-fiction classic about the dangers of nodding off.

Using as his source the 1956 black-and-white B-movie (which itself was an adaptation of Jack Finney’s fine 1955 novel The Body Snatchers), Kaufman recognized that the core idea of the story played into some very primitive fears while also creating some startling political metaphors. In the original book, that was serialized in Colliers magazine, the story centered on the small, quaint California town of Santa Mira, where one day people started hysterically complaining to Miles Bennell, the town doctor, that their loved ones were suddenly not being themselves. As the doctor investigated, he discovered that extraterrestrials had invaded their quiet villa bringing giant seed pods that replicated people and took possession of their souls while they slept. In a short time, the entire town becomes filled with people who have lost their personalities, the qualities making them distinctly human, in favour of a bland conformity where citizens become free of anxiety, pain and the ability to love. The core anxiety the material tapped into, of course, was our dread that when we fall asleep we may not wake up again, or if we did, we might not be ourselves. But the novel – and the subsequent film – also came out during the Cold War era, even echoing the McCarthy period where the fear of Communism (and the conformity that lay at the heart of that ideology) was rampant.


But like The Manchurian Candidate, which arrived a few years later, the material could be read in many different ways – it was either a fear of a communist takeover, or the bland encroachment of the homegrown banality of the Eisenhower Fifties, a blandness that was turning America into a collection of Norman Rockwell paintings. The shrewd beauty of the book's satire told us essentially that no one, from any political persuasion, could draw any self-righteous comfort from the story. Neither could Hollywood either when Don Siegel made his low-budget feature adaptation. Allied Artists ultimately cut out much of the story’s humour and tacked on a prologue and a more hopeful conclusion. In Siegel’s original version, Miles Bennell – played by Kevin McCarthy – is stranded on a highway trying to warn people who are indifferent to his ravings. The ridiculous new ending had a re-assuring ‘get-the-FBI’ endorsement. But it didn't matter. Invasion of the Body Snatchers still became a B-movie classic, a deliciously paranoid vision of losing your individuality as you go sleepwalking through history.

In Phil Kaufman’s highly original re-make, he updated the material and turned his Invasion into a hangover from the turbulent Sixties. Setting the story in his home city of San Francisco at the end of the Seventies, Kaufman (along with screenwriter W.D. Richter) built some profoundly witty variations on the themes of the original. Considering the picture’s relationship to the Sixties, the first truly funny (and ironic) part of the story is just setting it in San Francisco. Besides being resident to some of the most non-traditional and eccentric artists and personalities of the previous two decades (a city that gave rise to both the Beat Generation and the hippie movement), San Francisco has a history of protecting its individuality. So it’s a huge joke on the audience to choose the place that gave birth to the non-conformity of Flower Power and suddenly make it home to the new Flower Power of conformity. From the opening scenes, where we watch the web-like spores travelling from their home planet to Earth, and falling with the gentle rain on a normal day in the Bay area, Kaufman provides the kind of spooky nuances that invoke the cautionary SF films of the past while anticipating a new spin on the genre.

The jagged streets and distinctly poised houses also provide the perfect backdrop for a story about the dread of banality. If the original movie made us fear for the life of the picturesque small-town; in Kaufman’s tale, we fear the growing plants and their spidery tentacles enclosing and transforming San Francisco’s richly textured landscape. The parochial small-town attitudes of the original, too, are transcended here by the counter-culture attitudes of the city folk. In Kaufman’s version, Miles Bennell (Donald Sutherland) is not a simple town doctor but instead works for the Department of Health. He takes personal glee fighting the injustice of restaurants overpricing their food while tolerating rat infestations. His colleague Elizabeth (Brooke Adams) meanwhile curiously picks one of the flowers that descended on the city and investigates its origins. Elizabeth is involved with Geoffrey (Art Hindle), a rather docile dentist who can’t bother to take his headphones off while watching a basketball game on television in order to greet his homecoming partner. (He only rouses from his passivity when his team scores a basket.) She leaves the flower in a glass of water by the bed before they go to sleep. When she awakes in the morning, Geoffrey is now a duty-bound spore meeting strangers across the city and trading packages. (One of the great early jokes is making a dentist – those physicians often accused of dullness – as the first victim of the pods.)

Leonard Nimoy, Donald Sutherland and Jeff Goldblum

That day, Elizabeth tells Matthew of the strange transformation of her lover, that Geoffrey is no longer Geoffrey, but he’s more inclined to see her fears in terms totally in tune with the times. By the late Seventies, with the demise of the Sixties counter-culture from both drug burn-out and psychopathic criminality (which echoed the criminality of the Nixon years right into Watergate), the social rebellion had found a new home in narcissistic New Age self-help psychology: I’m OK, You’re Okay. So Matthew recommends that she see his friend, the self-help psychiatrist Dr. Kibner (Leonard Nimoy), for advice rather than confront the strangeness of the fact that suddenly everywhere people are complaining that those they know are not the people they once were. The cerebral Kibner, who is hosting a book launch and quelling the anxieties of others, terrified that their partners have changed, immediately picks up that there are romantic feelings between Elizabeth and her boss. He suggests that maybe her fears about Geoffrey are rationalizations for the attraction she feels towards Matthew. In short time, the city becomes engulfed with pod people and Matthew and Elizabeth are running for their lives trying to stay awake as well as wake up anyone who might respond to their warning cries.

Most science-fiction films, especially those in the Fifties, were cautionary fables about our destructive ways, and many of them (even good ones like the 1951 The Day the Earth Stood Still) became terribly obvious civics lessons. The human characters in these pictures, too, often lacked any dimension that we could possibly care about. (The actors and their dialogue were usually stiff and unconvincing.) With Invasion of the Body Snatchers, however, Kaufman catches the idiosyncratic rhythms of the city's inhabitants so that we become concerned about their fate. Besides Sutherland, who brings an off-centre warmth to Bennell, Brooke Adams provides a sunny skepticism that complements his. You believe that these two could fall in love. Jeff Goldblum and Veronica Cartwright also turn up as their eccentric counter-aparts, Jack and Nancy Bellicec, who run a Turkish mud bath. They have the loony non-conformist tensions of the city wrapped up in their affection for each other. Goldblum with his tall lanky frame is note perfect as a frustrated writer who hates the more successful Kibner because Jack cares about writing while Kibner cares about being a success. Cartwright matches up perfectly with her wrinkled nose, tickled by the more unorthodox life they lead. These are amazingly vivid, funny personalities who bounce verbal jokes off each other while their city turns to mediocrity. (“Where’s Kazantakis, where’s Homer, where’s Jack London?” Jack complains at Kibner’s book launch, while Matthew scanning the room for his beleaguered partner asks, “Where’s Elizabeth?”) It’s perhaps a stroke of genius to cast Leonard Nimoy, who brings all the associations of both SF and his role as the logical Spock from Star Trek, as the smug Kibner. Nimoy is so measured projecting Kibner’s self-righteousness that you’re never sure if he isn’t already a pod when we first meet him.

Invasion of the Body Snatchers contains a number of sophisticated spooky gags from a transformed dog to a number of inside jokes. Robert Duvall (who once starred as Jesse James in Kaufman’s earlier western, The Great Northfield Minnesota Raid), shows up uncredited as a priest on a schoolyard swing, but it amounts to nothing. Kevin McCarthy from the original Invasion, however, turns up and the joke has real bite. It's two decades later, and there he is, still in the streets, still warning us that they’re here. (This time the pods finally bump him off.) In this hipper version, there is also no FBI to counter the pods. The FBI is already perceived by these Bay area rebels as pods already.


The political metaphors in Invasion of the Body Snatchers continue to resonate even years after the movie came out. There were catchphrase slogans of the Reagan years like 'Morning in America' (in other words, waking up from a dark night) which eerily echo the phrase uttered by the pods in Invasion when they encourage citizens not to be trapped by “old concepts.” The movie shrewdly recognizes – even then – that the coming conservative revolution would be, in fact, a reaction to the Sixties. “[Ronald Reagan and Margaret Thatcher] did better than run against the sixties,’ writes critic Greil Marcus in his recent book about the L.A. rock group The Doors. “They kept the time and the idea alive by co-opting its rhetoric, by so brilliantly taking its watchwords, or its slogans, as their own. ‘Adventure,’ ‘risk,’ ‘a new world’ – those were emblems no conservative movements have claimed since the 1930s, when the movements that had claimed since the 1930s, when the movements that did trumpet such words named themselves fascist.”

Besides the manner in which the culture turned acclimatizing and sedate in the Reagan Eighties, there was also the darker shadow of the Sixties utopianism which Invasion doesn't let off the hook either. Many may not remember that when Invasion of the Body Snatchers was playing in theatres in late December of 1978, the rotting corpses of hundreds of San Franciscan followers of local cult leader and counter-culture political activist Jim Jones lay dead in Guyana after a mass suicide. What was once a counter-culture movement attacking social injustice ended up as a terrifying resignation to an afterlife led by a zealot. How could one not view the horror of that event through the lens of Invasion? Furthermore, a month earlier, San Francisco councilman Harvey Milk, the first openly gay elected politician, had been assassinated along with Mayor George Moscone, by councilman Dan White, an openly homophobic man, who would soon after his vicious crime face a prosecution that failed to put those facts before the jury. The trial therefore ended with White’s conviction on the lesser charge of voluntary manslaughter. Were the pods already walking the San Francisco streets in 1979 as well as occupying the movie screen?


Of course, Invasion of the Body Snatchers is an entertaining horror-comedy not designed self-consciously as a serious warning cry. Yet that’s what makes it all the more effective as both satire and political commentary – just like the equally audacious The Manchurian Candidate was in anticipating the events of Dallas, Texas a year before JFK was shot. But consider the on-going relevance of the movie's themes. For instance, when the first Invasion was on movie screens in the Fifties, writer and critic Philip Wylie was drawing the attention of his readers to a book of essays by Baltimore psychoanalyst Robert Lindner called Must You Conform? "The grimmest demon of our day – [is] the demand for conformity set up by the frightened men, the unfree men, the men George Orwell said would triumph by '1984,'" Wylie wrote. "Dr. Lindner draws his material from his immense experience with American men and women who have grown 'sick-minded' trying to be true to themselves in an era of rigid attitudes and senseless pressures." Lindner goes even further in his book to describe this malaise. "Our schools have become vast factories for the manufacture of robots," he says in the title essay. "We no longer send our young to them primarily to be informed and to acquire knowledge; but to be 'socialized' – which in the current semantic means to be regimented and made to conform...grades are given for the 'ability' of a child to 'adjust' to group activities, for whether he is 'liked' by others, for whether he 'enjoys' the subjects taught, for whether he 'gets along' with his schoolmates...[ultimately] revealing a cynical kind of anti-intellectualism."

Of course, the Sixties would become a decade devoted to dissolving the anti-intellectualism that Lindner refers to, by rejecting the kind of social order that turns critical thought into quiet acquiescence. The prosperity of the time also helped bring a wealth of cultural alternatives to the mainstream. But when the affluence of the Sixties turned into the severely pinched Seventies, our culture moved backward to that cynical kind of anti-intellectualism that Lindner warned of in the Fifties. Before long, our reality wasn't shaped by the outside world, a world we wished to understand and experience. The only reality we wanted to recognize was ourselves. Our self-worth was no longer determined by our ideas and thoughts, but by our status, or by whatever platform we could mount to help define and express ourselves. Worthiness was earned by emulating whoever was considered hot – and, of course, ignoring whoever wasn't. A social amnesia (where one could be popular one minute and forgotten the next) evolved out of all this navel-gazing. By the Eighties, as Ronald Reagan took power, solipsism became the yardstick by which politics, art and popular culture were measured. And he accomplished it a few years before Orwell's prediction. But Reagan had to erase the Sixties from both memory and history to accomplish this, just as the pods had to erase every shred of the individuality from each citizen.

Ronald Reagan in the Eighties

The other underlying theme of Phil Kaufman's Invasion of the Body Snatchers accounts for the critical shifts that took place in the Sixties counter-culture. As the Seventies turned into the Reagan Eighties, the desperate need in the middle class to survive (at any cost) created a new species of Yuppie careerists who had morphed right out of the counter-culture itself. And, like the folks who flocked to Dr. Kibner's book launch for answers, they also looked to the self-serving banalities of pop psychologists for answers. The spoken goal of this quest was self-knowledge, but the hidden desire was to find 'enlightened' ways to regain lost prosperity. What was sacrificed in this process was the more complex understanding of behaviour, provided by hundreds of years of literature, philosophy and psychology.

A generosity of spirit that comprised a worldview wider than the contours of our own navels was deemed expendable. Diversity of opinion began to dissolve as well in favour of a more adaptable homogeneity that could shape the public taste into something that could make dollars rather than sense. When the latest fad stopped proving lucrative, it was forgotten. We would soon witness a crippling recession with a financial meltdown in the foreseeable future that not only narrowed our pocketbook, but also our tolerance for free and open debate about important issues. (The Reagan Eighties were often defined by the entrepreneurial phrase "Go For It!" This particular kind of junk-bond philosophy, which ate away at our economic infrastructure, extended itself, as well, to how we thought.) Out of a need to put the turmoil of the Vietnam era in the past, Americans began to connect to a mythology from their Puritan history. And Ronald Reagan fed into this desire for nostalgia by creating a public aura that differed greatly from the reality of the times. This is the world that Invasion of the Body Snatchers, an entertaining populist satire, anticipated and depicted in its genesis. The picture's irreverent, yet poignant observations, lifted Phil Kaufman's masterpiece into being one of the most remarkable and timely remakes in the history of movies.

-- January 8/12

A Whole Wide World Within the Grooves



By the time I was four, I had developed a promiscuous interest in music. Without understanding the meaning of the first songs I discovered, such as Frankie Laine's romantic confession "Moonlight Gambler," or Marty Robbins' fateful ballad "The Hanging Tree," I was drawn by the unusual texture of the sound in those tunes. Laine, a hyperbolic performer, used a number of strange effects in his song. A high-pitched whistle, drenched in reverb, opened the track. To my young ears that whistle seemed to be signalling forlornly to some distant train arriving into a lonely, abandoned station. It was soon followed by another voice making click-clop noises, as if a majestic horse were coming over the hill to intercept that oncoming train. And all of this was taking place before Frankie Laine opened his mouth to sing. It was clear that I was responding to more than just a song – but instead to a whole other world of sound reverberating around me, creating a spot in my imagination, and inviting me to share in the music's distinctive peculiarities. But these were my parents' and my relatives' records. I didn't really discover rock 'n' roll until my mother's cousin, Jimmy Mahon, came to live with us in 1959.

Jimmy had a huge collection of 45s, by such performers as Buddy Holly ("It Doesn't Matter Anymore"), Ronnie Hawkins & the Hawks ("Southern Love"), rockabilly artist Jack Scott ("Patsy"), the Mills Brothers ("Till Then"), some Elvis ("Don't Be Cruel"), and Little Anthony & the Imperials (their wonderfully eerie voodoo hit "Shimmy, Shimmy, Ko-Ko-Bop"). He also owned a portable 45 rpm record player with a built-in cone to stack each record on top of the others. When one finished, the next song would drop down and begin to play. I used to sit in the middle of my room, stack the singles, and while leaning against my bed, I'd listen to the songs for hours. And there it was: my first love affair with music. Of course, even though I fell hard for Jimmy's rock collection, the records and the songs were his, not mine. (All I owned was "Popeye, the Sailor Man" and "Blow the Man Down," sewing the seeds for my future love of sea shanties.) At five, I even asked myself when I would find my own music. It came four years later.


Word of The Beatles reached Canada before it did in America. In 1963, some of their songs were getting radio airplay and many of my school friends were starting to take notice. When I saw a photo of the group, four guys in matching suits and cereal-bowl haircuts, they looked too precious for words. My parents complained about their long hair and, adorned with my own razor-shorned brushcut, I found no reason to argue. But a sharp guy in my grade three class seemed to know a thing or two about music. Brian Potts was older than his years. When The Beatles were about to make their historic appearance on The Ed Sullivan Show in February 1964, Brian was telling all within earshot to watch. He already owned their new album and he promised us that their TV performance would be worth it. A contrarian even at the age of nine, I scoffed and refused to watch the show. So the next day, Brian invited me over to his house to hear With The Beatles. It was their second UK album (called Beatlemania! With The Beatles in Canada) and it had a black-and-white photo on the front cover with the group in half-profile. The picture was startling – half in light, half in shadow – and their faces revealed no desire to please anyone. The music, on the other hand, was immediately arresting. From the opening track, the boldly appealing "It Won't Be Long," featured John Lennon declaring his desire to be by his lover's side, as Paul McCartney and George Harrison backed him up with affirmative "Yeahs!" I was instantly hooked.

There was a quality of feeling in this music that told you immediately that it was yours to possess. And as joyful as it was, it had a bottom end, a certain sadness at its core. The Beatles' songs seized me dramatically because the pleasure in their sound tugged at some inarticulated, buried sorrow. The seductive "All My Loving," for example, was the happiest song on the record, but it was about the singer going away, leaving his girl behind. Most popular artists, like Bobby Vee in "Take Good Care of My Baby," made it clear that there was nothing about the absence of a lover to feel happy about. Paul McCartney's song, on the other hand, made a candid promise. He'll write every day he's away, so although he's gone, don't worry, he'll be back, just as Lennon would be in "It Won't Be Long." When The Beatles covered Motown, as in their version of The Marvelettes' lovely 1961 hit "Please Mr. Postman," they act out an emotional tug-of-war. The Marvelettes coyly beg the postman for that letter, perfectly confident that the boy will come through; "Please Mr. Postman" anticipates the joy the singer will feel when that letter arrives. In The Beatles' version, John Lennon sounds like a man on death row waiting for a reprieve. The elation in his voice at the thought of this letter arriving is also paired with anguish that it may never come. This equivocal characteristic of their music wasn't a simple divide between pleasure and pain, right and wrong, or an unassuming claim asserting that pain would lead to pleasure (as the woman in "Girl" believes). The Beatles' records had transcendence in them, a belief that even in the most despairing moment, hope was possible; that even at the most painful time, enjoyment could be around the corner. With The Beatles was the first Beatles record I bought with my saved allowance in March 1964 – though my parents warned me not to get any ideas about growing my hair long.


In September, my mother bought me a ticket to The Beatles' first concert at Maple Leaf Gardens in Toronto. I was somewhat surprised since she had to put up with Brian Potts and me at the drive-in earlier that summer watching A Hard Day's Night, The Beatles' first film. Bored out of her mind, clueless to the Liverpudlian humour in the script, she also had to listen to Brian pontificate on the type of acoustic guitar Lennon was playing during "If I Fell." Nonetheless, she roamed the downtown streets seeking out a scalper so I could attend the concert. She could only get one ticket so I had to go it alone. I walked gingerly past scores of young girls screaming, tearing at their hair, their clothes, some dropping to the ground in a fainting fit in front of me. One female I recall was hitting her head – hard – against the Gardens' brick wall screaming out for Paul, who, of course, wasn't there to answer. He didn't yet know that the generous promise he offered in "All My Loving" would generate such desperate responses.

Inside the hockey arena, the sports palace that housed the storied Toronto Maple Leafs, my grandfather and I had watched many games together, but I was now entering it by myself for the first time to see The Beatles. It was a rite of passage. My first rock concert. Many performers took the stage before The Beatles, but in my fervent anticipation, I didn't register any of them. I was in the cheapest seats in the building, up in the grey area near the roof, so I could barely see the stage. When DJ "Jungle" Jay Nelson of the local CHUM radio introduced The Beatles, the din was frighteningly intense. I knew the tunes, but I could barely hear the melodies for the screams from the crowd. An older gentleman beside me lent me his binoculars from time to time through the group's brief half-hour set. Dressed in their matching suits, like elegant bachelors at a ball, they withstood the barrage from the crowd. Singing "She Loves You," they dug their black heels into the floor, as if fighting back hurricane winds, yet smiling happily, knowing that the vivacity of that song could match, perhaps even surpass, the devoted shouts the song earned. At times, it was as much a battle of wits between the band and the audience that I particularly remember the excitement during The Beatles' performance of "Please Please Me." The plea from the stage was to come on, share this love, an urgent appeal that was greeted with a desire in the crowd to become one with this young and exciting group. Nothing could compare to the emotional force in the stadium that afternoon. As The Beatles left the stage, once they had completed their mandatory mannerly bows, the air was thick with exhilaration. My ears rang for days.

The Beatles at Maple Leaf Gardens in 1964

When I saw them again at Maple Leaf Gardens in 1965, the music was becoming a little more sophisticated and I was sitting a lot closer, on the floor, and mere feet from the stage. With the majority of the screaming behind me, the intensity of the music was now before me. Lennon played a Hammond organ at this show, and as he sang Larry Williams' exuberant 1958 bar boogie "Dizzy Miss Lizzy," he'd alternate his fingers and his elbows, madly stroking the keyboards. Yet as exciting as the concert was, it lacked the impact of the 1964 performance because it was clear to me – even at my callow age – that The Beatles' show was starting to become routine. Certain of what to expect from the crowd, they knew now what to feed us. Yet despite the predictability of concerts like this one, they were never content to repeat their success when it came to their albums. As they scrambled through their second film, the James Bond parody pastiche Help!, they also released the introspective Rubber Soul, and later offered us the dazzling eclecticism of Revolver. On these records, The Beatles challenged us to hear music in new and exciting ways. The artistic risks they were taking in the studio were replacing the excitement once heard in their concerts. In 1966, their most dangerous risk, however, took the form of John Lennon's remark about The Beatles' being more popular than Jesus Christ. Despite the outcry over his comment, what Lennon argued was nothing more than a simple truth: popular culture, the paganism of the Sixties, was becoming the new religion.


In 1966, the year of their final tour, that's where the violence that had been lurking under the surface of the happily screaming throngs fully materialized. Facing death threats in Japan for daring to play at the Budokan, being assaulted in the Philippines for snubbing Imelda Marcos's state dinner, their albums burned in the American Bible Belt as a protest against Lennon's Jesus remark, The Beatles were now targets of hatred rather than just entertainers of adoring crowds. When I saw the group for the last time at Maple Leaf Gardens in 1966, I sensed a different dynamic in the audience. The spontaneity of earlier shows was no longer present in either the crowd or the group. For that show, I decided to bring a tape recorder my parents had recently bought me. Nobody bothered checking my machine since bootleg albums were a concern for the future, and copyright infringement was an issue for publishers and libraries only. My seat wasn't quite as good as it was in 1965, but I was close enough to get a reasonable recording of the show. The screaming was also nowhere near the pitch it had been during the heights of Beatlemania. If there was frenzy in the audience, it seemed rehearsed, as if people were acting out roles in a movie reeling in their minds, and no longer responding to the performance on stage.

I took my friend Doug Smith to the show and afterward, as we were exiting, the crowd turned towards us, thinking they saw The Beatles and began stampeding. I quickly pulled myself to the wall, clutching my tape recorder for fear it would get broken. Unfortunately, I wasn't as cautious about my feet and someone stomped over my ankle spraining it slightly. Doug wasn't as fortunate: he got mowed down so fast I didn't see where he went. When the paramedics asked me for a description of Doug, I told them what he looked like before he was clobbered by the crowd. But I was afraid to consider what he might resemble now. I remembered the scary pitch of intensity in the audience in 1964; the mood this time was more potentially dangerous. The crowd was less responsive to The Beatles, or to their music; they were now becoming conscious of their own power. The harmony between the group and the audience was no longer synergistic.As I sat on the stairs waiting for the paramedics to find Doug, with my ankle gently throbbing, I stretched out my legs. As I did, I heard voices and footsteps coming down the steps toward me. Most of the crowd had left the building by now so I found myself wondering who it was. As I looked up, I was stunned to see that it was them. It was The Beatles. Here at their final Toronto show, they were right before me. I was so startled that I couldn't decide whether to pull my legs in so that they could walk around me, or leave my legs stretched out, so they could just jump over me. Confused, I moved them up and down like an out-of-control drawbridge. Paul laughed as he jumped over me, while George briskly walked around my feet towards the wall. I brought my knees up and Ringo gingerly stepped around my feet, but John had already decided to leap over me, and his foot caught the edge of my knee. He started tumbling down the stairs, but quickly caught his balance towards the end, with George's help. Horrified, I tried to say I was sorry but the words wouldn't come out. I was caught in one of those paralyzing moments that resemble a dream where you try to run, but you can't because your legs won't move. Lennon quickly shot me his trademark look: an angry pose replaced by a quick smile that caught you off guard and told you it was alright. It was the same disarming smile you could clearly hear in his voice on "Eight Days a Week" or "I Feel Fine." I stumbled down the stairs to watch them enter the press lounge for what would be their final Toronto press conference.

The Beatles Toronto press conference in 1966 at Maple Leaf Gardens

When Doug was finally found, he certainly wasn't smiling. Although he was physically fine, he was emotionally shook up. He was even less pleased when he heard that I got to "meet" The Beatles. But I was the one who soon had little reason to be smiling. Assuming The Beatles would come back the next year, I decided to erase my tape of the concert because buying reel-to-reel audio tape was pretty expensive for an 11-year-old. And, of course, The Beatles never returned. Years later I combed The Beatles' fan sites for some documented audio record of that Toronto show and I discovered that I was evidently the only one to have recorded it. Until a few years back when someone had auctioned off his father's original tape of the entire show. As for my tape that had once contained The Beatles' final Toronto concert ... it was soon filled with the voices of my grandmothers taunting each other as they played cards.

For me, music has always created the possibilities for friendship, a shared passion that can also lead to a sense of community. Which is perhaps why American music always held out a particular fascination for me, as it had for The Beatles who, despite being British, heard their own lives hidden within it. The American character always has idealism at its core and The Beatles' idealism took the form of American rock and rhythm and blues music. And why not? "[They resurrected] music we had ignored, forgotten or discarded, recycling it in a shinier, more feckless and yet more raucous form," once wrote music critic Lester Bangs. And they chose the most appropriate music in which to lift our spirits: "In retrospect, it seems obvious that this elevation of our mood had to come from outside the parameters of America's own musical culture, if only because the folk music which then dominated American pop was so tied to the crushed dreams of the New Frontier," Bangs went on to write.

This endless struggle to define community pops up almost everywhere in American culture. In 1928, in the wake of a horrible depression, folk singer Harry McClintock proposed an alternate world in "Big Rock Candy Mountain" where one's worst trepidations could happily vanish. On Bruce Springsteen's Magic (2007), the narrator on "Radio Nowhere" desperately scans the radio dial looking for a song that will pull it all together, make sense of the turbulent tenor of contemporary American life, but he can't find it. He's not just clamouring for some current hit to tap his toes to; he's searching through time to find some meaning that's lost to him, a song that reminds him that he's part of something bigger and not at the mercy of transient tastes, the whims of the moment. His goal, as the song states, is to be delivered from nowhere. "[T]he covenant between Springsteen and his audience remains strong, in part because he gives them permission to go on believing in trust, even when the world seems to offer so few things to deserve it," Robert Everett-Green once wrote in The Globe and Mail after a 2007 Springsteen concert in Ottawa, Canada.

Daniel Day-Lewis in There Will Be Blood

You can see the cost of that pursuit of a covenant to trust in Tommy Lee Jones's Sheriff Ed Tom Bell, as he walks through the indifferent murderous American landscape in the Coen Brothers' laconic thriller adapted from Cormac McCarthy's novel No Country For Old Men (2007). "[His] last speech is a contemplation of hope, a dream, about however dark and cold the world might be, however long the ride through it might be, that at the end you know that you will go to your father's house and it will be warm, or to a fire that your father has carried and built for you," Jones told a journalist in 2008. "The last sentence of the movie is, 'And then I woke up.' It's a contemplation of the idea of hope, is it an illusion? Is it just a dream? And if it is, is the dream real?" The question of whether it is all real or an illusion, a question John Lennon also posed explicitly in "Strawberry Fields Forever," always remained at the heart of The Beatles' vision. Those of us seeking the covenant they offered were searching for something outside the world we were fated to live in.

But the America that The Beatles bonded with in the Sixties, despite the Vietnam War and racial iniquity, still had a covenant worth believing in. In the wake of the recent Iraq War, profoundly hysterical anti-Americanism replaced a critical distinction between what's rich and true in the culture and what's empty and false. You can see that lack of distinction, too, in the fatalistic world of Paul Thomas Anderson's There Will Be Blood (2007). The picture is a pose, a preordained polemical statement offering little insight and no surprises. Anderson's epic tale of American betrayal has nothing at stake because it implies that there was no American Dream to betray since it was already a nightmare to begin with. So his movie provides no tragic dimension to the teeming avarice of the oil man played by Daniel Day-Lewis. If Walt Whitman had once distinguished between an art that decided presidential elections and an art that made those elections irrelevant, There Will Be Blood is art that is too busy counting ballots. There is no grandeur to the deceived dream to even care about its loss. But by marrying themselves to the most vital and exciting aspects of American culture, the kind that outstrips partisan polemics, The Beatles gave the lie to the kind of narrow assertions Anderson deals in. They built a dream world based on America's conflicting temptations – its promises and its failings – which offer us a wider definition of community, one that also attracts many diverse citizens. That would include a young Canadian spinning 45s on a tiny record player and hearing a whole world called up between the grooves.

-- February 26/12

Dusty Sings Newman: "I've Been Wrong Before"



Music critic Rob Hoerburger, in his illuminating liner notes for The Dusty Springfield Anthology (1997), described the husky-voiced artist's impact on popular music in this way: "Dionne Warwick was more polished and Diana Ross sexier and Martha Reeves tougher and Aretha, well, Aretha. But Dusty Springfield, the beehived Brit, was always the smartest, the most literate, the wisest." In a career that spanned more than 35 years, she was also one of the finest white soul singers to emerge in the Sixties. Springfield covered, in the most significant and delicate ways, the gamut of soul, yet she also extended herself to perform lushly orchestrated pop and disco. Dusty Springfield not only could mine the emotions buried within a song, she would sometimes find emotions that weren't even planted there. Like Jackie DeShannon, Springfield's roots actually began in folk music. The trio she began with her brother, called The Springfields, made an early success of "Silver Threads and Golden Needles" in 1962. But Dusty left the band early in 1963 to pursue a solo career. It wasn't long before she quickly climbed the charts with the exquisite "I Only Want to Be With You" in 1964, quickly followed by the majestic "Wishin' and Hopin'."

When she was recording her album, Everything's Coming Up Dusty (retitled You Don't Have to Say You Love Me in the U.S. after the title song became a huge hit) in 1966, which was a sumptuous record seeped in R&B, Springfield made a decision to include tracks by the great pop practitioners Burt Bacharach ("Long After Tonight is Over"), Goffin/King ("I Can't Hear You," "Oh No! Not My Baby"), and a new song by a West Coast songwriter named Randy Newman. (We were still a few years away from hearing the sly and subversive satirist of "Sail Away" and "Rednecks.") Since the mid-Sixties, Newman had been a songwriter-for-hire at Metric Music (the West Coast equivalent of the Brill Building in New York) churning out conventional pop tunes for just about anyone who'd sing them. But the Newman song Dusty Springfield chose to perform, "I've Been Wrong Before," turned out to be arguably his strongest, most memorable of those numbers.

Although Randy Newman never thought much of it, "I've Been Wrong Before" is an elusive track that gets at the delicate areas of romantic desperation and ambivalence. It opens with a mournful melody played on the piano, a repetitive motif that invokes a sense of loss before Springfield has even opened her mouth to sing. By the time she does, the melody has already encircled her the way Bernard Herrmann's dizzyingly repeated chords ensnare James Stewart's detective as he runs across the rooftops of San Francisco in the opening of Hitchcock's Vertigo. "I've Been Wrong Before" opens with the singer anticipating a new love affair ("The night we met/The night I won't forget/You seem to be what I've been waiting for/But, baby, I've been wrong before"). Under the lyrics, the piano chords proceed cautiously, as if the pianist were afraid to find the next note, while the melody floods the song with sadness.

Having made a horrible mistake once, Springfield sings, what's stopping her from repeating it? When it comes to love, she seems to be saying, we sometimes fear what our instincts are telling us. Springfield plays off that fear, too, with a voice that fills us with both the tingling anticipation of what love can bring, and the dread of its possible consequences. By the time she hits the bridge, with the string section supporting her like a satin net, Springfield is wrestling with the fragile hope this new love holds out for her. She performs the song with the conviction that exorcising the pain of the past will free her for the future ("He used to smile at me/And hold my hand/Like you do/Then he left me/And broke my heart in two"). When she reaches the end of the bridge, Springfield painfully pulls the song back into the present ("I see your face/And feel your warm embrace/You're all I adore/But, baby, I've been wrong before/I've been wrong before"). The last word dissolves on a whisper which leaves no sense of resolution.


"I've Been Wrong Before" is a love song that defies the conventions of most romantic tunes. Popular love songs about breaking hearts are usually told from the point of view of a romantic victim of loss often pleading for the loved one to come back. (The secret to pop success is largely a song that perfectly reflects the romantic anguish of the listener.) In the case of "I've Been Wrong Before," rather than presenting a lover's world divided into victims and victimizers, Newman suggests that victim and victimizer can co-exist in the same person. It's a powerfully uneasy piece of pop mastery, and Dusty Springfield's beautifully tentative reading helps it achieve greatness.

I've Been Wrong Before" has been covered by a variety of artists, but oddly enough, Springfield's version has gone unnoticed. Most people, including Newman speaking in the March issue of Uncut magazine, point to Cilla Black's interpretation as the definitive one. But Black, a loud and overdramatic singer in the Shirley Bassey ("Goldfinger") mold, squashes the ambiguity in the song. You don't believe for a second that this woman has any doubts that she's been wrong before. There is little pain or vulnerability in her reading. Black's brassy style is far better suited to boldly optimistic material like "You're My World" and Paul McCartney's "Step Inside Love."

But who would have ever thought "I've Been Wrong Before" would have appealed to the Chicago psychedelic band H.P. Lovecraft who in 1967 recorded perhaps the strangest most bizarre version of the song? Taking their name from the famous horror writer, H.P. Lovecraft mixed originals with covers on their self-titled debut album. Their rendition of "I've Been Wrong Before" comes complete with a darkly exotic arrangement that includes a flute that calls up the Arabian Nights and a creaky TV late-show organ that signals haunted houses and agitated bats circling the towers. Vocalist Dave Michaels and George Edwards also perform the track as if they had just risen from the crypt to tell us about their romantic woes through the centuries. Their affected droning turns "I've Been Wrong Before" into tales of Count Dracula scorned.

It wasn't until 1995, on Kojak Variety, Elvis Costello's album of cover songs did "I've Been Wrong Before" make another triumphant return. Not surprisingly, Costello's singing drew on Dusty Springfield's delicately drawn out phrasing (as he also did on his sublime All This Useless Beauty CD). He further added an element of dramatic discord to the musical arrangement returning the song to the uncertainty of the longings it expresses. Whatever Randy Newman thinks of the song, Dusty Springfield's "I've Been Wrong Before" is a piercingly fragile examination of romantic fear, where love becomes an entire house of cards that can ultimately come tumbling down.

-- April 4/12

Igor's Boogie: The Rites of Stravinsky


It should come as no surprise that if any one composer could cause a riot, it would be Igor Stravinsky. Unpredictable in nature, and comparable in stature to painter Pablo Picasso, Stravinsky was an enigmatic figure who moved like a chameleon through the cultural world. He made his reputation with his erotically charged masterpieces The Firebird (1910), Petrushka (1911), and The Rite of Spring (1913). Throughout these works, you could hear Stravinsky gradually forsaking the world of romanticism which would lead him to ultimately forge a new style of neoclassicism in 1920 with Pulcinella. Yet right at the moment when he was pioneering that phase of his musical career, he joined forces with his serialist adversaries, Anton Webern and Arnold Schoenberg, who had abandoned classicism altogether. "People always expect the wrong thing of me," Stravinsky once said. "They think they have pinned me down and then all of a sudden – au revoir!"

Born in St. Petersberg in 1882, Stravinsky had such a great aptitude for music that the colourful Russian composer Nikolai Rimsky-Korsakov took him on as a pupil. In 1909, Russia's top impresario, Serge Diaghilev, heard two of Stravinsky's first compositions, Scherzo fantastique and Feu d'artifice, at a concert in St. Petersberg. He was so impressed that he commissioned Stravinsky to write a couple of numbers for a ballet he was producing. Out of that encounter came The Firebird which was an overnight success. While not as daring or innovative as his later ballet scores, The Firebird still had something more foreboding than the exotic colours of Rimsky-Korsakov. Diaghilev could hear immediately that Stravinsky's work had what author Joan Peyser in To Boulez and Beyond called "a latent barbarism." This "latent barbarism" would, of course, be even more explicit in his next work for Diaghlev titled Petrushka. This piece, with its polytonality and sharper rhythms, caused something of a small commotion.

Petrushka, the story of a puppet who is bestowed with life, premiered in 1911, with the legendary dancer Nijinsky in the title role. At the time, Nijinsky was revolutionizing ballet in much the same way that Stravinsky was revolutionizing music. They both were taking the formal decorum out of their respective art forms and releasing the inherent primal impulses in their pieces. The ballet featured parodic elements, repetitive rhythms, and passages where Stravinsky echoed the mechanical and soulless world in which Petrushka found himself. (For those with a keen ear, American composer Frank Zappa, who was influenced significantly by Stravinsky, once wrote a hilarious pop satire called "Status Back Baby." In the song, a young football star fears he's losing his status at his high school so he looks for affirmation from his peers by painting posters and joining De Molay. In the bridge of the song, Zappa plays a guitar solo that quotes the opening melody of Petrushka, driving the point home that if Stravinsky's ballet score is about a puppet that longs to be human, Zappa reverses the process by writing a song about a human who longs to be a puppet.) The composer also illuminated the dual elements in Petrushka's character – both his mechanical and human sides. "I had conceived the music in two keys in the second tableau as Petrushka's insult to the public," Stravinsky remarked. "I wanted the dialogue for trumpets in two keys at the end to show that his ghost is still insulting the public."

Northwest Ballet's performance of Petrushka

Though Petrushka caused some commotion, it was nothing compared to his next score, The Rite of Spring. This startling new piece was a culmination of what Stravinsky was working toward in The Firebird and Petrushka. "One day, when I was finishing the last pages of The Firebird in St. Petersburg, I had a fleeting vision," he recalled. "I saw in my imagination a solemn pagan rite: sage elders, seated in a circle, watching a young girl dance herself to death. They were sacrificing her to propitiate the god of spring." The score called for the largest orchestra Stravinsky had ever assembled (and with plenty of percussion). This was no romantic rendering of the genial spirit within nature, or the renewing elements of the seasons; The Rite of Spring was about the scourge of dehumanization. Russian and Hungarian folk tunes were integrated into the score, but even if the themes were familiar to the ear, the instruments played them in unfamiliar registers. Stravinsky had the time signatures change rapidly after each bar. The bassoon sounded like it had a bad cold. Arpeggios blurted from woodwinds. Meanwhile, the pizzicato of the first violins set the pace, with running sequences filled with squawks, trills, and shrieks. The music prodded with an erotic force.

On the night of the famous premiere at the Théâtre des Champs-Élysées in Paris on May 29, 1913, Stravinsky could sense trouble in the audience right from the opening notes. It was an audience that historian and author Modris Eksteins describes in his book, Rites of Spring: The Great War and the Birth of the Modern Age (1989), as one "to be scandalized, of course, but equally to scandalize. The brouhaha...was to be as much in audience reactions to their fellows as in the work itself. The dancers on-stage must have wondered at times who was performing and who was in the audience." Stravinsky recalled the first stirrings of dissension: "I heard Florent Schmitt shout, 'Taisez-vous garces du seizieme'; the 'garces' of the 16th arrondissement were, of course, the most elegant ladies in Paris. The uproar continued, however, and a few minutes later I left the hall in a rage; I was sitting on the right near the orchestra and I remember slamming the door. I have never been that angry." The piece was also called "monstrous," a "massacre," and the choreography compared to "epileptic seizures."

Stravinsky & Nijinsky

These seizures and massacres, which represented the violence of both birth and death, were prescient warnings of what lay ahead, too. The Rite of Spring laid waste to the idealized past and opened the door to the modernist sensibility to come with James Joyce lurking around the corner. As choreographer, Nijinsky demanded a physicality from the dancers that was brutal and harsh and where the rhythms were complex. The conventions of beauty were undermined and the serenity of those conventions were rendered obsolete. The Rite of Spring broke through all quaint considerations of beauty into something newly evolved and startling in its depictions of nature's uncompromising power. Within the piece, you could hear the primitive forces that anticipated the savagery of the World Wars, which would kill millions; the brutality of the Russian Revolution, which would turn the world upside down and force Stravinsky from his homeland; the Holocaust; the carnage of famines; and the inflamed passions of nationalism that would unleash massacres around the world after the fall of communism.

Katsushika Hokusai's The Dream of the Fisherman's Wife (1814)

It's likely no small irony that Stravinsky and The Rite of Spring would over time gain acceptance, but the composer would continue to experiment and challenge musical conventions. (He would never again, though, incite another riot.) Stravinsky broke with the Russian orchestral school during World War I and started working with smaller ensembles. He settled in Paris after the Russian Revolution in 1917 and composed his first foray into neoclassicism with Pulcinella before ultimately turning to the United States after the death of his wife and child from tuberculosis. While he would in his late career emulate the beautifully sparse serialism of Webern, The Rite of Spring would become a continuous leit-motif into the waking dreams and nightmares of movies. Composer John Williams in his score for Steven Spielberg's 1975 Jaws would comically make the link to The Rite of Spring attributing one of the ballet's themes (duh-duh-duh-duh duh-duh) to the shark who was the movie's force of nature. As Anais Nin reveals her piquant discovery of lithographic copies of Katsushika Hokusai's erotic woodblock prints (The Dream of the Fisherman's Wife) to a publisher in the opening moments of Philip Kaufman's Henry and June (1990), the faint melody of the opening notes of Spring can be heard as if it had been stored with the lithographs in the Pandora's box that Nin opens. Since Henry and June was the first film to earn an NC-17 rating, replacing the X-rating often assigned to pornography, Stravinsky's daring score continued to have its finger on the pulse of the culture. With all its barbaric beauty, The Rite of Spring can also still give the finger to the tired tropes of proper decorum as well.

-- May 26/12

Atlas Pumped – Muscle: Confessions of an Unlikely Bodybuilder (1992)


When I was a young lad reading my Superman and Batman comic books, I was always fascinated by an ad that appeared in every issue. It featured a skinny young guy (not unlike myself) sitting on the beach with his fetching girlfriend and having sand kicked in his face by some jock who was built like an express train. Of course, the lean kid was humiliated, and the girl ran off with the beefcake. This was the selling point for Charles Atlas, a popular bodybuilder who could turn your beanpole frame into a brickhouse and you'd never have to have sand kicked in your face again.

The very idea of crushing bullies with a quickly acquired set of brutal biceps had a certain appeal (especially for a guy who for years to follow would have to grow used to losing girlfriends to intimidating guys with Ferrari's), but it wasn't alluring enough for me to send away for barbells and catch what Samuel Wilson Fussell, in his autobiographical expose Muscle, calls "the disease." The "disease" he describes is the obsession with transforming yourself into the fearsome giant you once dreaded. In Muscle: Confessions of an Unlikely Bodybuilder (William Morrow, 1992), Fussell takes us pretty far into the secret world of the sissy who hides inside his hulking flesh. "The beauty of it all," he confesses, "lay in the probable fact that I would never be called upon to actually use these muscles. I could remain a coward and no one would know." What makes Muscle such a compelling read is that Fussell brings a frighteningly precise awareness of what he did to himself and why.

At times, delving into Muscle is like plunging into a memoir of a drug addict who finds compelling ways to describe what he finds so pleasurable about the addiction even though he knows that he's killing himself. "If it meant feeling safe and protected, I was willing to give up everything...my life pre-iron no longer existed for me. It happened to someone else, someone smaller, frailer, less substantial than this new-and-improved packaged version." Fussell's addiction to bodybuilding began in 1985 while living in New York City. Having just graduated from Oxford, the son of very literate parents – his father was historian Paul Fussell (who just died last week) – he was plagued by illness. At six-foot-four and 170 pounds, he found himself "looking cadaverous," and discovered that the cause of the sickness was the violence in the city.


One day while escaping one of New York's more unstable citizens (who just happened to be chasing him with a crowbar), Fussell fled into a bookstore, and found himself face to face with the book, Arnold: Education of a Bodybuilder. The image of a robust Arnold Schwarzenegger, the John the Baptist of the barbell set, sealed Fussell's fate. For four years he built himself up to 257 pounds, and competed in three bodybuilding contests until he woke up to the horror that he'd transformed himself into a caricature of manhood. "The physical palisades and escarpments of my own body," he writes, "served as a rock boundary that permitted no passage, no hint of a deeper self – a self I couldn't bear."

Ultimately, his desire for muscles melted away and he gave up bodybuilding to become a writer. In Muscle, Fussell takes us beyond the narcissism of the body image into the self-hatred that lays beneath it. It's a riveting journey through a subculture he describes compellingly as "part puritan, and part P.T. Barnum," with characters called Sweetpea, Mousie, Nimrod, and Vinnie going through the purifying rituals of pumping iron, developing attitude, and walking with a swagger. It's about steroid use, the vitamins and proteins that need to be ingested daily. It's about the nascent fascism of men seeking "perfect" bodies while sporting T-shirts that say "Don't growl if you can bite" and "I'd rather be killing communists in Central America."

One of the more revealing aspects of Fussell's memoir is how the author arrives at the source of his pathology. While the bodybuilder thinks he is intimidating the world, he's really insulating himself from a world he finds terrifying. In the highly compelling Muscle, Fussell demonstrates that the bully on the beach was a coward all along.

-- May 31/12

Magical Retreat: Sgt. Pepper After 45 Years



The Beatles' Sgt. Pepper's Lonely Hearts Club Band, which was released in June 1967, is a lovely confection, a beautifully self-conscious neon sign that celebrated with ample imagination the romantic ideal, where the possibility of true love could transcend all of our problems. (If only.) And in that summer, which came to be termed 'The Summer of Love,' Sgt. Pepper's seamless and mellifluous tone made it appear as if that possibility was indeed well within our grasp. However, the idea for the record, following up on their concept single “Penny Lane/Strawberry Fields Forever” (two radically different renderings of childhood and originally destined for the album) came out of the opposite sentiment, a 'Summer of Hate' that took place the previous year.

That particular summer, American cities (as they had almost every summer in the mid-Sixties) were burning in reaction to the continued racial unrest. The escalation of the war in Vietnam had also all but diminished President Johnson's War on Poverty. In short, the tenor of violence was becoming exactly as black activist H. Rap Brown had described it then – as American as apple pie. Amidst this chaos, with the mounting frustration over the dashed ideals of the New Frontier of the early Sixties, The Beatles became easy targets for the angry and the disillusioned. You could say they were even, to a large degree, at the apex of those very ideals being dashed. So their 1966 tour, filled with torpor and turmoil, reached its bottom end with record burnings in the Deep South after John Lennon had remarked earlier that year that The Beatles were more popular than Jesus. In that summer, The Beatles found themselves no longer in control of their meteoric success. When they first chose to engage their audience in 1962, with their first single “Love Me Do,” the goal wasn't simply to become entertainers, but to put new demands on the pop audience. They set out to take popular music and their fans to another place. And in the coming years they did just that – and more.

Of course, The Beatles couldn't change the course of the world around them, to make it a better place. Nor did they intend to. Rather they sought to change our perspective on this world. But Lennon's comments about Jesus demonstrated the limits of the audience's willingness to have their perspective changed. The group learned that any provocative outspokenness would have a price tag attached. So the rules of engagement had clearly changed by 1966. At which point, The Beatles were becoming fixed in the sights of devoted fans, ready to strike if displeased. For self-preservation, the band realized that it was time to stop being Beatles. “The Beatles were hated, as much as anything, for representing the principle that freedom was worse than available,” critic Dave Marsh once wrote while touching on the paradox buried within the pleasure principle of the group. As they grew restless and vulnerable, The Beatles were becoming deeply dissatisfied, feeling wasted and used up by being so available. “No one would have hated The Beatles in '66 if they hadn't been so loved in '64,” critic Devin McKinney reminded us in Magic Circles, his highly readable and perceptive book on the group. “Their mania always held the potential for a different, darker brand of madness.” So before the madness could consume them, The Beatles decided to quit the road.

Burning Beatles Records in the Deep South in 1966

If The Beatles by that time had become what literary critic Leslie Fiedler once called “imaginary Americans,” perhaps they could now imagine themselves as anything but Beatles. In that world, they could even create an imaginary audience to hear their new radical work. To begin, Sgt. Pepper was the brainchild of Paul McCartney, travelling incognito during a holiday in Paris following The Beatles' final tour, when he began to entertain the notion of a disguise for the group. Within these new masks, they could maybe find a new freedom, a freedom they had lost by 1966 being the Fab Four. McCartney also figured that the new album could demonstrate that The Beatles were no longer these mop-top performers; they would become pop artists rather than pop stars. Since many American rock bands, especially on the west coast, were developing exotic names (i.e. Strawberry Alarm Clock), McCartney thought of creating a concept LP where The Beatles could become an old-time combo from the late Forties, Sgt. Pepper's Lonely Hearts Club Band; why, they wouldn't even look like themselves. Decked out in old-fashioned military marching outfits, he wasn't kidding. The band now appeared burdened by age, by the weight of the dreams that carried them from Liverpool to Hamburg to the world at large. They even became more distinct from each other, with different cuts of hair, and now aged by mustaches and beards.

Sgt. Pepper's Lonely Hearts Club Band might have been the official sound track for the heady optimism of that psychedelic summer, but the album's cover was something less than gleeful. It resembled a rather mournful graveside shot of the burial of the old Beatles. With a motley collection of funeral guests, including boxer Sonny Liston, Mae West and British occultist Aleister Crowley, Sgt. Pepper unveiled the reborn Beatles from the ashes of the Fab Four. (Wax figures of the early band in their matching suits even appear over the graveside grieving their own demise.) Where once The Beatles were part of a collective identity, one that encompassed a larger community of fans, they sought now to abandon it as well as their connection to that community. They set out to take refuge from a pop world they helped create, to retreat from the violence they had unwittingly stirred in their fans; in fact, to use their time to deliver a more sophisticated music. But it would be done from a safe distance.


From the first sounds of the opening number on the album, with an orchestra tuning and an audience getting settled, it's clear The Beatles had invented an ideal crowd to hear them, not the potentially dangerous one that they had endured on the road all those years. The Beatles, on the song “Sgt. Pepper's Lonely Hearts Club Band,” fashioned something of a mirage, providing for themselves the illusion of performing the kind of live music they couldn't in reality do in concert. They conjured a dynamic between themselves and this imagined audience that didn't reflect the reality of the listeners they had just abandoned. For instance, McCartney shows so much happiness for this crowd he even expresses in the song the desire to take them home. (Can you imagine that kind of consideration being extended to the angry hordes they faced in the American South in 1966?)

All through the colourful landscape of the songs on the album – including the convivial “With a Little Help From My Friends,” the Lewis Carroll imagery of “Lucy in the Sky with Diamonds,” the wistful tale about the cost of asserting your independence in “She's Leaving Home,” the kaleidoscopic soundscape of “For the Benefit of Mr. Kite,” the richness of the Eastern flavoured “Within You, Without You,” the sudden confrontation with mortality in “A Day in the Life” – you can hear a band going musically forward while simultaneously going backwards. On the record, they're trying to reinvent their past, a past they never actually had, which would then anticipate a future that, we know now, they never got to fulfil. When the utopianism of the pop audience turned sour by 1966, The Beatles began to imagine a utopia that ironically turned out to be truly nowhere.

It's rarely addressed that despite the dazzling avant-garde innovations on Pepper, there's also an air of caution that threads its way through much of the record. The smooth craftsmanship, which gives the album its glamourous sheen, also masks a fear in the band of spontaneity. (During the sessions, George Harrison and Ringo Starr, in particular, complained that the group didn't play together as a band anymore. Their parts were worked out ahead of time rather than learned during band rehearsals.) Most listeners failed to notice, though, or even began to care. The album's originality and celebratory spirit intoxicated almost everyone. Not surprisingly, Sgt. Pepper would sell 2, 000,00 copies in the first three months of its release. Listeners heard the album as a bliss-out, a musical bath to clean away the enmity of the previous year. Yippie activist Abbie Hoffman even compared the record, as a happier memory, to the horror of the assassination of JFK. The reference to JFK is significant because (like that tragic event) everyone knew where they were when they first heard Sgt. Pepper. DJ Red Robinson also celebrated Sgt. Pepper as heralding in a new cultural revolution (ironically just as China was simultaneously having its own horrific version of one). But there were others who heard nothing of the sort. “Sgt. Pepper was the sound of The Beatles in hiding, avoiding danger – avoiding freedom,” wrote Devin McKinney. After all, the group's freedom was found in the chaos of Beatlemania, in the testing of their artistic worth, plus the proven ability to innovate in the pressure cooker of relentless demand.

For all of its brilliance, Sgt. Pepper created a false optimistic front (except perhaps on “A Day in the Life”), where the creators conveniently hid behind their innovations rather than (as they had in the past) use their innovations to cultivate an audience. Without the crowd as their adversary, though, The Beatles had no one to measure their utopian ideals by. In the past, the group found greatness on A Hard Day's Night, Rubber Soul and Revolver with little time to spare and under the insistent strain of constant touring; the band now found themselves with all the time in the world, but no urgency to use it. It was rock critic Richard Goldstein, though, who truly spoiled the party atmosphere created by Sgt. Pepper shortly after its release. Writing in The New York Times, Goldstein went so far as to call Pepper “an artistic failure.” While he did assent to the notion that the record was a hippie talisman for the season, he didn't think it provided a very deep perspective on the times, only a shallow reflection of them. “In substituting the studio conservatory for an an audience, The Beatles have lost crucial support, and that emptiness at the root is what makes their new album a monologue,” he wrote. Then quoting from John Lennon's “Strawberry Fields Forever,” he went even further. “Nothing is real therein, and nothing to get hung about. Too bad; I have a sweet tooth for reality. I like my art drenched in it, and even from fantasy I expect authenticity. What I worship about The Beatles is their forging of rock into what is real. It made them artists; it made us fans; and it made me think like a critic when I turned on my radio.”

Richard Goldstein's review of Sgt Pepper

For Goldstein, the best music turned its listeners into good critics, and given the era, it wasn't the time either for monologues. By the time his review was published, though, he was part of a different kind of dialogue. Goldstein became embroiled in a hailstorm of protest over his review. Besides an avalanche of hate mail by readers of The Times, The Village Voice published a rebuttal by Tom Phillips (angrily written by, of all people, another New York Times regular and colleague of Goldstein's). Paul Williams, in Crawdaddy magazine, thought that Goldstein got so hung up on his own sense of integrity that he couldn't “humble” himself before the album. Imagine that? Humble himself. What critic should ever humble himself before any work of art? How would they then be decent critics? (We've come to a slippery slope. It's bracing to see then a bold stand taken by a critic over such a hugely anticipated album. Try finding similar bravery today. Humbling oneself is comparatively more prevalent now due to an atmosphere of terror that seems to exist among some critics who prefer the safety of consumer reporting at the expense of their critical voice.)

As beautifully conceived as Sgt. Pepper is, in retrospect, it actually provided a false front for the band to hide behind. Critic Mikal Gilmore elaborated further on what this front hid in his excellent collection of essays titled Night Beat. “By the time Sgt. Pepper was on the streets, San Francisco’s Haight-Ashbury was already turning into a scary and ugly place, riddled with corruption and hard drugs, and overpopulated with bikers, rapists, thieves, and foolish shamans.” Contrast that sordid portrait with the quaint image of The Beatles on Sgt. Pepper. Now they were avatars of a very different Shangri-la, a communal wasteland of lost idealists, instead of the angry Beatlemaniacs left behind. Those disconsolate, lost faces of Haight-Ashbury also represented the other side of the celebratory Sgt. Pepper visage. Into an emotional void created by The Beatles' withdrawal, this hippie enclave cast an unsettling shadow over the bright sounds of Sgt. Pepper. These fans, like zombies out of Dawn of the Dead, crawled out from the kinds of holes The Beatles – and this album of the Summer of Love – just couldn't fix. In short time, The Beatles couldn't fix those holes for themselves.

-- June 19/12

Edgard Varèse & The Bomb That Would Explode the Musical World



On the night of May 29, 1913, Edgard Varèse sat in attendance at the Théâtre des Champs-Élysées in Paris watching the infamous performance of Stravinsky's The Rite of Spring. This was an evening considered by some to mark the spiritual birth of the 20th Century. While abuse was being hurled at the stage, and indignation toward this "barbaric" music raged around him, Varèse calmly wondered what all the fuss was about. After all, many composers were already growing tired of tonality. They had already begun resenting the adherence to a single key as the only accepted foundation for musical composition. Arnold Schoenberg, the Austro-Hungarian composer born in 1874, even developed his own solution: the twelve-tone system, an approach allowing all twelve notes of the chromatic scale to be played before the first note is played again. This open-ended arrangement offered composers the opportunity to compose disciplined atonal music that would be equivalent to the most traditional tonal system.

Following up on Schoenberg's daring, Anton Webern, an Austrian composer who studied with Schoenberg, reinterpreted the twelve-tone form by writing with an abstract sparseness that provided space for the ear to gradually discover the melody. Igor Stravinsky, always the contrarian, headed in a different direction. He was less interested in harmony and more dedicated to what music critic Frank Rossiter once described as "a return to the artistic ideals (and often to the specific musical forms) of the pre-romantic era." Varèse though took an even more radical route. He decided to question the very principles of Western music altogether. Varèse embarked on a search for what he thought could be a new music, one filled with the sounds of sirens, woodblocks, and eventually electronic tape. His impact on the culture may have appeared subterranean, but it was acutely felt among a diverse group of musicians. Most significantly, Frank Zappa would carry his legacy. But he also attracted the ear of be-bop legend Charlie Parker (who wanted Varèse to take him on as a student). His reach also extended to some of Zappa's contemporaries including progressive rock groups like King Crimson and Pink Floyd (who utilized Varèse's tape experiments), John Lennon and Yoko Ono's "Revolution 9" from The Beatles' White Album is just about inconceivable without him. Even the pop rock group Chicago chipped in with their own little ditty "A Hit for Varèse" in 1971.


Of course, Varèse never had a hit, he barely even had a hearing. Very few seemed to take seriously a man for whom music was a scientific construct of sounds, where the score sheets themselves would serve as his own simulated laboratory. As bold and bizarre as his music would be, too, he even had the wild and frizzy hair of those mad scientists from cheesy horror movies. But there was nothing cheesy about his artistic intent. Varèse wanted instruments that offered what he called "a whole new world of unsuspecting sounds." You could say much of his career would indeed be taken up the quest for a pure sound. That quixotic pursuit alone led him to pianist Ferruccio Busoni who wrote in his book Sketch of a New Aesthetic of Music (1911) that "music was born free and to win freedom is its destiny." With that in mind, Varèse set out for what he called "the bomb that would explode the musical world," where the resulting sounds would rush in through his induced molecular breach. To lay the foundation for that imminent explosion, he first organized an International Composers Guild in July 1921 that was dedicated to the presentation of works by Stravinsky, Schoenberg and Webern; and he wrote a manifesto that laid out the goals of the Guild. (Nearly half a century later these aspirations were invoked as a clarion call by Frank Zappa on the cover of his first few albums.) "Dying is the privilege of the weary," it reads in part. "The present-day composer refuses to die. They have realized the necessity of banding together and fighting for the right of the individual to secure a fair and free presentation of the work...It is out of such collective will that the International Composers Guild was born."

Despite sufficient financial support, however, Varèse could not find a mass audience with its own collective will for this difficult music. (He couldn't even find musicians eager to play the highly technical scores.) The Guild's first concert at the Greenwich Village Theatre drew only three hundred people and soon splintered over which compositions to use, not to mention Varèse's tendency to ruthlessly dominate the group. Eventually, he started composing new material himself. After embarking on his first work, Ameriques (written between 1918 and 1921), he declared, "I refuse to limit myself to sounds that have already been heard." As if to prove the point, Ameriques included a solo flute intermittently interrupted by loud blasts from the orchestra, setting up a significant tension between the solitary sound of the wind instrument and the larger instrumental projections. Another piece, Ionisation (1929-31), required a group of thirteen musicians who could play a total of thirty-seven percussion instruments, including a gong, Chinese blocks, tam-tams, snare drums, and Cuban claves. What Varèse sought with this bevy of percussive toys was timbre – sound in its purest form – rather than pitch.


Eventually, he moved to Greenwich Village in 1925 and began work on his boldest score yet. Arcana needed an orchestra of 120 musicians, over seventy strings, along with eight percussionists playing close to forty percussion instruments. The riveting cacophony of Arcana took Varèse began into exploring numerous electro-mechanical devices by the Forties, incorporating exotic instruments such as the theremin and the Ondes Martenot, both of which predated the creation of the moog synthesizer. Once magnetic tape became the standard in 1955, Varèse composed his audio tape masterpiece Deserts where he invented an entirely new means to best articulate the sounds he kept hearing in his head. But no matter what was echoing in his head, he couldn't get anything to echo back from grant foundations like the Guggenheim to help finance his search. American music circles were also becoming more progressively conventional and conservative and driving the avant-garde underground. Although he continued to believe that new instruments were necessary to free the composer, Varèse grew depressed from the lack of such liberation. He would compose new scores but then destroy them in despair.

Frank Zappa once wrote in Downbeat that it was significant that Varèse never received the acclaim in America that he found in Europe. "Even the critics [there] that didn't like his music didn't dismiss him as a buffoon...He was written about in the United States [however] like he was some kind of quack who didn't know what he was doing." But it was clear, even by 1917, that Varèse knew exactly what he was doing. "I dream of instruments obedient to my thought and which with their contribution of a whole new world of unsuspecting sounds will lend themselves to the exigencies of my inner rhythm," he would write then. In declaring this paradoxical balance created out of strict control and unbridled freedom, Varèse became trapped by what the present couldn't offer him. Yet within his dramatic and turbulent scores, the fury of the musical bomb he set off still occasionally sends off reverberations, timely reminders that continually further his claim that the present-day composer does indeed refuse to die.

-- August 14/12

The Pennultimate Challenge: Five Reasons Why Sean Penn Wanted to Give Up Acting and Become a Director (1996)



When Sean Penn announced in 1990, after starring in the woefully feral drama State of Grace, that he was giving up acting to turn to directing, the first film he said he was going to make was a story based on a Bruce Springsteen song, "Highway Patrolman," from the singer's equally grim, yet arresting 1982 album Nebraska. The song and the subsequent movie dealt with the troubles of two siblings – one who becomes a lawman, and the other who spends his time breaking the law. It was called The Indian Runner and when it was released in 1991, despite the merits of the story, it didn't run away with anything. But Sean Penn had plenty to say at the time about his decision to – briefly – abandon the trade.


1) "Answers are by definition wrong. I always wanted to make a movie that ended with a question."

Four years after his directing debut, Penn remains content providing answers rather than posing questions about his reasons for moving behind the camera. The questions instead are coming from a group of us gathered around a table in the early morning while the actor nurses his coffee and defies the "No Smoking" signs by lighting up. Once again, he's defending his decision to make films that he finds more complex that the roles he's performed in other artists' work. "The barometer on whether or not I'm going to act is that I go down to my pool," he explains in a measured tone while dropping his arm towards the floor."And if I put my hand in there and it's not heated, I'm going to act." When Penn, one of his generation's most exciting performers, tells you again that he's giving the profession which has been his greatest asset an expedient shift, you can't help but wonder why. Unless you're Laurence Olivier (Hamlet, Richard III), Charles Laughton (The Night of the Hunter), Kenneth Branagh (Much Ado About Nothing), or Diane Keaton (Unstrung Heroes), the track record for actors – even great ones – directing good movies is pretty dim.

Marlon Brando gave us the fascinating, but muddled western, One-Eyed Jacks (1961). Paul Newman became horribly earnest in Rachel, Rachel (1968). Robert de Niro turned into a shockingly pale imitator of Martin Scorsese with the horribly derivative A Bronx Tale (1993). Even Morgan Freeman, one of our most commanding screen actors, brought none of his electricity to bear on directing his dull polemic, Bopha! (1993). But Penn is unrepentant about the failings of The Indian Runner. His second film, titled The Crossing Guard (1995), also features two men in conflict with what life has dealt them. And when it opens later in the year, it would stiff just like his first effort.


2) "My strength as a director is that I'm convinced I don't know anything about directing."

But would you call that a strength or a deficiency? Penn obviously believes it's an asset because it leaves him with no pretensions about being a recognized auteur. His chief concern, whether he directs or acts, is more about the examination of conflict rather than finding its resolution. You could even say that this goal has been the driving force in all his work (as well as in his personal life). At one time, it was the personal side that got all the press: the drunk driving, the moodiness and his failed marriages. But his acting side, brave and imaginative, deserved more reckoning than it often got. Sean Penn has been an exciting presence in movies since he first burst on the screen as a voice of reason in the military drama Taps (1981). Subsequent roles in Fast Times at Ridgemont High (1982), where he played the amiable stoner Spicoli; and in Bad Boys (1983), as a juvenile offender who becomes filled with regret, established him as not only a highly skilled actor, but one who could convincingly slip into a wide range of diverse roles. Sometimes he would choose parts that demanded as much from the audience as he did from himself.

This was especially true in one of his strongest performances as Sgt. Meserve, a platoon sergeant during the Vietnam War who turns vindictive and instigates the kidnapping and gang rape of a Vietnamese girl in Brian De Palma's harrowing drama Casualties of War (1989). What makes his acting continually so arresting might be what Pauline Kael suggested when she wrote about Penn that "there's no residuum that carries from role to role." The most natural inclination for an actor known for leaving no residue behind might be to begin making movies that can do likewise. But there's a big difference between desire and execution.


Sean Penn directs David Morse in The Indian Runner

3) "[The Crossing Guard], to me, is about how two guys have crippled themselves. And, in the end, it's just two cripples. I'm interested in whether or not that can change."

If The Indian Runner posed questions about what an upstanding officer of the law is supposed to do when his own brother runs afoul of it, The Crossing Guard examines both sides of a hit-and-run tragedy where a young girl is killed, and the anguish of the perpetrator and the victim's father is equated. The idea of comparing the pain of John Booth (David Morse), who ran over the girl, and exits prison after serving six years, with the girl's father (Jack Nicholson), who hasn't felt alive since then, is perverse for some people who don't think they can be compared. But Penn emphatically disagrees.

"I do equate that. And I do care as much about one as the other," he insists. "If someone has a problem with that, then they have a problem. That's emotional politics. I care about both sides." Part of that dual empathy might come from Penn's own temperament, but it also may be informed by being a father as well. "Besides the nightmares that come with having a kid, something outside my own experience happened that inspired the film," Penn says soberly. "Just when I was getting an idea that I could run with, Eric Clapton's son Conor fell out of a window. And, I thought, how do you deal with that?"


4) "This is for better or worse a personal movie. You go through times where you wonder if you are totally alone in your thinking. I've talked to people who've hated the movie and I've talked to people who loved it. But they got it. So I figured my job was done."

As for what people actually got, it really depended on who you asked. Owen Gleiberman in Entertainment Weekly described The Crossing Guard as "Death Wish directed by Antonioni." But some, like Janet Maslin in the New York Times, thought the film was "risky and heartfelt." There is no question of the sincerity of the work here. But I still don't believe that Penn got onto the screen all that he'd intended. Penn has no problem creating the atmosphere of psychological drama, but he can't translate the psychological dimensions of his acting talent into his directing. When Penn directs other actors, he actually leaves out the crucial steps of good acting – motivation, the shaping of a character, and intuitive judgement – something that he brings to most of his roles. Just consider what he did in Tim Robbins' problematic polemic Dead Man Walking. He plays Matthew Poncelet, a condemned killer of two teenagers. What's extraordinary about his performance is how gets at very simply what he strains for as a director in The Crossing Guard. In Dead Man Walking, Penn takes us beneath the bravado and the façade of a vicious killer. We are then allowed to perceive the damaged soul of a man who no longer lays claim to having one. Penn wrestles here with the complex motivations of a psychopath (complexities not present in the theme of the film itself which blatantly spells out its sympathies on the issue of capital punishment). Within Penn's troubled interpretation, we empathize with Matthew Poncelet without losing sight of what makes him a mean, cold-blooded murderer. Dead Man Walking gives the largely liberal audience the safe comfort that its on that audience's side of the issue, while Penn's performance offers us no such comfort, no chance for the character's redemption, and no apologies, thus giving the lie to the pieties the picture lays on the audience.


Jack Nicholson in The Crossing Guard

5) "For me personally, directing is the more healthier experience because in acting you have to lock into things you've experienced or observed, and ride on it for the period of time the movie goes. Directing The Crossing Guard, I get to pick the subject. It's more peaceful and fun."

That may be true simply because it's easier to pick a good subject, especially if you're as smart as Penn is. But the intuition required by a great actor to provide clues to a character he plays is also hard work. It means that you have to risk touching on mysterious parts in yourself – something Penn might on occasion be reluctant to do, even though it's something he continues to do with such startling brilliance. As no more questions are forthcoming, Penn walks out of the room to his next interview. A trail of cigarette smoke lingers behind him, snaking through the air much like the dark, unresolved moods he divulges on the screen in The Crossing Guard. But, in a moment's time, like the movie itself, the smoke simply vanishes into thin air.

That's something you can say his acting never does.

-- September 13/12

Jimi Hendrix Drifting



When Jimi Hendrix died in 1970, over forty years ago this month, I was in high school. It was a time when a number of key pop figures – all in their twenties – never got to see thirty. A year earlier, it was Brian Jones of The Stones, and Janis Joplin and Jim Morrison would soon follow Hendrix to the grave. Besides sobering you with a taste of death's final victory (right at that moment when you saw nothing but life straight ahead), you also realized that a person's genius, their gifts, even their youth, could do nothing to protect them.

Hendrix's death hit me harder than the others because I came to truly love the paradoxical nature of his music. (In a song that fundamentally came out of the blues like "Burning of the Midnight Lamp," he combined a harpsichord with a wah-wah electric guitar and a chorale section to create a powerfully intense emotional soundscape.) Although Jimi Hendrix was always fully recognized as a virtuoso and theatrical guitar stylist, he was rarely discussed in any great depth in terms of his gifts as a poet, singer and music innovator. (For those insights, it's best to read David Henderson's 1978 biography 'Scuse Me While I Kiss the Sky which still hasn't been equalled.) But John Morthland, writing in The Rolling Stone Illustrated History of Rock & Roll, captured key aspects of those many gifts that Henderson elaborates on. "As a guitarist, Hendrix quite simply redefined the instrument, in the same way that Cecil Taylor redefined the piano or John Coltrane the tenor sax," he wrote. "As a songwriter, Hendrix was capable of startling, mystical imagery as well as the down-to-earth sexual allusions of the bluesman." Those sexual allusions though also led to a particular kind of theatricality that the artist himself was growing tired of indulging. Joni Mitchell, who met Hendrix in Ottawa towards the end of his life, recognized immediately his frustration about the public and critical perception of him based on those sexual allusions. "He made his reputation by setting his guitar on fire, but that eventually became repugnant to him," Mitchell told The Guardian in 1970. "'I can't stand to do that anymore,' he said, 'but they've come to expect it. I'd like to just stand still'."

The last album he was preparing when he died, which first came out posthumously in 1971 as The Cry of Love, features plenty of songs where he is indeed 'standing still.' The material on it draws essentially from tracks he had been recording between March 1968 and August 1970. While he was then preparing a visionary double-album work to be titled First Rays of the New Rising Sun (which would eventually come out on CD in 1997 with additional tracks not included on The Cry of Love), his death and various contractual issues prevented the release at that time. John McDermott in his liner notes for the First Rays CD, acknowledging the unfinished state of the album, clearly outlines Hendrix's intent. "With full faith in his music, Hendrix was primed to introduce his audience to a new frontier, where the triumphs of his past would merge freely with his unique blending of rock with rhythm and blues." As McDermott states, the work from these sessions were split up among three albums: The Cry of Love, Rainbow Bridge (1971) and War Heroes (1972). Of the three albums, The Cry of Love is the more sustained and satisfying work.


Although The Cry of Love remains a somewhat uneven record, the working out of his own personal isolation resonates in many of its best tracks like the ripping "Ezy Ryder," the lilting "Angel," the gospel fury of "In From the Storm," his Dylanesque "My Friend," and his tip of the hat to Skip James in the country blues demo of "Belly Button Window." "Ezy Ryder," which was recorded in December 1969, was obviously inspired by the hit counter-culture film (Easy Rider) from earlier that summer. But the song, the only one on the record recorded by Hendrix's funk trio known as the Band of Gypsys (including Billy Cox on bass and Buddy Miles on drums), strips away the underlying masochism and paranoia that inspired the picture's theme (and made it such a hit) to arrive at something far more poignant. "There goes ezy ryder," Hendrix cries out as Buddy Miles attacks his drum kit as if firing heavy nails into it with a machine gun as he rides wave after wave of Billy Cox's pulsing bass line runs. "Riding down the highway of desire/He says the free wind takes him higher/Searchin' for his Heaven above/But he's dyin' to be loved." While "Ezy Ryder" has all the full-out propulsion of earlier songs like "Manic Depression," "Spanish Castle Magic," or "Crosstown Traffic," the recognition of death isn't brought on by resignation, or the failure of values (as in Peter Fonda's fatalistic proclamation of "We blew it" in the movie), but the desire instead to transcend earthly chains. Martin Luther King would also proclaim recognition of the Promised Land in his final speech a year earlier, a vision that allowed him to face the death he saw coming, so Jimi Hendrix also reaches for the sky in "Ezy Ryder." "He's gonna be livin' so magic," Hendrix sings. "Today is forever so he claims/He's talkin' about dyin' it's so tragic baby/But don't worry about it today/We've got freedom comin' our way."

"Angel" with its more blatant recognition of death's final victory ("Angel came down from Heaven yesterday/She stayed with me just long enough to rescue me") became the biggest hit from the album, especially when Rod Stewart covered it in 1972. But the track, as lovely as it is, is too obvious in its meaning, the metaphors too easy to read: the pop song as obit. The tune that left me wondering if he indeed saw it all coming was the exquisite "Drifting." He'd written beautiful ballads before like "May This Be Love" and "Little Wing," but "Drifting" was essentially a spiritual, a contemporary interpretation of one that offered a poignant reckoning of the fact that he knew he was moving on. "Drifting on a sea of forgotten tear-drops," he sings with a delicate lilt, a soft crooning that anchors the watery texture of the various guitar melodies keeping him afloat, "On a lifeboat sailing for your love." No doubt the wistful qualities within this spiritual were borne out of its tonal resemblance to Curtis Mayfield's "People Get Ready." In the song's last moments, where his guitar loops sound like seagulls taking flight over the water, they echo the cries of liberation those same loops once called out for in the conclusion of "If Six Was Nine." But to a different effect. In "Drifting," you can practically see him waving goodbye as he flies away. Liberated.

John Morthland concludes his piece on Jimi Hendrix contemplating not only his continuing influence, but also the endless albums and repackages of both finished and unfinished tunes. "[A]s the years go by, it also becomes increasingly apparent that Hendrix created a branch on the pop tree that nobody else has ventured too far out on. None has actually extended the directions he pursued, but perhaps that is because he took them, in his painfully short time on earth, as far as they could go." It also may be true that he took those innovations with him to the Promised Land.

-- September 26/12

Time's Mysterious Passage: Penny & The Quarters' "You and Me"


Penny & The Quarters (minus Penny)

In Derek Cianfrance's Blue Valentine (2010), when Dean (Ryan Gosling) selects a song that he clearly hopes sustains the love between himself and Cindy (Michelle Williams), he's in a very familiar and popular movie tradition. That tradition includes Dooley Wilson calling up the lost romantic days in Paris for Rick (Humphrey Bogart) and Ilsa (Ingrid Bergman) when he sings and performs "As Time Goes By" in Casablanca (1943), or Lloyd Dobler (John Cusack) holding his boom-box high above his head outside the bedroom window of Diane Court (Iona Skye), the girl he's smitten with, so she can hear Peter Gabriel's touching "In Your Eyes" in Say Anything (1989). The romanticism in these songs not only serves to enhance the desires of the characters on the screen, what you could call an urgent yearning in them to make their love work, but it also speaks to the way pop music so often captures the fleeting innocence of what it feels like to fall in love. While in real life, relationships change, the songs always remain the same, fixed in the time in which they were first recorded. And if the relationship falls apart, or especially if it ends tragically, these tunes can quickly bring us heartache. When they suddenly come on the radio – often in a flash – they remind us of who we once were, but are no more.

Along with his co-writers Cami Delavigne and Joey Curtis, Cianfrance in Blue Valentine seems to understand the ways in which pop songs can define the way we love. He underlines it by introducing the couple's song in a completely new way. And the song he's chosen has a fascinating history of its own which hauntingly mirrors the dashed hopes of the couple on the screen. Blue Valentine is a devastating and accomplished work, a heartbreaking film about the dissolution of the relationship between Dean and Cindy, but it's not told to us in linear time. Throughout the film, we jump back and forth and through the various moments in their love affair and marriage, those moments that are both poignant and ultimately wounding. Cinafrance nimbly contrasts those changes, too, even in the body language of the characters. As they both come to know each other, we can see in their bodies the eager and giddy anticipation of the sparks they hope to set off in each other. (It's there in the musical sway of their courtship.) But that eagerness is then boldly juxtaposed with the present, where the music is suddenly gone and a revulsion at being physically touched dramatically mirrors the ways in which their marriage is coming apart. In most romantic pictures, we usually hear the couple's song the first time they choose it, when it clearly signals the love they begin to feel for each other. And we come to believe in that song, just as Rick and Ilsa believe in "As Time Goes By." In Blue Valentine, we encounter their song early in the picture, but it comes late in their marriage when it no longer has any meaning left for them. It is, in fact, in a moment when Dean is desperately trying to get it back. Then we hear it again, later in the film. But this time, it's right at that moment when it first became their song, a moment that becomes unbearably wounding because we also now know where their marriage is heading.

Michelle Williams and Ryan Gosling in Blue Valentine

In an e-mail exchange last spring with my friend, Amanda Shubert, she correctly pointed out to me that Blue Valentine "[uses] the poetic evocation of time's mysterious passage to express the exigencies of love and companionship...the end of love is a kind of death...through the damning sense that we can't escape the way time saps the spirit out of us, wears us down, as we age and take our parents' place." That notion of time's mysterious passage is also present in the couple's choice of song: Penny & The Quarter's' "You and Me." You'd be forgiven if you had no clue as to where this song came from because it's very likely you've never heard it before. But chances are, when you do hear it, you'll never forget it. In the most stylistic way, "You and Me" doesn't tie itself to the contemporary story being told in this picture (which is scored to music by Grizzly Bear); its sound, the gentle doo-wop of the Fifties, doesn't even come out of the era it was recorded in, which was the Seventies. But the tale of how it found its way into a picture made in 2010 is as enigmatic as the drama it became part of.

According to The Other Paper, a young woman named Jayma Sharpe, who was studying in Italy, was having dinner with some acquaintances there. At this meal, one of the guests turned out to be a huge record collector who specialized in jazz and r&b. As they ate, he told this fascinating story about how an old demo tape that had been recorded years earlier in Columbus, Ohio, was gaining notoriety after being discovered in Blue Valentine. Recorded sometime between 1970 and 1975, the song "You and Me" by Penny & The Quarters, a group of Ohio teenage hopefuls who had been invited to do some demo recordings at Harmonic Sounds Studio in Columbus, were now getting hundreds of thousands of hits on YouTube even though no one knew who the group was. But Jayma recalled that her mother, Nannie Sharpe, along with her uncles, had once been involved in Columbus's soul music community in the late Sixties. So Jayma decided to check out what she could find on the Internet. She discovered plenty. Nannie's phone soon began to fill with texts from her daughter. The mystery song that found its way into Blue Valentine had been sung by her mother.

When interviewed by The Other Paper, Nannie Sharpe recalled that "You and Me' was done as a one-take rehearsal demo and then put in a box for possible release in the future. (At the beginning of the song, as a guitar gently strums, you can even hear someone adjusting the mike in front of one of the singers.) But when the studio manager, Clem Price, passed away in 2006, the tape ended up being sold along with other recordings in an estate auction. The buyers were Numero Group from Chicago, a label that reissued rare r&b records done at small, independent labels. The owner Rob Sevier and his associate, Dante Carfagna, had no trouble identifying all the songs they had bought except this one box where, written in pencil, were simply notes that identified tracks one and two as "Penny & The Quarters." Numero Group released a CD, Eccentric Soul: The Prix Label in 2007, featuring all the songs recorded at Harmonic Sounds – including "You and Me." But given the unknown quantity of the material, the CD's sales tanked. 

Eccentric Soul: The Prix Label

Two years later, though, Ryan Gosling just happened to be visiting his PR agent's home in Chicago. Kathryn Frazier, of Biz 3 Publicity, also represented the Numero Group and so she started playing some of the soul CDs from their releases. The conversation abruptly stopped when Gosling began to hear "You and Me" billowing from the CD player. It stayed with him for many months. Later that year, while in rehearsals with Michelle Williams for Blue Valentine, Gosling still couldn't get the song out of his head and he finally came to the conclusion that this was the song needed to express the love Dean felt for Cindy. It would only take days after the movie's release before audiences themselves began to express their love because Numero Group's website was quickly deluged by downloads of "You and Me" even though the band itself was still a mystery.

The Other Paper, based in Ohio, first ran a cover story on the song in January 2011, at which time, Glodean Robinson came forward. She was the widow of Jay Robinson, the man who apparently wrote this mysterious track. "I was totally shocked. I knew the song," Glodean would tell The Guardian, "but I didn't know that version, with a lady singing. When I heard my husband's voice in the background, it just blew me away. I started crying. It brought back the loss – and all the memories." She had met Jay in 1975 and felt quite keenly his disappointment with the music business. "He wasn't successful with music," she remembered. "He never made any royalties. But it was something he gave himself freely to all his life." Given the dramatic trajectory of the couple in Blue Valentine, where jealousy and resentment partly determined their fate, it's a little unnerving how much of that texture is part of this song. "I felt very, very angry when I heard the song," Glodean continued in The Guardian. "I didn't know who Penny was, and something in her voice arose some jealousy in me." It also stirred some indelible recollections. "I remembered my husband told me he had went in the studio and redid a song he made – with a young girl. The studio was hot that day, everybody was cooking, and he only had a penny and a quarter in his pocket, so that was how he named the group."


But Jay didn't live to discover that the song he wrote was being used in Blue Valentine, and he also wasn't around to tell the world that the artists were Penny & The Quarters. That task was taken on by the daughter of the song's singer some forty years after it was recorded. Nannie Sharpe, who was nicknamed 'Penny' by her father, came from North Carolina and always had aspirations to sing. "I probably started singing with my brothers and sisters when I was five," she told The Other Paper. "We'd sing all the time, in church, in the house. We'd stand around, helping whoever's turn it was to wash dishes that week, singing together." (You can hear that close camaraderie Nannie describes in the tight, but generously affirming harmonies that surround her in the opening bars of "You and Me.") In 1970, Nannie and her brothers Preston, Donald and Johnny, barely out of high school, were simply answering an ad in a newspaper. Harmonic Sounds was looking for singers to launch a label. "We would go over there every Saturday morning and stay all day, from 7am to 4pm I remember thinking, ‘Do we have to stay all day?,'" she recalls. Jay Robinson taught the group his song and helped them enunciate the doo-wop harmonies. "We were just trying to get ourselves on record," Nannie remarked. "It was exciting. It hadn't really set with us as to what we could be yet. We were just trying to get on board." But after "You and Me" never saw the light of day, the family just continued singing gospel at the Bethany Presbyterian Church. (Apparently, they tried to do a CD as DC and the Gospel Quest but it also went nowhere.) Rather than starting a recording career, however, Nannie Sharpe simply became a mail sorter for some thirty years until moving to Woodbridge, Va., in 2006. Now 62, she's back singing gospel at the Harvest Life Changes Church.

When they first performed "You and Me," Penny and The Quarters had no idea they were being taped. They thought it was a run-through for some better take they'd record down the road, not recognizing that there would be no second act. Little did they know also that they were heading down a road that led for a time to obscurity. But it took a poetic evocation of time's mysterious passage, what Amanda Shubert described Blue Valentine as employing to tell its story, for Penny & The Quarters to actually exist as a real popular group within the movie, real enough for Dean to actually know their song (perhaps in that world it was even a hit) and to introduce his love Cindy to its sweetly hopeful cadences. "You and Me" is alive within the bitter-sweet world of Blue Valentine; it's also vibrant enough to define a marriage, a relationship, even the hopes of what two people thought they could uphold. In "You and Me," Nannie Sharpe sings of her own dreams in a pining voice that invokes the youthful aspirations of Frankie Lymon from some fifteen years earlier, aspirations that would ultimately end for him in a heroin overdose in 1968; but maybe even more closely, she borrows some of the yearning confidence of the young Michael Jackson, who was still two years away from making his debut album and leaving the Jackson Five, beginning a lucrative and successful career that would also end tragically. There's indeed a scent of death that wafts all through Blue Valentine, but it is not just the death of a marriage, or of people like the man in the nursing home, or of the couple's favourite family pet, but also the innocence we must let go of in order to face the reality of our adult lives. Yet it's in the affirmative grooves of "You and Me," this quiet, once lost, unassuming masterpiece, that we finally get to recover the adolescent hopes and longings of what we painfully left behind.

-- September 29/12

Old Souls Made New: Brian De Palma's Phantom of the Paradise (1974)


William Finley as The Phantom in Phantom of the Paradise

Director Brian De Palma has accumulated a long list of neglected gems (The Fury, Blow Out, Casualties of War, Redacted), but the one whose neglect makes the least sense is his ingenious satirical rock musical, Phantom of the Paradise (1974). Fiendishly clever and percolating with film-making fever, De Palma provides ingenious allusions to Phantom of the Opera, The Cabinet of Dr. Caligari and The Picture of Dorian Gray. (Last year, while teaching a class on Alfred Hitchcock and Brian De Palma, I had more angry responses to this picture than some of De Palma's more inflammatory work.) But this pulsing musical comedy is an exhilarating modern retelling of the Faust myth (with roots in Dante's Divine Comedy) wherein a man becomes so consumed by his thirst for divine knowledge that he sells his soul to the Devil. In Phantom of the Paradise, though, the thirst is for something perhaps a little less lofty: rock immortality.

As a parable, the Faust myth has fascinated a long list of artists from all fields (for maybe the obvious reason that the hunger for immortal acclaim is at its root). The allure of the story inspired Christopher Marlowe's The Tragical History of the Life and Death of Doctor Faustus, which was written as far back as 1591, three years before the author was killed in a street brawl. Mozart also caught the bug in 1775 when he composed his opera Don Giovanni, a Don Juan story that the composer was inspired to turn into a Faustian one. Hector Berlioz composed a colourful dramatic cantata, The Damnation of Faust, but (like De Palma's Phantom) it was greeted with little enthusiasm when it premièred in Paris in 1846. On the other hand, Charles Gounod, whose previous work had gone unnoticed, had his first major success with his opera Faust in 1859. Italian painter and composer Arrigo Boito, who found early fame writing librettos for Verdi's Otello (1886) and Falstaff (1893), turned to Goethe's Faust for higher glory in his opera, Mefistofele (1886). Even modernist composers couldn't resist the seduction of the tale. In Igor Stravinsky's 1918 chamber work, L'Histoire du Soldat (A Soldier's Tale), the Devil (in disguise) offers a soldier an old book filled with wisdom in exchange for his violin. American composer, Frank Zappa, who fell in love with Stravinsky's work as a teenager, reworked L'Histoire du Soldat in 1976 into a wickedly profane and funny oratorio, "Titties 'n' Beer," in which the Devil devours a motorcycle outlaw's girlfriend, plus his case of beer, which he says he'll return in exchange for the biker's soul.

Paul Williams as Swan

Variations on the Faustian deal inspired many novelists, too, including Thomas Mann, who wrote his version, Dr. Faustus (1947), during the horrors of World War II; Stephen Vincent Benet's 1937 short story "The Devil and Daniel Webster" (which would become a hit 1941 film); and Hans Christian Anderson's The Little Mermaid (1836) which told the Faust story from a female perspective. Faust has also been the source material for musicals (Damn Yankees, The Band Wagon) and other movies (The Devil's Advocate). The only pop musician to tackle the tale head on was Randy Newman with his Faust musical in 1993. While Faust should have been apt material for a songwriter who could turn God into a malevolent comic laughing at the follies of those who pray to him in "God's Song (That's Why I Love Mankind)," Newman's Faust was a huge disappointment. Instead of being the defining work of his entire recording career, he made his musical interpretation of the myth too literal, where the targets of his usually sharp satire were made of straw. Newman's Faust was a lethargic piece of craftsmanship with ultimately no soul to sell.

De Palma's Phantom of the Paradise, by contrast with Newman, has an imaginative power that links our associations with the legend of Faust to what we've already stored up from popular culture. The film concerns Winslow Leach (William Finley), a composer who is writing a cantata based on Faust, who is robbed of his music by Swan (Paul Williams), an entrepreneur looking for the right music to open his rock palace, the Paradise. We soon discover that Swan is under contract to the Devil in a deal to attain eternal youth, which is why he strongly identifies with Winslow's piece. Swan wants it performed by artists who he thinks have commercial potential. The comically varied bands, all played by the same trio of performers (Harold Oblong, Archie Hahn and Jeffrey Comanor), include The Juicy Fruits, a doo-wop ensemble; The Beach Bums, a surf band; and finally, The Undead, a goth-glam group (complete with KISS-inspired make-up). When Winslow is maimed (hilariously by a record-pressing machine rather than acid), he stalks the Paradise as a masked phantom, killing anyone who performs Faust, except for the singer Phoenix (Jessica Harper), whom he loves.

Jessica Harper as Phoenix

De Palma adapts and integrates Goethe's themes with gutsy imagination and flair into his own Grand Guignol design, daringly setting Faust in the corruptible world of rock, which has had a long history of recording artists selling their souls for a hit. The songs in Phantom (all written by the usually smarmy Paul Williams) are equally smart and witty. The opening track, "Goodbye, Eddie, Goodbye," sung by the Juicy Fruits, begins by parodying the melodramatic clichés of West Side Story. But the track also echoes the tragic suicide of Johnny Ace, who killed himself in a game of Russian roulette on Christmas Day in 1954, sending his greatest song, "Pledging My Love," to the top of the charts. "Upholstery," sung by The Beach Bums, is a cheeky re-write of Brian Wilson's "Don't Worry Baby," in which a game of chicken between two guys and their cars now turns into a Faustian bargain over pride. "Somebody Super Like You (Beef Construction Song)," sung by the Undead, is an uproarious parody of Nietzsche's Superman who is reborn as an androgynous Glam God named Beef (neatly invoking David Bowie in his Ziggy Stardust phase, and played by the peerless Gerrit Graham). In this instance, De Palma depicts with shrewd awareness how the audience, cheering on the Undead, willfully become sacrificial victims to the creation of Beef. But instead of trading their souls, they offer up limbs to their idol in order to give him life. Faustian themes wind like a river through the numerous musical genres De Palma weaves into the narrative.

Gerrit Graham as Beef

While William Finley is not the most charismatic of performers as Winslow, the actor's angular physical stature wittily matches the highly stylized expressionistic sets. And if Paul William's singing in the song "Faust" can't quite match the poignancy of his lyrics (with their brief allusion to John Lennon's utopian anthem "Come Together"), the urgency in his performance manages to put the song across. Jessica Harper's Phoenix, combining her sweetly innocent eyes with a vixen's curled lips just aching for corruption, is a worthy object of Winslow's (and Swan's) desire, but she also reaches surprising depths of both desire and sorrow in "Old Souls." ("Our paths have crossed and parted/This love affair was started long ago/This love survives the ages/In its story lives are pages/Fill them up/May ours turn slow.") Paul Williams' Swan represents an ambitious bit of casting given the singer/songwriter's role writing sentimental hits for The Carpenters ("We've Only Just Begun") during the Seventies. Throughout the picture, Williams remains mysterious and creepy, a discomfiting personality – drawn from his pop songs and past screen roles – who blends elusively into the part he's playing. Swan's final song, "The Hell of It," is an incriminating piece of work that also invokes our own unresolved feelings about the composer ("Born defeated, died in vain/Super destruction, you were hooked on pain/Tho' your music lingers on/All of us are glad you're gone").

Phantom of the Paradise, which was released by Twentieth Century Fox, never became the cult classic it deserved to be (it was eclipsed instead by Fox's inferior The Rocky Horror Picture Show which managed to turn high camp into an audience fetish). But Phantom remains a great contemporary Faust musical, one which integrates with bold ambition Goethe's eternal themes of damnation and salvation with our pop obsessions with idols and success.

-- October 6/12

True Blood: Margaret & The Experience of Violence


Anna Paquin as Lisa in Margaret

"I am seriously thinking of writing a play for the screen. I have a subject for it. It is a terrible and bloody theme. I am not afraid of bloody themes. Take Homer or the Bible, for instance. How many bloodthirsty passages there are in them – murders, wars. And yet these are the sacred books, and they ennoble and uplift the people. It is not the subject itself that is so terrible. It is the propagation of bloodshed, and the justification for it, that is really terrible! Some friends of mine returned from Kursk recently and told me a shocking incident. It is a story for the films. You couldn't write it in fiction or for the stage. But on the screen it would be good. Listen – it may turn out to be a powerful thing!"

– "A Conversation on Film With Leo Tolstoy" quoted in the appendix of film historian Jay Leyda's Kino: A History Of The Russian And Soviet Film (Princeton University Press,1960); and later reprinted in Roger Ebert's Book of Film (W.W. Norton, 1997).


In September 2001, it was my twentieth year as a film critic covering the Toronto International Film Festival. It was also the year of the terrorist attacks on New York and Washington. Before the carnage took place, I'd already been seeing a number of pictures that dealt with the subject of violence. But my response to the violence was as varied as the films themselves. South Korean director Kim Ki-duk's drama Address Unknown, for instance, attempted to tackle the cultural stigma of Korean women who had had children out of wedlock with American USO soldiers stationed in Seoul. But the director quickly lost sight of the more ambiguous ramifications of the story. Kim's unbridled rage instead got the better of him. There were so many florid scenes of mutilation and brutality that it overshadowed any compassion we might have had for the characters.

Then there was Patricio Guzman's documentary El Caso Pinochet (The Pinochet Case). The director meticulously put together a stinging indictment of the former Chilean dictator, who was arrested in 1998 and extradited for trial to England on charges of torture and murder. Guzman, a former Chilean exile, had been adamantly chronicling his country's turbulent history for over three decades. Ever since he filmed the coup of General Pinochet, which toppled the socialist Salvador Allende government in 1973, in his stunning epic 3-part documentary, The Battle of Chile (1975, 1976, 1979), Guzman had been making himself the caretaker of his homeland's national memory. While it lacked the accumulative power of The Battle of Chile, where we witnessed with horror as a cameraman captured his own death, The Pinochet Case was still a vividly personal and painful examination of the fallout from a nation's descent into totalitarian horror.

All through the press screenings, from all different countries, the scent of blood was in the air. The Austrian director, Ulrich Seidl, in his first dramatic feature, Dogdays, had just captured the Grand Jury Prize at the Venice Film Festival with a soporific and toxic attack on suburban bourgeois living. Seidl's celebrated debut, an exercise in degradation so enervating that it made similar studies in suburban ennui (Welcome to the Dollhouse, American Beauty) appear seeped in empathy, set out to cruelly punish people for their moral turpitude. (I recall in particular one scene where a sexually starved middle-aged woman, who passively harbours a sadistic boyfriend, getting her head flushed in a toilet for her troubles.) Seidl didn't express much sympathy for the victims, or even provide much in the way of an examination of the motives of the perpetrators. (Dramatic motivation isn't a calling card for directors like Seidl.) By the time one character remarked that "people can be cruel," which came right after the individual discovers a poisoned animal, cruelty seemed a better epigram for the director. The violence in Dogdays also didn't shock you, or even disturb you, because the director's indifference to human suffering cancelled out the horror. Revulsion was perhaps the more appropriate response.

The night before the planes tore into the World Trade Centre, I had been at a screening of Fred Schepisi's adaptation of English novelist Graham Swift's Last Orders. While not filled with the "terrible and bloody themes" that preoccupied Tolstoy shortly before his death, the sting of mortality was sharper here than in some of the more explicit bloodshed I'd been encountering elsewhere. Last Orders didn't have any moments of true violence, except for a brief fistfight during the journey, but the picture still had an unnerving impact that couldn't be talked away, or shaken off. When one of a circle of long-time buddies, Jack (Michael Caine), dies of cancer, the remaining friends and relatives (played by Bob Hoskins, Tom Courtenay, David Hemmings and Ray Winstone) carry out one of his "last orders": to take his ashes and scatter them off the pier at the British seaside town of Margate. As the group makes the trip, they take stock of their friend's death. But rather than provide nostalgic relief, or till the ground with sentimental remembrances, the journey uncovers unresolved wounds, lost opportunities, and once buried secrets. But could the tumultuous experience of that film possibly be sustained after the horrors the next day on September 11th? For the remainder of the Festival, I sat through film after film – good and bad – numb to their effect because the violence I had just witnessed in America had eclipsed any ability for drama to help me come to terms with true blood. And by the time I began writing my reviews, later in the week, I was on autopilot finishing a job, trying to connect to what I liked, what I didn't enjoy, yet not able to capture the experience of the movie itself because I couldn't feel anything. At that moment, I was truly terrified. I didn't know whether a movie could ever reach me again, or perhaps ever begin to matter as it once did.

Last Orders (2001)

When JFK was assassinated in 1963, which was an earlier seismic occurrence in the culture, many of the films that followed actually tried to come to terms with the impact of that event. The murder of Kennedy itself was experienced as if it were an act of parricide, a family crime where guilt was as much present as shock, and the movies of the next couple of decades were often expressions of both sentiments. Which is why, sitting in the theatre for the opening night of Bonnie and Clyde in 1967, barely into my teenage years, I felt a new relationship developing between the violence on the screen and the audience's relationship to it, one I'll never forget. Rather than denying the power of violence and our capacity for it, the picture built our identification with the romantic outlaws. But we weren't given the satisfaction of watching the bad guys get it, as we so often were in the past, even at many gangster films. We were instead implicated in their crimes. The ultimate demise of Bonnie and Clyde didn't take the pain of death away; it heightened it. The concluding slow motion barrage of gunfire, a twisting ballet of bullets and bodies, seemed to tear the screen apart even as it shredded their bodies. The effect left us feeling emotionally strafed rather than numb and indifferent.

Whether the movies themselves were good or bad, the violence in those post-assassination years wasn't inconsequential. But by the time 9/11 happened, screen violence had become, generally speaking, just another special effect, a new drug to get high on. If Kennedy's death served to influence the style and content of the many American films that followed, the events of 9/11 seemed to come out of contemporary American movies themselves. How often did you hear someone describe what they saw that day as looking like any number of action spectacles, either Die Hard, or maybe The Siege, or perhaps True Lies? The only difference was that there was no Bruce Willis to save lives and defeat the terrorists; no Arnold Schwartznegger to terminate anybody. In post-9/11, the notion of dramatic violence fulfilling Tolstoy's notion of it being a powerful dramatic and cathartic tool instead became an expression of impotence. We may have more explicit violence in the movies now, even more than what Sam Peckinpah would be criticized for in The Wild Bunch, but rather than put us in touch with what Pauline Kael once called "the sting of death," the brutality now seemed to be about creating a denial of death.

Even though, this past summer, The Dark Knight Rises initially inspired some of its fans to send death threats to film critics who panned it, the ultimate irony would be the horrifying massacre that took place inside a movie theatre where people were watching it. But for all the film's violence, its dealing with mortality, or its claims to be socially relevant in dealing with the financial crisis and reflecting the Occupy movement, The Dark Knight Rises went limp as quickly as it dominated the theatre screens. The Hunger Games also came earlier in the spring with all this buzz from fans of the best-selling novel, a book which many insisted said something profound about our fascination with death and spectacle. But rather than confront its own subject, it backed away and became a spectacle itself (and a lame one at that). The female hero is even denied an id – she's not tempted by the violence of the very pageantry she's taking part in. The only violence gets committed against either those we're supposed to like (those who we will later see avenged), or against those we don't like, so we are spared having to identify with those who actually become addicted to the Hunger Games. Even at the end, when sacrificial death seems inevitable, even as a form of protest against the inhumanity of the games, the audience is let off the hook so we can swoon at the two love-bird heroes whose mutual admiration comes at the expense of all the corpses piled up during the movie. For all the violence in both of these pictures, the propagation of bloodshed did little in making us come to terms with it.


But ironically, if there was one film this year that could wake us up to the true experience of violence, where it even became incumbent on us to respond, it came from a movie most people didn't see and many don't know exists. Kenneth Lonergan's Margaret (which arrived a few months ago unheralded on DVD) seems in many ways like a throwback to an earlier era when a picture could feel its way through a subject and find its meaning through understatement rather than trying to make a statement. Margaret, which started filming in 2005, didn't even reach movie theatres until six years later and then not at its original length, and in barely enough theatres for people to take notice. The long tale of Margaret's painful evolution is maybe best explained elsewhere, but the film itself is a marvel of dramatic invention and catharsis. The basic story involves Lisa Cohen (Anna Paquin), a fervent 17-year-old New Yorker, who unwittingly participates in the accidental death of a pedestrian (Allison Janney) when she distracts a bus driver (Mark Ruffalo) who ends up running a red light. At first, she decides to protect the driver from criminal charges when giving her statement to the police, but she also holds herself partly responsible for the pedestrian's death. (She also felt a personal connection, too, having held her hand and comforted her until she passed away.) So Lisa ultimately decides to refute her statement and tell the truth. As she becomes more adamant about being responsible, about taking responsibility, a decision that moves her painfully out of adolescence and into adulthood, Lisa's coming-of-age incites a growing resentment towards her mother, Joan (J. Smith-Cameron). Joan is at a different stage in her life than her daughter. She's a stage actress bitterly divorced from her husband (played by Kenneth Lonergan) and has long lost touch with the pangs of newly felt experience that is burning her daughter up. So Lisa finds a kindred soul in Emily (Jeannie Berlin), the dear friend of the deceased, and they both try to launch a lawsuit to achieve some justice, as well as get the driver (who has had other driving offences) fired. The lawsuit, of course, doesn't go as planned, but Lisa comes to understand how in growing older the world continues to test our very perception of it.

J. Smith Cameron and Anna Paquin

Margaret is essentially about coming out of innocence (its title comes from the Gerard Manley Hopkins' poem, "Spring and Fall to a Young Girl," where he writes, "Margaret, are you grieving/Over Goldengrove unleaving?/Leaves, like the things of man, you/With your fresh thoughts care for, can you?"), but the deeper themes examined here touch on the same post-9/11 numbness that followed the attacks. In many ways, Margaret mirrors this spiritual malaise as well as the emotionally detached culture that has produced movies which cheapen death and violence. Lonergan makes the consequences of that kind of detachment the very subject of his picture. Throughout the director's cut of the movie (only available in the U.S.), there are continuous cutaway shots of planes cutting through the New York skies which seem as divorced from the city as the people are from themselves. But Margaret is about the necessity to make sense of experience and to claim its power to transform you. Kenneth Lonergan had already touched on this subject in his 2000 debut, You Can Count On Me, where a brother (Mark Ruffalo) and sister (Laura Linney), whose lives are torn apart as kids when their parents are killed in a car accident, have to come to terms with the legacy it leaves on their adult lives. Margaret is far more ambitious, though, a flawed, sprawling canvas of unresolved plots, undeveloped characters and loose ends.Yet its messiness still adds up to something extraordinary – an epiphany. The structure itself is operatic, with a soundtrack that (in the director's cut) samples Wagner's Tristan and Isolde and Bellini's Norma. Those operatic motifs not only illustrate how Lisa dramatizes her troubles, but also reminds us of how operatic plots become inconsequential to the strong emotions stirred by the music within them.

The cast is almost uniformly perfect. Anna Paquin's Lisa is a tremulous current, a prickly presence, whose emotional daring continually sparks reactions from those around her. She can be almost cruel in her honesty to Darren (John Gallagher, Jr.), a sensitive young man who worships her, when she rejects him. She can also be as reckless in her pursuit of truth as she is in her desperation to lose her virginity (in a scene with Kieran Culkin that's as comical as it is discomfiting). But Lisa is also, like the film, trying to cut through the emotional stasis that surrounds her – particularly in her attempt to get through to Joan, who no longer trusts either her instincts (perhaps blunted by her failed marriage) and her stage career (which she feels is on autopilot). J. Smith-Cameron provides remarkable contrast to Anna Paquin. If Paquin's desperate intensity is all on the surface of her skin, Smith-Cameron's desperation runs so deep under her skin that it tightens around her like a strait-jacket hugging her bones. Some of the other actors though are less lucky. Matt Damon, as Lisa's teacher, who has no sense of boundaries and acquiesces passively to her sexual advances, is quite good playing a boy in a man's body. But the role seems sketched rather than developed. (Matthew Broderick, on the other hand, works small wonders with his brief role as an English teacher who's grown too comfortable and set in his ways.) Mark Ruffalo also doesn't get to fully develop the part of the bus driver which appears underwritten. Jean Reno, as Ramon, a patron of the theatre who falls for Joan, initially brings a tentative sweetness to the part, but he never appears fully comfortable with the role. (He seems to be figuring it out as he plays it.) But the one other performance that equals Anna Paquin's, for its sheer force and skill, is Jeannie Berlin's as Emily. If Lisa's mother has lost the capacity to feel genuine emotions, Berlin's Emily is all too aware of the ways getting older can blunt your ability to respond with fresh eyes to matters of life and death. She fully recognizes that age has a way of making tragedy seem less dramatic because you have likely faced more of it than you did as a teenager. Emily's criticism of Lisa for turning her life into "an opera" is also her own rage towards a lost innocence in herself.

Jeannie Berlin and Anna Paquin

Unlike Atom Egoyan's The Sweet Hereafter (1997), which was also about a tragic bus accident and its impact on families, Margaret is about facing up to the truth of our worst experiences in order to act with integrity. (The Sweet Hereafter, by contrast, chooses to cop out by taking the side of the young witness who dishonestly implicates the innocent bus driver in the accident. Her accusation vanquishes all hopes in the town of holding the bus company liable along with the huge financial settlement to come purely as a means to punish her sexually abusive father.) Margaret is hardly a perfect movie, but its imperfections seem to make it matter more. Like the messiness in life, the unresolved, sometimes unexplained episodes that make up our narratives, Margaret does full justice to the power of art to help us discover the kind of connections that make life bearable and also fulfilling.

-- October 11/12

The Secrets of Subversion: Steely Dan's "Reelin' in the Years"



One night in the late Seventies, on their network television show, Donny and Marie Osmond decided to perform a nostalgic tribute to the glorious days of our youth. Decked out in spangles and bell-bottoms, the duo picked a contemporary pop standard they believed caught the mood of nostalgia. As they began, they traded lines of the song as if exchanging precious memories:

You're everlasting summer
You can see it fadin' fast
So you grab a piece of somethin'
That you think is gonna last
Well, you wouldn't know a diamond
If you held it in your hand
The things you think are precious
I can't understand
.

Despite the bitter and acrimonious tone of the lyrics, the siblings' performance was upbeat and grossly energetic. They exchanged smiles, tossed individual lines to each other and reached out their hands as if eagerly anticipating their high school reunion. By the time they reached the chorus and were singing in harmony, their mood turned curiously exuberant:

Are you reelin' in the years
Stowin' away the time
Are you gatherin' up the tears
Have you had enough of mine
?

You probably recognize the song they selected as their tribute to the past: Steely Dan's 1972 hit "Reelin' in the Years." Hardly conceived as a bouquet of roses to the good ol' days, "Reelin' in the Years" was in fact a viciously satiric attack on those who do get misty over a walk down memory lane. But Donny and Marie responded only to the effervescent bounce in the melody. It was a common mistake people made with this band. When I worked on the CBC radio program Prime Time, in the early Nineties, our executive producer Dave Downey was a huge Steely Dan fan. And he got no end in grief for being one. While most of the other producers on the show found it easy to revel in the post-punk sounds of Radiohead, they actually missed the covert rebellion lurking within the smooth jazz arrangements of these Jewish songwriters; composers who are as sardonically ironic as the Coen brothers are in the world of film.

In 1972, Donald Fagen and Walter Becker were two pretty frustrated tunesmiths from New York who tempered their frustrations by forming a band. It helped that the pair had a kindred history. They originally met in 1967 at Bard College, discovering a mutual love for both black jazz and black humour while the rest of the college was grooving to Vanilla Fudge. Once Fagen graduated (and Becker got booted out), they decided to enter a partnership as songwriters. They first sought out the Brill Building, where they sold a few songs, including "I Mean to Shine" (covered by Barbara Streisand). But they soon found their sardonic sense of humour made it impossible to continue writing pop ballads designed to be hit records. In 1970, the duo abandoned songwriting and acquired employment as roadies for Jay & the Americans. But life on the road soon bored them, and they began writing songs at ABC Records, a label that took interest in them and encouraged Fagen and Becker to record their own stuff.

Walter Becker and Donald Fagen

They quickly assembled a band from studio musicians they liked and called themselves Steely Dan, after a dildo coined by William Burroughs in his novel Naked Lunch (1959). Nobody seemed to get that little joke, so they took the prank further by composing obtuse, acerbic songs adorned by crisp and commercially friendly melodies. Underneath Steely Dan's smooth jazz arrangements though lay some pretty deranged comic stories. "We think of our records as comedy records to some degree," Fagen told The New York Times in 2003. "There wasn't really any model for that sort of thing, with the possible exception of Frank Zappa. But when we first started, people thought our style belied the actual content of the lyrics. So they thought we were just sort of sincere California band. I guess that's the secret of subversion." In Avant Rock: Experimental Music from The Beatles to Björk (2002), Bill Martin writes: "What really makes the Steely Dan vision...is a synthesis of jazz-rock with a sound from the first decades of the 20th Century, a sound that I associate with Cole Porter, the Gershwin Brothers and Duke Ellington – I would call this sound 'music deco.' As with the art deco movement in design and architecture, music deco is innovation developed from popular materials. And, as with art deco, there is a definite Jewish side to music deco, or a synthesis of Jewish and African-American influences."

While incorporating such influences, Steely Dan also became masters of disguise and believers in the untrustworthy narrator (a trait they shared with songwriter Randy Newman). Besides the neatly veiled but anti-nostalgic "Reelin' in the Years," from Can't Buy a Thrill (1972), the band produced a number of deceptively perverse songs that miraculously found their way onto the radio. "Show Biz Kids," with its funky, catchy melody, took a well-aimed shot at the Hollywood rich and poor – not to mention their own fans. Look closely, and "Rikki Don't Lose That Number," which borrowed its seductive jazz melody from Horace Silver's "Song for My Father," is really about a transvestite. Likewise "The Fez," with its exotic dance rhythms, is actually a light-hearted advisory about wearing condoms. "Pretzel Logic," which accurately describes the swastika, takes on (at least in part) the subject of Adolph Hitler. "Any World (That I'm Welcome to)" is a sly critique on social alienation with a melody so beguiling you can sometimes hear the song playing, as I once heard it, in the most conventional places – like a supermarket.

Subversion and its secrets sometimes comes in the form of a Trojan Horse. But we have grown so used to seeing and hearing rebellion in its loudest, most demonstrative forms, we tend to miss the kind that sneaks in our door. As for Steely Dan's curmudgeonly view of human nature, it comes disguised as popularly accepted music. In "Reelin' in the Years," there is also a wicked devilry in their populist daring.

-- November 29/12

The Monkees: The Revenge and Resurrection of Tin Pan Alley



There was a time when it was seen as cool, and definitely hip, to disparage The Monkees. Perceived by some as the Justin Biebers of their time, they were even called "The Pre-Fab Four," cheap imitations of The Beatles and defined as teeny-bopper fodder. Yet despite the crass commercial packaging and their faux A Hard Day's Night-style TV show, The Monkees (who early on had seasoned session men playing their instruments) were more than just a marketing executive's idea of a wet dream. They were used essentially as a volley shot, a cannon blast that reached back to the American Revolution and aimed towards a series of British Invasion bands, led by The Beatles and The Rolling Stones. Were they simply a fad? Maybe they were conceived that way. But The Monkees turned out to be the revenge and resurrection of Tin Pan Alley.

Tin Pan Alley was the name given to a publishing company located on West 28th Street between Broadway and Sixth Avenue. From 1880 to 1953, this block became something of an epicenter for both songwriting and music publishing in America; and it provided the foundation for what became the standards in American song penned by composers like Rodgers and Hart, Irving Berlin, George and Ira Gershwin, Harold Arlen, Frank Loesser and Yip Harburg. Composers and lyricists were hired on a permanent basis to provide an industry for popular music. For until the emergence of Tin Pan Alley, European operettas had been the predominant norm and influence on American songs.

When Elvis Presley essentially broke the pop colour barrier in 1956 with "Hound Dog," a cover of Willie Mae "Big Mama" Thornton's stinging R&B hit (written by the white songwriting team of Jerry Leiber and Mike Stoller), Tin Pan Alley found new life in the rock & roll of the early Sixties. It also found a new address in the Brill Building, located at 1619 Broadway in Manhattan, where a whole new generation of popular songwriters, most of whom were were young Jewish kids from Brooklyn, found their artistic salvation. They included Carole King and Gerry Goffin ("Will You Still Love Me Tomorrow?"); Neil Sedaka and Howard Greenfield ("Breaking Up is Hard to Do"); Jeff Barry and Ellie Greenwhich ("Be My Baby"); Barry Mann and Cynthia Weil ("You've Lost That Lovin' Feeling"); and Mort Shuman and Doc Pomus ("This Magic Moment").

Tin Pan Alley

As a pop music phenomenon, the Brill Building was the brain-child of Don Kirshner, an ambitious 21-year-old music publisher and native New Yorker who spent his summers as a bellboy in the Catskills listening to acts like Frankie Laine ("Moonlight Gambler"), who inspired him to try his hand at songwriting. In 1958, Kirshner met Al Nevins, a very successful composer with a pop group called The Three Suns. Kirshner convinced Nevins that by publishing songs, they could market to the booming teenage market and make a fortune. Aldon Music was born and they set up shop in the Brill Building with an ambition to make more than a financial killing. "No larger gap could be imagined than that between the sophisticated cocktail music of Tin Pan Alley and the rude street music of rock & roll," wrote critic Greg Shaw in The Rolling Stone Illustrated History of Rock & Roll (1976). "Yet it was this very gap that Nevins and Kirshner set out to bridge."

For a brief but exciting period in American popular music, Aldon Music provided such a bridge. What helped lay the groundwork for the success of the Brill Building was that the music publishers – as they had earlier in the century – held significant power. By contracting work to songwriters, then shopping them to a number of record companies, they could pair off songs with performers in the stables of various labels. The record companies, having quick access to Top Forty radio, could make a bundle off releasing 45rpm singles. From 1958 through 1963, they had astonishing and lucrative success, until The Beatles and The Rolling Stones arrived on shore to spoil the party. "The British Invasion introduced us to the concept of the self-contained singing band," wrote Frank Zappa in his memoir The Real Frank Zappa Book (1989). "The success of British groups forced a change in the way new American groups were put together. They now had to be self-contained because every bar band that hired live music wanted its own little U.S. version of The Beatles or The Rolling Stones." As The Beatles had transformed American pop songwriting by instilling the idea that songwriters could perform their own material, the young songwriters of the Brill Building in New York were finding fewer and fewer outlets for their material. In order to survive (as well as to follow The Beatles' lead), some, like Neil Diamond and Carole King, began recording and singing their own compositions.

Don Kirshner

But when The Beatles disappeared for a short while after their last tumultuous world tour in 1966, Don Kirshner saw his chance for a comeback and went back to the drawing board. With the help of two burgeoning Hollywood producers, Bert Schneider and Bob Rafelson, they came up with the concept of creating The Monkees, a replica of the Fab Four for a new generation still pining for the lads from Liverpool. In doing so, they not only satisfied the bottomless nostalgia in the TV audience, they also had a band to perform material produced by the Brill Building songwriters. Many notable Los Angeles musicians were auditioned for parts in The Monkees, including the eccentric Van Dyke Parks, who would ultimately collaborate with Brian Wilson on The Beach Boys' doomed Smile project; Stephen Stills was rejected because his teeth and hair were not considered TV friendly so he went on to create the Buffalo Springfield with Neil Young; Bobby "Boris" Pickett, who did the novelty song "Monster Mash," was considered; and so was Danny Hutton, who went on to fame with Three Dog Night. In the end, the producers went with British actor Davy Jones, American musicians Mike Nesmith and Peter Tork, plus American TV actor Mickey Dolenz.

While The Monkees would appear to be performing as a pop band on the show, it was session musicians who were providing the music. The series would kick off on September 12, 1966 with an episode called "Royal Flush," where Dolenz tries to save a Princess from her evil uncle. But The Monkees had only one single, "Last Train to Clarksville" (which composers Tommy Boyce and Bobby Hart based on the fade-out harmony of The Beatles' "Paperback Writer"), on the radio at the time. Their other songs ranged from the Three-Blind-Mice melody of "The Monkees Theme," to the cloying ballad "I Want to be Free." Their attempt at straight-ahead rock was the rather tepid Freddie Cannon imitation "Let's Dance On." Since Schneider and Rafalson knew that the band needed to fill at least six or seven minutes of the show with music because the scripts were (to put it charitably) pretty thin, they made a phone call to Kirshner, who was now the head of the Columbia/Screen Gems' music division. Kirshner put his stable – Gerry Goffin and Carole King ("Take a Giant Step"), Neil Diamond ("Look Out (Here Comes Tomorrow)"), Neil Sedaka ("When Love Comes Knocking at Your Door"), Barry Mann and Cynthia Weill ("Love is Only Sleeping") – back into the spotlight. Within the week, Kirshner sent a dozen pre-recorded music tracks for the group to dub their voices onto, plus a number of new songs. There was now enough material to fill out the season, plus some extras to fit a debut album. The fall of 1966 saw "Last Train to Clarksville" reaching #1, along with the TV show.


The band's relationship with Kirshner over the next few years, though, was hardly reciprocal with generosity. In particular, Mike Nesmith, the gifted Texas musician and songwriter, was pissed and feeling more like a trained chimpanzee. He wanted the group to actually be a group and play their own instruments. In time, leading their own revolution, The Monkees would squeeze Kirshner out for $35 million in compensation thanks to Nesmith's rants (and threats). By their third album, Headquarters (1967), they finally became more of an autonomous group. But without a Kirshner to hate, The Monkees began to fragment over the years. Before the end of the decade, their show was taken off the air. They rallied to make one counter-culture cult film, the inchoate Head (1968), which had an improbable cast that included boxer Sonny Liston, Victor Mature, Annette Funicello and composer Frank Zappa. Though it was easy to see The Monkees as an inauthentic rip-off of The Beatles, merely hired hands playing trivial pop, the group did show some substance beneath its plastic cover. In fact, Frank Zappa, who had already been snidely satirizing the values of American plastic culture, thought The Monkees sounded better than the love-and-beads bands that were sprouting up in the wake of The Beatles' retirement from touring. He would even make an appearance on their television show where he and Mike Nesmith switched identities to do a mock interview.

Besides the little joke of having them open for The Jimi Hendrix Experience during the latter's 1967 American tour, The Monkees would go on to inspire a number of surprising acts in the years to come. Rappers Run-D.M.C. would record Nesmith's "Mary, Mary" in 1988; Smash Mouth took on the Neil Diamond-penned "I'm a Believer" in 2001; and Cassandra Wilson would give new life to "Last Train to Clarksville" in 1995. Besides being a huge influence on Paul Westerberg of The Replacements (who performed the John Stewart composition "Daydream Believer" and Nesmith's "You May Just Be the One"), R.E.M.'s Michael Stipe once stated that they would not accept induction into the Rock n' Roll Hall of Fame until The Monkees were inducted.

The ascension of The Monkees made it clear that, in the wake of the absence of The Beatles, pop fans were continuing to hunger for a spark of magic, a sense that what they believed back in 1964 wasn't a false promise. Maybe The Monkees were something of a false promise; but they were also possibly one of the first clone bands that ultimately made some good pop records. And for a short period of time, American songwriters brought back to life their dream factory of pop standards.

-- December 2/12



                                                                   2013

When We're Older Things May Change: Janis Ian's "Society's Child" (1966)



In 1963, Martin Luther King Jr. delivered his "I Have a Dream" speech, declaring that children wouldn't "be judged by the colour of their skin but by the content of their character." The Freedom Movement, which fought the early battles for desegregation in the South and voter registration for black Americans, was extending a call for a shared vision of interracial harmony. King, the political and spiritual leader of the civil rights struggle in the United States, called for the country to abandon the bitter legacy of slavery. King's speech, that hot day in August, hit like a bolt of lightning, and suddenly a vision of hope and possibility spread throughout the country. Critic Craig Werner persuasively describes that promise in his book A Change is Gonna Come: Music, Race & the Soul of America. "For people of all colours committed to racial justice, the Sixties were a time of hope," he writes. "You could hear it in the music: in the freedom songs that soared above and sunk within the hearts of marchers at Selma and Montgomery; in the gospel inflections of Sam Cooke's teenage love songs; in Motown's self-proclaimed soundtrack for 'young America'; in blue-eyed soul and English remakes of the Chicago blues; in Aretha Franklin's resounding call for respect; in Sly Stone's celebration of the everyday people and Jimi Hendrix's vision of an interracial tribe; in John Coltrane's celebration of a love supreme. For brief moments during the decade surrounding King's speech, many of us harboured real hopes that the racial nightmare might be coming to an end."

Janis Ian, a white Jewish girl who was born Janis Eddy Fink in New York City in 1951, was probably one of those touched by the hope that the racial nightmare would end. However, by 1966, the war in Vietnam had taken priority over Lyndon Johnson's war on poverty, and violent riots in black inner-city ghettoes every summer were reducing King's aspirations to ashes. Still, some American liberals wanted to see, in their films, television shows, and music, more racial harmony – despite evidence to the contrary. By the mid-Sixties, more black actors were appearing on TV shows, such as Bill Cosby on I Spy, or in solemn exercises in maudlin melodrama, like Guess Who's Coming to Dinner?, where Sidney Poitier, who had electrified audiences a decade earlier in The Blackboard Jungle, was reduced to a benevolent token. Under that haze of white liberal denial about some ugly facts, fourteen-year-old Ian wrote a song in 1966 about interracial dating while she was waiting to see the guidance counselor. When "Society's Child" was recorded a year later, the tune stirred up a storm of reaction.

Janis Ian on The Tonight Show in 1967.

"Society's Child" begins innocently enough with a baroque melody played on a harpsichord before the full orchestra joins in. Ian's voice hovers over the arrangement, with a longing that feels beyond her adolescent years, yet still has the buoyant anticipation of first love: "Come to my door, baby/Face is clean and shining black as night." Within moments, though, those eager yearnings start to crumble:

My mother went to answer, you know, that you looked so fine
Now I could understand your tears and your shame
She called you 'boy' instead of your name
When she wouldn't let you inside
When she turned and said, 'But, honey, he's not our kind
."

While Ian sings her mother's words to her black suitor, you can hear his response, not with words, but in the gospel organ crying out from behind her voice. Then the song turns into what we believe will be a story of defiant young lovers turning against the mendacity of their elders:

My teachers all laugh their smirking stares
Cutting deep down in our affairs
Preachers of equality
If they can't believe it, then why can't they just let us be
?

Those were strong words for 1967. Ian was saying that all the talk of racial harmony was just platitudes – she was the one walking the talk. In the next verse, she stands up against the injustice:

One of these days, I'm gonna stop my listening
Gonna raise my head up high
One of these days, I'm gonna raise my glistening wings and fly
.

But then, after she affirms her own values, the simple facts come crashing down:

But that day will have to wait for awhile
Baby, I'm only society's child
When we're older things may change
But for now this is the way
THEY MUST REMAIN
.

When it was released, "Society's Child" was banned on radio stations in both the North and the South. Unlike most topical folk songs which locked themselves into a clear opposition against the status quo, Ian spelled out instead that racism hadn't gone away – even within the character in the song. She went against the highly romantic idea that the goal of a protest song was to change the way people think and act. What I'm saying is that Janis Ian didn't make herself the proud spokesperson for the ideals of Martin Luther King's dream. Her thoughtful and intelligent single is instead an accurate account of liberal condescension, and of one young girl's honourable – yet failed – attempt to rise above it. You could say Janis Ian's song, which played hide-and-seek on the airwaves that year, cut right to the bone of the disease that culminated in Martin Luther King Jr.'s assassination a year later.

-- February 6/13

The American Absurdism of Carl Stalling



When it came to writing music for animated cartoons, Carl Stalling wrote some of the most outrageously impudent material heard this side of Spike Jones. Thanks to Stalling, it wasn't unusual in a Looney Tunes or Merrie Melodies cartoon to hear a happy collision of bassoons, trombone slides, mysterioso strings, violin glissandos and his memorable "boinnngg!" sound created on the electric guitar. Together, these instruments created a bold, anarchic sound for some of the wittiest and purest examples of American absurdism.

In his Memoirs of a Useless Man, the Venetian dramatist Carlo Gozzi said that "dramatic fables" should contain "the great magic of seduction that creates an enchanted illusion of making the impossible appear as truth to the mind and spirit of the spectators." This idea probably best describes the ultimate goal of animation. Even more than dramatic realism, the cartoon demands a suspension of disbelief. And if music is essential to movie drama, it is no less a significant component in animation. As Roy Prendergast accurately pointed out in Film Music: A Neglected Art, the element of exaggeration in cartoons already had its antecedent in the 18th Century comic operas of Carlo Gozzi and others (like Mozart). The rapid, almost frantic rhythm of opera buffa demanded that the music keep pace with the action. This is no less true of animation. Most North American animators looked to the 20th Century neoclassic style already heard in contemporary artists like Igor Stravinsky rather than the 19th Century romanticism preferred by most Hollywood composers. "In dramatic films of the 1930s and '40s the chromaticism of the nineteenth century was appropriate because of the music's tendency to de-emphasize small-scale musical events, thereby drawing the listener's attention to a large sense of movement," Prendergast writes. "Cartoons, on the other hand, are usually nothing less than frantic movement consisting of a series of small-scale events, and the music in cartoons plays at least an equal role with the animation and story in establishing the humourous success of events."

In the Thirties, the Walt Disney Studios, with their emphasis on the smooth, cultivated style of drawing, had little interest in 20th Century neoclassicism, instead cribbing their soundtracks mostly from the standard repertoire. For example, The Opry House featured a Rachmaninoff prelude, while Grieg's "March of the Dwarfs" from Peer Gynt turned up in one of their later Silly Symphonies. The feature film Fantasia (1940) was a desperately commercial attempt by Disney to, as Pauline Kael put it, "combine high art and mass culture." Here, animators merely designed visuals to accompany famous pieces of music. Conducted by Leopold Stokowski, Fantasia featured diverse compositions, from Bach's Toccata and Fugue in D Minor to Tchaikovsky's Nutcracker Suite, accompanying such images volcanoes erupting and dinosaurs battling. Beethoven's Pastoral was even used as the backdrop for female centaurs frolicking like happy peasants. Stravinsky's The Rite of Spring was the film's one concession to 20th Century music. As Kael observed, Disney became the precursor "of the musical processing in [Kubrick's] 2001." In Disney animated features like Snow White and the Seven Dwarfs (1937), Pinocchio (1940), Dumbo (1941), and Bambi (1942), right up to the present day's Beauty and the Beast (1991), Aladdin (1992) and The Lion King (1994), the studio turned away from the practice of pilfering classical scores, and looked more to the tradition of the Broadway musical.

Carl Stalling

With Carl Stalling, Warner Brothers brought on board a composer with a sophisticated sense of the ridiculous. Stalling was born in Lexington, Missouri, in 1898, and his earliest introduction to music was, not surprisingly, improvising tunes on a toy piano. He fell in love with the movies as a five-year-old, after he saw a screening of The Great Train Robbery. By the time he was twelve, Stalling conducted his own orchestra in the pit at Kansas City's Isis Theatre. It was there that he met Walt Disney, who offered him a couple of assignments – starring Mickey Mouse. Stalling was so successful at musically enhancing these shorts, Disney brought him to his newly formed studio in Hollywood where Stalling created one of his most famous pieces, the highly imaginative "skeleton dance," used in the first of Disney's many Silly Symphonies.

By 1930, Stalling grew restless at Disney and moved to other small animation studios. In 1936, he arrived at Warner Brothers, where a madly inspired team of animators, Bob Clampett, Chuck Jones and Fritz Freleng, were putting together a collection of wise-ass animated characters including Bugs Bunny, Daffy Duck, Elmer Fudd and Porky Pig. For twenty-two years, until his retirement in 1958, Stalling was a dissonant jukebox that could play just about every American musical idiom ever invented. Rather than simply raid the classical repertoire, as Disney did to cut costs, Stalling sought out popular songs that Warners could purchase. His idea of popular, though, included ridiculous odd ditties like Raymond Scott's "Dinner Music for a Pack of Hungry Cannibals," or "How Dry I Am?" which he forever linked in our imagination to states of inebriation.

"Set against the historical happenings in American music in the Thirties and Forties, Stalling's achievements become even more impressive," composer John Zorn wrote in a 1990 appraisal. "Copland's pantonality; [John] Cage beginning to explore the sonic possibilities of the prepared piano with quiet, Satie-inspired music; [Harry] Partch freaking out and building his own instruments based on his own forty-six-tone tuning theories; [Duke] Ellington balancing improvisation and composition with his swinging, harmonically lush big band...it was a period of basically conservative American impressionism invaded by the search for new sonic resources." Curiously, when Stalling retired, he despaired over the state of the art of scoring for animated pictures. "One trouble with cartoons today," he remarked shortly before his death in 1972, "is that they have so much dialogue the music doesn't mean much." For Stalling music was dialogue – especially in cartoons which were about music, such as, What's Opera, Doc? and One Froggy Evening. Stalling's variations on Rossini's Barber of Seville, for instance, while Bugs Bunny and Elmer Fudd duke it out on the opera stage in The Rabbit of Seville, is as madly inspired as the satirical shots at high culture in The Marx Brothers' A Night at the Opera (1935). When Hal Wilner released the first CD collection, The Carl Stalling Project: Music from Warner Brothers Cartoons, 1936-1958 (1990), not only could we clearly hear the playful surrealism at work in Stalling's orchestrations, but we could also happily re-imagine the inspired lunacy in those loony images he enhanced.

-- April 12/13

Legacies: Peter Collier and David Horowitz's Destructive Generation (1997) & Paul Berman's A Tale of Two Utopias (1997)


In a 1994 episode of Law & Order called "White Rabbit," assistant D.A. Jack McCoy (Sam Waterston) is prosecuting a political fugitive from the Sixties who is found guilty in the murder of a policeman years earlier. When he shortly after reflects on the sentencing deal he offers her, he becomes rather wistful. "She'll be in jail until 2003," he comments to his younger assistant Claire Kincaid (Jill Hennessy). "I think the Sixties should be over by then." Now ten years after that former fugitive presumably won her freedom, it's doubtful that Jack McCoy got his wish. The decade turns out not to be so easily put to rest. Most of what we experience today politically, socially and culturally is still being measured by the turbulence of that decade. I'm not suggesting this in any paternal way to those in the present, as if it's just too bad that you weren't there. It's simply that it's hard to think of any other decade (aside from perhaps the Thirties) that has divided as many people as the Sixties did. (A new film from Robert Redford, The Company You Keep, proves the point by continuing to stir up the pot with a story about a contemporary journalist who is on the trail of a Sixties anti-war fugitive.) Nobody ever argues with any passion about the Eighties or Nineties, just as nobody really argued about the Forties and Fifties. (Although many said they were glad to have survived them.) And though history has wrought numerous contentious periods, no other decade this century seems as alive with prickly debate as the Sixties. The decade may be a half century behind us, but it isn't dead and buried as Jack McCoy had hoped.

The continued life of the Sixties is not just a matter of seeing ongoing baby boom nostalgia for oldies tunes, or occasionally seeing John Sebastian on television in a cardigan encouraging us to remember Woodstock; there are real political issues that haven't gone away. Every decade since then has seemed more like a reaction to it. Considering the current agenda of the Tea Party and its right-wing constituents, they have gained their momentum by attacking any issue that had its roots in the Sixties. Their idea of progress is the opposite of the Sixties: they choose to slash, rather than build on what came before. What I suspect also makes the Sixties so volatile a subject, even today, is that it was the last decade in which people felt the urgent promise of possibility. They had a feeling of boundaries being stretched, history being made, wrongs being addressed, and alternatives being tried. And these possibilities were being shared by diverse groups who also shared a utopian vision. It was a time, as critic Greil Marcus once said (in writing about The Beatles), when you could join a group and find your individuality.


But this period also had its shadow side. The utopian promise of The Beatles was soon blighted by the Manson Family. The militant non-violence of Martin Luther King Jr. was transformed into the violent revolt of The Black Panther Party. The Students for a Democratic Society ultimately abandoned democracy, and embraced bombs, as the Weathermen. And Woodstock's peace and love would be shattered by the violence and death months later at Altamont. Promises were broken and promises were dashed. Two books from the late Nineties, Paul Berman's A Tale of Two Utopias (1997) and Peter Collier and David Horowitz's Destructive Generation (1997), are impassioned attempts to come terms with those broken promises. Both books, in their radically different ways, are important works in understanding why the decade lingered.

Destructive Generation, which was first published in 1989, appraises the Sixties as a time in which "American mischief fermented into American mayhem." For Collier and Horowitz, who once manned the barricades as editors of Ramparts magazine, only to become in the Eighties "Lefties for Reagan," the Sixties was an acting out of the political psychopathology of Dostoyevsky's The Possessed. Political terror became inevitable, according to Collier and Horowitz, in an era in which they saw the political ends justifying the means. That view is explored in a number of powerful, vivid essays on some of the key political activists of the period. This includes a chilling portrait of the Jekyll/Hyde persona of the late Black Panther co-founder, Huey P. Newton, and a compassionate one of radical lawyer Fay Stender, who defended black revolutionary George Jackson, only to reap a horrific conclusion for her efforts. Collier and Horowitz also demonstrate acute psychological deftness in their reading of the evolution of the Weathermen movement, which sought to bring down the American government through violence, but only succeeded in destroying itself.

At its best, Destructive Generation shows how student radicalism turned into a crude form of anti-Americanism (just as Stalinism a generation earlier corrupted the American democratic old left). The Vietnam War was dividing the country and ushered in a period of self-hatred in which America (in its own eyes) became the personification of evil – and Ho Chi Minh became a savior. But Collier and Horowitz fail to show how the assassinations of the Kennedys and Martin Luther King also darkened the hopes for benign social reform movement and, as a result, it turned sour as radicals began to embrace criminality. The other central flaw of Destructive Generation is that, while presenting the political struggle of the Sixties as a litany of horrors, they exclude much of the government corruption that inspired it. (As they were once leftist ideologues, they have now become ideologues of the right, as attested by Horowitz's Front Page magazine which has become the doppelgänger of Ramparts.) By embracing the policies of Ronald Reagan when writing their book because he "acknowledged the fragility of American democracy," they neatly avoid an honest appraisal of how Reagan exploited that fragility with deregulation and Iran-Contra. Their selective conclusions deny Destructive Generation the larger meaning it might have had.


Paul Berman is a Sixties radical who hasn't recanted – although since 9/11 he has certainly grown more skeptical of the left's continued romanticism with totalitarianism. His A Tale of Two Utopias also tries to make sense of what went wrong in the Sixties after 1968. Like Destructive Generation, A Tale of Two Utopias is made up of a collection of essays but because Berman is still trying to sort out the confusions of what he calls the moral history of the baby boom generation, the book lacks the focus of Collier's and Horowitz's. Berman sees a link between the revolutionary activity of the Sixties and the struggles of the later Velvet Revolution in the Czech Republic, and some of the liberal philosophies espoused by Francis Fukuyama in France. But Berman, while acknowledging the communist corruption of the democratic left, doesn't bring the psychological nuances required to show how the left actually matured through that painful process of corruption into a new form of democratic liberalism. At times his arguments read like he's still shuffling position papers, rather than clarifying the dramatic shape of a movement.

Berman's writing is at its smartest in A Tale of Two Utopias when he illustrates how the American left grew to admire the communists of China and Vietnam (over the Soviet Union) not just because of politics, but because – being non-white – they represented the oppressed like American blacks. He also writes a very perceptive account of the awakening of gay activism, and how it, after the Stonewall protest of 1969, survived divisive ideological wars, to embrace a wider diversity. And in a witty account of how the simultaneous arrival of two very different American figures – Shirley Temple Black and Frank Zappa – to Czechoslovakia in 1990 was greeted by the Czechs as a common happening, Berman shows how American culture is not viewed abroad as a nation at war with itself. But it has always been at war with itself. It is a country whose politics were born in a revolution that freed them from colonialism, but whose psychological roots were built of the whims of Puritans escaping religious persecution in England and who came to America to carry out "God's Will."

From its very beginning, America has been torn by that legacy. And this lineage (which includes a costly and divisive Civil War) has been played out on the world stage for many years. If both Destructive Generation and A Tale of Two Utopias, in their very different ways, show a country in the Sixties looking for transcendence while continuing to relive its formative conflicts, these two books' strengths and failings are perfectly in keeping with America's own divided heritage.

-- May 5/13

Getting Game: Bring it On (2000)


Kirsten Dunst and Gabrielle Union

Who would have thought that a film about competing cheerleading squads could be so much fun? Certainly not me. But Bring it On springs plenty of surprises. For one thing, it's not just another hormonal teen comedy about sex. It's also not another self-congratulatory jolt of testosterone about winning the big game. And even if the romantic parts of the story follow in the footsteps of already familiar formula, the picture has a tickling spirit that tweaks you on the nose. Director Peyton Reed and screenwriter Jessica Bendinger have put together an affectionate and cheerful look at what is often a catty and competitive sport without turning snide about it. The cheerleaders aren't bubble-headed conformists who fear losing status at the high school. Reed and Bendinger create instead a comic tapestry that dispenses with pom-pom-waving clichés. Bring it On shows the relationship the sport has to interpretive dance, swing, martial arts and even Busby Berkeley choreography. We can see how the standard cheerleading routines at school football and basketball games are only warm-ups for national competitions that are every bit as difficult as gymnastic events.

In the story, Torrance Shipman (Kirsten Dunst) has just been elected captain of her Toro cheerleading squad from Rancho Cane High School in San Diego. She's carrying quite a tradition on her shoulders, too, because they have won six straight championships at the nationals. But when one of the veterans breaks her leg during practice, they elect a new girl, Missy Patone (Eliza Dushku) to take her place. While working out their standard winning routine, Missy informs Torrance that it was stolen from the Clovers, a black hip-hop squad from East Compton in Los Angeles, by the Toros' former captain. While Torrance wrestles with finding a new routine for the Toros and develops romantic feelings for Missy's brother Cliff (Jesse Bradford), the Clovers, led by Isis (Gabrielle Union) are having financial problems that might keep them out of the championships altogether. Bring it On is about how both captains try to save the reputations of their squads so they can finally square off to see who is the best.


There is certainly no escaping the not-so-hidden irony that having a very white-bread cheerleading squad taking their best routines from a black one echoes how black culture has always been appropriated by whites. Yet in this case, Bring it On is about more than Little Richard being ripped off by Pat Boone. The filmmakers don't make the Toros easy targets for that kind of derision. The film instead shows how mutual respect in the profession helps to transcend the racial divide.Watching both of these groups compete is like trying to compare Van Morrison and John Lee Hooker: They both have soul.

Speaking of soul, Kirsten Dunst brings poise and a rapt intelligence to the part. Her earlier work in Little Women, and comedies like Drop Dead Gorgeous and Dick, showed Dunst to be a wily and unpredictable presence whose face can also buoyantly light up a screen. In Bring it On she plays a young woman grappling with both her talent and her integrity, and she doesn't once turn it into a solemn exercise in personal redemption. The sinewy Eliza Dushku (Buffy the Vampire Slayer, Dollhouse), who seems to bend the camera around her limber frame, starts out as a moody outsider, but gradually transforms herself into a shimmering jewel. And Jesse Bradford grounds the film with the pleasantly droll humour of a junior Don McKellar. As dazzling as Gabrielle Union is (she seems to move to the rhythms of her speech), her scenes are far too brief. Bring it On doesn't spend enough time contrasting the two teams (and their captains) so that we can comprehend how different the process of training is for both of them. The film also errs in having a stock villain in Aaron (Richard Hillman), a male cheerleader who starts out as Torrance's boyfriend. He's nothing more than a wind-up Ken doll on steroids.

Bring it On both satirizes and revels in the sexual banter, the songs that bait and celebrate, and it doesn't put these people down for their lifestyle choices. Where a picture like American Beauty (1999) took the easy road by turning suburbanites into soulless consumers and materialists (so we could feel morally superior laughing at them), Bring it On takes the air out of smug self-righteousness. The picture has us beaming at a profession that has drawn its fair share of put-downs. Given the jaded times we live in, this could be seen as downright heroic.

-- July 21/13

Shadow and Light: The Fiftieth Anniversary of With The Beatles (1963)



When The Beatles' second album, With The Beatles, was released almost fifty years ago in the UK, it stayed at the top of the pop charts for a startling 21 weeks. If you consider that it was released on November 22, 1963 (on the day President Kennedy was assassinated), and was ignored by their British label's subsidiary, Capitol Records, in the United States, the feat was extraordinary. Yet despite the circumstances, or perhaps, in part, because of them, the sounds within those grooves caught the times like few other albums ever did – and changed them. With The Beatles arrived on that cold late fall day amidst a national tragedy, and yet it became a tonic. The songs would mix joy seamlessly with sorrow, their brightness overshadowed darkness, as four white boys exuberantly celebrated their love of black music.

The stark black-and-white cover photo of the group had them posed in moody half-shadow. (Greil Marcus once called it a "take it or leave it" shot.) Taken by Robert Freeman, it was inspired by images of the band created by Astrid Kirchherr, an existentialist photographer in Hamburg, Germany. She shot The Beatles while they were dressed in their tough leather outfits on days between their battles with the drunken patrons in the seedy bars of the Reeperbahn's red light district. Freeman's cover photo tells us more in retrospect than it did at the time of the record's release. When The Beatles first entered Hamburg in 1960 it was a prosperous commercial city that had attracted ships with cargo and people since the nineteenth century, just like the group's hometown of Liverpool, England. But Hamburg, for them, was like stepping into a dark mirror of their own upbringing. Hamburg offered the group a reverse image of the repressive postwar Britain they'd come to know and reject. Coming from a world where they grew up on food rations, The Beatles had now entered a world of free sex, prostitution, drugs and alcohol. But the German audience they faced in Hamburg was also living in its own dark mirror. They existed in the silhouette of a painful history that they shared with The Beatles – the Second World War and the Holocaust.

Those horrors were still present for Germany, but now the German audience wished to distance itself further from the violence it had perpetrated a couple of decades earlier. Meanwhile, The Beatles sought through the force of their stage presence to act out retribution for the violence that had been perpetrated against them during the blitz. "They traded curses and outrages with their crowds," wrote critic Devin McKinney in his book Magic Circles: The Beatles In Dream And History. "Lurched about like cartoon cripples, hollering holy hell into the sunrise, [John] Lennon in particular must have found it a kind of heaven: demented mind theatre, Goon Show without censor or good taste, as he goose-stepped and played Der Fuhrer to the crowds." Playing off the Nazi horrors of the recent past, The Beatles turned the audience into their adversary as well as their muse. In treating this drunken mass as their foe, they discovered a way to mould their distinctive differences as four individuals into one soul: they found their identity as a group. From there, their ultimate goal was to become a musical force that would conquer the world. And soon, they did.

The Beatles in Hamburg in 1960

But by the summer of 1963 they had already shed those leather outfits in Hamburg for suits. Instead of drunks screaming obscenities and hurling objects at them in dingy bars, there were now young girls screaming in adulation and fainting in their presence in theatres all across Britain. In Hamburg, The Beatles operated in the shadows of night and out of the trauma of a darker history to become a powerful affirmative force. From the rubble of the blitz in Britain and into the dolorous subculture of Hamburg, The Beatles had cause to say "yes." "Existentialism was our way of expressing our difference from the old Germany," Astrid Kirchherr recalled. "Our major influence was France. America was too far away, and it couldn't be England for they were our enemies." The Beatles, returning from that enemy country, dramatically altered the perceptions of these young Germans. Now on home turf, they would step into the light with an affirmative sound that also had its shadow side. With The Beatles was a musical cyclone next to their 1962 debut record, Please Please Me, because the emotional drive of the album was pitched about as high as the decibels in the songs. With The Beatles took love songs and tinged them with doubt, desperation and regret, and then turned them into pure ecstasy.

On the opening track, "It Won't Be Long," Lennon's cry jumps out of the speakers before the band seems to realize the song has started. The song revels in its triumphant "yeah, yeah, yeah" refrain (the same refrain that decorated their previous hit single, "She Loves You"), but you can hear the underlying torment in Lennon's tone. "Lonely and rejected, he sits at home waiting for the girl who has walked out on him to come back and make him happy," wrote Steve Turner in A Hard Day's Write. "As in so many later songs, he contrasts the carefree life he imagines everyone else is having with his own anguish, believing that once he's reunited with his loved one all his problems will be solved." Lennon's "All I've Got to Do" follows like a quiet exhale of breath after the urgent exuberance "It Won't Be Long," but the yearning is equally intense. Drawing its inspiration from Smokey Robinson & The Miracles and the seductive moodiness of "You Can Depend On Me," the track's optimism is tinged with fear and uncertainty that forces us to imagine Lennon's horrible fate if the loved one he calls out to doesn't reply. "All My Loving" is probably the most joyful song Paul McCartney has ever written about abandoning a lover. Conceived rightly as a country and western song, with a rhythm section nicked by Lennon from The Crystals' "Da Doo Ron Ron," the song aches with longing. The sweet desire in McCartney's voice is answered by George Harrison's supple guitar solo which is as beautifully economical as James Burton's deft touch in Ricky Nelson's equally joyful "Hello Mary Lou."

If "All My Loving" is optimistic about the future, George Harrison's first composition for the group, "Don't Bother Me," speaks for its title. A rock rhumba motored along with the hypnotic rhythm of African percussion, the track brings to the surface the brooding underbelly of the album. "Don't Bother Me" is not only a plea for privacy from a composer who didn't trust the utopian dream of The Beatles; it anticipated the madness of Beatlemania, which would vindicate the fears he expressed. While it's easy to assume that only McCartney could have been attracted to a show ballad like "Till There Was You," both he and Lennon were steeped in the history of Broadway tunes. For McCartney, this famous track from the Meredith Wilson 1957 Broadway musical The Music Man came from his father, who played many traditional standards in his jazz band. Since "Till There Was You" was written as a duet between the con artist Harold Hill and the scrupulous librarian Marian, McCartney duplicated the conception by having Harrison's flamenco guitar answer McCartney's pleas. While many pejoratively cite "Till There Was You" as nothing more than Paul McCartney's taste for kitsch, the track is actually more a romantic reverie, a nod to the Tin Pan Alley balladry that inspired the group, than a concession to sentimentality. No doubt after the relentless momentum created by the previous tracks, the band saw the need for a breather. There's a delicacy in their performance here that's matched by the beautiful precision of the playing.

The Beatles recording With the Beatles 

Out of the soft classicism of "Till There Was You" jumps "Please Mr. Postman" with a hungry cry of "Wait!" heard over Ringo Starr's urgent slap of his drum cymbal. Sung by Lennon, this cover of The Marvelettes' first Number One song in December 1961 has a primal urgency that changes the original intent of the song. In The Marvelettes' version of "Please Mr. Postman," lead singer Gladys Horton goes for charm in conveying her despair. If her boyfriend's letter doesn't arrive in the postman's hands, she lets you know that she'll probably get by until it does. As far as she's concerned, it's the boyfriend's loss if he doesn't write her. John Lennon, on the other hand, approaches the song with impetuous abandon, his voice a gale of unrestrained craving bubbling out of an electrical current of anxiety. He seems to be saying that if he doesn't get the letter, he'd just as soon die. There's so much anguish in his voice that by the time he reaches "you didn't stop to make me feel better," you can hear the primordial echoes out of Lennon's past when a 17-year-old boy experienced the loss of his adored mother who was killed by a reckless driver.

Out of the cyclone of "Please Mr. Postman," With The Beatles cuts loose with George Harrison's take on Chuck Berry's "Roll Over Beethoven." If there is one songwriter in rock 'n' roll who has an endless gift for memorable (and enjoyably clever) anthems, it's Chuck Berry. Whether it's his wry pledge of allegiance in "Rock and Roll Music," his testament to roots in "Back in the USA," or the happily defiant "Roll Over Beethoven," Berry is the supreme storyteller, rock's Johnny Appleseed, a smooth talker and a smooth walker. "Roll Over Beethoven" is a personal manifesto: by the time The Beatles covered it, the song was a climatic cry announcing a new music designed to knock down doors, and put all those middle-of-the-road icons in their grave. The fact that it's George Harrison, in his self-deprecating voice, who's chosen to take the boots to those doors only makes the ironies richer and the song more pleasurable. While "Hold Me Tight," a Lennon/McCartney composition that tries to emulate the galloping rhythms of The Shirelles (who they covered with "Baby It's You" and "Boys" on Please Please Me), is the record's only clunker (it clumps instead of gallops), they're on much firmer ground with Smokey Robinson's "You Really Got a Hold on Me." As on "Please Mr. Postman," Lennon dramatically changes the character of the song. In the original 1962 version by The Miracles, Robinson comes across as a man who is feeling fragile and in need of finding strength in the wake of discovering his own vulnerability. Lennon turns those tremulous yearnings into demands and transforms Robinson's tenderness into a resilience that's borne out of having your heart violently seized by the one you desire. In The Beatles' version of "You Really Got a Hold on Me," she has a hold on Lennon, but Lennon makes sure that we know the romantic obsession cuts both ways.

"I Wanna Be Your Man" is a quickly tossed together rave-up for Ringo to sing (The Rolling Stones, who covered it, gave it the much needed pungent drive of an Elmore James blues), but The Beatles responded right after with the rare 1962 Motown track by The Donays called "Devil in His Heart" (retitled "Devil in Her Heart"). The Beatles never shied away from covering girl band songs – like The Shirelles' "Chains" and "Baby its You," and The Cookies' "Boys" – because it gave them a more encompassing view of romantic love. (The fact that The Beatles, because of their long hair, were always being teased by older adults for 'looking like girls' further broke down the period's traditional gender perceptions.) Girl group bands had come out of the Tin Pan Alley of the Brill Building between 1958 and 1965, and would also include The Chantels ("Maybe"), The Paris Sisters ("I Know How You Love Me"), The Crystals ("He's a Rebel") and The Ronettes ("Be My Baby"). While their hits were written mostly by contract songwriters, they turned those tracks into a joyful revelry about 'getting the boy.' If the 'boy' was the ideal love object to these girl groups, the 'girl' was for The Beatles. The warmth and generosity in the sound of those girl group records also had the same kind of utopian spirit that The Beatles found fit right into their own quest.


"Devil in Her Heart" is a song about romantic betrayal that George Harrison heard while in the U.S. visiting his sister a couple of years before the band would debut there. In The Donays' version, singer Yvonne Allen keeps insisting the boy she met is an angel sent to her, while the background chorus advises her that he's got the devil in his heart. Besides the switching of gender roles, there's another significant difference between The Donays' version of duplicity and The Beatles' particular take. "[Allen]'s pissed at her friends for slandering the guy; she's not going to even consider that they're right," critic Dave Marsh writes about the song in The Beatles' Second Album. "[Harrison]'s not denying anything, just insisting that she's such a great, uh, kisser that he's willing to operate under whatever set of illusions is required...where Allen is strident in her denial of the accusations, he's obstinate in his denial of the truth." What follows "Devil in Her Heart" is "Not a Second Time," one of John Lennon's most underrated songs, a mournful dirge in which his system of defenses – erected against being rejected – are put to the test by his remorseful voice. Although he doesn't want to be hurt a second time, he still can't live without the possible hope of passionate love. "Not a Second Time" never received much airplay, nor was it ever performed live, but this was the first Beatles song to attract serious attention from classical music critics. William Mann, the reviewer with The Times, put unneeded weight on this pensive track by comparing the song to Gustav Mahler's "Song of the Earth" with its "Aeolian cadences." (Upon reading the review, Lennon assumed "Aeolian cadences" were exotic birds.)

With The Beatles concludes with a mammoth punch that echoes its opening. "Money (That's What I Want)" is another Motown cover co-written by the label's founder, Berry Gordy. First released in 1959 and sung by Barrett Strong, "Money" is about a man who substitutes a lust for money for his romantic desires. In The Beatles' version, sung by Lennon as if he is fighting through a hailstorm, he goes for the cash with a raw gusto. But he also lets you know that he's lost a lot of sleep making that choice. Lennon's delivery is resolute, like that of a prisoner who has spent too much time in solitary confinement and is now breaking out of prison without a care that he could be captured by the guards at any moment, or even defeated by his own doubts. Lennon embraces the belief that money will fulfill all his promises of freedom, but it's a last ditch hope because his soul is torn up by going for the loot. As in "Please Mr. Postman," Lennon is driven by a torment that won't let him settle for less. In this case, it's the attainment of a freedom he feels money will offer him. Lennon goes beyond expressing the song's basic sentiments in a frenzied attempt to let the emotions carry him forward with the sole purpose of making the experience of freedom authentic to him. Within the unprecedented lunacy of Lennon's performance is buried the singer's knowledge that attaining money won't give him the freedom he requires. So he sings like a man motivated only by his hunger, by his belief that this brink of desire will be the only freedom he'll ever know.

The incongruous qualities built into The Beatles' cover of "Money (That's What I Want)" didn't find their equal until 1983 when Cyndi Lauper, a punkier Betty Boop, startled listeners of her 1984 debut album, She's So Unusual, with an astounding version of The Brains' 1978 track "Money Changes Everything." The Brains were an Atlanta punk band led by Tom Gray and "Money Changes Everything" was a modern equivalent of Barrett Strong's "Money (That's What I Want)" in which Gray watches his girl take off with a guy who is rich. As he sang it, Gray appears completely resigned to the pain caused by her departure. It's as if her leaving was inevitable in such callow times, so he takes refuge in his defeat. In his mind, money is clearly the enemy for what it's done to their romance. Cyndi Lauper, on the other hand, infuses her version with some of the same ambiguous hunger John Lennon expressed in The Beatles' cover of "Money (That's What I Want)." Portraying the role of the departing woman, she is as defiant as John, and yet she fully recognizes that her love has been violated by her desire for lucre. But she also sees that in seizing the cash, in an impulsive jump for freedom, she has become fully culpable in the destruction of the love affair. When she spits out "It's all in the past now/ Money changes everything" (with a wincing emphasis on "past"), she expresses the sting of what that past means, and you know that she'll feel it long into the future. With a biting fury, Lauper uncovers the tangled mess left when true love becomes corrupted by the things you fail to see, or can possibly even control.


With The Beatles, which was recorded in 11 sessions and took over 30 hours to complete, would eventually find listeners on American shores in 1964 when (after the single "I Want to Hold Your Hand" went to Number One and The Beatles played The Ed Sullivan Show) Capitol Records issued two LPs in quick succession, Meet The Beatles and The Beatles' Second Album, which cribbed together the songs from With The Beatles supplemented by singles not approved by the group. Today it would be unthinkable that Capitol, a subsidiary company of EMI, would turn down the official albums and instead create their own bastardized versions of Beatles records without the group's input. In Canada, Capitol actually released With The Beatles intact as Beatlemania! With The Beatles simultaneously with the UK. Since Paul White, the A&R executive at Capitol Canada, was from England, he knew how big the band would be. Meet The Beatles had the same cover photo as With The Beatles except that the image was brightened with a blueish tinge, perhaps because they didn't like the starkness of having the band's profile in half-shadow. (Neither did EMI or The Beatles' manager Brian Epstein, for that matter.) But the cover photo, with its shadow and light, did express the bold complexity of the new sounds within the record.

The cover became so iconic that it often got parodied, as in The Residents' 1974 album Meet The Residents. Genesis also invoked it for the cover art of their 1986 single "Land of Confusion," a song that passionately lamented the broken legacy of the Sixties in the era of Margaret Thatcher and Ronald Reagan. Despite the uncertainty at the time of that black and white photo, the public demand for the record was unabated. While the rest of the world was concentrating on the tragedy of JFK's murder, the police in England were more concerned with keeping control over the crowds that were bustling into the record stores. With joy spilling into the streets of London, it wouldn't be long, to paraphrase the title of the album's opening song, before dour, grief-stricken America would experience the same.

-- August 7/13

Artist as Apostate: Bob Dylan in 1966



Back in 1966, John Lennon was worried about whether he'd be killed as The Beatles criss-crossed America in a summer filled with race riots and a heated controversy over a comment he made about the group being more popular than Jesus Christ. But there was another performer, one who was confused with being a prophet, having similar qualms that summer: Bob Dylan. Not only did the events in that season of hate alter the path of Dylan's career, it dramatically transformed the artist himself. He went from being a man making history to one who feared becoming its pawn. That summer determined not only his retreat from pop stardom, where a reluctant avatar suddenly saw the possibilities of betrayal, it also changed the game. With Dylan's Another Self Portrait, which contains unreleased sessions of music that make up two albums (Self Portrait, New Morning) during his retreat from his audience between 1969 and 1971, and on sale in stores today, you can hear in many of its songs the desire for solace. But the quiet in their sound, the soft beauty of "Pretty Saro," the contemplative quest in "Went to See the Gypsy," is deceptive. Another Self Portrait also has room for the tragic seduction of "House Carpenter," and the plaintive account of brutal murder in the traditional "Little Sadie." What all these songs have in common is that they portray a man seeking refuge in the more subtle confinements of the chamber room. But he couldn't hide from a world he helped create.

Just before The Beatles began confronting the many pitfalls of being idolized pop stars in 1965, folk troubadour Bob Dylan had decided to enter the pop arena himself. During the early part of the Sixties, Dylan had been an active member of the American folk revival, a dedicated musical movement that had aligned itself with the Civil Rights struggle, and were committed to carrying on the long, ennobled tradition of left-wing activism. The movement was led by such stalwart figures as Pete Seeger, Odetta, Ramblin' Jack Elliott and Joan Baez and they had as their figurehead, the legendary Woody Guthrie. What Elvis had been to the birth of rock, Guthrie certainly was to the heart of the American folk movement. Within this revival was yet another quest for a renewed country and the music carried a righteous spirit to get them there. Unlike The Beatles' utopian ideals, though, their vision had an authentic set of values attached, and it wasn't located in a place in the mind. These believers looked out into America with an obligation to the dream of JFK's New Frontier. They demanded an America with justice for black and white, men and women. In their music, it was held that the values of the marketplace would never take precedence over the value of human life. They refused the urban hustle and bustle for what they saw as the honest simplicity of the rural communities. Unlike pop music, perceived by the folk community as an ugly symbol of capitalist corruption, their music set out to document the pure struggle of all peoples, not just one artist's petty self-interest. If you were to write a folk song, it wasn't going to be "I Wanna Be Your Man," but rather, "We Shall Overcome."

Into this sacred world, stepped an enigma named Bob Dylan. Dylan had abandoned his actual name of Robert Zimmerman, and he set out to become a folk-singing legend before the age of 25. By 1965, however, he abandoned the rustic look of the folk hero and donned a leather jacket. Dylan radically altered his repertoire as well by borrowing players from Paul Butterfield's Blues Band, picking up an electric guitar and plugging in. One night, at the 1965 Newport Folk Festival, a loud and unhappy community expressed their displeasure when Dylan took a traditional folk song named "Penny's Farm," and turned it into the loud urban blues of "Maggie's Farm" (a song he had included that same year on his half-electric/half-acoustic Bringing it All Back Home). With this song, he declared his independence from a movement that had recently crowned him their young leader. "I ain't gonna work on Maggie's farm no more," he boldly cried out. It was quite clear from the power of his voice exactly whose farm he wasn't gonna work on. Just before Newport, Dylan stated his mission when he tore up the pop charts with an electrifying six-minute single called "Like a Rolling Stone." In it, he announced to his followers that, unlike their topical songs, his music was no longer going to usher in a better world. Dylan had made it clear to those who loudly booed him, and to anyone who cared to listen, that his music wasn't a product of history. His music was about to make history.

Bob Dylan at Newport, 1965  

By 1966, as Dylan continued to embrace urban blues and rock 'n' roll on his stunning two-record set Blonde on Blonde, many in the folk community declared that Dylan had sold them out. In their eyes, he had embraced the Golden Calf, and became seduced by rock's vulgar paganism. He had abandoned the pastoral integrity of their indigenous music. In this perceived act of apostasy, Dylan's decision was duly affected by the pop storm created by The Beatles. Dylan claimed that he saw a line being drawn and that this British group had been no teeny-bopper fad. He saw a new possibility for himself in reaching a larger audience by providing a greater scope for his music. As he embraced the challenge The Beatles posed, Dylan abandoned the self-righteous dogmatism he saw inherent in the ambitious goals of the folk movement. He immediately made haste for the abstract language of dreams, surrealist tales, and comic allegories that were filled with real and wildly imagined personalities. When he launched his electric tour that spring of 1966, the concerts that followed were charged with a peculiar ambiance, a prophecy of what was to come to pass by 1968, when assassinations, riots and an escalating war would tear America in half.

To face the angry swarm of betrayed folk fans, Dylan brought on board a Canadian rock group from Toronto called The Hawks. As the American crowds hissed, jeered and loudly booed, drummer Levon Helm (the only American in the group) decided to haul his tail back to Arkansas. Being a proud Southerner, Levon didn't play music to bear insults. After then securing drummer Mickey Jones, the group headed to England, the proud home of the Fab Four. But unlike the Fab Four, they weren't greeted with any "yeah, yeah, yeah's." The hostility, in fact, grew so intense that each concert became one more bloody battle in a long, protracted war. Dylan began each concert with a set of his acoustic music, but the lyrics were sometimes slowly drawn out, emphasizing the sound of his voice rather than the literal meaning of the lyric. When he came back from the intermission, however, with the crowd mostly calmed and expectant, he and The Hawks launched into some of the loudest, most powerful rock heard from a live stage. Their highly amplified music took no prisoners and it asked no favours. "It was a musically revolutionary time," remembers The Hawks' lead guitarist Robbie Robertson. "Who else can talk about playing all over North America, Europe and Australia and being booed every single night?"

Bob Dylan and The Hawks

Before Dylan embarked on this tour, there were many who feared for his life. Folk singer Phil Ochs actually raised that concern a year earlier. "Dylan has become part of so many people's psyches – and there are so many screwed up people in America, and death is such a part of the American scene now," Ochs remarked. In 1965, when Ochs made that frank observation, death was just beginning to be part of the American scene. John Kennedy, Malcolm X and Medger Evers were already dead from assassin's bullets, but nobody had yet gunned down a pop star, let alone a celebrated folk artist. Ochs understood that Dylan was making himself a lightening rod for the rage of those beginning to feel dispossessed from the dreams of their country. These dispossessed were no longer just the alienated loners, like Lee Harvey Oswald, they could just as easily now be angry, forsaken idealists. When they countered Dylan, these pacifists, who had been dreaming of a new country based on egalitarian values, were now coming up against some dark primal emotions within themselves. Listening to their cascading jeers, Dylan certainly wasn't unaware of the fuss he was causing. He could feel the turbulent waves of resentment building with each show – and the audience definitely let him have it. Who was he kidding offering the frivolous "Leopard-Skin Pill-Box Hat" over "Blowin' in the Wind?" How dare he take those condescending shots at Mr. Jones in "Ballad of a Thin Man?" At the Manchester Free Trade Hall, on May 17, 1966, one frustrated individual decided to speak up. Keith Butler had been a mild-mannered Dylan fan from Toronto, attending school at Keele University in England. Although there would soon be a time, not too long in the future, when Butler would find himself having a relatively normal life, happily raising a family, and finding mainstream employment as a banker; on this particular evening, he impulsively stepped forward to change the course of Dylan's show. If Dylan could change history with his music, Butler thought, he could step into history and change it back.

Butler had actually enjoyed the acoustic half of the show like most of the folk purists. But when Dylan returned with The Hawks to play electric rock 'n' roll, Butler's mood changed as drastically as the crowd's. In particular, he was resentful about Dylan's radically altered versions of the once folk-flavoured "Baby, Let Me Follow You Down" and "One Too Many Mornings." "I was disappointed, very emotional, and my anger just welled up when he did two songs I loved in that electric guitar way," he told Andy Gill of Mojo years later. With his rage boiling over, Butler stood up and uttered one simple word – and it had the impact of a cold hard-hitting bullet. At the moment Dylan finished "Ballad of a Thin Man," in the silent second that briefly cut into the endless clatter of displeasure heard in the hall, Butler stood up and yelled out to Dylan, "Judas!" After Butler made his remark, the shock waves rippled through the audience as if lightening had shorted out the amplifiers. Some in the crowd nervously cheered him, but Dylan was visibly stunned. In his 1963 "Masters of War," he had already identified the arms merchants as Judas Iscariot and he was once applauded for saying so. A year after, in "With God on Our Side," he asked listeners if Judas possibly had God on his side? Now on this evening, with nobody but The Hawks on his side, a fan in the audience had accused Dylan of being the ultimate betrayer. He quickly realized that this electric music, which he delivered with imagination, freedom and power, was more potent than he could have ever imagined. But he didn't know that it would bring forth a whole different set of consequences than the world-changing tunes of "A Hard Rain's Gonna Fall." The only way for him to claim back the truth of this music was to come back hard. "I don't believe you," he told Butler curtly. The band then sought to regain their composure when Dylan called out, "You're a liar!" Turning back to Robbie Robertson, to break Butler's spell and gather the troops, Dylan refused to passively accept the role of the apostate. He calmly told the group, "Play fucking loud." With those words, Dylan unleashed a torrential version of "Like a Rolling Stone." Bob Dylan, in this 1966 performance, was telling his followers that they were now on their own with no direction home. Lines were clearly being drawn in the sand – and the sand was now shifting.


Putting aside for a moment the anti-Semitic intent of yelling "Judas!" at a Jew, the persona of Jesus was taking on a curious shape that year. After all, it was Lennon, mere months after Dylan's confrontation with Keith Butler, who became a target for saying that The Beatles were more popular than Jesus. Now Bob Dylan was being identified as Judas. Pop music, which was once basically a vehicle for immediate gratification, revealed a messianic spirit that was beginning to emerge. Within it, all of pop's participants could play out grander roles with higher stakes to win or lose. In particular, pop stars could believe they were delivering The Word (and even imagine themselves being crucified for doing so – as John Lennon did in "The Ballad of John and Yoko"). The pop audience could also play a crucial role in this sacrificial ritual. They could be like Roman soldiers hoisting their charlatan heroes on the cross, and hammering in the nails just to watch them die for our sins. Before the end of the Sixties, all of this religious masochism was being explicitly acted out in musicals like Jesus Christ Superstar, rock operas like The Who's Tommy, and films like Privilege (1967). The listener, no longer content to be a mere consumer of music, now sought to be a protagonist in a larger story. Viewed in this particular context, Dylan didn't just perform a disappointing concert that Keith Butler happened to attend. For Butler, the show represented a larger drama with core values at stake. Butler stepped right into Dylan's music that night, making himself part of its very fabric, and demanded a change to the concert's outcome. If, for his audience, Dylan had once been a Jesus figure, Butler would take the role back by accusing Dylan of being a false prophet.

In 2002, when singer/songwriter Robyn Hitchcock re-enacted the entire Manchester Free Trade Hall show from 1966 at the Borderline Club in London, many in the audience sought to play Keith Butler during the evening performance. A few yelled "Judas!" – after the wrong song – either unable to remember Butler's place in the story, or perhaps wishing to alter its time line. Maybe they wanted to see if they could change the outcome of the show. But someone did eventually step into Butler's shoes at the correct moment before Hitchcock and his group, imagining themselves as Dylan & The Hawks, found their way into "Like a Rolling Stone." Although the spirit of the evening was all in good fun, with an absence of the possible danger lurking in 1966, Hitchcock's performance held up as a reminder of the shadow side hidden in the allure of utopian ideals. The Beatles' "There's a Place" once held out a hand that invited us to venture to another place, asking us to be an active partner in a dream rather than remaining a passive consumer. "I'll let you be in my dreams, if I can be in yours," Dylan once sang in "Talkin' World War III Blues." But what Dylan and The Beatles were to discover in 1966 was the risk of asking people to take part in your dreams. Once they do, maybe the dream is no longer yours to control.

-- August 27/13

Scorsese's Jukebox


Rock music didn't make its true first appearance in movies until 1955 when Bill Haley & the Comets' "Rock Around the Clock" introduced movie audiences to its power in Richard Brooks' youth drama Blackboard Jungle. This jumping tune, heard over the opening credits, got people hopping with the kind of infectious enthusiasm not seen since the beginning of The Swing Era. Blackboard Jungle was the story of a new teacher (Glenn Ford) who begins a job at a school in the 'wrong' part of town. He initially gets a lot of grief from the underclass students he's trying to teach. But one of his colleagues gets more than just grief. He tries to interest his charges in jazz. But the music of Stan Kenton and Bix Beiderbecke makes no impressionable dent in their not-so-impressionable minds. (The poor teacher is forced to watch his prize collection of records get tossed around the room and smashed to bits.) The picture was noted for introducing to audiences the raw and exciting presence of Sidney Poitier, but the lasting memory is of a public so startled by "Rock Around the Clock" that Clare Boothe Luce, the American ambassador to Italy, protested Blackboard Jungle's inclusion in the Venice Film Festival that year because (thanks to Bill Haley & The Comets) it incited people to violence.

But was it the film that incited audiences or was it the music? As one of the teenagers who flocked to the theatre, American composer Frank Zappa recalled how explosive the experience was in an article he wrote years later for Life magazine. "It was the loudest rock sound kids had ever heard at that time," he wrote. "I remember being inspired with awe. In cruddy little teenage rooms across America, kids had been huddling around old radios and cheap record players listening to the 'dirty music' of their lifestyle...But in the theatre, watching Blackboard Jungle, [parents] couldn't tell you to turn it down." Despite the compromised storyline of Blackboard Jungle (which had the old people winning in the end), the movie represented for Zappa "a strange sort of 'endorsement' of the teenage cause: 'They made a movie about us, therefore we exist...'" Pop music was all about telling audiences that they did indeed exist, and sometimes it even informed them of things society didn't deem moral. Until Altman in McCabe & Mrs. Miller, though, rock music had largely been used in movies to either characterize the artists, as in Elvis's films, or enlarge their popularity, as in The Beatles' A Hard Day's Night and Help!. But it was in the film underground where pop once again became a virtual jukebox in Kenneth Anger's Scorpio Rising (1963).

Scorpio Rising, Anger's adoring look at the biker subculture with its pedigree of Nazism, had no dialogue. Instead, the movie's homoerotic fetishes (it also examined the idolatry of cinematic rebels like Marlon Brando and James Dean) were accompanied by an eclectic variety of pop songs that spoke for Scorpio (Bruce Byron). They included Ricky Nelson's bossa-nova take on "Fools Rush In (Where Angels Fear to Tread)," Little Peggy March's pre-feminist anthem "I Will Follow Him" (and also her unnerving celebration of subservience in "Wind Up Doll") and the Shangri-Las' memorable biker tragedy, "Leader of the Pack." One thing that Scorpio Rising makes clear is the manner in which pop songs already take from the movies to the degree that they become films in miniature. (I mean, where would "Leader of the Pack," a tale of romantic woe, be without dozens of teen screen melodramas before it?) Some critics have also claimed that Scorpio Rising provided considerable influence on Martin Scorsese for the use of music in his films, especially in his 1967 debut feature, Who's That Knocking at My Door? (which is named after – and features – the infectious 1959 hit doo-wop song by The Genies) and his propulsive Mean Streets (1973). But Anger's pop score doesn't provide the film's heartbeat. It could just as easily be Scorpio's record player going in the background simply playing his favourite singles as he spiffs up his bike and dons his leather duds. What Scorsese did was something radically different.

Scorpio Rising

Perhaps following Altman's example in McCabe of using pop music as the inner voice of the picture, Scorsese used songs as if they were arias in an opera. In Mean Streets, for instance, the memorable inclusion of The Ronettes' sublime “Be My Baby” over the opening credits sets us up for the spiritual conflicts of the petty hoodlum (Harvey Keitel) who gets seduced by both the Church and the Mob. The Ronettes sound like sirens on the shore luring the protagonist into sin and guilt. What other song could have the potency to make moral rot appear so seductive? When the wildly unpredictable Johnny Boy (Robert de Niro) strolls into the bar bringing mayhem into the controlled life of Keitel's career criminal he's accompanied by The Rolling Stones' "Jumpin' Jack Flash," as if Jagger and Richards had someone like Johnny Boy in mind when they penned the song a few years earlier. Unlike director Stanley Kubrick, who continually stripped the pop song of its sensuality (as in A Clockwork Orange, where Gene Kelly's "Singin' in the Rain" becomes a call to rape), Scorsese fed off the sensuality of the music just as his characters fed off the sensuality of the transgressive life they'd chosen. In his next picture, the seldom acknowledged Alice Doesn't Live Here Anymore, Ellen Burstyn's precocious son, Tommy (Alfred Lutter), listens to Mott the Hoople's "All the Way From Memphis" as if it were an instruction manual giving eternal guidance to the rebellious life he plans to have ahead of him. Martin Scorsese always opened up a triad of meanings in popular music by illustrating the many ways that pop intersects with our lives. Unlike classical music, or even jazz, pop music gives depth to the erotic immediacy of surfaces. The use of Wagner in Apocalypse Now is nothing more than a classical musical joke underlining helicopter Valkyries delivering death and napalm.On the other hand, in the same movie the water-surfing to The Rolling Stones' "(I Can't Get No) Satisfaction" heard on the transistor radio carries the surfer – and the audience – into the ways a soldier, who sways sensually to it, deals with the madness of war.

But sometimes incessant bows to sensuality can become like cocaine hyping up dead air as the use of music proved to be in Scorsese's Goodfellas and Casino. While it's hard to argue with the inclusion of tracks in Casino like the transcendent "I'll Take You There" by The Staple Singers, or Clarence 'Frogman' Henry's delectable "Ain't Got No Home," or the way Tony Bennett's "Rags to Riches" underscores Henry Hill's early desires to be a gangster in Goodfellas, the music is overall too arbitrary. In Goodfellas, the music (as good as it is) is wall-to-wall as if Henry Hill just keeps changing the station on his car radio. Sometimes the music doesn't even go with what we're watching, as in the scene where Hill is madly dashing about in a state of paranoia while we inexplicably hear George Harrison's romantic ode "What is Life." Scorsese's promiscuous use of pop music is comparable to Hal Ashby's in Coming Home, his sentimental Vietnam War drama, where the music pretty much drowned out the picture when it wasn't being applied to make points. (The inclusion of The Rolling Stones' "Out of Time" ("You're out of touch my baby") as Bruce Dern's clearly reactionary military officer marches along kills the appeal of the song.) Only the majestic piano coda to Derek & the Domino's "Layla," used over the discovery of dead gangsters in Henry Hill's criminal family, makes any dramatic sense in Goodfellas.


In Casino, the pop score is even more scattershot than in Goodfellas. How does The Animals' powerful rendition of "The House of the Rising Sun" illuminate the tragedy of Sharon Stone's heroin overdose? Does Scorsese use the entire seven-minute take of The Stones' "Can You Hear Me Knocking?" as we watch the routine of gathering cash at the gambling establishment because he likes it too much to cut it? Better to buy the CD soundtrack than to see the film. The soundtrack CDs of Goodfellas, Casino and Bringing Out the Dead all feature songs that tell stories more compelling than the films they appeared in. This is often true of the music in Quentin Tarantino's pictures, too, which have unmistakably been influenced for better or worse by Scorsese. He can be either be scarily effective (as he was when he used Stealer's Wheel's "Stuck in the Middle With You" as Michael Madsen's dance track to inflict torture in Reservoir Dogs), or simply showing off his fan fetish, as in his use of the Bernard Herrmann cue from the forgotten Seventies thriller The Twisted Nerve when Daryl Hannah whistles it as she strolls through the hospital to eliminate Uma Thurman in Kill Bill, Part One. Tarantino might have a keen ear for great pop and movie soundtracks (especially the work of Ennio Morricone), but sometimes you feel as if the music is supposed to do the work of the actors by drawing attention to itself. Pop music movie soundtracks such as The Graduate and Easy Rider were often used to make more money from the film's popularity. But Tarantino, who treats his audience like incessant pop consumers, isn't after cash. He wants the acclaim that comes from knowing that you are just as hip as your audience.

Dinah Washington

Who knows if Martin Scorsese will ever be able to startle an audience again with a song the way he did in Mean Streets (especially in that fleeting moment when Robert de Niro's Johnny Boy does a quick bop to Smokey Robinson's "Mickey's Monkey" before heading off to his likely death). But he still shows a wry cleverness in picking tracks. In his 2010 psychological thriller, Shutter Island, the picture was less an adaptation of author Dennis Lehane’s mystery thriller than it was a cluttered labyrinth and virtual fun-house of the director’s favourite film noir tropes. It was hardly fun and barely coherent. On the other hand, the soundtrack put together by Robbie Robertson for the movie is all of that. Featuring Ligeti's modernist “Lontano,” Ingram Marshall’s “Fog Tropes,” scored for brass sextet and fog horns, Mahler’s unfinished Piano Quartet, some prepared piano by John Cage ("Music for Marcel Duchamp"), and the dissonant chords of Krzysztof Penderecki ("Symphony No. 3: Passacaglia – Allegro Moderato"), plus Lonnie Johnson's haunting "Tomorrow Night" and Johnny Ray's eerie "Cry," the score is the central nervous system for a movie that just isn't there. The modern music Robertson uses has a way of creating a psychological tapestry that takes into account the modernist perspective on the 20th Century.

There's one track, heard over the concluding credits, that also conveys a larger tragedy than almost any of the drama itself. Robbie Robertson creates an ingenious mash-up of Dinah Washington's mournfully elegant 1960 hit "This Bitter Earth" (heard once before with powerful resonance in Charles Burnett's 1977 Killer of Sheep) and British composer Max Richter's gorgeously melancholic "On the Nature of Daylight" (from his 2004 album The Blue Notebooks). Robertson offers what critic Bradley Bambarger called "an improbably pure evocation of a shuttering heart." Here's Washington: “This bitter earth/ Well, what fruit it bears/ What good is love that no one shares/ And if my life is like the dust that hides the glow of a rose/ What good am I?/ Heaven only knows.” The song is soaked in the harsh experience of being black in America at the turning point of the Civil Rights struggle in the early Sixties. But it's Richter's strings that provide the tears that Washington can't cry. The song made little sense at the end of Shutter Island, but it takes you somewhere more substantial and stirring when the film is over. If the music in Scorsese's recent films (with the exception of his purely magical Hugo) no longer seems to provide their inner voice, the music itself is never negligible.When Paul Coates, in The Story of the Lost Reflection, defines cinema as a "dream of an after-life from which to comprehend this one," he isn't talking about Martin Scorsese's later movies. It doesn't even begin to describe them. But he could just as easily be describing the enduring appeal of a Martin Scorsese jukebox.

-- September 18/13

You Say You Wanna Revolution?


It's not too hard to take apart comedian Russell Brand's idea of an egalitarian revolution, which he happily endorsed this week on BBC Newsnight with Jeremy Paxman. There are no ideas there; only vague cereal box pronouncements that would make Karl Marx blush. Since Brand had already guest-edited The New Statesman on that very subject of revolution, he was brought on to the show to explain himself. Besides saying that voting only "legitimizes a corrupt system," Brand's dissent was all sound-bite with nothing at stake. But why Brand received so much play on social media isn't so negligible. As Elizabeth Renzetti pointed out in a column in The Globe and Mail yesterday, the political system has so broken faith with its constituents that it has allowed the bromide of a Russell Brand to take hold. "[W]hen the political system looks increasingly absurd – and you need only look to the kindergarten-style scrapping in Canada’s Senate or the tumbleweeds that recently rolled past the monuments in Washington – the absurdists look rational," she writes. "In his interview, Mr. Brand pointed to the fact that the British government is suing the European Union to remove a cap on bankers’ bonuses on the fifth anniversary of the financial crisis – if that isn’t head-spinning farce, what is?" What Brand spoke to is the void left when, as Renzetti puts it, "the web of trust and civic engagement meant to bind a society is fraying." When only 19 per cent of Americans trust their government compared to 75 per cent fifty years ago, Brand comes across as prophetic.

Forty-five years ago, though, the thought of revolution was something of a fact rather than a whimsical idea. In the early days of 1968, everywhere you looked, real ideals were being put to the test. The Soviet Union had brought a totalitarian chill to the Prague Spring after they invaded Czechoslovakia. The assassination of Martin Luther King in April was followed two months later by the shooting death of Democratic presidential candidate Robert Kennedy. Student upheavals in Paris against the Gaulist government were matched by riots in the United States over the escalation of the Vietnam War. During their various world tours, John Lennon had wanted The Beatles to have more freedom to comment on the political tumult surrounding the group, but Brian Epstein, fearing public reaction, steered Lennon against it. But with Epstein dead by 1968, Lennon knew that there was now no one around to stop him. He immediately went to work on completing a song he first started composing in India. "Revolution" was written in response to the various left-wing organizations that were vying for The Beatles' support for violent revolution. But instead of throwing his hat into the ring, he composed a stern riposte against violence that would create a huge backlash against the group from certain anti-war activists who had counted on The Beatles for support.

At the time Lennon wrote his song, the peaceful struggle against injustice, whose values were steeped in the non-violent activism of Martin Luther King, had been quickly evolving into forms of violent resistance. There was also something dangerously ideological about the insurgencies now developing in democratic nations. "It was not till Mao Tse-tung launched his Cultural Revolution in 1966 that the European Left found a faith to replace the one shattered by Khrushchev's exposure of Stalin in 1956," Ian MacDonald observed in Revolution in the Head. McDonald goes on to say that the attraction to the Chinese Cultural Revolution, which was brutally repressive (and served as a mere warm-up for what would take place in 1989 at Tienanmen Square), happened because it "eliminated the preparatory phases of Lenin's model, positing a direct leap to the Communist millennium which would expunge all class distinctions at a stroke." Before long, joining their European comrades, the ideologues of the West would, as their Stalinist comrades did in the Forties, make their own break with history. "All that remained was to take to the streets and 'tear down the walls'." While Lennon didn't believe in tearing down the walls, he made sure his song blasted the eardrums. One of Lennon's cleverest strokes in "Revolution" was to use the strongest, loudest, most distorted form of rock 'n' roll, the very quality that made it the most revolutionary art form, in order to put across a message that many found anti-revolutionary. Ironically, the song didn't begin that way.

While recording their new album in late May, Lennon had done a slower, doo-wop version of the song that he wished to see out as The Beatles' next single. McCartney though resisted the controversial song finding it too slow and saying that it wasn't commercial enough. (That version would show up on The Beatles as "Revolution 1.") Bristling from McCartney's rejection, Lennon became determined to re-make the song as something commercial and fast. What he came up with was a highly amplified and gritty guitar arrangement that changed the entire character of the song. Lennon plugged his guitar directly into the recording console that overloaded the channel and created the massive distortion that would earn "Revolution" its spot as the B-side for "Hey Jude." In the earlier version, Lennon had also expressed some ambivalence about his position on the subject of revolution when he would sing "you can count me out/in." On the single, though, he plainly says count him out. "The lyrics stand today," Lennon stated flatly in 1980 shortly before he died. "They're still my feeling about politics. I want to see the plan." In the Seventies, though, Lennon ended up seduced by the same attitudes he had derided in 1968. But in that post-Beatles era, the utopian dream Lennon had once spearheaded had died with his group so he threw in his lot behind former Yippies Jerry Rubin and Abbie Hoffman. By 1980, however, he clearly recognized the mistake he made. "What I said in 'Revolution' – in all the versions – is change your head." No doubt Lennon knew instinctively that the utopian ideals he first put forth in "There's a Place" in 1962 didn't promise us a better kingdom on earth, or in heaven, but rather a revolution in the mind. "Even the blunt nature of his dire agitprop work...was itself the display of an artistic stance promoting the direct expression of utilitarian ideals," reminded music critic Walter Everett about John Lennon's political ideals.

The Beatles performing "Revolution"

However the reaction from the counter-culture was not so generous towards Lennon's ideals. In Jon Weiner's book Come Together: John Lennon in His Time, he lists a number of damning quotes from counter-culture publications. Rock critic Jon Landau in Rolling Stone claimed that, in Lennon's "Revolution," "Hubert Humphrey couldn't have said it better." Robert Christgau, in the Village Voice, called for a nuanced response from critics while simultaneously denying one from himself. "It is puritanical to expect musicians, or anyone else, to hew to the proper line," he wrote. "But it is reasonable to request that they not go out of their way to oppose it. Lennon has, and it takes much of the pleasure out of their music for me." The Berkeley Barb claimed the song sounded "like the hawk plank adopted in the Chicago convention of the Democratic Death Party." The New Left Review summed up Lennon's treatise as "a lamentable petty bourgeois cry of fear." The San Francisco-based leftist magazine Ramparts had just one reaction: "Betrayal." The magazine took issue in particular with the idea that a millionaire pop artist could tell us in 1968 that everything was going to be 'alright'. Later that year, singer Nina Simone, perhaps still feeling the sting of Martin Luther King's murder, felt compelled to counter Lennon in her own version of his song where she urged Lennon to clean his mind.

When The Beatles released "Revolution" that August, The Rolling Stones followed the same week with "Street Fighting Man," which was their own incendiary response to the rise of violent revolt. Back in March, Mick Jagger had attended an anti-war rally at the U.S. embassy in London, where mounted police were having their hands full controlling a crowd of 25,000 people. At the demonstration, he met one of the organizers, activist Tariq Ali, a member of the Trotskyist International Marxist Group. Besides the rally in London, Jagger was also aware of the student rioting in Paris on the Left Bank. By the summer, The Rolling Stones took a new song they were working on called "Did Everyone Pay Their Dues?," and changed the lyrics to address the upheaval that Jagger was witnessing. Unlike "Revolution," though, The Stones' "Street Fighting Man" opens with a sharply strummed acoustic guitar to the sound of marching drums. As Jagger reports the sound of marching feet, to accompany the drums, he invokes a significant line from Martha & the Vandellas' 1964 hit "Dancing in the Street." While she describes summer as a time for dancing in the street, Jagger rewrites the lyric to tell us that summer is the time for fighting in the street. What Mick Jagger and Keith Richards pick up on is how "Dancing in the Street" became unwittingly associated with street revolt. In "Street Fighting Man," they make that connection even more explicit. At first, "Dancing in the Streets" was intended as nothing more than a spirited dance tune, but once it hit the airwaves, the song became a rallying cry for urban American blacks. As riots were tearing apart the inner-cities, young black activists, like H. Rap Brown, began using the song as a recruitment anthem.


While The Rolling Stones sound as if they are joining in with their guns blazing, "Street Fighting Man" is even more ambivalent than "Revolution." Jagger recognizes that people are fighting in the street, and he says he wants to join in, but he also understands that all he can do is sing in a rock 'n' roll band. Rather than commit themselves to a position in the song, The Stones instead strike a provocative pose. Where Lennon's "Revolution" is clearly a blatant attack on violent insurrection, Mick Jagger and Keith Richards can only sell rebellious attitude. At the end, Jagger sings about his desire to kill the King, and rails at all his servants, but revolution is just a notion in his head – no more radical than Russell Brand's notions of activism. He may dream of being a street fighting man, but he knows that he's just a singer left observing the battle raging around him. The disappointment for radicals, however, lay with their belief that The Beatles were on their side. In their mind, the music would help change the world, whereas Lennon felt that The Beatles' music existed to free your mind. Where radicals sought to have The Beatles join them at the vanguard of the struggle, Lennon preferred to map out a world of possibilities beyond the institutions and bureaucracies the extremists wished to obliterate. Lennon understood, to paraphrase Pete Townshend a few years down the road in "Won't Get Fooled Again," that the new boss would be just like the old boss.

Not all of the left, though, was critical of "Revolution." The SDS (Students for a Democratic Society) newspaper at Cornell University actually praised Lennon's pacifism. "You can argue about the effectiveness of non-violence as a tactic, but it would be absurd to claim that it is a conservative notion...," the paper stated. Not surprisingly, some conservatives spoke out in favour of the song. William F. Buckley praised "Revolution" in his syndicated column and found himself being attacked by the ultra-right John Birch Society. According to the Birchers, Lennon was no better than Lenin. They thought he was towing the Moscow line against left-wing infantilism rather than actually being anti-revolution. The only flaw with "Revolution" was essentially in its explicitness – its need to tell us what to think. "Hey Jude," on the other hand, was the song that Czech citizens sang while vainly attempting to block Russian tanks that very summer. McCartney's epic masterpiece provided a passionate appraisal of loss while simultaneously transcending the pain that loss can cause. Along with being the most successful Beatles single ever released, selling over five million copies, "Hey Jude" (which borrows its melody from The Drifters' "Save the Last Dance For Me") considers the reconciliation of grief.

The Beatles perform "Hey Jude" on The David Frost Show 

McCartney had composed it during the period that the Lennons were divorcing. As John was taking up with Yoko, Paul's concern was for the emotional welfare of John's five-year old son Julian. One day, while driving to Weybridge to visit Cynthia and Julian, McCartney started singing a melody with the words "Hey Julian." Later, as he drove home, he started changing the lyrics from Julian to Jules, then later to Jude because he remembered liking the name Jud, a character in Rodgers and Hammerstein's 1943 musical Oklahoma! While Julian may have inspired the song, "Hey Jude" also foreshadows the ultimate end of The Beatles. McCartney sings with as much a sense of profound loss and he does so with the hope that better tidings will come. When the song reaches the climax, with the orchestra soaring and the chorus chanting "na-na-na-na, hey Jude," McCartney ardently scats and shouts his way to the slow fade-out. But his shouts here aren't the celebratory, rousing screams of a young man finding his freedom, as he did in "Long Tall Sally" or "I'm Down." McCartney's cries in "Hey Jude" are filled with the release from pain. They are the cries of possibilities lost. Perhaps it's that inherent sense of loss indelibly woven into the fabric of the song that touched Czech nationals in that summer of 1968. They took "Hey Jude" to their heart as a song that best expressed what was left of their wistful pride during the demise of the brief freedom of the Prague Spring.

-- October 27/13

The Wild Side: Lou Reed vs Frank Zappa


Lou Reed and Frank Zappa (illustration by Chris Grayson)

It's curious how we recall certain moments only when death intervenes and creates a rent in our day. The sad passing of Lou Reed this past Sunday, at the age of 71, took me immediately to a typical party I attended as a teenager on a Saturday night back in the early Seventies. There's no significant reason to remember this party and I hadn't even thought about it since the night it happened. But that's what death does. It brings dormant moments back to life. On that evening, it was the first time I became aware of Lou Reed and his band, The Velvet Underground. Their debut album, The Velvet Underground & Nico, just happened to be playing on the turntable and I remember most the nursery rhyme beauty of the opening track, "Sunday Morning," the slashing guitar that droned under the driving beat of "I'm Waiting for the Man," and the pulsating intensity of "Heroin," where John Cale's shrieking violin seemed to create an electric blanket to surround Reed's determined voice and speaking for his heightened nervous system; the sensations brought on by milk-blood flowing in the veins (all of which made Steppenwolf's popular song "The Pusher" seem even sillier and more self-conscious by comparison). I also loved the Celtic melody that underscored "Venus in Furs" while the flattened out timbre of Nico's voice on "All Tomorrow's Parties" made me momentarily forget the party I was attending.

Whenever I come across sounds that speak a new language to me, I'm always drawn to their source. In this case, it was the turntable to first see the record cover (which had on it the illustration of a huge banana that seemed to be daring me to try and peel it) and then to the vinyl on the platter. When I noticed that they recorded for the Verve label, I perked up because one of my favourite bands, The Mothers of Invention, also recorded for that label. When I picked up the cover and turned it over, I was also happy to see that they shared the same producer, Tom Wilson, which only made me wonder if the two bands had much in common. When the host of the party came over to ask me what I thought of the album, I told him how excited I was to discover another highly unusual group that recorded on the same label as Frank Zappa's group. To my surprise, this launched a huge argument over my taste in music. The host was horrified that I would even speak of Zappa's "comedy group" in the same tone as Lou Reed's "heavier band." While the argument was never resolved, I eventually went back to catch the dissonant feedback and crumbling guitar notes of "European Son."


This debate was nothing new to me. I'd had this very same discussion years earlier with fans of The Rolling Stones who also drew lines in the sand on the order of: If you liked the bad boy Rolling Stones, you surely couldn't enjoy the pop niceties of The Beatles. It was a fierce battleground that I never understood. I had room for both groups and still do. But it was the same silly disagreement here, where the Velvet Underground and The Mothers of Invention were occupying armies that were mortal enemies. What I didn't realize till long after the party was that Lou Reed and Frank Zappa did appear to be artistic rivals and enemies. But when I began writing my Frank Zappa biography, Dangerous Kitchen, I came to argue that both men had more in common than not.

In the worlds of jazz, blues, and rap there has always existed an east-coast/west-coast rivalry. It's no different in rock & roll. Frank Zappa's Mothers of Invention were a satirical rock outfit from Los Angeles that treated musical history as a form of farce, and they were blowing raspberries in the church of good taste with music that raised questions about what we consume and why. The Velvet Underground was a band formed in New York in 1965 by Lou Reed and John Cale and produced by Andy Warhol (who designed the banana cover on their 1967 debut). When it comes to humour in music, irony rarely goes down easily with a pop audience (and it's even worse with the hippest of listeners). Partly the reason has everything to do with the very nature of popular music – that it is to be popular. The Mothers played that music but with the awareness that the very form of the genre is about reaching an audience with the kind of earnestness that will make you buy the record. The goal is to reflect the lifestyle of the person purchasing the song. The Mothers' satirized that aspect of popular music, just as the serialist composers at the end of the 19th century questioned the excesses of Romanticism in classical music.

The Velvet Underground

Lou Reed wasn't out to shock anybody in that manner. Reed brought to popular music the seedier underside of desperate romanticism of the kind that lay in the literature and poetry he loved, such as Delmore Schwartz and Hubert Selby Jr. As an English major from Syracuse University, Reed thought that the bohemian underground was perfect material for rock and roll. "Why wouldn't you listen to it? You have the fun of reading that, and you get the fun of rock on top of it," he once said. The Velvet Underground spoke to a different set of outsiders than The Mothers. But they both railed against bands that appealed to hippie utopianism. Reed's songs were often populated by junkies and transvestites and people on the losing end of the wild side. But the songs, despite their sometimes metallic dissonance, were as earnest as most folk music. So the fans of the Velvet Underground weren't interested in parody. To them, "comedy music" (as the party host called it) was something trivial and smug and belonging to the straight world they chose to reject.

The rivalry between The Mothers of Invention and The Velvet Underground actually started on Zappa's home turf in Los Angeles at The Trip on May 3, 1966 when The Mothers opened for Reed's group. Since Zappa's stage rap was already designed to satirize and put down trendy attitudes, the Velvet Underground, with their dark clothes and somber stage presence, became perfect targets for him. In his mind, they didn't fit the L.A. freak environment celebrated on The Mothers' first album, Freak Out!, in 1966. They didn't take drugs and live in dark dungeons, but they were colourful and absurdist. From the stage, Zappa proclaimed to the partisan crowd, "These guys really suck!" (But, of course, Zappa would make the same comments about his own group including on their own record albums tracks parodying the consumer's desire to own it – just as Spike Jones had done years earlier with his album titles such as The Worst of Spike Jones.) The Velvets didn't take kindly to the insult and Reed returned the favour. "[Zappa] is the single most untalented person I heard in my life," he sneered. ""He's a two-bit, pretentious academic, and he can't play rock 'n' roll, because he's a loser. And that's why he dresses funny. He's not happy with himself and I think he's right." In truth, they both dressed funny. It was a clash between two sets of temperamental talents: that is, two fierce innovators who were figureheads for different aspects of the era's counter-culture rather than mere enemies. "Both Zappa and Lou Reed were equally single-minded about the way they approached their material and the direction of their respective groups," critic Billy James wrote in Necessity Is... Frank Zappa and The Mothers of Invention. "They were also both cynical about the whole new hippie scene." While all of that is true, there was also the belief that the Velvet Underground's first album was held up by Verve Records due to the attention The Mothers were getting for Freak Out!. The Velvet Underground & Nico was actually ready for release by April 1966, but didn't see the light of day until a year later, having been eclipsed by The Mothers. "I know what the problem was," The Velvet's guitarist Sterling Morrison said in 1981. "It was Frank Zappa and his manager, Herb Cohen. They sabotaged us in a number of ways because they wanted to be the first 'freak' release. And we were totally naive. We didn't have a manager who would go to the record company every day, and just drag the whole thing through production." In the rivalry between east coast and west coast acts, this would become a familiar complaint throughout the years.

The Mothers of Invention

When The Mothers came to New York to perform at the Garrick Theatre in 1967, the differences between the two groups became most apparent when Zappa went to hear The Velvets at a local venue. Chris Darrow, in Kaleidoscope, wrote about the event:

"The opening night was very crowded and Zappa and members of The Mothers of Invention showed up to show their support... Nico's delivery of her material was very flat, deadpan, and expressionless, and she played as though all of her songs were dirges. She seemed as though she was trying to resurrect the ennui and decadence of Weimar, pre-Hitler Germany. Her icy, Nordic image also added to the detachment of her delivery.... The audience was on her side, as she was in her element and the Warhol contingent was very prominent that night. However, what happened next is what sticks in my mind the most from that night. In between sets, Frank Zappa got up from his seat and walked up on the stage and sat behind the keyboard of Nico's B-3 organ. He proceeded to place his hands indiscriminately on the keyboard in a total, atonal fashion and screamed at the top of his lungs, doing a caricature of Nico's set, the one he had just seen. The words to his impromptu song were the names of vegetables like broccolli, cabbage, asparagus... This 'song' kept going for about a minute or so and then suddenly stopped. He walked off the stage and the show moved on. It was one of the greatest pieces of rock 'n roll theatre that I have ever seen."


Both bands worked in the arena of rock & roll theatre and, while their intent might have been different, both groups were appealing to the kind of outsiders who questioned the puritanical underpinnings of the country that spawned them. Lou Reed and Frank Zappa did have different views on drugs (Zappa opposed drug use), but their views were more similar than not. They both questioned the guilelessness of the hippie subculture. Both did frank examinations of sexual transgression. (Lou Reed wrote about sadomasochism in songs like "Venus in Furs," while Frank Zappa did similar material in "Penguin in Bondage.") Lou Reed's "Lonesome Cowboy Bill" (from Loaded in 1970) might have had a hand in inspiring Zappa's "Lonesome Cowboy Burt" from 200 Motels in 1971. Both men spoke their minds eloquently and caustically and usually with contempt for the rock press. Neither had any respect for the PMRC which sought to censor rock music in the Eighties. Lou Reed and Zappa also became portals to individual freedom for those who were oppressed in Communist countries in Eastern Europe. The Czech president Vaclav Havel would eventually honour both men for being abiding spirits of his democratic revolution.

Moon Zappa and Lou Reed at the Rock and Roll Hall of Fame in 1994

But here at home, the fans of both artists continue to bicker. Only a few years ago, the Canadian music critic, Carl Wilson, wrote me to say kind words about my book on Randy Newman. I wrote back to thank him and asked him if he read Dangerous Kitchen. "I don't read books about Frank Zappa," was his quick reply. Being a critic sometimes demands a desire to examine not only why you love what you love, but also why you hate what you hate. (And Wilson, who took on the idea of taste in his 33 1/3 book on Celine Dion, obviously knows that.) But lovers of music on the wild side of the fence, perhaps like Carl Wilson, tend to get proprietary about what they love and why. It becomes a chic kind of security blanket against the expediency of the mainstream. But maybe the best way to resolve this particular argument is with facts. In 1994, a year after Frank Zappa died of prostate cancer, he was to be inducted in the Rock and Roll Hall of Fame (after having languished on the ballot for several years). But this oversight paled next to how they planned the whole affair and how the outcome was eventually resolved. A week before the ceremony, guitarist Eddie Van Halen was asked by Zappa's son, Dweezil, to induct his father. But he turned down the offer (he said he didn't give out awards). His widow, Gail Zappa, suggested the Fifties r&b legend Johnny 'Guitar' Watson, who not only inspired Zappa as a teenager, but also played with his group in the Seventies. But the Hall refused. As a last ditch effort, they came up with one name: Lou Reed. Although Zappa's kids, well aware of the insults between these contentious artists, were dead against it, Gail phoned Reed to discuss the possibility. After making his own amends for the years of heated remarks, he agreed to do it. And he spoke eloquently and movingly about someone many perceived as his hated rival. "It's very rare in life to know someone who affects things, changes in a positive way," Reed began. "Frank Zappa was such a person, and of the many regrets I have in life, not knowing him better is one of them." Listing Zappa's many accomplishments, which also (in many ways) echoed his own, Reed went on to make his strongest remarks. "Frank was a force for reason and honesty in a business deficient in those areas. As we reward some with money for the amusement they supply to the cultural masses, I think the induction of Frank Zappa in the Rock and Roll Hall of Fame distinguishes the Hall as well as the inductee."

Lou Reed, like Frank Zappa, came along in a decade when the doors flew open to anyone possessing an original vision and the will to sustain it. And, like Zappa, Lou Reed was as much a man of his time as he was out of time. Both refused to cater to audience expectations of where they should go next, and they always defied trends. But that's maybe why fans of both men stand guard defending each of their legacies without a thought as to what they actually shared in common. Edgard Varèse, the French composer who inspired Zappa to become one himself, once said, "An artist is never ahead of his time, but most people are far behind theirs." And perhaps it's when they die that we finally have an opportunity to truly catch up to where they've taken us.

-- October 29/13

Chasing Phantoms - From Del Shannon to Neil Young: "Runaway" and "Like a Hurricane"



When I was six and driving in the car with my parents, the radio often provided comfort either by giving me voices in the larger world beyond the roads we travelled, or music that could take me inside the world of the singer. For myself, the rock & roll I heard in 1960 was about finding a place, to paraphrase John Lennon, where I could go when I felt low. The songs of Elvis Presley and Buddy Holly could reach out to the friendless and disenfranchised and invite us to to be part of something larger than ourselves. Even if their tunes were about heartache and loss, the mere sharing of that pain gave credence to the idea that one could transcend it because the music was about giving pleasure. In one of his last recorded songs, "It Doesn't Matter Anymore," Buddy Holly playfully teases himself about how foolish he was to be driven crazy by the woman who abandons him. Not only does the singer survive the loss, he understands the price he was willing to pay in the process so he could move on. (It was only in real life, unlike in the nowhere land of the song, that Buddy Holly could lose his life in a plane crash he couldn't control.)

But not all of rock and roll's voices are about the sharing of community. Whenever Roy Orbison's "Blue Bayou" or "Only the Lonely" came on the radio, the desolation that's part of being friendless and disenfranchised gets forever invoked. With his sad eyes hidden behind dark shades, Orbison showed us what it cost to fail in one's quest to find community. Yet you could get so lost in the operatic pining of his theatrical voice that his songs never left a residue of despair. That wasn't the case, however, with rock's other desolate loner: Del Shannon. Born as Charles Weedon Westover in Grand Rapids, Michigan, Shannon's songs had a stark power emanating from the paranoia brought on by unrequited and lost love. In his best songs, including "Runaway," "Stranger in Town," "Hats Off to Larry" and (his last one) "Walk Away," Del Shannon sang in a friendless voice. Listening to Shannon, who would ultimately commit suicide, is akin to getting periodic bulletins from a desperate hitch-hiker aimlessly cruising from town to town, chasing phantoms and hopelessly seeking answers for why his loved one is now lost to him. His famous high falsetto voice, which cried out in wonder on "Runaway," didn't express longing, as the doo-wop vocalists of the Fifties did in their songs. Shannon instead carried the anguish of never finding a release from the pain.

Del Shannon

As the counter-culture of the Sixties emerged with its literal quest for community, first through the folk movement that had allied itself with the Civil Rights struggle, and then later with the hippie communalism and freak culture that broke from the mainstream, Del Shannon was abandoned on his lost highway. Although he continued to play Fifties reunion concerts and would write songs and practice at home, Shannon had no place in a counter-culture breaking with the past. But his voice did somehow still find its way into the work of new artists who would continue to reach the ears of the alienated loner. Canadian artist Neil Young often wrote about loners while becoming one himself. Maybe his solitary stance is one reason why he could be a patron saint of the Sixties as well as be claimed by the grunge culture that would emerge some thirty years later. In his high-pitched tremolo voice, Young has a way of reminding you that the hippie dream brought forth some haunting nightmares (as he would document in songs like "Ohio") and that the quest for community could also be the search for a false utopia (as he accounted for in the chilling "Revolution Blues"). His piercing voice cuts through the tenor of the times whether he's alone on his acoustic guitar or piano, sharing the stage with Crosby, Stills and Nash, or in full throttle with his electric band Crazy Horse. If he chooses to travel the lost highway, whether (as he once put it) in the middle of the road or in the ditch, you can sometimes hear Shannon's spirit in the reclusive ache of "Expecting to Fly," the quiet pining of "I Believe in You," and the desperate craving of "Lookin' For a Love." But he perhaps never touched the deeper solitude of Del Shannon better than in his epic track "Like a Hurricane" which he recorded in 1974 and was included on his 1977 record, American Stars 'n Bars.


While most of American Stars 'n Bars is an eclectic brew of the same kind of country material that made Harvest (1972) a popular and best-selling album, "Like a Hurricane" lives up to its title. Taking its melody from Shannon's "Runaway," where the singer is obsessed by the lost love of his life ("As I walk along I wonder a-what went wrong /With our love, a love that was so strong/And as I still walk on, I think of the things we've done/Together, a-while our hearts were young"), "Like a Hurricane" comes to terms with the primal power of what brings lovers together and what also tears them apart. Paul Simon in "Slip Slidin' Away" once touched briefly on men's fears of being consumed by a woman's emotional power ("My love for you's so overpowering/I'm afraid that I will disappear"), but it was just a glance in a softly sung pop gospel ballad. A few years after the release of Young's song, Pete Townshend went even further than Paul Simon in his passionate confessional, "A Little is Enough," from his 1980 Empty Glass album. Dropping the character armour that he sometimes donned proudly in The Who, Townshend sings a love song that is as nakedly honest about his fear of vulnerability as he is about the romantic passion that consumes him ("Just like a sailor heading into the seas/There's a gale blowing in my face/The high winds scare me but I need the breeze/And I can't head for any other place"). But "A Little is Enough" is a powerful testimonial where "Like a Hurricane" pulls us right into an emotional maelstrom for almost nine minutes and doesn't let go. Young's voice shifts, from line to line, between fear and curiosity until the chorus lays out the singer's terror.

You are like a hurricane
There's calm in your eye.
And I'm gettin' blown away
To somewhere safer
where the feeling stays.
I want to love you but
I'm getting blown away
.

In Shannon's "Runaway," one of the most haunting qualities is Max Crook's solo break on the clavioline, a keyboard instrument that was the forerunner to the analog synthesizer. With it's eerie high pitches, the clavioline (which The Tornadoes would employ a year later in their hit, "Telestar," and Jack White would resurrect to great effect in The White Stripes' "Icky Thump") seems to express the latent hysteria beneath Shannon's desperation. In "Like a Hurricane," Frank "Poncho" Sampedro's organ does likewise for Young and resembles a steam engine underscoring the piercing and wailing notes from Young's guitar. (In his 1993 MTV Unplugged appearance, Neil Young played "Like a Hurricane" solo on a pipe organ as if it were a steam engine.) 

Over the years, many people who couldn't shake the terror of "Runaway" still tried to catch its lightning, but they barely lit a spark. Bonnie Raitt sounded so quaint in her rather opaque 1977 rendition from Sweet Forgiveness that you forget by the end of the song that there was even a lost lover to pine over. In another instance, Tony Orlando & Dawn came up with the bizarre idea of matching the song with The Turtles' "Happy Together." No one came close to plumbing the neurotic depths of Shannon's original. As for "Like a Hurricane," it also had many comers including Roxy Music who went for the haunting melody, but lost the core by turning the song's desperation into a gigolo's passing fling. But if Young pays full tribute to Shannon in "Like a Hurricane," he manages to keep his sanity in the high winds of his song. Del Shannon was not so fortunate. If those early tracks revealed the wounds of a man lost in the wilderness, Shannon would return from that exile in the late-Seventies after a bout of alcoholism. Tom Petty would produce his album Drop Down and Get Me which featured the single "Sea of Love." But on his last release, Rock On! in 1990, produced by Mike Campbell (of Petty's Heartbreakers) and Jeff Lynne, his final single, "Walk Away," brought him full circle from "Runaway."


If "Runaway" was about a man haunted by the mystery of his lover's departure, "Walk Away" reverses the roles. In "Walk Away," Shannon seems to finally find the lost lover that alluded him in his early hits. But instead of being fulfilled by what he's found, he ultimately comes to break the promise that first brought them together ("Every time I have to lie it tears me apart/Every time I see you cry, it takes a piece of my heart/ I know that I said, never, never, walk away from you/I know that I said, I'd always be there my whole life through"). Instead of being the victim of the "runaway" this time, Shannon becomes one himself ("I got to walk away, walk away"). And walk away he did. Neil Young sometimes get to the bottom of a depressive funk, but Del Shannon lived out a life of clinical depression. Shortly after the release of Rock On!, Shannon committed suicide with a .22-caliber rifle while he was on a prescription dose of Prozac.

Art can sometimes build the means to confront the pain of being a solitary figure, but the phantoms we chase – or run from – never really go away. The ghosts that Bruce Springsteen once stared down and ran from in his Seventies hit song, "Born to Run," would catch up to him almost a decade later in Nebraska. Kurt Cobain once tried to stare down the haunts of Leadbelly's tale of horror when he sang "Where Did You Sleep Last Night?," a year before would be taken by them. Like Shannon, Cobain took his own life with a gun. Pop songs can often be as mysterious as detective novels. But Del Shannon's "Runaway" and Neil Young's "Like a Hurricane" aren't on a mission to solve the riddles of the contingencies of love and loss. That mystery remains the enduring part of why we continue to listen and always carry within us the traces of a conundrum that no song, or artist, has yet to solve.

-- December 1/13

Bad Timing & Bad Business: Big Star and Badfinger


Big Star

Last night, I was watching the riveting and touching documentary Big Star: Nothing Can Hurt Me about the Memphis band from the Seventies that eluded commercial success for any number of reasons that you can easily classify as bad timing. While they vanished after three superb albums, they ultimately reached an adoring audience a decade later as various independent bands took up their torch. The film is a fascinating study of love and dedication that doesn't elude the self-destructive drives that come out of artistic obsession. Director Drew DiNicola paints a fascinating portrait of the group with very few colours in his palette to work with. As Stephanie Zacharek wrote in the Village Voice, Big Star: Nobody Can Hurt Me "honors that sense of mystery, telling the band's story as if whispering it through the cracks in a wall. There's very little footage of the band themselves – their elusive magic found its truest expression in the studio rather than before a live audience." Co-founders Chris Bell and Alex Chilton are now gone, but the remaining witnesses fill in a story of artistic achievement that found a pulse in the shadows. Those shadows became a subterranean force for groups like R.E.M., The Flaming Lips and (especially) The Replacements (who wrote a song called "Alex Chilton"). As Robyn Hitchcock says in the picture, "They were like a letter that got lost in the mail." But Hitchcock also reminds us that the letter finally found its destination in the Eighties. (For me, it took my friend, Adam Nayman, to deliver the mail a few years ago. He wasn't around when the band first released #1 Record in 1972. What was my excuse?)

While I happily mulled over the movie, I was reminded of another Seventies band who had a case of bad timing, but with nowhere near the impact of Big Star – and this band had the benefit of being tutored by The Beatles. When The Beatles departed the stage in 1970, there was no shortage of others who tried to fill the gap they were leaving behind. One tragic case, however, turned out to be a band signed to their Apple label. Badfinger were poised through the early Seventies as the new heir to The Beatles, but their legacy ended in bad business, despair and death. Originally a Swansea, Wales group called The Iveys, they first came to the attention of Beatle roadie Mal Evans who was friends with their manager Bill Collins. Since The Beatles were just signing acts to Apple, Mal convinced the Fab Four that The Iveys were worth the bother. Lead guitarist Pete Ham and rhythm guitarist Tom Evans sang with ringing harmonies that strongly evoked Lennon and McCartney, and when Evans played them an Iveys’ demo tape, the whole studio took notice.“It was their uncanny resemblance to the young Beatles that had made everyone sit up and listen,” recalled Apple employee Richard DiLello. “But it was no conscious aping of their benefactors that had produced that similarity of sound.” The Iveys had inherited the yearning spirit of The Beatles rather than being a facsimile of the band. Their first single was the Beatlesque “Maybe Tomorrow,” which made the Top Ten in Europe and Japan in 1968. Due to its success, The Beatles were interested in grooming the band, but weren’t impressed by their name. Apple associate Neil Aspinall thought of Bad Penny, after Humphrey Lyttleton’s “Bad Penny Blues” which had inspired “Lady Madonna.” Ultimately, Badfinger was taken from “Bad Finger Boogie,” the original title of “With a Little Help From My Friends” (because John Lennon had composed the melody using his middle finger when he had hurt his forefinger).

Badfinger

But Apple was just starting to collapse as Badfinger entered the arena. With The Beatles barely on speaking terms, apparently nobody was talking to Badfinger either. Just then, Ringo was set to star in The Magic Christian (1970), with Peter Sellers, so McCartney was asked to contribute some songs. He had already written and recorded a demo to include in the film called “Come and Get It,” but at the same time, Badfinger was expressing their frustrations in the press about not getting a chance with Apple. McCartney apparently read their complaints. He came to them with “Come and Get It,” telling them that they could record the song providing they record it without changing any of the arrangement. He also offered them the opportunity to write some of their own stuff for the film since he was too busy to compose any new material himself. Paul still produced the song – and even added piano – and it was a Top Ten hit on both sides of the ocean. With new sessions pending in early 1970, Mal Evans began producing their new album, No Dice, with Geoff Emerick eventually taking over the controls. Their first single off the album was the punchy “No Matter What,” which went to #8 on the Billboard chart. However it was the aching “Without You” that would later become the bigger hit – only not by Badfinger. American pop singer Harry Nilsson decided to record a version of the song for his 1972 Nilsson Schmilsson album. Where the Badfinger version is tentative, almost uncertain of the latent romantic despair in the composition, Nilsson found the core of the song’s strengths and his soaring light tenor turned it into a classic lovesick ballad. Nevertheless, No Dice was a hit that caused critic Mike Saunders at Rolling Stone to exclaim that it was “as if John, Paul, George and Ringo had been reincarnated.” Badfinger might have found their niche in the solo Beatles’ inner circle, but it wasn’t always preferable. “We weren’t preoccupied with sounding like The Beatles, so it got to be a bit of a pain because people were asking all the time questions like, ‘What’s John really like?’ and ‘Is Paul a nice guy?’” said guitarist Joey Molland. “We got really fed up with it. I mean, we loved The Beatles but we didn’t want to talk about them all day.” But they continued to hang out with them. Badfinger performed on Lennon’s Imagine album, did George Harrison’s All Things Must Pass, and would perform as part of Harrison’s benefit concert for the victims of Bangla Desh in the summer of 1971. 

While in the U.S. a year earlier, Badfinger signed a business management contract with Stan Polley. Bill Collins would stay on as manager, but Polley (who had managed both Lou Christie and Al Kooper) became their financial overseer. He had them touring relentlessly with little time to devote to recording their new album, Straight Up, which George Harrison began producing. One song, “Day After Day,” in which Harrison played a pining slide guitar alongside Pete Ham, gave the group their third hit single. Harrison might have even finished the album, but he had work to do on both the concert for Bangla Desh and the subsequent film and soundtrack album. The producer reigns were then handed to Todd Rundgren who finished the record and helped spawn the album’s second hit single, “Baby Blue.” In 1972, Badfinger was under contract to release one more album for Apple, which was now closing down its operations. Simultaneous to that, however, allegations about Polley’s mismanagement of finances for Lou Christie were coming to the fore. Unfortunately, Badfinger never questioned Polley’s business affairs so during the recording of their last Apple record, Ass, Polley negotiated a $3 million dollar deal with Warner Brothers that included an album from the group every six months for the next six years. Although it appeared as if golden days were ahead, their Warners albums, Badfinger and Wish You Were Here, were commercially unsuccessful. But the band continued to tour and attract sell-out crowds. It soon became clear though that Polley was coming under suspicion from Warner Brothers when he refused to co-operate in communicating the status of an escrow account of advance funds. According to the contract, Polley was to keep $100,000 in safekeeping in a mutually accessible account for both Warners and the band to access. But Polley never told the label about the account’s whereabouts and he ignored legal warnings to cough up the information. On December 10, 1974, when the group was about to submit their next album, Head First, to Warners, the label instead issued a lawsuit against both Polley and Badfinger. Legal action prevented any other advance funds to the band and they also withdrew distribution of Wish You Were Here.


In winter 1975, the group was in turmoil with no money coming in from anywhere. All their profits earned from touring, recording and publishing, were tied up in Polley’s holding companies. Panic began to set in. Pete Ham and his girlfriend were expecting a child and running out of cash. Nobody would book Badfinger because of the restrictive contracts they had with Polley who was now up to his eyebrows in litigation. Although Ham tried endlessly to reach Polley, he’d never return the call. In total despair over his unravelling life, Ham hanged himself in his garage on April 25, 1975. In his suicide note, he wrote that he loved his girlfriend and that “Stan Polley is a soulless bastard. I will take him with me.” With Ham’s death, Badfinger dissolved. The surviving members did session work until the early Eighties when Tom Evans and guitarist Joey Molland decided to create separate touring bands that both used the group name. It caused huge rows between them until November 19, 1983 when Evans and Molland had a massive fight on the phone over past income owed from royalties. After the call, Evans followed Pete Ham’s example and also hung himself in his back garden. The surviving group members attempted to keep the name Badfinger alive, especially by playing golden oldies package tours, but by 1990, they were officially dead. What made Badfinger a shadow version of The Beatles was not just the mellifluous pop sound they created, but also the spiritual bond that existed between Pete Ham and Tom Evans – the Lennon/McCartney of the group. "When Pete died, his other half was gone,” said Evan’s widow Marianne. “He felt lost and lonely. Many times he said, ‘I want to be there, where he is’.” The bond they created together had carried the seeds of The Beatles’ utopian dream into the dark tragic conclusion that ended the Apple era. Once the financial problems of the label were ultimately settled in 1985, Badfinger’s royalties resumed. Just not soon enough to save Pete Ham and Tom Evans.

-- December 21/13


                                                                2014

When Ordinary People Come to Terms with the Extraordinary: Revisiting David Lynch's The Straight Story (1999)


Richard Farnsworth in The Straight Story

As I watched Alexander Payne's new film, Nebraska, in which Bruce Dern plays Woody, a craggy old man banking his final hopes on some junk mail scam that promises him a million dollars if he hoofs it to Billings, Montana to collect it, the picture's plainness left me with a bad case of sensory deprivation. I bailed some thirty minutes in. The smallness of the characters and Payne's need to italicize every irony didn't leave me quite as steamed as his Martian take on family life did in his last movie, The Descendants, but (despite the fine performance here by Dern), the journey undertaken in Nebraska sets up an inevitable ending before we even arrive there. So, following Woody's example, I sought fortune elsewhere and fled the theatre. And I began thinking back to another, somewhat similar road movie that has continued to cast its elliptical spell over me like some fairy tale recovered again years later in my grandparent's treasure chest. David Lynch's The Straight Story (1999) is a straight-forward account of one man's journey to seek closure towards the end of his life, but it's by no means simple. This lovely, poignant tale of a stubborn coot who wishes to mend his fractured relationship with his brother – and the world – before he dies examines what happens when ordinary people come to terms with the extraordinary.

Unlike most film directors drawn to populist themes, David Lynch (The Elephant Man, Blue Velvet, Twin Peaks: Fire Walk With Me and Mulholland Drive) creates a believable balance between the light and dark sides of everyday existence. Rather than indulge the quaintness in sentimental virtue (as Frank Capra did in It's a Wonderful Life and Mr. Smith Goes to Washington), or create curdled clichés of suburban angst (the way Sam Mendes did in American Beauty), Lynch's affection for oddness brings us closer to people who don't necessarily live the kind of lives we do. He draws out their capricious characteristics, rather than moralize about their eccentric behaviour, so that the irrational and the rational can coexist in the same world. The Straight Story concerns Alvin Straight (Richard Farnsworth), who one day hears that his brother, who hasn't spoken to him in ten years, has had a stroke. Alvin lives in Iowa while his brother, Lyle, is 300 miles away in Wisconsin. Because he has cataracts and his legs are too weak, he can't drive a car. His daughter, Rose (Sissy Spacek), who lives with him, doesn't have a vehicle or a driver's license either. So Alvin decides to take the trip on his John Deere lawnmower.


As Alvin makes the long journey, and the seasons change, Lynch doesn't use the elements to parallel his protagonist's impending mortality. He adjusts the changing skies the way a painter employs colour to create a panoptic mood that serves to enhance Alvin's varied encounters. It also helps that Richard Farnsworth, who once played the natty bank robber Bill Miner in the late Phillip Borsos' The Grey Fox, has an uncanny knack for taking the folksiness out of being old by depriving Alvin of a hokey dignity. Farnsworth, who was dying from cancer while playing the role (and later committed suicide due to the physical pain), has pools of conflicting emotions continuously flowing through his twinkling eyes as if each tear were trying to trace thoughts not yet formed. When he tells a pregnant runaway a story about his own early life, he's also wistfully wise as he remembers a moment from his childhood when his brother was a true sibling to him. There are always traces of regret around the crinkles of Alvin's smile, especially when he's asked by a young cyclist what he likes least about being old. Alvin answers, "Remembering what it was like to be young." While she doesn't have as much screen time as Farnsworth, Sissy Spacek, playing a physically and mentally impaired woman, doesn't milk the part for empathy (or for Oscar consideration). She conveys the mournful regrets of a woman struggling daily for clarity rather than condescending to her impediment.

When The Straight Story came out in 1999 a number of film critics jumped on the picture as if Lynch had become a token of American Republicanism. Since the movie was both a G-rated David Lynch movie and distributed by Disney, it was as though Lynch were now sanitized and had become a pod person. Writer and culture critic Howard Hampton went right after 'the Lynch Mob' of cineastes who suddenly felt that Lynch had lost his street cred. (It would come to resemble the same attack similar scribes made years later on Peter Jackson when he tackled the 'mainstream' epic The Lord of the Rings.) Hampton saw the nascent puritanism in their pans, influenced as they were by a deep distrust and resentment of what he called "the idiosyncratic, visionary, irreverent strain of American culture," a patrician snobbishness that they shared with other anti-Western intellectuals. "Seeing reactionary conspiracy behind every frame, [John] Patterson [in the LA Weekly] declares that 'non-whites might as well not exist' in Lynch's Iowa, which is like faulting [Abbas] Kiarostami for neglecting to include Iranian Jews in, say, Taste of Cherry," Hampton wrote in 2000. But, unlike Patterson, or Jonathan Rosenbaum who called the picture "propaganda," Hampton perceived Lynch in The Straight Story to be creating an American poetics every bit as sure and enticing as the work of Hou Hsiao-hsien. "[N]o living film-maker has gone further into the unresolved recesses of America's dream life or fashioned so indelible a universe from such spectral matter," he wrote. The very accessibility of The Straight Story shouldn't suggest to viewers that it's any less foreboding than Eraserhead or Blue Velvet. Hardly. At the core of The Straight Story is the same strange world that always fascinated David Lynch. The only difference is that David Lynch – like Alvin Straight – made his peace with it.

-- January 15/14  

A Termite in the Wood: Captain Beefheart & The Magic Band's Trout Mask Replica (1969)


Barack Obama knows his music

Not every record is for its time. Some don't even care about time. In the case of the strange masterpiece, Captain Beefheart & The Magic Band's Trout Mask Replica, it doesn't even pretend to keep time. In the summer of 1969, Trout Mask came out as an abstract sound collage that followed no trend, or provided even a hint to what was dominating the rock airwaves. Rather than build on what was popular then, it seemed to anticipate the rude gesture of punk almost a decade later. But it did it without the artists involved wanting to upset, or alienate anyone. Walt Whitman wrote about the spirit that Trout Mask inherited some seventy years before it was made. "A perfect writer would make words sing, dance, kiss, do the male and female act, bear children, weep, bleed, rage, stab, steal, fire cannon, steer ships, sack cities, charge with cavalry or infantry, or do any thing, that man or woman or the natural powers can do," Whitman wrote in An American Primer. Trout Mask dances and weeps and bleeds and sacks cities by integrating free form verse, the urban blues of Howlin' Wolf, the gospel blues of Blind Willie Johnson, and the free jazz of Ornette Coleman. The record was so dissonantly original that it annoyed more people than it attracted. People even heard it in records that weren't Trout Mask. A friend of mine once bought one of John Coltrane's more wildly improvisational records (John Coltrane in Seattle) only to call me later to complain, "This isn't Coltrane. It's Beefheart!" But to understand the shock and disbelief surrounding the release of this 1969 landmark, a wildly original (and unsurpassed) experimental leap in popular music, one that defied all the reasons why we listen to popular music, you have to first consider the music already on the airwaves, or perhaps about to arrive there.

That same summer, the aching harmonies of country rock were just being fully realized when Gram Parsons and Chris Hillman of The Byrds formed The Flying Burrito Brothers with their first record, The Gilded Palace of Sin. Another Byrd. David Crosby, plus former Buffalo Springfield singer/songwriter Stephen Stills and ex-Hollie Graham Nash, brought their own gentle angst to the creation of Crosby, Stills & Nash. British chanteuse Dusty Springfield turned up in Memphis to prove that she was the best white soul singer of her time, while Memphis resident, Elvis Presley, rediscovered his own soul making a confidently crafted studio record, From Elvis in Memphis, shortly after a surprisingly successful television special. The Beatles, on the other hand, were about to acrimoniously depart from the pop stage that had built seven years earlier in Liverpool with the late summer release of Abbey Road. A ten-year-old singer from Gary, Indiana, named Michael Jackson also emerged with a mission to change R&B along with with brothers, The Jackson 5. Into this eclectic gumbo of pop metamorphosis, Trout Mask Replica appeared on the scene totally oblivious to the musical, political, and cultural environment surrounding it.


Most of the other performers that summer, who were making their shift towards either stardom or oblivion, made their moves with one eye on the audience they carried on their backs. Captain Beefheart & The Magic Band made no concessions to anyone. They came out of a hermitage, not a popular culture. They emerged from a house where they relentlessly practiced the most difficult music they'd ever have to play, and they did it with sounds that nobody expected to hear. For those who did discover the record, it would polarize an already polarized culture. It repelled some just as violently as it attracted listeners. "When I first heard Trout Mask, I about puked," Rolling Stone critic Ed Ward put it (not so delicately). "What is this shit, I thought. People I met talked about it in such glowing terms – not just anybody, mind you, but people I genuinely respected when it came to their music tastes." One of those people he respected was an ambitious and talented writer named Lester Bangs. Bangs wrote about Trout Mask Replica as if the Messiah had just arrived to heal a broken nation. "Captain Beefheart, the only true dadaist in rock, has been victimized repeatedly by public incomprehension and critical authoritarianism," Bangs told Rolling Stone readers. "[His] music [derives] as much from the new free jazz and African chant rhythms as from Delta blues, the songs tended to be rattly and wayward, chattering along on weirdly jabbering high-pitched guitars and sprung rhythms."

Eliot Wald, writing a few years later in Oui, knew what Bangs had heard in Trout Mask, but he also understood what offended listeners as well. "[Lester Bangs] described [Trout Mask] as the most astounding and important work of art ever to appear on a phonograph record," he began. "However it was not to everyone's taste...Rhythms are totally unpredictable; what starts as a blues boogie may end up sounding like a surrealist waltz. Everybody seems to be playing whatever came to mind, including Beefheart, whose sax, musette and simran horn solos (played through tubes that allow him to play two instruments at the same time) swoop and dive, mirroring his incredible four-octave voice. Lyrically, it's absurdist poetry...Trout Mask Replica was not an overnight sensation." Not only was it not an overnight sensation, it took (for some people) many nights of listening to comprehend its strange power. "The first time I heard Trout Mask, when I was fifteen years old, I thought it was the worst thing I ever heard," remembered Matt Groening, the creator of The Simpsons. "I said to myself, 'They're not even trying!' It was just a sloppy cacophony...About the third time, I realized they were doing it on purpose: they meant it to sound exactly this way. About the sixth or seventh time, it clicked in and I thought it was the greatest album I ever heard."

Joe Strummer

The greatest album ever heard? In 1987, Rolling Stone did list it as #33 in their Top 100 Best Rock Albums issue, describing it as "rock's most visionary album." As visionary as Trout Mask is, its influence in the years to follow its release was not as straightforward as other significant pop artists. As critic Steve Huey remarked, "[T]he influence of Trout Mask Replica was felt more in spirit than in direct copycatting, as a catalyst rather than a literal musical starting point." That spirit, which stretched itself down many winding pathways, proved Beefheart right when he pronounced one day that everyone drinks from the same pond. One such drinker from that pond was John Graham Mellor, a middle-class grave digger, who would be reborn as Joe Strummer. Years before he dreamed of making a dent in the rock conglomerate with The Clash, Strummer told critic Greil Marcus, "When I was sixteen, [Trout Mask Replica] was the only record I listened to – for a year." "The Clash have taken Beefheart's aesthetic of scorched vocals, guitar discords, melody reversals, and rhythmic conflict and made the whole seem anything but avant-garde: in their hands that aesthetic speaks with clarity and immediacy, a demand you have to accept or refuse," Marcus wrote in New West. That either/or ultimatum, which became the standard provocation offered by punk in the late Seventies, wasn't exactly the stand that Trout Mask took when it appeared in 1969. Beefheart held a more ambiguous position than punk itself offered. Trout Mask Replica was an inhabitor of the pond, possessing those who perhaps wished to be different fish in it.

Lora Logic

Another artist transformed was Mark Mothersbaugh, the founder of the synth-punk band Devo. Formed in Akron, Ohio, in 1972, by two Kent State art students, Mothersbaugh and Jerry Casale came upon the notion of a "devolving" American society out of the ashes of the fatal shootings of the four students at Kent State by the National Guard. In their mind, mankind was regressing, becoming rigid in its thinking and more authoritarian in attitude. Devo mirrored that world in their music with robotic rhythms and nerdish demeanour. "Beefheart was a major influence on Devo as far as direction goes," Mothersbaugh explained in 1978. "Trout Mask Replica...there's so many people that were affected by that album that he probably doesn't even know about, a silent movement of people." That movement of people seemed silent only because the record, nurtured in isolation, inspired a quiet need to be unique. So its spirit became shared subliminally among a scattering of diverse voices in a wilderness. One such individual was Lora Logic (Sara Whitby), formerly of the punk band X-Ray Spex, who found the post-punk ensemble Essential Logic in 1978. In her song, "Aerosol Burns," from Fanfare in the Garden, her voice bursts forth like Björk on steroids as she breaks the song's title into spit consonants and vowels. While twisting her saxophone into squeaks curling around the broken sounds, Logic marries some of the raw power of punk honed in earlier bands to Beefheart's style of intricately shifting melodies. But Lora Logic wasn't the only woman inspired by the wilderness of Beefheart's music. Polly Jean Harvey was born in England the year Trout Mask was released. She taught herself guitar by listening to her parents' Beefheart albums. When she recorded her debut, Dry, in 1992, she integrated the primal charge of punk with the raw texture of the blues. With a wry humour, like Beefheart, she savaged pop convention with a frankness that set her apart from the more self-conscious brooding of Sinéad O'Connor.


One of the more obvious figures drinking from the same pond is Tom Waits, who began as melancholic singer/songwriter sitting at the piano bellowing heartache and longing as if he were Hoagy Carmichael reborn as a beatnik. With a raspy growl, Waits spent the Seventies depicting the lives of hipster lowlifes in songs like "Bad Liver and a Broken Heart" and "Heartattack and Vine." In 1983, he moved from Asylum Records to Island, after firing his manager and producer, then dramatically changed his recording approach with Swordfishtrombones. His new songs ("Underground," "16 Shells from a Thirty-Ought Six") took on the shape of audio poems, or maybe short stories cured in the percolating dreamscape of the unconscious, in which even his voice became part of the grain of the song. An existential Harry Parch, Waits would include (among the standard bass guitars, pianos, and drums) utilitarian devices like brake drums, metal aunglongs, and buzz saws. He ingested the rough surface of Beefheart's music without surrendering to its primal power. As highly imaginative and riveting as Waits's music is, it is still the music of a very sane man playing in the shadows. On later records, Rain Dogs (1985), Frank's Wild Years (1987), and the terrific Mule Variations (1999), he remade the blues and gospel with the same sonic eclecticism Beefheart put into Trout Mask Replica. But he did it by acting the part of a different fish rather than becoming one. Which is why Tom Waits, as wonderfully innovative as he is, won't scare people away from their stereos.


There are many other contemporary groups that have tried to unlock the mystery of Trout Mask's radioactive appeal and replicate it. The underground post-punk Scottish band, Nectarine No. 9, on their 1994 album Guitar Thieves, drew from the dissonant well of Albert Ayler and Vic Goddard & The Subway Sect to create, what one critic called, "a melancholy elegance." Groups were also popping up naming themselves after Trout Mask songs like Sweet Sweet Bulbs, Bill's Corpse and Dali's Car. A number of female rock bands put together a tribute disc in 2005 called Mama Kangaroos: Woman of Philadelphia Sing Captain Beefheart where a great number of the tracks, including the Dadaist sea shanty, "Orange Claw Hammer," was among them. But music wasn't the only area infiltrated by Trout Mask. In 1997, novelist Robert Rankin wrote an absurdest autobiography titled Sprout Mask Replica. The front jacket is a facsimile of the album cover, featuring a man in a fedora with a sprout face wearing a suit and tie with slogan buttons all over his jacket. The book is an elliptical tale filled with short anecdotes about Rankin's mythical ancestors. While one group, the Crombies, eats metal, Rankin himself is portrayed as a man with the power of a chaos butterfly (as insect of transformation out of Trout Mask's "Pena"). Like Beefheart's record, Rankin isn't interested in telling a formal story. The narrative is told through many threads, some leading down paranoid trails to conspiracy theories. Rankin draws inspiration from the record by making the characters aware that they are in a book, just as Beefheart consistently made the listener aware that they were listening to a record. For an album that few people cared to listen to in 1969, Trout Mask Replica found its way, like a termite through wood, into the unconsciousness of the culture at large.

-- February 16/14 

Adolescence and Adolescent: Coraline, Pleasantville and Watchmen


Coraline (2009)

When I was a kid, I used to love those pop-up books where, when you turned each page, the characters (and their peculiar characteristics) would jump out at you. In Henry Selick’s animated adaptation of Neil Gaiman’s SF fantasy novel Coraline (2009), he elegantly employs 3-D to essentially invoke the same effect (as Martin Scorsese would later do with wondrous aplomb in his 2011 Hugo). Yet you don’t find yourself thinking about how Selick (A Nightmare Before Christmas) achieves the kind of macabre splendour he does here, but rather, you become saturated by the tempest of a young girl’s runaway imagination. Coraline Jones (voice of Dakota Fanning) has just moved into Palace Apartments with her socially-conscious parents, Mel (voice of Teri Hatcher) and Charlie (voice of John Hodgman), who are so busy working on their new gardening book they don’t notice that their precocious daughter could care less about foliage and dirt. Due to her parents’ neglect, she becomes curious about a tiny door in their living room wall. Although she initially fails to find out what’s inside, one night, a small mouse leads her behind the door where she encounters a replica of her family – except these parents are “perfect” and cater to her every whim and desire. What Coraline soon realizes, though, when she sees that her “other” parents have buttons for eyes, is that things aren’t as perfect as they seem.

While Coraline has a direct antecedent in Alice in Wonderland, with the little door substituting for a rabbit hole, the spirit of the story is less hallucinatory than in Lewis Carroll. Coraline is dreamy and with a nightshade spell that is less hallucinatory than Alice. Wonderfully barmy characters continually turn up as well, like the feral cat (voice of Keith David), the Russian tenant, Mr. Bobinsky (voice of Ian McShane) who has a dancing mice circus, Ms. Spink (voice of Jennifer Saunders) and Ms. Forcible (voice of Dawn French), the two former actresses who have grown senile (and curiously resemble the psychic sisters giving dire warnings to Donald Sutherland in Nicolas Roeg’s Don’t Look Now). Coraline’s one playmate is the spunky irritant Wyborne (voice of Robert Bailey Jr.), who peddles around on an electric bike and wears a variety of masks.

The uncovering of masks is the underlying theme of Coraline where, out of lonely abandon, a young girl gains perception by learning to see behind appearances. (It’s no accident that it’s the eyes, what we watch movies with, that reveal the true illusion of the “perfect” family.) In life, we don’t get to choose our parents (anymore than they get to choose us) and Coraline is about how we come to terms with that reality. And to reach this particular epiphany, Henry Selick gives us a truly magic carpet ride.

Reese Witherspoon and Tobey Maguire in Pleasantville (1999) 

If you discount the pulpit ranting of Network (1974), there have been very few movies that tap our daily allegiance to television. Although not entirely successful, Gary Ross's Pleasantville (1999) opens up the subject in a number of compelling ways. On a superficial level, Pleasantville performs a virtuously noble civic's lesson out of The Twilight Zone while contrasting the banality of Fifties suburban TV dramas like Father Knows Best and Leave it to Beaver with contemporary realities. Brother and sister David (Tobey Maguire) and Jen (Reese Witherspoon) are the children of divorced parents in the suburbs. When Jen acts out her rebellion by playing the role of the coolest chick around, David longs for the conservative family values expressed on his favourite retrograde television program, Pleasantville. One night, with the help of a wizardly TV repairman (Don Knotts, from The Andy Griffith Show), they fall into their television set and into the Pleasantville show, becoming "Bud" and "Betty Sue," the son and daughter of George (William H. Macy) and Betty Parker (Joan Allen). Besides being totally in black and white, the world they've entered into seems like an idyllic American smalltown.

On first glance, it might be the borough of Sam Wood's Kings Row (1942), except Robert Cummings isn't around to discover Freud and Ronald Reagan isn't getting his legs amputated by a psychopathic doctor played by Charles Coburn. You could even confuse the town Pleasantville with the Lumberton of David Lynch's Blue Velvet (1986), only there's no virginal innocent like Kyle MacLachlan hiding in sultry Isabella Rosselini's closet or Dennis Hopper shouting menacingly that he'll fuck anything that moves. Pleasantville isn't a satiric drama about the darker underside of the white picket fence American town. Folks who live in Pleasantville, where sex and violence are as absent as rain, think it's paradise. Rather than go for a dichotomy between the "fake" and the "real" America, as in Kings Row, Blue Velvet and other lesser films, Pleasantville tries another perspective. David feels right at home in the bosom of his new loving parents – the comfort of a family circle he's never known. At first, Jen is aghast at the blandness all around her and desperate to leave. However, when she introduces one of the boys from school to sex, Pleasantville slowly begins to change from black and white to colour. The town also transforms into a contentious place where art, music and politics evolve into a living history that changes the lives of everyone – including David and Jen – in unexpected ways. In other words, the propulsion of forces that became the cornerstone of the cultural revolution of the Sixties would both expose and transform the America that lay dormant in the shadow of its founding ideals.

The conformist America of the Fifties, explored and smartly satirized in black and white with Invasion of the Body Snatchers (1956), is the antecedent for the insipid homogeneity that swept America while it was napping. But what happens in Pleasantville is the reverse. It's about a "pod" America that's brought back to colourful life. What's more, in Pleasantville, we see an America full of disguises, where masks also become mirrors and the quaint America created by Louis B. Meyer and Irving Thalberg in Hollywood is undermined by the vibrant, sometimes frightening one hidden away. Although this theme is spelled out a little too blatantly in Pleasantville, there are moments that still carry a reverberating tremble. Brief glimpses of a remembered authentic America burst through the movie's obvious metaphors. The attack by the black and white people on the newly created "coloureds" conjures up every Civil Rights march from Selma to Little Rock. Listening once again to Etta James singing "At Last," just as the teenagers start to discover the joys of sex, you can hear a sigh of satisfaction in her voice as if it took centuries to be expressed. Dave Brubeck's "Take Five," with its rare use of quintuple 5/4 time, jumps out of the mix with cool ebullience to underscore the herky-jerky adolescent jump of kids at the soda shop. When the opening notes of Miles Davis's sly and quietly alluring "So What" and Buddy Holly's sprightly "Rave On" are heard, they don't strike pangs of nostalgia in the way songs often do in period films. It's the opposite in Pleasantville – you hear them as if for the first time, and the music has you looking ahead rather than looking back. At its best, Pleasantville is about the desire to both account for and then transcend one's adolescence.

Zack Snyder's adaptation of Watchmen

It’s often assumed that if a movie adaptation is literally faithful to the source book the film will be successful. Zack Snyder’s largely true rendering of the plot of Alan Moore’s sprawling graphic novel Watchmen unfortunately illustrates the opposite. While Snyder (Dawn of the Dead, Man of Steel) accurately gets many of the story points, an Eighties murder mystery set among a group of former superheroes, the tone and spirit couldn’t be more different. Moore drew on the mechanics of film noir to get inside the divided souls of his heroes; while Snyder turns to the simple, cerebral world of the action genre, which concerns itself more with the body count. Put simply, Moore’s novel is a study of morals where Snyder’s film is a moralistic study.

Watchmen is set in an alternate reality that comes close to mirroring the America of the Eighties. New York police are investigating the murder of Edward Blake (Jeffrey Dean Morgan), formerly known as “the Comedian,” a superhero whose comic smile hid a dark, violent spirit. When the police end up down blind alleys, a costumed vigilante Rorschach (Jackie Earl Haley), whose image-shifting disguise lives up to his name, picks up the search. While Rorschach tracks down the others in the group that made up the Watchmen, he uncovers a plot by the government to wipe out costumed superheroes. As he further unravels the mystery of Blake’s murder, the Watchmen come to discover something more grave than the death of one of their own.

Alan Moore’s book clearly took the themes of the great adventure superhero comics like Batman, Superman and the Marvel collection of adventurers and examined the hidden motives behind masked vigilantes who call themselves heroes. In doing so, he explored how recent American history, from the assassinations of the Sixties to the fake optimism of the Reagan Eighties, rendered the superhero as potentially superfluous, an anachronism with no souls to save. (In the book and film, Richard Nixon gets a third term as President and the potential of a nuclear holocaust is still a foreshadowing.) The diversity of characters from Dr. Manhattan (Billy Crudup), a scientist who is mutated by a failed experiment like The Amazing Colossal Man of the Fifties, to the second Silk Spectre, his lover Laurie Juspeczyk (Malin Ackerman), become by-products of dashed ideals and false hopes.

But the film fails to distil the dispirited tone of Moore’s prose. For example, in one scene where Rorschach dispatches of a depraved killer, the book has Rorschach chain him to an oven, hand him a hacksaw, then set the building on fire, daring the killer to hack his arm off to get free. As he waits outside, the building is consumed by flames and no one emerges. He replies to himself, wistfully contemplating his actions, “Nobody got out.” But in the film, Snyder drops the line and instead has Rorschach go on a bloody rampage, gleefully justified in committing murder. All through the picture, Snyder tries to follow the story, but he keeps losing the thread. Watchmen ends up trafficking in a primitive catharsis with all the moral complexity of a Dirty Harry movie. From the opening moments of the picture, Snyder lays on the literal symbols with a trowel: the key music of the time ("The Times They Are A-Changin'"? From when to what?), a badly caricatured Nixon, plus some deadly heavy-handed hard-boiled dialogue that would make Mickey Spillane blush. The picture also clocks in at a hefty two hours and 43 minutes. (The director’s cut is even longer.)

The divided spirit of Watchmen, which cancels out the film's possibility of being relevant as a cultural touchstone, pales next to something like the animated The Incredibles which, with a playful cleverness, already took us inside the fractured spirit of the superhero and our ambivalent relationship to them. And those films did it without trashing the intimate bond that we have to those tormented characters. (Christopher Nolan's The Dark Knight trilogy also sunk under the solemn weight of its own gloomy, incoherent pretensions.) Watchmen essentially breaks faith with the audience. Instead of engaging our childlike fascination with superheroes and bringing us to childhood's end, Watchmen's infantile machismo plays to our most basic desire for bloodlust.

-- June 3/14  

Unanswered Questions: The Riddle of the Topical Song


"Every folk song is religious in the sense that it is concerned about the origins, ends, and deepest manifestations of life, as experienced by some more or less unified community. It tends to probe, usually without nailing down definite answers, the puzzles of life at their roots."

– John Lovell, Jr., Black Song: The Forge and The Flame.

Some years back, while I was in high school, FM radio still held the promise of surprise, along with a keen sense of artistic danger always lurking. There was the prospect of discovering something you might not have the good fortune to hear again. Late one night, on Toronto's CHUM-FM, I first encountered Creedence Clearwater Revival's "Effigy" from their 1969 album, Willie and the Poor Boys. This epic song, which concluded the first side of their fourth LP, described an act of mob violence without identifying the mob (or the subject of their anger), and it had all the insistence of a news bulletin interrupting regular programming. With a portentous melody built upon the foreboding chords of a dirge, "Effigy" carried some of the same apprehension that the news reporter's commentary did in Orson Welles's famous War of the Worlds radio broadcast just before the Martians started vaporizing the citizens of Grover's Mill, New Jersey. Listening to "Effigy" that evening before bed, I kept expecting the song to conclude just like that news reporter's broadcast did – with death – where the mob would ultimately catch the singer before he could finish documenting the crimes he was witnessing. Songwriter and singer John Fogerty continually outpaced the urgency of what he was seeing until all that was left in his dying questions was why this was all taking place. His chiming guitar, with the clawing force of a chainsaw, soon cut through those questions just like the Martians' vapour ray did through Grover's Mill. His fears quickly faded into the long night as if he'd been finally caught and silenced by the mob. And I never heard "Effigy" on the radio again.

Up to that point, AM radio had been playing, from that same record, Creedence's hit single "Fortunate Son" non-stop. Recorded at the height of the Vietnam War, John Fogerty addressed that conflict and his refusal to support his country's involvement in it with a clarity and passion that made you want to join the barricades and protest with him. Unlike "Effigy," "Fortunate Son" came from a long and noble tradition of what Bob Dylan once called "finger-pointin'" songs, topical numbers like "John Brown" and "I Ain't Marching Anymore," that would stop you in your tracks. But the topical song, both the lasting kind and the pretenders to the throne, do run the danger of sooner or later becoming co-opted. As they lean towards conveniently dividing the world into those who stand on the side of righteousness and those who don't, given the day, righteousness can turn out to be relative. Topical songs rarely ever escape their time because they are always fixed on the subject of their time.

On occasion, that problem poses a great pop quiz: How long can these songs stay relevant? Consider Dylan's "The Times They Are a-Changin'," which in 1963 foretold an age of unrest, but within a few decades, the Bank of Montreal would gleefully incorporate that tune in order to promote their credit cards. The Beatles' "Revolution," where John Lennon passionately laid out his ambivalent views on the political tumult of 1968, would a few years later be in the service of selling Nike running shoes. Imagine Bruce Springsteen's horror in 1984 when Ronald Reagan would usurp "Born in the U.S.A.," an angry protest song that accounted for the damaged psyches of Vietnam War vets, as a patriotic number to underscore his idea of a new morning in America. During that same dubious bright dawn in the country, Tom Petty had to watch stupefied as David Duke, a former KKK Grand Dragon running as a Republican, chose Petty's defiant track, "I Won't Back Down," as his campaign song. As powerfully persuasive as "Fortunate Son" became in 1969, in its convictions against the Vietnam War, the track really had only one side of the story to tell; and if you shared in its vision, you went to the wall with it, and for it. Fogerty's "Effigy," though, barely had a story to tell, yet it held you in complete suspense for the entire six minutes it took for it to end. The facts in the song were suggested rather than stated. Who knew what side you were supposed to be on? All we did know was that the singer walked out one night and saw "a fire burning on the palace lawn" as "humble subjects watched with mixed emotions." Rather than solving the enigma as to who was being burned in effigy, and why, Fogerty simply raised more urgencies as the fire spread across the countryside until, by morning, "few were left to watch the ashes die." As a listener, you were left with unanswered questions rather than partisan answers.


When "Fortunate Son" comes on the radio today, it still has the force to take you back to the time it dominated the airwaves, but it doesn't have the importunateness to seize the time we live in now. It remains in the past tense. Even if one wanted to apply it to any war, from Iraq to Gaza, the meanings would have to be imposed on the conflict and by those looking for the song to speak for their side. By contrast, "Effigy" didn't directly address its time, it embodied it, like a swath of tissue being swept up by the political winds. It doesn't tell the listener what to think, but invites us to think along with it. You can't get the song under your control. It escapes your desire to define it, to have it say what you want it to say. Even with its description of the Silent Majority "not keepin' quiet anymore," which evokes the voters that Richard Nixon named as his true supporters, "Effigy" breaks free of the dark shadow of his regime. It doesn't even try to sum up his time. Heard on the radio today, "Effigy" might just as easily speak for the recent murder of black teenager Michael Brown by the police in Ferguson, Missouri, where the flames of anger and protest that followed called up all the racial injustices of the past.

Back in 2006, Canadian singer/songwriter Serena Ryder released an album called If Your Memory Serves You Well. The title being the opening line from Bob Dylan & The Band’s song "This Wheels on Fire." If Your Memory Serves You Well was also something of an appropriate title. After all, most of Serena’s record was a trip down the memory lane of some of the best Canadian pop and folk songs of the last forty years. Besides Canadian Rick Danko’s co-written "This Wheels on Fire," Serena Ryder also included versions of Leonard Cohen’s haunting “Sisters of Mercy,” Ian & Sylvia’s indelible “You Were on My Mind” and Paul Anka’s mournful ballad “It Doesn’t Matter Anymore,” which became a posthumous hit for Buddy Holly after he was killed in a plane crash in 1959. But there was one song she included where your memory might not serve you well, a track many would not have pegged as one written by a Canadian. It was a song called “Morning Dew.”


“Morning Dew” was an anti-nuclear war folk song written by a young Toronto teenager named Bonnie Dobson back in 1961. Dobson's topical song would become a melancholic ode to a world left devastated and abandoned by nuclear holocaust, and a tune that artists from every genre would come to cover. Over the years, performers as diverse as The Grateful Dead to Jeff Beck to Devo to Robert Plant came to be just as intrigued by the song as Serena Ryder was. "Morning Dew” developed a long shelf life even if listeners didn't always recall it. The question of why is intriguing. Written one evening, after a gig at the Ash Grove in Los Angeles, Dobson came to write her signature song after being inspired by director Stanley Kramer’s film adaptation of Nevil Shute’s anti-war novel, On the Beach, a movie which had all its fingers pointing. On the Beach featured a group of people on an island off Australia waiting to be shrouded by a cloud of hydrogen gas from a bomb that had already taken out the rest of the world. One couple with a child was also left contemplating suicide. “Morning Dew” was a protest song and it was conceived at a time when folk music was attempting to change the world from one that would turn to rubble into something new and egalitarian in spirit. “Morning Dew” is framed through a conversation, but is it between a parent and child, or is it two lovers? Who really knows? Who really cares?

At the time, the exquisite darkness of "Morning Dew" laid clues that we could be facing a nuclear holocaust, but heard today, the singer could be describing global warming. Being specific denies the song its particular greatness, and what's made it last to become one of the most covered of Canadian folk songs. And like many lasting songs, it has its origins in another. The nightmare "Morning Dew" suggests had its roots in an earlier Canadian tune that in some ways could be considered the antithesis of “Morning Dew,” a 1950 Ed McCurdy anti-war track called “Last Night I Had the Strangest Dream.” But if “Last Night I Had the Strangest Dream” imagined a better world out of horrifying circumstances, “Morning Dew” was a nightmare about something good that has now tragically ended. “Morning Dew” never really tells you the details of its story any more than Fogerty's narrator did in "Effigy." The song becomes essentially a mystery novel waiting for someone to solve it. It was as if by singing it you just could unlock the door to its bottomless mysteries. Many have tried. “Morning Dew” does unfasten a series of questions, but they never get fully answered. All we do know is that there is nothing left of the world – but how and why it got there remains a secret. Unlike those many anti-war songs that dozens of people lined up to sing, as if signing up for a different kind of army, “Morning Dew” didn't offer a clear message in which to sign on. It didn't provide the affirmation of spirit that folk artists embraced when they sang the many other tunes they turned into social protest.

Woody Guthrie, who knew a thing or two about "finger-pointin'" songs, once said that he didn't so much write songs as pull them out of the air. But he was being somewhat disingenuous with that remark. What Guthrie chose to 'pull out of the air' were tunes that suited his beliefs. "Effigy" and "Morning Dew" aren't so much pulled out of the air, as they are the air. Their creations escaped their control and their meaning moved beyond even the intent of the songwriter's goals. (In the case of Bonnie Dobson, for a number of years, "Morning Dew" literally escaped her control when songwriter/performer Tim Rose stole it from her by claiming copyright over it.) Woody Guthrie, like many political folk artists, wanted his songs to possess answers to the urgent issues he wished to address. John Fogerty and Bonnie Dobson might have also begun their songs with that aim in mind. But when they sat down to find those answers, the illusive questions "Effigy" and "Morning Dew" possessed offered the kind of riddles we are all still seeking solutions for.

-- August 16/14

A Drama of History: The 40th Anniversary of Randy Newman's Good Old Boys



"It's hard to hear a new voice, as hard as it is to listen to an unknown language," D.H. Lawrence wrote in Studies in Classic American Literature (1924). "We just don't listen." Lawrence wasn't just talking about something as basic as the fear of something new. New ideas, as he later suggested, can always be pigeon-holed. "The world fears a new experience more than it fears anything," Lawrence explained. "It can't pigeon-hole a real experience. It can only dodge. The world is a great dodger, and the Americans the greatest. Because they dodge their own very selves." Lawrence was addressing here the varied works of American writers James Fenimore Cooper, Edgar Allen Poe, Nathaniel Hawthorne and Herman Melville. A panoramic and illuminating study, the polemic examines how a number of gifted writers were coming to terms with the experience of a young country still in the process of finding its identity. For an artist who has barely registered on the public's consciousness, except in his movie music and his songs for Pixar pictures, singer/songwriter Randy Newman could be one of Lawrence's great dodgers – an Artful Dodger – and one who deliberately creates paradoxical narratives in his songs. And his music, like the writers of the previous century, has also been on a comparable sojourn. For almost half a century now, the country he depicts with both love and devotion is also riddled with broken promises, violence, paranoia, superstition and arrogance.

In "Davy the Fat Boy" (from his 1968 debut), Davy's best friend promises the boy's dying parents to take care of him, only to stick Davy in a circus freak show after they're gone. In "Suzanne" (12 Songs), Newman's answer to Leonard Cohen's poetic tribute to a lovely and mysterious nymph, a loner finds Suzanne's name in a phone book and stalks her like a rapist. The song "Lucinda" (also from 12 Songs) has the poor girl accidentally chewed up by a beach-cleaning machine while she snoozes in the sand. "Sail Away," perhaps Newman's greatest songs from the album of the same name, features a slave trader coercing African blacks onboard his ship by promising them watermelons and wine (not to mention opportunities to sing songs about Jesus all day) if they only come to America. On the subject of God, the gospel-inspired "God's Song (That's Why I Love Mankind)," from 1972's Sail Away, has God blithely handing down plagues, wars and famine while laughing from his home in Heaven at all the futile prayers offered him. After all, Newman chortles, that's why God loves mankind. In "The World Isn't Fair" (Bad Love), the narrator brings that great agnostic Karl Marx back from the dead to show him a world so unfair that Karl wants to return quickly back to his grave. One more recent song from 2006, "A Few Words in Defense of My Country" (Harps and Angels), Newman tries to confront the malaise and hubris of the Bush Presidency by explaining that Caligula, Torquemada, Hitler and Stalin were worse, but as critic Greil Marcus remarks, it gave him (and listeners) "cold comfort" to realize that.

Randy Newman

Randy Newman's songs, which grapple with people and issues rarely found in chart-topping popular songs, are cleverly constructed, satirical pieces charged with ambiguity. "I'm more in the tradition of Irving Berlin and Harold Arlen and those guys who were just doing their job," Newman once said. "I'm just doing the job, too, with my kind of characters." But as much as Newman owes to Berlin and Arlen, he is doing more than just a job. Many contemporary songwriters besides Newman have attempted to overturn the quaint notion of the American experience. "I sometimes think when I hear Paul Simon or Irving Berlin that we're more interested in America, but we're just trying to get it right," Newman told Timothy White in 2000. While acknowledging Berlin's artistic savvy in depicting America, Newman also recognizes the kind of dodging that disguises it. After all, didn't Berlin, who was Jewish-American, turn "White Christmas" into a song about snow rather than the holiday, or "Easter Parade" into a romantic date rather than the resurrection after the blood and nails. "Berlin can sing about loving Alabam', but God help him if he ever went to Alabama," Newman asserts. When Randy Newman recorded his fifth album Good Old Boys, forty years ago this month, he went directly into that troubled South. With a romantic musical aesthetic out of Irving Berlin, he painted a picture of the American South that captured its many seductive contradictions torn out of the history of a defeated people. Besides being his most commercially successful record (it reached #36 on the Billboard charts), the album has since inspired a new book (David Kastin's Song of the South: Randy Newman's Good Old Boys, Turntable Publishing), and its songs ("Louisiana ,1927," "Rednecks") continue to be relevant invoking the rich dualities within the American character.

Good Old Boys was also Randy Newman's bid to come to terms with an evolving South, by looking into its past and subsequently taking us to the present – the horrific 1927 New Orleans flood ("Louisiana, 1927"), the corrupt Louisiana governor, Huey. P. Long ("Kingfish," "Every Man a King"), and Lester Maddox ("Rednecks") – as a means to grappling with the country in crisis as the President of the United States was about to resign in disgrace over Watergate. The opening track, "Rednecks," was perhaps the key to the album's rich complexity. It was inspired by watching Lester Maddox, the segregationist governor of Georgia, humiliated on The Dick Cavett Show, a sophisticated TV talk program out of New York. While Maddox was indeed a racist, Cavett, along with novelist Truman Capote, and football player Jim Brown, decided to spend the show ridiculing and mocking him until he got up and left. Newman had been born in Los Angeles, but he had spent many years as a child in the American South where the culture, with all its peculiarities and traditions, had seeped into his blood. So he understood the vile characteristics that made up Maddox's racism, but he also equally understood Northern liberal condescension which he felt dictated the show. "The audience hooted and howled, and Maddox was never given a chance to speak," Newman said. "So I wrote the song, and Northerners have recognized ever since that they are as guilty of prejudice as the people of the South." If Maddox had been allowed to speak, Cavett's audience might even have discovered something more complex about the governor than they at first assumed.

Lester Maddox being ridiculed on The Dick Cavett Show in 1972.

Born in the working-class district of Atlanta, Maddox was a high school dropout who found himself embroiled in one of the most contentious periods in American history. It was Maddox, the day after the 1964 Civil Rights Act had been signed into law in Washington, who chased blacks from his Pickrick fried chicken restaurant. Not long after his defiant stand against integration, Maddox became even more obstinate, choosing to sell his restaurant rather than be forced to serve blacks. Adopting a pick handle as a symbol, Maddox soon entered politics, where he made two unsuccessful bids for mayor. In 1966, however, he captured the Democratic nomination for governor after an odd fluke of circumstance. During the general election, Maddox trailed Republican Howard H. "Bo" Callaway, but write-in voters ensured that neither got the majority vote. The election instead was thrown to the Democratic-dominated Georgia legislature, which threw its support behind Maddox.

To the surprise of just about everyone, a newly elected Maddox urged peace, declaring that there would be no place in Georgia "for those who advocate extremism and violence." He also confounded critics by appointing more blacks to key positions in government than had any of his predecessors. In 1968, however, Maddox reverted back to his old form, stirring up resentment when he refused to close the state capitol for the funeral of the assassinated Martin Luther King Jr., and raised hell over the state flags being set at half-staff. Despite his odious behaviour, Lester Maddox could confound both his critics and his admirers even though he was usually portrayed as a caricature possessed of a slick pate, thick glasses and a folksy humour. Maddox was an easy target for jokes. As Cavett had already judged Maddox a clown, the governor never had a chance to present himself in any other light. "He wasn't even given a chance to prove what an idiot he was," Newman said. "It was like, they sat Jim Brown next to him, and the crowd was razzin' him...He didn't get a chance to do anything, and they just elected him Governor, in a state of six million or whatever, and if I were a Georgian, I would have been offended, irrespective of the fact that he was a bigot and a fool."

Lester Maddox

"Rednecks" was written from the point of view of one (beginning as part of a concept album, Johnny Cutler's Birthday, about a Southerner that would eventually evolve into Good Old Boys). The song opens with a description of the Dick Cavett debacle where Newman tells us that Maddox "may be a fool, but he's our fool," implicating both himself and the country in the calamity of its racist history. From there the song's narrator (originally the character Johnny Cutler) launches into a hilariously self-deprecating appraisal of his Southern heritage with a series of identifiable stereotypes ("We talk funny down here/We drink too much and we laugh too loud/We're too dumb to make it in no Northern town/And we're keeping the niggers down"). The chorus is cleverly designed to get white liberals on Newman's side by smugly confirming the obvious redneck attributes ("We're rednecks,We're rednecks/We don't know our ass from a hole in the ground/We're rednecks, We're rednecks/And we're keepin' the niggers down"). But before we can draw any satisfaction from the putdown of Southern crackers, Newman plays an eloquent piano bridge, in the manner of Stephen Foster, which puts us in the spirit of the Old South. The effect is to pull the rug out from under the Northern liberal conceit. Newman begins to name the ghettoes outside the South where blacks are anything but liberated, including Harlem, the Hough in Cleveland, the east side of St. Louis, the Fillmore district in San Francisco and Roxbury in Boston. Taken together, they tell the woeful story of criminal neglect and urban squalor that has the effect of putting the North on the same playing field as Southern racists.

This hideous irony has a long-standing history according to Mark Royden Winchell, a professor of English at Clemson University. In his essay "The Dream of the South," a study of the role of Stephen Foster in American music, Winchell explains that the double standard Newman sings about was argued long before the Civil War. "Throughout the 1850s, Southern apologists argued that slaves in the South enjoyed more social and economic security than the typical factory worker in the North," Winchell wrote. "Even in the decades over slavery immediately prior to the War between the States, honest Northern liberals such as Orestes Brownson noted the hypocrisy of Yankee abolitionists decrying slavery in the South while turning a blind eye to the plight of factory workers in their own backyard." For Newman, a simple putdown of the redneck would have been worse than pandering to a liberal conceit – it would have failed to provide the historical context that created him, as well as the forces that divided the country in the post-Civil War years.

Lenny Bruce

In "Rednecks," Newman also gives the lie to one of comic Lenny Bruce's most celebrated routines of the early Sixties, "Are There Any Niggers Here Tonight?" After a short preamble on integration, Bruce asks the audience that question. The crowd, usually consisting of both blacks and whites, would get restless and uneasy. Bruce proceeded undaunted, pointing at the audience and calling out to the "kikes," "niggers," "spics," and "micks." By the end, he was parodying a gambler at a poker game ("I pass with six niggers and eight micks and four spics") and finally arriving at his point:

"The point? That the word's suppression gives it the power, the violence, the viciousness. If President Kennedy got on television and said, 'Tonight, I'd like to introduce all the niggers in my cabinet,' and he yelled 'niggerniggerniggerniggerniggerniggernigger' at every nigger he saw...till nigger didn't mean anything anymore, till nigger lost its meaning – you'd never make any four-year-old nigger cry when he came home from school."

At its best, Lenny Bruce's art combined satire with shock, but here, he dips into preaching and phony moralizing. His solution to racism would hardly end racial intolerance, or inure four-year-old black children from the pain associated with the slur. Moreover if "nigger" lost its meaning, it would also lose its tragic historical significance, and as a consequence, give comfort to racists everywhere. Today, of course, the word is hardly repressed; it's been uttered quite freely by both black and white rappers, as well as other hipsters. Yet it still has the capacity to cause anger and pain. Newman makes us aware of that pain. Despite Newman's awareness, though, "Rednecks" was banned in Boston during the desegregated bussing riots in 1975. So the word still hasn't lost its ability to offend. "There was a black kid in Louisiana who was offended because he was sitting in an audience of fifteen-hundred white people who were roaring at the fact they were rednecks," Newman once told interviewer Paul Zollo.

All through Good Old Boys, Newman provides character dramas that bring us a fuller comprehension of the people and circumstances he depicts. In "Birmingham" (which also began as part of the abandoned Johnny Cutler's Birthday project), underlying a sense of civic pride is danger and menace. "I like the guy being proud of where he's from, even if the city has a bad reputation, even in the South, for being ugly," Newman would recall to Timothy White. "Now it's entrepreneurial and a second Atlanta, but when I wrote it, it wasn't thought of as anything but sorta dirty and low." At first, "Birmingham" seems as affectionately nostalgic as The Beatles' shimmering "Penny Lane," but then Newman quickly reminds you of the meaner side of that adoration ("Got a big black dog/And his name is Dan/Who lives in my backyard in Birmingham/He is the meanest dog in Alabam'/Get 'em Dan"). The majestic "Louisiana, 1927" renders a Southern tragedy (echoed years later in 2005 during Katrina) of a New Orleans flood which brought wreckage, death and national neglect, yet Newman burrows equally into the wounded heart of the matter without sacrificing the rage that indifference ferments.

Huey P. Long

Following "Louisiana, 1927" is a couple of songs about the controversial and demagogic governor of Louisiana, Huey P. Long, who would be elected governor of the state the year after the deluge in New Orleans. The songs include Long's own "Every Man a King," to be followed by "Kingfish," named after Long's nickname, which some claim was inspired by a character of the same name in the radio comedy Amos and Andy. (Long insisted the name had another meaning: "I'm small fry here in Washington. But I'm the Kingfish to the folks down in Louisiana."). In "Kingfish," Newman explores – and parodies – Long's paternal leadership ("Who took on the Standard Oil men/And whipped their ass/Just like he'd promised he'd do?/Ain't no Standard Oil men/ Gonna run this state/Gonna be run by little folks/Like me and you"). In the chorus, where Newman identifies the Kingfish, the strings soar in mock triumph as he announces that "the Kingfish gonna save this land." Long would become so legendary that novelist Robert Penn Warren would use that legend to inspire his book, All the King's Men (1946), which featured a fictional governor named Willie Stark, clearly patterned on Huey P. Long. The novel was a study of an American style of fascism, and would ultimately be adapted for an Academy Award-winning film in 1949. Warren's book brought to light, as Randy Newman would in "Kingfish," the convoluted history of the South. "If you were living in Louisiana you knew you were living in history defining itself before your eyes and you knew that you were not seeing a half-drunk hick buffoon performing an old routine," Warren wrote. "But witnessing a drama which was a version of the world's drama, and the drama of history, too: the old drama of power and ethics." The decision for Newman to depict Long on Good Old Boys, whose political star had risen after the events described in "Louisiana, 1927," would end in corruption and assassination. He couldn't have also been more timely. The recording of Good Old Boys happened to coincide with the downfall of President Richard Nixon, a more contemporary demagogue on the verge of leaving office after the Watergate scandal.

Nixon would also get his own tribute on Good Old Boys in "Mr. President (Have Pity on the Working Man)." It was even recorded on August 8, 1974, the same night Nixon, in an effort to stave off impeachment hearings, gave his resignation speech. Although the song is a simple plea for compassion from America's leader, Newman also addresses Nixon, who has some very familiar attributes to Long ("Maybe you're cheatin'/Maybe you're lyin'/Maybe you have lost your mind/Maybe you're only thinking 'bout yourself"). Newman portrays Nixon, like Long, as a dreamer caught up in a quest for absolute power, and sets out on Good Old Boys to illustrate how both dreamers represent both the promise and failure of their country's largest ambitions. To frame "Mr. President (Have Pity on the Working Man)," Newman draws partly from the well of lyricist E.Y. "Yip" Harburg, who cowrote "Brother, Can You Spare a Dime?" with Jay Gorney during the early part of the Depression. Their song focused on an everyman who is the victim of broken promises. In "Mr. President (Have Pity on the Working Man)," Newman borrows some of Yip's outrage to express his own disgust at a man who made a travesty of his own presidency.


Good Old Boys shares an affinity with the first two records by The Band (Music from Big PinkThe Band) which also celebrated the rich dualities within the American character while parsing the troubled strains of promises broken and promises that couldn't be kept. To paraphrase journalist Pete Hamill on the work of Bob Dylan: If totalitarian art tells us what to feel, Newman's art feels, and invites us to join him. Since doctrinaire thinking has perhaps never been more in vogue than it is today, Newman's work on Good Old Boys comes across today as refreshingly candid, music that engages us with ideas and challenges our prejudices and assumptions. Good Old Boys is the music of a dreamer who never tries to find comfort in the dream. Newman doesn't sing of the quaint America found in Phil Alden Robinson's Field of Dreams (1989), in which a farmer builds a baseball diamond on his land in an effort to heal the ruptured values of his country and family. Field of Dreams perpetuates a fraud which neatly airbrushes corruption, segregation and racism, from its nostalgic reflection of America, and urges conformity to the comforting moral values it sets before us. Newman's portrait of America on Good Old Boys encompasses all things we don't find warm and comforting – and that includes the misfits and outsiders he sings about.

"The real American day hasn't begun yet," D.H. Lawrence also wrote in 1924 about the young country before him. "American consciousness has so far been a false dawn...You have got to pull the democratic and idealistic clothes off American utterance and see what you can of the dusky body of IT underneath." Good Old Boys represents an avid quest to dig beneath that dusky body of American utterance. To be an American dreamer, Newman's songs continue to remind us, means learning how to live with the unresolvable truths of what those dreams mean – a hardship to endure and a challenge to embrace.

-- September 13/14



                                                                     2015


The Tracks of Our Years: The Music of The Wanderers (1979), Diner (1982) and Wild (2014)


Barry Levinson's Diner.

Most coming of age movies that examine the tracks of our years often let the music of the era do the walking for them. George Lucas's American Graffiti (1973), for instance, provided a softer, more genial look at the past and so he provided a perfectly programmed jukebox of iconic songs from the late Fifties and early Sixties in order to wax nostalgic. Lucas was displaying his marketing savvy, as well, even before Star Wars (1977), in creating a merchandising scheme to sell albums filled with hits for those needing to drift back happily to their good ol' days. Barry Levinson's Diner (1982), however, gave the lie to the Archie comics sensibility that Lucas trafficked in, and also let the songs of Elvis Presley ("Don't Be Cruel"), Fats Domino ("Whole Lotta Loving") and Bobby Darin ("Dream Lover") simply become the air the characters breathed. The Del Vikings' propulsive "Come Go With Me,"for example, is used in both films but is much more memorable in Diner. As Tim Daly's Billy Howard, a reticent Wasp, arrives at the Baltimore train station to be best man at the wedding for Steve Guttenberg's Eddie Simmons, and he's greeted by all his old friends, Daly strides confidently towards them in perfect time to the Del Vikings. It's as if the tune's seductive swing and rhythm provided a casual sway that only his old hometown awakened in him. The music in Diner essentially interacts with the characters, like an ambient intoxicant, that seems to imbue the comic patter that keeps them up all night in their favourite roadhouse dig.

In capturing both the cultural and political shifts between the Fifties and Sixties, American Graffiti and Diner contained key elements of that shift, a seismic wave that carried Chuck Berry's "Sweet Little Sixteen" literally crashing into The Beach Boys' "Surfin' USA." But no American film made sense of that cultural revolution with more guile and imagination than Phil Kaufman's little-seen adaptation of Richard Price's The Wanderers (1979). It's loosely based on Price's 1974 novel about a youth gang in the Bronx named after Dion and The Belmonts' hit song ("The Wanderer") and the other New York city misfits they encounter and clash with. Set in 1963, the plot is a coming of age story not only for the hormonal kids, but also for the adults, and for the nation on the verge of JFK's assassination. In an early scene from the picture, as the forbidding Fordham Baldies stride onto the screen and gather outside the Marine Recruitment office, we hear The Four Seasons' triumphant "Walk Like a Man" on the track. In a sense, the comic use of the song is ironic because despite the overwhelming presence of these very large young hoods with shaved heads ("Those guys look like a bunch of pricks with ears," opines the diminutive Joey of the Wanderers), their body language carries a bigger tinge of male homoeroticism echoed in the shimmering falsettos of The Four Seasons. The heterosexual relationship the outsized and aptly named leader Terror (Erland van Lidth) has with his opposite number in stature, Pee Wee (Linda Manz), barely counts as straight because one can't begin to comprehend the two of them copulating.

Terror, Pee Wee and the Fordham Baldies.

As the picture goes on, though, the songs begin to function more like operatic arias speaking for what the characters themselves can't put into words. When Terror decides to sign up to be a Marine and fight in Vietnam, Pee Wee looks as defeated and forlorn as The Shirelles did tracing those same lines of grief in their poignant hit, "Soldier Boy." When Richie Gennaro (Ken Wahl), the leader of the Wanderers, cheats on his girlfriend Despie (Toni Kalem, who would years later play Angie Bonpensiero, the discarded spouse of 'Big Pussy' in The Sopranos, as if that marriage became the future that The Wanderers ultimately foretold), The Angels' "My Boyfriend's Back" provides an ironic counterpoint to what is happening before her eyes. All through The Wanderers, the pop soundtrack does more than reflect the times of the songs, and instead these tunes suggest more why they became so essential as to be possessed by the people who heard them and loved them. When JFK is assassinated, Richie and Despie come to reconcile their differences and decide to marry, but the song that both captures the moment of tragedy, while providing a walking bridge for them to the altar, is Ben E. King's "Stand by Me." If Sam Cooke's "A Change is Gonna Come" is forever linked to the Civil Rights struggle, Ben E. King's soft and stirring anthem could have forever invoked Dallas in 1963. "When I was shooting Goldstein, we came out on the street one day and I saw people were staggering down the street crying," Phil Kaufman told The Hollywood Interview in 2008. "We were walking around with our cameras and saw a bunch of people standing around a store window, looking in and crying. That was how I found out that JFK had been killed. We duplicated that in The Wanderers with the people looking in the department store window at all the TVs, watching the news that JFK has been assassinated." The scene itself movingly depicts a familiar zeitgeist moment, but the use of "Stand by Me" enlarged the national event, and also the personal and fragile plea that's made between Richie and Despie. (Years later, "Stand by Me" would be shrunk to the size of a pea in the treacly 1986 Rob Reiner coming of age story that took the song's title, but none of its emotional power.)

Toni Kalem and Ken Wahl in The Wanderers.

Written by King, along with Jerry Leiber and Mike Stoller (who had penned songs for Elvis, The Coasters and The Drifters), "Stand by Me" was inspired by the spiritual "Lord Stand By Me," a 1905 gospel standard written by Reverend Charles Albert Tindley, where Tindley calls on a mighty and higher power to deliver him "when the storms of life are raging." While the pop adaptation ended up with over 400 recorded covers since King first sang it, he almost refused to record it until Leiber and Stoller helped him complete it. When "Stand by Me" became a hit single in 1961, just as the country was at war with itself over segregation and mapping out the New Frontier, the composition spoke of dark nights that could be overcome. But in 1963, when the song first appeared on an album, its depiction of skies tumbling and falling seemed almost impossible to transcend in the wake of the President's murder. And that's where the song takes you in The Wanderers, where the appealing adolescent cockiness of Dion & the Belmonts' "Runaround Sue" and "The Wanderer," gives way to the adult emotions of "Stand by Me" (and later in the film by Bob Dylan singing "The Times They are a-Changin'" which sends Richie into retreat from this new world being born back into the safe and familiar one that's about to be irrevocably changed by the decade ahead).

But the films I've already mentioned locate their stories in the distant past and were made some time ago. In Jean-Marc Vallée's beautifully elliptical adaptation of Cheryl Strayed's 2012 memoir, Wild: From Lost to Found on the Pacific Crest Trail, he does something equally radical with the music, but to achieve a different impact. A young woman, Strayed (Reese Witherspoon), hikes across the Pacific Crest Trail after the death of her mother. But she isn't on this trek to purify her life of the self-destructive habits she's acquired growing up in an embattled family (as you find in most life-affirming memoirs). Strayed uses the songs she knows to mark the miles of an umbilical chord that ties her to the mother (Laura Dern, in a beautifully cadenced performance) she lost to cancer. But Strayed also seeks her independence from that bond so that she can discover, as critic Steve Vineberg suggested in Critics at Large, the daughter her mother understood her to be.

As in his 2005 debut, C.R.A.Z.Y., Vallée's coming of age story also examines the intricate ways music can shape our perceptions of a world we feel estranged from. Though Wild doesn't showcase the music in the dazzling way C.R.A.Z.Y. did with set pieces like The Rolling Stones' "Sympathy for the Devil" which plays with beatific blasphemy in the head of the young hero as he watches a Catholic church ritual in his Quebec hometown, Vallée does more lightning quick visual motifs on the various tunes that play in Cheryl's memory. Song fragments reach out to other fragments and they tie together a narrative that only she can string together. In the book, Strayed described her music as a “mix-tape radio station” that played in her head, where she kept “playing and replaying scraps of songs and jingles in an eternal, nonsensical loop.” But Vallée (with the help of Nick Hornby's deft script) makes sense of that loop by including the music of her mother that Cheryl has inherited and also needs to battle (like Simon and Garfunkel's "El Condor Pasa").

Reese Witherspoon in Wild.

Much of the drama and humour in the picture comes from how these songs possess her and trigger a variety of memories that are inseparable from the music. Memory becomes a chance game of free association, where Portishead's "Glory Box" can suddenly shake hands across the years with Bruce Springsteen's "Tougher than the Rest," and Free's "Be My Friend" can build a bridge to the Pat Metheny Group's "Are You Going With Me?" Occasionally, a song does something less conventional than just speak to the drama on the screen, as in the scene when Cheryl hitches a ride with a father and son and The Shangri-Las' powerful 1965 melodrama, "I Can Never Go Home Anymore" comes on the radio and pulls Cheryl back into her mother's orbit. While the use of the song may seem obvious (given the subject), it's lead singer Mary Weiss's sense of unrequited loss that makes the song so irresistible to the scene. "I had enough pain in me to pull off anything," Weiss would say in a recent interview about a song she recorded when she was all of sixteen.

In an age when girls groups ruled in their own pop kingdom, The Shangri-Las were shimmering alchemists who took the melodrama of the teen angst genre, in heartbreakers like "Leader of the Pack" and "(Remember) Walkin' in the Sand," and turned them into biting psychodramas. Formed in Queens, New York, this group could make you hurt and simultaneously make you glad that you indeed could be wounded. "I Can Never Go Home Anymore," the tale of a young woman who leaves home for a boy and then has too much pride to reconcile with her mother, is cutting enough to draw blood. (Lines such as "she grew so lonely in the end, the angels picked her for a friend," or Mary Weiss's primal cry of "Mama!," immediately invoke the conclusion of Douglas Sirk's Imitation of Life when Susan Kohner, a young black woman trying to pass for white, comes back to the funeral of the mother she's rejected and dissolves in grief before our eyes.) For Cheryl, in Wild, though, "I Can Never Go Home Anymore" does the opposite of the title. It brings her unexpectedly back to harmonious times with her mother rather than to the endured pain of separation. Unfortunately, the driver's son, engrossed in a book and at a silent remove from his dad, snaps off the radio leaving his father and Cheryl suspended in time. "I love you, too, son," the father says to him with a tone of sarcasm that makes the track perhaps even more poignant to consider. Turning off the radio, however, only ends the song. The residue it leaves behind never vanishes.

-- March 7/15

Still Crazy After All These Years: Network (1976)


Peter Finch's Howard Beale is "mad as hell" in Network.

The last few months I've been noticing, especially in the news feed of Facebook, this continued reverence for Sidney Lumet's 1976 film Network, his loud and abrasive adaptation of Paddy Chayefsky's broad satire about the shift in television journalism from hard news to glib entertainment. The picture seems to be getting acclaimed all over again for its sheer prescience in revealing how the corporate control of television news has turned the sacred screeds of Edward R. Murrow into the boorish rantings of Bill O'Reilly. Whether talking about Donald Trump's candidacy for President, the shout-fests that litter the prime time broadcasts on Fox News, or more recently, the tragic shooting deaths of TV reporter Alison Parker and photographer Adam Ward live on morning television in Virginia (simply because the news anchor in Network is murdered on air due to poor ratings) folks online are revisiting the picture for clues to see how it all went wrong. You'd think that Lumet and Chayefsky were sages who saw it all coming. I've often made the case that Robert Altman's Nashville (1975), or Arthur Penn's Alice's Restaurant (1969), had their ear to the ground in anticipating the political and cultural changes taking place in the culture. But those films were pensive and elliptical works that called upon the audience to contemplate what some of those shifting dramatic themes were all about. Network doesn't allow you to think; it tells you emphatically (and with a tin ear) what to think. Network is a noisy collection of broadside rants that – seen today – are no more perceptive than one of Bill O'Reilly's nightly belches. Instead of being an outrageous and equal opportunity satire that spares nobody, Network is full of homilies that reveals more of Paddy Chayefsky's fortune cookie idea of humanism than it does leveling with the dumbing down of the glass teat.

During a drunken night carousing with his old friend news director Max Schumacher (William Holden), veteran UBS anchorman Howard Beale (Peter Finch) discovers that he's been fired due to low ratings. So Beale announces in his next newscast that he's tired of all the bullshit and is going to shoot himself live on-air in his next broadcast. But rather than causing panic and alarm, his angry tirade and suicide threat creates a huge boost in the ratings (coming just as the news department is about to be the victim of corporate meddling at the hands of Robert Duvall's company man, Frank Hackett). The popularity of Beale's nightly diatribes opens the door for Diane Christensen (Faye Dunaway), an ambitious careerist who'll do anything for power and success. That includes cutting a deal with a group of radicals, the Ecumenical Liberation Army, who are a parody of the Symbionese Liberation Army that kidnapped heiress, Patty Hearst, for an upcoming fall docudrama called The Mao Tse-Tung Hour. Not content at showing the bald opportunism at work in network news, though, Chayefsky (as he had earlier in the decade in his script to Arthur Hiller's 1971 The Hospital) also delves into the menopausal misery of middle-aged men (Holden in this picture and George C. Scott as a Chief of Medicine in The Hospital) who get seduced by threatening younger women. When Beale discovers that the CCA (Communications Corporation of America), the conglomerate that owns UBS, is about to be bought out by a larger Saudi conglomerate, Beale goes on the offensive against it. Those who put Beale on the air quickly send in their chairman, Arthur Jensen (Ned Beatty), to talk Beale into accepting the corporate line and changing his tune on the air. When Beale starts telling people in his tirades that he believes democracy is hopeless, his ratings collapse and he's assassinated.

Sidney Lumet and Paddy Chayefsky on the set of Network.

It's curious that a full-out lampoon of a visual medium happens to be so full of words. Yet that's the key to Network's appeal and success – the fear and distrust of images. From the early emergence of the visual arts, audiences resisted their pagan pull. When movies first arrived, it was theatre people who grew suspicious of a visual art form that called on you to give in to its power and surrender your control. As television appeared on the cultural horizon, movie audiences then argued against – and dreaded – a medium that invaded the sanctity of the home. When Network came out, I was studying film in college and none of us there found cause to discuss it seriously while being caught up in more challenging work from Jonah Who Will Be 25 in the Year 2000 to Carrie. (Even visually, cinematographer Owen Roizman's work looked totally washed out as if he were afraid to demonstrate the dynamic boldness he showed in his previous The Return of a Man Called Horse.) It was writers who seemed to view the film as some urgent manifesto against everything they feared. One Canadian poet I knew was so alarmed by the message of Network that he was inspired to type a whole novel on toilet paper (using both single and two-ply) called Shit. His work was actually more entertaining and imaginative than Network. (As people continued to use one of these ass-wipers, those new to the commode would be missing essential pieces of the narrative. Even so, I'm still not convinced as to how Network leads to Shit.) Many viewers of Network failed then and now to see the fashionable fundamentalism at its core. Like a modern day Moses throwing daggers into the Golden Calf, Paddy Chayefsky wasn't interested in looking at how corporate culture took over network news and why. Network instead drew its power by exploiting the hangover following Watergate, along with the collapse of the counter-culture and the rise of homegrown terrorist networks, to tell us that television had taken away our souls. But how can you prove spiritual larceny, when most of the characters have no souls to steal?

Peter Finch's Beale is believably seasoned as a veteran anchor, but we're never sure if he's truly mad or whether he's simply playing a holy fool. His key moment of madness, which has now become iconic, only adds to the confusion. Beale goes on the air and marshals the public rage over the state of the nation into a call to action. He asks everyone to stick their head out the window and yell, "I'm mad as hell and I'm not going to take it anymore," and the whole country follows suit. But their passive acquiescence to Beale's request doesn't come across as an act of protest, rather it makes him out to be merely a mad prophet of conformity – the very thing the picture is supposed to be railing against. Terry Jones's Life of Brian (1979), an absurdist satire of the Christ story, did this gag much better. In the scene where Brian publicly tells his followers that they are all individuals, they cry back in unison, "We are all individuals!" (except for one malcontent who cries, "No, I'm not!"). By the time, Ned Beatty's Babbit figure converts Beale to the corporate line, Network further betrays its rebellion against compliance. When Beale tells his audience that they might as well give in to the corporate agenda, the picture unwittingly swallows the popular despair of those who have already cynically given up on the validity of the political process.

Faye Dunaway and William Holden

William Holden's Max holds the screen with the authority of a leading man, but his affair with Diane Christensen, the blonde ice queen who is described as "television incarnate," makes her the stock villain used to disguise the priggishness of his character. Despite the thankless role, though, Dunaway is almost as sexy here as she would be a few years later as the fashion photographer in The Eyes of Laura Mars. The carnal heat she generates in Network could be the thrill of the game if Chayefsky hadn't rigged the rules to make her both the victim and progenitor of TV's soul-snatching. What's surprising, in retrospect, was the absence of any feminist reaction to the creation of a female character whose desire for power seems to be tied to her sexual neurosis. (Max's menopause, by contrast, is supposed to add to his integrity even when he admits his feelings for Diane to his endearingly faithful wife played by Beatrice Straight.) Robert Duvall is such a caricature of boardroom malevolence that he resorts to showboating to prove he's the corporate meany. (He has one plaintive moment, though, after Beale helps nix the Saudi deal when he laments mournfully, "I'm now a man without a corporation.")

For all the talk of this being one of Sidney Lumet's great New York movies, Network doesn't come close to striking the nerve that his previous Dog Day Afternoon did. Unlike the heavy moralizing in Network, Dog Day Afternoon bristles with the raw excitement of a city fighting to be recognized. The role of television media here is also far more dramatically convincing. When Al Pacino's bank robber taunts the police with shouts of "Attica! Attica!," and the gathering crowd screams their approval, he's not making a statement like Beale's "mad as hell" moment. Pacino is invoking the complicit violence of the authorities at Attica Prison in 1971 when an uprising of prisoners lead to 43 deaths. He's identifying with those we saw victimized on television and his shouts magnetize the crowd while terrifying the police. The scene is both comic and stirring and the mounting volume of screaming becomes funny because no one thinks they're being heard even when they're yelling at the top of their lungs. Lumet often worked best when he could catch the rising temperature in a room, as he did in Dog Day Afternoon, but when he had a thing or two to say about social justice in plodding pictures like Prince of the City (1981), Daniel (1983) and Q &A (1990), the movies suffered badly from tired poor blood. Network is also pretty tired, too, but it's so frenetic that it only gives off the sense that it's getting at something.

Why Network is still being revered today is perhaps no mystery given the way television news has been more and more abandoning its in-depth political stories for personality pieces that connect to the average viewer. But Network never respects how the medium is different from that of magazines, newspapers and even radio to give us a sense of the changing dynamics. Chayefsky and Lumet are content in Network playing the role of old-time preachers on the stump trying to tell us how the sanctity of the word is being threatened by the secular image. Their protest is not only still profoundly sentimental and antiquated. It's also old news.

-- September 12/15

Out of This World: John Coltrane in Seattle (1965)


John Coltrane and Elvin Jones (drums) at the Half Note in 1965.

For twenty years, between 1947 and 1967, John Coltrane (whose birthday was yesterday) played saxophone while engrossed with the desire to reach a place unheard, never before felt, and spiritually solvent. Beginning his career with an equal longing to be "consumed" by the spirit of Charlie Parker, in actuality, he was consumed (like Parker had been himself) with drugs and alcohol. One day, though, Coltrane had a spiritual awakening through vegetarianism and eastern religion which lead him on a quest "to make others happy through music." Who knew then that this sojourn would take him to the furthest edges of what many would call listenable music? And it would leave some people less than pleased let alone happy.

His career had begun with Dizzy Gillespie's band in the late Forties, but when Coltrane hooked up with Miles Davis in the mid-Fifties, he began to hone a virtuosity in improvisation.They were an audacious contrast in styles. Where Davis was a master of spareness, Coltrane could never seem to cram enough notes into a bar of music. His heroin problems got him kicked out of Davis's group, but then he began a short term with pianist Thelonious Monk before kicking his habit permanently. Coltrane had found a mentor in Monk. Monk taught him methods of creating complex harmonic structures within his sax solos, which in time would be long, difficult excursions into abstract blues. Coltrane could take a pop standard such as Rodgers and Hammerstein's "My Favourite Things" in 1960 and enlarge the melody on the soprano saxophone by building an extended solo overtop the basic chords of the theme. Within a year, though, in a series of concerts at the Village Vanguard, Coltrane used melody as merely a starting place for epic solos that built in intensity like a chainsaw cutting through a tornado. Sometimes they would last for close to an hour. "Chasin' the Trane," for instance, featured over eighty choruses that were built upon a twelve-bar blues. That intensity would reach a spiritual epiphany in 1964 with the luxuriant devotional suite A Love Supreme.


Like Blind Willie Johnson years earlier, Coltrane was possessed by a higher power and a purpose that was expressed through a fervent desire to remake himself through his art. "My music is the spiritual expression of what I am – my faith, my knowledge, my being," Coltrane remarked. But where many would take the path of sanctimony, Coltrane sought out dissonance rather than harmony. It reached its zenith in Seattle in 1965. That year, he was recording a phenomenal amount of music, each piece becoming more abstract than the last. One night, he had a dream in which he and the band had played a show without reference to chords or chordal sequences. In his dream, he discovered that he was seeking, in the words of jazz critic Keith Raether in Earshot Jazz, "dialogues of tonal and atonal sections similar to parallel octave passages found in African vocal music." The sessions Coltrane recorded after his nocturnal vision were the kind that would give other people nightmares. "We did two takes and both had that kind of thing in them that makes people scream," saxophonist Marion Brown explained. During the concert in Seattle, Coltrane decided to take his group, which also included Pharoah Sanders on sax, Elvin Jones on drums, and Jimmy Garrison on bass, through the most atonal abstractions he'd ever played. The purpose? "I don't think I'll know what's missing from my playing until I find it," Coltrane told a journalist from Melody Maker before the show.

One of the tracks, "Evolution," was a thirty-six minute excursion into an extravagant sheet of combative chords that filled close to two sides of an LP. The Harold Arlen standard "Out of This World" became literally just that. It was so dense in its atonality that the recording engineer, Jan Kurtis, who knew Coltrane's original recording of the song, didn't recognize it until well into the piece. In what became an understatement of perception, Kurtis told Keith Raether: "Coltrane seemed to be thinking about a lot of things. There must have been an enormous amount of music going on inside him." The enormity of that music was overwhelming for most listeners to consume. By Seattle, Coltrane had dispensed of conventional melodies in his own search for what Blind Willie Johnson had been looking for in the gospel blues: the soul of a man. For both artists, the soul of a man was not a harmonious place. So the octane that Coltrane provided was pure turbulence, a streaming of notes too primal to contain, what you might call a speaking in tongues from a spiritual hermitage. For many listeners to the double-LP John Coltrane in Seattle, however, it became much less than that. What they heard were tongues that were garbled – pure noise – and no more than listenable than pure cacophony. Music from the other side of the fence.

At the turn of the 20th Century, a number of classical composers were growing weary of tonality and wishing to dispense with the adherence to a single key as the one acceptable foundation for a musical composition. In response, Arnold Schoenberg developed a twelve-tone system in which all twelve notes in the chromatic scale were performed before the initial note was played again. Anton Webern offered his own interpretation of twelve-tone serialism by using it to create an abstract sparseness in his pieces. Igor Stravinsky became inspired to take music back to a pre-Romantic era. From there, he could explore form rather than content, ultimately leading him into neoclassicism and interpreting the music of the past. Composer Edgard Varèse wished to clear the decks altogether by reinventing Western music at its core. He explored it as a scientific construct of sounds, creating a whole new world of music yet unheard.

McCoy Tyner, Archie Shepp, John Coltrane and producer Bob Thiele in 1964. (Photo: Chuck Stewart)

As for American jazz music, many of its practitioners already considered it free, since it was built on improvisation, soloing, and liberated voices calling out to one another. By the end of the Fifties, though, there were some who claimed it wasn't free enough. "Free jazz" became a radical deviation from the form that challenged the conventional chord progressions and time signatures at its foundation. It erupted out of the untimely death of Charlie Parker, who opened the door to innovators to rethink the legacy they inherited. Classically trained pianist Cecil Taylor, for instance, decided to bring the ideas of Schoenberg and Webern into the land of Bud Powell and Horace Silver. In 1957, he appeared at the Newport Jazz Festival with an improvisational style that, as he put it at the time, "imitate[d] on the piano the leaps in space a dancer makes." Those leaps began in a lonely loft where by night, after returning from his dull day job of delivering sandwiches, he would hold "imaginary concerts" of his music, and envisioning an audience that could one day hear and appreciate it.

Since classical music and jazz have a comparable smaller audience than pop, Coltrane raised the bar on minority music. "Pop music provides immediate emotional gratifications than the subtler and deeper and more lasting pleasures of jazz," film critic Pauline Kael wrote in her ambivalent praise of Lady Sings the Blues (1972), the movie biography of singer Billie Holiday. "Pop drives jazz back underground." Yet it's in that underground where a laboratory of experimentation can flourish (as it did for Coltrane and his group). Since the huge dollars and the mass audience never drive that world, lone dreamers (such as Cecil Taylor) can endlessly perform their own imaginary concerts. But these distinct kinds of propulsive forces are only possible in that underground where jazz once resided despite Miles Davis doing his part in building significant bridges between both pop and abstract jazz in his work following In a Silent Way. The stage that Elvis Presley and The Beatles built though, as big and bold as it was, could never break totally free from the huge business interests that ultimately needed to make money from their art. John Coltrane in Seattle went beyond considerations of ever making money since the audience for that record could barely find a road map through its many detours – and those detours haven't been taken by anyone since. Whatever secrets John Coltrane discovered in that spiritual quest of playing those dramatic extended solos, he took them to the grave with him. And there ain't nobody who is ever going to bring them back.

-- September 24/15


                                                                   2016

Schizoid: John Gregory Dunne's Monster (1997)


When studying film in college during the Seventies, I read critical books that were about the themes, issues and the craft of movie-making. But by the Eighties and Nineties, most of the books I encountered – good ones, I'll admit – like Final Cut (about the Heaven's Gate disaster), Outrageous Conduct (about the Twilight Zone tragedy) and The Devil's Candy (about The Bonfire of the Vanities fiasco) were more about the failure of the American movie industry. (Ironically, most made for more compelling drama than the films that inspired them.) Now it would be tempting to add John Gregory Dunne's Monster (Random House, 1997) to this ignominious list, but for the fact that it is not much more interesting than the movie that spawned it. Monster initially reads as an absorbing and a painfully comic tale that pits the creative writer, with his unreasonable demands, against the corporate system, one that produces inhabitants who wear pinstripe suits with suspenders, slick their hair back with grease and have Perrier breath. But the book loses its nerve part way through and turns pretty schizoid. If the first half of Monster suggests how Hollywood's corporate brass turn writers into cookie-cutters, by the end, Dunne is practically providing trays to put the cookies on.

Monster deals with the eight years it took screenwriters Dunne and his wife Joan Didion to come up with what was to become John Avnet's film, Up Close and Personal (1996), where Robert Redford plays veteran television newsman Warren Justice and who helps Tally Atwater (Michelle Pfeiffer), a star on the rise, find fame while they fall in love. In outline, the book is a field diary of how they tried to negotiate different projects, survive serious illness and witness the death of friends over the time it took to finish writing the film. But Monster also examines how this very formulaic and sentimental movie came out of a very different and contentious story. In 1988, Dunne and Didion were approached to write a screenplay based on Alanna Nash's book Golden Girl, a biography of the late network correspondent and anchorwoman Jessica Savitch. Savitch wasn't just another blond beauty with a talent for the tube. According to Dunne, she was "a small-town girl with more ambition than brains, an overactive libido, a sexual ambivalence, a tenuous hold on the truth, [and] a taste for controlled substances." She would die in 1983 at the age of 35 in a freak car accident. Disney, a studio least likely to show interest in such a gritty story, nibbled at the project first. Through an insane number of rewrites, Dunne and Didion dramatically changed the content, the lives of the characters and the plot trajectory until what remained was a generic love story. It not only did not resemble the life of Jessica Savitch but instead looked like a template of their botched screenplay for the Streisand/Kristofferson version of A Star is Born (1976). What makes Monster so bizarre is in their acceptance of the travesty they had to make of their work. It's like watching people gleefully turning into the pods from Invasion of the Body Snatchers.

Michelle Pfeiffer and Robert Redford in Up Close and Personal.

Considering those faults, however, Monster isn't a total loss. It takes its title from an amusing anecdote about a Disney executive who, while bringing out an imaginary monster from an imaginary cage, tells a pugnacious screenwriter that the monster is "our money." But there are many other monsters in the book – and they aren't imaginary. Dunne is at his most pungent when he levels action director Renny Harlin ("I asked him how envisioned our first rewrite of Gale Force. 'First act, better whammies,' he said. 'Second act, whammies mount up. Third act, all whammies.'"), or producer Scott Rudin ("Overweight, overbearing, with a black beard and a huge booming laugh, the bully boy's bully boy...the passenger who takes the airphone and keeps it for the length of the flight, hiding it when he's not using it.") Dunne loses his edge, though, when he appraises with a tad too much sentimentality Don (Top Gun) Simpson who fired him and Didion from a UFO action film he was developing before he died from substance abuse. ("If Simpson didn't like something, he hit you right between the eyes with it; he did not send 11 pages of notes to an agent with orders not to show them to the writer.")

Dunne makes clear that if it hadn't been for a 1988 screenwriters' strike that cancelled all other projects, they would not have been committed to Golden Girl. And if it hadn't been for heart surgery he needed, along with the medical insurance, they probably would have abandoned the project when it started to get out of hand. Nonetheless, he still seems too pleased with the final result. But for Michelle Pfeiffer's intuitively crafted performance, Up Close and Personal seems to betray the very nuanced elements that attracted them to the Savitch story in the first place. What John Gregory Dunne comes face to face with in his book is a real monster – the Hollywood corporate machinery and the creatures it creates. By the end, though, he deludes himself into thinking that it was all just a mirage. If truth be known, I think the monster swallowed him up.

-- April 24/16

Careers and Corporatism: Arthur Kent's Risk and Redemption (1996)



Like many television news viewers, I didn't truly become aware of journalist Arthur Kent until 1991 when he was ducking Saddam Hussein's explosive little presents over Dhahran during the first Gulf War. At that time, while decked out in his leather jacket, and his sweeping dark hair blowing in the night air, he emerged with the sexual panache of a movie star. He was America's own Mel Gibson from Peter Weir's The Year of Living Dangerously, caught in the hail of rocket fire, and barely letting it ruffle his locks. This striking image not only won him the moniker "Scud Stud" from female fans rooting for him all over North America, it also won him no end of grief when he refused to be NBC's answer to Geraldo Rivera.

The great story behind Arthur Kent's 1996 memoir, Risk and Redemption: Surviving the Network News Wars (Viking, 1996), is how a reputable correspondent, who covered some of the biggest news events in his era, was forced to take the NBC television network to court in a $25 million dollar defamation and fraud suit because the corporate climate of turning hard news into celebrity worship rendered him unable to do his job. But Arthur Kent was also caught in a profoundly ironic trap because it was becoming the "Scud Stud" that actually brought him to international prominence. Risk and Redemption is Kent's attempt to separate what makes a journalist from what makes a luminary. And although you come away from the book cheering Kent's integrity, intelligence and victory, there is still something romantically self-serving about it. He comes across as someone beyond the temptation of stardom – even though television news, the profession on he's chosen, invites it. The incongruity of how the image of the "Scud Stud" (which Kent himself created) shaped a network's perception of him as a journalist, and perhaps implicated him in their corporate plans, doesn't envelop the book as much as I hoped it would.

In the first half of Risk and Redemption, Kent lists his accomplishments, and defends his reasons for becoming a newsman. Coming from a line of journalists, including a father who was boldly ethical, Kent demonstrates how he has lived out his family legacy. This includes covering the 1979 war in Afghanistan, the violence of Tiananmen Square, the death of Communism in Berlin, and, of course, Stormin' Norman Schwarzkopf and the adventures of Desert Storm. The stories are adventurous and read like a fairy tale filled with harrowing episodes, but they don't accumulate the resonant power of great prose. Since Kent's greatest strength is in bringing the presence and style of a star to hard news, his writing style lacks the body, or the eloquence, of his television pieces. Despite this flaw, Risk and Redemption gets more involving once Kent explores the shifting values of television news, and how it affected his case. He illustrates very clearly that when General Electric bought NBC in the early 1980s it started dismantling the foreign desks, and dumping seasoned journalists, under the guise of cutting costs. But the real reason, according to Kent, was to bring the news departments more in line with the entertainment world in Burbank, which controlled the air time of their flagship news show Dateline. The emphasis of news at NBC began to shift towards a journalism with a prurient fixation on celebrities.


Although the 1991 Gulf War had made him a celebrity, at a time when NBC wanted to build up a bevy of them for their news programs, Kent refused to relent. (He reports that NBC, as far back as 1986, considered having Bill Cosby and Don Johnson anchor some news specials because The Cosby Show and Miami Vice were high in the ratings.) Paradoxically, it was Kent's refusal to accept an assignment to the former Yugoslavia in 1992 (because of the network's poor preparation in sending its team to a war zone), that led to his defamation and fraud suit because NBC suspended him. Kent eventually won his suit against NBC in 1994, for an undisclosed amount, and reading his account of the legal battle, you can't help relishing in his victory. Risk and Redemption outlines how financial cuts can be employed by companies to create an air of desperation that will bring employees into line with corporate policy, rather than the stated reason of "balancing the books."

A number of years back, John MacArthur, now president of Harper's Magazine, was asked why he thought most journalists went along with government censorship during Desert Storm. He didn't blame a right-wing conspiracy, or offer up one of Noam Chomsky's pat pronouncements. He just said one word: careerism. MacArthur explained that many current affairs journalists aren't interested in risking their careers on certain news items. MacArthur said that a climate of conformity, of trying to stay on the winning side of popular trends, and the desire to be celebrities, makes it possible for government censorship to happen. (One can clearly see today how the incessant coverage of Donald Trump's current crusade remains true to that view. TV journalists are more and more patently desperate to be on the hot side of a story rather than delving into the larger and broader issues of a Presidential campaign.)

Thankfully, Kent wasn't one of them. He stood up against government and network censorship. So, despite its contradictions and shortcomings, Risk and Redemption lays out with a impassioned intelligence the continued skirmish between the marketing of information and the interpretation of it. The marketing side continues to win at the moment, but Kent's book – and his past victory over NBC – perhaps shows us that the war may not be over yet.

-- May 14/16

The Masculine Ideal


Paul Newman and Sally Field in Absence of Malice (1981).

During the Christmas season, back in 1981, while visiting some friends of mine in rural Caledon, Ontario, I suggested going to neighbouring Orangeville to see a movie. The theatre in town was a duoplex with Sydney Pollack's legal drama, Absence of Malice, in one, and Burt Reynolds' adaptation of William Diehl's pulp potboiler, Sharky's Machine, in the other. But rather than making a quick choice on arrival, we ended up hitting a great divide. A huge argument broke out that almost sent us back home. The women were keen on Absence of Malice, where Paul Newman played a Miami liquor wholesaler, the son of a deceased mobster, who suddenly became a suspect in a front-page story about the possible murder of a union official. The men were primed and determined to see Burt Reynolds as an Atlanta narcotics sergeant working a drug case who gets demoted when, in pursuit of a drug dealer, he accidentally shoots a bus driver. We ultimately decided to split up and see our respective films before we ended up seeing nothing. (Full disclosure: I went with the women.)

The quality of both pictures aside, it was curious to see how these men and women – united in life – could be so quickly divided in their movie tastes. It wasn't just the simple notion often implied that men like action pictures and women prefer drama; a line was being drawn on the notion of what a man was. In Absence of Malice, Newman's Michael Gallagher is a reserved and private man who gets drawn into public scrutiny by specious association rather than the quality of his character. He takes us pretty far inside the complications of a basically uncomplicated man so that we can enjoy his desire to clear his name and avenge a dear female friend who gets caught in the crossfire of the investigation. There isn't a need for him to prove himself a man. When Gallagher turns on the investigators and the reporter (Sally Field) who launched the story, he's perfectly at home with his masculinity. His actions are motivated by a code of basic decency rather than a desire to be a hero. Paul Newman provides an appealing portrait of a man in conflict with the world where his prowess is never in jeopardy (as is often the case with Clint Eastwood). Newman shows us that Gallagher's need to act is an affirmation of integrity rather than a need to prove one's strength. Which is why, while sitting in the theatre with the women at the screening, I could feel a palpable synergy between Newman and the audience, as if his actions and his character opened up an ideal concept of who a man could be.

But, unlike Cary Grant, who became an idealized man with great style on the screen, Newman embraced the glamour of the movie star and infused it with the realism of the Method actor. (For all his virtues as an actor, Cary Grant was really a man of our imagination whereas Newman was a real man with imagination.) The other great Method actors, such as Marlon Brando, James Dean and Montgomery Clift, brought the hidden neurosis lurking in masculinity to the surface, but Paul Newman brought out masculine sanity instead. Whether he was playing a scoundrel like Hud, a randy hockey player in Slapshot, or a martyred Christ in Cool Hand Luke, Paul Newman didn't act out the burden of being a man, but instead showed us how a sane man dealt with that burden. Absence of Malice was hardly a notable movie, but it offered Newman a sketch that he could fill out. It was an actor's showcase where movie-making fever played second fiddle to an exploration of what made a character tick. The audience that night, including the women I watched the film with, came out satisfied at having seen a good story and felt the heat of the sensuous possibilities Paul Newman opened up.

Rachel Ward and Burt Reynolds in Sharky's Machine (1981).

If Paul Newman offered up the hopeful possibilities in being a man, Burt Reynolds in Sharky's Machine (which I saw a month later) gave us the despair that comes with carrying masculinity like a ball and chain. In Sharky's Machine, Reynolds isn't playing the righteous avenger who can prove he's the better man. He's crippled by his sensitivity, as if the world – and the venal criminals in it – won't let him be one. His cop protects a female witness (Rachel Ward), but the story isn't about his feelings for her. Instead we concentrate on what kind of man he has to be to protect her. (To prove his devotion, the movie lingers endlessly on Sharky getting tortured by the men who are after her.) Sharky's Machine is about as brooding and enervating as its star. Reynolds is a man of action in repose who's left contemplating a world that no longer has a place for a real guy. (I'd say about the only thing worse than the raging machismo of the action hero who needs to prove himself a man is the weltschmerz of macho impotence.) Sharky's Machine is hardly a surprising portrait of Burt Reynolds, the actor and star. While Reynolds has always been a gifted performer, he's never seemed to find a warm spot in his talent. With the handsome volatility of the young Brando, and the brashness of Sean Connery, Reynolds has mocked his own appeal and played it for cheap laughs. The darling of TV talk shows, he was the movie star who found a strange sense of dignity by playing the sellout who mocked the profession he sold out to. As a result, his films turned into self-conscious vanity projects which winked at the audience (Smokey and the Bandit, The Cannonball Run). Only in John Boorman's Deliverance did Reynolds risk shaking the image he helped create for himself. Playing a suburban man of nature, who constantly tests his masculinity against the wild, Reynolds is the leader of an ill-fated canoe trip that ultimately reveals how little his muscular stature adds up to when he becomes physically wounded and helpless. In Sharky's Machine, however, the body armour stays firmly in place even if Reynolds thinks that including Chet Baker singing "My Funny Valentine" is supposed to show us how Sharky soothes his savage breast.

Coming out of the movie house that night, you could say that both camps were satisfied with their choices. But the separate experiences didn't truly bring the group together as we gathered outside in the cold wind sweeping us to the car. From our discussion afterwards, it was clear that Paul Newman may have provided the women with an evening of tantalizing entertainment that invited them into a masculine world of feeling, where intimacy was also a sign of strength, while for the men, Burt Reynolds confirmed the opposite. They were still alone and misunderstood, stoic and forlorn, their male burden left unrelieved. Having spent the evening with the women, and not the men, put me in a whole different perspective. Through Newman, my sense of being a man may have been affirmative, but it was a totally isolating experience since I couldn't share that experience with the men I rode home with. And since I wasn't a woman, I couldn't identify with their feelings except vicariously, even though I had shared the movie with them. I was occupying a nowhere land where the communal experience of moviegoing had become, for me, at least, a private island. That hermetic illumination didn't become a weight like Sharky's, though, or even a noble gesture like Gallagher's; it was merely an admission that there was still some distance to cover before that island became a shore.

-- October 6/16


                                                                   2017

All the Criticism That's Fit to Print: Revisiting The Rolling Stone Record Review and The Rolling Stone Record Review II


Led Zeppelin (courtesy of Getty Images)

In March 1969, writer John Mendelsohn was given the assignment for Rolling Stone to review the debut album of Led Zeppelin, a high-octane blues-rock outfit that had just emerged out of the ashes of The Yardbirds – a popular British Invasion band with a string of hits behind them including "Heart Full of Soul" and "For Your Love." Although there were no great expectations that this new ensemble would make history, Mendelsohn's words came to suggest that they might just become history. Chalking up their sound to formula, Mendelsohn remarked that Zeppelin "offers little that its twin, the Jeff Beck Group, didn't say as well or better..." Robert Plant's "howled vocals" were described as "prissy" on their cover of Joan Baez's "Babe I'm Gonna Leave You," and Mendelsohn went on to add that "[Plant] may be as foppish as Rod Stewart, but he's nowhere near so exciting." Jimmy Page gets complimented as an "extraordinarily proficient blues guitarist," but he's also singled out as "a very limited producer and a writer of weak, unimaginative songs." Criticizing them as wasting their talent on "unworthy material," Mendelsohn saw little from that first record that suggested that Led Zeppelin would be talked about fifty years later. "It would seem that, if they're there to fill the void created by the demise of Cream," he wrote, "they will have to find a producer (and editor) and some material worthy of their collective attention."

Now whether you believe that John Mendelsohn was right or wrong in his assessment of the record (and history has certainly proved him wrong on the band's longevity) is hardly the point. What his review of Led Zeppelin did, two years into the life of the counter-culture magazine, was make a claim for a subjective critical voice in reviewing rock music. While the visual arts, literature, film and theatre were already well into their established critical traditions, rock music had no such custom in the late sixties. Rolling Stone – and later Creem and Crawdaddy! – would launch over time a number of writers who would become part of a critical pantheon, including Lester Bangs, Jon Landau, Greil Marcus, Jim Miller, Ed Ward and John Mendelsohn. You'd be hard pressed in the music press today to see someone challenge Led Zeppelin as anything other than a classic rock album (even in the current Rolling Stone). But reading Mendelsohn's review years later, where it's included in the out-of-print paperback The Rolling Stone Record Review (Pocket Books, 1971), you're reminded of just how little remains of true critical writing in pop music criticism. What is more common today (especially in magazines like Mojo and Uncut) is consumer reporting, an objective style of criticism that eliminates the sensibility of the critic from the review. While the writing itself is technically clean and precise, as well as knowledgeable about the artist and their work, the critics aren't required to provide any subjective insights into the music except to reinforce what the artist intended on the record. In a reverse of D.H. Lawrence's famous quote from Studies in Classic American Literature, they never trust the tale, they trust the artist. The Rolling Stone Record Review collection – which gathers Rolling Stone pieces from 1967 to 1970 – chases the tale and holds the artist accountable for it. Often sloppy, sometimes blind to what makes the work a success or failure, the voices in this anthology are nonetheless trying to find their way into the grooves of every record they review.


Rolling Stone was co-founded in San Francisco in 1967 by Berkeley drop-out Jann Wenner and music critic Ralph J. Gleason, who both sought to provide compelling commentary on an emerging counter-culture. The Rolling Stone Record Review brings together early artifacts from that quest and the results are definitely entertaining and sometimes quite bracing. You can smile with the benefit of hindsight reading Arthur Schmidt's review of The Songs of Leonard Cohen when he quotes a friend who told him after reading Cohen's novel, The Favorite Game, that he didn't understand why people in the novel kept asking the protagonist to sing when the author clearly had a horrible voice. But often you get prescient perceptions – even from Jann Wenner himself. On The Beatles' White Album, Wenner sees clues that anticipate their later break-up when he spots how the music on the record actually splinters the familiar group identity. "[T]here is almost no attempt in this new set to be anything but what the Beatles actually are," he writes. "Four different people, each with songs and styles and abilities. They are no longer Sgt. Pepper's Lonely Hearts Club Band, and it is possible that they are no longer the Beatles." Today a Beatles record, like Abbey Road, might be greeted with lit incense sticks. What Ed Ward heard instead was overproduced sterility: "The Beatles create a sound that could not possibly exist outside a studio. Electronically altered voices go la la la in chorus, huge orchestras lay down lush textures, and the actual instruments played by the Beatles themselves are all but swallowed up in the process." As a result, according to Ward, George Harrison's "Something," which became his biggest hit as a Beatle, is "vapid" and "oozing like saccharine mashed potatoes." Ward is also prophetic when he claims that "Something" will "be covered by eight or ten artists in the next month and will rate with 'Yesterday' and 'Michelle' as one of the fab four's top money makers." In comparing Abbey Road to the group's single "Get Back/Don't Let Me Down," he concludes, "It is ironic that the Beatles should have put out a single with that advice, as well as an admonition not to let them down, followed that advice quite well with the follow up record [Let it Be], and then released an album like this." Would someone dare write those words about Abbey Road today? Greil Marcus opened his review of Bob Dylan's desultory Self Portrait with the famous question, "What is this shit?" and then proceeded to tell us what made it shit by writing an anti-auteurist piece that trusted the tale rather than the artist who told it.

While you can always get a visceral kick out of someone taking a mad stab at debunking an album's greatness, it's even more satisfying when a writer uses the album to reach for a larger vision. Many critics writing here felt the darkness of the Seventies beginning to overshadow the music. This may be why Jon Landau talks about the faint despair lurking in the beauty of The Byrds' The Notorious Byrd Brothers (1968). "They sense the paranoia that is all around us but they do not give up on their search for innocence and natural harmony," he writes. "The Notorious Byrd Brothers is simply the latest rendition of 'Turn, Turn, Turn.' It's just that this time the turning isn't so self-assured or so automatic." Greil Marcus sees a troubling decade looming at the end of the Sixties on The Rolling Stones' masterpiece, Let it Bleed (1969): "[I]n Let it Bleed we can find every role the Stones have played for us – swaggering studs, evil demons, harem keepers and fast life riders – what the Stones meant in the Sixties, what they know very well they've meant to us. But at the beginning and the end you'll find an opening into the Seventies – harder to take, and stronger wine. [Their songs] no longer reach for mastery over other people, but for an uncertain mastery over the more desperate situations the coming years are about to enforce." For some music listeners, this kind of writing kills the pleasure they seek from an album. But I would argue that these critics actually deepen the satisfaction of the music because they are not only finding the means to articulate what it feels like to both enjoy and dislike a record, but they're also trying to show us that art doesn't emerge out of a vacuum. Langdon Winner reminds us of that fact in his review of The Jackson Five's debut album: "The Jackson Five stand in the tradition of super young rock singers that goes back to Frankie Lymon and the Teenagers and, more recently, Little Stevie Wonder. Ever since the day that Frankie Lymon lied about his age to producer George Goldner and earned the right to sing lead on 'Why Do Fools Fall in Love," there has been a prominent place in rock and roll for the very young, exceptional voice."


Some columns in The Rolling Stone Record Review feature writers whose opinions appear to be cured in chemical refreshment, but there are also some who try wild experiments like J.R. Young, who takes on Crosby, Stills, Nash & Young's Déjà vu by turning his review into a fictional short story. Musicians like Al Kooper try their hand at criticism and offer opinions on key records like The Band's Music from Big Pink (1968), where Kooper (who had played with various members of the group) hears The Beach Boys, Hank Williams, The Swan Silvertones and The Coasters as "a varied bunch of influences." But he also raises a definition more commonly used today: authenticity. "When you hear a dishonest record you feel you've been insulted or turned off in comparison. It's the difference between 'Dock of the Bay' and 'This Guy's In Love With You.' Both are excellent compositions, and both were number one. But you believe Otis [Redding] while you sort of question Herb Albert. You can believe every line in this album, and if you choose to, it can only elevate your listening pleasure immeasurably." What does become glaring, however, as you read through the book is the dearth of female critics. Janet Maslin didn't really arrive until The Rolling Stone Record Review, Volume II (Pocket Books, 1974), where she wrote about Linda Ronstadt's debut album on Capitol Records in 1972. "All of Linda's records have been solid and enjoyable, but no single one of them prepares you for the knockout she can be in live performance," Maslin explains. "Her full talent brings together so many powerful elements that it seems to be almost impossible for any recording to capture her as a whole." What Maslin seemed to be anticipating was producer Peter Asher, who would bring together Ronstadt's full talents on her 1974 Heart Like a Wheel album.

As satisfying a read as The Rolling Stone Record Review is, Volume II has more depth and the writing is much sharper. A period that included the deaths of Brian Jones, Jimi Hendrix, Jim Morrison and Janis Joplin brought out the best in its critics. Lester Bangs talked about the loss of Joplin with an eye towards the impact it was having on both the culture and the music. "I don't know which is worse, the cannibalistic impulse of the public and the pop music industry which mutually encourages artists in disintegration because that's the flash and we really do think that someone else can live out our lives and deaths for us, or the sickly, not to say sickening, spate of 'Eulogies' and 'Memorials' and 'Remembrances' which sweep the rock press as soon as another star done gone. But perhaps they are the same thing," he wrote in his review of the posthumous Joplin in Concert (1972). With time to reflect on the golden age of Motown singles in the sixties, Vince Aletti writes with a passionate eloquence about their value. Describing the tunes as "jukebox spirituals, scripture to dance to, expressions of the emotional life black music has always been most concerned with," he goes on to define Motown's creator, Berry Gordy, as the perfect progenitor of the label. "Berry Gordy knew the psychology of the car radio and the transistor and the jukebox; his records slipped through these media and into people's heads and daily lives so quickly they were almost subliminal," he explains. "The Motown Sound had an insistence that went right past being irresistible." Musician Lenny Kaye, who would later be a valued member of the Patti Smith Group, turns out to be a pretty sharp critic when defining Jimi Hendrix's significance in his review of the 1971 posthumous release, The Cry of Love. "Hendrix was a master of special effects, a guitarist who used electricity in a way that was never as obvious as mere volume," he writes. "He took his bag of toys – the fuzz-tone, the wah-wah pedal, the stack of Marshalls – and used them as a series of stepping-stones to create wave upon wave of intense energy, proper settings for a scene of wrath and somehow healing destruction. It was rock and roll that was both quite in tune with and yet far ahead of its time, and in a way, I'm not sure we've ever really fully caught up."


The Rolling Stone Record Review, Volume II doesn't function as a consumer guide to pop music the way so many review pages in magazines do today. Often a writer here will nail a famous record by listening to what's on it rather than speak to the album's intention. For instance, Jack Shadolan sums up the calculated sanctimony of the musical Jesus Christ Superstar perfectly when he wrote that "in this re-working of a great agony, the agony seems to be misplaced . . . an ideal gift for Mom & Dad so they shouldn't call what you listen to degenerate anymore." Mike Saunders and Melissa Mills exalt the guitar work of the lesser-known Robin Trower in Procol Harum at the expense of taking a shot at Eric Clapton: "Trower [has] been among the best guitarists extant in rock; using a sparse yet incredibly intense technique, he had done on Shine on Brightly and A Salty Dog what Eric Clapton endlessly bullshitted about but never did – play with the emotional intensity of the blues in a rock framework." Both Record Review volumes also remind you that the pop music of the time was often a hybrid of genres. Ralph J. Gleason, for example, could see the fusion of jazz and funk in both Miles Davis's On the Corner and Santana's Caravanserai in his 1972 review. Long before he'd write the finest biographies of Elvis Presley (Last Train to Memphis, Careless Love), Peter Guralnick brought up the unsolvable contradictions in an artist who had a multitude of them in his 1971 review of Elvis Country. "You wonder sometimes just who is controlling Elvis's career," he ponders during what would be Presley's final decade of recording and performing. "In the middle of a typical movie soundtrack album, Spinout, you come across not only a raunchy 'Down in the Alley' but the interpretation by which Bob Dylan would most like to be known, 'Tomorrow is a Long Time.' In a bland follow-up to his dynamic Memphis album, Back in Memphis, you find a brilliant and impassioned treatment of the Percy Mayfield blues, 'Stranger in My Own Home Town.' And now at a time when it seemed as if his career must sink beneath the weight of saccharine ballads and those sad imitations of his own imitators, Elvis Presley has come out with a record which gives us some of the very finest and most affecting music since he first recorded for Sun almost 17 years ago." What comes across strongest in this piece is not whether you or I would embrace Elvis Country with the same enthusiasm as Guralnick does; it's how his sensibility as a critic opens up the enigma of Elvis while reviewing this album. I have distinct memories of the writers of many of these reviews included in both Rolling Stone volumes. But I can barely call up the names of writers I read last week in various magazines like Mojo because the person listening to the record has disappeared from the review.

In the years that followed, Rolling Stone opted for the more consumer-friendly The Rolling Stone Record Guide, edited by Dave Marsh and Nathan Brackett, which streamlines the artists and their albums in a more agreeable way so that Led Zeppelin gets a proper place in the classic rock pantheon. You have to search used bookstores high and wide to find the original two volumes of reviews. The Rolling Stone Record Review I & II brings back a time when the music critic wasn't beholden to record companies, or editors, who were afraid of opinions that might affect album sales. What marketing folks did very successfully over the years was seduce writers by offering them access to their artists (or depriving that access if they weren't happy with a publication's review). This style of behaviour played not only to the writers' egos, but also to their editors, who needed star quality in order to boost readership and keep their bosses happy. That dubious relationship set the model for reviewing that doesn't read much different from ad copy. In this climate, it's doubtful we'll ever see a review as funny – and as percipient – as Paul Gambaccini's very brief one on The Archies' Greatest Hits. "Contained within these grooves," he writes in The Rolling Stone Record Review, Volume II, "are twelve convincing arguments against the capitalist system."

-- February 27/17

Inheritance: Peter Reich's A Book of Dreams (1973)


(l. to r.) Eva Reich, Jerome Siskind, Peter Reich, Wilhelm Reich, Ilse Ollendorff (Peter's mother) in Maine.

"I am in Lewisburg [Penitentiary]. I am calm, certain in my thoughts, and doing mathematics most of the time. I am kind of 'above things,' fully aware of what is up. Do not worry too much about me, though anything might happen. I know, Pete, that you are strong and decent. At first I thought that you should not visit me here. I do not know. With the world in turmoil I now feel that a boy your age should experience what is coming his way – fully digest it without getting a 'belly ache,' so to speak, nor getting off the right track of truth, fact, honesty, fair play, and being above board – never a sneak ..."
– Letter from Wilhelm Reich to his son, Peter, aged 13, from prison, March 19, 1957.

When my mother passed away recently from cancer, I fulfilled a promise I made to eulogize her at the memorial. For the first time, however, I decided not to write the tribute as I had for other friends and relatives I'd lost in the past. It might seem to be a strange choice since we choose our friends over time and throughout our life, but we begin in the womb of our mothers. You would think that my eulogy would need the care of consideration and thoughts first consigned to paper. But as I was growing up, I came to know a formidable and peripatetic woman who was as daunting as she was fascinating. For one thing, Sheila Courrier-Vezeau had done many things by the time I was 10. Besides being a striking model in her late teens, she would soon after get her pilot's license. To this day, I still have a distinct memory and knowledge of all the cloud formations she taught me when we took to the sky. If she longed for the stars, she also dove into the depths of the water when she learned to scuba dive. I would often go up to Tobermory, Ontario, in the Great Lakes on summer camping trips, trekking into the woods, while she sought out small shipwrecks.

During my early childhood, she wasn't home much since she had to work full time. My grandparents came to raise me in those years. At first, she assisted the famous futurist, Frank Ogden, and helped him with the kind of prescient inventions that predated the computer age. (I wish I had a photo of her – decked in black leather – while I rode on the back of her motorcycle bolting off to Ogden's farm.... but, then again, maybe I'm glad I don't.) Before long, in the early Sixties, she ended up working with anchor Harvey Kirck at CTV News and helping him organize his newsroom. In the end, though, it was real estate that grabbed her passion. From 1972, until she was diagnosed with cancer a year ago, she worked full-time selling houses. It was remarkable to me that she could work so long and so hard right into her eighties. She never seemed to tire.


Though we didn't have a very consistent domestic relationship as I was growing up, her influence was still remarkably felt. It had an indelible impact that I'm still trying to figure out. My own doggedness and fierce independence were undoubtedly an inheritance from her. When I was diagnosed with an incurable cancer, mere months before she would herself be struck, she came to Toronto on a couple of occasions to be with me. She first assisted me through a crucial 8-hour spinal operation that saved my life, as well as taking care of me while I was recovering and about to begin chemotherapy. We were encountering in the worst moments what being mother and son meant to both of us, as it appeared then that I wouldn't be in this sphere much longer. Having never had a chance to delve into her maternal side, and with me coming to terms with the fragility of time, my mother drew closer to that role and what it truly meant to her. I was also able to become a son. It was a gift beyond words that those times took place before more tragedy struck. While we talked during those visits, I made a promise to speak of her at whatever memorial was created. But due to the complicated history we had, where I made choices in my life that diverged radically from what she might have had in mind, and which would ultimately made us ships that passed periodically from a distance, I needed to feel my way back to my origins with her. For that reason, I decided to speak impromptu at the tribute as if I was making my way down river streams to the source of the water that had brought me here rather than speaking from a carefully planned arrival.


When re-reading Peter Reich's memoir A Book of Dreams (Harper & Row, 1973) about his famous father, the psychoanalyst Wilhelm Reich, I realized that his book traced similarly tangled and ambiguous roads back to a relationship with an influential and intimidating parent. Unlike some chronicles written by the children of prominent figures, A Book of Dreams isn't a Daddy Dearest trash job, or a revisionist history that robs its subject of his distinctive genius. Reich instead writes a story of his childhood, flashing back and forth through a series of personal dreams he has as an adult, that draws a captivating portrait of his loving and difficult relationship with Wilhelm Reich. At the time of his birth in 1944, Peter entered into a family embroiled in controversy. His father was an Austrian psychoanalyst who had originally studied under Freud. But he later broke from the strictly formal talk therapy of his mentor, and even from some of Freud's ideas on neurosis. In particular, after graduating from medicine at the University of Vienna in 1922, Wilhelm Reich had a desire to reconcile Marxism and psychoanalysis by delving into something he termed character analysis – where he sought the source of repression in the body itself through muscular armour used by individuals to protect themselves from emotions and memories they didn't wish to face. In his book, Character Analysis (1933), Reich wrote that character structure was the result of social processes, whereas muscular armour was a defence that contained the history of the patient's traumas. Dissolving that armour would bring back the memory of the childhood repression that had caused the blockage in the first place. He would eventually come to treat it through body therapy – an aggressive form of deep massage – to free up the feelings and then arrive at talk therapy. Reich first tried to ingratiate his ideas into the society created by the Russian Revolution and argued that neurosis had its roots in sexual attitudes formed in socioeconomic conditions; he hoped the Revolution might possibly provide the ground for a new society that would break those psychic chains of repression. But he quickly became disillusioned when he saw signs of the authoritarian personality in Leninist and Stalinist theory and action. Once addressing the National Congress of the Central Committee of the Communist Party, he tried in vain to explain the Oedipal conflict as he was being shouted from the state. In a few short years Reich lamented that millions were going off in war to die for the Motherland.

After being denounced by the Stalinists, Reich was chased from Europe by the Nazis, whom he examined in 1933 in his landmark book, The Mass Psychology of Fascism. By the time he wrote The Function of the Orgasm in 1939, where he explored the direct relationship between neuroses and our inability to surrender completely to the full bio-energetic pleasure of complete orgasm, he began to be perceived as a radical in the psychoanalytic community and was denounced by both psychotherapists and scientists. He emigrated to America in 1939, eventually to live in Maine, where he was regarded as insane when he claimed to have discovered the source of life in the Orgone, a biological energy that existed in the universe as cosmic energy and flowed through the body, which began to draw the attention of the FDA (Food and Drug Administration). When he began to build boxes called Orgone Accumulators to enhance Orgone energy in the patients' bodies, doing experiments in curing cancer, and building cloud-busters used to break up Orgone energy in the sky to bring rain to areas of drought – as well as chasing UFOs – he became the FDA's target. In time, Reich would grow more isolated and paranoid about the attacks and, though filled with sharp insight into the human condition, later books like Listen, Little Man (1948) and The Murder of Christ (1953) had tinges of megalomania and narcissism. (In The Murder of Christ, you could easily perceive the ways he felt persecuted by those suffering from what he called "the emotional plague," which he also blamed for the Crucifixion of the Prophet.) In the late Forties, when disparaging articles about Reich and his work appeared in The New Republic and Harper's, the FDA closed in with an injunction that charged him with a "fraud of the first magnitude." During the early Fifties, they had him first destroy his accumulators and then required him to burn his research papers and books. In Arizona in 1956, Reich was charged with violation of that injunction when he shipped an accumulator across state lines and was eventually sentenced for two years at Lewisburg, where he died of a heart attack in November 1957, at the age of 60. He was just days away from his application for parole.

Peter Reich

A Book of Dreams begins in France in 1963 when Peter wakes up from a powerful dream and finds himself in a Paris hospital, having suffered a dislocation of his shoulder from an accident. What follows is a number of childhood memories from the period when his father was going through his troubles with the FDA. One of the first recollections is of his father making him bury a glow-in-the-dark yo-yo because "the glow stuff was deadly just like fluorescent light." The 13-year-old child was losing a favourite toy, but to his concerned scientist parent it was poison:

Glow-in-the-dark light was bad energy and it didn't mix with Orgone Energy, which was good energy. Daddy was trying to kill the bad energy in the atmosphere. Bad energy came from flying saucers and bombs. The cloudbuster cleaned the atmosphere of the deadly orgone – we called it DOR – and fought the flying saucers. Only we called the flying saucers EAs. It was initials. The E stood for something and the A stood for something. Daddy told me what it was but I forgot. We had names for a lot of stuff. The EAs energy was like glow-in-the-dark energy and it made us sick.

Despite the explanation, as Peter writes his memory from the mind of a young adolescent, he doesn't fully understand the concern and his father seems to lose sight of his son's sense of loss:
"I have told you, Peeps, that the glow-in-the-dark paint has a negative charge. It is like florescent light...Rather than giving off energy, it draws it away, absorbs it from living things."
"How come the other kids don't get sick then?" 
"But they are, Pete. They are tightly armoured against feeling the deep effects of DOR sickness. They fight it off with toughness and dirty jokes but the sickness still eats them away inside. Their faces become tight and their jaws get rigid because they no longer feel. When they get older, they die of cancer. Sometimes I see armouring in you and that is why I give you treatments."
"All their bellies are hard?"
"Yes. And their way of achieving things is a hard-bellied way. Do you remember the movie we saw with John Wayne [The Wings of Eagles], in which he falls and gets crippled?"
"The one where he plays a navy officer. Yeah. He fell down stairs at night and the doctors told him he would never walk again."
"Ja. You see, when he was sitting in bed, looking down to the end of his cast watching his toes, he resolved to walk again. And he said, over and over again, 'Gonna move that toe, gonna move that toe, gonna move that toe.' You see, that is the rigid way of overcoming things."
"But in the end, he walked, didn't he?" I asked.
"Yes, but you see, to overcome obstacles that way, by force, so-called will power, that is communist. It is the rigid, mechanistic way of accomplishing things. He had to make himself so tight and hard to force himself to walk again that he forgot how to love and be kind."

Peter Reich would elaborate on those movie connections with his father years later when talking to author John Shaplin (Adventures in the Orgasmatron) in 2012. “He thought the movies were about him, and maybe they were," Peter explained. "You see. It’s hard to know where the circle starts. For example, High Noon, he was really into High Noon, and Bad Day at Black Rock. And this is why he wore a cowboy hat: he was Gary Cooper. And when the FDA came up to see him at Organon, he was just like Spencer Tracy. He’d say, ‘Listen, mister’ – he used that language. That was really part of his American persona, the movie person. He didn’t make a distinction between that and real life," “He could put his hands on you, and he was a healer, he really was. And I think he felt he could heal the world, because his cloud-busters really seemed to work. So he really felt he was in control of everything. And he didn’t understand who other people didn’t see that. He shared the moral certainty that Gary Cooper had in High Noon and Spencer Tracy had in Bad Day at Black Rock, and that Sir Thomas More had in A Man for All Seasons." That day the FDA came to Organon was particularly traumatic for Peter, as he had to let the officers into the property and take part in the destruction of the accumulators.“He was a nineteenth-century scientist, he wasn’t a twentieth-century scientist," he explained to Shaplin. "He didn’t practice science the way scientists do today. He was a nineteenth-century mind who came crashing into twentieth-century America. And boom! The FDA was hot to get a prosecution and he walked right into it. He was sending telegrams to the president of the United States, saying he was stopping hurricanes and claiming that the FDA were Communists. He walked right into it, with his eyes wide open.” As A Book of Dreams reveals, a boy's eyes are less wide open. The needs of a child, often overlooked by preoccupied parents, clouds the judgment of both.

A patient of a Reichian therapist in an Orgone Accumulator box

While much of A Book of Dreams is caught up with Peter Reich's coming to terms with the trauma of the persecution of his father, he also comes to face the psychological dilemma created in his adult life because of it. 

As an unhappy adolescent I followed the Playboy ethic assiduously. Big tits. Love 'em and leave 'em. Sex is a diversion, like sports. I fucked a lot. I masturbated a lot, not as a release of energy, but because fantasy was easier to come by than the dream world portrayed in movies. It ran deeper, too, like the lake which only got darker and darker, because being a real person and letting myself love a woman would have meant sharing all that fear. It would have meant sharing who I was, and I was too loyal for that. In my own way, I wanted his penis too.

Eventually, Peter Reich would find his own feet and become a journalist, then a day-care teacher who lived in Vermont with his wife, Susan Gulick, and for the past thirty years he has worked at the Boston University School of Medicine. The legacy of his father, ironically, would be picked up by others in psychoanalysis a mere decade after he died with Fritz Perls creating Gestalt Therapy, former Reich student Alexander Lowen's bioenergetic analysis (which also worked on the body but through exercises rather predominant massage), and the primal therapy of Arthur Janov, which brought into the world the Primal Scream. But Reich has also been felt (and often judged incorrectly) in popular culture. In 1971, the Serbian film director, Dušan Makavejev, made a scatterbrained collage, W.R.: Mysteries of the Organism, that combined a specious knowledge of Wilhelm Reich's work with contemporary looks at counterculture politics and at The Vow, a Stalinist propaganda film from 1946. Makavejev makes an appearance in A Book of Dreams, researching Reich with Peter and shooting scenes at their former compound, but you never get the hang of what this film will be. "Movies are like tangible dreams, colorful moving shadows," Makavejev tells Peter. "When you turn the light on, it disappears. This is a very powerful fact." That powerful fact never emerges in W.R.: Mysteries of the Organism. On the first season of NBC's Law & Order in 1990, one of the show's most powerful episodes over two decades of broadcasts ("Indifference"), fictionalized the famous Joel Steinberg-Hedda Nussbaum abuse and murder case, but instead of portraying the Steinberg character as an attorney (as he was in real life), he's a Reichian therapist who drugs his patients with cocaine and then fucks them as if that were a typical part of Reich's Orgone treatment. If Wilhelm Reich wasn't turned into a misplaced avatar of the Sixties, advocating fucking in the streets, he was pilloried in dramas as a dangerous quack.

Donald Sutherland and Kate Bush in "Cloudbusting"

He fared better in other cultural forms. In music, Patti Smith was the first to tackle Peter Reich's A Book of Dreams in her romantic epic dirge "Birdland" on Horses ("His father died and left him a little farm in New England / All the long black funeral cars left the scene / And the boy was just standing there alone / Looking at the shiny red tractor / Him and his daddy used to sit inside / And circle the blue fields and grease the night"). But most famous is Kate Bush's "Cloudbusting" on her 1985 album, Hounds of Love. In the video of the song she recreates the story of the buried yo-yo and combines it with a dramatization of the cloudbusting machine, playing Peter Reich as a pixie Peter Pan while Donald Sutherland as Wilhelm Reich is a young and dashing long-haired idealist, a Walt Whitman playing the Burt Lancaster role in The Rainmaker and making it pour while the FDA is in hot pursuit. Although the video simplifies the story by having the son heroically make the sun come out just as his father is being taken away, the song itself is a powerful evocation of a boy's lost innocence.

A Book of Dreams is about the complicated love that parents have for their children, which is made more difficult when the parent's persona makes him seem more than human when in truth he is only ihuman. My mother was a career woman with a long and fascinating series of jobs and accomplishments before it became common and that is indeed something to remember. But it came at a price for both of us. She never got to experience what it meant to be a mother raising her first-born son and I know she felt some guilt and sadness around that fact. I never got to experience the maternal warmth that her presence could have provided me with and helped me to share and trust in the qualities I saw in her in my adult life. (Knowing all the cloud formations in the sky doesn't compensate for that.) Maybe Peter Reich never got to reconcile those complications in his life with his father. Fortunately, I did have those few last moments with my mother before she became sick. But A Book of Dreams doesn't set out to fix the past and what it failed to provide. It does, however, make the future more real – as well as the parents who made that future possible.

-- March 16/17

Another America: Remembering Dick Gregory 1932-2017



I arrived home this past week from a short holiday in Florida to the sad news that activist and comedian Dick Gregory had gone to spirit at the age of 84. Although in recent years, Gregory existed more on the periphery of mainstream culture, and a barely remembered figure of an earlier era of Civil Rights reforms and anti-war ferment, he was nevertheless still being sought out by eager young videographers who'd visit his home as if on a pilgrimage. With the goal of consulting with a famous relic of another America, they sought him out for help in making sense of the current one. But often the results you'd find on YouTube of their quests from those endless sojourns was a ranting hermit caught up in Truther campaigns who saw conspiracies in everything including "faked" moon landings, 9/11, Prince's death (which he believed was murder), the Rodney King beating tapes (the C.I.A. and the Australian "secret police" were behind the people who filmed it), Bill Cosby being framed for sexual assault because he was attempting to buy a major media company, etc. Yet Dick Gregory's flights of fantasy, often painfully funny to watch (especially since his proteges didn't possess his knowledge and experience of history) did little to diminish his authenticity as a powerful advocate for justice. Whatever outlandish tale Dick Gregory would tell those budding militants, he seemed to speak for the idea of a country that they felt was in jeopardy of disappearing, and it was that very notion of a nation, containing a citizenry that Gregory was once a prominent part of, that these willing apprentices appeared to see rapidly vanishing before their eyes. The fact that Dick Gregory died as white supremacists and American Nazis marched freely and candidly in Charlottesville made their view even more vividly painful to consider.

Given current events in Trump's America, it's damn near impossible to recognize that other country now. When Dick Gregory was first making history in the Sixties and Seventies, there was an authentic sense of a world at stake, ready to grab, and one to be transformed. In that time, Gregory set out to prove that people weren't so much shaped by history, but were those on the precipice of making history. Whether as a stand-up comedian who Hugh Hefner first booked at the Playboy Club in 1961, or in his landmark appearance as the first African-American to sit on the couch to be interviewed by Jack Paar on The Tonight Show, and then even further to the marches with Martin Luther King Jr., which would culminate in 1968 with Gregory making himself a write-in candidate for President of the United States, Gregory kept making history and keeping another America visible. Students on campus at the time weren't looking for "safe spaces," either, they were going into danger zones to assert that America had promises they demanded be kept. Gregory was there, too, on those front lines and with a comic zeal that carried with it a political bite that opened up America to its failings. But he also reminded everyone in his barbed routines that pessimism towards the political process and solipsism solved nothing. (When Gregory first spoke out against racial segregation, he turned it into a crackling joke."Segregation is not all bad," he told one nightclub audience. "Have you ever heard of a collision where the people in the back of the bus got hurt?") He daringly titled his dynamic 1964 autobiography, Nigger, co-written with Robert Lipsyte, and dedicated his memoir to his late mother by saying, "Dear momma, wherever you are, if ever you hear the word 'nigger' again, remember they are advertising my book."

Dick Gregory combined humour and activism as an incendiary form of absurdism where the audience could find the release of laughter while also recognizing the injustice and pain resting at the heart of the joke. He actually accomplished what many felt Lenny Bruce had been doing which was producing satire that stripped American culture of its puritan hypocrisies. But Bruce was far less political in his goals and choose instead to shock audiences with his performer's fervour. (Gregory actually invited Bruce to march in the South, to which Bruce refused, saying in a routine that "I think people would think I was exploiting the issue for my own dues. Anyway, the marches are sloppy, people shoving back and forth, Al Hibbler and Ray Charles walking into people...") But Gregory became especially active in the march on Selma in October, 1963, where he spoke for two hours on a public platform on the importance of the voter registration drive. Along with his push for racial integration, his activism went beyond marches and into full political participation in the system. The year before he ran for President, he first took on Chicago mayor Richard Daley in a race for the office in 1967. When he eventually ran for president, he ended up garnering close to 49,000 votes for the Freedom and Peace Party, an accomplishment that landed him on the winning president's master list of political opponents. Dick Gregory invoked even more controversy, however, when he had printed currency with his own image on them – and not as a publicity stunt. Many of these bills ended up circulating which led to Gregory coming close to being charged with committing a federal crime. But he joked that no one would arrest him since "everyone knows a black man will never be on a U.S. bill."


As we passed into Richard Nixon's Seventies, Dick Gregory still remained a force, but sometimes joking that he wouldn't care to see anyone else as president than Nixon. "Now you see white people hurting that never known hurt before." But was his remark a glib retort to the failures of the previous decade? I doubt it. Perhaps he began to believe that his cutting idealism which didn't bring the country to a sense of fulfillment was due to larger forces working against the nation. It's here that his embracing of conspiracy theories, beginning with the biggest, where Gregory became an outspoken critic of the Warren Commission's findings on the assassination of JFK by Lee Harvey Oswald took hold. In 1975, he would appear along with assassination researcher, Robert J. Groden, on Geraldo Rivera's late night talk show, Goodnight America, to premiere the Abraham Zapruder film of the murder. It was the first time the public had ever seen the movie and it carried the palpable shock of entering a time warp. Many that evening came to believe once witnessing the shooting that Kennedy's responses to being hit by the bullets had finally proved there was more than one shooter. Although Gregory would insist to his dying day that Kennedy's assassination was no accident, even claiming to one contemporary amateur sleuth that there were foretelling clues hidden on an American dollar bill, his embrace of early Truther zealotry was not borne of the same narcissism that fuels those on the fringes today. Like many at the time, Gregory began to believe that there were invisible forces subjugating those without access to power. Activists on the left like Noam Chomsky and Naomi Klein also like to conveniently divide the world into the powerful and the powerless as a means to define in the simplest terms possible the nature of injustice while evading the actual truth which is often much more complex (and takes us into the psychological pathologies of nations as much as it does their social inequities). But Gregory's passionate embrace of some of those same partisan simplicities were likely fueled by a need to answer for an America passing into darkness. Under those circumstances, there may have been some perverse comfort in believing that unseen conspiratorial forces, the kind that steal your country, are responsible rather than tragic events that emerge from a commingling of circumstance, accident, and the inevitable appearance of loners who pass into our world and inexplicably change our history.

Maybe that was why, while pushing Washington to examine JFK's murder, he was also curious about the fringe terrorist movements that grew out of the Sixties like the Weathermen, and even later, the Simbionese Liberation Army (SLA) who kidnapped Patty Hearst. When I first met Dick Gregory in 1974, I was just beginning college as a film student in Media Arts at Sheridan. One of my colleagues, Richard Kerr, had learned that Gregory would be speaking in Toronto in the Yorkville district where Anne Kalsø had her Earth Shoe emporium. Since Richard was invited to videotape Gregory's appearance on this new Sony portable recorder, he asked me if I wanted to interview him since I was more familiar with his work. Having never done an interview in my life, and being shy about the possibility, I was hesitant. But since I was quite familiar with him along with his comedy and activism, I decided to leap at the opportunity. The political scene which was then consumed by Nixon's resignation and his pardon over Watergate seemed to dominate the air. While much of the specific content of our talk has now faded with memory, the intensity of its dynamic hasn't. As we talked, Gregory continually made Richard and I aware of the power of the camera and what it said about the control of information we were recording. When Iasked him about his thoughts of Gerald Ford pardoning Richard Nixon, he replied, "When the sheriff breaks the law, what do you think the deputy is going to do?" When the SLA came up, he deferred to forces within the CIA as the true culprit and claimed that founder Donald DeFreeze (aka General Field Marshal Cinque) was a government informer. He also asserted that when DeFreeze was killed in the shootout with police in Los Angeles his headless body was delivered for funeral proceedings to his mother suggesting that the delivered corpse wasn't even DeFreeze. To confuse matters further, Gregory went on to tell me that it couldn't even be his real name. "What black mother would have a son named, DeFreeze?" Yet despite his lapses in reality, my very first professional interview was still a baptism by fire where the conversation was less a standard Q & A than a dialogue requiring my full and equal participation in the views expressed. Any questions of objectivity were always being challenged by Gregory. Many years later, with those challenges in mind, Richard Kerr would include choice excerpts from that interview in his poignant and politically vibrant 1988 experimental film about democracy in America, The Last Days of Contrition.


As the Eighties and Nineties proceeded, Dick Gregory continued to march for justice. He stormed the Capitol Building with women who lead the National ERA March for Ratification and Extension on Women's Equality Day in 1978, as well as speaking out with the Anti-Apartheid Movement. Having long embraced hunger strikes as part of his arsenal, he also protested the hostage crises in Tehran where he travelled in 1980 to help negotiate their release. A bout of cancer in 1999, led Gregory into becoming an entrepreneur for various health remedies – even becoming a vegetarian and showing up on television infomercials with his "Bahamian Diet Nutritional Drink" to help those who were struggling with obesity. But his efforts never seemed self-serving, despite his appearances on after hours television where TV preachers and other hucksters dominated the airwaves. By the time YouTube became a vivid part of social media in the post 9/11 culture, Dick Gregory found a niche from which he could address any subject that consumed his interest. As Barack Obama actually became the first African-American president, Gregory was also quick to warn that the nation's racial differences had not fully healed. "They get a behaved Negro," Gregory told Power 105.1's The Breakfast Club back in 2016. “He don’t raise his voice. He ain’t never called them a honky. You understand what I’m saying? Can you imagine how lucky they are.” If he'd won in 1968, Gregory said he would have been anything but polite. When asked about the first thing he would have done, he answered, "I would have dug up that Rose Garden and plant me a watermelon patch. And it would be no more state dinners, but watermelon lunches. We’d eat watermelon and spit the seeds on Pennsylvania Avenue.” How Gregory's style of guerrilla theatre would have gone down today is highly debatable, but his response reveals a man unafraid by the America he confronted. When later asked about the possibility of Donald Trump becoming president, he became more cryptic. All he could say was that anything was possible because "we don't run nothing."


That tone could easily suggest for some a state of despair in the face of Trump, but it would hardly apply to Dick Gregory. America could never disappear in Gregory's mind because liberation was always its goal. "When George Washington was fighting the British, it wasn't so he could build a college. It's liberate," Gregory once told Ray Suarez at AlJazeera. "So all the black folks you see in America have never been liberated...We talk as a group of people that's educated but not liberated. George Washington didn't have any educated people with him. The song don't say, 'Give me education or give me death' it say, 'Give me liberty or give me death.' In the price of a liberty still to come, Gregory retained a strong belief in a Universal God whose will in creating the universe couldn't be breached no matter how long mankind danced around their follies and avoided the true taste of liberation.

As Dick Gregory now passes into history, President Donald Trump is making history trying to erase the eight years of the first African-American president as if he never existed. In the face of this, it's hard to imagine what Gregory would have said about Charlottesville had he lived to comment on it, but to his last breath he was never without a view that caught you up short because no matter how condemning it could be towards the racism and injustices of America, he never lost sight of another America that was the bedrock of his activism. There was a calm that seemed to suggest that America would survive Donald Trump because the issues of America would always require a reminding voice to put the country in full perspective of its aims. That's why Gregory didn't limit his critical barbs to white racism only. When he got angry at Don Cheadle for his compromising portrait of Miles Davis in Miles Ahead, by introducing a pathetic action plot that skirted Davis's indelible impact on jazz, or chastised Cedric the Entertainer (for insulting Rosa Parks) in Barbershop, he was unequivocal and as clear as a bell. Dick Gregory was often fond of saying before every concert appearance, "I thank God that we have all arrived here safely today, and I pray to God that your return and my return will be equally as safe." In light of the fractious country he's left behind, I wish his journey today to also be a peaceful one.

-- August 27/17

Down That Lonesome Road: Sophie Huber's Harry Dean Stanton: Partly Fiction (2012)



Back in 1984, I was scheduled to do a taped radio interview with actor Harry Dean Stanton from his hotel room during the Toronto International Film Festival. Having been a "character" actor with memorable supporting parts in numerous films from Cool Hand Luke to The Rose, Stanton had just landed his first real starring role in Wim Wenders' laconic drama, Paris, Texas, where he played a lost soul estranged from his family who wanders out of the desert one day to reunite with them years after disappearing. The studio, 20th Century Fox, was eager under the circumstances to get Stanton plenty of publicity despite the fact that the actor wasn't the least bit comfortable being thrust into the spotlight. Despite his reluctance to be showered with attention, I didn't help my cause that day by accidentally missing the initial press screening and having to attend the Festival one (which was taking place just before I was to go meet Stanton). Paris, Texas also turned out to be over 2 1/2 hours long which meant I had to leave the film a half-hour early just to make the interview in time. While I wasn't comfortable having to depart the picture early, I still felt confident enough with what I saw to do the interview. But maybe what I shouldn't have done was tell Stanton that I had to leave before the film ended because from the time we started rolling tape, he rolled back into a cocoon. Looking at me with complete indifference, he let me ask about fifty questions in ten minutes – questions that had to go beyond Paris, Texas back through his earlier film career and even into his life in music – where he continued to provide one-word cryptic answers that revealed little and answered nothing. Finally exasperated, I turned off the tape recorder and Harry Dean looked at me with the satisfied grin of someone who had just won a round of arm wrestling. As I looked up to say, "Well, that's it," he answered back quickly, "It sure is." He grabbed his jacket and departed the hotel room with such speed that it was if he wanted no trace left of ever having being there. For years, I was baffled that we didn't get past our great divide, but having recently seen Sophie Huber's lovely and satisfying documentary, Harry Dean Stanton: Partly Fiction, I came to recognize something I probably missed that day. For an actor whose career lit up the background shadows of movies, being visible and being recognized came with certain obligations from those who wished to engage him. As Harry Dean Stanton never said a word, or looked into a camera lens, without making that moment matter, it was expected that you probably shouldn't take those moments lightly either.

In describing this veteran character actor of over 200 movies, film critic Michael Sragow rightly points out that Harry Dean Stanton (who died at the age of 91 in 2017) never became a "character-actor star" like Gene Hackman or Robert Duvall, rather Stanton "developed keener instincts for connecting with other players to create a solid grid of energy and feeling." Unlike Zelig, a nobody who pops up visibly in a crowd of somebodies, Stanton was a somebody who emerged in the background to light a small flame of recognition and it would burn into the viewer's memory. When Paul Newman's chain gang rebel Lucas Jackson in Cool Hand Luke is put in solitary after word of his mother's death, it's Stanton back in the barracks singing "Just a Closer Walk With Thee" that fills the scene with poignancy. Stanton's drug-fueled desperation as Kris Kristofferson's burned-out musical pal in Cisco Pike became that picture's sputtering motor that kept reminding Cisco of the counter-culture world he needed to abandon if he had any hope of surviving it. In cult movies like Repo Man, where Stanton possessed a exasperated visage that seemed eaten away by ravenous inertia, he played the car reprocessor as a feral Philosopher King. In Straight Time, he brought a desperate unswerving loyalty to his jewel thief partner Dustin Hoffman, a loyalty that clouded his better judgement, and made the bad ending to their crime feel even more painfully inevitable. In countless pictures, Harry Dean Stanton filled the screen with quiet men used to hiding who suddenly found themselves exposed and emotionally naked. What Sophie Huber does (with the help of crack cinematographer Seamus McGarvey) is bring a diffused light to those shaded and hidden corners of the lonely road that Stanton has walked most of his life.

Director Sophie Huber and Harry Dean Stanton. (Photo: Michael Buckner)

Rather than construct her film like a straight-forward biography, though, Huber's film plays like a ballad that provides variations on Stanton's life. Although her questions touch on various details of his life, such as his childhood in Kentucky in a deeply Christian home, his participation during the Battle of Okinawa, and his peripatetic love life, she isn't questing for answers. She is drawn to conjuring up moods instead so that we have to use our senses to piece together the fragments of Harry Dean Stanton's inner world. Often she will cut to a scene from a movie to illuminate a biographical point, but not to define some part of his personality. Many times in the middle of a story, Stanton will break into song whether it's a sad Mexican standard, or an American pop favourite like Fred Neil's "Everybody's Talkin'" (which Stanton says was inspired by actor Luke Askew and his heroin habit). Mostly Stanton is seen cloaking himself in silence – not to be coy, or to avoid being forthcoming, but to the let the stillness of his body speak for him. Guided by a certain Zen perspective, Stanton believes in the void with an understanding that our true being comes out of nothingness. (For an actor used to being the background, you can easily see how this would be a useful philosophy.)

In Harry Dean Stanton: Partly Fiction, Sophie Huber lets Stanton set the mood and the pace of the picture. Even when she brings in director David Lynch who playfully peppers Stanton with questions from a prepared sheet of paper, or Kris Kristofferson who reminisces warmly with Stanton about how Harry got him the lead role in Cisco Pike, or Debbie Harry with a ticklish shy eroticism going into why she wrote a song about him ("I Want That Man") and then surprisingly comes to meet him, and Wim Wenders who discusses with loving reflection the nuances Stanton brought to Paris, Texas, the anecdotes become more colour on a canvas than information we're chocking up to define the actor. In many ways, Harry Dean Stanton: Partly Fiction sets us up beautifully for John Carroll Lynch's extraordinary 2017 drama Lucky, an unassuming chamber work about a 90-year-old atheist loner (Harry Dean Stanton) who sees death in his rear view mirror and it's beginning to scare him. This being Stanton's final film might bring the picture an added poignancy, but Lynch (who did his own fine character acting in Zodiac and The Founder) keeps the material dry and witty in its eccentricity. While watching it, my mind kept drifting back to David Lynch's The Straight Story, where Stanton appeared at the very end to meet his long estranged brother who traveled across the state to reconcile with him. I wondered if Lucky could be the same guy. (Lucky also has a beautifully acted scene between Stanton and Tom Skerritt discussing devastating war stories similar to another powerfully affecting scene in The Straight Story.) With a motley cast that ranges from that same David Lynch, to former teen idol James Darren, Lucky reveals a deadpan minimalism that isn't devoid of emotional engagement in the way Jim Jarmusch's often are. Lucky possesses a luminous spareness.

If Harry Dean Stanton, despite his air of detachment, didn't possess such a wry and plucky spirit, Huber's documentary might be too vague and abstract, but her subject is endlessly fascinating. She is also a great listener. An actress herself, Sophie Huber sometimes lets the camera caress Stanton's worn face for threads of meaning found lined in the road maps seemingly etched there. When we hear Stanton singing the too-familiar "Danny Boy" in a voice that strips the song of any sentimental consideration, or the aching "Blue Eyes Crying in the Rain," she allows his somber voice to tell us that there are as many mysteries in art as there are in the lines of his face. Harry Dean Stanton: Partly Fiction doesn't give us answers to the enticing existential puzzles of a great character actor, but the pieces Sophie Huber provides lead to a compelling and unforgettable portrait of one.

-- February 3/18

No comments:

Post a Comment