From solo concerts to restaurant gigs to weddings to playing in bands, Larry Zbikowskihad played it all. Yet, 25 years ago, the classically trained guitarist found himself thinking he was just scratching the surface.
At times, while he was improvising, “I found myself thinking in music,” he says. “I became skeptical of the idea that language was the only way to think.”
While working on his master’s degree in classical guitar in the mid-1980s at the University of Minnesota, Zbikowski decided entering academia might provide some answers.
Today, his work is at the forefront of the emerging field of “cognitive musicology”—a term that’s been in circulation only for the last year or two, he says. The discipline loosely draws on such fields as linguistics, psychology, and neuroscience, says Zbikowski, Associate Professor and Director of Graduate Studies in Music.
“It’s not about borrowing complete models from those disciplines and mapping them onto music, but rather using whatever fits from those models,” he explains.
In his current project, with the working title By Crystal Fountains: Music, Language, and Grammar, Zbikowski tries to illuminate the difference between music and language.
“Why did humans develop both language and music?” Zbikowski wonders. “Every culture has both.” There are cases where the two blur, he says, as in hip-hop or pop music. Nonetheless, Zbikowski believes that language and music serve very different functions in human culture, he says. While language adheres to one set of grammatical rules, music follows another.
Discipline ‘Built on Quicksand’
Zbikowski’s approach to these questions began to take shape while he was a PhD student in music at Yale University during the 1980s, at a time when he began to have grave doubts about the limitations of music theory.
“I came to the conclusion,” Zbikowski recalled in his keynote address to the Gesellschaft für Musiktheorie in 2005, “that the discipline of music theory was built on quicksand.”
At the time, it was believed that music theory could reveal the structure of a piece of music. Hoping to find what was meant by structure, Zbikowski turned to cognitive linguistics and cognitive psychology. He became convinced that structure was not an attribute of the music, but a reflection of the cognitive capacities we use to understand it.
Zbikowski developed this perspective in the 2002 book, Conceptualizing Music: Cognitive Structure, Theory, and Analysis. He argued that music theory was not about scales, chords, and intervals, familiar only to those with formal musical training. Instead, theorizing about music is something listeners do every time they try to make sense of a musical experience. His new work draws on this perspective, but focuses on the resources music offers that are different from those of language.
In one recent chapter, Zbikowski recounts how music intended for social dances is shaped around the dance it was created to accompany. The grammar of a waltz, for instance, has to conform to the movement of the dancers; the music is “first and foremost about the dance, rather than about tonal organization,” Zbikowski writes.
Based on his analysis, Zbikowski concludes that the primary function of music is to provide a sonic analog for dynamic processes—whether an 18th-century dance or a leaf falling through the air. But isn’t the primary function of music to trigger an emotion in the listener? In Zbikowski’s conception, as well as that of many researchers on emotion, emotions also are similarly dynamic: “Sadness is not a state, but a kind of process that is lived,” he says. “Music, through its sounds and rhythms, can provide an analog for this process.”