Learning Digital Signal Processing through Music & Sound
A different approach to learning DSP
This course and video series is a bit different from the standard university Digital Signal Processing (DSP) curriculum. Most textbooks and classes start with the fundamentals of continuous-time (analog) signals and systems and from that foundation, they derive discrete-time and ultimately full digital systems, which is a mathematically complete approach.1
A math and theory-based approach made sense in a world where we relied primarily on pen & paper and computation was precious, but I find that many (most?) engineering students have difficulty grasping the analog theory. For some, they only “get” DSP when actually applying it in realistic and meaningful scenarios. (In fact, that’s how I truly learned it myself.) And today, we have abundant computation and tools to capture, manipulate, and visualize signals in near real-time, so let’s use them! Plus, nearly all signal processing is digital now, so I think the discrete/digital domain is the best place to start learning about it.2
In this course, not only do we start with digital signals, we begin with the Discrete Fourier Transform, one of the most useful and widely-applied algorithms ever created. This offers a concrete tool and application (frequency analysis and visualization) that, I believe, offers a great starting point for thinking about signals.
Why videos?
The core material for this class is a “video textbook”. This is because I believe strongly in the impact of simultaneous presentation: when you can see, hear, and perhaps even feel a concept, it makes a deeper impression that is more likely to be remembered. I also believe animation and motion graphics, when used effectively, can elucidate complex and challenging topics. Much of this is inspired by Grant Sanderson’s incredible 3Blue1brown YouTube channel and his general approach, which is to start with concrete examples that work back towards the abstract concepts. In this series, I try to start with a real signal or application of DSP, demonstrate some direct manipulation or utility, and eventually zoom out to generalize for different conditions.
Why music?
Nearly all of the examples and applications I present are from/for music. There are multiple reasons for this, and the first is selfish: this is where the bulk of my DSP experience lies. But music is familiar and broadly accessible… most people have listened to lots of it, and we can hear and observe changes to a sound signal right away. It’s also fairly easy, computationally, for machines these days to handle (heck, even your phone can do most audio DSP in near real-time). Thus, I think music (and sound) offer an ideal entry point for learning digital signal processing.
Is this “DSP for Dummies”?
Look, DSP is hard. There’s a lot of details that draw upon different areas of engineering knowledge. And yes, there’s some math and even more coding. My intention is to impart both a conceptual and practical understanding of DSP, rigorously. It’s a University-level class, but I think we can create better understanding through specific applications using state-of-the-art tools and modern media presentation. My goal is to prepare students to actually use DSP, without the baggage of theory and notation developed nearly a century ago.
Intended audience
- Engineering undergraduate students seeking to learn the fundamentals of DSP and how it is applied.
- Musicians and audio engineers who want to know more about DSP and why certain recording and production settings, practices, and traditions work the way they do.
- Anyone interested in music and sound processing.
- Prerequisites: An understanding of calculus and some knowledge of coding (we’ll be using Python, particularly the NumPy and SciPy libraries, which are now nearly universal). Familiarity with differential equations would also be helpful.
Course Materials
- DSP Video Textbook
- Colab Notebooks on GitHub (coming soon)
Instructor
Youngmoo Kim is Director of the ExCITe Center, an institute at Drexel University for transdisciplinary collaboration and learning, and Professor of Electrical & Computer Engineering. His research group, the Music & Entertainment Technology Laboratory pursues AI for music, human-machine interfaces and robotics for expressive interaction, and STEAM education. He has also served as Resident Technologist for Opera Philadelphia and serves as an advisor for the Philadelphia Museum of Art. You can view his TEDxPhiladelphia 2019 talk here.
Subscribe to Creating at a Distance, the newsletter.
Notes
1 That’s because the traditional electrical engineering curriculum builds upon a stack of prerequisite theoretical knowledge: Electrical circuits > Linear systems > Signals & systems > Analog signal processing > DSP 🔼
2 There are some applications that require analog signal processing, but they’re rare. It’s kind of funny, even the analog/continuous-time examples we create for our teaching materials are actually digital signals we’ve massaged into looking and behaving like analog ones. 😀 🔼