Social interactions require continually adjusting behavior in response to sensory feedback. For example, when having a conversation, sensory cues from our partner (e.g., sounds or facial expressions) affect our speech patterns in real time. Our speech signals, in turn, are the sensory cues that modify our partner’s actions. What are the underlying computations and neural mechanisms that govern these interactions? To address these questions, my lab focuses on the acoustic communication system of Drosophila. To our advantage, the fly nervous system is relatively simple, with a wealth of neural circuit tools to interrogate it. Importantly, Drosophila acoustic behaviors are highly quantifiable and robust. During courtship, males produce time-varying songs via wing vibration, while females arbitrate mating decisions. We have discovered that male song patterns are continually sculpted by internal state dynamics and interactions with the female, over timescales ranging from tens of milliseconds to minutes. On the listener side, we are examining how courtship song representations drive responsive behaviors. I will discuss our work to map the circuits and computations underlying both song production and perception, and explain how our focus on natural acoustic signals provides a powerful, quantitative handle for studying the basic building blocks of communication.