In The New Yorker, Kyle Chayka bemoans the creeping blandness that settled into his Spotify listening experience as the company leaned into algorithmic personalization and playlists.
Issues with the listening technology create issues with the music itself; bombarded by generic suggestions and repeats of recent listening, listeners are being conditioned to rely on what Spotify feeds them rather than on what they seek out for themselves. “You’re giving them everything they think they love and it’s all homogenized,” Ford said, pointing to the algorithmic playlists that reorder tracklists, automatically play on shuffle, and add in new, similar songs. Listeners become alienated from their own tastes; when you never encounter things you don’t like, it’s harder to know what you really do.
This observation that the automation of your tastes can alienate you from them feels powerful. There’s obviously a useful and meaningful role for “more like this” recommendation and prediction engines. Still, there’s a risk when we overfit those models and eliminate personal agency and/or discovery in the experience. Surely there’s an opportunity to add more texture—a push and pull between lean-back personalization and more effortful exploration.
Let’s dial up the temperature on these models, or at least some of them. Instead of always presenting “more like this” recommendations, we could benefit from “more not like this,” too.