Machine learning wrote a punk album

Music was generated autoregressively with SampleRNN, a recurrent neural network [Soroush Mehri et al. 2017], trained on raw audio from the album Punk in Drublic. The machine listened to Punk in Drublic 26 times over several days. The machine generated 900 minutes of audio. A human listened to the machine audio, chose sections from varied evolution points, and taped them together into a 20 minute album.

I’d pay real money to see someone attempt to annotate this on Genius.

More madness at

Via The Outline

Karl Smith is a New Zealander living in Melbourne, Australia. He's an art and design, tech, and pop culture enthusiast. Previous gigs include musician, concierge, picture framer, designer and product manager.