SPEAKER

Christine Payne

Christine is a research scientist at OpenAI where she created MuseNet, a transformer-based music generation model, which was recently used to co-compose a piece for the BBC Philharmonic. Also a Juilliard-trained pianist, she is particularly interested in Human/AI musical collaborations. She holds a masters in neuroscience from Stanford, and graduated as valedictorian of Princeton with a degree in physics.

MuseNet is a deep neural network that can generate 4-minute musical compositions with 10 different instruments. It can combine styles from country to Mozart to the Beatles. MuseNet was not explicitly programmed with our understanding of music, but instead discovered patterns of harmony, rhythm, and style by learning to predict the next token in hundreds of thousands of MIDI files. MuseNet pushes the boundaries of AI creativity, both as an independent composer, and as a collaboration tool with human artists.

VIDEOS

VIDEO

Girl Geek X OpenAI Lightning Talks and Panel

More than 100 girl geeks met at OpenAI HQ in San Francisco to to connect with OpenAI researchers and learn about their recent work in reinforcement learning, robotics, AI policy, and more!
WATCH ON YOUTUBE