When I tell people that I get paid to research and algorithmically generate music (among other research topics), I often get asked a few of questions: How old were you when you started composing? How many instruments do you play? Did you go to conservatory?
The short answer is — I didn’t know what harmony was until I was 19, never went to conservatory, and don’t play any instruments, thank you very much.
Then there’s the long answer.
Music almost entered my life at the age of 16, when I first learned about computer science and was spending all my tutoring money buying anything I could find that could actually apply programming to solve real “problems,” something tremendously exciting that I’d never been able to do before. Among the many textbooks on computational epistemology, empirical sociology, cell biology, etc, I purchased “Notes from the Metalevel,” an algorithmic composition textbook. I read part of it and enjoyed it, but it didn’t particularly speak to me, and I continued on to other things.
Enter spring of my first year of college. I was hospitalized in a psychiatric facility, and my parents learned that the constant pacing, the grimacing without meaning to, the randomly dropping things, the wandering the halls aimlessly at school weren’t just “Halley being Halley” — they were the signs of prodromal schizophrenia that everyone had missed — especially me.
Personally, I was certain that I was evil and contaminating everyone I knew by letting fantasies involving the FBI get into my head — in a word, I was delusional and lacked insight, while everyone around me had seen me excel in academics and not worried about what we now think of as “the signs.”
I don’t remember a lot about that first six months. Only two things stick out to me vividly. The first was the day I actually believed the doctors who were telling me that it wasn’t my fault, and the huge weight it lifted off of my shoulders (much better to be mentally ill than to be evil).
The second was when I first felt ready to go back to programming and happened to take “Notes from the Metalevel” off the shelf. It was like someone had given me a new purpose when I needed one the most — music wasn’t going to “solve” itself, was it? So reading music theory dissertations and using them to generate music became my companion as I completed the partial hospitalization program, as I moved back to being a college student (this time on a good dosage of medication and with lots of therapy), and, eventually, as I entered grad school.
Fortunately, I realized fairly early on that music wasn’t a “problem” to be solved — it was a domain with endless potential (in fact, one of my papers proved mathematically that you can simulate the Big Bang as well from within standard music software as from within any known computational system). And that was part of what made it so great. Music generation is grounded in empirical reality (trust me, you can tell when you’ve failed to generate anything remotely like music), yet it offered the creativity I had previously only associated with my most psychotic thoughts.
That’s not to say that I haven’t struggled with my relationship with music. When I experience small dips of psychosis, even music can get tied to “that FBI stuff.” I can be obsessive about working on music, and have to rely on my wonderful fiance to remind me that showering is more important than finishing running that one script. But overall, music has been a source of comfort and a blessing for me, as it has for many, many people over the centuries — although perhaps not in exactly the same way,