Syllabus

Introduction to Music Informatics

Informatics I590 (to become I548)

Music N560

 

Instructor: Don Byrd, Visiting Associate Professor of Informatics; Senior Scholar, School of Music
Email: donbyrd@indiana.edu
Phone: 856-0129 (usually) and 856-2230
Offices: Music Library, Simon Center (usually) (ask for me at the Reference or Circulation Desk) and Eigenmann 1124
Office Hours: by appointment; also, I'm nearly always free after class unless I already have an appointment with a student.

Class meets Mon/Weds/Fri from 1:25 to 2:15 PM, in Simon Center M373. To get to M373, you must go into the Music Library; then go up two floors.


Course Overview and Goals

We will briefly cover several areas that are important in music informatics; we won't attempt to cover all important areas, but enough to give students a feeling for the subject in general. The goals of the course include:

Class Format and Requirements

While not primarily a readings course, the course will be organized around readings and, often, systems or projects for each area, with students giving regular presentations to the class on the assigned readings, systems, and projects. A typical class will consist of a 15- or 20-minute presentation by a student plus a short lecture by the instructor, or a longer lecture and demo by the instructor or, occasionally, a guest. I expect a student presenting an assigned paper or system to understand the content sufficiently to present the problem(s) addressed and explain the approach taken and experimental findings or other results to the class. If possible, the student should go further to seek resources and examples that illustrate the principles and/or algorithms discussed in the paper or implemented in the system.

There will be short assignments, including some that involve writing simple programs, and a large final project. For the final project, I expect each student to either implement and extend the findings of one of the papers, or do an independent music-informatics project on a relevant topic. I'll provide a list with a wide variety of possible topics, or you can propose your own. You can do your projects alone or in teams of two.

To keep our feet on the ground, i.e., to keep a strong connection between what we're studying and real music, (1) at the beginning of the course, we'll choose some music each of us is interested in, the idea being to create a collection to use as examples throughout the semester; (2) we'll take a look at, and when appropriate listen to, one or more major systems for each area we cover; and (3) we'll discuss (mostly in terms of technology) music created and/or performed by members of the class.

Preference will be given to systems we can actually try out, but most state-of-the-art systems aren't available, so we'll also discuss some interesting systems we can't touch (e.g., Cope's EMI) and some interesting systems that aren't state-of-the art. I expect that student presentations will average about 1/4 of class meeting time, not counting presentations of major projects at the end of the semester.

I'll take into account what students are interested in as much as possible in choosing both reading material and systems.

We will have at least one music-informatics researcher as a guest speaker.

Prerequisites

Readings

There is no textbook as such. Music informatics is too new and too fast-moving for anything suitable to exist. Many readings will available on the Web; others will be on reserve in the Music Library, or I'll hand copies out. They will be selected mostly from recent literature -- not all of it academic -- such as Computer Music Journal, the Journal of New Music Research, and Electronic Musician, as well as proceedings of conferences like the International Conference on Music Information Retrieval, International Computer Music Conference, Computer Music Modeling and Retrieval, and the Joint Conference on Digital Libraries, and books like Pohlmann's Principles of Digital Audio.

 

Course Outline

The following outline of major topics and materials to study (readings, systems, etc.) is approximate and subject to change. However, I don't expect it to change much, except for adding study materials.

(28 Aug. - 15 Sept.) 1. Introduction: Music Research and Music Informatics; Our Own Music; Digital Audio; Programming

We'll discuss what research is and why it matters for us; compose some very simple music, in order to learn more about how music is put together; and we'll each choose some music we're interested in, then listen to and discuss from a technical standpoint everyone's choices. We'll also learn a bit about programming in a simple language, probably R.

Readings:


(11 - 27 Sept.) 2. Acoustics & Psychoacoustics; Music Perception & Cognition; Expectation vs. Perception

Acoustics is the branch of physics that studies sound. When sound is considered in terms of how it's perceived, we have psychoacoustics, which is a matter of psychology and, to a lesser extent, music theory. Just as optical illusions show the complexity of visual perception, auditory illusions dramatize the complexity of our perception of sound. One aspect is the fact that we experience musical sounds partly in the time domain and partly in the frequency domain. We'll consider what this means for music informatics. We'll also see how so-called "perceptual coding" makes it possible to compress audio tremendously with very little loss of fidelity.

Readings:


(29 Sept. - 16 Oct.) 3. Music Representation and Notation; Converting between Representations and Encodings, Acquiring Music in Digital Form, etc.

The list of computer representations for music in use today is incredibly long, and problems in converting between one and another causes an incredible amount of trouble. Why can't people just choose one and be done with it? The short answer is, which one is best depends very much on what you want to do. We'll discuss music in all three basic forms -- audio, time-stamped events (MIDI), and notation -- and how you get any of them into a computer in the first place.

Guest talk: Tim Bell, Dept. of Computer Science and Software Engineering, University of Canterbury, New Zealand. Topic: Optical Music Recognition & Digital Music Stands.

Readings and systems:


(18 - 30 Oct.) 4. Music Retrieval by Content: Similarity Scale; Symbolic Music IR; Genre Classification; "Covers"

We'll talk about finding music in all forms (audio, MIDI, and notation) based on its content or its style (the latter as attempted by music-recommender systems like Pandora, Last.fm, MusicStrands, etc).

Readings and systems:


(1 - 6 Nov.) 5. Digital Music Libraries: Music Retrieval via Metadata; searching vs. browsing

It's possible to find music based on bibliographic information, as library catalogs do, rather than by its content. Naturally, online catalogs like IUCAT allow many more options than do the drawers full of 3-by-5 cards libraries used to have, and computers are capable of supporting many more options than any online catalog. And once you've found the music you want, recent systems like IU's own Variations2 are starting to include powerful built-in features for doing useful things with it.

Systems: iTunes, Variations2

Readings:


(8 - 13 Nov.) 6. Music Similarity, Sampling, and Intellectual Property Rights

Two obvious and important questions are, how does a person or a computer decide some music is similar to other music, and how can a court decide if some music -- especially sampled music -- infringes the copyright on other music? A less obvious and perhaps even more important question that some observers have raised is: have recent changes to U.S. law and policies gone too far, interfering with the creation of new music and, especially, of new kinds of music that digital technology has only recently made possible?

Readings and projects:


(WILL SKIP) 7. User Interfaces and Visualization for Music Systems

Music is so complex and subtle that usability of the system (including factors like how much control a user has as well as how easy it is to learn and use) is an important issue for doing almost anything with it on a computer. A related area is visualization, e.g., of a large collection of music to support finding music in it by browsing.

Systems and Web Sites: Variations2, Music Animation Machine, Ishkur's Electronic Music Guide, NightingaleSearch

Readings:


(15 Nov. - 22 Nov.) 8. Synthesis Of Sounds and of Music: New Approaches to Performance, Composition, and Improvisation

A great deal of music we hear these days could not have been made without computers, and they're used in a wide variety of ways. Sometimes computers are used as performing media, to synthesize sounds ranging from realistic imitations of orchestral instruments or human voices to wild effects unlike anything heard before; sometimes they're used to choose notes to create music that sounds surprisingly like the works of composers from Bach to Scott Joplin and beyond; and sometimes they're used as partners in live improvisation.

Readings:


-- Thanksgiving --

(27 Nov.) Reprise: Music Research and Music Informatics

(27 Nov - 8 Dec.) Presentations of Final Projects

 

Books on Reserve (in the Music Library)

 

Course Requirements and Grading

 

Miscellaneous

If you have a documented disability and anticipate needing accommodations in this course, please make arrangements to meet with me soon.

If you have any comments, questions, or suggestions, please see me during office hours, make an appointment, write me a note (anonymously if you like), or send me email.

University policies on academic dishonesty will be followed. Cite your sources. Students found to be engaging in plagiarism, cheating, or other types of dishonesty will receive an F for the course. For further information, see the IU Code of Student Ethics at http://campuslife.indiana.edu/Code/index1.html .

Late work will not be accepted without prior arrangement for compelling reasons.


Last updated: 29 Sep. 2006
Comments: donbyrd(at)indiana.edu
Copyright 2006, Donald Byrd