From September 2005 to June 2006 a team of thirteen scholars at the The University of Southern California's Annenberg Center for Communication explored how new and maturing networking technologies are transforming the way in which we interact with content, media sources, other individuals and groups, and the world that surrounds us.
This site documents the process and the results.
Can you distill music down to a genetic code? Something beyond "three chords and the truth"? The Musical Genome Project is working on it. Essentially they use human intelligence to create metadata on songs. Of course they have a business model behind it...sell you mixes of songs you like based on musical preference rather than buying preferences.
Interesting concept, and I'd love to see their "scorecards" for the meta data. The problem you run into is that the subjective nature of music drives meta structures crazy. What does "blue" sound like? I know there is a blue, but my blue is likely different than your blue.