From MusicBrainz Wiki
< AcousticBrainz
Revision as of 12:52, 23 March 2015 by Gentlecat (talk | contribs) (Moved dataset idea.)
Jump to navigationJump to search

This page describes ideas that we've had for AcousticBrainz project.

Data exploration

An interactive system to explore the data that we already have in AcousticBrainz. For example, what are all of the songs that we say are in a certain Key. Order these by tempo and then group them by mood.


A search system (which could be part of the above task) that lets you search for tracks by their metadata or by extracted features. This could use an existing search technology (e.g. Solr), or something custom-written for the task. A similar task would be to be able to place songs in an n-dimensional similarity space to explore songs that are acoustically similar.

Data accuracy

An investigation of the accuracy of AcousticBrainz compared to other music databases. For example, MusicBrainz has many tags which represent genres. This information is also available from services like Lower-level information such as key and bpm is available from services such as the Echo Nest.


We're interested in developing a tool that lets people build datasets - mappings between tracks in AcousticBrainz and attributes that represent them. These attributes could be broad and subjective, such as a genre and mood, or specific descriptions, such as instrumentation or vocal qualities. We have the technology built to develop machine learning models given examples of these mappings, but we lack a tool to actually build them. This tool should let people choose attributes and attach them to musicbrainz ids. Then it should convert the data into a form that our existing tools can understand to create the training model, and then present some evaluation statistics such as the accuracy of the new model.