python-musicbrainz3: Difference between revisions

From MusicBrainz Wiki
Jump to navigationJump to search
Line 32: Line 32:
## Git repository ('''done''')
## Git repository ('''done''')
## Jira project?
## Jira project?
## Create group in Reviewboard ('''done''')
## Create group in Review Board ('''done''')
# Evaluate tools
# Evaluate tools
## XML mapping tool (Matt, '''in progress''')
## XML mapping tool (Matt, '''in progress''')

Revision as of 21:17, 19 March 2010

The python-musicbrainz3 package is the successor of python-musicbrainz2.

NGS is coming, so a rewrite is in order to keep up with MB development. This wiki page collects information, plans, and ideas for this project.

Feel free to edit!

Goals

  • Fully support the NGS data model and web service
  • Require Python 2.6 and aim to be as close to Python 3 as possible
  • Despite NGS's complexity: Simple things should be simple
  • Turn today's modules into packages (source files are much too large)
  • Make it more pythonic where possible (naming conventions?)
  • Remove rarely used cruft (what would that be?)
  • Keep the amount and quality of documentation
  • Write more test cases

Project Infrastructure

  • This wiki page is the central source of information
  • Discussion can take place on the development mailing list
  • Source code is hosted in the python-musicbrainz3 git repository
  • Submit patches to Review Board in group python-musicbrainz
  • TODO: Jira?
    • Do we want to use Jira just for bug reports, or as a dumping ground for upcoming tasks too (e.g. 'Do author search') --alastair
    • I don't know. Using it for tasks sounds like a good idea, too. --matt

Project Plan

  1. Set up project infrastructure
    1. Wiki page (done)
    2. Git repository (done)
    3. Jira project?
    4. Create group in Review Board (done)
  2. Evaluate tools
    1. XML mapping tool (Matt, in progress)
    2. Unit test runner (nose?)
      1. What's the advantage of something like nose over just using the unittest package? I don't have much experience in unit testing python stuff. --alastair
    3. Look at ETree or similar for XML (also consider json libraries?) (Alastair, in progress)
  3. Create UML diagrams for
    1. the core entities (done)
    2. pymb3's classes
  4. Set up project structure in the repository (Matt, in progress)
  5. Gather data for the parser
    1. Update MMD Relax NG schema if necessary
    2. Prepare comprehensive set of example documents for the test suite
      1. I wouldn't worry too much about getting this comprehensive list upfront. Especially since there are parts of the webservice still not completed we should be able to create this as we go --alastair
      2. I agree. We should start in see what's missing. --matt
  6. Start hacking!

Model

This is a simplified model showing only the core entities.

Core Model

The m:n associations displayed with bold lines (Artist to anything, Release to Label) are attributed and will be modeled using association classes (ArtistCredit and ReleaseEvent respectively). We implement 1:n associations using lists.

TODO: Are Work to Recording, ReleaseGroup to Release ordered?

TODO: The URL class is only used for ARs?

Links

Links with documentation and other relevant information.

Ideas and Open Questions

  • Use an XML mapping package (xml.etree?)
  • Import version history into git?