Note-based Representation

muspy.to_note_representation(music: Music, use_start_end: bool = False, encode_velocity: bool = True) → numpy.ndarray[source]

Encode a Music object into note-based representation.

The note-based represetantion represents music as a sequence of (pitch, time, duration, velocity) tuples. For example, a note Note(time=0, duration=4, pitch=60, velocity=64) will be encoded as a tuple (0, 4, 60, 64). The output shape is N * D, where N is the number of notes and D is 4 when encode_velocity is True, otherwise D is 3. The values of the second dimension represent pitch, time, duration and velocity (discarded when encode_velocity is False).

Parameters:
  • music (muspy.Music object) – Music object to encode.
  • use_start_end (bool) – Whether to use ‘start’ and ‘end’ to encode the timing rather than ‘time’ and ‘duration’. Defaults to False.
  • encode_velocity (bool) – Whether to encode note velocities. Defaults to True.
Returns:

Encoded array in note-based representation.

Return type:

ndarray, dtype=uint8, shape=(?, 3 or 4)

muspy.from_note_representation(array: numpy.ndarray, resolution: int = 24, program: int = 0, is_drum: bool = False, use_start_end: bool = False, encode_velocity: bool = True, default_velocity: int = 64) → muspy.music.Music[source]

Decode note-based representation into a Music object.

Parameters:
  • array (ndarray) – Array in note-based representation to decode. Will be casted to integer if not of integer type.
  • resolution (int) – Time steps per quarter note. Defaults to muspy.DEFAULT_RESOLUTION.
  • program (int, optional) – Program number according to General MIDI specification [1]. Acceptable values are 0 to 127. Defaults to 0 (Acoustic Grand Piano).
  • is_drum (bool, optional) – A boolean indicating if it is a percussion track. Defaults to False.
  • use_start_end (bool) – Whether to use ‘start’ and ‘end’ to encode the timing rather than ‘time’ and ‘duration’. Defaults to False.
  • encode_velocity (bool) – Whether to encode note velocities. Defaults to True.
  • default_velocity (int) – Default velocity value to use when decoding if encode_velocity is False. Defaults to 64.
Returns:

Decoded Music object.

Return type:

muspy.Music object

References

[1] https://www.midi.org/specifications/item/gm-level-1-sound-set

class muspy.NoteRepresentationProcessor(use_start_end: bool = False, encode_velocity: bool = True, default_velocity: int = 64)[source]

Note-based representation processor.

The note-based represetantion represents music as a sequence of (pitch, time, duration, velocity) tuples. For example, a note Note(time=0, duration=4, pitch=60, velocity=64) will be encoded as a tuple (0, 4, 60, 64). The output shape is L * D, where L is the number of notes and D is 4 when encode_velocity is True, otherwise D is 3. The values of the second dimension represent pitch, time, duration and velocity (discarded when encode_velocity is False).

use_start_end

Whether to use ‘start’ and ‘end’ to encode the timing rather than ‘time’ and ‘duration’. Defaults to False.

Type:bool
encode_velocity

Whether to encode note velocities. Defaults to True.

Type:bool
default_velocity

Default velocity value to use when decoding if encode_velocity is False. Defaults to 64.

Type:int
decode(array: numpy.ndarray) → muspy.music.Music[source]

Decode note-based representation into a Music object.

Parameters:array (ndarray) – Array in note-based representation to decode. Will be casted to integer if not of integer type.
Returns:Decoded Music object.
Return type:muspy.Music object

See also

muspy.from_note_representation()
Return a Music object converted from note-based representation.
encode(music: muspy.music.Music) → numpy.ndarray[source]

Encode a Music object into note-based representation.

Parameters:music (muspy.Music object) – Music object to encode.
Returns:Encoded array in note-based representation.
Return type:ndarray (np.uint8)

See also

muspy.to_note_representation()
Convert a Music object into note-based representation.