Event-based Representation

muspy.to_event_representation(music: Music, use_single_note_off_event: bool = False, use_end_of_sequence_event: bool = False, encode_velocity: bool = False, force_velocity_event: bool = True, max_time_shift: int = 100, velocity_bins: int = 32, dtype=<class 'int'>) → numpy.ndarray[source]

Encode a Music object into event-based representation.

The event-based represetantion represents music as a sequence of events, including note-on, note-off, time-shift and velocity events. The output shape is M x 1, where M is the number of events. The values encode the events. The default configuration uses 0-127 to encode note-on events, 128-255 for note-off events, 256-355 for time-shift events, and 356 to 387 for velocity events.

Parameters:
  • music (muspy.Music) – Music object to encode.
  • use_single_note_off_event (bool, default: False) – Whether to use a single note-off event for all the pitches. If True, the note-off event will close all active notes, which can lead to lossy conversion for polyphonic music.
  • use_end_of_sequence_event (bool, default: False) – Whether to append an end-of-sequence event to the encoded sequence.
  • encode_velocity (bool, default: False) – Whether to encode velocities.
  • force_velocity_event (bool, default: True) – Whether to add a velocity event before every note-on event. If False, velocity events are only used when the note velocity is changed (i.e., different from the previous one).
  • max_time_shift (int, default: 100) – Maximum time shift (in ticks) to be encoded as an separate event. Time shifts larger than max_time_shift will be decomposed into two or more time-shift events.
  • velocity_bins (int, default: 32) – Number of velocity bins to use.
  • dtype (np.dtype, type or str, default: int) – Data type of the return array.
Returns:

Encoded array in event-based representation.

Return type:

ndarray, shape=(?, 1)

muspy.from_event_representation(array: numpy.ndarray, resolution: int = 24, program: int = 0, is_drum: bool = False, use_single_note_off_event: bool = False, use_end_of_sequence_event: bool = False, max_time_shift: int = 100, velocity_bins: int = 32, default_velocity: int = 64, duplicate_note_mode: str = 'fifo') → muspy.music.Music[source]

Decode event-based representation into a Music object.

Parameters:
  • array (ndarray) – Array in event-based representation to decode.
  • resolution (int, default: muspy.DEFAULT_RESOLUTION (24)) – Time steps per quarter note.
  • program (int, default: 0 (Acoustic Grand Piano)) – Program number, according to General MIDI specification [1]. Valid values are 0 to 127.
  • is_drum (bool, default: False) – Whether it is a percussion track.
  • use_single_note_off_event (bool, default: False) – Whether to use a single note-off event for all the pitches. If True, a note-off event will close all active notes, which can lead to lossy conversion for polyphonic music.
  • use_end_of_sequence_event (bool, default: False) – Whether to append an end-of-sequence event to the encoded sequence.
  • max_time_shift (int, default: 100) – Maximum time shift (in ticks) to be encoded as an separate event. Time shifts larger than max_time_shift will be decomposed into two or more time-shift events.
  • velocity_bins (int, default: 32) – Number of velocity bins to use.
  • default_velocity (int, default: muspy.DEFAULT_VELOCITY (64)) – Default velocity value to use when decoding.
  • duplicate_note_mode ({'fifo', 'lifo', 'all'}, default: 'fifo') –

    Policy for dealing with duplicate notes. When a note off event is presetned while there are multiple correspoding note on events that have not yet been closed, we need a policy to decide which note on messages to close. This is only effective when use_single_note_off_event is False.

    • ’fifo’ (first in first out): close the earliest note on
    • ’lifo’ (first in first out): close the latest note on
    • ’all’: close all note on messages
Returns:

Decoded Music object.

Return type:

muspy.Music

References

[1] https://www.midi.org/specifications/item/gm-level-1-sound-set

class muspy.EventRepresentationProcessor(use_single_note_off_event: bool = False, use_end_of_sequence_event: bool = False, encode_velocity: bool = False, force_velocity_event: bool = True, max_time_shift: int = 100, velocity_bins: int = 32, default_velocity: int = 64)[source]

Event-based representation processor.

The event-based represetantion represents music as a sequence of events, including note-on, note-off, time-shift and velocity events. The output shape is M x 1, where M is the number of events. The values encode the events. The default configuration uses 0-127 to encode note-one events, 128-255 for note-off events, 256-355 for time-shift events, and 356 to 387 for velocity events.

use_single_note_off_event

Whether to use a single note-off event for all the pitches. If True, the note-off event will close all active notes, which can lead to lossy conversion for polyphonic music.

Type:bool, default: False
use_end_of_sequence_event

Whether to append an end-of-sequence event to the encoded sequence.

Type:bool, default: False
encode_velocity

Whether to encode velocities.

Type:bool, default: False
force_velocity_event

Whether to add a velocity event before every note-on event. If False, velocity events are only used when the note velocity is changed (i.e., different from the previous one).

Type:bool, default: True
max_time_shift

Maximum time shift (in ticks) to be encoded as an separate event. Time shifts larger than max_time_shift will be decomposed into two or more time-shift events.

Type:int, default: 100
velocity_bins

Number of velocity bins to use.

Type:int, default: 32
default_velocity

Default velocity value to use when decoding.

Type:int, default: 64
decode(array: numpy.ndarray) → muspy.music.Music[source]

Decode event-based representation into a Music object.

Parameters:array (ndarray) – Array in event-based representation to decode. Cast to integer if not of integer type.
Returns:Decoded Music object.
Return type:muspy.Music object

See also

muspy.from_event_representation()
Return a Music object converted from event-based representation.
encode(music: muspy.music.Music) → numpy.ndarray[source]

Encode a Music object into event-based representation.

Parameters:music (muspy.Music object) – Music object to encode.
Returns:Encoded array in event-based representation.
Return type:ndarray (np.uint16)

See also

muspy.to_event_representation()
Convert a Music object into event-based representation.