Event-based Representation¶
-
muspy.
to_event_representation
(music: Music, use_single_note_off_event: bool = False, use_end_of_sequence_event: bool = False, force_velocity_event: bool = True, max_time_shift: int = 100, velocity_bins: int = 32) → numpy.ndarray[source] Encode a Music object into event-based representation.
The event-based represetantion represents music as a sequence of events, including note-on, note-off, time-shift and velocity events. The output shape is M x 1, where M is the number of events. The values encode the events. The default configuration uses 0-127 to encode note-one events, 128-255 for note-off events, 256-355 for time-shift events, and 356 to 387 for velocity events.
Parameters: - music (
muspy.Music
object) – Music object to encode. - use_single_note_off_event (bool) – Whether to use a single note-off event for all the pitches. If True, the note-off event will close all active notes, which can lead to lossy conversion for polyphonic music. Defaults to False.
- use_end_of_sequence_event (bool) – Whether to append an end-of-sequence event to the encoded sequence. Defaults to False.
- force_velocity_event (bool) – Whether to add a velocity event before every note-on event. If False, velocity events are only used when the note velocity is changed (i.e., different from the previous one). Defaults to True.
- max_time_shift (int) – Maximum time shift (in ticks) to be encoded as an separate event. Time shifts larger than max_time_shift will be decomposed into two or more time-shift events. Defaults to 100.
- velocity_bins (int) – Number of velocity bins to use. Defaults to 32.
Returns: Encoded array in event-based representation.
Return type: ndarray, dtype=uint16, shape=(?, 1)
- music (
-
muspy.
from_event_representation
(array: numpy.ndarray, resolution: int = 24, program: int = 0, is_drum: bool = False, use_single_note_off_event: bool = False, use_end_of_sequence_event: bool = False, max_time_shift: int = 100, velocity_bins: int = 32, default_velocity: int = 64) → muspy.music.Music[source] Decode event-based representation into a Music object.
Parameters: - array (ndarray) – Array in event-based representation to decode. Will be casted to integer if not of integer type.
- resolution (int) – Time steps per quarter note. Defaults to muspy.DEFAULT_RESOLUTION.
- program (int, optional) – Program number according to General MIDI specification [1]. Acceptable values are 0 to 127. Defaults to 0 (Acoustic Grand Piano).
- is_drum (bool, optional) – A boolean indicating if it is a percussion track. Defaults to False.
- use_single_note_off_event (bool) – Whether to use a single note-off event for all the pitches. If True, the note-off event will close all active notes, which can lead to lossy conversion for polyphonic music. Defaults to False.
- use_end_of_sequence_event (bool) – Whether to append an end-of-sequence event to the encoded sequence. Defaults to False.
- max_time_shift (int) – Maximum time shift (in ticks) to be encoded as an separate event. Time shifts larger than max_time_shift will be decomposed into two or more time-shift events. Defaults to 100.
- velocity_bins (int) – Number of velocity bins to use. Defaults to 32.
- default_velocity (int) – Default velocity value to use when decoding. Defaults to 64.
Returns: Decoded Music object.
Return type: muspy.Music
objectReferences
[1] https://www.midi.org/specifications/item/gm-level-1-sound-set
-
class
muspy.
EventRepresentationProcessor
(use_single_note_off_event: bool = False, use_end_of_sequence_event: bool = False, force_velocity_event: bool = True, max_time_shift: int = 100, velocity_bins: int = 32, default_velocity: int = 64)[source] Event-based representation processor.
The event-based represetantion represents music as a sequence of events, including note-on, note-off, time-shift and velocity events. The output shape is M x 1, where M is the number of events. The values encode the events. The default configuration uses 0-127 to encode note-one events, 128-255 for note-off events, 256-355 for time-shift events, and 356 to 387 for velocity events.
-
use_single_note_off_event
¶ Whether to use a single note-off event for all the pitches. If True, the note-off event will close all active notes, which can lead to lossy conversion for polyphonic music. Defaults to False.
Type: bool
-
use_end_of_sequence_event
¶ Whether to append an end-of-sequence event to the encoded sequence. Defaults to False.
Type: bool
-
force_velocity_event
¶ Whether to add a velocity event before every note-on event. If False, velocity events are only used when the note velocity is changed (i.e., different from the previous one). Defaults to True.
Type: bool
-
max_time_shift
¶ Maximum time shift (in ticks) to be encoded as an separate event. Time shifts larger than max_time_shift will be decomposed into two or more time-shift events. Defaults to 100.
Type: int
-
decode
(array: numpy.ndarray) → muspy.music.Music[source] Decode event-based representation into a Music object.
Parameters: array (ndarray) – Array in event-based representation to decode. Will be casted to integer if not of integer type. Returns: Decoded Music object. Return type: muspy.Music
objectSee also
muspy.from_event_representation()
- Return a Music object converted from event-based representation.
-
encode
(music: muspy.music.Music) → numpy.ndarray[source] Encode a Music object into event-based representation.
Parameters: music ( muspy.Music
object) – Music object to encode.Returns: Encoded array in event-based representation. Return type: ndarray (np.uint16) See also
muspy.to_event_representation()
- Convert a Music object into event-based representation.
-