Note-based Representation¶
-
muspy.
to_note_representation
(music: Music, use_start_end: bool = False, encode_velocity: bool = True, dtype: Union[numpy.dtype, type, str] = <class 'int'>) → numpy.ndarray[source] Encode a Music object into note-based representation.
The note-based represetantion represents music as a sequence of (time, pitch, duration, velocity) tuples. For example, a note Note(time=0, duration=4, pitch=60, velocity=64) will be encoded as a tuple (0, 60, 4, 64). The output shape is N * D, where N is the number of notes and D is 4 when encode_velocity is True, otherwise D is 3. The values of the second dimension represent time, pitch, duration and velocity (discarded when encode_velocity is False).
Parameters: - music (
muspy.Music
) – Music object to encode. - use_start_end (bool, default: False) – Whether to use ‘start’ and ‘end’ to encode the timing rather than ‘time’ and ‘duration’.
- encode_velocity (bool, default: True) – Whether to encode note velocities.
- dtype (np.dtype, type or str, default: int) – Data type of the return array.
Returns: Encoded array in note-based representation.
Return type: ndarray, shape=(?, 3 or 4)
- music (
-
muspy.
from_note_representation
(array: numpy.ndarray, resolution: int = 24, program: int = 0, is_drum: bool = False, use_start_end: bool = False, encode_velocity: bool = True, default_velocity: int = 64) → muspy.music.Music[source] Decode note-based representation into a Music object.
Parameters: - array (ndarray) – Array in note-based representation to decode.
- resolution (int, default: muspy.DEFAULT_RESOLUTION (24)) – Time steps per quarter note.
- program (int, default: 0 (Acoustic Grand Piano)) – Program number, according to General MIDI specification [1]. Valid values are 0 to 127.
- is_drum (bool, default: False) – Whether it is a percussion track.
- use_start_end (bool, default: False) – Whether to use ‘start’ and ‘end’ to encode the timing rather than ‘time’ and ‘duration’.
- encode_velocity (bool, default: True) – Whether to encode note velocities.
- default_velocity (int, default: muspy.DEFAULT_VELOCITY (64)) – Default velocity value to use when decoding. Only used when encode_velocity is True.
Returns: Decoded Music object.
Return type: References
[1] https://www.midi.org/specifications/item/gm-level-1-sound-set
-
class
muspy.
NoteRepresentationProcessor
(use_start_end: bool = False, encode_velocity: bool = True, dtype: Union[numpy.dtype, type, str] = <class 'int'>, default_velocity: int = 64)[source] Note-based representation processor.
The note-based represetantion represents music as a sequence of (pitch, time, duration, velocity) tuples. For example, a note Note(time=0, duration=4, pitch=60, velocity=64) will be encoded as a tuple (0, 4, 60, 64). The output shape is L * D, where L is th number of notes and D is 4 when encode_velocity is True, otherwise D is 3. The values of the second dimension represent pitch, time, duration and velocity (discarded when encode_velocity is False).
-
use_start_end
¶ Whether to use ‘start’ and ‘end’ to encode the timing rather than ‘time’ and ‘duration’.
Type: bool, default: False
-
default_velocity
¶ Default velocity value to use when decoding if encode_velocity is False.
Type: int, default: 64
-
decode
(array: numpy.ndarray) → muspy.music.Music[source] Decode note-based representation into a Music object.
Parameters: array (ndarray) – Array in note-based representation to decode. Cast to integer if not of integer type. Returns: Decoded Music object. Return type: muspy.Music
objectSee also
muspy.from_note_representation()
- Return a Music object converted from note-based representation.
-
encode
(music: muspy.music.Music) → numpy.ndarray[source] Encode a Music object into note-based representation.
Parameters: music ( muspy.Music
object) – Music object to encode.Returns: Encoded array in note-based representation. Return type: ndarray (np.uint8) See also
muspy.to_note_representation()
- Convert a Music object into note-based representation.
-