Options
All
  • Public
  • Public/Protected
  • All
Menu

Class MelodyRhythmConverter

Converts between a monophonic, quantized NoteSequence containing a melody and a Tensor representing only the rhythm of the melody.

The rhythm is represented as a [numSteps, 1]-shaped Tensor with 1 in the positions corresponding to steps with a note-on and 0 elsewhere.

Since the melody cannot be reconstructed from its rhythm alone, toNoteSequence returns a NoteSequence with drum hits at the note-on steps.

Hierarchy

  • MelodyControlConverter
    • MelodyRhythmConverter

Index

Constructors

constructor

Properties

NUM_SPLITS

NUM_SPLITS: number = 0

SEGMENTED_BY_TRACK

SEGMENTED_BY_TRACK: boolean = false

depth

depth: number

endTensor

endTensor: tf.Tensor1D

ignorePolyphony

ignorePolyphony: boolean

maxPitch

maxPitch: number

melodyControl

melodyControl: MelodyControl

minPitch

minPitch: number

numSegments

numSegments: number

numSteps

numSteps: number

Methods

tensorSteps

  • tensorSteps(tensor: tf.Tensor2D): tf.Scalar

toNoteSequence

  • toNoteSequence(tensor: tf.Tensor2D, stepsPerQuarter?: number, qpm?: number): Promise<NoteSequence>

toTensor

  • toTensor(noteSequence: INoteSequence): tf.Tensor2D

Generated using TypeDoc