An experimental work in Python mainly using GNN and CNN for Algorithmic composition.
This work is inspired by this paper and this paper.
The main idea of this work is to generate a melody by using GNN and to change its style by using CNN.
Take the differences between a melody and an image into consideration,the idea metioned above is just a tentative version.
This work is now under experiment.The final version will be commited once finished.
The ABC notation, compared with the MIDI, is more human-oriented than machine-oriented. So a preprocessing procedure is necessary. This part aims at extracting useful information from the raw text data of the data set, by processing the prefixes and suffixes of each note.
The main files in this part and their functions are as follows:
ABCParser.py : ABC parser for ABC notation files.
DataPreprocess.py : extracting the pitch and duration information from the data set.
GetPitchDurationData.py : getting the pitch and duration information and write them in files.
GlobalConstant.py : several global constants in this part.
Pitch.dat and Duration.dat are uploaded. The files contain the extracted information mentioned above. They all are normalized to reduce redundancies in representations. The normalization process is in DataPreprocess.py.
This part aims at generating new melodies by the nerual network model. The nerual network consists of two part: the melody model and the rhythm part.
The main files in this part and their functions are as follows:
GetData.py : get the inputs of the melody model and the rhythm model from the pitch and duration.
MelodyGenerater.py : build the models, train and test and evaluate them, then generate new melodies.
generaterTester.py : the current entry of the work.
This part is very memory-consuming. The optimization is now in process.