Journal Home Online First Current Issue Archive For Authors Journal Information 中文版

Frontiers of Information Technology & Electronic Engineering >> 2024, Volume 25, Issue 1 doi: 10.1631/FITEE.2300359

Style-conditioned music generation with Transformer-GANs

School of Electronic and Information Engineering, South China University of Technology, Guangzhou 510600, China

Received: 2023-05-21 Accepted: 2024-02-19 Available online: 2024-02-19

Next Previous

Abstract

Recently, various algorithms have been developed for generating appealing music. However, the style control in the generation process has been somewhat overlooked. Music style refers to the representative and unique appearance presented by a musical work, and it is one of the most salient qualities of music. In this paper, we propose an innovative algorithm capable of creating a complete musical composition from scratch based on a specified target style. A linear and a patch discriminator are introduced in the model. The linear models musical instrument digital interface (MIDI) event sequences and emphasizes the role of style information. Simultaneously, the patch discriminator applies an adversarial learning mechanism with two innovative loss functions to enhance the modeling of music sequences. Moreover, we establish a discriminative metric for the first time, enabling the evaluation of the generated music’s consistency concerning music styles. Both objective and subjective evaluations of our experimental results indicate that our method’s performance with regard to music production is better than the performances encountered in the case of music production with the use of state-of-the-art methods in available public datasets.

Related Research