Reproducing Musicality: Emulating Human Musicality through Immediate Learning and Sequential Evolution

Author

Aran Samson

Date of Award

2021

Document Type

Dissertation

Degree Name

Doctor of Philosophy in Computer Science

Department

Information Systems & Computer Science

First Advisor

Andrei D. Coronel, PhD

Abstract

Musicology is a growing focus in computer science. Past research has had success in automatically generating music through learning-based agents that make use of neural networks and through model and rule-based approaches. These methods require a significant amount of information, ei- ther in the form of a large dataset for learning or a comprehensive set of rules based on musical concepts. This paper explores a model in which a minimal amount of musical information is needed to compose a desired style of music. This paper takes from two concepts, objectness, and evolutionary computation. The concept of objectness, an idea directly derived from im- agery and pattern recognition, was used to extract specific musical objects from single musical inputs which are then used as the foundation to algo- rithmically produce musical pieces that are similar in style to the original inputes. These musical pieces are the product of evolutionary algorithms which implement a sequential evolution approach wherein a generated out- put may or may not yet be fully within the fitness thresholds of the input pieces. This method eliminates the need for a large amount of pre-provided data as well as the need for long processing times that are commonly asso- ciated with machine-learned art-pieces. This study aims to show a proof of concept of the implementation of the described model.

This document is currently not available here.

Share

COinS