Reward HN: RaveForce – An OpenAI Gym vogue toolkit for tune period experiment

RaveForce is a Python package that allows you to define your musical task in Python with Glicol syntax, and train an agent to do the task with APIs similar to the OpenAI Gym. Why RaveForce It seems that music generation researches have been dominated by MIDI generation or audio generation methods with either supervised learning…

45
Reward HN: RaveForce – An OpenAI Gym vogue toolkit for tune period experiment

RaveForce is a Python kit that lets you account on your musical project in Python with Glicol syntax, and put collectively an agent to manufacture the project with APIs the same to the OpenAI Gym.

Why RaveForce

It sounds as if tune period researches had been dominated by MIDI period or audio period systems with both supervised finding out via a corpus or unsupervised finding out with some sample library. However let’s center of attention on a simple example: you ought to put collectively an agent to play the synth sequencer for you. The aim is to repeat a notable bass line. Therefore, in every step, the agent needs to manufacture a resolution on which notify to play and what roughly timbre to manufacture. The agent can contain an observation of what has been synthesised, and the reward is calculated by comparing the similarity for the time being.

But it would be very sophisticated and time-engrossing to dangle a right-world ambiance (a lot like a tune robotic) to conceal the entire needs for electronic tune. One other possibility is to make use of some constructed-in Python characteristic to dangle our tune tasks, however silent, for every project, you wish to jot down some DSP characteristic chains which will by no map be passe all over again in practice.

A higher come is to search out a on daily basis between our simulation and right-world tune practices. Stay coding is precisely this form of custom where the artist performs improvised algorithmic tune by writing program code in right-time.

Therefore, the final architecture is:

Agent
-> Mess round the are dwelling coding code
-> Stay coding engine does the non-right-time synthesis
-> Fetch the reward, observation map, and numerous others.

This process might well silent involve some deep neural network as the synthesised audio is draw more sophisticated to process than the symbolic sequences.

Previously, SuperCollider is passe for RaveForce. Look paper:

Lan, Qichao, Jim Tørresen, and Alexander Refsum Jensenius. “RaveForce: A Deep Reinforcement Studying Atmosphere for Tune Technology.” (2019).

@article{lan2019raveforce,
  title={RaveForce: A Deep Reinforcement Studying Atmosphere for Tune Technology},
  author={Lan, Qichao and Torresen, Jim and Jensenius, Alexander Refsum},
  One year={2019}
}

Reward that the implementation of this paper has been moved to the sc branch.

However as a result of the velocity restrict of non-right-time synthesis on onerous disk, we switch to Glicol.

Glicol is a brand new are dwelling coding language that would be accessed within the browsers:

https://glicol.org

The syntax of Glicol is terribly the same to synth or sequencers, which completely fits our needs. Plus, Glicol is written in Rust and might well be called in Python via WebAssembly (there are a form of systems however wasm is passe because it shares the same layout with Glicol js bindings).

Methods to make use of RaveForce

Install

That is fairly easy:
pip install raveforce

Be familiar with Glicol syntax.

Focus on over with Glicol web page to rep familiar with its syntax and draw:

https://glicol.org

Python

Since we are going to account for our have musical project, we might well silent fabricate some adjustments to the fabricate come.

Let’s center of attention on the most attention-grabbing example: proper let the agent to play for 1 step, tweaking attack, decay and freq of a sine wave synth to simulate a kick drum.

> envperc {} {}

kick_drum: sin {}>> mul ~env
“””,
total_step=1,
step_len=dur,
purpose=purpose,
action_space=[
[“lin”, 0.0001, dur-0.0001],
[“rel”, 0, lambda x: dur-0.0001-x], # connected to para 0
[“exp”, 10, 10000]
]
)”>

import raveforce
import librosa

purpose, sr = librosa.load("YOUR_KICK_DRUM_SAMPLE", sr=None)
dur = len(purpose) / sr

env = gym.fabricate(
    """
     ~env: imp 0.1>> envperc {} {}
    
    kick_drum: sin {}>> mul ~env
    """,
    total_step=1,
    step_len=dur,
    purpose = purpose,
    action_space=[
      ["lin", 0.0001, dur-0.0001], 
      ["rel", 0, lambda x: dur-0.0001-x], # connected to para 0
      ["exp", 10, 10000]
    ]
)

Then, use as a on daily basis Gym env:

observation = env.reset()
toddle = env.action_space.sample()
print(toddle)

observation, reward, performed, data = env.step(toddle)
plt.build(observation) # fabricate your have import matplotlib
print(reward, performed, data)

In this situation, after 2000 iterations, the rewards are moderately sure that a low attack and a low freq is most fantastic to simulate a kick drum, which is vivid.

The result after 2000 iterations

I additionally made

Read More

Charlie Layers
WRITTEN BY

Charlie Layers

Fill your life with experiences so you always have a great story to tellBio: About: