music 

I want to teach framsticks dance. I will need some music sensors.

For example, high / low tunes sensor and loud / quiet sensor, with the
ability to score for fast / slow motion when music is loud / quiet and high
/ low body position when the music is high / low.

Is it perhaps already possible?

Cheers,
Lukasz

Forums: 
Szymon Ulatowski's picture

> I want to teach framsticks dance. I will need some music sensors.
>
> For example, high / low tunes sensor and loud / quiet sensor, with the
> ability to score for fast / slow motion when music is loud / quiet and
high
> / low body position when the music is high / low.
>
> Is it perhaps already possible?

you can create such sensors by writing framscript functions (see
scripts/*.neuro files).
replacing the standard fitness calculation with your own rules is also
possible.
the problem is how to pass the real music into the framsticks world (and
make the creatures listen to the input signal in the realtime through your
music card or similar equipment). currently it is not possible to do such
things in the framsticks scripting system.
however, you can make the framsticks hear the "virtual" music ;-)
let's define a new parameter in the experiment definition:
musicpattern="slhlhLHLHshshslsl"...
the music is encoded as string using the following characters: s=silence,
h=high and silent, H=high and loud, l=low, etc...
now the sensors and the scoring system can return the proper values - they
will know the current pattern position from the current time (eg. the
creature's age). the drawback is you couldn't hear this music... (unless you
provide the current pattern information to some external application which
would generate it for you. unlike reading data into framsticks this one is
easy, you only need to emit the message and read it from the messages.out
file. another possibility is to generate the wav directly from the music
pattern string and add it to the movie generated from the opengl export or
rendered by the povray).
some hints on writing the script:
- neurons cannot access the ExpProperties data (well, they can, but the neurons
are compiled before the experiment definition is loaded so it doesn't work).
there is a simple workaround: pass the music pattern into Creature.user
field (when the creature is born).
- the fitness calculation won't be easy... you need to determine if the
creature's motion is fast/slow/high/low and compare it with the pattern. but
the actual values will vary across the population so you can't tell if the
current position or speed is low before know next position... maybe the
right approach is to defer the actual scoring until the creature dies.
before that you can store the current speed and height at some intervals
(ie. put those values in the array).

good luck! :-)

sz.

williamsharkey's picture

Music is a measure of intensity over time, like smell.

If you look at framsticks/scripts/foodcircle.script it places food in a
circle. Perhaps it could me modified to play music.

A script could replace an apple every step, varying apple size to correspond
with a music wave file.

The framsticks would listen to music with thier noses, bizarre.

I think it would be hard to evolve creatures that dance well because a human
would need to evaluate the dances.

Also, humans listen to frequencies, not displacement. So the creatures would
need to implement a Fourier Transform. I bet you could do a fourier
transform with a neural network, you are ambitious enough.