ChMux changes supposed to change signal strength as well? 

I am trying to "normalize" (not sure if that is the proper term for it) a
signal, flip a negitive signal to positive if it is negitive, and leave it
alone if already positive.

exp:

-.04 to .04
.23 to .23
-.76 to .76

I attempted to do this by channelizing a signal and a charged flipped
version of it then using the original as the selector.

exp:

X[N,1:1][ChMux,2:1,1:1][Ch,1:-1,1:1][*:-.04]

Now the original charge was -.04, and it comes out positive in the end, but
the signal strength has changed to .0016. Is the ChMux supposed to do that,
or is it a bug? If it is supposed to do that what is the reason for it and
what can I do to save the integrity of the original charge strength?

Forums: 
Maciej Komosinski's picture

> X[N,1:1][ChMux,2:1,1:1][Ch,1:-1,1:1][*:-.04]

Two hints:

1. You can use the Threshold neuron, like [Thr,t:0,lo:1,hi:-1,-1:1]
to get two values on output depending on the sign of input.
2. If you want to have N react immediately, use [N,fo:1,in:0]

> Now the original charge was -.04, and it comes out positive in the end, but
> the signal strength has changed to .0016. Is the ChMux supposed to do that,
> or is it a bug? If it is supposed to do that what is the reason for it ...

It is not ChMux that is responsible, it is because of your -.04
in the "*" neuron. -0.04 is not the output value here, but the
weight between "*" and N (which is added automatically). Thus
you get your input signal from "*" multiplied by -0.04 BEFORE
it reaches "Ch".

When you use [*:-.04], two neurons are created: "*" with the
constant output value 1, connected to "N" with the input
weight -.04. Thus you get your desired -0.04 on the N's
output.

You got confused beacuse most probably you tested your
network by changing the "*" output value (default 1). All your
changes were multiplied by the -0.04 weight on N's input.

To test your circuit, change the N's output value (AFTER
it is multiplied by the fefault -0.04), and you will
get what you want on N's output.

Good luck,

MacKo

ps. Strictly speaking, the "*" neuron produces a constant
output of 1. If you use [*:-.04], it is _equivalent_ to
[*][-1:-.04].

>1. You can use the Threshold neuron, like [Thr,t:0,lo:1,hi:-1,-1:1]
> to get two values on output depending on the sign of input.

I can't test at the moment because I'm at work, but this posses another
question. If there are 2 channels and the first input in a [ChMux] isn't
-1 or +1, do you get a portion of both channels added together as your
ouput? If so, that would be my problem and explain the strange result.
In that case using a [Thr] would fix it.

This is what I get when I test.

[*]
+1.0 <== Output
|
|
[N,-1:-0.04]
-0.04 <== OK so far
|
|
[Ch,-1,-1,-1,1]
Ch:0 +0.04, Ch:1 -0.04 <== OK so far
| |
| |
[ChMux,-2:1,-1:1]
+0.0016 <== I expected +0.04.

I assumed that when choosing between 2 channels in a [ChMux] that any
negitive signal as the selector would choose the first channel (ch:0)and
any positive signal would choose the second channel (ch:1). But I'm
beginning to think I was wrong in my assumptions.

It appears this is the case. It is working now that I used [Thr] on the
selector signal.

> I can't test at the moment because I'm at work, but this posses another
> question. If there are 2 channels and the first input in a [ChMux] isn't
> -1 or +1, do you get a portion of both channels added together as your
> ouput? If so, that would be my problem and explain the strange result.
> In that case using a [Thr] would fix it.
>
> This is what I get when I test.
>
> [*]
> +1.0 <== Output
> |
> |
> [N,-1:-0.04]
> -0.04 <== OK so far
> |
> |
> [Ch,-1,-1,-1,1]
> Ch:0 +0.04, Ch:1 -0.04 <== OK so far
> | |
> | |
> [ChMux,-2:1,-1:1]
> +0.0016 <== I expected +0.04.
>
> I assumed that when choosing between 2 channels in a [ChMux] that any
> negitive signal as the selector would choose the first channel (ch:0)and
> any positive signal would choose the second channel (ch:1). But I'm
> beginning to think I was wrong in my assumptions.
>

Szymon Ulatowski's picture

Eric Muirhead wrote:

> It appears this is the case. It is working now that I used [Thr] on the
> selector signal.

[...]
>>[ChMux,-2:1,-1:1]
>>+0.0016 <== I expected +0.04.
>>
>>I assumed that when choosing between 2 channels in a [ChMux] that any
>>negitive signal as the selector would choose the first channel (ch:0)and
>>any positive signal would choose the second channel (ch:1). But I'm
>>beginning to think I was wrong in my assumptions.

ChMux returns the weighted average signal calculated from two neighbor
channels, eg.:
-1.0:channel#0
+1.0:channel#1
+0.5:25% of channel#0 + 75% of channel#1

you can see the exact weighting formula in the scripting version of the
chmux neuron, which can be found in your scripts_sample folder.
look for:
[...]
var is1=Neuro.getWeightedInputStateChannel(1,i1);
var is2=Neuro.getWeightedInputStateChannel(1,i1+1);
Neuro.state = is1*w2+is2*w1;

it was also posted here:
news://news.alife.pl/3EE520C1.4CA83488@poczta.onet.pl

sz.

Maciej Komosinski's picture

> I am trying to "normalize" (not sure if that is the proper term for it) a
> signal, flip a negitive signal to positive if it is negitive, and leave it
> alone if already positive.

Then you want to compute an absolute value of a signal.
Have a look at the "scripts/threshold.neuro" file. You can
easily modify it to get your own "Abs" neuron:

function go()
{
Neuro.state = Neuro.weightedInputSum;
if (Neuro.state<0) Neuro.state=-Neuro.state;
}

Then save it as "abs.neuro" file, change neuron name ("Abs"),
longname, and remove all properties (they are not
needed for Abs).

After you restart Framsticks, you will be able to
use your Abs neuron in genotypes, and simulate it
in neural circuits.

MacKo

Szymon Ulatowski's picture

> Then you want to compute an absolute value of a signal.
> Have a look at the "scripts/threshold.neuro" file. You can
> easily modify it to get your own "Abs" neuron:
>
> function go()
> {
> Neuro.state = Neuro.weightedInputSum;
> if (Neuro.state<0) Neuro.state=-Neuro.state;
> }

correction: better use the temporary variable instead of Neuro.state,
like this:

function go()
{
var s = Neuro.weightedInputSum;
if (s<0.0) s=0.0-s; // 0.0-s because the unary minus is broken :-(
Neuro.state=s;
}

Neuro.state yields the current output and it is not the same value you
put into the Neuro.state moments ago.
"Neuro.state=x;" means the Neuro.state WILL BE set to x during the next
step (unless you use the neuro probe to override, in that case your "x"
would be ignored).

sz.

Hmm. interesting...
Szymon Ulatowski wrote:
beginquote
"Neuro.state=x;" means the Neuro.state WILL BE set to x during the next
step
endquote

Does this mean that a neuron 'delays' the signal by one step?

Frans

"Szymon Ulatowski" schreef in bericht
news:bja22n$23v$1@cancer.cs.put.poznan.pl...
> > Then you want to compute an absolute value of a signal.
> > Have a look at the "scripts/threshold.neuro" file. You can
> > easily modify it to get your own "Abs" neuron:
> >
> > function go()
> > {
> > Neuro.state = Neuro.weightedInputSum;
> > if (Neuro.state<0) Neuro.state=-Neuro.state;
> > }
>
> correction: better use the temporary variable instead of Neuro.state,
> like this:
>
> function go()
> {
> var s = Neuro.weightedInputSum;
> if (s<0.0) s=0.0-s; // 0.0-s because the unary minus is broken :-(
> Neuro.state=s;
> }
>
> Neuro.state yields the current output and it is not the same value you
> put into the Neuro.state moments ago.
> "Neuro.state=x;" means the Neuro.state WILL BE set to x during the next
> step (unless you use the neuro probe to override, in that case your "x"
> would be ignored).
>
> sz.
>

Szymon Ulatowski's picture

Frans Verbaas wrote:
> Hmm. interesting...
> Szymon Ulatowski wrote:
> beginquote
> "Neuro.state=x;" means the Neuro.state WILL BE set to x during the next
> step
> endquote
>
> Does this mean that a neuron 'delays' the signal by one step?

that depends what the 'delays' mean.
neurons are simulated synchronously (first calculate the output signals
for all neurons, then change the output for all neurons at the same
moment).
in the neuron {Neuro.state=Neuro.weightedInputSum;} there is one step
propagation time between the input and the output but there is no
additional delay as one might reason from the fact, that the Neuro.state
variable doesn't reflect the written value immediately.

sz.