Sunday, August 2, 2009

Quicknet Activation Function in Hidden Layer

In the file, “QN_MLP_BunchFlVar.cc”, the function QN_MLP_BunchFlVar::forward_bunch(size_t n_frames, const float* in, float* out)  has the following part of codes:

// Check if we are doing things differently for the final layer.
if (cur_layer!=n_layers - 1)
{
    // This is the intermediate layer non-linearity.
   qn_sigmoid_vf_vf(cur_layer_size, cur_layer_x,
             cur_layer_y);
}
else
{
    // This is the output layer non-linearity.
    switch(out_layer_type)
    {
    case QN_OUTPUT_SIGMOID:
    case QN_OUTPUT_SIGMOID_XENTROPY:
    qn_sigmoid_vf_vf(cur_layer_size, cur_layer_x, out);
    break;
    case QN_OUTPUT_SOFTMAX:
    {
    size_t i;
    float* layer_x_p = cur_layer_x;
    float* layer_y_p = out;

    for (i=0; i<n_frames; i++)
    {
        qn_softmax_vf_vf(cur_layer_units, layer_x_p, layer_y_p);
        layer_x_p += cur_layer_units;
        layer_y_p += cur_layer_units;
    }
    break;
    }
    case QN_OUTPUT_LINEAR:
    qn_copy_vf_vf(cur_layer_size, cur_layer_x, out);
    break;
    case QN_OUTPUT_TANH:
    qn_tanh_vf_vf(cur_layer_size, cur_layer_x, out);
    break;
    default:
    assert(0);
    }
}

The activation function of MLP in quicknet tools, the activation function of hidden layers are all set to sigmoid by default.

Only the activation function can be set by users.

No comments:

Post a Comment

Google+