Monday, February 28, 2011
Sunday, February 27, 2011
[OpenGL] FreeType and FTGL for Xcode
1) Download FreeType from: http://www.freetype.org and FTGL source from: http://sourceforge.net/projects/ftgl/
2) Extract the source and open a Terminal to navigate under the corresponding folder
3) For each do following steps in Terminal:
a) ./configure
b) make
4) find the libfreetype.a library file, for my case, it is under the folder of objs/.libs
5) copy the libfreetype.a and the include folder to a place, these are the files needed for apps;
6) Similarly for the FTGL, do ./configure, make and find the libftgl.a which is under src/.libs/
7) copy the libftgl.a and the folder FTGL under the src folder for future use.
For testing,
In Xcode, create a command-line app.
Copy the code from FTGL's demo folder's c-demo.c to the main.c of the newly created Xcode project.
drag libftgl.a, libfreetype.a and the header folders FTGL, freetype and the f2build.h to the project.
To compile the program we also need the "config.h" file from the FTGL's folder and add the OpenGL and GLUT frameworks.
Finally the result is :
Thursday, February 24, 2011
[misc] speech recognition in real world - for fun
Sup Mee xSuck my willy :POpen The Door Pleasee :DGoing Upp :P | |
Tuesday, February 22, 2011
[HMM] The GCONST in HTK's HMM definition file
The GCONST value is a precomputed constant for the log likelihood of the Gaussian, i.e. the log of the denominator of the Gaussian formula:
The stored GCONST is:
where for the diagonal covariance matrix, the determinant is just the product of the diagonal elements.
Sunday, February 20, 2011
[MLSS] Singapore 2011
Machine Learning Summer School
13-17 June 2011, Singapore
The home page for the MLSS world series is http://www.mlss.cc/
Lecture notes for previous events: http://www.springerlink.com/content/lrh41y849xdh/
Thursday, February 17, 2011
[NN] A guide to recurrent neural network and backpropagation
This doc is a rather simple introduction to Recurrent Neural Network giving a basic understanding of RNN.
Modeling time series, we can use either feedforward networks or RNN:
1) Adopting temporal windows on the input feature for feed forward NN, i.e. the tapped delay line memory in this paper
2) RNN is inherently capable of modeling the temporal information in the input sequence due to its recurrent characteristic
One major concern is the simple RNN only make use of the previous input information, which may not be sufficient especially for speech signals.
To incorporate more temporal information a complex system is required and the structure after unfolding is illustrated below:
Will this more complex system beat the feed forward NN with similar model complexity?
Another issue is the RNN only utilizes the historical information in the time series, while for windowing in feed forward NN, we can actually employ both the history and future information for current frame prediction.
However, feed forward NN process the current input and the context information with the same set of model parameter, which may not be the best way.
Maybe the first thing to try is just to experiment with RNN see whether it gives any promising results.
Wednesday, February 16, 2011
Introduction To Bayesian Inference
In the progress of learning his wonderful book Pattern Recognition and Machine Learning
Probabilistic Models for Computational Linguistics
Time to learn more basic and fundamental stuffs for my research
Subscribe to:
Posts (Atom)