Darwin  1.10(beta)
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Groups Pages
Matlab Interfaces

The matlab project implements mex interfaces to various components of the Darwin library. A number of Matlab scripts (in src/scripts) also facilitate interoperating with Darwin from Matlab. The Matlab routines are:

The remainder of this document describes how to compile the mex interfaces and provides some example uses. Pre-compiled mex files can be downloaded from Code Downloads. Additional examples can be found in the projects/matlab/examples directory.

Compiling Mex Projects

Linux

Warning
On the latest versions of Ubuntu and older versions of Matlab you may get compiler warnings complaining about gcc version incompatibility. This can be fixed by replacing libstdc++.so.6 and libgcc_s.so.1 in the Matlab library directory, e.g.,
cd external/matlab/sys/os/glnx86
rm libstdc++*
rm libgcc*
ln -s /usr/lib/i386-linux-gnu/libstdc++.so.6
ln -s /usr/lib/i386-linux-gnu/libgcc_s.so.1
You may need to execute these commands with superuser (sudo) privileges.
Note
If you are having problems running Matlab with OpenCV, try starting Matlab with
LD_PRELOAD=/usr/lib/x86_64-linux-gnu/libfreetype.so.6 matlab &

Windows

Warning
Make sure you have compiled the Darwin libraries for the correct architecture. For example, if you are running 64-bit Matlab then you will need to compile the Darwin libraries using the x64 configuration.

Mac OS X

Examples

Multi-class Classification

The full suite of Darwin classifiers can be trained and evaluated from Matlab. The following example shows how to train a multi-class logistic classifier (drwnMultiClassLogistic) and then evaluate it (on the training set!).

% load iris dataset
[x, y] = iris_dataset;
x = x'; y = ([0, 1, 2] * y)';
% learn a classifier
options = struct('verbose', 1, 'method', 'drwnMultiClassLogistic');
classifier = mexLearnClassifier(x, y, [], options);
% evaluate classifier (on training data)
options = struct('verbose', 1);
[predictions, scores] = mexEvalClassifier(classifier, x, options);
accuracy = sum(predictions == y)

The classifier is stored as an XML string and can be saved for later use. Sometimes it is useful to weight the training examples differently, e.g., so that every class contributes the same mass to the classifier. The following code snippet shows an example:

% weight training examples
numClasses = max(y) + 1;
w = zeros(size(y));
for k = 1:numClasses,
indx = find(y == k - 1);
w(indx) = 1.0 / length(indx);
end;
% learn a classifier
options = struct('verbose', 1, 'method', 'drwnMultiClassLogistic');
classifier = mexLearnClassifier(x, y, w, options);
Warning
In Darwin classes are labeled from 0 to K-1. A negative class means unknown and is ignored during training.

Sometimes it helps to normalize features to have zero mean and unit variance before training classifiers. You can do this automatically using the 'whiten' option. The normalization parameters are stored with the classifier and will be applied automatically when invoking the the mexEvaluateClassifier function.

If instead of learning a multi-class logistic classifier (drwnMultiClassLogistic) you want to learn a decision tree classifier (drwnDecisionTree) you can simply change the 'method' option. The 'set' option allows you to configure the internal parameters of the decision tree as the following code snippet shows:

% learn a decision tree classifier
options = struct('verbose', 1, 'method', 'drwnDecisionTree', ...
'set', 'drwnDecisionTree maxDepth 5');
classifier = mexLearnClassifier(x, y, [], options);

Graphical Model MAP Inference

In order to perform inference in a graphical model we need to define the universe of variables and a factor graph over which inference will be performed. The universe is defined by a vector specifying the cardinality of each variable. For example,

universe = [3, 2, 2];

Defines a universe over three random variables, say, A, B, and C. The first random variable, A, is ternary, while the remaining two are binary.

Next we need to define a factor graph, which is a structure array that specifies the set of variables in each factor and the value of the factor for every possible assignment to it's variables. The following provides an example factor graph with two (random) factors defined over variables (A, B) and (B, C), respectively.

factors = [];
factors(1).vars = [0 1];
factors(1).data = rand(prod(universe([1 2])), 1);
factors(2).vars = [1 2];
factors(2).data = rand(prod(universe([2 3])), 1);

This defines the energy function $E(A, B, C) = \psi_1(A, B) + \psi_2(B, C)$. Darwin will try to minimize this energy function during inference.

Warning
In Darwin indexes are 0-based, while in Matlab they are 1-based. The first variable has id 0.

We can now run inference (energy minimization) over the factor graph. For example, to run max-product inference and display the resulting MAP assignment we have

options = struct('verbose', 1, 'method', 'drwnMaxProdInference');
assignment = mexFactorGraphInference(universe, factors, [], options);
disp(assignment);

Suppose that we wish to implement a Markov chain over variables $X_1$ to $X_n$ where we want the labels in the chain to be monotonically non-decreasing. The following code shows how this can be achieved.

N = 10; % length of chain
L = 5; % label space
universe = repmat(L, N, 1);
factors = [];
for i=1:N,
factors(end + 1).vars = i - 1;
factors(end).data = rand(L, 1);
end;
pairwiseCost = tril(1000 * (ones(L, L) - eye(L, L)));
for i = 1:N - 1,
factors(end + 1).vars = [i - 1, i];
factors(end).data = pairwiseCost(:);
end;
edges = [0:N-2, 1:N-1; N:2*N-2, N:2*N-2]';
options = struct('verbose', 1, 'method', 'drwnMaxProdInference');
assignment = mexFactorGraphInference(universe, factors, edges, options);

Max-Flow/Min-Cut

The mexMaxFlow function provides code for computing the minimum-cut in a directed weighted graph (see Max-flow/Min-cut).

The following example shows how to find the value of the minimum cut on a simple graph from Matlab.

dot_inline_dotgraph_1.png
% define edges
edgeList = [
-1 0 3; % SOURCE -> 0
-1 2 3; % SOURCE -> 2
0 1 4; % 0 -> 1
1 2 1; % 1 -> 2
1 3 2; % 1 -> 3
2 3 2; % 2 -> 3
2 4 6; % 2 -> 4
3 -1 1; % 3 -> TARGET
4 -1 9; % 4 -> TARGET
];
% run max-flow
[value] = mexMaxFlow(edgeList);
disp(['Value of minimum cut is ', num2str(value)]);
Note
A negative value indicates a source or target edge. Node indexing is 0-based.

k-Means

The mexKMeans function provides and interface to Darwin's fast k-means code. An example usage is given below.

% get the data (rowwise features)
x = generateFeatureVectors();
% run k-means
options = struct('verbose', 1, 'maxiters', 100);
centroids = mexKMeans(5, x, [], options);
% plot results (first two dimensions)
figure;
plot(x(:, 1), x(:, 2), 'b.');
hold on;
plot(centroids(:, 1), centroids(:, 2), 'rx', 'MarkerSize', 5, 'LineWidth', 2);
hold off;
title('Darwin k-Means Example');