Code for Link Prediction via Matrix Factorization, ECML '11
You can find MATLAB code for link prediction here. The included functions are:- link_prediction_test_script.m, a sample script that demonstrates usage on a synthetic network
- factorisationSGDOptimiser.m, for optimisation of the classification loss
- factorisationSGDRankingOptimiser.m, for optimisation of the ranking loss
Example usage
The script link_prediction_test_script.m illustrates the use of both the classification and ranking loss, and the presence and absence of side-information, on some synthetically generated data. You should observe output such as the following:
>> link_prediction_test_script;
testing setting: standard loss, # node features = 0, # link features = 0
rmse of predicted vs true probabilities = 0.2205
optimal auc = 0.6959
predicted auc = 0.6524
...
testing setting: ranking loss, # node features = 4, # link features = 1
rmse of predicted vs true probabilities = 0.3257
optimal auc = 0.7260
predicted auc = 0.7126
Detailed description
Regarding use of either of the functions: you may just pass in [] for convergenceScoreTr and convergenceScoreTe. Just setting the # of epochs for SGD as appropriate would work. Similarly for computeObjective. This was used for early debugging, and is not essential.Regarding the other arguments, weights is a structure that has the initial values for the parameters. Example initialization for the fields are as listed below.
weights = [];
weights.U = randn(k, m); % for k latent features and m nodes
weights.UBias = randn(1, m);
weights.ULatentScaler = randn(k, k); % for asymmetric; for symmetric, use diag(randn(k)) instead
weights.WPair = randn(1, dPair); % for dPair features for each pair
weights.WBias = randn;
weights.WBilinear = randn(1, dBilinear); % for dBilinear features for each node
Similarly, lambda is a structure that has the regularization strengths for different parameters.
lambda = [];
lambda.lambdaLatent = 0.1; % regularization for node's latent vector U
lambda.lambdaRowBias = 0; % regularization for node's bias UBias
lambda.lambdaLatentScaler = 0; % regularization for scaling factors Lambda (in paper)
lambda.lambdaPair = 0; % regularization for weights on pair features
lambda.lambdaBilinear = 0; % regularization for weights on node features
lambda.lambdaScaler = 1; % scaling factor for regularization, can be set to 1 by default