Reading in the weights…
# bastardized from https://aboveintelligent.com/face-recognition-with-keras-and-opencv-2baf2a83b799 from scipy.io import loadmat data = loadmat(weightsFilename, matlab_compatible=False, struct_as_record=False) net = data['net'][0,0] net = net.net_params[0,0] # first conv layer tmp = net.layers[1,0][0,0] weights = np.zeros((8,8,1,8)) bias = np.zeros(8) for k in range(8): weights[:,:,0,k] = np.rot90(tmp.k[0,0][0,k], -2) bias[k] = tmp.b[0,k] model.layers[1].set_weights([weights, bias])
Keras Flatten() layer gave me some troubles. In MATLAB, the last level was flattened via…
% concatenate all end layer feature maps into vector net.fv = []; for j = 1 : numel(net.layers{n}.a) sa = size(net.layers{n}.a{j}); net.fv = [net.fv; reshape(net.layers{n}.a{j}, sa(1) * sa(2), sa(3))]; end
There is a final Dense layer output 2 classes. I read the weights in from MATLAB and then mangle them to produce a MATLAB faithful computation.
# Dense layer weights = np.transpose(data['net'][0,0].net_params[0,0].ffW) idx = np.arange(300).reshape((5,5,12)).flatten('F') idx = np.argsort(idx) weights = weights[idx,:] bias = data['net'][0,0].net_params[0,0].ffb.flatten() model.layers[8].set_weights([weights, bias])
Essentially, I mimic the forward mangling process. MATLAB is column major. So, I create a vector of 300 elements from 0 to 299 (my last conv layer output is 5x5x12). I flatten this vector using FORTRAN style flattening. Now, I must get the reverse mapping which is accomplished via and argsort. Finally, I reorder the MATLAB weight vector so that when keras reorders via flatten(), its in the same ordering as the MATLAB MatConvNet computation.
net.o = sigm(net.ffW * net.fv + repmat(net.ffb, 1, size(net.fv, 2)));