Actually, the following is not issue, but question.
BNN-PYNQ sample provides …a comparison between FPGA-based BNN and CPU-based one.
e.g. https://github.com/Xilinx/BNN-PYNQ/blob/master/notebooks/CNV-QNN_Cifar10.ipynb
However, I also want to compare BNN and basic CNN on CPU.
So, I at first tried to restore cifar10-1w-1a.npz using the script below.
Then, the script returned the following error.
Could anyone please teach me how to correct?
Do I need some flags to set to lasagne.layers.set_all_param_values()?
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/kenichi/.local/lib/python2.7/site-packages/lasagne/layers/helper.py", line 522, in set_all_param_values
(p.get_value().shape, v.shape))
ValueError: mismatch: parameter has shape (64, 3, 3, 3) but value to set has shape (256, 128, 3, 3)
import numpy as np
import theano
import theano.tensor as T
import lasagne
import cnv
from collections import OrderedDict
learning_parameters = OrderedDict()
learning_parameters.alpha = .1
learning_parameters.epsilon = 1e-4
learning_parameters.W_LR_scale = "Glorot"
learning_parameters.activation_bits = 2
learning_parameters.weight_bits = 2
input = T.tensor4('inputs')
cnn = cnv.genCnv(input, 10, learning_parameters)
npz = np.load("cifar10-1w-1a.npz")
lasagne.layers.set_all_param_values(cnn, [npz[v] for v in npz])
To confirm what shape the parameters are, I revised quantized_net.py around the 300th line as follows and executed cifar10.py
arrays = lasagne.layers.get_all_param_values(model)#
print([v.shape for v in arrays])#
if save_path is not None:
np.savez(save_path, *arrays)#
npz = np.load(save_path)#
print([npz[v].shape for v in npz])#
Then, the output is the following.
How can I understand the difference between the shapes of lasagne.layers.get_all_param_values(model) and npz = np.load(save_path)
…
Training...
[(64, 3, 3, 3), (64,), (64,), (64,), (64,), (64,), (64, 64, 3, 3), (64,), (64,), (64,), (64,), (64,), (128, 64, 3, 3), (128,), (128,), (128,), (128,), (128,), (128, 128, 3, 3), (128,), (128,), (128,), (128,), (128,), (256, 128, 3, 3), (256,), (256,), (256,), (256,), (256,), (256, 256, 3, 3), (256,), (256,), (256,), (256,), (256,), (256, 512), (512,), (512,), (512,), (512,), (512,), (512, 512), (512,), (512,), (512,), (512,), (512,), (512, 10), (), (), (), ()]
[(256, 128, 3, 3), (256,), (256,), (256,), (128,), (128,), (128,), (128,), (256,), (256,), (512,), (512,), (512,), (512,), (512, 512), (512,), (512,), (512,), (512, 10), (), (256,), (256,), (256,), (256, 256, 3, 3), (512,), (256, 512), (256,), (256,), (512,), (512,), (128,), (128, 128, 3, 3), (), (), (), (64,), (64,), (128,), (128, 64, 3, 3), (128,), (128,), (128,), (128,), (64,), (64, 3, 3, 3), (64,), (64,), (64,), (64,), (64,), (64, 64, 3, 3), (64,), (64,)]
Epoch 1 of 500 took 1454.11680388s
…