restore cifar10-1w-1a.npz

BNN-PYNQ sample provides a comparison between FPGA-based BNN and CPU-based one.

However, I also want to compare BNN and basic CNN on CPU.
So, I at first tried to restore cifar10-1w-1a.npz using the script below.

Then, the script returned the following error.
Could anyone please teach me how to correct?
Do I need some flags to set to lasagne.layers.set_all_param_values()

Traceback (most recent call last):
File “”, line 1, in
File “/Users/kenichi/.local/lib/python2.7/site-packages/lasagne/layers/”, line 522, in set_all_param_values
(p.get_value().shape, v.shape))
ValueError: mismatch: parameter has shape (64, 3, 3, 3) but value to set has shape (256, 128, 3, 3)

import numpy as np
import theano
import theano.tensor as T
import lasagne
import cnv
from collections import OrderedDict

learning_parameters = OrderedDict()
learning_parameters.alpha = .1
learning_parameters.epsilon = 1e-4
learning_parameters.W_LR_scale = “Glorot”
learning_parameters.activation_bits = 2
learning_parameters.weight_bits = 2

input = T.tensor4(‘inputs’)
cnn = cnv.genCnv(input, 10, learning_parameters)

npz = np.load(“cifar10-1w-1a.npz”)
lasagne.layers.set_all_param_values(cnn, [npz[v] for v in npz])

The error states there’s a shape mismatch between the parameter and the values you are trying to restore. I would make sure you are generating the parameters in the right way.
You can take a loot at what’s going on here. your cnn argument to set_all_param_values() seems to be generated incorrectly. But frankly I cannot really help you beyond this as my knowledge on this topic is limited.

I would reccommend trying to get support directly from the team beyond BNN-PYNQ, which is different from the PYNQ-core team.
Try opening an issue on the BNN github.

I posted as an issue to the BNN github.

1 Like