radhen
September 20, 2020, 2:32pm
1
Dear @PeterOgden @rock @gnatale
How are you? I hope you are doing well
Hi, I am doing data processing on PYNQ Z2, it is doing well, until I increase the dataset.
source_data = np.memmap(‘…/dataset/10MB.txt’, dtype=np.uint8)
input_buffer = xlnk.cma_array(shape=source_data.shape, dtype=np.int32)
so just using 10MB text file dataset, it already give an error messages:
RuntimeError: Failed to allocate Memory!
Would you mind letting me know how to handle a big size dataset?
Thank you.
radhen
September 20, 2020, 2:51pm
2
I solved it already. It turn out I need to overcommit memory, delete buffer and reset jupyter notebook.
echo 1 > /proc/sys/vm/overcommit_memory
Hopefully help others who face this issues too.
Thank you.
rock
September 21, 2020, 3:49pm
3
You need to manually delete allocated buffers since they are not freed automatically. Glad you found the solution.
@rock I tried dataset more than 10MB but it result:
RuntimeError Traceback (most recent call last)
<ipython-input-6-51eb4e9cabfc> in <module>()
1 input_buffer = xlnk.cma_array(shape=source_data.shape, dtype=np.int32)
2 input_buffer[:] = source_data
----> 3 output_buffer = xlnk.cma_array(shape=source_data.shape, dtype=np.int32)
/usr/local/lib/python3.6/dist-packages/pynq/xlnk.py in cma_array(self, shape, dtype, cacheable, pointer, cache)
286 length = elements * dtype.itemsize
287 if pointer is None:
--> 288 raw_pointer = self.cma_alloc(length, cacheable=cacheable)
289 pointer = self.ffi.gc(raw_pointer, self.cma_free, size=length)
290 buffer = self.cma_get_buffer(pointer, length)
/usr/local/lib/python3.6/dist-packages/pynq/xlnk.py in cma_alloc(self, length, cacheable, data_type)
220 buf = self.libxlnk.cma_alloc(length, cacheable)
221 if buf == self.ffi.NULL:
--> 222 raise RuntimeError("Failed to allocate Memory!")
223 self.bufmap[buf] = length
224 return self.ffi.cast(data_type + "*", buf)
RuntimeError: Failed to allocate Memory!
what should I do?
rock
January 29, 2021, 6:24am
5
I guess you just need to be cautious that you can exceed the memory limit. On PynqZ2 I remember it is around a few hundreds MB. So you need to constantly free the memory after using it. If you ever come to that situation above, you can try to xlnk_reset to free all contiguous memory and try again.