I have a code that works just fine. I am capturing data from hardware and eveything looks fine. Here is the snippet of my code
for i in range(100):
output_buffer = allocate(shape=(5,), dtype=np.uint32)
dma.recvchannel.transfer(output_buffer)
dma.recvchannel.wait()
# .... and the the rest of my code
so the above code works fine for grabbing 100 packets.
But then I needed to add intentional delay in the loop for my particular application.
So, then I modified the code to something like this:
for i in range(100):
time.sleep(0.05)
output_buffer = allocate(shape=(5,), dtype=np.uint32)
dma.recvchannel.transfer(output_buffer)
dma.recvchannel.wait()
# .... and the the rest of my code
But then the code errors out with the following error message:
File "zcu111_data_server.py", line 71, in <module>
dma1.recvchannel.wait()
File "/usr/local/lib/python3.6/dist-packages/pynq/lib/dma.py", line 135, in wait
raise RuntimeError('DMA channel not started')
RuntimeError: DMA channel not started
I don’t understand what is it with “sleep()” statement that throws off the DMA engine!!!
because when I add the delay in some sort of a dummy fashion, instead of using “sleep()”, then I don’t get any error…
for i in range(100):
# dummy delay
a = 0
for p in range(10000):
a = a + p
output_buffer = allocate(shape=(5,), dtype=np.uint32)
dma.recvchannel.transfer(output_buffer)
dma.recvchannel.wait()
# .... and the the rest of my code
The above code works just fine.
So, there is certainly something about sleep() function that DMA doesn’t like.
Any idea or suggestion why “sleep()” doesn’t work ?