python - Splitting a string in chunks using multiprocessing -


i acquiring data microcontroller via udp. each packet hex string, , need split equal-sized chunks processing afterwards. however, since packets relatively large (about 700 chars each), time takes split 1 chunks larger time between packet arrival. introduces latency , half of data lost. cannot use tcp/ip, because system needs operate in real time. how can multiprocess following line (can done):

list_of_chunks = [packet[i:i+n] in range(0, len(packet), 16)] 

i doubt multiprocessing (or multithreading) speed want done—too overhead. instead consider unwinding loop. in case can write script creates code you:

here's mean:

packet_size = 700 packet = [0] * packet_size  # dummy packet  #list_of_chunks = [packet[i:i+16] in range(0, len(packet), 16)] list_of_chunks = '\n    '.join('packet[{}:{}]'.format(i, i+16)                                     in range(0, len(packet), 16)) print('list_of_chunks = [\n    ' + list_of_chunks + ']') 

output:

list_of_chunks = [     packet[0:16]     packet[16:32]     packet[32:48]     packet[48:64]     packet[64:80]     packet[80:96]         ...     packet[624:640]     packet[640:656]     packet[656:672]     packet[672:688]     packet[688:704]] 

Comments

Popular posts from this blog

php - Vagrant up error - Uncaught Reflection Exception: Class DOMDocument does not exist -

vue.js - Create hooks for automated testing -

Add new key value to json node in java -