


What is a coroutine, I hear you ask? Or, at least, I hear some of you ask. sleep ( 0.01 )įeel free to skip this next section if you already know asyncio create_task ( hobble_process ( pid ) ) def hobble_process ( pid ): while True : os.

sleep ( 2 ) def hobble_current_processes ( already_hobbled ): pids = yield from get_naughty_pids () for pid in pids : if pid in already_hobbled : continue already_hobbled. run_forever () def hobble_processes_forever (): already_hobbled = set () while True : yield from hobble_current_processes ( already_hobbled ) yield from asyncio. create_task ( hobble_processes_forever ()) loop. Signify where each function can yield control back to the event loop, ready toīe woken up again when there's something for it to do:ĭef main (): loop = asyncio. Similar to the normal procedural code - I just add a few "yield froms" to Lots of places where I use "time.sleep", which are good places to give backĬontrol to some sort of event loop or task manager.Īnd sure enough, my first cut of the same code with asyncio was pleasingly Good potential candidate - I have a fairly simple algorithms, and there are The hot new thing in the world of Python async stuff, and this seemed like a The only question was: how to do the "parallelise somehow" part. The naughty program still runs, but its execution is suspended forĩ5% of the time. sleep ( 0.01 )Įvery 10 seconds, go fetch a list of "naughty" processes, and then "hobble"Įach one of them, by using OS signals to stop and start the process at short sleep ( 10 ) def hobble_process ( process ): while True : try : os. Def main (): while True : naughty_processes = find_new_naughty_processes () for process in naughty_processes : hobble_process ( process ) # parallelised somehow time.
