开发者

Python asynchronous callbacks and generators

开发者 https://www.devze.com 2022-12-12 20:35 出处:网络
I\'m trying to convert a synchronous library to use an internal asynchronous IO framework. I have several methods that look like this:

I'm trying to convert a synchronous library to use an internal asynchronous IO framework. I have several methods that look like this:

def foo:
  ....
  sync_call_1()   # synchronous blocking call
  ....
  sync_call_2()   # synchronous bl开发者_高级运维ocking call
  ....
  return bar

For each of the synchronous functions (sync_call_*), I have written a corresponding async function that takes a a callback. E.g.

def async_call_1(callback=none):
  # do the I/O
  callback()

Now for the python newbie question -- whats the easiest way to translate the existing methods to use these new async methods instead? That is, the method foo() above needs to now be:

def async_foo(callback):
  # Do the foo() stuff using async_call_*
  callback()

One obvious choice is to pass a callback into each async method which effectively "resumes" the calling "foo" function, and then call the callback global at the very end of the method. However, that makes the code brittle, ugly and I would need to add a new callback for every call to an async_call_* method.

Is there an easy way to do that using a python idiom, such as a generator or coroutine?


UPDATE: take this with a grain of salt, as I'm out of touch with modern python async developments, including gevent and asyncio and don't actually have serious experience with async code.


There are 3 common approaches to thread-less async coding in Python:

  1. Callbacks - ugly but workable, Twisted does this well.

  2. Generators - nice but require all your code to follow the style.

  3. Use Python implementation with real tasklets - Stackless (RIP) and greenlet.

Unfortunately, ideally the whole program should use one style, or things become complicated. If you are OK with your library exposing a fully synchronous interface, you are probably OK, but if you want several calls to your library to work in parallel, especially in parallel with other async code, then you need a common event "reactor" that can work with all the code.

So if you have (or expect the user to have) other async code in the application, adopting the same model is probably smart.

If you don't want to understand the whole mess, consider using bad old threads. They are also ugly, but work with everything else.

If you do want to understand how coroutines might help you - and how they might complicate you, David Beazley's "A Curious Course on Coroutines and Concurrency" is good stuff.

Greenlets might be actualy the cleanest way if you can use the extension. I don't have any experience with them, so can't say much.


There are several way for multiplexing tasks. We can't say what is the best for your case without deeper knowledge on what you are doing. Probably the most easiest/universal way is to use threads. Take a look at this question for some ideas.


You need to make function foo also async. How about this approach?

@make_async
def foo(somearg, callback):
    # This function is now async. Expect a callback argument.
    ...

    # change 
    #       x = sync_call1(somearg, some_other_arg)
    # to the following:
    x = yield async_call1, somearg, some_other_arg
    ...

    # same transformation again
    y = yield async_call2, x
    ...

    # change
    #     return bar
    # to a callback call
    callback(bar)

And make_async can be defined like this:

def make_async(f):
    """Decorator to convert sync function to async
    using the above mentioned transformations"""
    def g(*a, **kw):
        async_call(f(*a, **kw))
    return g

def async_call(it, value=None):
    # This function is the core of async transformation.

    try: 
        # send the current value to the iterator and
        # expect function to call and args to pass to it
        x = it.send(value)
    except StopIteration:
        return

    func = x[0]
    args = list(x[1:])

    # define callback and append it to args
    # (assuming that callback is always the last argument)

    callback = lambda new_value: async_call(it, new_value)
    args.append(callback)

    func(*args)

CAUTION: I haven't tested this

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号