-
Notifications
You must be signed in to change notification settings - Fork 13
taskman: an initial prototype for a trio.Nursery-like per-task-scope-manager (maybe via a user defined single-yield-generator function?) 😎
#363
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: hilevel_serman
Are you sure you want to change the base?
Conversation
tractor/trionics/_supervisor.py
Outdated
| return self.result | ||
|
|
||
|
|
||
| class TaskHandle(Struct): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO: show an second example where the cancel_scope_manager generator wraps the trio internals listed as attrs into this compound type and then that is what's returned from the .start_soon() call?
|
|
||
| scope_manager: ContextManager | None = None | ||
|
|
||
| async def start_soon( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't really know what the best API would be for doing this with .start()?
- you don't really need start since you can always implement it by getting the child task to
Event.set(), or even further use thisMaybeOutcomething from above. - how do you get the
cancel_scope_manageryielded output in that case, since.start()would normally return whatever the child passed tostatus.started()?
- i.e. you can't do
started = await n.start(blah)and expectstartedto be theoutcome, csstuff
- maybe the child task could get access to its own cancel scope "stuff" from somewhere and then choose to
status.started((outcome, cs))it back to the caller?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Huh, i guess we could just offer some kinda support for hooking into the started value returned by the task? that way the cancel_scope_manager could decide what to send back to the .start() caller with the default being the value that the child .started() ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
huh, likely you could implement that as a double yield generator then?
I dunno what would be best for ensuring that other then decorator params and then async_fn checking inside .start() against that i guess?
tractor/trionics/_supervisor.py
Outdated
|
|
||
| val: str = 'yoyoyo' | ||
| val_outcome, cs = await sn.start_soon(sleep_then_return_val, val) | ||
| res = await val_outcome.unwrap() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As already mentioned above in the comment on the method, probably a better name is warranted for this since you're actually first waiting for the outcome then unwrapping it's underlying value/result.
tractor/trionics/_supervisor.py
Outdated
| assert res == val | ||
| print(f'GOT EXPECTED TASK VALUE: {res}') | ||
|
|
||
| print('WAITING FOR CRASH..') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
scheduling sleep_then_err() results in an assertion error that is caught and handled in the pdbp REPL thanks to the cancel_scope_manager implementation which does this on std Exceptions 😎
Note, you could also catch trio.Cancelled or wtv else by changing the except line 🏄🏼
|
|
||
| async def _start_wrapped_in_scope( | ||
| task_status: TaskStatus[ | ||
| tuple[CancelScope, Task] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wrong, this should be the type sig of whatever is yielded from the self.scope_manager obviously
tractor/trionics/_supervisor.py
Outdated
|
|
||
| # TODO: better way to concat the values delivered by the user | ||
| # provided `.scope_manager` and the outcome? | ||
| return tuple([maybe_outcome] + to_return) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah as mentioned in the TODO, don't love this concat-ing of the results from the scope_manager(), but i think this goes away if we instead implement our own generator as @trio.cancel_scope_manager protocol since then the absolute value tuple yielded can be returned directly?
bc2bd2d to
a7af633
Compare
tractor/trionics/_supervisor.py
Outdated
| # TODO: this was working before?! | ||
| # nonlocal to_return | ||
|
|
||
| with cs: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ahh wait, right.. so we do need to activate this here or can it be done in the user's @task_scope_manager thing?
tractor/trionics/_supervisor.py
Outdated
| # task = new_tasks.pop() | ||
|
|
||
| n: Nursery = self._n | ||
| cs = CancelScope() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need this allocated here?
See comment(s) below.
e4d4434 to
6c089c0
Compare
trio.Nursery per-task-scope-mangement via a user defined single-yield-generator function 😎
6c089c0 to
940e65f
Compare
tractor/trionics/_supervisor.py
Outdated
| return self.result | ||
|
|
||
|
|
||
| class ScopePerTaskNursery(Struct): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Heh, guess we don't need much of a special name now?
PerTaskHookableNursery 😂
tractor/trionics/_supervisor.py
Outdated
| *args, | ||
|
|
||
| name=None, | ||
| scope_manager: ContextManager | None = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe task_manager is a better name?
| name=None, | ||
| scope_manager: ContextManager | None = None, | ||
|
|
||
| ) -> tuple[CancelScope, Task]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
needs to again be dynamically defined at runtime.. soo use the ParamSpec stuff?
tractor/trionics/_supervisor.py
Outdated
|
|
||
| await trio.sleep(0.6) | ||
| print( | ||
| 'Cancelling and waiting on {err_outcome.lowlevel_task} ' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lul, missing f-string
cda85a3 to
b24f0d5
Compare
tractor/trionics/_supervisor.py
Outdated
| # TODO: define a decorator to runtime type check that this a generator | ||
| # with a single yield that also delivers a value (of some std type) from | ||
| # the yield expression? | ||
| # @trio.task_scope_manager |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe just @trio.task_manager ?
tractor/trionics/_supervisor.py
Outdated
| def add_task_handle_and_crash_handling( | ||
| nursery: Nursery, | ||
|
|
||
| # TODO: is this the only way we can have a per-task scope |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no, we don't need the nursery to allocate any cancel scope per task interally!
need to remove this!
872dcd7 to
c7e27ad
Compare
trio.Nursery per-task-scope-mangement via a user defined single-yield-generator function 😎 taskman: an initial prototype for trio.Nursery per-task-scope-manager (maybe via a user defined single-yield-generator function?) 😎
taskman: an initial prototype for trio.Nursery per-task-scope-manager (maybe via a user defined single-yield-generator function?) 😎 taskman: an initial prototype for a trio.Nursery-like per-task-scope-manager (maybe via a user defined single-yield-generator function?) 😎
Namely distinguishing service "IPC contexts" (opened in a
subactor via a `Portal`) from just local `trio.Task`s started
and managed under the `.service_n` (more or less wrapping in the
interface of a "task-manager" style nursery - aka a one-cancels-one
supervision start).
API changes from original (`piker`) impl,
- mk `.start_service_task()` do ONLY that, start a task with a wrapping
cancel-scope and completion event.
|_ ideally this gets factored-out/re-implemented using the
task-manager/OCO-style-nursery from GH #363.
- change what was the impl of `.start_service_task()` to `.start_service_ctx()`
since it more explicitly defines the functionality of entering
`Portal.open_context()` with a wrapping cs and completion event inside
a bg task (which syncs the ctx's lifetime with termination of the
remote actor runtime).
- factor out what was a `.start_service_ctx()` closure to a new
`_open_and_supervise_service_ctx()` mod-func holding the meat of
the supervision logic.
`ServiceMngr` API brief,
- use `open_service_mngr()` and `get_service_mngr()` to acquire the
actor-global singleton.
- `ServiceMngr.start_service()` and `.cancel_service()` which allow for
straight forward mgmt of "service subactor daemons".
c7e27ad to
23809b8
Compare
Namely distinguishing service "IPC contexts" (opened in a
subactor via a `Portal`) from just local `trio.Task`s started
and managed under the `.service_n` (more or less wrapping in the
interface of a "task-manager" style nursery - aka a one-cancels-one
supervision start).
API changes from original (`piker`) impl,
- mk `.start_service_task()` do ONLY that, start a task with a wrapping
cancel-scope and completion event.
|_ ideally this gets factored-out/re-implemented using the
task-manager/OCO-style-nursery from GH #363.
- change what was the impl of `.start_service_task()` to `.start_service_ctx()`
since it more explicitly defines the functionality of entering
`Portal.open_context()` with a wrapping cs and completion event inside
a bg task (which syncs the ctx's lifetime with termination of the
remote actor runtime).
- factor out what was a `.start_service_ctx()` closure to a new
`_open_and_supervise_service_ctx()` mod-func holding the meat of
the supervision logic.
`ServiceMngr` API brief,
- use `open_service_mngr()` and `get_service_mngr()` to acquire the
actor-global singleton.
- `ServiceMngr.start_service()` and `.cancel_service()` which allow for
straight forward mgmt of "service subactor daemons".
As was listed in the many todos, this changes the `.start_soon()` impl to instead (manually) `.send()` into the user defined `@task_scope_manager` an `Outcome` from the spawned task. In this case the task manager wraps that in a user defined (and renamed) `TaskOutcome` and delivers that + a containing `trio.CancelScope` to the `.start_soon()` caller. Here the user defined `TaskOutcome` defines a `.wait_for_result()` method that can be used to await the task's exit and handle it's underlying returned value or raised error; the implementation could be different and subject to the user's own whims. Note that by default, if this was added to `trio`'s core, the `@task_scope_manager` would simply be implemented as either a `None` yielding single-yield-generator but more likely just entirely ignored by the runtime (as in no manual task outcome collecting, generator calling and sending is done at all) by default if the user does not provide the `task_scope_manager` to the nursery at open time.
Turns out the nursery doesn't have to care about allocating a per task `CancelScope` since the user can just do that in the `@task_scope_manager` if desired B) So just mask all the nursery cs allocating with the intention of removal. Also add a test for per-task-cancellation by starting the crash task as a `trio.sleep_forever()` but then cancel it via the user allocated cs and ensure the crash propagates as expected 💥
- drop unneeded (and commented) internal cs allocating bits. - bypass all task manager stuff if no generator is provided by the caller; i.e. just call `.start_soon()` as normal. - fix `Generator` typing. - add some prints around task manager. - wrap in `TaskOutcome.lowlevel_task: Task`.
Allows dynamically importing `pdbp` when enabled and a way for eventually linking with `tractor`'s own debug mode flag.
Since I'd like to decouple the new "task-manager-nursery" lowlevel primitives/abstractions from the higher-level `TaskManagerNursery`-supporting API(s) and default per-task supervision-strat and because `._mngr` is already purposed for higher-level "on-top-of-nursery" patterns as it is. Deats, - move `maybe_open_nursery()` into the new mod. - adjust the pkg-mod's import to the new sub-mod. - also draft up this idea for an API which stacks `._beg.collapse_eg()` onto a nursery with the WIP name `open_loose_tn()` but more then likely i'll just discard this idea bc i think the explicit `@acm` stacking is more explicit/pythonic/up-front-grokable despite the extra LoC.
Namely just renaming any `trio.Nursery` instances to `tn`, the primary `@acm`-API to `.trionics.open_taskman()` and change out all `print()`s for logger instances with 'info' level enabled by the mod-script usage.
23809b8 to
653f23a
Compare
As per some discussion in the
triogitter and starting somethingtoward #22 and supporting our debugger crash mode from within
a
trio-compat task nursery.Idea: a
trio.NurseryAPI for optionally allowing a user to define a per-task managerBut why?
Well if you wanted any of the following (some of which are implemented
in the module script in this patch):
trio.Nursery"internals" for anyper-task oriented purpose.
waiting on a task to finish and get its result.
of debugging crashes, (unexpected) cancels or just generally tracing
the
trioper task control flow.one_for_onefrom erlang/elixir and friends.
ToDo:
offer a
task_managercontext-manager-as-generator API suchthat the current
TaskOutcomestruct is not tied to theinternals of the
ScopePerTaskNursery.start_soon()implementation.outcome: Outcome = yield (cs, ..)is delivered tothe user defined mngr
via asuch that onlyOutcome.send(mngr)callraw
trio(dependency) internal types are provided to the user forwrapping / adjusting.
Outcome.send()bc it would send the underlyingvalue, not the outcome instance; not sure if that's best?
way better naming since most of this was whipped up on the
first try and getting good names is probably going to be tricky 😂
@task_manageris maybe better?@task_scope_manager?ScopePerTaskNurseryis terrible.. how aboutTaskScopedNursery?TaskManagerNursery?TaskOutcome.unwrap()seems wrong, despite it being an example ofa user defined handle.
.wait_for_result()which is a lot morepythonic in terms of plain-ol'-simple-semantics 🏄
maybe.unwind()is more descriptive? as in unwinding the outcomefrom the task would imply you both have to:
a function call and the subsequent popping of the stack ..?
an imperative lang in terms of lower level fp abstractions which
is not only confusing but a bit moot: we don't need to hear math
about stuff we can't change in (python and) the semantics of
a method..:joy:
test the waters in
triocore for interest in this ideaanyioauthor and njs butdon't think they understood what i was proposing (hence this proto
😂)
inside
trio's scheduler would be about here:https://github.com/python-trio/trio/blob/master/trio/_core/_run.py#L1601
since this is where the nursery-global cancel scope's
CancelStatusis activated on the currently spawning task.
and offer the generator-style
@cancel_scope_managerthingbecause we'll need to call
__enter__/__exit__from twodifferent
Runnermethods ->.spawn_impl()and.task_exited()PR (nor care?) and it'll get relegated to our
trionicssecretsauces pantry yet again 😂
figure out if it's possible to swap in this
open_nursery()for all uses of
trio.open_nursery()when our users enable debug mode😂
Hopefully much more to come!