Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Fixing instrumentation for workflows #16290

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

logan-markewich
Copy link
Collaborator

@logan-markewich logan-markewich commented Sep 29, 2024

Since workflows return a Future directly now, instrumentation will log the future instead of the actual result

So, to remedy this, we need to handle returning futures in our span decorators

This fix seems to work, but seeing the "error" below when using arize (note that this seems benign? since things are still logged to arize properly?)

ERROR [opentelemetry.context] Failed to detach context
Traceback (most recent call last):
  File "/Users/loganmarkewich/Library/Caches/pypoetry/virtualenvs/llama-index-caVs7DDe-py3.11/lib/python3.11/site-packages/opentelemetry/context/__init__.py", line 154, in detach
    _RUNTIME_CONTEXT.detach(token)
  File "/Users/loganmarkewich/Library/Caches/pypoetry/virtualenvs/llama-index-caVs7DDe-py3.11/lib/python3.11/site-packages/opentelemetry/context/contextvars_context.py", line 50, in detach
    self._current_context.reset(token)  # type: ignore
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ValueError: <Token var=<ContextVar name='current_context' default={} at 0x2acb049f0> at 0x2cb890500> was created in a different Context

I bit more validation is needed

Fixes #16283

@dosubot dosubot bot added the size:M This PR changes 30-99 lines, ignoring generated files. label Sep 29, 2024
if isinstance(result, asyncio.Future):
# If the result is a Future, wrap it
new_future = asyncio.ensure_future(result)
new_future.add_done_callback(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This makes sense to me! Wonder where that Context is being spun up in the arize error you shared

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the quick fix @logan-markewich! Just wanted to callout that the same error occurs with MLflow tracing (while the trace itself is logged correctly).

From my observation, I think the error is because of trying to reset ContextVar by a token that is created in the main thread, while resetting it in the async task - Arize and MLflow works fine nevertheless because neither relies on active_span_id. I think setting active_span_id inside handle_future_result should fix it (and it makes more sense that the async task manages its own active span).

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@B-Step62 I don't think I can move the active_span_id into handle_future_result, since its needed for span_enter, and handle_future_result only runs once the span ends 🤔

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
size:M This PR changes 30-99 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: Span dispatcher does not handle WorkflowHandler.pending correctly.
3 participants