Sync from v0.13
This commit is contained in:
94
examples/online_serving/opentelemetry/README.md
Normal file
94
examples/online_serving/opentelemetry/README.md
Normal file
@@ -0,0 +1,94 @@
|
||||
# Setup OpenTelemetry POC
|
||||
|
||||
1. Install OpenTelemetry packages:
|
||||
|
||||
```bash
|
||||
pip install \
|
||||
'opentelemetry-sdk>=1.26.0,<1.27.0' \
|
||||
'opentelemetry-api>=1.26.0,<1.27.0' \
|
||||
'opentelemetry-exporter-otlp>=1.26.0,<1.27.0' \
|
||||
'opentelemetry-semantic-conventions-ai>=0.4.1,<0.5.0'
|
||||
```
|
||||
|
||||
1. Start Jaeger in a docker container:
|
||||
|
||||
```bash
|
||||
# From: https://www.jaegertracing.io/docs/1.57/getting-started/
|
||||
docker run --rm --name jaeger \
|
||||
-e COLLECTOR_ZIPKIN_HOST_PORT=:9411 \
|
||||
-p 6831:6831/udp \
|
||||
-p 6832:6832/udp \
|
||||
-p 5778:5778 \
|
||||
-p 16686:16686 \
|
||||
-p 4317:4317 \
|
||||
-p 4318:4318 \
|
||||
-p 14250:14250 \
|
||||
-p 14268:14268 \
|
||||
-p 14269:14269 \
|
||||
-p 9411:9411 \
|
||||
jaegertracing/all-in-one:1.57
|
||||
```
|
||||
|
||||
1. In a new shell, export Jaeger IP:
|
||||
|
||||
```bash
|
||||
export JAEGER_IP=$(docker inspect --format '{{ .NetworkSettings.IPAddress }}' jaeger)
|
||||
export OTEL_EXPORTER_OTLP_TRACES_ENDPOINT=grpc://$JAEGER_IP:4317
|
||||
```
|
||||
|
||||
Then set vLLM's service name for OpenTelemetry, enable insecure connections to Jaeger and run vLLM:
|
||||
|
||||
```bash
|
||||
export OTEL_SERVICE_NAME="vllm-server"
|
||||
export OTEL_EXPORTER_OTLP_TRACES_INSECURE=true
|
||||
vllm serve facebook/opt-125m --otlp-traces-endpoint="$OTEL_EXPORTER_OTLP_TRACES_ENDPOINT"
|
||||
```
|
||||
|
||||
1. In a new shell, send requests with trace context from a dummy client
|
||||
|
||||
```bash
|
||||
export JAEGER_IP=$(docker inspect --format '{{ .NetworkSettings.IPAddress }}' jaeger)
|
||||
export OTEL_EXPORTER_OTLP_TRACES_ENDPOINT=grpc://$JAEGER_IP:4317
|
||||
export OTEL_EXPORTER_OTLP_TRACES_INSECURE=true
|
||||
export OTEL_SERVICE_NAME="client-service"
|
||||
python dummy_client.py
|
||||
```
|
||||
|
||||
1. Open Jaeger webui: <http://localhost:16686/>
|
||||
|
||||
In the search pane, select `vllm-server` service and hit `Find Traces`. You should get a list of traces, one for each request.
|
||||

|
||||
|
||||
1. Clicking on a trace will show its spans and their tags. In this demo, each trace has 2 spans. One from the dummy client containing the prompt text and one from vLLM containing metadata about the request.
|
||||

|
||||
|
||||
## Exporter Protocol
|
||||
|
||||
OpenTelemetry supports either `grpc` or `http/protobuf` as the transport protocol for trace data in the exporter.
|
||||
By default, `grpc` is used. To set `http/protobuf` as the protocol, configure the `OTEL_EXPORTER_OTLP_TRACES_PROTOCOL` environment variable as follows:
|
||||
|
||||
```bash
|
||||
export OTEL_EXPORTER_OTLP_TRACES_PROTOCOL=http/protobuf
|
||||
export OTEL_EXPORTER_OTLP_TRACES_ENDPOINT=http://$JAEGER_IP:4318/v1/traces
|
||||
vllm serve facebook/opt-125m --otlp-traces-endpoint="$OTEL_EXPORTER_OTLP_TRACES_ENDPOINT"
|
||||
```
|
||||
|
||||
## Instrumentation of FastAPI
|
||||
|
||||
OpenTelemetry allows automatic instrumentation of FastAPI.
|
||||
|
||||
1. Install the instrumentation library
|
||||
|
||||
```bash
|
||||
pip install opentelemetry-instrumentation-fastapi
|
||||
```
|
||||
|
||||
1. Run vLLM with `opentelemetry-instrument`
|
||||
|
||||
```bash
|
||||
opentelemetry-instrument vllm serve facebook/opt-125m
|
||||
```
|
||||
|
||||
1. Send a request to vLLM and find its trace in Jaeger. It should contain spans from FastAPI.
|
||||
|
||||

|
||||
34
examples/online_serving/opentelemetry/dummy_client.py
Normal file
34
examples/online_serving/opentelemetry/dummy_client.py
Normal file
@@ -0,0 +1,34 @@
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
# SPDX-FileCopyrightText: Copyright contributors to the vLLM project
|
||||
|
||||
import requests
|
||||
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
|
||||
from opentelemetry.sdk.trace import TracerProvider
|
||||
from opentelemetry.sdk.trace.export import BatchSpanProcessor, ConsoleSpanExporter
|
||||
from opentelemetry.trace import SpanKind, set_tracer_provider
|
||||
from opentelemetry.trace.propagation.tracecontext import TraceContextTextMapPropagator
|
||||
|
||||
trace_provider = TracerProvider()
|
||||
set_tracer_provider(trace_provider)
|
||||
|
||||
trace_provider.add_span_processor(BatchSpanProcessor(OTLPSpanExporter()))
|
||||
trace_provider.add_span_processor(BatchSpanProcessor(ConsoleSpanExporter()))
|
||||
|
||||
tracer = trace_provider.get_tracer("dummy-client")
|
||||
|
||||
url = "http://localhost:8000/v1/completions"
|
||||
with tracer.start_as_current_span("client-span", kind=SpanKind.CLIENT) as span:
|
||||
prompt = "San Francisco is a"
|
||||
span.set_attribute("prompt", prompt)
|
||||
headers = {}
|
||||
TraceContextTextMapPropagator().inject(headers)
|
||||
payload = {
|
||||
"model": "facebook/opt-125m",
|
||||
"prompt": prompt,
|
||||
"max_tokens": 10,
|
||||
"n": 3,
|
||||
"use_beam_search": "true",
|
||||
"temperature": 0.0,
|
||||
# "stream": True,
|
||||
}
|
||||
response = requests.post(url, headers=headers, json=payload)
|
||||
Reference in New Issue
Block a user