vllm.entrypoints.anthropic.api_server ¶
   parser  module-attribute  ¶
 parser = FlexibleArgumentParser(
    description="vLLM Anthropic-Compatible RESTful API server."
)
  build_app ¶
 build_app(args: Namespace) -> FastAPI
Source code in vllm/entrypoints/anthropic/api_server.py
   create_messages  async  ¶
 create_messages(
    request: AnthropicMessagesRequest, raw_request: Request
)
Source code in vllm/entrypoints/anthropic/api_server.py
   engine_client ¶
 engine_client(request: Request) -> EngineClient
  health  async  ¶
     init_app_state  async  ¶
 init_app_state(
    engine_client: EngineClient,
    state: State,
    args: Namespace,
) -> None
Source code in vllm/entrypoints/anthropic/api_server.py
   messages ¶
 messages(request: Request) -> AnthropicServingMessages
  ping  async  ¶
  Ping check. Endpoint required for SageMaker
  run_server  async  ¶
  Run a single-worker API server.
  run_server_worker  async  ¶
  Run a single API server worker.
Source code in vllm/entrypoints/anthropic/api_server.py
   setup_server ¶
  Validate API server args, set up signal handler, create socket ready to serve.