METADATA 1.6 KB

1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950
  1. Metadata-Version: 2.4
  2. Name: langgraph-sdk
  3. Version: 0.3.0
  4. Summary: SDK for interacting with LangGraph API
  5. Project-URL: Source, https://github.com/langchain-ai/langgraph/tree/main/libs/sdk-py
  6. Project-URL: Twitter, https://x.com/LangChainAI
  7. Project-URL: Slack, https://www.langchain.com/join-community
  8. Project-URL: Reddit, https://www.reddit.com/r/LangChain/
  9. License-Expression: MIT
  10. License-File: LICENSE
  11. Requires-Python: >=3.10
  12. Requires-Dist: httpx>=0.25.2
  13. Requires-Dist: orjson>=3.10.1
  14. Description-Content-Type: text/markdown
  15. # LangGraph Python SDK
  16. This repository contains the Python SDK for interacting with the LangSmith Deployment REST API.
  17. ## Quick Start
  18. To get started with the Python SDK, [install the package](https://pypi.org/project/langgraph-sdk/)
  19. ```bash
  20. pip install -U langgraph-sdk
  21. ```
  22. You will need a running LangGraph API server. If you're running a server locally using `langgraph-cli`, SDK will automatically point at `http://localhost:8123`, otherwise
  23. you would need to specify the server URL when creating a client.
  24. ```python
  25. from langgraph_sdk import get_client
  26. # If you're using a remote server, initialize the client with `get_client(url=REMOTE_URL)`
  27. client = get_client()
  28. # List all assistants
  29. assistants = await client.assistants.search()
  30. # We auto-create an assistant for each graph you register in config.
  31. agent = assistants[0]
  32. # Start a new thread
  33. thread = await client.threads.create()
  34. # Start a streaming run
  35. input = {"messages": [{"role": "human", "content": "what's the weather in la"}]}
  36. async for chunk in client.runs.stream(thread['thread_id'], agent['assistant_id'], input=input):
  37. print(chunk)
  38. ```