Loading…
Loading…
Python · AI agents
LangChain / LangGraph output is a fully-wired Python AI agent application: typed LangGraph state machines per workflow, LangChain tool definitions per integration, a Flask API layer exposing agent runs as REST endpoints, persistent memory via PostgreSQL + pgvector, and Celery for long-running agent tasks. The architecture drives which tools, chains, and graphs are generated — not the other way around.
Every LangChain / LangGraph app generated by Archiet ships with these baseline pieces. No template-cutting, no placeholder TODOs.
One representative file from a generated LangChain / LangGraph app. Your generated output will include many more files like this one, customized to the entities and flows in your blueprint.
from langgraph.graph import StateGraph, END
from typing import TypedDict
class WorkflowState(TypedDict):
input: str
context: list[str]
result: str | None
def retrieve(state: WorkflowState) -> WorkflowState:
docs = vector_store.similarity_search(state["input"], k=4)
return {**state, "context": [d.page_content for d in docs]}
def generate(state: WorkflowState) -> WorkflowState:
response = llm.invoke([
SystemMessage(content="Use the context to answer."),
HumanMessage(content=f"Context: {state['context']}\n\nQ: {state['input']}"),
])
return {**state, "result": response.content}
graph = StateGraph(WorkflowState)
graph.add_node("retrieve", retrieve)
graph.add_node("generate", generate)
graph.add_edge("retrieve", "generate")
graph.add_edge("generate", END)
graph.set_entry_point("retrieve")
app = graph.compile()Pick LangChain/LangGraph when your architecture includes AI agents, RAG pipelines, or multi-step reasoning workflows. Archiet generates the graph topology from your business requirements — you get a production agent system, not a Jupyter notebook demo.