Flowise
Visual drag-and-drop builder for LLM applications, chatbots, and AI agent workflows.
Overview
Flowise gives you a canvas to build LLM-powered applications without writing backend code. Connect components visually — language models, memory, tools, and data sources — then deploy your flow as a REST API in one click. It complements Langflow with a different node library and workflow style, giving you flexibility in how you build.
Key Features
- Visual Canvas — Drag and connect nodes to build LLM chains, chatbots, and agents
- Component Library — Pre-built nodes for models, memory, vector stores, tools, and more
- REST API Deployment — Every flow is instantly available as an API endpoint
- Chat Playground — Test your flows interactively before deploying
- Credential Management — Securely store API keys and connection strings
- Persistent Storage — Flow configurations saved to your workspace database
Getting Started
- From the Hub, click Flowise to launch
- The Flowise canvas opens in your browser
- Click + Add New to create a new chatflow or agentflow
- Drag components from the sidebar onto the canvas
- Connect nodes by clicking output → input ports
- Click Save and then Chat to test your flow
Building Flows
Chatflows
Chatflows are conversational pipelines — great for building chatbots with memory, RAG systems, and Q&A interfaces:
- Add a Chat Model node (OpenAI, Anthropic, etc.)
- Add a Conversation Chain or LLM Chain node
- Optionally add Memory and Vector Store nodes for context
- Connect them and save
Agentflows
Agentflows are tool-using agents that reason and act:
- Add an Agent node as the orchestrator
- Add Tool nodes (web search, code execution, APIs)
- Connect tools to the agent
- Add a Chat Model to power the agent’s reasoning
Deploying as an API
Once your flow is saved, Flowise automatically exposes it as a REST API:
curl -X POST "https://your-hub/flowise/api/v1/prediction/{flowId}" \
-H "Content-Type: application/json" \
-d '{"question": "What is Calliope AI?"}'The API supports streaming responses, session management, and file uploads depending on your flow configuration.
Supported Integrations
Language Models: OpenAI, Anthropic, Google, Mistral, Ollama, and more
Vector Stores: Pinecone, Qdrant, Chroma, Weaviate, PostgreSQL pgvector
Document Loaders: PDF, CSV, JSON, web pages, Notion, Confluence
Tools: Web search, code execution, custom APIs, calculators
When to Use Flowise vs Langflow
| Scenario | Use |
|---|---|
| Building chatbots with memory | Either works well |
| Complex multi-agent orchestration | Langflow |
| Quick chatflow prototyping | Flowise |
| RAG pipelines with many vector stores | Either |
| API-first deployment | Flowise |
| Production agent pipelines | Langflow |
Both tools are available in your Hub — try both and use what fits your workflow.