Skip to content

Flowise

Visual drag-and-drop builder for LLM applications, chatbots, and AI agent workflows.

Overview

Flowise gives you a canvas to build LLM-powered applications without writing backend code. Connect components visually — language models, memory, tools, and data sources — then deploy your flow as a REST API in one click. It complements Langflow with a different node library and workflow style, giving you flexibility in how you build.

Key Features

  • Visual Canvas — Drag and connect nodes to build LLM chains, chatbots, and agents
  • Component Library — Pre-built nodes for models, memory, vector stores, tools, and more
  • REST API Deployment — Every flow is instantly available as an API endpoint
  • Chat Playground — Test your flows interactively before deploying
  • Credential Management — Securely store API keys and connection strings
  • Persistent Storage — Flow configurations saved to your workspace database

Getting Started

  1. From the Hub, click Flowise to launch
  2. The Flowise canvas opens in your browser
  3. Click + Add New to create a new chatflow or agentflow
  4. Drag components from the sidebar onto the canvas
  5. Connect nodes by clicking output → input ports
  6. Click Save and then Chat to test your flow

Building Flows

Chatflows

Chatflows are conversational pipelines — great for building chatbots with memory, RAG systems, and Q&A interfaces:

  1. Add a Chat Model node (OpenAI, Anthropic, etc.)
  2. Add a Conversation Chain or LLM Chain node
  3. Optionally add Memory and Vector Store nodes for context
  4. Connect them and save

Agentflows

Agentflows are tool-using agents that reason and act:

  1. Add an Agent node as the orchestrator
  2. Add Tool nodes (web search, code execution, APIs)
  3. Connect tools to the agent
  4. Add a Chat Model to power the agent’s reasoning

Deploying as an API

Once your flow is saved, Flowise automatically exposes it as a REST API:

curl -X POST "https://your-hub/flowise/api/v1/prediction/{flowId}" \
  -H "Content-Type: application/json" \
  -d '{"question": "What is Calliope AI?"}'

The API supports streaming responses, session management, and file uploads depending on your flow configuration.

Supported Integrations

Language Models: OpenAI, Anthropic, Google, Mistral, Ollama, and more

Vector Stores: Pinecone, Qdrant, Chroma, Weaviate, PostgreSQL pgvector

Document Loaders: PDF, CSV, JSON, web pages, Notion, Confluence

Tools: Web search, code execution, custom APIs, calculators

When to Use Flowise vs Langflow

ScenarioUse
Building chatbots with memoryEither works well
Complex multi-agent orchestrationLangflow
Quick chatflow prototypingFlowise
RAG pipelines with many vector storesEither
API-first deploymentFlowise
Production agent pipelinesLangflow

Both tools are available in your Hub — try both and use what fits your workflow.