Over Flowise
Drag & drop UI to build your customized LLM flow
Features
- LLM Orchestration: Connect LLMs with memory, data loaders, cache, moderation and many more: Langchain, LlamaIndex, 100+ integrations
- Agents & Assistants: Create autonomous agent that can uses tools to execute different tasks: Custom Tools, OpenAI Assistant, Function Agent
- API, SDK, Embed: Extend and integrate to your applications using APIs, SDK and Embedded Chat: APIs, Embedded Widget, React SDK
- Open source LLMs: Run in air-gapped environment with local LLMs, embeddings and vector databases: HuggingFace, Ollama, LocalAI, Replicate, Llama2, Mistral, Vicuna, Orca, Llava, Self host on AWS, Azure, GCP
Environment variables
View environment variables
- PROJECT
- flowise
- DOMAIN
- stack.localhost
- DATABASE_PATH
- /root/.flowise
- APIKEY_PATH
- /root/.flowise
- SECRETKEY_PATH
- /root/.flowise
- LOG_PATH
- /root/.flowise/logs
- BLOB_STORAGE_PATH
- /root/.flowise/storage
- IFRAME_ORIGINS
- *
- DISABLE_FLOWISE_TELEMETRY
- true
- ENABLE_METRICS
- false