Skip to main content

Getting Started

Prerequisites

  • Docker and Docker Compose
  • Node.js 18+
  • Python 3.12+

Quick Start

1. Clone and Configure

git clone https://github.com/sensyze/dataflow.git
cd dataflow
cp .env.frontend.example .env.frontend
cp .env.backend.example .env.backend

2. Configure Environment Variables

Edit .env and set:

SUPABASE_DATABASE_URL=postgresql://...
SUPABASE_SERVICE_ROLE_KEY=...
GEMINI_API_KEY=...

3. Start the Stack

make dev

This starts all services:

  • Sensyze Dataflow Server - FastAPI + Temporal worker
  • Temporal Server - Workflow orchestration
  • Redis - Caching

4. Access Services

ServiceURL
Apphttp://localhost:3000
API Docshttp://localhost:8000/docs
Temporal UIhttp://localhost:8080
Dask Dashboardhttp://localhost:8787

Common Commands

CommandDescription
make devStart/rebuild stack (recommended)
make dev-freshForce rebuild, ignore Docker cache
make dev-cleanRemove everything and rebuild
make statusCheck health of all services
docker compose logs -f dataflow-serverView server logs
docker compose exec dataflow-server bashShell into container

Verify Installation

Test API

curl -X POST http://dataflow-server:8000/python/test \
-H "Content-Type: application/json" \
-d '{"code": "def transform(df): return df", "input_data": [{"a": 1}]}'

Run Tests

cd dataflow-server
pytest tests/