Getting Started
Prerequisites
- Docker and Docker Compose
- Node.js 18+
- Python 3.12+
Quick Start
1. Clone and Configure
git clone https://github.com/sensyze/dataflow.git
cd dataflow
cp .env.frontend.example .env.frontend
cp .env.backend.example .env.backend
2. Configure Environment Variables
Edit .env and set:
SUPABASE_DATABASE_URL=postgresql://...
SUPABASE_SERVICE_ROLE_KEY=...
GEMINI_API_KEY=...
3. Start the Stack
make dev
This starts all services:
- Sensyze Dataflow Server - FastAPI + Temporal worker
- Temporal Server - Workflow orchestration
- Redis - Caching
4. Access Services
| Service | URL |
|---|---|
| App | http://localhost:3000 |
| API Docs | http://localhost:8000/docs |
| Temporal UI | http://localhost:8080 |
| Dask Dashboard | http://localhost:8787 |
Common Commands
| Command | Description |
|---|---|
make dev | Start/rebuild stack (recommended) |
make dev-fresh | Force rebuild, ignore Docker cache |
make dev-clean | Remove everything and rebuild |
make status | Check health of all services |
docker compose logs -f dataflow-server | View server logs |
docker compose exec dataflow-server bash | Shell into container |
Verify Installation
Test API
curl -X POST http://dataflow-server:8000/python/test \
-H "Content-Type: application/json" \
-d '{"code": "def transform(df): return df", "input_data": [{"a": 1}]}'
Run Tests
cd dataflow-server
pytest tests/