Live Demo — move your cursor

Lights that follow people.

Mount a camera. Press Track. Every moving head locks onto every person in the room — automatically. No programming, no cue lists, no operator.

scroll
40 Hz
Art-Net update rate
16
detection classes
< 200ms
detection-to-beam latency
1-click
to start tracking
Free
MIT open source

How it works

From camera to beam in four steps

SlyLED runs YOLOv8n on the camera node itself so detection never leaves your network. The orchestrator converts detected positions into pan/tilt angles and fires Art-Net at 40 Hz.

01
📷

Camera sees the stage

A USB camera on a Raspberry Pi or Orange Pi captures frames. YOLOv8n runs on-device — no cloud, no latency, no subscription.

02
🧭

Calibration maps pixels to space

A one-time beam-detection calibration links camera coordinates to real-world stage positions. Accuracy within ±5 cm.

03
📐

Fixture profiles convert to pan/tilt

SlyLED knows every fixture's position and OFL profile. It converts XYZ coordinates to exact DMX pan/tilt values via inverse kinematics.

04

Beams glide. Show runs.

40 Hz Art-Net output. Smooth cubic interpolation means heads glide — not snap. Even fast runners stay lit.

Top-down beam tracking

This canvas shows what SlyLED sees: four fixtures mounted on a truss, three people walking the stage floor. Each fixture independently locks onto the nearest unassigned person. Assignment resolves in real time as people cross paths.

  • 1:1 exclusive lock — one head per person, no doubling
  • Round-robin cycling when more people than fixtures
  • Spread mode to maximize stage coverage
  • Proximity re-ID across detection gaps (500 mm threshold)
  • Per-fixture color — each head gets its own beam color
  • Temporal markers appear live in 3D viewport
Top-down view — truss overhead, stage floor below

Why SlyLED tracking is different

Every competitor charges thousands for follow-spot automation. SlyLED is free and does more.

🧠

On-device AI, zero cloud

Detection runs on the camera node itself. Nothing leaves your network. No API keys, no subscription, no outage risk. Works in a venue with no internet.

🎯

Any fixture, any make

SlyLED uses OFL profiles — 700+ fixtures supported. If it has pan and tilt channels, tracking works. No proprietary hardware required.

No programming required

Legacy follow-spot systems require DMX programming and manual cueing. SlyLED is one click. The show adapts to performers, not the other way around.

🔗

Integrates with everything

Combine tracking with timeline automation, spatial effects, and manual overrides. Tracking is a mode, not a separate system.

Who uses it

Built for shows where the performer is unpredictable — which is every show.

🎤

Live music

Vocalist moves freely. Spot beams follow them on stage while the rest of the rig runs the programmed show.

🎭

Theatre

Actors hit their marks — or don't. SlyLED tracks wherever they actually are, not where the cue list expected them.

💃

Dance

Up to 8 performers tracked simultaneously across the full stage width. No follow-spot operator needed.

🎪

Corporate events

Speaker moves around the stage naturally. Clean, professional lighting follows without a technician watching.

🎠

Installations

Interactive art where the light responds to visitors. 16 object classes — track people, props, even pets.

🏫

Education

Affordable enough for school theatre budgets. Full system under $500 in hardware. No licensing fees ever.

Screenshots

Stage at T=0 — three beams waiting to acquire
T=0 — fixtures idle, waiting for first detection
T=5s — two people detected, beams acquired
T=5s — beams acquired on two performers
T=10s — full tracking, all fixtures locked
T=10s — full stage coverage, all fixtures tracking
Camera configuration dialog
Per-camera config — classes, sensitivity, resolution, FPS

Your next show runs itself.

Download SlyLED, mount a USB camera, run the calibration wizard. Your moving heads will be tracking performers before the headliner's soundcheck.

Download SlyLED Free See All Features