Omni-Dromenon Engine (Metasystem Master)
Collective audience input shaping live art in real time
The Problem
Live performance has always been a negotiation between performers and audiences — but the feedback channels are coarse. An audience member can clap or not clap. They cannot communicate "I want the texture to thin out while the harmonic tension increases." Existing tools for interactive performance are either too simple (binary voting, applause meters) or too complex (custom Max/MSP patches that take months per piece).[1] There's nothing in between: a general-purpose engine that works across genres while remaining configurable enough for each. The challenge of designing spectator experiences that move beyond passive consumption has been well documented in HCI research, yet few systems bridge the gap between audience agency and artistic coherence.[5]
The Design Decision
The critical insight: the audience is a co-performer operating a collective instrument, not a data source.[2] And the performer is never subordinate to the crowd. Three override modes (absolute, blend, lock) give the performer graduated control — they can fully override a parameter, blend their intent with the audience's at any ratio, or lock it entirely. This approach reflects the principle that human-centered systems must keep humans in command of consequential decisions, even when the system aggregates collective intelligence.[3] The resulting performances are negotiated in real time, at sub-second latency, across every parameter the performer exposes.
Architecture
┌──────────────────────────────────────────────┐
│ NGINX REVERSE PROXY │
├──────────────────────────────────────────────┤
│ │
│ ┌──────────────────────────────────┐ │
│ │ CORE ENGINE (Port 3000) │ │
│ │ Express + Socket.io │ Redis │
│ │ ┌──────────┐ ┌──────────────┐ │◄──7 │
│ │ │ REST API │ │ WebSocket │ │ │
│ │ │ │ │ /audience ns │ │ Chroma │
│ │ │ │ │ /performer ns│ │◄──DB │
│ │ └────┬─────┘ └──────┬──────┘ │ │
│ │ └───────┬───────┘ │ │
│ │ Parameter Bus │ │
│ │ Consensus Engine │ │
│ │ OSC Bridge │ │
│ └──────────────────────────────────┘ │
│ │
│ ┌──────────────────────────────────┐ │
│ │ PERFORMANCE SDK (Port 3001) │ │
│ │ React 18 + Vite │ │
│ │ Audience UI · Performer Dash │ │
│ └──────────────────────────────────┘ │
│ │
│ ┌──────────────────────────────────┐ │
│ │ AUDIO SYNTHESIS BRIDGE │ │
│ │ OSC Server + WebAudio Engine │ │
│ └──────────────────────────────────┘ │
└──────────────────────────────────────────────┘
Data flow:
Phone → WebSocket /audience → Parameter Bus
→ Consensus (spatial × temporal × cluster)
→ Outlier rejection → Smoothing
→ Performer override check
→ Audience UI + Performer Dashboard + OSC The architecture reflects distributed systems principles where message-passing between isolated namespaces ensures fault tolerance: a misbehaving audience client cannot affect the performer channel, and vice versa.[10] The Redis adapter enables horizontal scaling across multiple Node.js processes, a necessity for production-ready deployment under real audience load.[9]
The Consensus Algorithm
Audience inputs are batched, never processed individually. The consensus loop runs every 50ms, computing weighted averages across three axes that must sum to ~1.0. This approach draws on research into social creativity, where collective input must be structured to avoid both tyranny-of-the-majority and cacophony.[4]
| Genre Preset | Spatial | Temporal | Consensus | Rationale |
|---|---|---|---|---|
| Electronic Music | 0.3 | 0.5 | 0.2 | Rhythmic immediacy |
| Ballet | 0.5 | 0.2 | 0.3 | Spatial proximity to dancer |
| Opera | 0.2 | 0.3 | 0.5 | Collective dramatic coherence |
| Installation | 0.7 | 0.1 | 0.2 | Location is almost everything |
| Theatre | 0.4 | 0.3 | 0.3 | Balanced narrative needs |
Spatial weighting uses exponential decay from the stage — closer audience members have more influence, reflecting the qualitative difference of proximity. Temporal weighting ensures the system responds to the audience's current state (5s decay window), not their historical average. Consensus weighting detects clusters: converging inputs amplify each other, producing decisive group movements rather than perpetual averages. The cluster detection mechanism resonates with Csikszentmihalyi's observations on how group flow states emerge when individual contributions align toward a shared creative target.[7]
interface ConsensusConfig {
spatial: number; // proximity weight
temporal: number; // recency weight
cluster: number; // convergence weight
outlierThreshold: number; // z-score cutoff
smoothingFactor: number; // EMA alpha
}
function computeConsensus(
batch: AudienceInput[],
config: ConsensusConfig,
performerOverrides: Map<string, Override>
): ParameterState {
// 1. Weight each input by proximity to stage
const spatialWeighted = batch.map(input =>
applyExponentialDecay(input, input.distance, config.spatial)
);
// 2. Apply temporal decay (5s window)
const temporalWeighted = spatialWeighted.map(input =>
applyTemporalDecay(input, Date.now(), config.temporal)
);
// 3. Detect clusters via DBSCAN, amplify convergence
const clusters = detectClusters(temporalWeighted);
const clusterWeighted = amplifyConvergence(
temporalWeighted, clusters, config.cluster
);
// 4. Reject outliers beyond z-score threshold
const filtered = rejectOutliers(
clusterWeighted, config.outlierThreshold
);
// 5. Compute weighted average per parameter
const consensus = weightedAverage(filtered);
// 6. Exponential smoothing to prevent jitter
const smoothed = exponentialSmooth(
consensus, previousState, config.smoothingFactor
);
// 7. Apply performer overrides
return applyOverrides(smoothed, performerOverrides);
} Implementation
Built as a pnpm monorepo (TypeScript + Python) with five packages: core-engine (Express + Socket.io + Redis), performance-sdk (React 18 + Vite), audio-synthesis-bridge (OSC + WebAudio), documentation, and example applications. The frontend SDK leverages patterns from the Processing community's tradition of making creative coding accessible through well-designed abstractions.[8] The core engine handles two strictly separated Socket.io namespaces — /audience for many concurrent clients (target: 1,000+) and /performer for authenticated controllers. Z-score outlier rejection (threshold: 2.5 SD) and exponential smoothing (factor: 0.3) prevent individual inputs from dominating.
Why This Is Art
The consensus algorithms aren't backstage plumbing — they're the medium. Who gets weighted more? What happens when the crowd and the performer disagree? These are artistic questions answered by system design. This positions the engine within the lineage of generative art, where the system's rules constitute the artwork itself.[6] The engine consumes theoretical foundations from ORGAN-I (recursive-engine's identity models inform how performers and audience maintain coherence across transformations) and produces a framework that could become a commercial product in ORGAN-III. Bourriaud's concept of relational aesthetics — art defined by the human relations it produces rather than the objects it creates — finds its most literal technical expression here: the engine's entire purpose is to structure the relationship between performer and audience.[12] This is ORGAN-II at its most ambitious: art that treats its own governance as part of the aesthetic.
Tradeoffs & Lessons
- Generality vs. genre-specific optimization — A general-purpose engine across ballet, electronic music, theatre, and installation means no genre gets perfectly tailored behavior. The preset system mitigates this, but custom Max/MSP patches will always outperform a generalized solution for a single piece. The tradeoff is worth it for rapid deployment across genres.
- Performer override as compositional tool — Initially designed as a safety valve, performer override became the most interesting artistic element. The creative tension between crowd desire and performer resistance produces dynamics impossible in either autocratic or purely democratic systems. Bishop's analysis of participatory art's political dimensions — who holds power, who concedes it, and under what conditions — proved unexpectedly relevant to the override system's design.[11]
- Monorepo complexity — Five packages in a pnpm workspace means more build configuration, more dependency management, more CI complexity. The alternative (five separate repos) would be worse for a system where packages share types and build in lockstep.
- WebSocket at scale — Targeting 1,000+ concurrent audience connections pushes Socket.io's single-process limits. Redis adapter handles horizontal scaling but adds operational complexity.
By the Numbers
References
- Machover, Tod. Hyperinstruments: A Progress Report. MIT Media Lab, 1992.
- Weinberg, Gil. Interconnected Musical Networks. MIT Press, 2005.
- Shneiderman, Ben. Human-Centered AI. Oxford University Press, 2022.
- Fischer, Gerhard. Social Creativity: Turning Barriers into Opportunities for Design. ACM, 2004.
- Reeves, Stuart et al.. Designing the Spectator Experience. CHI, 2005.
- Galanter, Philip. What Is Generative Art?. Digital Creativity, 2003.
- Csikszentmihalyi, Mihaly. Creativity: Flow and the Psychology of Discovery. Harper Perennial, 1996.
- Reas, Casey and Ben Fry. Processing: A Programming Handbook. MIT Press, 2007.
- Nygard, Michael T.. Release It! Design and Deploy Production-Ready Software. Pragmatic Bookshelf, 2018.
- Tanenbaum, Andrew S.. Distributed Systems: Principles and Paradigms. Pearson, 2007.
- Bishop, Claire. Artificial Hells: Participatory Art and the Politics of Spectatorship. Verso, 2012.
- Bourriaud, Nicolas. Relational Aesthetics. Les Presses du Réel, 2002.