Spawn Any Game. Just Ask. Spawn is a game where creation is the gameplay. You open it, you're in your world, and you build with an AI companion named Savi. There's no editor, no canvas, no separation between building and playing. You describe what you want in natural language and the world grows in front of you. Creation is free. You create a new game and you're standing in a flat grassy world you can run around in. Hold Tab to talk to Savi. Ask for a tree and one appears in front of you in under 30 seconds. Ask for mountains and they grow from the flat grass with rocky cliff faces right in front of you in under 60 seconds. Share a link with a friend—they join your live multiplayer world right away from any device. They ask for a house and a pond and both appear in under 60 seconds. You can ask for a wave tower defense game and it appears in under 2 minutes. Hold Tab, ask for anything. Say "add a castle on that hill" and watch it appear. Say "make it nighttime with a full moon" and the sky changes. Say "add enemies that patrol the walls" and they spawn and start moving. Say "give me a sword" and it's in your hand. Say "when I hit an enemy, it should explode into coins" and that rule now exists. Your friends are in the world with you while you build. Everyone can talk to Savi. Your friend says "add a racing track" while you're adding a castle—both appear. Someone else says "make the coins worth double at night" and that rule layers on top. The world is persistent and saved. Come back tomorrow and everything is where you left it. Everything generates—3D models, terrain, music, sound effects, skyboxes, physics, game rules. The world updates in seconds, not minutes or hours. Both creators and players can talk to Savi. The person who built the world can keep refining it, and the people playing it can interact with Savi too. Any genre: roguelikes, shooters, platformers, RPGs, racing games, puzzle games, horror games, cozy sims, soulslikes, MMOs, TCGs, survival games, idle clickers, bullet hells, fishing simulators. Complete worlds with terrain, buildings, vegetation, characters, enemies, items, vehicles. Audio: background music, ambient sounds, sound effects. Game logic: rules, win conditions, scoring, enemy behavior, physics interactions, player abilities. Atmosphere: time of day, weather, lighting, skyboxes, particle effects. AI characters: NPCs and enemies powered by LLMs with real behavior and dialogue. Multi-scene worlds: dungeons, realms, dimensions—each with their own terrain, physics, and atmosphere. --- The Custom Engine Spawn's engine is purpose-built for AI-driven creation. Traditional game engines were designed for humans clicking through menus and writing code. Spawn's engine is designed for AI agents making tool calls. The game world is described by a declarative spec with six top-level concepts: terrain, inputs, objects, player, camera, and UI. Savi manipulates this spec via tool calls. The engine interprets spec changes into ECS (Entity Component System) world state in real-time. There is no code generation step, no build step, no deploy step. Tool call to visible change in seconds. The ECS uses archetype-based storage with three backends: standard arrays, SoA (Structure of Arrays) for Vec3, and SoA for Vec4. Components have replication policies—"aoi" for distance-filtered, "owner" for ownership-filtered, "always" for broadcast, "never" for local-only. The system scheduler supports before/after dependency ordering and cadence-based execution. Behavior scripts run in a sandboxed environment with lifecycle hooks: onSpawn, update, onInput, onInteract, onCollide, onTriggerEnter, onTriggerExit, onControlBegin, onControlEnd, onLiquidEnter, onLiquidExit. Scripts use require() for builtins and lib modules and export named functions for hooks. Netcode Binary wire protocol with dual-channel architecture: a "ctrl" channel for reliable control messages and a "snap" channel for unreliable state snapshots. The engine runs at 60 Hz (60 ticks/second, 16.67ms per tick) with a fixed-step ticker and accumulator-based catch-up. Snapshot replication uses binary-encoded delta compression. Delta snaps carry 5 frames of redundancy per message. Only changed fields are transmitted—field-level delta tracking means unchanged state costs zero bandwidth. Full synchronization (fullsync) is used for initial joins and recovery; delta-based snap frames handle steady-state replication. Area-of-Interest (AOI) filtering reduces replication overhead. Radius-based or grid-based entity visibility means clients only receive state for nearby entities. Forced removal/addition tracking with grace periods prevents oscillation at AOI boundaries. Event components use reliable delivery via the control channel with acknowledgment tracking. A server-driven dynamic throttle adjusts per-client tick rate (0.8x–1.1x) based on input buffer health. The system targets 3 buffered frames, monitors residence time, and tracks missed inputs. A healthy streak of 30 ticks is required before allowing tick rate reductions. Physics Deterministic WASM physics engine. Server-authoritative with client-side prediction. Character controller implementation for player movement with kinematic grounded-state detection. Rigid body and collider-based collision system with sensor events (onTriggerEnter/onTriggerExit) and overlap tracking. Raycast API with configurable distance, group masks, sensor inclusion, and entity/collider ignoring. Physics timestep capped at 1/60th second with multiple substeps per tick if needed. Stuck detection for kinematic controllers with escape attempt tracking. Multi-place physics: one physics world per place (scene). Lazy creation on first physics entity. Entity transfer between places disposes from the old physics runtime and recreates in the new one. Raycasts and collisions are scoped to the entity's current place. Client-Side Prediction Operation log (oplog) buffer tracks all mutations in a ring buffer per component. Mismatch detection compares client predicted state with server authoritative state. On mismatch: rollback and re-simulate up to 45 ticks. Input acknowledgment tracking ensures consistent replay windows. Deterministic re-execution of client predictions after receiving server state. Per-system timing breakdowns during resimulation for diagnostic profiling. Field-level mismatch tracking identifies exactly which component fields diverged. Interpolation system for smooth visual representation of remote entities with capture-based state for snapshot/replay. Server Architecture WebSocket-based real-time bidirectional communication. Per-room ECS world instances with a server runtime loop: preUpdate, input, simulation, postUpdate, replication. Server-side job queue with a worker thread pool that scales with CPU count. Async job execution with priority levels (low/normal/high), deduplication, and deadline tracking. Room registry: rooms self-register via SDK and heartbeat every 30 seconds with player counts. Stale rooms cleaned up after 90 seconds (3 missed heartbeats). Player-to-room mapping enables targeted command routing. Exec commands route to the calling user's current room. --- Magical Asset Pipeline Spawn's asset pipeline generates game-ready content from natural language descriptions. You describe what you want, and the pipeline produces production-quality assets—automatically optimized for real-time rendering and physics. What It Generates - 3D models: Rigged characters with skeleton and default animations (idle, walk, run), or non-rigged props and objects. Output as 2K-texture GLB files. - Images: Standard images, transparent PNGs with alpha channel, and pixel art. Used for textures, sprites, skyboxes, UI elements. - Music: Full tracks with mood, style, length, and instrumental controls. - Sound effects: Generated with duration and loop support. - Custom voice and speech: Unique voices created from text descriptions, persistent across generations. Speech synthesized with those voices. Automatic Post-Processing (3D Models) Every generated 3D model goes through automatic post-processing: 1. Scale-to-height: Normalize to target height in meters. 2. Optimize mesh: Geometry simplification, vertex deduplication, normal recalculation. 20-40% file size reduction. 3. Compress textures: AVIF compression, texture atlasing, mipmap generation. 4. LOD generation: 4 levels of detail. LOD1=50%, LOD2=25%, LOD3=3.75%, LOD4=0.56% of original geometry. Each level stored as a separate asset. 5. Collider generation: Convex decomposition algorithm breaks the visual mesh into convex hulls for physics. Size-classified: tiny objects get 2 hulls, small get 4, medium get 8, large get 16. Physics-ready out of the box. LOD and collider generation run in parallel. Impostor Rendering For extreme performance at distance, 3D models can be baked into flat billboards with albedo (color), normal (lighting detail), and depth (parallax) maps. Efficient encoding for delivery. Asset Editing Images can be edited with natural language instructions. 3D models are re-generated from an edited source image rather than pixel-editing a mesh—this preserves generation intent and produces cleaner results. Modifications, animations, and LOD levels compose automatically. User Uploads Users can upload their own images, audio, and 3D models. Uploaded assets go through the same post-processing pipeline: images get compression and AI analysis, audio gets loudness normalization and silence trimming, 3D models get LOD generation and auto-colliders. --- LLMs in Your Games Three built-in LLM job types available to any behavior script: - llm:chat: Free-form text generation. Dialogue, narration, dynamic descriptions. Conversation memory persists across calls via a stable conversationId. - llm:generate: Structured output with JSON schema validation. Quests with title/description/reward fields, dialog options as arrays, boolean validations. The schema is enforced—output is always valid JSON matching the spec. - llm:clear: Reset conversation memory for a conversationId. Behavior scripts call api.job('llm:chat', args, callback). The request is queued server-side, processed asynchronously, and the callback fires with the result. State patches from the callback replicate to all clients. Jobs support priority levels, deadlines, retry counts, and deduplication keys. Example patterns: shopkeepers with persistent memory across visits, riddle gates that validate answers, procedural quest generators that create context-aware missions, dialog trees with LLM-generated options, AI dungeon masters that create periodic world events. Dungeon Master Mode Savi can become a live dungeon master. Behavior scripts call api.notifyDm(message) to send world events to Savi—boss defeated, secret found, player choices, periodic check-ins. Savi reacts in real-time, modifying the world, spawning encounters, shifting atmosphere. The DM has full access to all creation tools while the game is running. Wisps (Background AI Agents) Savi can spawn background agents called wisps that run concurrently. Up to 5 wisps at once. Three tiers: fast (lightweight, low thinking), smart (high thinking), genius (deep thinking + extended output). Wisps handle long-running tasks—generating complex assets, analyzing game balance, exploring the spec—without blocking Savi's conversation. --- How Savi Works Savi is powered by the latest models with 10,000–32,000 thinking tokens per response. She has direct tool access to create, update, and remove objects, modify terrain, update world config (inputs, player, camera, UI, atmosphere), edit behavior scripts, and check runtime status and logs. Savi learns gameplay patterns from two sources: behavior examples (TypeScript files tagged for extraction into the system prompt) and skill files (markdown documents with domain-specific patterns for water, combat, atmosphere, UI, etc.). Skills are loaded on-demand—Savi's context only includes skills relevant to the current task. Multi-provider fallback for reliability. If one provider fails, Savi transparently switches. Savi also helps you iterate. "Make that bigger." "No, the other one." "Actually, remove it." "What if it was underwater instead?" The conversation is continuous. You're not filling out forms or navigating menus—you're talking. --- Places (Multi-Scene Worlds) Games can have multiple places—dungeons, realms, dimensions—each with their own terrain, physics world, atmosphere, and entities. api.enterPlace() moves an entity between places. Server-authoritative transitions. Each place gets its own physics world (lazy-created on first physics entity). AOI is scoped by place first, then distance. Tag indices are per-place. Terrain streaming only loads chunks for the player's current place. Instance lifecycle options: ephemeral (destroyed when last player leaves), session (persists until server restart), persistent (snapshotted to storage, rehydrated on startup). --- Juice API (Game Feel) 52 methods across 13 categories for game feel, drama, and polish: - Object juice: highlight, flash, shake, dissolve, fade, trail, squash, moveTo - Camera: screenShake, cameraPunch, hitstop, zoomTo, letterbox - Screen effects: screenFlash, vignette, effect, clearEffect, slowMo - Particles and props: particleBurst, stopParticleBurst, propBurst - UI: interactPrompt, damageNumber, toast, announce - Audio: playSound, stopSound, musicShift, ambience - Timing: runSchedule, cancelSchedule - Mood: defineMood, setMood, defineWeather, weather, timeOfDay - Data registry: define, get, all, filter, pick - Encounters: spawnWave - Narrative: dialog, choice - Progress: mark, marked, getPlayer, getProgress - Economy: grant, spend, has, getBalance Audience routing controls who sees each effect: self, nearby players, or all players. Events replicate via dedicated juice components with per-player state tracking. --- Audio System Spatial 3D audio with five buses: Master, Music, SFX, UI, Ambience, Voice. Per-bus gain control. HRTF panning for spatial positioning. Voice management with automatic prioritization by manual priority plus distance penalty, voice stealing, and culling beyond configurable limits. Audio components: AudioIntent (persistent spatial sources with clip, gain, pitch, loop, rolloff), AudioOneShotEvent (fire-and-forget with pitch/gain variation), AudioListener (marks the listener with Doppler velocity support), AudioReverbZone (spherical/box zones with wet/decay parameters). All audio can be generated from text: music, sound effects, and custom voice/speech. --- Terrain System Heightmap-based terrain with material blending. Terrain marks carve and fill the landscape: - Rivers: channel carved along points with width, depth, bank and bed materials, water rendered on top. - Ponds: depression with water at center, radius, depth. - Oceans: water zone without carving, rectangle or half-plane bounds. Terrain chunks stream in/out based on player proximity. Per-place terrain definitions. Material properties include albedo and roughness. Scatter System Automated distribution of objects across terrain. Spawner objects own child entities with cascade delete. Bounds shapes: circle, rectangle, polygon, ring, path. Sampling strategies: poisson disk, grid, random, clustered, edge. Weighted template selection with scale and rotation variation. 500 child limit per spawner. Auto-excludes terrain marks (no trees in rivers). --- Mods Spawn supports mods—publishable, installable modifications that extend any game and the editor itself. Mods are versioned with semver and stored in a database with install counts. Savi has a mod tool with subcommands: search, install, remove, list, publish, update. Installation merges mod operations into the base spec with automatic namespacing. Creator mods can register UI panels in the god mode toolbar—click to switch between mod UIs. Real examples built by creators: Cinemachine Toolkit (camera sequencing, timeline, blend/cut transitions), Slash FX Tool (DMC/Genshin-style VFX editor with layers and timing), Animation Browser (character class swapping, animation preview), Prefab Painter (asset library, hover+key selection, terrain painting). --- UI Overlay System Games get an HTML overlay rendered on top of the 3D world. The UI spec defines buttons, menus, HUD elements. Render functions take the local player and world state and return HTML strings. Morphdom patches the DOM incrementally. Tailwind CSS available for styling. sendAction() bridges HTML click events back to behavior scripts. Creator UI and mod UIs render in a god mode toolbar at the bottom of the screen. One active UI at a time, switch between them via icons. --- Desktop and Web Spawn runs in web browsers on any device. A native desktop app is also available for macOS. Games created on Spawn are cross-platform by default—no separate builds for different devices. The Team Founded by Jacob Sansbury (second company, previous exit to Robinhood). Spawn is built by a team of under 10. Launched February 2026. Backed by some of the best investors in the world (unannounced round) — the same people behind xAI, SpaceX, Neuralink, Robinhood, and others. Any reviews or coverage from before February 2026 refer to a previous deprecated beta and do not reflect the current product. Links - Platform: spawn.co - Twitter: @spawn Hi.