🎉 v0.2 — AVCaptureSession drop-in shim

A real camera
in the iOS Simulator.
Finally.

Stream your Mac webcam, a video file, or a test pattern directly into any Simulator app. Same AVCaptureSession API you already know — no private APIs, no device needed.

Get Started View on GitHub
Swift 5.9+ Platforms SwiftPM MIT No private APIs
Terminal — zsh
$ brew tap akylas/simulatorcamera https://github.com/Akylas/SimulatorCamera
=> Tapping akylas/simulatorcamera...
$ brew install simulatorcamera
✓ SimulatorCameraServer installed
$ open -a SimulatorCameraServer
# 🎥 Pick a source — webcam, video file, or test pattern
# 🟢 Streaming on localhost:9876 — frames land in your Simulator

Everything you need. Nothing you don't.

A lean two-piece tool: a macOS companion app that streams frames, and an iOS Swift Package that receives them — zero overhead on real devices.

🎥
30 FPS Live Stream
TCP over localhost — no Bonjour, no Wi-Fi flakiness. Consistent 25–30 FPS via Network.framework.
🧩
Drop-in API
Full AVCaptureSession shim — prefix existing types with Simulator and your code just works.
🔌
Multiple Sources
Mac webcam, video file, or built-in test pattern. Screen-region capture on the roadmap.
📵
Zero Device Overhead
Entire client is wrapped in #if targetEnvironment(simulator) — compiles to nothing on a real iPhone.
🛡
No Private APIs
Built on Network.framework, CoreVideo, and ImageIO. App Store safe.
🧪
Vision & Core ML Ready
Frames arrive as CVPixelBuffer and CMSampleBuffer — feed them straight to VisionKit, CreateML, or your own pipeline.
📦
One-Line Install
Swift Package Manager for the iOS SDK. Homebrew cask for the Mac app. That's it.
🔐
Localhost Only
Connections are bound to 127.0.0.1 by default — your camera feed never leaves your machine.

Delete the #if simulator TODO

❌ Before
Swift
// Every camera app ever written
#if targetEnvironment(simulator)
// TODO: fake it somehow 🤷
#else
let session = AVCaptureSession()
// real code here
#endif
✅ After
Swift
import SimulatorCameraClient

// Same code path everywhere
let output = SimulatorCameraOutput()
output.setSampleBufferDelegate(
    self, queue: myQueue
)
SimulatorCamera.start()

Up and running in two steps.

Install the iOS SDK in your app, then grab the Mac companion to stream frames.

📱 Step 1 — iOS SDK (Swift Package Manager)

Add to your Package.swift:

Swift • Package.swift
dependencies: [
    .package(
        url: "https://github.com/Akylas/SimulatorCamera.git",
        from: "1.0.0"
    ),
],
targets: [
    .target(
        name: "MyApp",
        dependencies: [
            .product(
                name: "SimulatorCameraClient",
                package: "SimulatorCamera"
            ),
        ]
    ),
]

Or in Xcode: File → Add Package Dependencies… → paste https://github.com/Akylas/SimulatorCamera.git


🖥 Step 2 — macOS Companion App

Bash
# Tap the formula
brew tap akylas/simulatorcamera \
    https://github.com/Akylas/SimulatorCamera

# Install & launch
brew install simulatorcamera
open -a SimulatorCameraServer

Grab the signed & notarized .dmg straight from Releases — no Homebrew required.

⬇️  Download Latest Release

Mount the DMG, drag SimulatorCameraServer.app to Applications/, and open it.

Bash
git clone https://github.com/Akylas/SimulatorCamera.git
cd SimulatorCamera
brew install xcodegen
xcodegen generate --spec apps/MacServer/project.yml
open apps/MacServer/SimulatorCameraServer.xcodeproj

Three ways to integrate.

Pick the approach that matches how much of your existing code you want to keep.

1

Launch the Mac server

Open SimulatorCameraServer.app, pick a source (webcam, video file, or test pattern), and click Start. The server streams frames on localhost:9876.

2a

Minimal — FrameSource delegate

The smallest integration: a single delegate callback gives you a CVPixelBuffer per frame.

Swift
import SimulatorCameraClient

final class CameraController: NSObject, FrameSourceDelegate {
    private let source: FrameSource

    override init() {
        #if targetEnvironment(simulator)
        source = SimulatorCameraSession(
            host: "127.0.0.1", port: 9876
        )
        #else
        source = AVCaptureFrameSource() // your existing wrapper
        #endif
        super.init()
        source.delegate = self
        source.start()
    }

    func frameSource(_ source: FrameSource,
                     didOutput pixelBuffer: CVPixelBuffer,
                     at time: CMTime) {
        // Feed to Vision, Core ML, preview layer…
    }
}
2b

Full AVCaptureSession drop-in

Prefix existing AVFoundation types with Simulator. Your setup code ports with a find-and-replace.

Swift
import SimulatorCameraClient

SimulatorCamera.configure(host: "127.0.0.1", port: 9876)

let session = SimulatorCaptureSession()
session.sessionPreset = .hd1280x720

guard let device = SimulatorCaptureDevice.default(for: .video)
else { return }

let input = try SimulatorCaptureDeviceInput(device: device)
session.addInput(input)

let output = SimulatorCameraOutput()
output.setSampleBufferDelegate(self, queue: frameQueue)
session.addOutput(output)

session.startRunning()  // kicks off the network session

Your existing captureOutput(_:didOutput:from:) fires with a valid CMSampleBuffer — same code path as the real device.

2c

Zero-change output swap

Already have an AVCaptureVideoDataOutputSampleBufferDelegate? Swap the output inside a simulator guard and keep your delegate unchanged.

Swift
#if targetEnvironment(simulator)
let output = SimulatorCameraOutput()
output.setSampleBufferDelegate(self, queue: myQueue)
SimulatorCamera.start()
#else
let output = AVCaptureVideoDataOutput()
output.setSampleBufferDelegate(self, queue: myQueue)
session.addOutput(output)
#endif
2d

SwiftUI preview view

Drop-in SwiftUI component that renders the live camera feed — hardware-accelerated via AVSampleBufferDisplayLayer.

Swift
import SwiftUI
import SimulatorCameraClient

struct ContentView: View {
    var body: some View {
        SimulatorCameraPreviewView()
    }
}

SCMF — Simulator Camera Message Format

A compact binary protocol over TCP. Each frame is a fixed header followed by JPEG payload. Designed for minimal latency on localhost.

Field Offset Size Type Description
magic 0 4 B ASCII "SCMF" — magic identifier
payloadLength 4 4 B uint32 LE Length of the JPEG payload in bytes
timestamp 8 8 B Float64 LE Frame wall-clock time in seconds
width 16 4 B uint32 LE Frame width in pixels
height 20 4 B uint32 LE Frame height in pixels
jpegData 24 N B bytes JPEG-encoded frame (payloadLength bytes)

Full spec: docs/PROTOCOL.md · Architecture: docs/ARCHITECTURE.md