Stream your Mac webcam, a video file, or a test pattern directly into any Simulator app.
Same AVCaptureSession API you already know — no private APIs, no device needed.
A lean two-piece tool: a macOS companion app that streams frames, and an iOS Swift Package that receives them — zero overhead on real devices.
Network.framework.AVCaptureSession shim — prefix existing types with Simulator and your code just works.#if targetEnvironment(simulator) — compiles to nothing on a real iPhone.Network.framework, CoreVideo, and ImageIO. App Store safe.CVPixelBuffer and CMSampleBuffer — feed them straight to VisionKit, CreateML, or your own pipeline.127.0.0.1 by default — your camera feed never leaves your machine.#if simulator TODO// Every camera app ever written
#if targetEnvironment(simulator)
// TODO: fake it somehow 🤷
#else
let session = AVCaptureSession()
// real code here
#endif
import SimulatorCameraClient
// Same code path everywhere
let output = SimulatorCameraOutput()
output.setSampleBufferDelegate(
self, queue: myQueue
)
SimulatorCamera.start()
Install the iOS SDK in your app, then grab the Mac companion to stream frames.
Add to your Package.swift:
dependencies: [
.package(
url: "https://github.com/Akylas/SimulatorCamera.git",
from: "1.0.0"
),
],
targets: [
.target(
name: "MyApp",
dependencies: [
.product(
name: "SimulatorCameraClient",
package: "SimulatorCamera"
),
]
),
]
Or in Xcode: File → Add Package Dependencies… → paste
https://github.com/Akylas/SimulatorCamera.git
# Tap the formula
brew tap akylas/simulatorcamera \
https://github.com/Akylas/SimulatorCamera
# Install & launch
brew install simulatorcamera
open -a SimulatorCameraServer
Grab the signed & notarized .dmg straight from Releases — no Homebrew required.
Mount the DMG, drag SimulatorCameraServer.app to Applications/, and open it.
git clone https://github.com/Akylas/SimulatorCamera.git
cd SimulatorCamera
brew install xcodegen
xcodegen generate --spec apps/MacServer/project.yml
open apps/MacServer/SimulatorCameraServer.xcodeproj
Pick the approach that matches how much of your existing code you want to keep.
Open SimulatorCameraServer.app, pick a source (webcam, video file, or test pattern), and click Start. The server streams frames on localhost:9876.
FrameSource delegateThe smallest integration: a single delegate callback gives you a CVPixelBuffer per frame.
import SimulatorCameraClient
final class CameraController: NSObject, FrameSourceDelegate {
private let source: FrameSource
override init() {
#if targetEnvironment(simulator)
source = SimulatorCameraSession(
host: "127.0.0.1", port: 9876
)
#else
source = AVCaptureFrameSource() // your existing wrapper
#endif
super.init()
source.delegate = self
source.start()
}
func frameSource(_ source: FrameSource,
didOutput pixelBuffer: CVPixelBuffer,
at time: CMTime) {
// Feed to Vision, Core ML, preview layer…
}
}
AVCaptureSession drop-inPrefix existing AVFoundation types with Simulator. Your setup code ports with a find-and-replace.
import SimulatorCameraClient
SimulatorCamera.configure(host: "127.0.0.1", port: 9876)
let session = SimulatorCaptureSession()
session.sessionPreset = .hd1280x720
guard let device = SimulatorCaptureDevice.default(for: .video)
else { return }
let input = try SimulatorCaptureDeviceInput(device: device)
session.addInput(input)
let output = SimulatorCameraOutput()
output.setSampleBufferDelegate(self, queue: frameQueue)
session.addOutput(output)
session.startRunning() // kicks off the network session
Your existing captureOutput(_:didOutput:from:) fires with a valid CMSampleBuffer — same code path as the real device.
Already have an AVCaptureVideoDataOutputSampleBufferDelegate? Swap the output inside a simulator guard and keep your delegate unchanged.
#if targetEnvironment(simulator)
let output = SimulatorCameraOutput()
output.setSampleBufferDelegate(self, queue: myQueue)
SimulatorCamera.start()
#else
let output = AVCaptureVideoDataOutput()
output.setSampleBufferDelegate(self, queue: myQueue)
session.addOutput(output)
#endif
Drop-in SwiftUI component that renders the live camera feed — hardware-accelerated via AVSampleBufferDisplayLayer.
import SwiftUI
import SimulatorCameraClient
struct ContentView: View {
var body: some View {
SimulatorCameraPreviewView()
}
}
A compact binary protocol over TCP. Each frame is a fixed header followed by JPEG payload. Designed for minimal latency on localhost.
| Field | Offset | Size | Type | Description |
|---|---|---|---|---|
| magic | 0 | 4 B | ASCII | "SCMF" — magic identifier |
| payloadLength | 4 | 4 B | uint32 LE | Length of the JPEG payload in bytes |
| timestamp | 8 | 8 B | Float64 LE | Frame wall-clock time in seconds |
| width | 16 | 4 B | uint32 LE | Frame width in pixels |
| height | 20 | 4 B | uint32 LE | Frame height in pixels |
| jpegData | 24 | N B | bytes | JPEG-encoded frame (payloadLength bytes) |
Full spec: docs/PROTOCOL.md · Architecture: docs/ARCHITECTURE.md
MIT-licensed, no paid tier, no telemetry. Maintained on tips — if it saves you a device-build loop, consider sponsoring.