Vol.01 01 Rel.0.1.0 macOS13+ · Apple Silicon

Adaptive Screen

— pro lumine ambientis —

A menu bar instrument for keeping your display honest at every hour of the day.

Your eyes adapt to ambient light in seconds; your monitor does not. Adaptive Screen watches the room through your webcam in seven-hundred-millisecond glimpses — brief enough that the camera LED flickers and goes dark — and drives your external display's brightness, contrast, and warmth across the curve you set, or the curve it learns from watching you adjust over the course of a week.

No cloud round-trip. No frame retention. No marketing overlay. The webcam is powered on for less than a second at a time and only the Y-plane is read. What gets persisted is exactly one 64-bit float.

Download AdaptiveScreen.dmg·412 KB·v0.1.0
Signed with a Developer ID certificate Notarized by Apple·No App Store
01

Method

camera → luma → curve → DDC/CI

Snapshot

Every 20 to 60 minutes the app opens the camera, waits ~700 ms for exposure to settle, captures one frame, stops the camera. The LED blinks briefly.

Compute

The Y-plane of the pixel buffer is averaged into a single luminance scalar 0–255. The pixel buffer is never copied or decoded into an image format.

Curve

The luma is looked up against your control curve, or — if you haven't drawn one yet — against a weighted-mean model trained on your own manual brightness adjustments.

DDC/CI

The target brightness rides over DDC/CI to the monitor — the same bus its OSD buttons speak on. No software tint, no framebuffer overlay, no GPU color manipulation.

02

Calibration

your curve, or one it learns

Define your own ambient-light-to-brightness mapping by dragging control points on a grid. Click empty space to add; -click a point to remove. A purple marker tracks the live luminance reading against your curve so you can see the target brightness move in real time.

If you haven't drawn a curve, Adaptive Screen uses a small weighted-mean model trained on the brightness choices you make manually — partitioned by time-of-day bucket so your late-night preference doesn't interfere with your morning one. Once the model has enough signal you can bake it into a curve and edit from there.

live luma 128 dark bright 0% 100%
03

Solar cadence

denser samples when light moves fastest

Morning and evening demand denser samples — the light changes fastest then. The snapshot scheduler uses your stored latitude and longitude to compute local sunrise and sunset from NOAA's solar algorithm, then switches automatically: 20-minute intervals within an hour of either event, 60 minutes otherwise. Set your location by clicking the embedded MapKit view, or let the app infer it from your time zone.

00:00 06:00 12:00 18:00 24:00
dense ·  20 min slow ·  60 min sun event
04

Instruments

what's in the menu
  1. 01

    Per-monitor profiles

    Every setting — curve, warmth, mode, friendly name — is keyed on the monitor's EDID (vendor × model × serial). The Dell at the office and the LG at home each remember their own adjustments, no matter which Mac you plug them into.

  2. 02

    Idle on built-in

    When only the MacBook display is attached, the app sits quietly. macOS handles that one better than any third-party tool could.

  3. 03

    Native controls, indexed

    Color preset, picture mode, and input source are populated by parsing the monitor's DDC/CI capabilities string — not a hard-coded list. Your monitor's Low-Blue-Light mode shows up where it exposes it.

  4. 04

    iCloud key-value sync

    Curves, monitor names, manual overrides, and location ride between Macs through the ubiquitous KV store. One account, one tuning — across every desk.

  5. 05

    Warmth adjustment

    A single slider maps to per-channel DDC RGB gains, preceded by a User-1 preset write that unlocks the gain registers on monitors that gate them behind a mode switch.

  6. 06

    Debug API

    A JSON snapshot of every monitor read, camera transition, and scheduler decision is written to Application Support every three seconds — because it doesn't work on mine should be a solvable problem, not a shrug.

05

On the camera

the only part that matters

Adaptive Screen never sees an image.

The webcam is powered on for roughly 700 ms per snapshot — just long enough for exposure to settle and one frame to arrive. The pixel buffer is locked, the Y-plane (luminance only, no chroma) is read byte-by-byte into an integer accumulator, and the buffer is unlocked. What gets persisted is exactly one Double. There is no image storage, no network transmission, no frame retention past the scope of the function that computed the average.

The whole camera-handling surface is about a hundred lines of Swift: one file, LightSensor.swift, readable in a sitting.

0 128 ← mean luma 255
One scalar per frame. That's the whole signal surface.
06

Specifications

what you'll need to run it
Operating systemmacOS 13 Ventura or later
ArchitectureApple Silicon (M-series)
DisplayExternal monitor with DDC/CI over DisplayPort, HDMI, or Thunderbolt
CameraAny built-in or USB webcam; the app filters out Continuity Camera
DistributionDeveloper ID Application, notarized by Apple·not sandboxed
Private SPIIOAVService — same interface used by Lunar, BetterDisplay, and m1ddc
Bundled code100 % first-party Swift·no third-party binaries
Size412 KB .dmg·~1.2 MB installed
07

Distribution

one download, two formats
.dmg 412 KB · v0.1.0

AdaptiveScreen.dmg

Disk image for macOS 13 or later, Apple Silicon. Signed with a Developer ID certificate and notarized by Apple.

Download

First launch will prompt for camera access. Grant it; the app will start sampling and the menu-bar symbol settles on its reading within a few seconds. If you previously denied the prompt, use Reset camera permission from the app menu.