Back to main page

HighwayHarmonics

Inspired by Mercedes’ Sound Drive: live car data controls how music plays. Built with Xcode and Swift — work in progress.

Status: In progress Stack: Swift · Xcode · AudioKit Hardware: OBD-II Bluetooth scanner Platform: iOS

Overview

I saw Mercedes’ Sound Drive and was fascinated by the idea of steering music with live car data. My approach uses an OBD-II scanner that streams data over Bluetooth to the phone. The app’s audio engine then maps those signals to musical parameters, so driving literally shapes the sound.

How it works

  • data: OBD-II scanner sends live car data over Bluetooth to the iPhone.
  • Mapping: Values like speed feed envelopes/curves that control stem volumes and filters.
  • Engine: AudioKit processes stems; the app applies filter cuts and fades in/out stems based on the mapping.

Current focus

I’m deep in the AudioKit integration — a completely new world for me. There’s almost no material online for this exact context, so I’m figuring it out myself. Music theory isn’t the problem; it’s wiring up a reliable audio graph and getting the behavior right. UI is not a priority now: first solid audio logic, then Bluetooth connection; UI comes last.

Key behavior example

  • At standstill, music plays but only the bass is audible — with a high/low cut applied.
  • As speed increases, more stems fade in and their filters open progressively.
  • Smooth transitions (no clicks) via envelopes and live data updates.

What I’m learning

  • Designing an AudioKit graph (mixers, filters, nodes) for real-time control.
  • Mapping continuous sensor data to musical parameters (curves, thresholds, smoothing).
  • Swift and Xcode.

Roadmap

  • Finish the heart of the project: Audio Logic.
  • Implement Bluetooth classes for OBD-II scanner connection.
  • Finetune Audio Logic for even smoother transitions
  • Build minimal UI after engine & Bluetooth are stable.
  • In-car testing and iteration.
  • integrate some sort of AI in the future