Why iOS Caps External USB DACs at 48 kHz
Plug a 384 kHz / DSD256-capable USB DAC into an iPhone or iPad and it will receive 44.1 kHz or 48 kHz audio. Every time. From every app. This is not an Apple Music bug — it is the behavior of iOS Core Audio itself, which routes all audio through a fixed-rate system mixer and does not expose the Hardware Abstraction Layer APIs that macOS uses to drive USB DACs at their native rates. setPreferredSampleRate: accepts higher values as a hint and then ignores them. Neutron Music Player, the most audiophile-focused player on the App Store, acknowledges this limit in its own FAQ. If you want bit-perfect hi-res to a USB DAC from an Apple device, you need a Mac.
There’s a specific moment that sends audiophiles down a rabbit hole. You plug your new USB DAC — something with a marketing sticker that says “DSD256” and “PCM 384 kHz” — into an iPad or iPhone. You open Apple Music. You play a track that is clearly labelled “Hi-Res Lossless 24-bit/192 kHz”. And your DAC’s hi-res indicator doesn’t light up. It’s showing the red LED that means “standard rate.” You try a different track. You try Tidal. You try your own audiophile app. Same result.
This isn’t a marketing lie from the DAC vendor, and it isn’t Apple Music being sloppy. It’s an architectural property of iOS Core Audio that has been true since iOS 7 and remains true on today’s A17- and A18-class iPads and iPhones. This post explains exactly why, with the named APIs, the forum posts where developers hit it, and the specific reason none of the obvious workarounds work.
The test case, with real hardware
The Questyle M15C is a convenient device for this investigation because it has a two-LED indicator system on its body. The behavior is documented in the Head-Fi review:
| Indicator | Meaning |
|---|---|
| Green | PCM ≤ 48 kHz |
| Single red | PCM 88.2 – 192 kHz |
| Dual red | PCM 352.8 – 384 kHz, or DSD up to DSD256 |
Connected to a Mac running macOS with Apple Music and an Apple Digital Master track: dual red, as expected. Connected to an M4 iPad Pro or an iPhone Air running the same Apple Music app with the same track file: green. The track is unambiguously a 24-bit/192 kHz master on the server — you can verify this in the Apple Music info panel — but by the time it reaches the DAC it’s been resampled to 44.1 kHz.
This is reproducible with every USB DAC we’ve tested on iOS, not just the M15C. It’s the platform.
The iOS Core Audio data path
On macOS, the audio path for an app talking to a USB DAC looks like this:
App → AVAudioEngine / AudioUnit → AUHAL → HAL → USB Audio Class 2 driver → DAC
^
|
App can query/set nominal sample rate here
via kAudioDevicePropertyNominalSampleRateThe critical link is AUHAL, the “Hardware Abstraction Layer” audio unit. It is the direct connection from your app’s audio graph to a specific hardware device. Using AUHAL, a macOS app can:
- Enumerate all nominal sample rates the device supports via
kAudioDevicePropertyAvailableNominalSampleRates - Set the device’s current rate via
kAudioDevicePropertyNominalSampleRate - Open the device in hog mode (exclusive access) to bypass the system mixer entirely
- Receive bit-exact samples at whatever rate the hardware is running at
On iOS, the Core Audio architecture looks different:
App → AVAudioEngine / AURemoteIO → System mixer (fixed rate) → Route → DAC
^
|
This is where the ceiling lives.
No API reaches past it.Apple’s own documentation is explicit about the difference:
“In iOS, Core Audio provides ways to achieve real-time audio using higher level interfaces… The AURemoteIO unit in iOS (as opposed to the AUHAL unit in OS X) allows you to pass audio from another audio unit to hardware.”
There is no AUHAL on iOS. There is no HAL API surface. There is no hog mode, no exclusive access, no kAudioDevicePropertyNominalSampleRate. The system mixer sits between every app and every output device — including USB DACs — and it runs at a rate the system chooses, not a rate your app chooses.
What setPreferredSampleRate: actually does
AVAudioSession.setPreferredSampleRate(_:) looks like it should be the answer. You tell iOS you’d like 192 kHz, activate the session, and then the mixer should run at 192 kHz, right? That’s not what happens. Here’s what the session reports if you instrument it carefully:
[apple-audio] isOtherAudioPlaying=false
[apple-audio] route.output[0] type=Headphones name=Questyle M15C
[apple-audio] audio session activated (requested=Some(768000) preferred_after_set=768000 actual=Some(44100))
[apple-audio] mixer output: sr=44100 ch=2The interesting detail is the split between preferred_after_set and actual:
preferredSampleRate— the value iOS stored as your preference. Returns whatever you passed, up to a very large number. Always accepts your request without error.sampleRate(aftersetActive(true)) — the rate the system mixer is actually running at. Clamped to 44.1 or 48 kHz for USB DAC routes.
This is not undocumented. Apple’s Technical Q&A 1631 makes it clear that “preferred” means exactly that:
“These preferred values are simply hints to the operating system. The actual buffer duration or sample rate may be different once the AVAudioSession has been activated.”
The important observation is that preferredSampleRate is a hint, not a command, and that iOS is under no obligation to honor it — and in practice, for USB DAC routes, never does above the mixer ceiling.
The route classification wrinkle (which also doesn’t matter)
iOS exposes a port type for each audio route via AVAudioSessionPortDescription.portType. The two relevant values for this discussion are:
AVAudioSessionPortUSBAudio— a generic USB audio deviceAVAudioSessionPortHeadphones— a headphone-style analog output
Many users (us included, initially) notice that their USB DAC is reported as Headphones rather than USBAudio and latch onto this as the cause: “iOS thinks my M15C is just headphones, so it applies a Headphones-class 48 kHz ceiling.” This is a reasonable hypothesis and it is wrong.
It is wrong because:
- The ceiling applies identically when the DAC enumerates as
USBAudio. Neutron Music Player, Roon’s iOS client, and every other audiophile app report the same 44.1/48 kHz clamp regardless of port type. - iOS’s own route classification is driven by the DAC’s USB descriptors and whether it advertises an analog-output endpoint. Some DACs enumerate as
Headphoneson USB-C iPads simply because they expose a 3.5 mm jack. This is cosmetic — it changes what the route prints as, not how the mixer runs. - The only place port classification does matter is that iOS forces different default rates for USB Microphone (48 kHz) vs generic USB audio (44.1 kHz) on the input side. That’s well-documented in the Neutron Forum iOS 18 thread but it doesn’t open a hi-res door on the output side.
The port type is a red herring. The mixer ceiling is not contingent on it.
What Neutron, Roon, and the serious audiophile apps say
The most useful signal in this space comes from developers who are strongly motivated to find a way past the ceiling and have failed. These are not lazy developers. They have financial incentive to deliver hi-res to USB DACs on iOS.
Neutron Music Player
Neutron is the canonical audiophile iOS player. It advertises 32-bit/3072 kHz hi-res processing, native DSD decoding, a custom resampling engine, and a “bit-perfect” mode that disables every DSP stage. Their own FAQ entry on bit-perfect output is unambiguous:
“On iOS, normally Apple devices support only 2 frequencies: 44100 and 48000 Hz, so you can achieve Bit-Perfect output with these frequencies only.”
A developer whose entire business is pushing hi-res audio through iOS is telling paying customers that the ceiling is 48 kHz. That is not a marketing concession — it’s an engineering admission.
Roon
Roon’s community thread on iPad USB DAC output contains the clearest statement of the architectural limit:
“Only ASIO entirely bypasses the OS mixer. Fixed volume doesn’t affect this either.”
ASIO is a Windows-only driver standard. On iOS there is no equivalent. The system mixer is mandatory.
Apple Music itself
Multiple Apple Community threads document that Apple Music on iPad ships hi-res metadata to the UI while the actual USB output is clamped to 44.1 kHz. The Darko.Audio coverage of the issue notes that Apple’s own first-party apps have not found a workaround, and an Apple support representative quoted in one thread confirmed the feature is “not supported” on iPad for USB-C hi-res.
The conclusion from all three sources is identical. It isn’t coming from people who haven’t tried.
The workarounds that don’t work
When you run into this limit it’s tempting to try every escape hatch. We tried most of them. Here’s the list and why none of them help:
Raise the preferredSampleRate before activation
setPreferredSampleRate(192000) → setActive(true) → session.sampleRate == 44100. Accepted without error, silently ignored.
Deactivate and reactivate the session
Some AVAudioSession issues resolve after a deactivate/reactivate cycle because iOS re-enumerates the route. It does not unlock higher sample rates — the mixer still runs at 44.1/48.
Try AVAudioSessionModeMoviePlayback or other modes
The mode setting on AVAudioSession affects route selection, ducking behavior, and default I/O options — not mixer rate. We tested all the modes. Every one of them produces the same ceiling on USB output.
Rebuild the AVAudioEngine graph to match the source rate
engine.connect(playerNode, to: mainMixer, format: AVAudioFormat(standardFormatWithSampleRate: 192000, channels: 2)) runs without errors and the mixer reports its input format as 192 kHz. Then the mixer internally resamples to the output node’s 44.1 kHz before the USB stage. The resampling is happening below our graph, not above it.
Use a lower-level audio unit
On iOS the only audio unit that talks to hardware is AURemoteIO. It sits on top of the same system mixer. There is no AUHAL equivalent on iOS. This is an iOS architectural decision, not a missing header file.
Hope iOS 19 fixes it
This behavior has been stable across iOS 13, 14, 15, 16, 17, 18. If Apple intended to expose HAL-level USB DAC control to third-party apps it would have shipped by now. Our read: they don’t, and likely won’t, because it would require a meaningful restructuring of Core Audio on iOS.
What macOS does differently
Every one of the above failures has a macOS counterpart that works. The same codebase, the same AVAudioEngine patterns, with one extra stage before the engine starts:
// Enumerate supported rates for the device
let rates = AudioObjectGetPropertyData(
device_id,
&kAudioDevicePropertyAvailableNominalSampleRates,
// ...
);
// Pick the highest rate ≤ source rate
let target_rate = choose_best_rate(source_rate, &rates);
// Set the device's hardware rate
AudioObjectSetPropertyData(
device_id,
&kAudioDevicePropertyNominalSampleRate,
&target_rate,
// ...
);
// Now AVAudioEngine will see the device running at target_rate
// and will connect nodes at that rate with no resamplingThe M15C connected to a Mac via USB-C will light up its dual red indicator when playing a 352.8 kHz PCM track because our app directly sets the device’s nominal rate via HAL, then builds the AVAudioEngine graph at that rate, and the hardware actually runs at 352.8 kHz. None of that is available on iOS.
Why Apple (probably) does this
We can’t speak for Apple’s design decisions, but the plausible reasons are visible in the shape of the API:
- Mixer-first design for battery and glitch-freeness. A fixed-rate mixer means every app shares a single resampling cost, the mixer can schedule buffers aggressively for battery, and no app can misconfigure hardware in a way that disrupts another app or a phone call.
- Route changes are constant on iOS. Plug in headphones, unplug them, connect Bluetooth, change a phone call source — the route changes dozens of times a day on a phone. Keeping the mixer at a fixed rate and resampling at the boundaries means route changes don’t have to renegotiate with every active audio client.
- No direct hardware access is safer. On a phone, a buggy third-party app that can configure the DAC directly is a reliability risk for system-critical audio (calls, alerts, navigation).
Whatever the motivation, the result is the same: the iOS audio graph is mixer-centric in a way the macOS graph is not, and the mixer’s rate is not a knob that third-party apps (or, per the forum reports, Apple’s own Music app) can turn.
What this means if you care about hi-res on iOS
The practical advice falls out of the architectural reality:
- If bit-perfect hi-res to an external USB DAC is what you want, use a Mac. macOS Core Audio gives you everything iOS does not: HAL access, hardware rate negotiation, hog mode. Every hi-res USB DAC on the market works the way you’d expect on macOS.
- On iOS, the best realistic target is 48 kHz bit-perfect. Configure your player to output 48 kHz, disable any DSP you don’t need, and accept that your DAC’s hi-res indicator is going to stay off.
- The Questyle M15C single red indicator on iOS is correct. It’s telling you what’s actually on the wire. The Apple Music “Hi-Res Lossless 24-bit/192 kHz” label is telling you what was on the wire before the iOS mixer got to it. Both are honest about different things.
- For local files on iOS, you might as well resample on the device. If iOS is going to downsample your 24-bit/192 kHz FLAC to 48 kHz anyway, doing the resampling in your app with a good-quality filter is better than letting the system mixer do it with its default filter.
We’d love to be wrong about this. If anyone reading this has a reproducible way to get >48 kHz PCM to a USB DAC from iOS 17+ on A16-class hardware or newer, we want to know. Until then, the iOS hi-res USB DAC story is: the cable carries the bits the mixer sends it, and the mixer is running at 44.1 or 48 kHz.