The iPad's Software Problem Is Permanent
FULL TRANSCRIPT
This is a $2,000 iPad Pro with an M5 chip, 16 gigs of RAM, and more GPU cores than an iMac. And this,
this is a 5-year-old M1 Mac Mini in a 3D-printed case that cost just $800 when it was new all of
those years ago. And the performance is not close, the iPad absolutely destroys this Mac Mini. One of
these devices can run a 20 billion parameter AI model, but the other won't even attempt to load
it. One of these devices can edit red raw camera footage natively, the other doesn't support the
codec. One of these devices can tweak settings inside of Death Stranding to balance performance
and quality, but the other gives you just two options, HDR on or off. Guess which one is which.
My friend Jason Snell recently wrote something that's been rattling around in my head. There
was a time when Apple clearly viewed the iPad as the future of computing, while the Mac seemed
destined for obsolescence. Now, ironically, at a moment when the Mac has roared back to the center
of Apple's universe, the iPad feels closer than ever to fulfilling its original promise.
Except it doesn't, not really, because while the iPad has gained windowing and external display
support, pro apps, all the trappings of a real computer, underneath it all, iPadOS is still a
fundamentally mobile operating system with mobile constraints baked into its very DNA. Meanwhile,
the Mac is rumored to be getting everything the iPad does best. Touchscreens, OLED displays,
thinner designs. I mean, we're heading towards the future that Apple swore would never come, where
the Mac becomes more like the iPad. And the iPad itself, despite being more advanced than ever,
seems forever tied to architectural decisions made 15 years ago when it was criticized upon launch
for being just a big iPhone. And proof that this isn't thermal throttling or weak hardware or even
a lack of pro apps, but instead a fundamental OS limitation and the manner in which Apple
treats the iPad as a second-class citizen is a month-long investigation that I just wrapped:
testing, benchmarking, and discovering three core problems that make five-year-old budget Macs like
this objectively more capable than this $2,000 iPad Pro. I'm calling them the jettison problem,
curative limitations, and the foundation gap, and they together explain why this dynamic is not
really likely to change anytime soon. Computers run out of RAM all of the time. Macs especially,
because until recently Apple was shipping entry-level configs with a paltry 8 gigs.
What distinguishes a real computer from something that isn't, isn't whether it runs out of memory,
it's instead how the operating system deals with that reality. And the best way to demonstrate
this is to deliberately stress nearly all available memory by loading a local AI model,
something that more and more people are doing. Now I wanted to run GPT OSS,
it's a newish trendy 20 billion parameter model that has been quantized and optimized for Apple's
MLX. And it's a fairly small model, but it still takes about 11 to 13 gigabytes of RAM when loaded
into memory. And since both my M5 iPad Pro and my M5 MacBook Pro have the same chip, same RAM,
same storage, I tried loading the model onto both, and the iPad refused. Not just in one app,
but in multiple apps designed specifically for running local AI models. The moment any app
attempted to allocate that much memory, the app would either crash or politely inform me that the
model was just too large to run on this device. Now, running those same App Store distributed apps
on the Mac didn't fare much better. While unlike the iPad, the models would load into to memory,
the memory pressure was so high that the models would just get stuck into a repeat loop. And this
is a pretty chronic symptom of insufficient RAM when running local AI models. And I mean,
it makes sense. While the model is trying to use 13 gigabytes of memory,
Activity Monitor is suggesting that the operating system is using 2 to 4 gigs by itself. And you
can't give the model more RAM than exists in the machine, right? But then I tried LM Studio,
a popular tool downloaded from the web, and I disabled its default memory safety guardrails.
The system ground down to a crawl with multiple seconds to switch between windows. Beach balling
was found everywhere. But the model did load and it ran pretty quickly too at about 50 tokens per
second. A rather impressive speed. So I tried this on an M4 MacBook Air. And well, that also
worked. And I tried it on this five-year-old M1 Mac Mini. It ran it too, just more slowly.
So what's happening here? Why can the Mac do this but the iPad can't? Well, it boils down to a core
philosophical divide between these two platforms. Mac OS is built to preserve work at all costs,
even if it means slowing things down. When the 16 gig Mac is asked to load that 13 gig model, the
total demand from the OS and the app does exceed the physical RAM available, but the Mac doesn't
fail, it negotiates. It activates virtual memory, pushing other apps and even its own idle system
processes into a swap file on the SSD. The model loads, the workload completes, and the only cost
is that molasses feeling of that swap. iPadOS, by contrast, makes the opposite trade-off, and that
is to preserve responsiveness at all costs. It has no memory swap. Now, when memory pressure climbs,
both systems, both on macOS and iPadOS, send low memory warnings to the running apps, a signal that
they should voluntarily free up resources, but this is where their policies diverge. On macOS,
that warning is a polite suggestion. Apps like LM Studio can just ignore it and keep running,
relying on swap to handle the overflow. But on iPadOS, ignoring that warning has consequences.
If apps don't release enough memory to alleviate the memory pressure, the system calls for a Jet
Sam event, forcibly terminating apps to reclaim that memory. Background processes go first,
but if memory pressure continues, even the foreground apps can be killed. And this is
the fatal problem for our AI model: it can't dial back its all-or-nothing 13 gig request.
And on the iPad, this massive upfront allocation is not a negotiation, it's a violation. Even
with 16 gigs of hardware, iPadOS enforces a hard, non-negotiable, variable, and undefined
per-process memory cap to guarantee system stability and a fluid UI. And that 13 gig model,
well, it crosses the iPad's invisible line, and the sentence from the Jetsam event is immediate,
resulting in what appears to be a crash, but is actually just the system self-preserving
by jettisoning the app. And this isn't some specific AI edge case either. There are a
number of pro apps, from video and photo editing, to 3D modeling, to music and podcast production,
that all push against this same invisible ceiling. A problem that is further compounded with iPadOS's
new windowing environment, where multiple apps run alongside each other in tandem. Speaking of
pushing the iPad to its limits, if you're actually using your iPad for creative work, you know that
the glass surface isn't exactly ideal for writing or drawing, and that's where today's sponsor
Paperlike comes in. Look, I've tried a lot of screen protectors that claim to give you a paper
feel, and most either look like garbage or feel like you're dragging your pencil across sandpaper.
The new Paperlike 3 is different because they've spent years perfecting their nano dot surface
technology, microtextures, engineered to deliver the exact friction that your hand naturally
expects when you're writing on actual paper, all without compromising screen clarity like
every other paper field protector I've tried. And as someone who has admittedly tried and removed a
previous generation of the Paperlike, the new Paperlike 3 is truly impressive. But honestly,
the real breakthrough here is their new butterfly application system. If you've ever ruined a $40
screen protector before because of dust or misalignment or just general frustration, you
know the pain. This four-layer clean sheet creates a clean room effect, keeping your iPad sealed from
airborne particles until the exact moment of application. The helper tool handles alignment,
layer sliding, bubble removal, everything. And there's even an interactive on-screen guide
that walks you through each step at your own pace. I installed mine in about two minutes,
with zero bubbles, zero dust, and zero desire to throw my iPad across the room. Which is a plus,
eh? Also, because Paperlike 3 stays under Apple's maximum screen protector thickness,
your Apple Pencil maintains full precision and responsiveness. And unlike those harsh gritty
protectors that destroy pencil tips, Paperlike's surface is engineered to protect both your
screen and your Apple Pencil while giving you that premium paper feel. They're engineered in Germany,
manufactured in Switzerland, and frankly, if you're doing any serious writing or drawing on
your iPad, they're really worth checking out. Link is below. But even when apps don't hit those hard
memory limits, there's a second issue: curated limitations. And that's something that gaming
illustrates perfectly. Look, Apple has been aggressively pushing for AAA titles to appear
on their hardware, and with the remarkable M5 on board, the iPad Pro should handle gaming in
spades. But when I first launched Death Stranding, the image looked, honestly,
horrible. It was wildly soft, with an insane amount of upscaling and ghosting. It was like
I was playing on an 11 inch PS Vita from 2005. I know the Vita came out in 2011, alright? Back
off. Using an app developed by fellow YouTuber Mr. MacWrite that brings Apple's MetalHUD to any game,
well I discovered why. The iPad was rendering Death Stranding at 784x540 internally, and
then temporarily upscaling to 1567x1080, and then stretching it to the iPad's native resolution of
2420x1668. This is insane behavior that would only be acceptable if the iPad just couldn't push more.
But the frame rate was pegged at 60fps, and the iPad wasn't getting remotely warm, so I suspected
there was more juice to squeeze. But here's the problem: there's literally no graphics settings.
You can adjust HDR brightness, and that's it. Now the Mac version? The Mac version gives you a full
suite of graphics options like you would expect from a computer game. You have resolution scaling,
texture quality, shadow detail, anti-aliasing methods, metal effects, upscaling controls. You
can push settings so high that your Mac becomes a slideshow. On the iPad? No. You get nothing.
You lose! Good day, sir! But Death Stranding has got a bit of a hack because both games are
running the same binary. You can copy the graphics config file from macOS to iPadOS's game directory,
and the game on the iPad will use those parameters instead. The M5 MacBook Pro can render the game at
1080p native, no upscaling, max settings at about 85 FPS, with the fan, frankly,
screaming. Those exact same settings on the iPad Pro yielded over 70 FPS. Super impressive
for about 20 seconds. The tablet dropped into the mid 50s, and then after about 10 minutes,
it settled into the mid 30s as the entire back of the iPad became noticeably toasty. For comparison,
this similarly fanless M4 MacBook Air averaged a consistent 47 FPS throughout. So I strapped
a Peltier cooler, a thermal electric plate drawing an enormous 20 watts directly to
the back of the iPad. frame rates jumped about 40% to around 50 FPS on average,
outperforming the M4 MacBook Air. And while this illustrates that, yes, thermal throttling is real,
it's not really the main problem. I mean, 1080p in the mid-30s on a thin and light tablet isn't too
bad. It looks a lot better than the trash settings that are enabled by default, settings that haven't
been updated in nearly a year and clearly don't take advantage of the M5's capabilities. I mean,
the hardware can handle a lot more. Now, you might think that this is just Kojima's fault,
a lazy developer not optimizing for new hardware. And in a way, I guess it is. But this problem
exists because of the way that Apple discourages developers from providing those settings. And I
quote straight from the horse's mouth, choose a few sets of options that work all of the
time rather than presenting all of the options to players. Avoid setting combinations that lead to
instability or poor performance. Aim to provide default settings that give the best experience
to the largest number of people. Minimize the number of settings you offer. Apple is explicitly
telling developers, don't give users control. Just pick some safe defaults and lock them in.
Hitman World of Assassin is the poster child for this approach. It's one of the iPad's most recent
AAA releases and notably is not a lazy Mac port. It is an iOS-exclusive binary built specifically
for the iPad. So you'd think that if anything was perfectly optimized, it'd be this. It has three
graphics settings, better than Death Stranding Zero, you get a frame limiter, Metal FX upscaling
mode selection, and mirror quality sliders. That's it. I dug into each render resolution
for each mode and found something honestly pretty bizarre. The spatial upscaling mode runs at the
same 834p resolution that the native mode runs in, and it uses a simple stretching algorithm
that honestly looks pretty blurry, but does hit a locked 60fps. The Temporal Performance mode,
which uses a smarter DLSS-like algorithm, runs at that exact same 834p resolution, but it drops
the frame rates to around 50 FPS because the algorithm itself has a pretty high fixed cost.
You're getting 10 fewer frames per second for an image that's honestly just as blurry. However,
Temporal Quality mode renders at nearly three times the resolution of temporal performance,
while only dropping frame rates by about 15% to 45 FPS, which proves, at least to me, that the
render resolution is not the bottleneck. The most obvious sign of wasted potential is that in the
quality mode, even in heavy action, you're never dropping below 40 FPS, which makes the 30 FPS lock
option in the settings completely useless. I mean, a properly optimized game would use
this massive performance headroom to push a much higher internal resolution, but instead, users
are stuck with inefficient, static presets that frankly fall short of this iPad's capabilities.
And you can't change them like you would be able to on a Mac or a PC because, well, this is an
iPad. Apple told developers not to let you. And that brings us to the last, the foundation gap,
something I'll illustrate with a video editing benchmark within Blackmagic's DaVinci Resolve,
an app that Apple themselves have highlighted multiple times and multiple keynotes over many,
many years to illustrate how powerful the iPad really is. And hot dang, look at that. I mean,
the M5 iPad Pro outperformed the macOS sporting M4 MacBook Air handily. Except, well, I had to create
a fake benchmark to even get these numbers. You see, my original benchmark used a mixture of HEVC,
AV1, ProRes, and RED RAW footage codecs that you actually encounter all of the time in professional
work. The iPad couldn't play back half of it. Despite the M5 having hardware AV1 encode and
decode engines, iPadOS doesn't support the common MKV container that said AV1 file was in. And then
RED's R3D SDK exists to handle RED RAW footage, but only for macOS, Windows, and Linux. There is
no support for iPadOS. So, even though the iPad is running the exact same DaVinci Resolve binary as
the Mac, like, you can literally reveal parts of the Mac app that they've lazily hidden on the iPad
with hotkeys, with menu bar support and all, you just can't edit the same footage. Frustratingly,
Blackmagic doesn't even maintain a list of what is and what is not supported on the iPad
itself. It may seem a bit silly to blame Apple for Blackmagic, and for Red not taking the iPad all
that seriously, but frankly, neither does Apple! It was only recently that Apple FINALLY added
support for editing footage off of an external drive, which… like what the hell? This is toddler
stuff, this should have been a day one feature! And while the iPad's Final Cut Pro has one feature
that the Mac version doesn't, Live Multicam, it lacks a lot of other really basic stuff. You don't
have the same full suite of color grading tools, media tagging and keywords are not supported
equally, advanced audio editing is not present third-party plugins are not supported, even
project round-tripping. You can open up an iPad's final cut project on the Mac, but not the other
way around. And just like Resolve, the iPad lacks support for over two dozen file formats that are
supported on the Mac. And that's to say nothing of Adobe, which just released Premiere for iPad,
that may as well be an iMovie clone. Completely stripped down and featureless. Not desktop class
at all. Look, I love this iPad. The hardware is incredible. The OS is fantastic. for what it does
well. And I get that a lot of people just want a really good big phone. That's valid. But Apple is
also finally starting to take the iPad seriously as a pro device. Windowing, external displays,
adequate memory, that's real progress. But it's also going to take more than interface changes to
get there. The jettison problem still kills apps when memory pressure climbs. Curated limitations
still prevent users from making the choices that would let them push this hardware choices they
can make on a Mac. And we haven't even gotten to the foundation gap yet, where architectural
differences run deeper than any point update can really address. The iPad remains fundamentally
constrained by decisions made when it was just a bigger iPhone. There are still days that I reach
for my $750 MacBook Air because my $2,000 iPad Pro can't do what I need it to. Seldom is the
reverse true. And I've made peace with that, but most people probably won't. This is too expensive
and too limited to be their real computer. Thanks so much for watching, and as always, stay snazzy.
UNLOCK MORE
Sign up free to access premium features
INTERACTIVE VIEWER
Watch the video with synced subtitles, adjustable overlay, and full playback control.
AI SUMMARY
Get an instant AI-generated summary of the video content, key points, and takeaways.
TRANSLATE
Translate the transcript to 100+ languages with one click. Download in any format.
MIND MAP
Visualize the transcript as an interactive mind map. Understand structure at a glance.
CHAT WITH TRANSCRIPT
Ask questions about the video content. Get answers powered by AI directly from the transcript.
GET MORE FROM YOUR TRANSCRIPTS
Sign up for free and unlock interactive viewer, AI summaries, translations, mind maps, and more. No credit card required.