I wish I had the expertise to do such in-depth reverse engineering of firmware blobs.
The DCP is actually the thing that's stopping me from providing native brightness control on the HDMI port of the newer Macs inside Lunar (https://lunar.fyi). Users have to either switch to a Thunderbolt port to get native brightness control for their monitor, or use a software dimming solution like Gamma Table alteration.
It's not clear what's going on, but it seems that the HDMI port of the 2018+ Macs uses an MCDP29xx chip inside, which converts the HDMI signal to DisplayPort internally, so that Apple doesn't have to decode both HDMI and DP video signals. (that is also the reason why even the newest MacBook and Mac Studio have only HDMI 2.0, that's the most the converter chip supports )
When sending DDC commands through the IOAVServiceWriteI2C call, monitors connected to the HDMI port lose video signal, or flicker or completely crash and need a power cycle to get them back.
The Thunderbolt ports however send the DDC command as expected when IOAVServiceWriteI2C is called
After @marcan42 from Asahi Linux pointed out  the DCPAVFamilyProxy kexts, I've looked into it and I found some different writei2c methods and some MCDP29xx specific code, but no clue on how to call them from userspace.
I guess I'll have to look into how the analysed exploit is using the RPC, and also check the methods assembly from inside the firmware blob itself. I was not aware that most userspace methods are now shims for remotely calling the embedded code.Reply
The exploit is quite complicated to pull together. Would there be any chance that someone created it based on iOS sources? I assume NSO and such actors would already have bought stolen source codes.Reply
> This sideloading works because the app is signed with an enterprise certificate, which can be purchased for $299 via the Apple Enterprise developer program.
From linked post, the actor is identified as a "commercial spyware" company.
So... I'd like to assume that Apple has cancelled their enterprise cert and will refuse to sell them another after this abuse, right? Surely there are terms of service that forbid using an enterprise cert maliciously, to fraudulently pretend to be another app and trick users that are not part of your "enterprise"?
Right? (cue Anakin and Padme meme).
But seriously... will they?Reply
Linking this next time somebody tries to tell me iOS's limitations on sideloading improve security.
In reality it costs the bad guys $299 to bypass this limitation, while your average user is locked out of this feature.Reply
if that's so "easy" for enterprise to get side loading to work, why eg. epic games won't go that route to provide apps outside app store? am i missing something?Reply
What's interesting to me is that on its face Apple's architecture was the right thing from the perspective of modern security thought: split out "driver" layers that can be driven (in a reasonably direct way) by untrusted code and put them on dedicated hardware. That way you're insulated from the Spectre/Meltdown et. al. family of information leaks due to all the cached state on modern devices.
Except the software architecture needed to make this happen turns out to be so complicated that it effectively opens up new holes anyway.
(Also: worth noting that this is a rare example of an "Inverse Conway's Law" effect. A single design from a single organization had complicated internal interconnectivity, owing to the fact that it grew out of an environment with free internal communication. So an attempt to split it up turned into a mess. Someone should have come in and split the teams properly and written an interface spec.)Reply