I don't think they would waste a high value capacitor just to keep a led lit for longer, also a led directly lit by a capacitor would be noticeable by slowly dimming when the capacitor discharges. It's more likely that the signal driving the led comes out of a monostable implemented in code: pin_on() drives the led on; pin_off() waits n secs then drives the led off.
This is Apple, so that assertion isn’t guaranteed valid like it would be for non-enterprise HP or Lenovo. They absolutely would invest in a capacitor if that’s what it takes, as they are maximally focused on camera privacy concerns and have made a point of that in their security marketing over time; or else they wouldn’t be allowing hardware security engineers to brag about it, much less talk publicly about it, at all.
If you are designing an ASIC for the camera, you can include all the required logic gates to control the LED for a cost that is close to zero. It wouldn't impact the production cost of the ASIC, whereas a capacitor is an additional item in the BOM (and to be charged it requires current, more than the LED, so the driver in the IC must be bigger).
The trick is to keep using the camera until that capacitor is discharged. I'm pretty sure most cameras can run at voltages below a LED's forward voltage nowadays.