This gives some interesting food for thought, but, there is yet another way to deal with accurate timing. Use a camera that is not buffered, and does not end up feeding a stream, but instead requires a 'start frame' signal from the pc. If it's an interline unit there are ways and means this can all be accomplished within the driver.
I'll use an example of hardware I have here. The all-sky is just an sx superstar under the covers. With minor changes to the driver, it would be easy to introduce a mode where end of one frame, becomes start of next frame. Sequence goes like this at the very low level. At the start of sequence, flush all accumulators and trigger start of exposure. At the end of exposure time, latch the data and download it. When the next frame time is complete, latch again. This works with an interline sensor, and creates back to back frames with zero time loss between frames. To address sensetivity bin the camera, and to address frame rates, after on target, only take the small piece of the sensor where the target star exists.
When running in this mode with an interline sensor, you do not get any time loss due to data download, and it's possible to get very accurate frame times from the host clock because each and every frame is fetched from the sensor with no buffers creating internal lags and the host is generating the data latch timing. We know the exact start and end times for each exposure cycle, because the host triggered it directly by latching the data in the sensor. Frame start and end times should be accurate to on the order of 1ms, and we also know, because it's an interline sensor, the end time for frame one, is the start time for frame 2, etc. The only potential bottlenecks in this scenario are the ability of the host program to consume the data as fast as the driver delivers it.
It would only take a small bit of code change in the driver to create the 'back to back' mode so that once the sequence is triggered, the driver delivers the frames continually, and the end of frame one becomes the start of frame 2, without any external triggers required. It's essentially adding a streaming mode to the core ccd driver code.
I currently manipulate the all-sky over a wifi link, so there is considerable lag between the client program on the computer in the house, and the one out at the all-sky. This isn't a big deal in our case, because I'm using 10 second exposures. I dont remember the exact numbers, but when I was working on stuff directly connected to the development computer, I believe frame download time for a full unbinned frame was on the order of just over a half second. That would cut by a factor of 4 if it was binned, and another factor of 4 if one was only grabbing 1/4 of the sensor. It should be able to keep up handily grabbing 1/4 of the sensor binned, with very accurate timing at 10hz frame rates. Timing accuracy will be limited only by your ability to keep the host clock synchronized.
It's difficult for me to experiment with some of this right now, because I dont have a hardwire link to the all sky yet, but it should be in place any day now, electrician was scheduled to put the fiber in last week, but got pre-empted on another job. Once the gigabit connection is in place, it'll be pretty strait forward to experiment with this a bit, and see how well it can be made to work. If it works well in that scenario, then to do an actual occultation timing is just a case of putting a similar camera onto a different optical setup. 1.8mm focal length is not really conducive to this project