The maximum supported data rate for SPDIF is about 6 Mbit/s, so from
where did you get that 10 MHz requirement ? So even if you have two
transitions (half cycles) within a single bit time, the frequency is
still 6 MHz.
In SPDIF, the serial bit clock (after dividing down) is typically
used for driving the DAC, which requires a jitter free timing. After
initially sensing the bit rate from a large range, you need a narrow
bandwidth PLL to maintain the rate as constant as possible.
Yes, as I read more of this thread I realize this is almost exactly what
I implemented in an FPGA a few years ago as part of a design still in
production. At one end is a data link and an ADC which samples a time
code signal. The data link has no clock, but rather an approximate rate
is set by the software and the hardware slave to the data transitions.
The CODEC is slaved to the data rate. The aggregate data bundle (fixed
number of data bits and the CODEC sample) is shipped over an IP network
where at the other end another copy of this unit receives the data and
reconstitutes both the data stream and the time code signal in
synchrony. This requires syncing up to the average data rate of the IP
packets and clocking the CODEC in lock step with clocking the data.
IIRC, the loop lock was actually regulated by the amount of data in the
FIFO. The FIFO delay had to match the delay of the sigma-delta CODEC.
The receiver half of this should be *very* much like what the OP wants
to do.