Hi,
I'm a bit confused by the relationship between frequency and data rate and wondered if someone could help clarify please.
Say I want to achieve a data rate of 10Mbit/s - 10 millions bits transmitted across a link every second - and am using on-off modulation (say +5V = '1' bit and 0V = '0' bit), which by default is NRZ (non return to zero). What is the required frequency: 10 MHz, or could we say that each cycle can transmit 2 bits - so a frequency of 5 MHz would do?
Also, to then improve this could I use a different modulation scheme or form of coding? I've read that QPSK or QAM are more 'spectrally efficient' and can transmit more bits per symbol, but isn't the frequency determined by the BIT rate not SYMBOL rate?
Thanks for any tips!
I'm a bit confused by the relationship between frequency and data rate and wondered if someone could help clarify please.
Say I want to achieve a data rate of 10Mbit/s - 10 millions bits transmitted across a link every second - and am using on-off modulation (say +5V = '1' bit and 0V = '0' bit), which by default is NRZ (non return to zero). What is the required frequency: 10 MHz, or could we say that each cycle can transmit 2 bits - so a frequency of 5 MHz would do?
Also, to then improve this could I use a different modulation scheme or form of coding? I've read that QPSK or QAM are more 'spectrally efficient' and can transmit more bits per symbol, but isn't the frequency determined by the BIT rate not SYMBOL rate?
Thanks for any tips!