O.K., finally you reveal what your deadtime is.
According to figure 3 (Deadband_Min) from your post #21 the deadband can be adjusted from 2 to 256 clock cycles.
Your ADC will give you an output that is OUT=Vin/Vref where Vin is your input voltage (0.5V...3.25V) and Vref is your reference voltage (unknown to me). If I read the table in your post #21 correctly, this translates to values between 4 (fpr 0.5V) to 24 (at 3.25V) If not, you need tpo make a new table.
Anyway, working with the values from the table, and assuming a clock cycle time of 50ns (that is the number you have stated so far, albeit it contradicts the step size 40ns in the table from post #21), when you insert these values into the configuration (figure 3 Deadband_Min from your post #21) you will get:
4 -> 4*50ns = 200ns
24 -> 24*50ns = 1.2µs
This is close but not exactly what you requested. Check your requirements: do you need to be as exact as stated or can you work with this slight deviation?
You will never be able to be on the point because 205ns deadtime cannot be achieved from a 50ns period clock. 200ns is the best you will ever get.
As for the max value, you could either scale down the analog input or multiply the digital output from the ADC to get at the correct scale.
What I cannot tell you is how to input the aADC's output to the deadtime input of the PWM generator in your SOC design software.
It also totally escapes me why you would use a 16 bit ADC for digitizing an analog input signal to 24 steps. 5 bits are enough to encode 24 steps, the nearest meaningful ADC resolution is an 8 bit ADC.