So, I wanted to try simulating a virtual Op-Amp as a comparator, and noticed something bothering me and have no clue why it happens
When the supply voltage +Vcc = 15V and -Vcc = 0 and the inverting input is 3V and non inverting input is 7V, the output I assumed The output would be 15V or slightly less, but the output was 13V.
when I switched the Inputs, the output was 2volts and not 0, and when changing -Vcc to -15V
the output was about 13V
Why is it always 2 Volts difference from the supply voltages no matter what I've changed in the Circuit ? is the simulation is broken or there is some constant in each junction due to the voltage on the semiconductor junctions inside ? I Know a proper IC for this stuff will fix the issue, but I want to know the juice of how and why this happens? Use as complicated math as you want if needed.
the above simulation was in NI MultiSim on DC sweep for VDC1 [ 0:10V, 0.01V step] and didn't change the default parameters of it.
When the supply voltage +Vcc = 15V and -Vcc = 0 and the inverting input is 3V and non inverting input is 7V, the output I assumed The output would be 15V or slightly less, but the output was 13V.
when I switched the Inputs, the output was 2volts and not 0, and when changing -Vcc to -15V
the output was about 13V
Why is it always 2 Volts difference from the supply voltages no matter what I've changed in the Circuit ? is the simulation is broken or there is some constant in each junction due to the voltage on the semiconductor junctions inside ? I Know a proper IC for this stuff will fix the issue, but I want to know the juice of how and why this happens? Use as complicated math as you want if needed.

the above simulation was in NI MultiSim on DC sweep for VDC1 [ 0:10V, 0.01V step] and didn't change the default parameters of it.
Last edited: