G
g
I'm not arguing that the grid can't or won't take any, the majority, or
all of the generated power.
The question here is - what exactly must the invertors do in order to
get as much current as the PV system can supply into the grid.
The inverter must ensure that it transforms the DC from PV to the
frequency and voltage of the grid. To ensure flow of current into the
grid the voltage must be attempted to be raised. Because there are
losses between the inverter and the grid, the voltage will be higher
than the grid.
If our analogy is pipes, water, and water pressure, then we have some
pipes sitting at 120 PSI and we have a pump that must generate at least
121 PSI in order to push water into the already pressurized pipes.
Fairly good analogy, and due to internal resistance in the pipe then
that must be overcome by having a higher pressure. Don't forget that
somewhere someone else has to reduce the water flow into the pipe system
in order to avoid pressure buildup. Because the water in the pipe system
is used up as it is supplied, at the same rate.
Not sure I understand what you're trying to say there.
See the pipe analogy above, the power lines from the inverter has some
resistance, which results in a voltage drop. Therefore the voltage
measured at the inverter will be slightly higher than measured a
distance away.
No, I don't agree.
Why? take a hypothetical grid with 1 megawatt consumption. Generating
machinery produce that energy at a set voltage. Mr Homeowner connects to
the grid with a 10kW PV array. If no power utility adjustment took place
then the overall voltage of the grid will increase. OK for small
fluctuations, but if enough PV arrays came online, somewhere energy
production has to decrease or bad things will happen due to high grid
voltage.
You cannot unless your local load is zero. You must subtract the localHypothetically speaking, let's assume the local grid load is just a
bunch of incandecent lights. A typical residential PV system might be,
say, 5 kw. At 120 volts, that's about 42 amps. How are you going to
push out 42 amps out to the grid?
load from the generated PV array power if the house load is lower. If
the house load is higher than the PV array output then you will use all
the PV array power with the difference supplied from the grid.
Correct, due to a slightly raised voltage if there is a voltage dropThey're going to burn a little brighter -
between the inverter and the grid. (There is some drop)
Not possible, the current is controlled by the internal resistance inthey're going to use all of the current that the local grid was already
supplying to them, plus they're going to use your current as well.
the lamp. They will draw a current by the formula volt/resistance.
So when the PV array produces current, grid current is reduced.
The voltage increase you will see at the output of the inverter is very
small, but it does depend on the cables used.
An example: I have a 300 feet underground cable to the nearest utility
transformer and a 100A service panel.
If I max out the power, I will have a voltage drop over the cable of
about 6 Volts. Much higher than normal households.
When your PV array is producing full power, and your house load matches
that, then the voltage difference between the grid and inverter is zero.
at any other house load, current will flow in the power utility lines,
and the inverter voltage increase is a function of the loss in those
lines.