The given grid has a bias factor of λ=266.6 MW/Hz.
The simpleGen
is generating 100 MW in order to supply the the
internal grid load of 100 MW. The system is in balance.
At t=10s
the grid loses 133.3 MW of production
(see loadStep
parameters>) and now two different cases can be investigated:
When simulating the system with the setting of grid.mu=0
one should expect
the frequency difference ΔF (see grid.dF.y
) to settle down at
Δf = -0.5 Hz.
Changing the setting of the self-regulation to grid.mu=2
should now
result in a frequency difference Δf (see grid.dF.y
) that settles down at
a slighly lower Δf ≈ -0.496 Hz.
Note: When doing the calculation based by hand and calculating the new λ which includes the effect of self-regulation based on Δf = -0.5 Hz one will get a smaller expected Δf than the simulation shows. Can you think why?