you can do easy math for yourself, power (w) = voltage (v) * amperage (a), if you have 120V and a circuit breaker of 20 amperes, you can drive 2400 * 0.9 = 2160w without any power blackout for sure... you could drive it all 20 amperes in theory, but i think bulb ballasts draw exact amount of power (power doesnt drop with resistance, but instead current (amperage) gets increased until power level is achieved), maximum (as in, maximum when done by pro and done properly and when youre really far from your power station) voltage drop should be around 5%, when you take that into your formula, to get 2400w of power on bulbs with power drop of 5% you would need 21 amperes which is enough to trigger circuit breaker, so we take that 5% into account plus 5% to stay safe), but i may be wrong... if you got someone do re-do your wires and your panel (your circuit breaker), you should have go with higher rating (sum all power your devices use, divide by voltage and pick circuit breaker that is higher rated than your current demand)... when youre in safe range, there is 99% chance it wont pop (esp. if you replaced it with new one) for i estimate 10 years or so...
if you got lost, here is summary:
sum of total power you need for lets say 4 x 600w lamps, 200w dehumidifier, 150w vents, 50w pump + 300w for future (you never know) = 3100w
divide by your voltage = 3100 / 120 = 25.83 amperes needed, you need 30 ampere MINIMUM circuit breaker, and wires rated for 30 amperes which would be 10 gauge by this source:
https://activerain-store.s3.amazonaws.com/image_store/uploads/3/9/2/9/9/ar13634460999293.jpg (im european, i dont understand your metric, i will never understand it, please check with someone professional from your country if its true), and no more blackouts... hope its making sense! peace...