Depends entirely on the gauge of the wire. You can google to find charts that will give you the resistance per foot for any gauge wire. That info allows you to calculate wire losses for a given load current.
For instance, if you have 14 gauge wire(minimum build standard for household 120VAC circuits), its resistance is about .0025 ohms per foot. At a typical max load of 15 amps, it works out to .56W per foot of wire. So on a standard household (120VAC) 15A circuit if the load is 1800W, 40 feet from the breaker panel, the wire losses will be a little over 21W, about 1.1% of the load wattage. Over the same wires, if 240V supply is used, the current will be cut in half and the wire loss will be reduced to .14W per foot. At 40 ft, the wire loss becomes 5.68W or 0.3% of the 1800W load.
That's where the difference comes from - wire losses. But those losses are generally small anyway (unless the electrician did a shoddy job and undersized the wiring), and the difference is pretty negligible until you get into very long wire runs.