That's not hard to believe. The last time I did any real(ish) solar calculations it was for a buddies house many years ago to show him that it would never pay off. That system converted to AC and did a stepup to line voltage at the meter. So thanks for correction.
Ah. Knowing this - use 14 gauge. The effect of the voltage drop at the controller is that it will have to dissipate less power as heat. So there is a ton of room to be lossy in transmission. (8.5 watts - (there is some required conversion cost))
And we know 2.22 amps. So, a 15 amp/day is 6.88 hours of charge time.
Don't have to guess the effeciency, it's dialed into the controller and it's linear. (Yuch) So, at 40 watts you lose 8.5 watts. Double the input current and lose double the power - 17 watts.
I'm back. I thought I would back up the "use 14 gauge" argument a bit.
From "Amateur Radio Relay Handbook" 1985 I find resistance in 14 AWG is 2.575 ohms / 1000ft.
If we use a 100ft drop cord (that ought to be plenty, eh?) then we will be adding .2575 ohms resistance to each of the positive and negative leads for a total resistance of .515 ohms.
That drops our 18V input to (uhm calculator...) (6.35% loss) (uhm...) 16.86V.
Since we are constant current and assuming the 16.86V meets the cost requirements of the controller, we end up with the same 2.22 amps to the battery and need to shed only 5.9 watts as heat.
And since the controller is so ineffecient, I could even argue that 14 gauge is the right choice. It will let the controller run cooler.
Keep in mind that there are other losses in other places too. Hook everything together and test at the controller input. If it meets minimum voltage requirement for the controller then you are good-to-go.