I don't believe that this is actually a problem of phase (delay). Doing a fair amount of handwaving over the math, light travels a foot per nanosecond, so we'll use that as the propagation speed up the wire (yes, I know that that's not strictly correct in a transmission line, but bear with me). How much differential wire length would be needed to induce a 10deg phase shift at 20 kHz (that would be a swag at a just-noticeable-difference)? Let's see: the period of a 20kHz signal is 5e-5 sec (50uS). A 10deg phase shift would be 1/36 of that period, or 1.3e-6 (1.3 uS). At a foot per nanosecond, that sounds like 1300 feet of wire to me.
So if you have one speaker wire that's 1300 feet longer then the other, you may experience some phasing problems in the high end. Assuming that you can get into the room with that huge freakin' spool of wire in the way, of course. (;-)
A much bigger problem comes from the inductive and especially capacitive loadings created by the wire, especially if you are driving line levels out to the power amps located in the active speakers. If one cable is much longer, and you are driving single-ended signals, you can expect parasitic RC losses to be noticeable in the high end *long* before you would see any phase delay nightwierdies. Even well-driven balanced differential signals will experience different attenuation over skewed cable lengths, and at some point (well less than 1300 feet!) you would be able to hear it. What point? Search me- that is completely installation dependent...
Anyway, if anybody really needs a rule of thumb, keeping the wires on the same order of magnitude lengthwise makes mathematical sense for loading the line drivers equally. And keeping them within a factor of 2 of each other would be in some sense "better", although I'm not at all convinced that you could actually hear it with decent quality interconnect cables. Perfectly matched identical lengths are always nice conceptually- but not worth losing sleep over, IMNSHO.
The wire, she ain't the problem...