Since the signal must travel through the positive wire to the destination and then return through the negative wire, the wire resistances of each is in series and are thus additive. The thicker the cable (lower wire gauge number), the lower the resistance.
If the cable is intended to carry line-level signals and the destination is a high-impedance input, then there will be negligible current flow through the cable and the wire gauge would be irrelevant. For example, assuming a home CDP output of 2V rms maximum, driving an amplifier with input impedance of 10K ohms, the max current will be 0.2mA rms. If you're using 24AWG wire, which has 0.0256 ohms per foot of resistance, then the voltage loss would be about 1nV. If you have a 6 foot long cable then the loss would be around 6nV, which is completely negligible.
Wire gauge only becomes a factor if the cable will be carrying significant current, and wire resistance in the cable would cause voltage losses. An example would be a speaker cable. Let's assume a 100W amplifier driving an 8 ohm speaker at max output (28.317V rms), then the rms current is about 3.5A. If you use the same 6 foot 24AWG cable, then the voltage loss would be some 1.9V. That is considerable loss!