Here's the gist of what I thought were the salient points...
Ones and zeroes is ones and zeroes, they don't get degraded passing through various pieces of electronics the way an analog signal would. The place where errors are most likely to occur is in the actual burning process, occasionally a one or zero (I'm being simplistic) doesn't get burned because of mechanical/physical problems with the laser, or because of imperfections on the disk surface. This is why hardware that plays back the CD employs error correction, to replace the bits that are missing.
From that point the argument switched to:
1) which CD manufacturers made the best blank disks,
2) Which CD burners were most/least error prone.
3) How much does the write speed of the burner factor into the equation.
#3 was the most interesting to me because it made the most sense to people in my situation. I'm burning CDs from my computer's CD-RW and probably the smartest thing I can do is burn at a 1X speed to decrease the possibility of errors.
I know the error correction part sounds pretty suspect, but I remember taking two or three classes on the topic when I got my BSEE back in 1983. It's really fascinating how much it's used and how well it works.
Hope I didn't start another flame war on the topic.