Found an article describing the voltage measuring method of setting gain: http://knowledge.sonicelectronix.co...-amplifier-gains-using-a-digital-multi-meter/
This is the part I was not quite grasping:
My 4 channel puts out 63 watts x4 @ 4 ohms (per birth cert). 63*4=252 Square root of 252=15.87v And I need to run a sine wave test tone of ~1000hz to see if I reach that voltage. I may just run a sweep and look for the peak.
Reading further on a few other forums, I need to do this test with a load on the amp (speaker and meter connected at the same time). I can check both ways.
I was thinking since the speakers are rated less than the amp delivers then maybe it's better to use the speaker rated power (20w) which would mean around 8.9v? I'm not sure why I calculated it at 12.6v that day, maybe because I combined both channels (40w).
I think the oem speakers are normally very efficient so they should be loud enough with little power. If they're rated at 20w rms I would stick to that instead of setting gain to the maximum the amp outputs. You CAN blow them up by sending more power than they are rated for (but then we are discussing rms and no one knows what the max power rating is so the chance of blowing them up while setting up the gain is more than when actually playing them).
100hz test tone for woofers works well. 1000hz is a bit too high I think. I would really avoid trying to set the tweeters using any test tones (people usually use 5khz but there it's a lot easier to blow tweeters this way).
One thing to note is I believe both oem woofers and tweeters are wired in parallel. So theoretically that means an 8ohm load (if the tweeters are also 4ohm). So maybe around 16v would still be fine (I'm just guessing here).