DC Input protection - trip point ???

AVForums

Help Support AVForums:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

JimGore

AVForums Grandmaster
*
Joined
Jul 8, 2005
Messages
4,488
Reaction score
32
Location
Jhb
Hi Guys,

I am fine tuning my circuit design for DC input protection on my big amplifier build.  The amplifier is fully DC coupled (obviously), so I reckon I will be clever and monitor for DC on the input, and if an excessive amount of DC is present, I will have the circuitry automatically switch a polyprop input capacitor into the circuit thereby removing the DC, but making the amplifier no longer DC coupled.

Question is at which point should the protection circuit trip?  I am thinking I wouldn't want to see more than 50mV - what do you think?  The servo inside the amplifier should be able to take care of 50mV worth at the input, but beyond that it could potentially get interesting.

Although coming to think of it - in a fully balanced amplifier the DC offset would (in theory, hopefully) be the same on both the HOT and COLD channels, so it may just be discarded because it's common mode.  Though there is no real guarantee that both the HOT and COLD will have the same offset.  Hmmm....  :thinking:

Thanks for your input!

Kind regards,
Ian.
 
Top