Forums
New posts
Search forums
What's new
New posts
Latest activity
Members
Registered members
Current visitors
Log in
Register
What's new
Search
Search
Search titles only
By:
New posts
Search forums
Menu
Log in
Register
Install the app
Install
DIY & Tutorials
DIY For Audio
DC Input protection - trip point ???
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Help Support AVForums:
This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
Message
<blockquote data-quote="JimGore" data-source="post: 815323" data-attributes="member: 3"><p>Hi Guys,</p><p></p><p>I am fine tuning my circuit design for DC <u>input</u> protection on my big amplifier build. The amplifier is fully DC coupled (obviously), so I reckon I will be clever and monitor for DC on the input, and if an excessive amount of DC is present, I will have the circuitry automatically switch a polyprop input capacitor into the circuit thereby removing the DC, but making the amplifier no longer DC coupled.</p><p></p><p>Question is at which point should the protection circuit trip? I am thinking I wouldn't want to see more than 50mV - what do you think? The servo inside the amplifier should be able to take care of 50mV worth at the input, but beyond that it could potentially get interesting.</p><p></p><p>Although coming to think of it - in a fully balanced amplifier the DC offset would (in theory, hopefully) be the same on both the HOT and COLD channels, so it may just be discarded because it's common mode. Though there is no real guarantee that both the HOT and COLD will have the same offset. Hmmm.... :thinking:</p><p></p><p>Thanks for your input!</p><p></p><p>Kind regards,</p><p>Ian.</p></blockquote><p></p>
[QUOTE="JimGore, post: 815323, member: 3"] Hi Guys, I am fine tuning my circuit design for DC [u]input[/u] protection on my big amplifier build. The amplifier is fully DC coupled (obviously), so I reckon I will be clever and monitor for DC on the input, and if an excessive amount of DC is present, I will have the circuitry automatically switch a polyprop input capacitor into the circuit thereby removing the DC, but making the amplifier no longer DC coupled. Question is at which point should the protection circuit trip? I am thinking I wouldn't want to see more than 50mV - what do you think? The servo inside the amplifier should be able to take care of 50mV worth at the input, but beyond that it could potentially get interesting. Although coming to think of it - in a fully balanced amplifier the DC offset would (in theory, hopefully) be the same on both the HOT and COLD channels, so it may just be discarded because it's common mode. Though there is no real guarantee that both the HOT and COLD will have the same offset. Hmmm.... :thinking: Thanks for your input! Kind regards, Ian. [/QUOTE]
Insert quotes…
Verification
Post reply
DIY & Tutorials
DIY For Audio
DC Input protection - trip point ???
Top