If it's connected to ~20' of 40pF/foot guitar cable, then it sees ~~1000pF of capacitance.
BTW shouldn't that be roughly 800pF?
I said it was rough!
(So rough that I made a mistake in the formula I used for generating the graph which over stated the effect of multiple sections - The principle remains, but I will correct the post - oh and -3dB should be ~70% of the voltage, not 50%! )
I'm just looking at a classic low pass filter where the resistor (R) is the output impedance of the previous stage and the capacitor (C) is the capacitance of the screened cable (length x capacitance/unit length from the manufacturer's data sheet)
(Capacitive reactance equals one over two pi times the frequency (in Hertz) times the capacitance (in Farads))
Vout/Vin = Xc/(sqrt(Xc^2 + R^2)
When a 1M pot is set at mid value, it looks like two 500k resistors in parallel to small signals - i.e 250k, which is the worst case I could think of for R.
Yeah, I didn't think the figures sounded right, I just wasn't hearing the losses you described and I really should be hearing something. The link above has the co-ax at 26pF. Thankfully I don't where you'd buy coax with spec of 100pF per foot.
Seems to be squared up now. Still, if each run is only 20pF to ground (8" of 30pF/ft cable) instead of 100pF, that puts the -3dB point out at 32kHz. And, some people would insist that using low pF cable such as that would make the amp too bright unless some other HF reduction is included.
martin manning wrote:Seems to be squared up now. Still, if each run is only 20pF to ground (8" of 30pF/ft cable) instead of 100pF, that puts the -3dB point out at 32kHz. And, some people would insist that using low pF cable such as that would make the amp too bright unless some other HF reduction is included.
Agreed.
The fun starts when someone tries to 'improve' a proven design - either by running shielded cable where there should be none and killing the high end, or by replacing 'normal' screened cable with super-duper low capacitance (PTFE??) cable and making it too bright.
...however, the original point was to demonstrate that these things are predictable and can be calculated.
(It sounds like I'm arguing with you, but I'm not! )
Not specifically germane (to the Rocket), but on my Express, I shielded the input and the treble and mid/bass lines (seperately) and there is too much mid/bass. It matters.
Is there any difference in tying the shield to ground vs tying it to B+? Other than the obvious safety concerns.
I don't like giving anecdotal responses, Gerald Weber had an article out on tying the shield to the plate, I experimented with it, I found it was okay after the tone controls with no apparent colouration, however on the first stage there was a loss of highs, I put it down to the high output impedance of the guitar.
LeftyStrat wrote:So another question. Is there any difference in tying the shield to ground vs tying it to B+? Other than the obvious safety concerns.
There is no significant difference in frequency response between tying the shield to the plate, B+, or ground, except when the shielded run is between a series grid resistor and the grid. In that case there is a big difference when it's tied to the plate because the cable capacitance is subject to the Miller effect, whereby it is multiplied by the voltage gain of the stage. For example, 20pF of cable capacitance after a 34k grid stopper (you leave the pair of 68k input resistors on the input jack, say) will move the -3dB point from 20kHz (shield connected to ground or B+) down to 3.6kHz (shield connected to the plate). The frequency shift of the -3dB point will be about the same for a shielded run from a 1M volume pot wiper (2.5 octaves).
Edit: Yeah, I guess it is funny in that tragic way. There was lost brightness due to the co-ax, then it was too bright, then hooking the shield to the plate came up and I overlooked Miller's effect.