One quantitiy to rule them all
This discussion will be firmly set in the context of electronic circuits with currents and voltages. Root mean squared is a common quantity in other domains also where a single number is useful to describe how “big” a signal is.
1. Effective values
Consider a DC voltage source \(V_X\) attached in parallel with a resistor \(R_{\mathrm{load}}\). The power absorbed by this resistor is then:
Now imagine that this voltage source varies with time and is periodic with period T. It makes no difference what the shape of this waveform is, just that it repeats eventually.
- Definition
-
The effective voltage \(V_{\mathrm{eff}}\) is the value of a constant (DC) source that supplies the same average power to the resistor \(R_{\mathrm{load}}\) as the signal source \(v_X(t)\).
Compute the average power dissipated in the load resistor over a period of an arbitrary waveform \(v_X(t)\).
A DC source of the correct value, the effective voltage will supply the same power to this resistor.
Throw in a square-root and out pops our definition of root-mean-square:[1]
2. Composite waveforms
Superposition is a powerful property of linear systems because it allows decomposing a signal into a sum of several signals (usually sinusoids but not necessary), which can then be treated separately.
-
How does superposition work for RMS quantities?
Be careful, it may not be what you would guess! Power quantities behave differently than voltage and current quantities.
Consider a signal composed of two periodic waveforms v1 and v2 with periods T1 and T2. Again, it doesn’t matter the shapes as long as each part is periodic, which is the whole point of the effective value idea. Since both are periodic, then the composite signal is also periodic with period T = T1 T2. Note that T can be any common multiple of the individual periods.
Compute the (square of the) RMS of this signal by using the definition
Since integration is a linear operator, expand into three parts to better inspect each term
We see familiar expressions in the first and third terms --- they are simply the RMS values of the two component signals, \((v_{1\,\mathrm{RMS}})^2\) and \((v_{2\,\mathrm{RMS}})^2\). The second term requires the careful treatment.
-
If signals \(v_1\) and \(v_2\) are orthogonal over the interval \([0,T]\) then this term is zero (the very definition of orthogonal). It turns out that any two sinusoids of different frequencies, integrated over their the least common multiple of their respective periods, are orthogonal.
-
Another way to say this is this second term goes to zero if \(v_1\) and \(v_2\) are uncorrelated.
Therefore, for the sum of uncorrelated signals, the composite’s RMS value is
This is called the root sum (of the) squares or RSS for short.
To repeat: RSS, root sum of the squares, only works when the signals are (choose one, choose all):
|
3. RMS of common wave shapes
Few people relish the thought of computing an integral just to get an “effective value,” but that’s the definition.
Fortunately, there are a few common wave shapes that that we encounter all the time.
3.1. sinusoid
Signal: \(v_X(t) = A \sin(2\pi f t)\)
-
Repeat this calculation if the signal is: \(v_X(t) = A \cos(2\pi f t)\) ?
-
If you have a 120 VRMS signal (say, from a power outlet), what is the peak voltage value?
3.2. triangle
This is a fun one! Set up a triangle wave that is centered around zero, meaning that the average value must be zero. It is useful to specify the “duty cycle” as the proportion of time the waveform spends linearly rising compared to the period (the other portion the signal is linearly falling). Then you only complete this integral once and get a parameterized result.
You need to do this by hand at least once in your life before you memorize the result.