One quantitiy to rule them all

This discussion will be firmly set in the context of electronic circuits with currents and voltages. Root mean squared is a common quantity in other domains also where a single number is useful to describe how “big” a signal is.

1. Effective values

Consider a DC voltage source \(V_X\) attached in parallel with a resistor \(R_{\mathrm{load}}\). The power absorbed by this resistor is then:

\[P_{\mathrm{load}} = V_X I_X = V_X \dfrac{V_X}{R_{\mathrm{load}}} = \dfrac{V_X^2}{R_{\mathrm{load}}}\]

Now imagine that this voltage source varies with time and is periodic with period T. It makes no difference what the shape of this waveform is, just that it repeats eventually.

rms veff
Figure 1. Simple circuit for computing power
Definition

The effective voltage \(V_{\mathrm{eff}}\) is the value of a constant (DC) source that supplies the same average power to the resistor \(R_{\mathrm{load}}\) as the signal source \(v_X(t)\).

Compute the average power dissipated in the load resistor over a period of an arbitrary waveform \(v_X(t)\).

\[\begin{align} P_R &= \dfrac{1}{T} \int\limits_0^T p(t)\, dt \\ &= \dfrac{1}{T} \int\limits_0^T v_X(t) i_X(t)\, dt \\ &= \dfrac{1}{T} \int\limits_0^T v_X(t) \left(\dfrac{v_x(t)}{R}\right)\, dt \\ &= \dfrac{1}{R} \dfrac{1}{T} \int\limits_0^T v_X^2(t)\, dt \\ \end{align}\]

A DC source of the correct value, the effective voltage will supply the same power to this resistor.

\[\dfrac{V_{\mathrm{eff}}^2}{R} = P_R = \dfrac{1}{R} \dfrac{1}{T} \int\limits_0^T v_X^2(t)\, dt\]
\[V_{\mathrm{eff}}^2 = \dfrac{1}{T} \int\limits_0^T v_X^2(t)\, dt\]

Throw in a square-root and out pops our definition of root-mean-square:[1]

Important observations
rms terms
Figure 2. Because S-M-R would be weird
  • An RMS quantity is a scalar value --- not a function of time.

  • RMS is synonomous with effective value.

  • RMS has to do with power at the core of its definition.

2. Composite waveforms

Superposition is a powerful property of linear systems because it allows decomposing a signal into a sum of several signals (usually sinusoids but not necessary), which can then be treated separately.

  • How does superposition work for RMS quantities?

Be careful, it may not be what you would guess! Power quantities behave differently than voltage and current quantities.

Consider a signal composed of two periodic waveforms v1 and v2 with periods T1 and T2. Again, it doesn’t matter the shapes as long as each part is periodic, which is the whole point of the effective value idea. Since both are periodic, then the composite signal is also periodic with period T = T1 T2. Note that T can be any common multiple of the individual periods.

\[v_{\mathrm{total}} = v_1(t) + v_2(t)\]

Compute the (square of the) RMS of this signal by using the definition

\[\begin{align} V_{\mathrm{RMS}}^2 &= \dfrac{1}{T} \int\limits_0^T \left[ v_1(t) + v_2(t) \right]^2\, dt \\ &= \dfrac{1}{T} \int\limits_0^T \left[ v_1^2(t) + 2 v_1(t) v_2(t) + v_2^2(t) \right]\, dt \\ \end{align}\]

Since integration is a linear operator, expand into three parts to better inspect each term

\[\begin{equation} \begin{split} V_{\mathrm{RMS}}^2 = &\dfrac{1}{T} \int_0^T v_1^2(t)\,dt \\ &+ \dfrac{2}{T} \int_0^T v_1(t) v_2(t)\,dt \\ &+ \dfrac{1}{T} \int_0^T v_2^2(t)\,dt \\ \end{split} \end{equation}\]

We see familiar expressions in the first and third terms --- they are simply the RMS values of the two component signals, \((v_{1\,\mathrm{RMS}})^2\) and \((v_{2\,\mathrm{RMS}})^2\). The second term requires the careful treatment.

  • If signals \(v_1\) and \(v_2\) are orthogonal over the interval \([0,T]\) then this term is zero (the very definition of orthogonal). It turns out that any two sinusoids of different frequencies, integrated over their the least common multiple of their respective periods, are orthogonal.

  • Another way to say this is this second term goes to zero if \(v_1\) and \(v_2\) are uncorrelated.

Therefore, for the sum of uncorrelated signals, the composite’s RMS value is

\[V_{\mathrm{total}\,\mathrm{RMS}} = \sqrt{v_{1\,\mathrm{RMS}}^2 + v_{2\,\mathrm{RMS}}^2 + v_{3\,\mathrm{RMS}}^2 + \ldots}\]

This is called the root sum (of the) squares or RSS for short.


To repeat: RSS, root sum of the squares, only works when the signals are (choose one, choose all):

  • uncorrelated

  • orthogonal

3. RMS of common wave shapes

Few people relish the thought of computing an integral just to get an “effective value,” but that’s the definition.

Fortunately, there are a few common wave shapes that that we encounter all the time.

3.1. sinusoid

Signal: \(v_X(t) = A \sin(2\pi f t)\)

\[\begin{align} V_{\mathrm{RMS}} (sin) &= \sqrt{\dfrac{1}{T} \int\limits_0^T A\sin(2\pi f t)^2\, dt} \\ & = \sqrt{\dfrac{1}{T} \int\limits_0^T \frac{A}{2}\left(1 - \cos(2\cdot2\pi f t)\right)\, dt} \\ & = \sqrt{\dfrac{A}{2T}T - \int \cdots} \\ & = \sqrt{\frac{A}{2} - 0} \\ & = \frac{A}{\sqrt{2}} \end{align}\]
  • Repeat this calculation if the signal is: \(v_X(t) = A \cos(2\pi f t)\) ?

  • If you have a 120 VRMS signal (say, from a power outlet), what is the peak voltage value?

3.2. triangle

This is a fun one! Set up a triangle wave that is centered around zero, meaning that the average value must be zero. It is useful to specify the “duty cycle” as the proportion of time the waveform spends linearly rising compared to the period (the other portion the signal is linearly falling). Then you only complete this integral once and get a parameterized result.

You need to do this by hand at least once in your life before you memorize the result.

3.3. square

Another fun one.

50% duty cycle, centered around 0 with peaks at ±A.

X% duty cycle, with peaks at ±A.


1. To the mathematial folk that just twitched: yes, a proper square root operation includes the ± prefix. Remember that the Veff definition involving power always involves squaring first. Therefore there is no change if you take the negative value, the same power is dissipated in the resistor. By convention, we take the positive value to reduce the negativity in our lives.