Index Home About
From: floyd@ptialaska.net (Floyd Davidson)
Newsgroups: alt.comp.dcom.modems,alt.dcom.telecom,comp.dcom.telecom.tech
Subject: Re: -dBm?
Date: 5 Aug 1999 02:17:56 GMT

Jason Spashett <jason_spashett@neuk.net> wrote:

>-dBm does anyone know what this measure ment means? it's the Rx
>LEVEL from an at&v1 report, but why is it minus? Anyone?

Generally the letters "dB" with another character (or more)
tacked on the end means dB compared to some value that the
character(s) represent.  In the case of an "m", it is reference
to a milliwatt.  A negative number means less than a milliwatt
and a positive number means more than a milliwatt.  Hence if you
have a receive level of -23 dBm, it means your signal is 23 dB
less than a milliwatt.

That is the short answer to your question.  What follows is
the much more interesting peripheral information (which is
probably more than you wanted to ever hear about!).

Another variation that is commonly seen adds a number after the
letter, such as "dBm0".  This is a little trickier to
understand.  It means the level compared to a milliwatt _after_
the value is adjusted to make a "correct" value be 0 dBm.  Hence
a circuit that goes through many different devices with various
amounts of gain or loss would have the exact correct signal
value of "0 dBm0" at every point, even the the actual correct
level might be anything.  If the value measured should be -16
dBm (which is called a "-16 dBm Test Level Point" and
abbreviated -16 dBm TLP), and in fact it is measured at -18 dBm,
then that reading would be called "-2 dBm0" (meaning it is 2 dB
lower than it should be).

The significance of the above (which confuses many people) is
that data signals are always set to -13 dBm0 or lower.  Hence if
a Plain Old Telephone Service (POTS) line is supposed to have a
receive level of -9 dBm, a 0 dBm0 signal would be -9 dBm.  And
that is what a testtone would be.  But data signals are at -13
dBm0, so it would read -22 dBm if it is exactly correct.

  Floyd



--
Floyd L. Davidson                          floyd@barrow.com
Ukpeagvik (Barrow, Alaska)



From: floyd@ptialaska.net (Floyd Davidson)
Newsgroups: alt.comp.dcom.modems,alt.dcom.telecom,comp.dcom.telecom.tech
Subject: Re: -dBm?
Date: 6 Aug 1999 03:34:07 GMT

In article <37A9A7AC.813372A8@prairie.ca>,
Southwest Telephone  <suitedrm@prairie.ca> wrote:
>> (which is probably more than you wanted to ever hear about!).
>
>At Collins we measured some of our MUX signals in dBrnC0

No, you didn't.  You measured *noise* energy with that reading,
not signal level!

>deciBells-residual noise-C message weighting-referenced to 0 dbm at 600
>ohms (or termination)

dBrn is "dB compared to reference noise level".  That is a scale
where 1 milliwatt will read 90 dBrn, and 0 dBrn would be -90
dBm.  C-message is a weighting filter that is designed to match
the human ear and the response of the telephone receiver, such
that the measured noise gives a relatively good indication of
how much it will interfere with a voice conversation.

dBrn0 is an adjusted figure to show the value that would be read
if test tone level were 0 dBm.  Hence if the Test Level Point is
-10 dB (for example at the end of a cable with 10 dB of loss),
then the measured value of dBrnC would have 10 added to it to
become dBrnC0.  That way noise levels can be compared to signal
levels, because while a 30 dBrnC noise level might be just OK if
the signal level is +7 dBm, it would be horrible if the signal
level is supposed to be -16 dBm.  The first amounts to a 23
dBrnC0 noise, and the second is a 46 dBrnC0 noise.

Once again, the 600 ohms is not significant, and the same noise
power is available whether it is 600 ohms or some other odd
value.

>btw, how loud is a Bell?

There is no loudness to a Bell.  (Does that ring true?)

A deciBell is approximately the smallest _change_ in volume that
can be detected by the human ear.  3 dB is a 2:1 power ratio.  6
dB is 4:1, and 9 dB is 8:1.  (Each 3 dB loss cuts the power by a
factor of two, each 3 dB gain doubles the power by 2.  Hence 10
dB (a Bell) would be just more than an 8:1 power change.  Of
course that is a very noticable change.

  Floyd


--
Floyd L. Davidson                          floyd@barrow.com
Ukpeagvik (Barrow, Alaska)



From: floyd@ptialaska.net (Floyd Davidson)
Newsgroups: alt.comp.dcom.modems,alt.dcom.telecom,comp.dcom.telecom.tech
Subject: Re: -dBm?
Date: 8 Aug 1999 02:36:40 GMT

H. Peter Anvin <hpa@transmeta.com> wrote:
>floyd@ptialaska.net wrote:
>>
>> Another variation that is commonly seen adds a number after
>> the letter, such as "dBm0".  This is a little trickier to
>> understand.  It means the level compared to a milliwatt
>> _after_ the value is adjusted to make a "correct" value be 0
>> dBm.
>>
>> Hence a circuit that goes through many different devices with
>> various amounts of gain or loss would have the exact correct
>> signal value of "0 dBm0" at every point, even the the actual
>> correct level might be anything.  If the value measured
>> should be -16 dBm (which is called a "-16 dBm Test Level
>> Point" and abbreviated -16 dBm TLP), and in fact it is
>> measured at -18 dBm, then that reading would be called "-2
>> dBm0" (meaning it is 2 dB lower than it should be).
>>

I'm not sure that I understand which part of this makes no
sense!  The concept of a TLP is certainly confusing to most
people when a simple attempt is made to explain it, and I'm
quite sure that my attempt was far too simple.  I need a good
picture to demonstrate, but that gets difficult.  (XEmacs to the
rescue...)

>That makes no sense.  The "m" refers to the reference (i.e. 0
>dBm == 1 mW), so although you can say that the "correct" value
>is 0 dB (meaning you're taking the values relative to whatever
>the correctly calibrated circuit is), it doesn't make it dBm
>for that, since that refers to a *specific* calibration scale.

No.  The correct value, _adjusted_ as if it were 0 dBm.  (And it
is definitely "dBm", not dB.  "dB" refers to a ratio, "dBm"
refers to a specific power level.

Let me try drawing a simple circuit to demonstrate.  Lets say
this is a one-way only circuit (so I don't have to put both
transmit and receive levels on it).  The part shown comes out of
a channel bank, goes to a pad, is then transported over a cable
pair to some distant point.  The levels shown are for a test
tone.


  +-----+      +-----+               +-----+      +-----+
  |     |      |     |     cable     |     |      |     |
  |Chan |      | 2 dB|  (6 dB loss)  |15 dB|      |     |
  |Bank |----->| PAD |->-----//----->| PAD |----->|Modem|
  |(Rcv)|  +7  |     | +5         -1 |     | -16  |     |
  |     | dBm  |     |dBm        dBm |     | dBm  |     |
  +-----+      +-----+               +-----+      +-----+

At each point, the level shown is the correct level for a test
tone, and is also the TLP.  Hence, the output of the channel
bank is called a +7 TLP, and the input to the modem is called a
-16 TLP.

If we measured this circuit, and at each point the levels were
exactly what is listed, then at each point the level would be
"0 dBm0".

But lets consider a more likely set of measurements!  The
channel bank output is actually a little low, it reads +6 dBm
(or -1 dBm0), the first pad isn't really 2 dB, it is 2.3 dB, so
the signal hitting the cable is 3.7 dBm (-1.3 dBm0), the cable
is wet today and therefore has 7.5 dB of loss instead of 6 dB so
the level out of the cable is -3.8 (-2.8 dBm0), and the pad
before the modem is actually a 14 dB pad instead of 15 dB, so
the level going into the modem is -17.8 (-1.8 dBm0).

(One way to demonstrate the usefulness of dBm0 is fabricate an
example like the above!  I guarantee that it was easiest to
decide how far off the particular loss was and then first
calculate how that would change the dBm0 value before figuring
the dBm value.  Figuring the dBm value before the dBm0 value is
just a lot harder.)

To show the usefulness of using dBm0 values, here is a table:

   Test Point     dBm0

   ChanBk Out     -1
   Cable Head     -1.3
   Cable End      -2.8
   Modem Input    -1.8

Each value shows how far from the correct value the level actually
is.  Here is a table of raw (called "uncorrected" readings),

   Test Point     dBm0    TLP

   ChanBk Out     +6      +7.0
   Cable Head     +3.7    +5.0
   Cable End      -3.8    -1.0
   Modem Input    -17.8   -16.0

Clearly another column is needed to indicate what the TLP is!
Otherwise that table is worthless.

Another practical example would be a leased line modem circuit
that goes across the country, with testboard access at 10
different locations.  If the level end-to-end is not correct,
the problem must be traced by checking with each testboard.  If
each testboard reports a dBm level, then they also must report
what the TLP if the location asking is to know if that level is
to high, too low, or just right.  By reporting a dBm0 value,
which is "corrected" for whatever the TLP level is, that just
eliminates confusion, because the value given _is_ the amount by
which the signal is too high or too low.

  Floyd



--
Floyd L. Davidson                          floyd@barrow.com
Ukpeagvik (Barrow, Alaska)



From: varney@ihgp2.ih.lucent.com (Al Varney)
Newsgroups: alt.dcom.telecom,comp.dcom.telecom.tech
Subject: Re: -dBm?
Date: 17 Sep 1999 21:00:01 GMT

In article <7qu7ft$nv5$1@nyheter.chalmers.se>,
Maxime Flament <Maxime.Flament@s2.chalmers.se> wrote:
>Floyd Davidson <floyd@ptialaska.net> wrote in message
>news:7pnhui$2dop@enews3.newsguy.com...
>> NO SPAM  <nospam@rsccd.org> wrote:
>> >Floyd Davidson wrote:
>> >> That is but one example.  The point is that the definition of dBm
>> >> is independant of the impedance.  Power can be measured in
>> >> circuits with virtually any impedance, and there is no requirement
>> >> that a circuit be 600 ohms to measure power in dBm as your
>> >> original post indicated.
>> >
>> >I didn't indicate that, nor did I contradict what you said. I just said
>> >that prectically applied measurement of power is usually done with a
>> >voltmeter, into a known impedance.  Without knowing the impedance, the
>> >voltage can't be directly related to the power.  You admitted that you
>> >were being pedantic.
>>
>> The original comment that I responded to said "dBm is zero at 1
>> milliwatt into 600 ohms."
>>
>> It isn't.  That was my point.  It is a valid point.
>
>Indeed, "dBm is zero at 1 milliwatt into 600 ohms." has no meaning: 1 mW in
>600Ohms or 1200Ohms is still 1mW.

   Still, Floyd, the above phrase keeps coming up.  Did you ever wonder why?
Well, it's time for The Rest of the Story....

   In January 1938, Bell Labs, CBS and NBC engineers began trying to move
the broadcast industry and the Bell System toward a common standard for
the design and usage of "volume indicators".  At the time, such indicators
were rather ad-hoc devices, using a reference of 1, 6, 10, 12.5 or 50
milli-watts into 500 or 600 ohms.  Scales were sometimes in the new "dB"
unit, sometimes in volts, sometimes in percentage of peak power, etc.

   The first documented case for a "volume indicator" was while setting up
the public address system for use on November 11, 1921 (Armistice Day, now
Veterans Day), where the Unknown Soldier was to be buried at Arlington.
A public address system in those days was not a local affair -- large
audiences were expected in New York and San Francisco to listen to the
occasion.  The Bell System (and Western Electric) supplied loudspeakers,
amplifiers and a communications infrastructure (15kHz lines) to support
this "simul-cast".

   Amplifier overload was detected in the testing of the system, and the
Bell System responded by building (and patenting) a "volume indicator"
(Types 518 and 203).  The reference level was set to read 0 dB at a point
just below distortion in the telephone repeaters (amplifiers) carrying
this type of high-quality voice.  The user was to adjust "volume" to the
point where the meter needle reached 0 dB (mid-scale) about once every
ten seconds.

   From that point, usage of these types of meters exploded both within
the Bell System and within organizations involved in sound broadcast.
But there was no agreement on the reference values, the responsiveness
of the meters, the tracking of RMS vs. peak voltages or even the units
to be used on the meter scales.

   This would have not been such a problem if each organization worked in
a closed environment.  But broadcasters had two external organizations to
satisfy:  the FCC, who had strict limits on broadcast power modulation levels
and the Bell System, who interconnected most radio stations that made up
a "broadcast network".  There were frequent disagreements between Bell
System personnel and broadcast studios regarding the control and monitoring
of overload levels in these networks.  The lack of standards in the area
of "volume" were the biggest cause of these disagreements.

   As an example of the complexity of these networks, a typical one had
15,000 miles of circuits, hundreds of amplifiers and 50-100 radio stations
over a broad geographic area.  Every 15 minutes or so, some stations would
be added or dropped from the network feed, and the feed itself might move
between studios in various cities.  At all times, the volume of the material
arriving at each station, and the volume through each amplifier, had to be
monitored and adjusted for a consistent broadcast signal.  Bell personnel
were responsible for volume inside the broadcast lines and radio studio
or station engineers were responsible for volume at their ends of the
connections.

   So, the trio of Bell System/NBC/CBS and the Weston Instrument company
conducted tests over a year-long period to determine a single standard for
the volume indicator.  This included the scale color, type of pointer,
responsiveness, sensitivity, decay rate, etc.  Included was the requirement
for calibration:

   "The reading of the volume indicator shall be 0 vu when it is connected
    to a 600-ohm resistance in which is flowing one milliwatt of sine-wave
    power at 1000 cycles per second."

   The rationale for such a specification?  The desire to reinforce that
the meter was to measure POWER, not VOLTAGE, while acknowledging that all
indicators (to that point) were voltage-based and thus a common impedance
was needed to avoid the need to readjust meters to multiple values.  In
effect, it specified the impedance of the monitoring point(s), while
stating that it was power, not voltage, that was being measured.

   Oh, yes, there was another requirement for these new volume indicators.
The units on the scale were to be "vu" or "VU", not "dB", so that users would
know the instrument conformed to the new "volume indicator standard".
And the hope was that the term VU would be restricted in the future to only
those instruments conforming to the requirements.

---
   I have similar information on the history of the "decibel" which I hope
to find time to put together.

Al Varney


From: floyd@ptialaska.net (Floyd Davidson)
Newsgroups: alt.dcom.telecom,comp.dcom.telecom.tech
Subject: Re: -dBm?
Date: 18 Sep 1999 07:54:34 GMT

Al Varney <varney@lucent.com> wrote:
>
>   "The reading of the volume indicator shall be 0 vu when it is connected
>    to a 600-ohm resistance in which is flowing one milliwatt of sine-wave
>    power at 1000 cycles per second."
>
>   The rationale for such a specification?  The desire to reinforce that
>the meter was to measure POWER, not VOLTAGE, while acknowledging that all
>indicators (to that point) were voltage-based and thus a common impedance
>was needed to avoid the need to readjust meters to multiple values.  In
>effect, it specified the impedance of the monitoring point(s), while
>stating that it was power, not voltage, that was being measured.
>
>   Oh, yes, there was another requirement for these new volume indicators.
>The units on the scale were to be "vu" or "VU", not "dB", so that users would
>know the instrument conformed to the new "volume indicator standard".
>And the hope was that the term VU would be restricted in the future to only
>those instruments conforming to the requirements.

And indeed, to this day a VU meter is a device that works on 600 ohm
circuits, only.  Unlike a dB meter.  Which is why the broadcast industry
uses only 600 ohm circuits and VU meters, while the telephone industry
uses an array of different circuit impedances, and measures them with
dB meters.

(BTW, Al... thanks so much, once again, for doing research with
the wonderful resources you have access to.  The articles you
write are just absolute treasures for the rest of us! It takes a
great deal of time and no small amount of effort to verify all
the facts, and then the talent needed to recount and summarize
the information is in itself far beyond what most of us are able
to do.)

  Floyd


--
Floyd L. Davidson                          floyd@barrow.com
Ukpeagvik (Barrow, Alaska)



From: varney@ihgp2.ih.lucent.com (Al Varney)
Newsgroups: alt.dcom.telecom,comp.dcom.telecom.tech
Subject: Re: -dBm?
Date: 20 Sep 1999 21:24:58 GMT

In article <7rvgfq$1pqi@enews3.newsguy.com>,
Floyd Davidson <floyd@ptialaska.net> wrote:

>And indeed, to this day a VU meter is a device that works on 600 ohm
>circuits, only.  Unlike a dB meter.  Which is why the broadcast industry
>uses only 600 ohm circuits and VU meters, while the telephone industry
>uses an array of different circuit impedances, and measures them with
>dB meters.

   Adding to the confusion, the telephone industry (unlike broadcast)
also specifies dBw measurements at points in the network where 0dB does
not equal 1 milliwatt.  While the topic of Transmission Level Point (TLP)
is of less concern in all-digital networks, it still adds a level of
complexity to voice transmission telephone networks.  And another level
is added with the use of (multiple) circuit noise measurements (dBrn,
dBa, etc.).

>(BTW, Al... thanks so much, once again, for doing research with
>the wonderful resources you have access to.  The articles you
>write are just absolute treasures for the rest of us! It takes a
>great deal of time and no small amount of effort to verify all
>the facts, and then the talent needed to recount and summarize
>the information is in itself far beyond what most of us are able
>to do.)

   Blush ... Thanks, Floyd.  But your praise really belongs to many
unsung telephony pioneers whose written words are still preserved
at Bell Labs libraries.  For example, much of my VU information was
from the Bell System Technical Journal, Vol 19, 1940, p. 94-137,
"A New Standard Volume Indicator and Reference Level", by D. K. Gannett
(Bell Labs), H. A. Chinn (CBS) and R. M. Morris (NBC).  This was
also presented at a joint AIEE/IRE meeting in San Francisco (June 1939)
and at the IRE Convention in New York (September 1939).  Additional
material was from the June 1940 Bell Labs Record and several prior
documents relating to the Bell System's "program transmission networks"
and work of loudspeaker/public-address systems.

   As for the effort, I have to acknowledge a love of technical history
developed at an early age.  A corresponding skill at writing/presentation
is, unfortunately, lacking.  Reading these old journals is an education;
while looking up some articles on a particular topic, I almost always come
across something else of interest.  For example (BSTJ, 1935):


   Using a combination of open wire, undersea cable and HF radio circuits,
the first (known) 2-way telephone call around the world took place on
25 April, 1935.  The 4-wire circuit went from New York to San Francisco
via open wire (and some cable), then to Bandoeng, Java via 10 MHz radio,
to Amsterdam via 19 MHz radio, to London via land/sea cable and back
to New York via 12 MHz/18 Mhz radio.  [I believe Bandung is the common
spelling these days.]

   A total of 980 tubes were used in amplifying the signals, a gain of
about 2000 dB in each direction.  Total one-way delay was about .25 second,
with the loaded cable sections contributing over half that delay (but only
15% of the total length.)

   A number of persons (including the President of AT&T) conversed over
this circuit for about 30 minutes, even though the end points were only
a few rooms apart.  [No estimate of the "cost" of this call was available.]

   I don't have documentation on the HF radio used in 1935, but commercial
HF radiotelephone service overseas didn't begin until 1929.  Those units
used a 6-tube final RF section with water-cooled tubes and a 10KV plate
supply for a 15KW transmitter.  The San Francisco-Bandoeng route is about
8700 air miles.  Note that it was 1956 before a trans-oceanic undersea
cable was available, so for 25 years HF radio was the primary means of
trans-oceanic International calling.

Al Varney


From: floyd@ptialaska.net (Floyd Davidson)
Newsgroups: alt.dcom.telecom,comp.dcom.telecom.tech
Subject: Re: -dBm?
Date: 21 Sep 1999 14:19:07 GMT

In article <37E6D63B.854FB836@mail.rsccd.org>,
NO SPAM  <NOSPAM@mail.rsccd.org> wrote:
>Floyd Davidson wrote:
>[snip]
>
>> And indeed, to this day a VU meter is a device that works on 600 ohm
>> circuits, only.  Unlike a dB meter.  Which is why the broadcast industry
>> uses only 600 ohm circuits and VU meters, while the telephone industry
>> uses an array of different circuit impedances, and measures them with
>> dB meters.
>
>Remember that the usual audio line output is low impedance, so it's
>constant voltage.  This means that for most purposes, the voltage will
>be the same whether it's driving a 10k load or a 600 ohm load.
>
>And of course the power will only be correct for the correct
>impedance.  But then the line level is what is being displayed.

In fact, the description above is not valid.  The voltage will not
remain the same, and the power delivered will of course be
_drastically_ different for different load impedances.

Generally both dB and Vu meters are voltage measuring devices that
depend upon the impedance being correct to provide a calibrated
power reading.  That is commonly known and has been commented on
by several people in this thread.

The difference between measuring the voltage with a 10K ohm load
and with a 600 ohm load, will measure 6 dB change on a _volt_
meter that is calibrated for power.  In essense, that means the
voltage will be twice as high across the 10K ohm load.

There are two values which are "trigger" points for anyone who
does much in the way of measuring levels.  One is knowning that a
3 dB drop means the circuit is "double terminated".  (That happens
if you use a test set with a 600 ohm termination to measure a
circuit which already has a 600 ohm load tied to it.) The second
is a 6 dB rise in level, which means the circuit is not terminated
at all.  (That happens when a circuit is measured without its
normal 600 ohm load, and the test set is configured for a
"bridging measurement", which is generally a 10K impedance.)

One of the more embarassing things that can be done is make a long
trip to some remote location and adjust circuit levels only to
later discover that they are exactly either 3 dB or 6 dB off!  (In
my case that can be _really_ embarassing, because some of my
"remote" locations cost upwards of $2000 for chartered aircraft to
visit!)

  Floyd


--
Floyd L. Davidson                                floyd@ptialaska.net
Ukpeagvik (Barrow, Alaska)                       floyd@barrow.com


Index Home About