Index Home About
From: floyd@polarnet.com (Floyd Davidson)
Newsgroups: comp.dcom.modems
Subject: Re: Allowable phone line signal level?
Date: 1 Sep 1996 06:56:22 GMT

renneber@tmx.mhs.oz.au wrote:
>Bill Garfield wrote:
>> 
>> Typical "good" levels will vary from a peak of about -15 to down around
>> -24.   Much below that and the signal-to-noise ratio starts falling off
>> pretty quickly.
>> ...
>
>-15 to -24 dB sounds right.
>
>Interestingly enough, if the signal levels are too high you can have
>problems too.
>
>In part, this is because at high signal levels, the line
>transformers along the call path (eg in the modems, and perhaps in the CO and
>DLC equipment) generate significant distortion products, due to the
>transformers' non-linear characteristics.
>
>The higher the power level, the more these distortion products cause in-band
>interference. Unless you have an atrocious connection, you shouldn't gain
>anything by increasing the received signal level above -10 dBm.

Not raising the _transmit_ signal level above -10 dBm is good
advice, but the above comments on why are not correct.
Transformers in the telephone system are designed to handle not
only a signal, but also loop current, which is far higher in level
than a modem signal.  Transformers are rarely, if ever, be the
source of significant signal distortion in the PSTN.

Hot levels, however, do commonly result in distortion.  From the
transmit end, lets consider a modem set to have an average output
power level of -10 dBm.  The average cable for a local loop has 3
dB loss, so the signal arrives at the telco switch at about -13
dBm.  That is by design, not accident.  The "Test Level Point" on
the transmit side, at the telco, is 0 dBm (that means a test tone
level at that point should be 0 dBm).  Data signals are set to
-13 dBm0 (dBm0 means dB below the level of a test tone level)
because the _peak_ instantaneous power in the signal may be as
high as 13 dB above the average power.  The circuit has to pass
the peak power undistorted for the modem to function properly.

So, with an average send level power of -10 dBm at the modem, and
a 3 dB loss on the local loop, the peak power will be 0 dBm at the
point where the telco digitizes the analog signal with an AD/DA
codec.  The codec does not encode any level higher than +5 dBm.
All levels above that level are coded as +5 dBm, and are thus hard
limited at that level.  (And of course hard limiting will cause
serious distortion of a modem signal.) While it appears that we
have a 5 dB range to work with, in fact it isn't.  The average
cable on a local loop may have 3 dB loss, but if you live across
the street from the telco it might be less than half a dB.  Likewise
a telephone circuit is allowed to have frequency vs. amplitude
variations of as much as 3 dB above test tone level at frequencies
other than the 1000 Hz test tone.  Hence, for either of two
reasons, the dynamic range available for increased transmit level
can be reduced to only 2 dB, and if both are experienced it would be
possible for even the -10 dBm modem level to cause limiting in the
telephone system, though that is very unlikely.

Suffice to say that transmit levels should not be adjusted higher
without understanding the possible distortion and reduced
connection speeds that might result.

On the receive side, a level of -15 dBm is probably actually a
little higher than it should be.  The 0 dBm test tone sent at the
distant codec is intended to arrive at a telephone subscriber with
a average power level which is never higher than -6 dBm.  Since
the peak to average ratio is 13 dB, the measured level for a modem
signal should be about -19 dBm at the greatest.  However, once
again the variation with frequency can be as high as +3 dB, so it
would be possible to have a signal of -16 dB on a line that is
within specifications and still not experience any signal
distortion due to peak limiting in the sending codec.  At a
measured level of -15 dBm the signal is probably, at best, within
1 dB of limiting back at the transmit end.

That is the theory of it from a circuit engineering standpoint,
but it also should be noted that practical experience with trunks
and foreign exchange lines indicates that levels which are 4 dB
hot will cause v.32 and v.34 modems to have degraded preformance.
It should also be noted that modems are generally designed to the
very same specifications, and experience setting pad levels on
foreign exchange circuits (where padding is between the codec and
the modem) indicates that the receive signals sent to a modem will
cause serious degradation if the level is more than about 3-4 dB
too high.

Floyd


-- 
Floyd L. Davidson          Salcha, Alaska         floyd@tanana.polarnet.com



From: floyd@polarnet.com (Floyd Davidson)
Newsgroups: comp.dcom.modems
Subject: Re: What about Tx level?
Date: 6 Sep 1996 22:13:39 GMT

[emailed and posted]

nkal@hol.gr wrote:
>Does anyone know what the Tx level should be?
>My fastblazer(8840) is configurable for -9 --> -15 dBm , and my  Rx signal
>is usually -18 dBm..
>I've read that The Rx level should not exceed -9 dBm (i.e -8..-7..) cause this
>indicates that the the Tx signal from the remote was somewhere over-amplified,
>since the specs are that no modem should be transmitting > -9dBm.
>Also someone here said something about maximum Rx signal best being around -18
>not "louder"..So if I set to -9 am I doing wrong? Can someone clarify what 
>happens @ -9 and @ -15 ? 

The "correct" level is -13 dBm at the line card (whether the line
card is in a switch, a remote unit, or a cable carrier system)
where the telco converts your analog signal to a digital signal.
Hence if you know that your loop is a rather long one, increasing
the transmit level a bit might not hurt.  But if your loop is very
short the higher level may result in signal distortion due to peak
limiting in the telco's facilities.

If you have the ability to measure the loss on your loop it is
then fairly easy to determine an exact setting for the modem
transmit level.  But lacking that ability, perhaps the commonly
used -10 dBm level (which assumes about 3 dB loss) is probably a
very good compromise.  It isn't critical at all below the point
where limiting occurs.  A signal that is 2 dB below limiting is
not really any better than one which is 6 dB below limiting in
most cases.  But just 1 or 2 dB too high will cause significant
degradation of the signal and a slower connection.  Hence a
compromise that is on the low side for most connections, and still
workable for connections with high levels, is a good working
solution.

The design target for receive level on a telephone loop is a test
tone level of -6 dBm maximum.  The data level should be 13 dB
below that, so the receive level probably should never exceed -19
dBm for a modem data signal.

The -18 dBm figure you are reading indicates two things.  First is
the modem may not accurately measure the level.  It might actually
be a few dB different than what is says.  That being the case, the
measurement it gives is good for comparing to other measurements it
gives, but little else.  For example if one call reads -18 dBm and
another reads -22 dBm it is probably very safe to assume the
second is close to 4 dB lower than the first.  The absolute levels
are questionable though.

However, I personally would draw one more conclusion from the -18
dBm reading, and would assume that you have a relatively low loss
loop.  Without any other specific information I'd be more likely
to reduce the modem transmit level than to increase it.  But I
wouldn't do either unless the calls coming in at -18 dBm receive
level are consistently connecting at lower speeds than those with
lower levels.

Floyd
-- 
Floyd L. Davidson          Salcha, Alaska         floyd@tanana.polarnet.com

From: floyd@ptialaska.net (Floyd Davidson)
Newsgroups: comp.dcom.modems
Subject: Re: Peak vs. Average Modem Power Level?
Date: 1 Dec 1999 11:51:56 GMT

In article <821pj8$bqa$1@ssauraab-i-1.production.compuserve.com>,
Alan Tabourian  <71203.1307@CompuServe.COM> wrote:
>After the initial line probing that is supposed to take place at a
>-6dBm level, modems generally connect at a -10dBm nominal level.
>
>Could someone explain whether the -10dBm level refers to the peak
>level that would be observed or whether this applies the average
>level.

That is the average level.  Actual peaks will approach 12-13 dB
higher than the average (the closer to a gaussian distribution
the more difference there will be between the peak and the average).

The design target assumes 2-3 dB cable loss, a 10-11 dB signal,
and therefore a level at the telco of about 0 dBm for peaks (and
about -13 dBm for average).  That provides only a 4 dB margin
for high levels, because digitial carrier cannot quantize a
level greater than +4 dBm.  If for any reason at any point the
levels become 4 dB higher than expected the result will be peak
limiting, which causes severe distortion which will quickly
degrading modem througput.

  Floyd

--
Floyd L. Davidson                          floyd@barrow.com
Ukpeagvik (Barrow, Alaska)



From: floyd@ptialaska.net (Floyd Davidson)
Newsgroups: comp.dcom.modems
Subject: Re: Peak vs. Average Modem Power Level?
Date: 2 Dec 1999 00:52:29 GMT

<tabourian@compuserve.com> wrote:
>Hi Floyd,
>
>Tks for your response.  Let me expand on the question, just to
>make sure that I have not gone off on a tangent.

This is interesting!


>I am trying to run a V.34 modem over a narrow band FM link,
>mind you not a two-way radio style TX/Rx unit, but a custom
>designed FM link that is meant to provide adequate performance
>(bandwidth, group delay, SNR, etc) to support V.34 operation.

One question though, will this be used to connect to other
circuit facilities, such as PCM digitial carrier or digital
switching systems?  Or will the two modems both be directly
tied to the FM radio channel interface points?

Also, I'm assuming you would really like to see 33.6Kbps...
but how low can you go and still accept the results?  My
guess is that 33.6K might be difficult, 28.8K is possible,
and 24 or 26.4K seems very reasonable.  I can't see how
anything would not work at 21.6K!

>The issue really has to do with SNR.  Theoretically, and I have
>read this over and over again, an analog phone line will
>deliver no better that 35-38dB SNR due to mu/A law
>quantitisation.  At first I assumed that if I deliver this
>level of SNR, then I am home free.

Assuming you are going to tandem your FM channel with other
facilities, which most likely will include digital carrier or
switching systems, you need at least 6 dB better than that just
to not contribute significantly to the SNR.  43 dB is the number
that comes to mind (but that should also be a _minimum_, not the
design target.  60 dB might be a good design target...  :-)

If your channel provides a 37 dB SNR, and is in tandem with a
PCM channel then the overall SNR is going to be no better than
34 dB.  The signal level will remain the same, but the noise
from those two channels will add together, and since they have
identical noise power to contribute, the increase in noise will
be 3 dB.  Ideally you can design an FM radio to provide much
better SNR than the PCM channel, and it will contribute no
significant noise (less than 1 dB change in SNR).  That is of
course ignoring the issue that you bring up farther down in
regard to idle channel noise.

>The first issue that came up, is that modems connect at a
>-10dBm nominal level.  In a wireline environment, the SNR level
>does not change appreciably if the level drops from 0dBm to
>-10dBm.  Of course in an analog FM link, a -10dBm level drop
>means a drop of -10dB in SNR.

Because of the A/mu-Law quantizing, the SNR of a PCM channel
will maintain relatively the same SNR for different signal
levels within a reasonable range, but will exhibit very low
noise on an idle channel.  Other facilities used for typical
telephone circuits do not have that characteristic.  Both PCM
and FM radio channels have a distinct upper limit to signal
level though.  With PCM it is a very hard limit, and clipping
takes place with perhaps less than 1 dB compression prior to the
knee, and then absolute limiting.  With an FM radio channel
there is a knee in the intermod vs. modulation index plot, and
higher signal levels cause dramatically increasing amounts of
intermod past that point.

Hence peak levels are very significant for FM radio channels,
just as they are for PCM channels.  Usually in the telephone
industry we are very concerned with one difference though, which
is that on wideband multichannel systems the noise problems
with high levels in PCM multiplexers only affect the channel
with the high level, while with a multichannel FM radio a high
level in any channel can affect all channels.

Of course, in your situation there is only one channel, so the
difference is moot in that respect.

>At that point I beefed up the specs so that I called for 48 to
>50dB SNR at 0dBm for the NBFM link.
>
>Next I learned that both V.34 and specially V.34+ use a
>modified (packed) constellation and rely on the very low idle
>channel noise present over the telephone network on the order
>of -60 or even -70dB to make it all work.  Translating this
>into my NBFM world, means that we need an SNR of 60 or 70dB
>which would be next to impossible to achieve.  I got this info
>from a couple of codec app. engineers at Motorola and Lucent.

Is 60 dB impossible to achieve?

I hadn't realized the significance of idle channel noise in
relation to v.34 before, but maybe that explains why we didn't
get all that many high speed v.34 connections over analog radio
facilities (all of which are wideband FM radios).  But I do know
that I've seen 28.8 connections sometimes.  (All of the single
channel per carrier systems that I've experienced, however, are
lucky to get 21.6K connections.  But these were universally
designed to minimize bandwidth on satellite systems, so that is
probably not indicative.)

Consider that if there is always a tandem PCM channel involved,
then choosing levels such that the idle channel noise power from
the FM radio channel is low enough to not be within the range
that a PCM encoder can quantize, will result in an over all
end to end reduction in idle channel noise compared to the
FM channel alone.  Clearly the dynamic range is going to be
limited, but you also might consider leased line modems that
automatically adjust levels too, if these are not going to
be dialup lines.

>I then thought of one idea that might buy me an extra 10dB or
>so, which is to play games so that I set up the system such
>that -10dBm corresponds to full deviation (equivalent to 0dBm)
>on the NBFM link, where we can tolerate the +4dB over deviation
>during the initial training period.
>
>From the looks of things though, based on the info you
>mentioned this scheme is not going to work.
>
>Can you recommend anything and/or correct any misunderstanding
>that I might have caught myself up in.

I think you have it pretty well understood.  I had to read that
three times to catch some of the minor nuances in things you
said (I wasn't thinking of "wireline environment" to mean a PCM
channel, but rather a cable pair itself.  That lead to a couple
big paragraphs the were deleted when I realized that I hadn't
read it they way you wrote it! :-)

It does appear to me that a test lash up would be very easy to
accomplish, and would quickly indicate what kind of performance
could be expected.  I'd like to see the results of v.34
connections for the FM channel alone, and for that and a PCM
channel in tandem, and in both cases with the deviation on the
FM channel adjusted in 5 dB steps from far too low to far too
high.

  Floyd

>
>Tks,
>
>Alan+
>
>
>In article <82324s0nm1@enews4.newsguy.com>,
>  floyd@ptialaska.net wrote:
>> In article <821pj8$bqa$1@ssauraab-i-1.production.compuserve.com>,
>> Alan Tabourian  <71203.1307@CompuServe.COM> wrote:
>> >After the initial line probing that is supposed to take place at a
>> >-6dBm level, modems generally connect at a -10dBm nominal level.
>> >
>> >Could someone explain whether the -10dBm level refers to the peak
>> >level that would be observed or whether this applies the average
>> >level.
>>
>> That is the average level.  Actual peaks will approach 12-13 dB
>> higher than the average (the closer to a gaussian distribution
>> the more difference there will be between the peak and the average).
>>
>> The design target assumes 2-3 dB cable loss, a 10-11 dB signal,
>> and therefore a level at the telco of about 0 dBm for peaks (and
>> about -13 dBm for average).  That provides only a 4 dB margin
>> for high levels, because digitial carrier cannot quantize a
>> level greater than +4 dBm.  If for any reason at any point the
>> levels become 4 dB higher than expected the result will be peak
>> limiting, which causes severe distortion which will quickly
>> degrading modem througput.
>>
>>   Floyd
>>
>> --
>> Floyd L. Davidson                          floyd@barrow.com
>> Ukpeagvik (Barrow, Alaska)
>>
>>
>
>
>Sent via Deja.com http://www.deja.com/
>Before you buy.



--
Floyd L. Davidson                          floyd@barrow.com
Ukpeagvik (Barrow, Alaska)



From: tabourian@compuserve.com
Newsgroups: comp.dcom.modems
Subject: Re: Peak vs. Average Modem Power Level?
Date: Sun, 05 Dec 1999 22:14:36 GMT

Hi Floyd,

At 03:52 PM 12/1/99 -0900, you wrote:
>>I am trying to run a V.34 modem over a narrow band FM link,
>>mind you not a two-way radio style TX/Rx unit, but a custom
>>designed FM link that is meant to provide adequate performance
>>(bandwidth, group delay, SNR, etc) to support V.34 operation.
>
>One question though, will this be used to connect to other
>circuit facilities, such as PCM digitial carrier or digital
>switching systems?  Or will the two modems both be directly
>tied to the FM radio channel interface points?

OK, it looks like I am going to need to go into more details, I was
trying to simplify things so as to minimise the potential for
misunderstandings.

The application is a wireless local loop system.

We have a terminal at the subscriber end.  We have a DSP with two codecs
that operate in LINEAR mode (very important) and we use digital filters.

At the base station we use mu/A law encoding for voice again with a DSP
and digital filters and the base station is interconnected to the PSTN
via E1/T1.

>Also, I'm assuming you would really like to see 33.6Kbps...
>but how low can you go and still accept the results?  My
>guess is that 33.6K might be difficult, 28.8K is possible,
>and 24 or 26.4K seems very reasonable.  I can't see how
>anything would not work at 21.6K!

Of course 33.6kbps would be ideal, but I guess I will settle for
28.8kbps if I have to.

>>The issue really has to do with SNR.  Theoretically, and I have
>>read this over and over again, an analog phone line will
>>deliver no better that 35-38dB SNR due to mu/A law
>>quantitisation.  At first I assumed that if I deliver this
>>level of SNR, then I am home free.
>
>Assuming you are going to tandem your FM channel with other
>facilities, which most likely will include digital carrier or
>switching systems, you need at least 6 dB better than that just
>to not contribute significantly to the SNR.  43 dB is the number
>that comes to mind (but that should also be a _minimum_, not the
>design target.  60 dB might be a good design target...  :-)

I don't understand why you are saying that a minimum of 6dB better SNR
is required.  Why would the modem need to see 6dB better performance
(and better than which performance are you refering to?)

>If your channel provides a 37 dB SNR, and is in tandem with a
>PCM channel then the overall SNR is going to be no better than
>34 dB.  The signal level will remain the same, but the noise
>from those two channels will add together, and since they have
>identical noise power to contribute, the increase in noise will
>be 3 dB.  Ideally you can design an FM radio to provide much
>better SNR than the PCM channel, and it will contribute no
>significant noise (less than 1 dB change in SNR).  That is of
>course ignoring the issue that you bring up farther down in
>regard to idle channel noise.

I am confused here.  The way I looked at it, is that the entire path
between a terminal and a base station (two codec conversions and digital
filters in the terminal, and the codec conversion and filters in the
base station) should be equivalent to a single mu/A law conversion and
deliver the SNR of a single mu/A law conversion, hence the reason I
first called for 35-38dB of SNR.  Note that my understanding is that
V.34 was designed to be able to handle a maximum of  three mu/A law
conversion and when I investigated this further was told that this holds
for 28.8kbps but that for 33.6kbps only two conversions are allowed.

>>The first issue that came up, is that modems connect at a
>>-10dBm nominal level.  In a wireline environment, the SNR level
>>does not change appreciably if the level drops from 0dBm to
>>-10dBm.  Of course in an analog FM link, a -10dBm level drop
>>means a drop of -10dB in SNR.
>
>Because of the A/mu-Law quantizing, the SNR of a PCM channel
>will maintain relatively the same SNR for different signal
>levels within a reasonable range, but will exhibit very low
>noise on an idle channel.  Other facilities used for typical
>telephone circuits do not have that characteristic.  Both PCM
>and FM radio channels have a distinct upper limit to signal
>level though.  With PCM it is a very hard limit, and clipping
>takes place with perhaps less than 1 dB compression prior to the
>knee, and then absolute limiting.  With an FM radio channel
>there is a knee in the intermod vs. modulation index plot, and
>higher signal levels cause dramatically increasing amounts of
>intermod past that point.
>
>Hence peak levels are very significant for FM radio channels,
>just as they are for PCM channels.  Usually in the telephone
>industry we are very concerned with one difference though, which
>is that on wideband multichannel systems the noise problems
>with high levels in PCM multiplexers only affect the channel
>with the high level, while with a multichannel FM radio a high
>level in any channel can affect all channels.
>
>Of course, in your situation there is only one channel, so the
>difference is moot in that respect.

We do have multiple channels although in most cases, we will not have
adjacent channels present so that should not be a problem.  Clearly
though, one thing we need to check is the peak deviation that we are
experiencing when a modem is connected at -10dBm nominal level.  If we
are over deviating, we will end up quickly taking a hit in the FM link
due to distortion.

>>At that point I beefed up the specs so that I called for 48 to
>>50dB SNR at 0dBm for the NBFM link.
>>
>>Next I learned that both V.34 and specially V.34+ use a
>>modified (packed) constellation and rely on the very low idle
>>channel noise present over the telephone network on the order
>>of -60 or even -70dB to make it all work.  Translating this
>>into my NBFM world, means that we need an SNR of 60 or 70dB
>>which would be next to impossible to achieve.  I got this info
>>from a couple of codec app. engineers at Motorola and Lucent.
>
>Is 60 dB impossible to achieve?

For a narrowband FM with +/-5 kHz deviation, it is hard to achieve even
50dB SNR.  The only way to maybe be able to achieve this sort of
performance is with a very expensive design using digital I/F
techniques.  Note that it is not just an issue of SNR with respect to
noise that we have to worry about, but also distortion in the
demodulator (we are using low group delay 2nd I/F filters), hence the
reason I suggested that in an ideal system, the demodulator would need
to be implemented in the digital domain with digital I/F filters.

>I hadn't realized the significance of idle channel noise in
>relation to v.34 before, but maybe that explains why we didn't
>get all that many high speed v.34 connections over analog radio
>facilities (all of which are wideband FM radios).  But I do know
>that I've seen 28.8 connections sometimes.  (All of the single
>channel per carrier systems that I've experienced, however, are
>lucky to get 21.6K connections.  But these were universally
>designed to minimize bandwidth on satellite systems, so that is
>probably not indicative.)

Probably not.  Right now we are achieving 21.6kbps very reliably.  We
have identified potential problem areas in the hardware and in the
digital filter implementation and I am now trying to gather all of the
relevent information so that we can redesign the hardware and filters.

>Consider that if there is always a tandem PCM channel involved,
>then choosing levels such that the idle channel noise power from
>the FM radio channel is low enough to not be within the range
>that a PCM encoder can quantize, will result in an over all
>end to end reduction in idle channel noise compared to the
>FM channel alone.  Clearly the dynamic range is going to be
>limited, but you also might consider leased line modems that
>automatically adjust levels too, if these are not going to
>be dialup lines.

This is a dialup line with regular modems.  I  didn't understand though
your suggestion of choosing levels such that the idle channel noise
power from the FM radio channel is low enough not to be within range
that a PCM encoder can quantize.  Can you expand on this?

>>I then thought of one idea that might buy me an extra 10dB or
>>so, which is to play games so that I set up the system such
>>that -10dBm corresponds to full deviation (equivalent to 0dBm)
>>on the NBFM link, where we can tolerate the +4dB over deviation
>>during the initial training period.
>>
>>From the looks of things though, based on the info you
>>mentioned this scheme is not going to work.
>>
>>Can you recommend anything and/or correct any misunderstanding
>>that I might have caught myself up in.
>
>I think you have it pretty well understood.  I had to read that
>three times to catch some of the minor nuances in things you
>said (I wasn't thinking of "wireline environment" to mean a PCM
>channel, but rather a cable pair itself.  That lead to a couple
>big paragraphs the were deleted when I realized that I hadn't
>read it they way you wrote it! :-)

Sorry about that, I should have been more clear in my description.

>It does appear to me that a test lash up would be very easy to
>accomplish, and would quickly indicate what kind of performance
>could be expected.  I'd like to see the results of v.34
>connections for the FM channel alone, and for that and a PCM
>channel in tandem, and in both cases with the deviation on the
>FM channel adjusted in 5 dB steps from far too low to far too
>high.

It is actually not that easy.  We have so many variables that affect
things and we don't have right type of setup to do the testing where we
can isolate the effect of every potential thing that can affect the
performance.  I am working on it though!

Tks for your feedback.

Alan+


From: floyd@ptialaska.net (Floyd Davidson)
Newsgroups: comp.dcom.telecom.tech
Subject: Re: Subtle Problem
Date: 10 Dec 1999 13:14:24 GMT

Phil Borod <miamipab@shadow.net> wrote:
>
>You can see from some of my notes that I have formed some
>opinions, but since the problem hasn't yet been resolved, any
>ideas are appreciated..

  [snipped for effect]

>ISP                              #1      #1      #2       #3       #3

>Receive  Signal Power  (-dBm)  29       12      5        13       12
>Transmit Signal Power  (-dBm)  10       14      14       14       14
>Round Trip Delay   [2] (msec)  8        7       7        6        5

>Transmit Frame Count           384      4622    1023     2223     4185
>Transmit Frame Error Count     2        2       1        0        3
>Receive  Frame Count           796      8570    2533     5039     7350
>Receive  Frame Error Count     62       97      4        67       45
>Retrain by Local  Modem        8        1       0        1        0
>Retrain by Remote Modem        0        1       0        0        0
>Robbed-Bit Signaling           NA       01      04       04       08
>Digital Loss           (dB)    NA       3       6        3        3

There are several interesting things in the figures there.
Unfortunately, with so little information it is impossible to
really be sure about anything.  If each of these where a
composite of 10-20 calls to that particular ISP, then it would
be more meaningful.

Note that there are two of these calls with some unusual levels
indicated.  The first column is a call with clearly very low
levels in both directions.  The transmit output level has been
cranked up to max to compensate for the distant end reporting
low receive levels, and the receive level on this end is well
below normal.  It is not surprising that a less than ideal
connection resulted.  However, the other figures indicate that
we can't really tell much by the values of these numbers!

The best connection seems to be column three, but it has an
absolutely impossible receive level being reported.  Rest
assured that if the receive level really was -5 dBm, you would
not get any connection.  The typical range should be (these are
real values, not modem reported values) from -19 to -22 dBm in
an ideal world.  A signal that is more than 4 dB too hot will
result in peak limiting in transit through any PCM digital
carrier or switching system.  A signal that is 14 dB too high
would be just absolutely undecypherable!

Which means, since obviously you did get a _good_ connection,
that we cannot take these numbers very serious as far as the
absolute values go.  It happens that we have no reason to think
we can compare between a v.34 connection and a v.90 connection
either.  So we can't compare column 1 to the others at all.

The fact that all v.90 connections seem to have high reported
receive levels and have apparently adjusted the send level to as
low as it can be, suggests that all levels probably are at least
1-2 dB high.  The only real conclusions we can draw are that
column 1 probably does have low signal levels, and that they are
low in both directions.  The others seem to have high levels in
both directions, and the connection in column 3 was higher than
the others by some measurable amount.

Calls to column 1 are routed on different trunk facilities that
calls to the other locations.  Or at least that is what happened
when this data was collected.

If you have intermittently different results to the first
location, it is likely that there are multiple trunk groups
between telephone offices, and some are on different facilities
than others.  Perhaps some are on an old system and some are on
newer systems.  Often the older system is not nearly as good...
Usually those will be overflow trunks for when the others are
busy, so you may see them in use only during peak hours.
Likewise, depending on the topology of the local switching
network, it is possible that there are alternate routes for any
or all of these locations that pass through the normal routes to
the others!  It may be that when no trunks are available to the
office where ISP 3 is located, that those calls are then routed
towards the office where ISP 1 is located, and relayed from
there to the final destination.  That would cause rare problems
to ISP 3 and common problems to ISP 1 that look identical other
than in the frequency.  The overflow, again, would only happen
during peak hours and you will not catch an example when testing
at midnight.

The gist of it is, we can guess all we like, but we can't tell
what is happening.  The telco, which has absolutely no incentive
to do this, could figure it out in a relatively short time and
know _exactly_ what is happening.  But it would be a waste of
their time, and probably yours too because nothing is likely to
be done to change it.

  Floyd

--
Floyd L. Davidson                          floyd@barrow.com
Ukpeagvik (Barrow, Alaska)


From: Floyd Davidson <floyd@ptialaska.net>
Newsgroups: comp.dcom.modems
Subject: Re: What's easiest way to test phone line?
Date: 03 Mar 2000 10:50:43 -0900

"LGC" <lcrawford@trxinc.com> wrote:
>Here's an ati11 after about a half hour of flipping pages:
>ati11
>
>    Description                         Status
>    ---------------                     ------------
>    Last Connection                     V.90
>    Initial Transmit Carrier Rate       26400
>    Initial Receive  Carrier Rate       50666
>    Final   Transmit Carrier Rate       26400
>    Final   Receive  Carrier Rate       50666
>    Protocol Negotiation Result         LAPM
>    Data Compression Result             V42bis
>    Estimated Noise Level               153
>    Receive  Signal Power Level  (-dBm) 15
>    Transmit Signal Power Level  (-dBm) 14

The levels indicated here are consistant with a very short
cable.  -14 dBm transmit is probably as low as it can go.
The -15 dBm receive level is a tad higher than what is probably
optimum.  The Test Level Point for a subscriber location
ideally is -6 dBm, and data should be 13 dB below that, or
-19 dBm.  If we are to believe your modem (risky assumption),
you could reduce the level by 4 dB and be better off.  Maybe.

Since you are interested in wasting time for a trivially better
connection... :-) why not try putting a pad in series with the
line to drop the level a bit.  Just be careful to use a high
enough power rating that they don't burn up right away from the
loop current.  Any padding should have a 600 ohm impedance too,
and be balanced:


         +-----+        +-----+
     o---| Rs  |----o---| Rs  |----o
         +-----+    |   +-----+
                  +---+
                  |   |
                  |Rp |
                  |   |
                  +---+
                    |
                  -----   2 ufd non-polarized  250 VDC
                  -----       plastic capacitor
         +-----+    |   +-----+
     o---| Rs  |----o---| Rs  |----o
         +-----+        +-----+


For series resistances Rs and a parallel resistance Rp, you can plug
either value into this formula and obtain the other value:

                     2RsRp + 600Rp
     600 =  2Rs + -------------------
                     2Rs + Rp + 600

That will maintain a 600 ohm impedance in both directions.
Values up to about 200 might be useful for Rs, as that would be
close to a 6 dB loss.

>Other than the initial & final rates, I don't have a good feel
>for what the other numbers are, although it appears noise level
>& error count is low.  Maybe my house wiring is pretty good
>after all.  Could be as Tech_Head says, the impedance could be
>in the modem itself.  What do you think?

The impedance of both the modem and the line card in the switch
are a 600 ohm resistors with a 2 ufd series capacitor.  That is
definitely not going to be a problem.  The only variation would
be the shunt capacitance that each end is balanced for will be a
lot more than actually exists in a few hundreds of feet of
cable.  You might buy a few a set of good grade plastic caps,
and try values of .002, .005, .01, and .02 ufd across the line
in various combinations.

Note that a 3 dB pad will swamp any minor variations in the
resistance component of the line impedance very nicely.  You
might want to try the capacitors on both sides of the pad (and
if you see any change on either side, buy another set of caps
and try both sides at once).

However...  I would suspect the impedance simply is not even
significant (the modem has a *very* fancy echo canceller built
into it, which means the impedance match just isn't all that
important).  The levels might do something, and using CAT5
cable might do something.

--
Floyd L. Davidson                          floyd@barrow.com
Ukpeagvik (Barrow, Alaska)




From: floyd@tanana.polarnet.com (Floyd Davidson)
Newsgroups: comp.dcom.modems
Subject: Re: When will FCC repeal 53K limitation?
Date: 16 Sep 1998 06:23:14 GMT

Alan Fowler <amfowler@melbpc.org.au> wrote:
>	To the best of my knowledge, most telephone Regulatory
>Bodies around the world require the output power of a modem to
>be about 3 dB less than that of speech.  A modem puts out a
>reasonably constant power level all the time it is connected.

Actually the _peak_ levels experienced for both voice and data
are exactly the same.  A testtone level is the maximum level,
and voice levels are set so that voice peaks never exceed
testtone level.  The _average_ power in speech signals is
considerably less, and as read on anything other than a peak
reading meter it will appear to be peaking at least 10 dB below
testtone levels.

A modem is set to 13 dB below testtone level.  The reason is
because it can be shown that for a Gaussian distributed white
noise (which all modems aspire to, but never quite reach) the
peak levels experienced will be 12.5 dB above the average power
of the entire signal.

Hence testtone, voice, and data levels are all set such that
_peaks_ are the same.  However, only a testtone also averages
the same as the peak.  Note that on a averaging meter the
voice peaks will be read at about 3 dB higher in level than
the modem data tones, but in fact both are hitting peaks that
are at a much higher, and equal, level.

Now, despite what I just said...  loud talkers can and do
often exceed testtone level.  There is a little bit of fudge
factor built into every system, and the greatest amount was
available from a wideband analog system, while the least is
available from  PCM digital carrier (with the exception that
a great deal of early analog satellite carrier was even
more sensitive, but little of that remains in service).  An
old wideband (1200 channel or greater) analog system might
tolerate several signals that were averaging 10 dB hot, with
no ill effect to anything.  A narrow band (300 channel analog)
might handle 1 such hot channel.  A digital carrier system
will clip at approximately 3.5 dB and go into hard limiting
before 4 dB.  No other channels are affected, as would be
the case with an analog system, but that one channel is
severely impaired.  (Some old analog satellite carrier would
hard limit at 0.5 dB over testtone!)

The significance of the slop factors that are built in is
that a short loop has about 2 dB less loss than the modem
is designed for, and if the distant end also has a short loop,
the two modems will be right at the margin and any portion
of the circuit which is even 1 dB less loss than it should
be will cause severely distorted signals, and hence lower
data throughput for v.34 modems.

  Floyd

--
Floyd L. Davidson                                floyd@ptialaska.net
Ukpeagvik (Barrow, Alaska)                       floyd@barrow.com

Newsgroups: comp.dcom.modems
From: floyd@icefog.polarnet.com (Floyd L. Davidson)
Subject: Re: noise?
Date: 6 Jan 1998 17:41:52 -0900

John Navas <Usenet@NavasGrp.Dublin.CA.US> wrote:
>"Brian Thomforde" <brian@nospam-truckdriver.com> wrote:
>
>> I brought my laptop while visiting Grama and grampa over the holidays, I
>>have used this setup at many motels and even thru cell phone
>>connections.....here's the question
>>I get dumped from my ISP all the time(only here) sometimes right away other
>>times after a short time (5 minutes) I thought it was line noise but once in
>>a frustrated moment i picket up the phone to listen as the modem connected
>>and didn't get dumped so i left it off the hook it works fine now (off the
>>hook) WHY???
>
>See my FAQ.  Sounds like premises equipment problems.

It sure sounds like hot receive levels to me.  The primary
effect of having a phone off the hook (other than adding a lot
of room noise to the line noise) is to double terminate the
line.  That will cause received signal to be split between the
phone and the modem, and half of the signal power will go to the
phone and half to the modem, hence reducing the signal level by
3 dB (half power).

It happens that v.32 and v.34 modems are not very tolerant of
receive levels that are too high, and as little as 2 dB over the
threshold would cause exactly the above descibed trouble.

My guess is this happens on a line that is very close to the
phone companies CO, or very close to a remote unit or SLIC type
of unit?  It is possible that asking the telco to verify the
line configuration would get some help, but it also might take a
lot of effort to get anyone to look in the right places.

If the modem is one that will report signal quality, the highest
reasonable receive level would be about -16 dBm, though that may
also cause problems.  However, many modems are not exactly
accurate when measuring levels so some modems may work fine and
report levels that high while others die at -18 dBm indicated.
The "design" level is -22 dBm (a received testtone level would
be -9 dBm, and the data level is -13 dBm0, or 13 dB lower than a
testtone level.  (Some lines might actually have a design level
of -6 dBm testtone too, which would result in -19 dBm data to a
modem.  That might be right on the edge...)

Floyd
--
Floyd L. Davidson                                floyd@polarnet.com
Salcha, Alaska                               or: floyd@tanana.polarnet.com




Index Home About