Posts tagged GSM

AT&T Bands in Las Vegas – 850 GSM/EDGE, 1900 UMTS/3G

Last time I was in Las Vegas it was for MIX 10 and Windows Phone 7 (back when it included ‘series’ at the end). This time, the reason is CES 2011 with AnandTech and a whole bunch more mobile devices.

I thought it was interesting last time I came that most casino floors in Las Vegas had shockingly poor or non-existant UMTS (3G) coverage on AT&T. I guess I didn’t find it too shocking, since coverage inside buildings in a dense urban environment is probably the most challenging for mobile networks, but it seemed to be a consistent problem. After getting frustrated about 6 hours into my stay, I decided to switch entirely to EDGE for the duration just because of how annoying being constantly handed between GSM/EDGE and UMTS is when you’re trying to do things. For whatever reason, back then I didn’t think to pull up field test on the iPhone 3GS I was currently carrying to see what bands were assigned to which network technology.

Now that I’m back, I decided to check. Thankfully, Apple has restored most if not all of the Field Test data products in iOS 4.2.1, a huge step forward from 4.1 just allowing signal strength in dBm at top left, and a far cry from 4.0 which shipped with no field test whatsoever. To save potential readers some googling, to get here, enter *3001#12345#* from the dialer and hit call – if it hasn’t been removed yet, you’ll get dumped into Field Test on iOS.

In EDGE and tapping on GSM RR Info, it’s immediately obvious why I saw that behavior last time I was here:

ARFCN dictates what channel inside what band we’re on, and 142 just happens to lie inside the GSM 850 band. It’s a number basically used to refer to the FDD pair of frequencies the phone is currently using. You can calculate exactly what frequency downlink and uplink are on with a little math and some reference guide (there’s a good table here), but basically with an ARFCN of 142 we know immediately that GSM/EDGE is on AT&T’s 850 MHz spectrum. Between 128 and 251 is that GSM850 spectrum.

Now, what about UMTS/3G? Enabling 3G (look at how weak that signal is…) and going into UMTS RR info, I saw the following:

Looking at the fields “Downlink Frequency” and “Uplink Frequency” we can see the device’s UARFCN channel numbers. It’s the same thing, but U for UMTS. Again, with a reference aide (read: wikipedia) we can see that UMTS/3G is working in the PCS 1900 MHz band.

Remember that higher frequencies are less effective at propagating through buildings. It’s pretty obvious now why getting good 3G coverage on AT&T is a challenge deep inside a casino in Las Vegas. There’s nothing inherently wrong with putting GSM/EDGE on 850 and UMTS on 1900, it’s just interesting in practice how immediately obvious the difference is walking around. Propagation is a challenge in dense urban environments with lots of people moving around to begin with, I’m sure this doesn’t help in Las Vegas. AT&T promised to put all of its 3G (UMTS) network on the 850 MHz band (wherever it’s licensed to use it) by the end of 2010, but sadly that hasn’t happened quite yet, at least in this market. I’ll keep checking, but thus far it’s been solidly in 1900 PCS. Oh well.

AT&T Observations and Bandwidth

Bandwidth and Latency Data

I’ve always kind of been obsessed with bandwidth. I find myself constantly testing latency, bandwidth, and connection quality (mostly, in fact, through smokeping). Needless to say, that same obsession applies to my mobile habit, and especially given the often-congested perception of AT&T.

It sounds weird, but the two most-run applications on my iPhone are Speedtest.net Speed Test and Xtreme Labs SpeedTest. The Xtreme labs test used to be my favorite, largely because of its superior accuracy and stability. As great as Speedtest.net’s website is for testing, the iPhone app continually fell short. Tests ended before throughput stabilized, often the test would start, then the data would start being calculated a second later (skewing the average), or it’d just crash entirely. I could go on and on about the myriad problems I saw which no doubt contributed negatively to perception of network performance.

A few months ago, I wrote a big review and threw it up on the App Store. In the review, I noted that being able to export data would be an amazing feature. At the time, I had emailed Xtreme Labs and asked whether I could get a sample of my speed test results for analysis (I have yet to hear back). On Feb. 2nd, Ookla finally got around to releasing an update to the Speedtest.net app; it included the ability to export data as CSV.

Since then, I’ve been using it exclusively. I’ve gathered a bit of data, and thought it relevant to finally go over some of it. This is all from my iPhone 3GS in the Tucson, AZ market, largely in the central area. I’ve gathered a relatively modest 76 data points. Stats follow:

Gathered Statistics

Downstream (kbps)
Upstream (kbps)
Latency (ms)
Average 1880.3 263.3 1029.2
St. Dev. 1179.6 101.6 1140.2
Max 4279.0 356.0 6011.0
Min 82.0 18.0 366.0

These stats really mirror my perceptions. Speeds on UMTS/HSPA vary from extremely fast (over 4.2 megabits/s!) to as slow as 82 kilobits/s, but generally hang out around 1.2 megabits/s. On the whole, this is much faster than the average 600 kilobits/s I used to see when I was on Sprint across 3 different HTC phones.

Next, I became curious whether there was any correlation between time of day and down/up speeds. Given the sensitivity of cellular data networks to user congestion (through cell breathing, strain on backhaul, and of course the air link itself), I expected to see a strong correlation. I decided to plot my data per hour, and got the following:

Downstream and Upstream Bandwidth

Some interesting trends appear…

  1. I apparently sample at roughly the same time each day (given the large vertical lines that are evident if you squint hard enough). Makes sense because I habitually test after class, while walking to the next.
  2. There is a relatively large variation per day for those regular samples, sometimes upwards of a megabit.
  3. There does appear to be a rough correlation between time of day and bandwidth, but the fact that I’m moving around from cell to cell during the day makes it difficult to gauge.
  4. Upstream bandwidth is extremely regular, and relatively fast at that.

I’m still mentally processing what to make of the whole dataset. Obviously, I’m going to continue testing and gathering more data, and hopefully more trends will emerge. You can grab the data here in excel form. I’ve redacted my latitude and longitude, just because my daily trends would be pretty easily extracted from those points, and that’s just creepy.

3G Bands – Where is the 850?

Lately I’ve been getting an interesting number of hits regarding the 850/1900 MHz coverage of AT&T here in Tucson.

To be honest, I’ve read a number of different things; everything from certainty that our market has migrated HSPA (3G) to 850 MHz, to that AT&T doesn’t even have a license for that band in Arizona. For those of you that don’t know, migrating 3G to the 850 MHz bands is favorable because lower frequencies propagate better through walls and buildings compared to the 1900 MHz bands. In general, there’s an industry wide trend to move 3G to lower frequencies for just that reason.

I’ve been personally interested in this myself for some time, and finally decided to take the time to look it up.

Maps, maps, maps…

The data I’ve found is conflicting. Cellularmaps.com shows the following on this page:

AT&T 1900 MHz

AT&T 850 MHz

Note that the entire state of Arizona doesn’t have 850 MHz coverage/licensing.

However, the GSM authority over at GSM World shows three very different maps:

HSPA 3G Coverage (yellow)

AT&T 850 MHz coverage

AT&T 1900 MHz coverage

Note that the 3G data coverage map is labeled ambiguously; HSPA coverage exists, but it could be on either 1900 or 850. However, what we do glean is that (at least according to GSM world) there is equal 850 and 1900 MHz coverage in Tucson and the surrounding area. This contradicts the earlier map.

Then you have maps like these, which are relatively difficult to decipher but supposedly show regions of 800-band coverage from Cingular and AT&T before the merger:

Cingular 800, AT&T 850

Finally, you have websites such as these that claim Arizona is only 1900 MHz.

So what’s the reality? Uncertain at this point.

The map given by cellularmaps.com is sourced from 2008, whereas the GSM world maps are undated, and ostensibly current. The other maps are also undated, but the majority consensus is that AT&T isn’t licensed to use 800 MHz in this market.

If anyone knows about some better resources or information, I’d love to see it.

Update – 3/24/2010

I finally spoke with someone at AT&T, and it turns out that my initial suspicions were correct – Arizona does not have the 850 MHz UMTS Band 5. It’s as simple as that.

Oh well, at least we know now!

Mobile Phone Signal Bars – Thoughts

Something that’s bugged me for a long time is how crude and arbitrary signal bars on mobile phones are. With a few limited exceptions, virtually every phone has the exact same design: four or five bars in ascending order by height, which correspond roughly to the perceived signal strength of the radio stack.

Or does it? Let me just start by saying this is an absolutely horrible way to present a quality metric, and I’m shocked that years later it still is essentially the de-facto standard. Let me convince you.

It isn’t 1990 anymore…

Let’s start from the beginning. The signal bar analogy is a throwback to times when screens were expensive, physically small, monochromatic if not 8 shades of grey, and anything over 100×100 pixels was outrageously lavish. Displaying the actual RSSI (Received Signal Strength Indicator) number would’ve been difficult and confusing for consumers, varying between 8 already difficult to distinguish shades of grey would have been hard to distinguish, and making one bar breathe in size could have sacrificed too much screen real estate.

It made sense in that context to abstract the signal quality visualization into something that was both simple, and readable. Thus, the “bars” metaphor was born.

Since then, there have been few if any deviations away from that design. In fact, the only major departure thus far has been Nokia, which has steadfastly adhered to a visualization that makes sense:

Ascending Strength Bars

Another Example

Namely, their display metaphor is vertically ascending bars that mirror call quality/strength. This makes sense, because it’s an optimal balance between screen use and communicating the quality in an easy to understand fashion. Moreover, they have 8 levels of signal, 0-7 bars showing. Nokia should be applauded for largely adhering to this vertical format. (In fact, you could argue that the reason nobody has adopted a similar metaphor is because Nokia has patented it, but I haven’t searched around)

It’s 2010, and the granularity of the quality metric on most phones is arbitrarily limited to 4 or 5 levels at best.

Better designs?

Thus, an optimal design balances understandability with level of detail. On one hand, you could arguably simply display the RSSI in dB, or on the other hand sacrifice all information reporting and simply report something boolean, “Can Call” Yes/No.

Personally, I’m waiting for something that either leverages color (by sweeping through a variety of colors corresponding to signal strength) or utilizes every pixel of length for displaying the signal strength in a much more analogue way.

Excuse the crudity of this rendering

Green and red are obvious choices for color, given their nearly universal meaning for OK and OH NOES, respectively. Something that literally takes advantage of every pixel by breathing around instead of arbitrarily limiting itself to just 4 or 5 levels also wouldn’t be hard to understand.

Fundamentally, however, the bars still have completely arbitrary meaning. What constitutes maximum “bars” on one network and device has a totally different meaning on another device or carrier. Even worse, comparing the same visual indicator across devices on the same network can often be misleading. For example, the past few months I’ve made a habit of switching between the actual RSSI and the resulting visualization, and I’ve noticed that the iPhone seems to have a very optimistic reporting algorithm.

There’s an important distinction to be made between the way signal is reported for WCDMA versus GSM as well:

First off one needs to understand that WCDMA (3G) is not the same thing as GSM (2G) and the bars or even the signal strength can not be compared in the same way, you are not comparing apples to apples. The RSCP values or the signal strength in WCDMA is not the most important value when dealing to the quality of the call from a radio point of view, it’s actually the signal quality (or the parameter Ec/No) that needs also to be taken into account. Source

That said, the cutoff for 4 bars on WCDMA seems to be relatively low, around -100 dB or lower. 3 bars seems around -103 dB, 2 bars around -107 dB, and 1 bar anything there and below. Even then, I’ve noticed that the iPhone seems to run a weighted average, preferring to gradually decrease the report instead of allowing for sharp declines, as is most usually the case.

Signal in dB

Use dB if you’re not averse to math

What you’re reading isn’t really dBm, dBmV, or anything really physical, but rather a quality metric that also happens to be reported in dB. For whatever reason, most people are averse to understanding dB, however, the most important thing to remember is that 3 dB corresponds to a factor of 2. Thus, a change  of -3 dB means that your signal has halved in power/quality.

The notation dBm is refrrenced to 1 mW. Strictly speaking, to convert to dBm given a signal in mW:

Signal_{dB}=10\log\left(\frac{Signal_{mW}}{1\, mW}\right)

Likewise, to convert a signal from dBm back to mW:

Signal_{mW}=10^{\left(\frac{Signal_{dB}}{10}\right)}

But even directly considering the received power strength or the quality metric from SNR isn’t the full picture.

In fact, most of the time, complaints that center around iPhones failing to make calls properly stem from overloaded signaling channels used to setup calls, or situations where even though the phone is in a completely acceptable signal area, the node is too overloaded. So, as an end user, you’re left without the quality metrics you need to completely judge whether you should or should not be able to make a data/voice transaction. Thus, the signal quality metric isn’t entirely a function of client-tower proximity, but rather node congestion.

Carriers have a lot to gain from making sure their users are properly informed about network conditions; both so they can make educated decisions about what to expect in their locale, as well as to properly diagnose what’s going on when the worst happens. Worse, perhaps, carriers have even more to gain from misreporting or misrepresenting signal as being better than reality. Arguably, the cutoffs I’ve seen on my iPhone 3GS are overly optimistic and compressed into ~13 dB. From my perspective, as soon as you’re below about -105 dB, connection quality is going to suffer on WCDMA, however, that shows up as a misleading 3-4 bars.

Conclusions:

What we need is simple:

  1. Transparency and standardization of reporting – Standardize a certain visualization that is uniform across technology and devices. Choose something that makes sense, so customers can compare hardware in the same area and diagnose issues.
  2. Advanced modes – For those of us that can read and understand the meaning of dB and real quality metrics from the hardware, give the opportunity to display it. Hell, perhaps you’ll even encourage some people to delve deeper and become RF engineers in the future. It’s annoying to have to launch a Field Trial application every time we want to know why something is the way it is.
  3. Leverage recent advances in displays - Limiting display granularity to 4 or 5 levels doesn’t make sense anymore; we aren’t constrained by tiny monochromatic screens.
  4. Tower load reporting - Be honest with subscribers and have the tower report some sort of quality metric/received SNR of its own so we know which path of the link is messed up. If a node is congested, tell the user. Again, people are more likely to be happy if they’re at least made aware of the link quality rather than left in the dark.