Posts tagged RF
See the update at the bottom for the real deal, I was partly wrong about some of the antennas in iPhone 4, though I was indeed right about the connector locations for the bottom, and partly for the top.
I’ve been following the iPhone 4G/HD leak saga like a hawk, and until now I haven’t been able to really add anything to all that’s been said. However, today, Gizmodo published pictures of the inside of the iPhone 4G hardware they obtained. They didn’t talk about much other than the absurd number of screws (upwards of 30), battery size, packaging, and potential ease of replacement. In fact, their primary aim seems to have been locating “APPLE” markings on the few ribbon cables inside, rather than picking apart Apple’s hardware choices. No doubt disassembly was challenging, potentially explaining why there aren’t any photos of the iPhone with the “connect to iTunes” lock screen (broken after disassembly?).
They neglected to remove the EMI shields atop the interesting bits on the PCB, what I would’ve considered the biggest news about the device. So we still don’t know virtually anything about SoC, how much NAND flash there is onboard, RAM, the hugely important baseband (and whether this thing is potentially dual CDMA/GSM and UMTS for it to work on Verizon/Sprint alongside T-Mobile and AT&T), WiFi or Bluetooth choices (likely the same as the iPad, however), or anything else you’d expect to glean without those shields in place. In short, all the squares in this diagram from the iPhone 3GS are big question marks for the iPhone 4G. Still, we can make very good guesses about what the likely choices are.
However, being the RF-obsessed dude I am, I scrutinized the photos for some time looking for other interesting bits. I think I’ve found some interesting things.
First and foremost, I think that there are two discrete antenna assemblies in the phone. One at the top, one at the bottom (as you’d hold it in your hand).
Note that the phone in this picture has been rotated; the red circled area on the hardware is actually the bottom. Now, look at the two places I’ve marked with the white arrows. You can very clearly see a pigtail and standard radio connector on the top one, and a connector pad at the tip of the arrow at right. This is 100% certainly an antenna, and it’s also in the same region of the hardware (at the bottom) as the 3GS.
Above is what I’m talking about at 100% resolution.
Above shows the antenna before being removed, with the pigtail clearly connected to the mainboard PCB. We can make an educated guess that whatever is under the EMI shield next door is the baseband.
Now, compare and contrast to the iPhone 3GS’s ribbon/kapton antenna assembly:
And see it inside the black plastic holder (only the trailing ribbon connector is visible at bottom left):
If I’m not mistaken, the two connectors there are for discrete antennas inside, for cellular radio and WiFi/Bluetooth. I’m not infinitely familiar, but there only seems to be one antenna assembly in the 3GS at the bottom.
Now, on the iPhone 4G photos, there appears to possibly be a second possible antenna at the top.
I’ve labeled the connector that I can make out. Given the similar black packaging (possibly housing the flex PCB like in the 3GS), it seems likely this is another antenna.
I’ll leave you to speculate about why Apple might potentially want two discrete cellular antennas in their next generation phone…
After looking through the FCC OET internal photos of a huge number of other dual CDMA/UMTS design phones, all of which only require one antenna, I’m pretty sure the other top component is something less insidious. It’s entirely possible this is nothing more than a connector, some support structure, or perhaps maybe it is indeed an antenna, but for WiFi (N?). Whatever the case, I’m completely uncertain what this thing is, or if it’s part of the baseband. Obviously, the part at the bottom is an antenna, but the top part I’m more and more uncertain about.
We’ll see as time goes on and better pictures are made available what it is, but I’m not confident it’s an antenna anymore.
Of course, we now know the real deal with the iPhone 4. I was wrong about what the antennas were, but right about the connectors. Up at the top, if you scrutinize iFixit’s teardown, you can see a small gold pad right above a test junction for the WiFi/GPS/BT 2.4 GHz antenna. There’s a trace on the EMI shield which leads to a contact screw (gold, so it’s visible) leading directly to the antenna. So the connector for the 2.4 GHz antenna is up at the top near that seam.
For the UMTS/GSM antenna, the connector snakes across from the PCB to the left side of the phone facing up (facing down, it snakes to the right, like in this photo):
You can see the test point and connector at the left, the pigtail leading to the right across the EMI shield, and the gold screw which connects the whole deal to the aluminum antenna.
Of course, the interesting part is that this becomes the most active region of the antenna. It’s a monopole, rather than a dipole – in this configuration. The result is that for 1/4 wavelength, that part of the aluminum is very active at radiating RF. This is also the location your palm rests, interestingly.
I’m going to talk about the real deal on AnandTech shortly, so stay tuned…
It’s live here now: http://www.anandtech.com/show/3794/the-iphone-4-review/1
Something that’s bugged me for a long time is how crude and arbitrary signal bars on mobile phones are. With a few limited exceptions, virtually every phone has the exact same design: four or five bars in ascending order by height, which correspond roughly to the perceived signal strength of the radio stack.
Or does it? Let me just start by saying this is an absolutely horrible way to present a quality metric, and I’m shocked that years later it still is essentially the de-facto standard. Let me convince you.
It isn’t 1990 anymore…
Let’s start from the beginning. The signal bar analogy is a throwback to times when screens were expensive, physically small, monochromatic if not 8 shades of grey, and anything over 100×100 pixels was outrageously lavish. Displaying the actual RSSI (Received Signal Strength Indicator) number would’ve been difficult and confusing for consumers, varying between 8 already difficult to distinguish shades of grey would have been hard to distinguish, and making one bar breathe in size could have sacrificed too much screen real estate.
It made sense in that context to abstract the signal quality visualization into something that was both simple, and readable. Thus, the “bars” metaphor was born.
Since then, there have been few if any deviations away from that design. In fact, the only major departure thus far has been Nokia, which has steadfastly adhered to a visualization that makes sense:
Namely, their display metaphor is vertically ascending bars that mirror call quality/strength. This makes sense, because it’s an optimal balance between screen use and communicating the quality in an easy to understand fashion. Moreover, they have 8 levels of signal, 0-7 bars showing. Nokia should be applauded for largely adhering to this vertical format. (In fact, you could argue that the reason nobody has adopted a similar metaphor is because Nokia has patented it, but I haven’t searched around)
It’s 2010, and the granularity of the quality metric on most phones is arbitrarily limited to 4 or 5 levels at best.
Thus, an optimal design balances understandability with level of detail. On one hand, you could arguably simply display the RSSI in dB, or on the other hand sacrifice all information reporting and simply report something boolean, “Can Call” Yes/No.
Personally, I’m waiting for something that either leverages color (by sweeping through a variety of colors corresponding to signal strength) or utilizes every pixel of length for displaying the signal strength in a much more analogue way.
Green and red are obvious choices for color, given their nearly universal meaning for OK and OH NOES, respectively. Something that literally takes advantage of every pixel by breathing around instead of arbitrarily limiting itself to just 4 or 5 levels also wouldn’t be hard to understand.
Fundamentally, however, the bars still have completely arbitrary meaning. What constitutes maximum “bars” on one network and device has a totally different meaning on another device or carrier. Even worse, comparing the same visual indicator across devices on the same network can often be misleading. For example, the past few months I’ve made a habit of switching between the actual RSSI and the resulting visualization, and I’ve noticed that the iPhone seems to have a very optimistic reporting algorithm.
There’s an important distinction to be made between the way signal is reported for WCDMA versus GSM as well:
First off one needs to understand that WCDMA (3G) is not the same thing as GSM (2G) and the bars or even the signal strength can not be compared in the same way, you are not comparing apples to apples. The RSCP values or the signal strength in WCDMA is not the most important value when dealing to the quality of the call from a radio point of view, it’s actually the signal quality (or the parameter Ec/No) that needs also to be taken into account. Source
That said, the cutoff for 4 bars on WCDMA seems to be relatively low, around -100 dB or lower. 3 bars seems around -103 dB, 2 bars around -107 dB, and 1 bar anything there and below. Even then, I’ve noticed that the iPhone seems to run a weighted average, preferring to gradually decrease the report instead of allowing for sharp declines, as is most usually the case.
Use dB if you’re not averse to math
What you’re reading isn’t really dBm, dBmV, or anything really physical, but rather a quality metric that also happens to be reported in dB. For whatever reason, most people are averse to understanding dB, however, the most important thing to remember is that 3 dB corresponds to a factor of 2. Thus, a change of -3 dB means that your signal has halved in power/quality.
The notation dBm is refrrenced to 1 mW. Strictly speaking, to convert to dBm given a signal in mW:
Likewise, to convert a signal from dBm back to mW:
But even directly considering the received power strength or the quality metric from SNR isn’t the full picture.
In fact, most of the time, complaints that center around iPhones failing to make calls properly stem from overloaded signaling channels used to setup calls, or situations where even though the phone is in a completely acceptable signal area, the node is too overloaded. So, as an end user, you’re left without the quality metrics you need to completely judge whether you should or should not be able to make a data/voice transaction. Thus, the signal quality metric isn’t entirely a function of client-tower proximity, but rather node congestion.
Carriers have a lot to gain from making sure their users are properly informed about network conditions; both so they can make educated decisions about what to expect in their locale, as well as to properly diagnose what’s going on when the worst happens. Worse, perhaps, carriers have even more to gain from misreporting or misrepresenting signal as being better than reality. Arguably, the cutoffs I’ve seen on my iPhone 3GS are overly optimistic and compressed into ~13 dB. From my perspective, as soon as you’re below about -105 dB, connection quality is going to suffer on WCDMA, however, that shows up as a misleading 3-4 bars.
What we need is simple:
- Transparency and standardization of reporting – Standardize a certain visualization that is uniform across technology and devices. Choose something that makes sense, so customers can compare hardware in the same area and diagnose issues.
- Advanced modes – For those of us that can read and understand the meaning of dB and real quality metrics from the hardware, give the opportunity to display it. Hell, perhaps you’ll even encourage some people to delve deeper and become RF engineers in the future. It’s annoying to have to launch a Field Trial application every time we want to know why something is the way it is.
- Leverage recent advances in displays - Limiting display granularity to 4 or 5 levels doesn’t make sense anymore; we aren’t constrained by tiny monochromatic screens.
- Tower load reporting - Be honest with subscribers and have the tower report some sort of quality metric/received SNR of its own so we know which path of the link is messed up. If a node is congested, tell the user. Again, people are more likely to be happy if they’re at least made aware of the link quality rather than left in the dark.