Aren't you the same one that keeps asking this same question every week. Change back to the bar graph and wait for full bars, then switch to numerical and you'll see what optimal level is. The bar graph is just a graphical representation of the numerical value (split up into sections based on range of the numerical value vs graphic). Find your own range.
Perhaps changing back to the bar graph will rid you of your obsessive complusiveness of wanting a perfect signal all the time.
A dBm is a standard unit for measuring levels of power in relation to a 1 milliwatt reference signal. The minus sign means the ratio is less than one, which will always be the case with a cell phone, since you're not normally close enough to a tower to receive a signal power >1mW. -80 dBm is a typical 5-bars signal. -100 dBm is low, but still usable. I don't know what the lowest usable signal level is.
Since dBm is a log unit, every 10 dBm corresponds to a factor of 10 change in power. So, using values we typically see in cellphones,
-80 dBm = 10 pW
-90 dBm = 1 pW
-100 dBm = 0.1 pW
Last edited by JeffH; 05-07-08 at 10:52 PM.
Reason: added example of log scale
A dBm is a standard unit for measuring levels of power in relation to a 1 milliwatt reference signal. The minus sign means the ratio is less than one, which will always be the case with a cell phone, since you're not normally close enough to a tower to receive a signal power >1mW. -80 dBm is a typical 5-bars signal. -100 dBm is low, but still usable. I don't know what the lowest usable signal level is.
I thought the lower the number was, the better the signal. So -100 dBm is a better signal than -80 dBm.
I could be wrong, but that's always what I thought.
EDIT: Ahhh, I take that back, I just checked it out on my phone. It was at -107 at one point, and it was like 1 bar.