2017년 6월 4일 일요일

Understanding ibeacon distancing


Trying to grasp a basic concept of how distancing with ibeacon (beacon / bluetooth le) can work. Is there any true documentation on how far exactly an ibeacon can measure. Lets say I am 300 feet away...is it possible for an ibeacon to detect this?

--
The distance estimate provided by iOS is based on the ratio of the iBeacon signal strength (rssi) over the calibrated transmitter power (txPower). The txPower is the known measured signal strength in rssi at 1 meter away. Each iBeacon must be calibrated with this txPower value to allow accurate distance estimates.
When we were building the Android iBeacon library we had to come up with our own independent algorithm because the iOS CoreLocation source code is not available. We measured a bunch of rssi measurements at known distances, then did a best fit curve to match our data points. The algorithm we came up with is shown below as Java code.
Note that the term "accuracy" here is iOS speak for distance in meters. This formula isn't perfect, but it roughly approximates what iOS does.
protected static double calculateAccuracy(int txPower, double rssi) {
  if (rssi == 0) {
    return -1.0; // if we cannot determine accuracy, return -1.
  }

  double ratio = rssi*1.0/txPower;
  if (ratio < 1.0) {
    return Math.pow(ratio,10);
  }
  else {
    double accuracy =  (0.89976)*Math.pow(ratio,7.7095) + 0.111;    
    return accuracy;
  }
} 
Note: The values 0.89976, 7.7095 and 0.111 are the three constants calculated when solving for a best fit curve to our measured data points. YMMV
--
I'm very thoroughly investigating the matter of accuracy/rssi/proximity with iBeacons and I really really think that all the resources in the Internet (blogs, posts in StackOverflow) get it wrong.
davidgyoung (accepted answer, > 100 upvotes) says:

Note that the term "accuracy" here is iOS speak for distance in meters.

Actually, most people say this but I have no idea why! Documentation makes it very very clear that CLBeacon.proximity:

Indicates the one sigma horizontal accuracy in meters. Use this property to differentiate between beacons with the same proximity value. Do not use it to identify a precise location for the beacon. Accuracy values may fluctuate due to RF interference.
Let me repeat: one sigma accuracy in meters. All 10 top pages in google on the subject has term "one sigma" only in quotation from docs, but none of them analyses the term, which is core to understand this.
Very important is to explain what is actually one sigma accuracy. Following URLs to start with: http://en.wikipedia.org/wiki/Standard_errorhttp://en.wikipedia.org/wiki/Uncertainty
In physical world, when you make some measurement, you always get different results (because of noise, distortion, etc) and very often results form Gaussian distribution. There are two main parameters describing Gaussian curve:

1. mean (which is easy to understand, it's value for which peak of the curve occurs).
2. standard deviation, which says how wide or narrow the curve is. The narrower curve, the better accuracy, because all results are close to each other. If curve is wide and not steep, then it means that measurements of the same phenomenon differ very much from each other, so measurement has a bad quality.

one sigma is another way to describe how narrow/wide is gaussian curve.
It simply says that if mean of measurement is X, and one sigma is σ, then 68% of all measurements will be between X - σ and X + σ.
Example. We measure distance and get a gaussian distribution as a result. The mean is 10m. If σ is 4m, then it means that 68% of measurements were between 6m and 14m.
When we measure distance with beacons, we get RSSI and 1-meter calibration value, which allow us to measure distance in meters. But every measurement gives different values, which form gaussian curve. And one sigma (and accuracy) is accuracy of the measurement, not distance!
It may be misleading, because when we move beacon further away, one sigma actually increases because signal is worse. But with different beacon power-levels we can get totally different accuracy values without actually changing distance. The higher power, the less error.
Author has a hypothesis that accuracy is actually distance. He claims that beacons from Kontakt.io are faulty beacuse when he increased power to the max value, accuracy value was very small for 1, 5 and even 15 meters. Before increasing power, accuracy was quite close to the distance values. I personally think that it's correct, because the higher power level, the less impact of interference. And it's strange why Estimote beacons don't behave this way.
I'm not saying I'm 100% right, but apart from being iOS developer I have degree in wireless electronics and I think that we shouldn't ignore "one sigma" term from docs and I would like to start discussion about it.
It may be possible that Apple's algorithm for accuracy just collects recent measurements and analyses the gaussian distribution of them. And that's how it sets accuracy. I wouldn't exclude possibility that they use info form accelerometer to detect whether user is moving (and how fast) in order to reset the previous distribution distance values because they have certainly changed.

iBeacon uses Bluetooth Low Energy(LE) to keep aware of locations, and the distance/range of Bluetooth LE is 160ft (http://en.wikipedia.org/wiki/Bluetooth_low_energy).



Distances to the source of iBeacon-formatted advertisement packets are estimated from the signal path attenuation calculated by comparing the measured received signal strength to the claimed transmit power which the transmitter is supposed to encode in the advertising data.
A path loss based scheme like this is only approximate and is subject to variation with things like antenna angles, intervening objects, and presumably a noisy RF environment. In comparison, systems really designed for distance measurement (GPS, Radar, etc) rely on precise measurements of propagation time, in same cases even examining the phase of the signal.
As Jiaru points out, 160 ft is probably beyond the intended range, but that doesn't necessarily mean that a packet will never get through, only that one shouldn't expect it to work at that distance.

It's possible, but it depends on the power output of the beacon you're receiving, other rf sources nearby, obstacles and other environmental factors. Best thing to do is try it out in the environment you're interested in.

With multiple phones and beacons at the same location, it's going to be difficult to measure proximity with any high degree of accuracy. Try using the Android "b and l bluetooth le scanner" app, to visualize the signal strengths (distance) variations, for multiple beacons, and you'll quickly discover that complex, adaptive algorithms may be required to provide any form of consistent proximity measurement.
You're going to see lots of solutions simply instructing the user to "please hold your phone here", to reduce customer frustration.

--

How to improve Bluetooth distance measuring using RSSI?


For my project i need to estimate the distance between a Smartphone and a bluetooth module. The Estimation doesn't have to be very precise. I only need to determine the distance with a margin of error of about 50cm.
I did test the RSSI of two bluetooth modules at distance-steps of 10 cm. I measured the RSSI 5 times for each step and got the average of the 5 measurements. The averages are shown in the graph below:


The red and blue lines resemble the two Bluetooth modules. You can see that the results are not very linear. One of the reasons for this is interference, so i searched for ways to tackle the interference issue. Two ways i found are:
⦁ Signal Noise Ratio(SNR): Understanding ibeacon distancing
⦁ ratio of the iBeacon signal strength (rssi) over the calibrated transmitter power (txPower).The txPower is the known measured signal strength in rssi at 1 meter 
However i don't really understand how the above techniques would be used to get more accuracy. For SNR i need the Noise value, how do i even get the Noise value?
For ratio rssi/txPower, i can get the txPower by simply measuring the rssi at 1 meter from the module. So i know all the needed values. But i don't know what to do from here on out. How do i use these values to get a more accurate distance estimations?
Are there any other techniques i can use to improve accuracy?

--
You are running into the practical limitations on this technology. Getting estimation accuracy of +/- 50 cm may be possible under ideal conditions at short distances (under 2 meters) not at long distances of over 10 meters.
To answer your specific questions:
1. No, there is no practical way to know what part of a single RSSI measurement comes from signal and what part comes from noise. You can take an average over many samples, which partially removes noise if the transmitter and receiver are stationary over the sample interval.
2. The techniques you ask about do work to give you distance estimate, but they have the limitations of the technology described above.

--

Android AOSP - Definition of scan interval and scan window in android source code


I have downloaded the AOSP Source code for Lollipop 5.0. In api level 21, under bluetooth low energy scan settings there are three options for scanning the ble devices- SCAN_MODE_BALANCED, SCAN_MODE_LOW_LATENCY, SCAN_MODE_LOW_POWER. Are the based on different scan interval and scan window values? If so, where can I find the values defined for these macros in the source code directory.

--
I found below values in http://androidxref.com/5.0.0_r2/xref/packages/apps/Bluetooth/src/com/android/bluetooth/gatt/ScanManager.java while greping the keyword "SCAN_MODE_BALANCED" :
    /**
     * Scan params corresponding to regular scan setting
     */
    private static final int SCAN_MODE_LOW_POWER_WINDOW_MS = 500;
    private static final int SCAN_MODE_LOW_POWER_INTERVAL_MS = 5000;
    private static final int SCAN_MODE_BALANCED_WINDOW_MS = 2000;
    private static final int SCAN_MODE_BALANCED_INTERVAL_MS = 5000;
    private static final int SCAN_MODE_LOW_LATENCY_WINDOW_MS = 5000;
    private static final int SCAN_MODE_LOW_LATENCY_INTERVAL_MS = 5000;

    /**
     * Scan params corresponding to batch scan setting
     */
    private static final int SCAN_MODE_BATCH_LOW_POWER_WINDOW_MS = 1500;
    private static final int SCAN_MODE_BATCH_LOW_POWER_INTERVAL_MS = 150000;
    private static final int SCAN_MODE_BATCH_BALANCED_WINDOW_MS = 1500;
    private static final int SCAN_MODE_BATCH_BALANCED_INTERVAL_MS = 15000;
    private static final int SCAN_MODE_BATCH_LOW_LATENCY_WINDOW_MS = 1500;
    private static final int SCAN_MODE_BATCH_LOW_LATENCY_INTERVAL_MS = 5000;
Also checkout out ScanManager.ScanNative.configureRegularScanParams(). Two params scanWindow and scanInterval are set according to the scan setting (ScanSettings.SCAN_MODE_LOW_POWERScanSettings.SCAN_MODE_BALANCEDScanSettings.SCAN_MODE_LOW_LATENCY), converted into BLE units, and then passed to gattSetScanParametersNative().
Hope this helps.

--
I'm not sure if this is accurate or if you can even use it to find the values you need, but I found some code from Google regarding scanning settings:
  // Constants for Scan Cycle
  // Low Power: 2.5 minute period with 1.5 seconds active (1% duty cycle)
  /* @VisibleForTesting */ static final int LOW_POWER_IDLE_MILLIS = 148500;
  /* @VisibleForTesting */ static final int LOW_POWER_ACTIVE_MILLIS = 1500;

  // Balanced: 15 second period with 1.5 second active (10% duty cycle)
  /* @VisibleForTesting */ static final int BALANCED_IDLE_MILLIS = 13500;
  /* @VisibleForTesting */ static final int BALANCED_ACTIVE_MILLIS = 1500;

  // Low Latency: 1.67 second period with 1.5 seconds active (90% duty cycle)
  /* @VisibleForTesting */ static final int LOW_LATENCY_IDLE_MILLIS = 167;
  /* @VisibleForTesting */ static final int LOW_LATENCY_ACTIVE_MILLIS = 1500;
--

Bluetooth-Low-Energy RSSI changes periodically on Android devices


I noticed that the signal strength of Bluetooth Low Energy received on Androids is varying in cycles. The graph below represents the RSSI values of one BLE beacon over two minutes. The receiving Android and the beacon were both static with a distance of 1 meter. I made sure that there is as low interference as possible. The Android was a Nexus 5, but I had the same phenomenon with other Android devices, all running on API 21. I could not test it on iOS yet.
You can see that there are 3 major levels for the RSSI repeating every 15 seconds, like low -> middle -> high -> low -> middle -> high etc.
My guess is that the reason lies on the android side, not sure whether it is because of hardware or software reasons.
Why is the RSSI cyclic over time? Can someone explain?

--
As per Android AOSP - Definition of scan interval and scan window in android source code the scan interval in any scanning mode is 5000ms.
I would assume that your graph was generated via an application that used continuous scanning - i.e. scan window of 5000ms, which is basically continuous.
The scanner will rotate between channels 37,38,39 after every scan interval, which accounts for the differences you observe. Channels 37,38,39 are not contiguous in the BLE spectrum - 37 is at 2402Mz whereas 39 is at 2480Mz. The difference in wave length means that the multi path (interference from reflections) fade will be different for each channel http://www.cl.cam.ac.uk/~rmf25/papers/BLE.pdf - you say that the devices were static, so provided that nothing else was moving, the interference will also be static.
On iOS, the scan interval (foreground) is reportedly 40ms which means that you should not experience this precise effect.

After reading a lot into the topic now, I might have come to an answer.
Bluetooth Low Energy beacons use three different channels for advertising, which is their adaption of frequency hopping to avoid interference with other 2.4GHz signals. This happens much slower than for normal Bluetooth (1600/s) - according to my measurements around every 5 seconds.
The received signal strength depends obviously on the frequency, so if the frequency changes to another channel, the RSSI is different. How to deal with that is now a different question.
UPDATE: After following up on this issue, I have to update my remarks:
It is very likely that the three levels with each one around 5s are not directly due to the beacons slow frequency hopping, but to the android devices scanning seperately on the channels and switching to the next after such a time interval.
A way to overcome this behavior is starting and stopping the scan process in a loop, so that a scan lasts clearly less than 5s. When starting the scan, the device seems to begin scanning always on the same channel and the scan is restarted before it can switch to a different channel. With the restarts, the pattern is not detectable anymore - to the disadvantage that the channel is "fixed" and may suffer interference on this frequency. Thanks to Airsource Ltd for bringing me back to this question.
--

BLE - Notify about disconnect


I am trying to work on an app using the BLE. However, I am stuck trying to figure out how to have the scanning-connecting process to keep on going whenever a disconnection occurs. I want this to be more automatic. So I have my "scanning" once you initialize the app, so I guess that's why it does not restart the scanning because the screen is not being initialized. I tried setting some timer but 
1. I still don't understand those very well, 
2. My app stops at the last command of the clock. 

SO, can you please help me. A screenshot is found attached. 



(In summary I want the app to start scanning once a disconnection occurs,)

--
This is not easy to do, because your device will stop advertising once it is connected. It seems that a state change is only detected after 20 seconds or so.
The way to do this is to have the device communicate with your phone constantly (at cost of battery life). Then, when you do not recieve data, the connection may be lost.
There are also BLE devices that support notification of range events, like beacons. I do not know what you device can do in this respect.

-- 
To add: maybe just using when BluetoothLE.RssiChanged (Recieved signal strength) could help. A strength of 0 would mean disconnected. Let us know if it works. I never tried.

refer to: Bluetooth-Low-Energy RSSI changes periodically on Android devices

-- 
You said  "The way to do this is to have the device communicate with your phone constantly (at cost of battery life)."
How do I do that.

I have a beacon, that seems to disconnected after a couple of seconds of being connected. 

-- 
Put a clock in your app, enable it after the beacon is connected, set a timer interval of 500 milliseconds, or whatever seems suitable and every time the timer event fires (use when timer... event block), send something to the beacon. I do not know what your beacon expects as communication messages, so you will have to experiment.
That is exactly the fun of using App Inventor, experiment and find out.

-- 

How to get the android version of the device


I have the same Problem with the Android Version, and the Listview-Component.
Therefore, I am trying to detect the Android Version, so I can change the size of the font of the Listview Component.
But how does that work?
Any help will be appreciated.

--
Take a look to my "special tools extension" on my blog:

https://nmd-apps.jimdo.com/extensions/nmd-extensions/
You can get many things with my extension.


--
Many thanks to Mika, with the extension it works perfectly.

-- 

2 Question about Mit App inventor


I am a newbie at MIT App inventor, here is my 2 question:
1) Does MIT App inventor service free? (means that I can make as many apps as I want as free?)
2) After make it ,can I monetize it by using Admob?

--
Yes, the App Inventor service is free. 

--