This isn’t an indoor navigation/location problem. This is a maximum radio reception distance problem. Normally Bluetooth has a ~60’ radius. Decrease the transmitter power, part of the Bluetooth LE standard, and shrink the radius down to ~8’ between average phones. Then only phones that are within that distance will receive the temp ID beacon transmission, with no need to know the exact relative positions. (There are beacon apps for phones that already do this.)
My monitor died and I’m not getting s new one til next Friday. I feel your pain.
Discourse lets you select a post and all replies from a desktop browser. It’s really a boon for long and sprawling topic like this. Mobile only lets you select one post at a time. Since folks are pretty well behaved in this topic I’m in no rush to swoop in and fix things.
in related news
… . regarding contact tracking, I can theoretically see the benefit and could see the worth in accepting some risk if it was actually effective. The implementation seems really questionable though from a practical standpoint - what would be the acceptable false negative rate be? Just knowing how fickle and error prone tech solutions generally are it seems like epidemiologists should make that call, not engineers - in terms of efficacy and overall factoring the consequences of relying on this as a metric. On the face of it it strikes me as a boondoggle, and in the worst scenario a pretext for the development of association warrants, deployed in haste in the inevitable future security crisis “X”. The mere existence of the tech could make leaving ones phone at home a reasonably suspicious act.
The engineer I spoke to basically said “naaaah, not that simple” when I suggested that decreasing power or measuring signal strength could be used. They insisted that accurate clocks were the key (which I know are the basics for GNSS, of course).
I think indoor navigation has a lot of similar problems. Attenuation and reflection do influence the resolution/position problem, relative to wifi, and Bluetooth beacons. That’s why I mentioned it, after Euclid got a mention.
So, I have worked some on problems with distance measurements using radio-frequency signals. There are three basic approaches for this problem:
1: Triangulation using the relative phase of reception on three antennas with a common reference clock. This will get you the direction of the transmitter, and if you have three such systems with known locations you can localise the transmitter perfectly. This requires specialised equipment with fixed installation points.
It also suffers from multi-path problems (that is, reflections of the original signal) that renders the problem much more difficult. With more than three receptor systems, there are algorithms for attempting to resolve this and find the original emitter. As far as I know, no one has a working system for bluetooth. And even if they did, that would be a lot of dedicated equipment to install absolutely everywhere!
This is not what is proposed.
2: Time-of-flight measurements. You ping-pong the signal. Device 1 transmits, Device 2 receives, and retransmits a signal, along with information on its own delay between reception and re-transmission, Device 2 receives, and uses this information, along with the knowledge of its own delays, to calculate the time of flight between the two devices.
The time of flight is 3.33 ns per meter. That is the accuracy with which you need to have the delays in your devices known in order to have useful data down to the meter. This is not information that exist for a phone Bluetooth system. You would also need to have a good clock, but not that excessively great. If you can count cycles on a 333 MHz clock, you can have the resolution. (Though it should be better, because you will have additional errors in the delay calculation part)
This is not what is proposed either. Can’t do it with a phone. Ultra Wideband (UWB) system chips that can do it exist, but they are not in your phone.
3: Received Signal Strength Indication (RSSI). Or rather known signal decay with distance. This is the only thing that could be done with mobile phones as they are constructed today. Theoretically well known. The power transmitted will be spread over the surface of a growing sphere. So, according to a 1/r^2 law. The equation is well known to calculate the exact attenuation at a given frequency (the Friis equation), so knowing the power transmitted and the power received will theoretically let you know the distance of transmission.
The 1/r^2 relationship means that for twice the distance you will have ¼ of the power arrive. That is -6dB in logarithmic units. (Which is how power is measured in RF) Now, the RSSI meter in a phone is not that accurate. Count a couple of dB errer. The transmission power is not that well know either! The phone may say that it is using a 4 dBm nominal output power, but it will be anywhere from 6 dBm to 0 dBm, or so. It will also vary by Bluetooth channel used. For most chips the signal is stronger lower in the band. (One chip I measured had a difference of 5 dB between the lowest and highest channel. The chip-to-chip difference was also a few dB.)
This all only works for free-space emission. So if there are some obstacles there will be more uncertainty. Reflections will also make it work less well. As will putting the phone close to a human body. The errors will accumulate and I think one would not get better than a factor of 4 or 5.
I have no idea if this could in any way be considered good enough.
 dBm, power relative to 1 mW – 1mW is 0 dBm, 3 dBm would be 2 mW, etc.
4 : Using other localisation data too. GPS, wifi localisation, etc. Maybe this is what is proposed?
Thank you very much for this explanation.
I am very curious if Google and Apple have a rabbit up their sleeves. Because if they don’t, from both what the engineer I asked and you said, this sounds like a huge tech promise which cannot be kept without cutting corners. Namely, using location data of “Alice” and “Bob”. Which makes the whole thing a privacy nightmare despite promising otherwise.
One question that remains why isn’t this all over the news articles covering the Google/Apple joint venture.
Another other is: did we miss anything crucial so we are just not getting it?
The third is: can anyone relate our concerns to the virologist and epidemiologist who are currently banking on Bluetooth tech promised to get us out of measures which are extremely hard on the economy?
The system that Google/Apple are proposing does not track phone positions. It doesn’t use GPS outside, and it doesn’t need to localize phones indoors. I’m not sure that Luther’s engineer friend understood that.
All they are checking is the proximity of the phones. i.e. were the phones close enough that the owners might have been within the fuzzy distance for possible COVID-19 spread?
Both Apple and Google have both had Bluetooth beacon APIs for years, and probably have assembled a library of tricks and tweaks for refining proximity estimates based on RSSI.
On the other hand, most ARM SOCs do have a high resolution timer, but usually not available to non-system apps for security reasons (counting cycles on system calls). Since Apple and Google say that they’re making additions to their OSs, perhaps they’re adding code for a ping-response distance measurement API? (The response would have to include an average delay for that phone’s chipset/drivers. Hm, the phone should be able to ping itself to measure that.)
That could be handy for a lot of things other than this application, as well as many unintended consequences.
- The Contact Tracing Bluetooth Specification does not use location for proximity detection. It strictly uses Bluetooth beaconing to detect proximity.
If bluetooth can tell you what is nearby, then it can give you an idea of how proximity changes with time by looking at how long the two devices were nearby. For example if you and I walk in opposite directions on a path, then you get an indication of relative motion as the BT contact comes up and goes away. If another person walks at 90 degrees to the two of us, then there is more information.
Governments should just equip us all with GPS tracking bracelets and get it over with, because that is the limiting case in all of this.
I am not betting on it. As I said: if they have a white rabbit, they better pull it out, and fast.
So far, beacons are a different thing than phones. Less problems with variable environmental attenuation, less problems with variable reflection. Most importantly, they have a know position. Several of them could triangulate a phone with ease.
I don’t know shit, so let’s ask @Aciantis: is it even the same hardware, or do they use other chips? Other power sources? (As they explained, that will have an result on the measurements.)
Time is of essence, since the duration of the contact is very much influential on your probability to get infected.
Regarding the idea to use it for proximity, and this:[quote=“Michael_R_Smith, post:18, topic:168306”]
Governments should just equip us all with GPS tracking bracelets and get it over with, because that is the limiting case in all of this.
Are you trolling? O_o
Proximity by duration: nope, doesn’t work this way.
GPS: even if it would work indoors and wouldn’t be a total dystopian fuckshit, I don’t think we could just ramp up production of dedicated devices.
Let us keep discussing BT solutions here, shall we? That’s what was proposed.
No, I don’t think it can be a ping response measurement. Counting cycles isn’t your only problem. You have to know when the signal starts. BT uses 1 MHz (or 2 MHz for BT5) channel bandwidths. That’s the speed with which the signal ramps up. It takes 125 ns (¼ period) at 2 MHz to start the signal. That’s a pretty slow rise-time compared to the 3.3 ns per meter time of flight. (6.6 ns return trip). That’s really fuzzy to measure start time. This is the reason that time of flight systems all use ultra wide-band (UWB) systems, to have cleaner rise times.
And, no you can’t ping yourself. The BT chip is either sending or receiving. There is a switch selecting which path you use. There is some dead time between sending and receiving. Well, the switch is leaky. So some signal will spill over into the receiver when you transmit. Theoretically I suppose you could use this to measure the signal you are transmitting in your receiver, but I don’t know if you have enough control of the chip to be able to practically do this. The BT chip usually has its own 8-bit processor to handle all the really low level crap, like taking the IQ signals and doing symbol detection. And it does not give you access from higher up the chain. It just spits out bits.
I’m guessing that it is just proximity based on RSSI with black magic on top. (Machine learning!) And just living with the uncertainty in distance. I’m going to check in with what some other people I am working with on BT stuff have to say and see if they have some enlightening ideas.
BT beacons are the same type of RF frontend. They really don’t work that well for distance measurements either. The question they can answer is, is it close?
And it works more the other way around. The beacon beacons. That is, it spits out a regular signal. Your phone knows if it is close to the beacon, not the other way around. The ones I have seen in use are in places like museums. You walk up to an exhibit, phone in hand, and the app you installed at the entrance knows what you are close to, and it displays some information on the object, or maybe plays some audio in your headphones.
A collection of dedicated devices with fixed locations (not beacons, beacons send, they don’t transmit) could triangulate better, sure. But we shouldn’t call it triangulate. It is not the easy triangulation math if you want some accuracy. More like, with the fixed installed devices, you pass many times with a phone via known locations to create a learning set. Then it is black magic time! (Machine learning)
The Health Institute here in Norway is rolling out an app soon, possibly even this afternoon. There have been objections from privacy advocates, but without effect. (Note: “two feet” is really “two meters”.)
Thanks for your ongoing effort to explain BT distance measurement and clearing up my misconceptions about beacons!.
Thank you for this, too. I have the feeling that this is going to be really interesting, and also going to be interesting breaking it down to people like me, other mutants, and especially decision makers.
Virology and epidemiology already proved hard to communicate. Adding technology and physics… I would pray if I had any believe in that kind of stuff.
Right now, I am really worried.
So some regression and some feedback. No problem there.
another interesting thing is that the traditional way to do contract tracing involves interviews – often phone calls to people to discuss where they’ve been and when.
only, you need to hire a lot of people to do the interviews, and i guess nobody is looking for work right now… /s
to me… so far… phone based contact tracing seems a bit like electronic voting. sexy and exciting because we are supposed to be living in the future, while the reality is the old school pen-and-paper methods are probably still best.
the us government though – and maybe the general public – would rather give a few private companies big bundles of cash for developing something tech-centric and fragile vs. giving a bunch of individuals each a small amount of cash for doing something routine and reliable.
somewhat related: the supreme court is going to be doing arguments by phone.
whomever thought to use the phone instead of something like zoom deserves plenty of praise imo. reliable and well tested, with no special hardware or training needed.
I don’t have much time doing serious research, but a quick look around at scientific papers has had some hits for other techniques besides RSSI.
HOWEVER, everything was either proof of concept or even basic research. Phase shift seemed to be one of the more promising avenues. But to me, as someone who just has some very poor basic understanding of the matter, this is not really clear.
Anyone here willing to dig at ISI, Google Scholar and SciHub?
Using the existing BT/BLE hardware in phones or with some newer generation stuff? 5G radios use beamforming techniques so they’re not as constrained by the inverse square law effect. (there is also a direction finding feature in the newer implementation of BLE, but I don’t think it’s widely deployed). One might imagine these coming into play somehow - though I’d find it hard to believe that it would make contact tracing more rather than less effective (increasing the number of uncontrollable variables).
Just from a first principles perspective RSSI is going to fail because radio signal is not a reliable proxy for actual distance / exposure potential, (for that matter even “actual distance” isn’t when considering that walls and multi-level structures exist). What about being in traffic - is every adjacent vehicle a potential exposure source, does
isWindowsOpen get evaluated?
What about a scenario where the radio is blocked by granite counter-top, yet the face of the infected person is a couple dozen inches away. The radio that stands for you and the other could have a greater mean-free-path than other radios that are many tens of feet away.
I could see this being useful from a zoomed out population survey perspective (and maybe that’s really all that is actually intended, despite some misdirecting press) - but the utility as a useful marker for individual exposure potential seems extremely low.
I seriously don’t understand enough to even understand the principles. I looked at some recent papers and found, e.g., phase shift approaches. But the papers were written on dedicated hardware under lab conditions, very much basic research. And I can’t even understand the details, so: no idea what else could be used, but I have the impression we are years, not months, away from a usable technology.
An update on the Norway app, which has been deployed for several days:
Apparently, on top of everything else, the part of the system which is supposed to text you if you’ve been near a registered positive is not very secure. I don’t know if that is just bad design in the Norwegian app or is a structural problem with the idea.
Very different kind of problem, but not the last one, we have to assume.