Bruce Schneier had a blog entry about the security of partial fingerprints yesterday. His main point is that there has been a ruling in an US court recently that partial fingerprints cannot be used in a murder case. He links among else to the news-article (update: link removed, not longer available) describing this ruling.
Now this seems to me to be an effect due to sampling frequency. Research has shown that the fingerprints of two different individuals are different. The problem is that law agencies don’t seem to check the whole fingerprint. They check only a few different spots of the fingerprint. In other words they have a sampling frequency algorithm when they enumerate a fingerprint. Now, I am no expert on fingerprints, but I do know the weaknesses of a sampling frequency. If it is too loose you might get wrong data. To different objects can be identified with the same sampled key. (You might call it the same hashing key if you like.)
The article references among else two other cases where the fingerprint have been wrongly identified, and the judge “criticized the common method of fingerprint as overly subjective and lacking in standards”. Now the reason I am blogging about this is that we are now seeing the utilizing of fingerprint readers in a lot of devices. From laptops to airline check-in points.
As everybody that has seen the Mythbusters episode where they are trying to hack fingerprint readers know, such technology is not 100% secure. They only have to be secure enough. I have been alerted to wrongly identified airline passengers due to electronic fingerprint readers (in Norway). I would like to know if this was caused by software or hardware malfunction, or if the product did not use a “sampling frequency” capable of handling enough different passengers.
Anyway, we have to be aware of the weaknesses of a technology we are using and if there are problems we have to address them accordingly.
Link to entry on Digg