• Categories

  • Pages

  • Archives

Your Daughter Arrested By Your Own DNA? Ancestry Sites & Law Enforcement

Back in 2009, I’d written an article on Disney theme parks sharing facial recognition technologically enhanced photos of park-goers with the Department of Homeland Security in an effort to boost the DHS’ base population photo database.  Shortly thereafter, the theme parks were joined by cruise lines, vacation spots and just about all hotel, domestic and international, check-ins.  Now firmly in possession of billions of citizen and visitor photos, law enforcement has moved on to absorb as much DNA from the public as it can, often to identify relatives of those on file in connection with crimes.

This 2015 Fusion article describes the acquisition of genetic IDs from family ancestry sites like Ancestry.com and 23andMe:

When companies like Ancestry.com and 23andMe first invited people to send in their DNA for genealogy tracing and medical diagnostic tests, privacy advocates warned about the creation of giant genetic databases that might one day be used against participants by law enforcement. DNA, after all, can be a key to solving crimes. It “has serious information about you and your family,” genetic privacy advocate Jeremy Gruber told me back in 2010 when such services were just getting popular.

Now, five years later, when 23andMe and Ancestry both have over a million  customers, those warnings are looking prescient. “Your relative’s DNA could turn you into a suspect,” warns Wired, writing about a case from earlier this year, in which New Orleans filmmaker Michael Usry became a suspect in an unsolved murder case after cops did a familial genetic search using semen collected in 1996. The cops searched an Ancestry.com database and got a familial match to a saliva sample Usry’s father had given years earlier. Usry was ultimately determined to be innocent and the Electronic Frontier Foundation called it a “wild goose chase” that demonstrated “the very real threats to privacy and civil liberties posed by law enforcement access to private genetic databases.”

The FBI maintains a national genetic database with samples from convicts and arrestees, but this was the most public example of cops turning to private genetic databases to find a suspect. But it’s not the only time it’s happened, and it means that people who submitted genetic samples for reasons of health, curiosity, or to advance science could now end up in a genetic line-up of criminal suspects.

Both Ancestry.com and 23andMe stipulate in their privacy policies that they will turn information over to law enforcement if served with a court order. 23andMe says it’s received a couple of requests from both state law enforcement and the FBI, but that it has “successfully resisted them.”

23andMe’s first privacy officer Kate Black, who joined the company in February, says 23andMe plans to launch a transparency report, like those published by Google, Facebook and Twitter, within the next month or so. The report, she says, will reveal how many government requests for information the company has received, and presumably, how many it complies with. (Update: The company released the report a week later.)

“In the event we are required by law to make a disclosure, we will notify the affected customer through the contact information provided to us, unless doing so would violate the law or a court order,” said Black by email.

Ancestry.com would not say specifically how many requests it’s gotten from law enforcement. It wanted to clarify that in the Usry case, the particular database searched was a publicly available one that Ancestry has since taken offline with a message about the site being “used for purposes other than that which it was intended.” Police came to Ancestry.com with a warrant to get the name that matched the DNA.

“On occasion when required by law to do so, and in this instance we were, we have cooperated with law enforcement and the courts to provide only the specific information requested but we don’t comment on the specifics of cases,” said a spokesperson.

As NYU law professor Erin Murphy told the New Orleans Advocate regarding the Usry case, gathering DNA information is “a series of totally reasonable steps by law enforcement.” If you’re a cop trying to solve a crime, and you have DNA at your disposal, you’re going to want to use it to further your investigation. But the fact that your signing up for 23andMe or Ancestry.com means that you and all of your current and future family members could become genetic criminal suspects is not something most users probably have in mind when trying to find out where their ancestors came from.

“It has this really Orwellian state feeling to it,” Murphy said to the Advocate.

If the idea of investigators poking through your DNA freaks you out, both Ancestry.com and 23andMe have options to delete your information with the sites. 23andMe says it will delete information within 30 days upon request.

Another example of familial DNA invasion:

From pri,org:

DNA is taken from the crime scene and compared against a federally regulated FBI-run database used to process DNA evidence, called CODIS. The process can take as long as 18 months before a match is identified. In the meantime, the perpetrator has committed a string of other crimes.

But some local police departments claim they can get faster results — as little as 30 days — by using private labs and local DNA databases.

Frederick Harran, director of public safety at the Bensalem Police Department in Pennsylvania said, “18 months is not prevention, that’s not what they pay me for.”

“I would agree the federal database is a good thing, but we’re just moving too slow,” he claims.

So more and more law enforcement agencies are turning to local databases. But with loose regulations, that can present troubling scenarios. Take this real example from Melbourne, Florida, for example.

A few teenagers were sitting in a parked car, when a police officer pulled up and requested someone provide a DNA sample. The officer gave one boy a cotton swab and a consent form. Once the officer made the collection, he went back on patrol as usual.

Increasingly, local police departments are collecting consensual DNA samples, processed using private labs. It’s happening in cities across Florida, Pennsylvania, Connecticut and North Carolina.

The potential issues for these databases vary state by state. In Florida, minors are allowed to consent to having their DNA collected, which isn’t true in other states, like Pennsylvania. But simply maintaining the databases allows each jurisdiction to test every sample already collected, meaning that the DNA from a minor crime scene from years before could be immediately matched with the new sample.

Stephen Mercer, chief attorney for the Forensics Division of the Maryland Office of the Public Defender, finds the practice deeply troubling.

“The collection procedureshighlights the very real threat to liberty interests that local DNA databanks pose,” Mercer said. “The usual suspects are targeted, so we see this amplification of bias in the criminal justice system along the lines of race being amplified through the criminal justice system.”

Granted, many may think, “Well, if you have nothing to hide…”.  That’s not the point. The innocent, unindicted individual should retain a basic form of control over whether she becomes involved in situations wherein she identifies relatives in potential criminal acts. There is something perverse in having one’s DNA finger one’s own flesh and blood for the government’s purposes.  Identification by familial DNA isn’t a slippery slope… it’s a well-greased slalom of privacy infringement.

We will be looking into the matter of DNA familial finger-pointing in-depth and report back as developments warrant .

BNI Operatives: Situationally aware.

As always, stay safe.

Operation Mickey Mouse? Facial Recognition & Disneyland

There is an internet story of a software engineer who, while visiting Disneyland,  went on a ride and was then offered – by a theme park employee –  a photo of himself and his girlfriend to buy – with his credit card information already linked to it.  The engineer emphatically stated that he had not entered any of his personal or credit card information on any of the theme park’s registers.  So, he determined, based on his professional experience, the system had to be using facial recognition technology to access and activate his personal facial and credit card information. He had never signed an agreement allowing the Mouse & Co. to do so, and believed that this use was illegal. He also stated that he believed Disney was sharing information related to facial recognition technology with the military.

As it turn out, he may not be wrong or very far off from the truth.

To understand how his claim of passive facial recognition might work let’s first define facial recognition technology: Facial recognition software (FRS) can pick someone’s face out of a crowd, extract the face from the rest of the scene and compare it to a database of stored images. In order for this software to work, it has to know how to differentiate between a basic face and the rest of the background. Facial recognition software is based on the ability to recognize a face and then measure the various features of the face.

Every face has numerous, distinguishable landmarks, the different peaks and valleys that make up facial features. FRS defines these landmarks as nodal points. Each human face has approximately 80 nodal points. Some of these measured by the software are:

•Distance between the eyes

•Width of the nose

•Depth of the eye sockets

•The shape of the cheekbones

•The length of the jaw line

These nodal points are measured creating a numerical code, called a faceprint, representing the face in the database.

Next, let’s review how facial recognition occurs.  (In the past, FRS use was limited to 2D facial images, and subject to many environmental factors, such as lighting or blurring, that restricted its use to primary law enforcement agencies for comparative analysis v. existing pictures of the subjects of interest. We are now well past that stage and into comparing live 3D images to networked databases worldwide.)

3D Facial Recognition   Facial recognition software uses a 3D model, which provides more accuracy than its 2D predecessor. Capturing a real-time 3D image of a person’s facial surface, 3D facial recognition uses distinctive features of the face — as outlined above — to identify the subject. These areas are all unique and don’t change over time.

Using depth and an axis of measurement that is not affected by lighting, 3D facial recognition can even be used in darkness and has the ability to recognize a subject at different view angles with the potential to recognize up to 90 degrees (a face in profile).

Using the 3D software, the system goes through a series of steps to verify the identity of an individual.

Detection

Acquiring an image can be accomplished by digitally scanning an existing photograph (2D) or by using a video image to acquire a live picture of a subject (3D).

Alignment

Once it detects a face, the system determines the head’s position, size and pose. As stated earlier, the subject has the potential to be recognized up to 90 degrees.

Measurement

The system then measures the curves of the face on a sub-millimeter (or microwave) scale and creates a template.

The system translates the template into a unique code. This coding gives each template a set of numbers to represent the features on a subject’s face.

Matching

If the image is 3D and the database contains 3D images, then matching will take place without any changes being made to the image. However, there is a challenge currently facing databases that are still in 2D images. 3D provides a live, moving variable subject being compared to a flat, stable image. New technology is addressing this challenge. When a 3D image is taken, different points (usually three) are identified. For example, the outside of the eye, the inside of the eye and the tip of the nose will be pulled out and measured. Once those measurements are in place, an algorithm (a step-by-step procedure) will be applied to the image to convert it to a 2D image. After conversion, the software will then compare the image with the 2D images in the database to find a potential match.

Verification or Identification

In verification, an image is matched to only one image in the database (1:1). For example, an image taken of a subject may be matched to an image in the Department of Motor Vehicles database to verify the subject is who he says he is. If identification is the goal, then the image is compared to all images in the database resulting in a score for each potential match (1:N). In this instance, you may take an image and compare it to a database of mug shots to identify who the subject is.

Facial Recognition Systems Uses

Law enforcement:  Aside from the obvious background identification and history of arrested suspects, l.e. uses the system to capture random faces in crowds to match to their terrorist databases. .

Government agencies: Some government agencies have also been using the systems for

– security

– monitor voter fraud

– eliminate “buddy punching” (The practice of a coworker signing for a friend or displaying that friend’s id for UPC processing. )

– tracking foreign visitors and frequent flyers  (The Department of Homeland Security has implemented a program called US-VISIT, United States Visitor and Immigrant Status Indicator Technology, aimed at foreign travelers gaining entry to the United States. When a foreign traveler receives his visa, he will submit fingerprints and have his photograph taken. The fingerprints and photograph are checked against a database of known criminals and suspected terrorists.  Likewise,  the TSA is runs its Registered Traveler program through FRS.

Other potential applications currently in use include ATM and check-cashing security and access to your own lap/desk top via the monitor’s FR program.

To get back to our irate software engineer, he is correct in identifying Disneyland’s use of facial recognition software and sharing it with the United States Department of Defense.  This collusion is referred to as Operation Mickey Mouse (not joking) and has been in effect for decades. Who would suspect the family friendly theme park of being a de facto arm of the government?

Now the vast majority of us will never really notice how much facial recognition has creeped into our lives — but if there is a foul-up, you can expect it to be a big deal.  ALL government FR dbases will have to be updated if a modification (e.g., surgically enhanced faces) occurs.

Our operatives: Situationally aware.

As always, stay safe.

Deep Face: FB’s Facial Recognition Software Can- and Probably Will, Follow You Everywhere Online.

deep face

 

 

(We believe our readers will easily make the connection to the uses of  facial recognition capabilities as it applies to law enforcement and the field of law.  For this reason, we sought and received permission to reprint this article in its entirety from ExtremeTech.  This technology has advanced so quickly that its implications for the future are limitless – good, bad or indifferent as these applications resolve.  Read on and draw your own conclusions.) 

 

Facebook’s facial recognition research project, DeepFace (yes really), is now very nearly as accurate as the human brain. DeepFace can look at two photos, and irrespective of lighting or angle, can say with 97.25% accuracy whether the photos contain the same face. Humans can perform the same task with 97.53% accuracy. DeepFace is currently just a research project, but in the future it will likely be used to help with facial recognition on the Facebook website. It would also be irresponsible if we didn’t mention the true power of facial recognition, which Facebook is surely investigating: Tracking your face across the entirety of the web, and in real life, as you move from shop to shop, producing some very lucrative behavioral tracking data indeed.

The DeepFace software, developed by the Facebook AI research group in Menlo Park, California, is underpinned by an advanced deep learning neural network. A neural network, as you may already know, is a piece of software that simulates a (very basic) approximation of how real neurons work. Deep learning is one of many methods of performing machine learning; basically, it looks at a huge body of data (for example, human faces) and tries to develop a high-level abstraction (of a human face) by looking for recurring patterns (cheeks, eyebrow, etc). In this case, DeepFace consists of a bunch of neurons nine layers deep, and then a learning process that sees the creation of 120 million connections (synapses) between those neurons, based on a corpus of four million photos of faces. (Read more about Facebook’s efforts in deep learning.)

Once the learning process is complete, every image that’s fed into the system passes through the synapses in a different way, producing a unique fingerprint at the bottom of the nine layers of neurons. For example, one neuron might simply ask “does the face have a heavy brow?” — if yes, one synapse is followed, if no, another route is taken. This is a very simplistic description of DeepFace and deep learning neural networks, but hopefully you get the idea.

Sylvester Stallone, going through DeepFace's forward-facing algorithm

Anyway, the complexities of machine learning aside, the proof is very much in the eating: DeepFace, when comparing two different photos of the same person’s face, can verify a match with 97.25% accuracy. Humans, performing the same verification test on the same set of photos, scored slightly higher at 97.53%. DeepFace isn’t impacted by varied lighting between the two photos, and photos from odd angles are automatically transformed (using a 3D model of an “average” forward-looking face) so that all comparisons are done with a standardized, forward-looking photo. The research paper indicates that performance — one of the most important factors when discussing the usefulness of a machine learning/computer vision algorithm — is excellent, “closing the vast majority of [the] performance gap.”

Facebook facial recognition fail

Facebook tries to impress upon us that verification (matching two images of the same face) isn’t the same as recognition (looking at a new photo and connecting it to the name of an existing user)… but that’s a lie. DeepFace could clearly be used to trawl through every photo on the internet, and link it back to your Facebook profile (assuming your profile contains photos of your face, anyway). Facebook.com already has a facial recognition algorithm in place that analyzes your uploaded photos and prompts you with tags if a match is made. I don’t know the accuracy of the current system, but in my experience it only really works with forward-facing photos, and can produce a lot of false matches. Assuming the DeepFace team can continue to improve accuracy (and there’s no reason they won’t), Facebook may find itself in the possession of some very powerful software indeed. [Research paper: “DeepFace: Closing the Gap to Human-Level Performance in Face Verification“]

What it chooses to do with that software, of course, remains a mystery. It will obviously eventually be used to shore up the existing facial recognition solution on Facebook.com, ensuring that every photo of you on the social network is connected to your account (even if they don’t show a visible tag). From there, it’s hard to imagine that Zuckerberg and co will keep DeepFace purely confined to Facebook.com — there’s too much money to be earned by scanning the rest of the public web for matches. Another possibility would be branching out into real-world face tracking — there are obvious applications in security and CCTV, but also in commercial settings, where tracking someone’s real-world shopping habits could be very lucrative. As we’ve discussed before, Facebook (like Google) becomes exponentially more powerful and valuable (both to you and its share holders) the more it knows about you.

BNI Operatives: Street smart; info savvy.

As always; stay safe.

%d bloggers like this: