In a twist on the old punch-clock technology, workers at Beijing Sunrise Technology in Beijing step before a device that scans their faces before they enter a secure work area.
A New Jersey health care provider in a high-crime area uses facial recognition technology at the entrance to its emergency room to determine if a person entering the building has a criminal history and may pose a danger to staff. It pays a subscription fee to access several national databases to verify the identities of people in its database. Additionally, it reportedly uses the same technology to screen vendors coming onto the premises.
Lu Shaoshua of Beijing Sunrise Technology told NTDTV, “The more advanced software saves a lot of trouble for the human resources department in checking on work attendance” and increases the efficiency of the company’s security to its warehouses.
Facial recognition technology is creeping into other business uses as well, stated Jennifer Lynch, staff attorney with the San Francisco-based Electronic Frontier Foundation, in written testimony before a U.S. Senate subcommittee hearing July 18, 2012, on privacy, technology and the law.
“Private companies are using biometric identification for everything from preventing unauthorized access to computers and corporate facilities to preventing unauthorized access to the [company] gym,” she stated. But the benefits come with dangers and risks, she added.
“Data accumulation and sharing can be good for solving crimes across jurisdictions or borders, but can also perpetuate racial and ethnic profiling, social stigma, and inaccuracies throughout all systems and can allow for government tracking and surveillance on a level not before possible,” she noted.
She was among a panel of experts at the hearing on “What Facial Recognition Technology Means for Privacy and Civil Liberties,” and she urged legislative action to curb over-collection and overuse of biometrics. There was no such legislation introduced or pending on either the Senate or House floor as of Sept. 12, 2012.
Biometric Data and Workplace Impact
Sen. Al Franken, D-Minn., who chairs the Senate Committee on the Judiciary and called the subcommittee hearing, acknowledged that while “there’s nothing inherently right or wrong with facial recognition technology,” its use raises potential privacy issues.
“It’s a tool that can be used for great good,” he said. “But if we don’t stop and carefully consider the way we use this technology, it could also be abused in ways that could threaten basic aspects of our privacy and civil liberties.”
At the New Jersey health care facility, for example, the technology use is limited to the emergency room’s primary entrance, and a staff member who receives an “alert” after scanning a patient’s driver’s license does not learn why the person is flagged. That information “is filtered to high-level security personnel” at the hospital and to law enforcement, according to a 2011 Security Director News report.
Unlike fingerprints, face prints create “acute privacy concerns,” Franken said. “Once someone has your face print, they can get your name, they can find your social networking account, and they can find and track you in the street, in the stores that you visit, the government buildings you enter and the photos your friends post online.”
Using facial recognition technology beyond checking attendance or to maintain security could be a slippery slope into privacy issues if its use by employers or their vendors veers into sourcing potential job candidates.
However, Dawn A. Haag-Hatterer, J.D., SPHR, and a member of the Society for Human Resource Management’s (SHRM) Technology and HR Management Special Expertise Panel, thinks the cost of the technology will be so prohibitive that many employers will not use it.
There’s also the matter of accessing an FBI or some other governmental master database and paying a requisite fee, she added.
Instead, smaller employers may opt to use lower-cost alternatives that compare photos already in their system, such as badge photos, against an employee unique identifier, she said.
The downside to that approach, she told SHRM Online in an e-mail, is that less-robust systems will lack the capability “of conducting an intensive analysis based on facial coordinates used in biometrics.”
The investment decision then is sometimes “based solely on the employer wanting to have the ‘latest and greatest’ ” in the employer’s effort to discourage time-punch fraud.
If an employer uses large outsource recruitment agencies with the resources to use this technology, she said it’s advisable for the service-level agreement to indicate whether the employer accepts the vendor’s use of facial recognition software.
Haag-Hatterer also thinks it’s a good idea to address biometric data in company policy. Add a sentence or two to the company’s privacy policies that stipulate how the employer might use and safeguard biometric data in the event it is collected.
“How people use the information is the deadly part,” she said. If employers make hiring decisions based on what they find in photos, such as a picture of an applicant at a political rally, she predicts an influx of discrimination charges with the U.S. Department of Labor.
Paul Belliveau, senior principal at Infosys and co-chair of SHRM’s technology special expertise panel, concurred with Haag-Hatterer on the importance of weighing the risks of using facial recognition technology.
“You don’t want to do mining just for the mining’s sake,” he said. “If there’s a compelling reason and a good business reason, then you’ll need to use caution with respect to what you’re using it for,” he told SHRM Online. He further advised employers to put some type of policy in place to cover this issue.
‘Augmented Reality’ and Privacy Issues
Facebook allows its more than 900 million users to tag the photos they upload. Unless a Facebook user opts out—a six-click process, according to Franken—Facebook’s “Tag Suggest” feature “uses face recognition to automatically match uploaded photos to other photos a Facebook user is tagged in, grouping similar photos together and suggesting the name of a user’s friend in the photo,” Lynch noted.
Because Facebook is designed to promote social engagement, it “establishes associations between and among users and between users and the companies, organizations and causes they find relevant to their lives,” she pointed out.
However, Robert Sherman, manager of privacy and public policy for Facebook, said during the subcommittee hearing that the company’s technology does not enable people to reliably identify others with whom they have no relationship. Additionally, it has unique technology whose templates “work only with our proprietary software”; even law enforcement cannot use it “to reliably identify an unknown person.” Facebook also has privacy settings that allow people to block others who tag them in photos, Sherman said in written testimony, and individuals can click a link to request the removal of objectionable photos.
In Germany, data protection officials “are demanding that Facebook destroy its photographic database of faces collected” in that country and request “explicit consent” before creating a user’s digital face print, according to an Aug. 15, 2012, article in The New York Times. Data protection laws in Europe require users to give explicit consent to access their photos, it reported; Facebook requires users to opt out of this practice.
Alessandro Acquisti, Ralph Gross and Fred Stutzman, all of Carnegie Mellon University, are the authors of a 2011 study, Faces of Facebook: Privacy in the Age of Augmented Reality. It was funded by the National Science Foundation and the U.S. Army Research Office.
In the paper, they raise questions about the future of privacy in an “augmented reality” where data found both online and offline “will seamlessly blend.”
In one experiment conducted for the paper, they identified individuals on an online dating site where members use a pseudonym; in another, using students’ profile photos on Facebook, they identified students as they walked on campus.
“If an individual’s face in the street can be identified using a face recognizer and identified images from social network sites such as Facebook or LinkedIn, then it becomes possible not just to identify that individual, but also to infer additional, and more sensitive, information” using the person’s presumed name, they wrote in an online question-and-answer format on the college’s website.
Illinois’ Biometric Information Privacy Act of 2008 addresses the collection, use, safeguarding, handling, storage, retention and destruction of “biometric identifiers.”
Biometric identifiers include a retina or iris scan, fingerprint, voiceprint, hand scan, or face print. The law applies to private entities such as individuals, partnerships, corporations, limited liability companies, associations or other groups.
While Illinois is one of only a few states addressing biometric privacy, “if biometrics become prevalent, more states will regulate their use by employers,” much like recent laws pertaining to the privacy of social media passwords and the use of credit history checks for employment decisions, said Philip L. Gordon, attorney and chair of the Privacy and Data Protection Practice Group at Littler Mendelson in Denver.
If the data is not encrypted, a security breach “can trigger a legal obligation to provide notification,” he added.
There’s also an employee relations aspect to the use of facial recognition technology in the workplace, he said. With any new technology, employers need to be sensitive to the impact its use will have on employees or job applicants, Gordon pointed out.
“Employers should expect some pushback from an employee relations perspective and think about how they’re going to address that,” he said. He advised employers to be forthright with employees about using technology that could be perceived as invasive.
Gordon suggested that employers vet facial recognition technology before committing. Questions to ask: Do you really need it? Is it providing a benefit? If so, do the benefits outweigh the risks?
If the answer to the last question is yes:
- Take steps to mitigate the risks and comply with any applicable law, such as safeguarding the data against theft.
- Put policies and procedures in place to make sure the information is used appropriately in the recruitment process. That may include obtaining the individual’s consent, depending on state law. If there is no such state law, provide notice.
- Check for any state laws that impose specific legal requirements in the collection of biometric data.
- Consider the employee relations issues, and think carefully about how to communicate with your workforce about the new technology.
- Consider whether implementing the technology is a mandatory subject of collective bargaining, if the workplace is unionized.
Gordon has not had any inquiries about using facial recognition technology, nor has he seen any reports of its use in recruitment. He noted that employers are “taking on some significant risk” if they use it for that purpose.
“The benefits need to be carefully scrutinized … and weighed against the risks,” he said. “There’s always the temptation to use the newest and hottest technology out there. It’s important first to decide what the value is to the organization, whether it really is adding value.”
Kathy Gurchiek is associate editor for HR News. To view the original article, please click here.