Facial Recognition and Privacy Issues | UPSC

Spread the love

Facial Recognition and Privacy Issues | UPSC

Facial Recognition and Privacy Issues | UPSC

      HEADLINES:

Big Brother is watching you; actually your face

      WHY IN NEWS:

Hot from DownToEarth !

SYLLABUS COVERED: GS 2 : 3 : Right to Privacy : Artificial Intelligence : Data Security

      LEARNING: 

For PRELIMS it is important to understand this technology and the relevant reports .

For MAINS this is an important topic . Keep an eye on significance , market share , hurdles , legal backup and accountability issues.

      ISSUE: 

Facial recognition has become a frontline policing tool in India amid fears that it is prone to errors and allows the state to expand surveillance without much oversight

HOW WE RECOGNISE A FACE

  • Have you ever wondered how you recognise a face?
  • Intuitively, right!
  • We often do not give much attention to this special, though not exclusive, ability of humans.
  • We humans have a ‘facial vocabulary’ that enables us to recognise at least 5,000 faces, their peculiarities and profiles.

This vocabulary is organised in such a way that we instantly create memory associations and parallels with faces.

EXAMPLE
Those of our relatives or friends and rarely fail to recognise a person we have met at a social event, even if only for the second time.

  • This intuitive knowledge, which deploys millions of permutations through specialised cells and circuitry to instantly tell us who we are talking to, is an amazing biological feat.

FACIAL RECOGNITION TECHNOLOGY

Facial Recognition and Privacy Issues | UPSC

Facial Recognition and Privacy Issues | UPSC 

  • Even more awe-inspiring are the technological interventions that are trying to replicate this biological process.

Technologies are being deployed to create and sustain a surveillance system that has never been seen before.
 

  • Our facial features — scanned through every possible source — are being converted into a gigantic data pool.
  • Using algorithms, millions of these faces can be compared and assessed to identify or verify a person.
  • Regardless of , If she/he is a culprit, a dreaded terrorist under disguise, a visitor in a protected area or a rioter.
  • Face-recognition technology is becoming commonplace, used in most smartphones for unlocking.

EXAMPLE
Mobile applications, such as Instagram and Snapchat, use the technology to tag individuals and apply filters to photographs.

  • “While there is a range of facial recognition techniques, prevalent models rely on using an image to create a mathematical representation of a person’s face,”.

FAST-EMERGING MARKET

  • The global facial recognition market would grow annually at 22 per cent for the next two years to become a $9.6 billion trade.
  • In recent years, three-dimensional facial recognition devices have captured a significant market as retailers deploy them to gauge customers’ facial gestures.

By assessing customers’ facial expressions and even bodily responses, retailers are able to gain better insights into consumer behaviour.

  • Even to the point where they can predict how and when a buyer might purchase their products in the future.
  • This helps increase sales.

DATASETS OF FACIAL ALGORITHMS

All facial recognition surveillance systems require three kinds of datasets:

  • A training dataset to prepare the system
  • A dataset of the suspected people usually called ‘people of interest
  • Finally the master image database.

The fear is that in a country as diverse as India, in terms of race and ethnicity, if the ‘other race effect’ creeps into the system, correcting it will become an uphill challenge.

WHERE TO DRAW THE LINE 

  • While many question the necessity of this technology, others have raised alarm as it can be used by state to pervade privacy and intensify mass surveillance.
  • In India recently, we witnessed this being played out in a courtroom.
  • For the February 2020 Delhi riots in the grey area between conscientious citizens and anarchic rioters.
  • Arrests were tapped through driving licence and voter identity databases to apprehend 1,900 rioters.

ACCURACY OF FACIAL RECOGNITION

Facial recognition systems have an accuracy rate of less than 1 per cent.

  • It cannot even distinguish between boys and girls. –Union Ministry of Women and Child Development
  • In 2018, Delhi Police admitted in the high court that the accuracy of its facial recognition system was not more than 2 per cent.

LEVELS OF RECOGNITION

  • The first level of facial recognition includes the detection of a human face from an image or video.
  • Smartphone cameras use this to autofocus.
  • The second level involves creating a facial signature of individuals by extracting and cataloguing unique features of their face.

These may include the length of the jawline, the spacing between the eyes, dimensions of the nose, mouth and ears. 

  • At the final level, the facial signatures are compared with a database of human images and videos.
  • As the steps increase, so does the complexity and the chances of error.

LIFE OF FACIAL RECOGNITION IN INDIA

  • The life of the facial recognition software in India began benevolently with the aim to identify missing children.
  • In those circumstances, an accuracy rate of even 1 per cent is admirable.

The same statistics seem totalitarian and dystopian when they are capable of implicating citizens with criminality.

  • India is betting high on the technology and is on its way to create one of the world’s largest face recognition-based surveillance systems.

NATIONAL AUTOMATED FACIAL RECOGNITION SYSTEM

  • The National Automated Facial Recognition System, being developed by the National Crime Records Bureau (NCRB) .
  • It claims to automatically identify and verify criminals, missing persons, unidentified bodies and unknown traced persons.

The National Automated Facial Recognition System will have a searchable visual database of “missing persons. 

  • It will also take care of unidentified found persons, arrested foreigners, unidentified dead bodies and criminals based around dynamic police databases”.
  • It will also have individual information, such as name, age, addresses and special physical characteristics.

Facial Recognition and Privacy Issues | UPSC

  • The database will be accessible through mobile phones and will be available with the state police, along with the Union home ministry and NCRB.
  • It can be accessed by 2,500 users at the same time.

The system will also provide matching functions based on images / visuals of modified facial features like plastic surgery, aged image, bearded faces etc.

  • The project document claims that it will “play a very vital role in improving outcomes in the area of criminal identification and verification” by “quick and timely information availability”.
  • But not everyone is smitten by the government’s tall claim.

LEGAL TANGLES

  • The proposed system has no legal backing, claims Internet Freedom Foundation (IFF).
  • IFF’s notice draws strength from the Supreme Court verdict in August 2017.

JUSTICE K S PUTTASWAMY (RETD) AND ANR V UNION OF INDIA AND ORS CASE : The Supreme Court had said that privacy constitutes a fundamental right under the Article 21 of Indian Constitution which ensures ‘right to life and personal liberty’.

  • It added that any interference in an individual’s privacy by the state should be done only in a manner that is “fair, just and reasonable”.

It explained that states can interfere with an individual’s privacy only if: it is supported by law ; and is proportional to the objective.

  • The Information Technology Act, 2000, classifies biometric data as a type of sensitive personal data.
  • It also has rules for the collection, disclosure and sharing of such information.
  • The checks mentioned in the Act cover only ‘body corporates’ and do not apply to the state’s use of biometric facial data.
  • The proposed surveillance system is also a disproportionate response as it requires the deployment of facial recognition technology on large segments of the population without their consent.

OTHER HURDLES

  • The decision to have no legal framework means that the coercive technology is being implemented by the executive with no accountability.
  • In the Aadhaar card case, the apex court had also noted that although the disclosure of information in the interest of national security cannot be faulted.

The power to make such decisions should preferably be vested in the hands of a judicial officer and not concentrated with the executive.

  • The apex court’s insistence makes sense as the chances of misuse are higher with several departments, such as transport.

EXAMPLE
RTO’s have high-resolution images of most citizens that can be used as a natural database for face recognition programmes. 

  • These could easily be combined with public surveillance or other cameras in the construction of a comprehensive system of identification and tracking.

Facial Recognition

Facial Recognition and Privacy Issues | UPSC

SURVEILLANCE SYSTEM AND TRENDS

  • In Telangana where the police recently used its surveillance system to track people suspected of the novel coronavirus disease (COVID-19).
  • The state is trying to set up the surveillance system without prior discussion or consultation about the implications of the projects.

Little is known about the criteria used for the selection of the technology partner, the security protocols that are in place and the accuracy of the results

  • Transparency about the use of FRTs (facial recognition technologies) becomes all the more important when it is used in the context of criminal investigations.
  • Even the sketchy information available in the request for proposal document released by NCRB to attract bidders shows how coercive the system could be.
  • NCRB claimed that the database will automatically collect information from CCTV cameras installed across the country.

GLOBAL OBSESSION

  • Without legal safeguards, facial recognition technology is set to undermine democratic values.
  • China — which currently has the largest facial recognition system in place—used it to identify and target pro-democracy protestors.
  • It has also set up a system to track minority communities like Uighur Muslims, mostly living in the Xinjiang province.

US authorities used the technology to hunt down protestors during the Black Lives Matter agitations.

  • In January this year, the US recorded its first case of misidentification by facial recognition technology.
  • This is the biggest fear as most countries including India and the US lack the legal framework that can bring accountability into the system.

Almost 85 per cent of countries with facial recognition systems employ it for surveillance, suggests the Artificial Intelligence Global Surveillance Index 2019.

BACKGROUND

  • The technology came into existence in late 1960’s.
  • The accuracy improved in the 1970s as researchers drew on more facial markers, such as lip thickness.
  • But the real progress came in the 1980s and 1990s with new methods to locate a face in an image and extract its features, making fully automated facial recognition possible.

The most radical advances came from 2010 onwards when deep neural networks helped in the mastery of face recognition.
 

  • In 2011, the technology helped confirm the identity of Osama bin Laden when he was killed in a US raid.

CORPORATES AND FACIAL RECOGNITION SYSTEM

  • Facebook rolled out the technology for photo tagging and in 2014, its DeepFace program became the first to reach near-human performance in face recognition.
  • With the improvement in technology, came the obvious exploitation.
  • IBM, for instance, has closed its facial recognition technology division.
  • Amazon has put a moratorium on the technology for a year.
  • Microsoft has announced it will not sell its facial recognition technology to the police in places without federal regulation.

But corporate restraint will not prevent the abuse of this technology — even if some firms stand down, others are eager to step in. 
 

  • Corporations are also expanding the scope of facial recognition to study and predict human behaviour.
  • Israeli company Faception has made a software that it claims can read an individual’s face and predict his / her intelligence quotient.

      IASbhai WINDUP: 

GENDER JUSTICE

  • Delhi is currently in the process of setting up about 300,000 CCTV cameras being operated by the Delhi Police.
  • One of the stated objectives of this expansion project is to secure the safety of women.

In reality, such technologies often end up having a disproportionate negative impact on women and other marginalised groups.

  • The Delhi Police’s initial claim that its surveillance system will be specifically used for identifying missing children and bodies.
  • Studies show that even highly accurate facial recognition algorithm technologies struggle with children as their facial signature keeps on changing.
  • They also have limited success with bodies that are usually mutilated or decomposed.
     SOURCES:   DownToEarth  | Facial Recognition and Privacy Issues | UPSC

 

DISCOVER MORE : GENERAL STUDIES-III

If you liked this article, then please subscribe to our YouTube Channel for Daily Current Affairs , Editorial Analysis & Answer writing video tutorials. You can also find us on Twitter and Facebook.


Spread the love

Leave a Comment

You have successfully subscribed to the newsletter

There was an error while trying to send your request. Please try again.

IASbhai will use the information you provide on this form to be in touch with you and to provide updates and marketing.