Look who’s watching

CCTV-Survellience

In 2021, Delhi Chief Minister Arvind Kejriwal spoke about the city beating Shanghai, New York, and London with the most number of closed-circuit TV cameras per square mile.

The updated Comparitech report detailing CCTV presence in different parts of the world (excluding China) found that Delhi, with 1,446 cameras per square mile, is the most surveilled city in the world, followed by Chennai (614) and Singapore (387). 

While the Delhi Chief Minister thought this was a statistic to brag about, the truth is that the use of CCTVs raises many unanswered and disturbing questions. The increasing use of facial recognition technology (FRT) by State governments of Punjab, Gujarat, Tamil Nadu, Andhra Pradesh and Maharashtra for everything from law enforcement to education raise questions about the potential misuse of this data in the absence of proper laws in the area.

The new proposed Data Protection Bill does not deal with surveillance, unlike the previous version of the proposed law, where it was dealt with at least as a generic principle. It is interesting that the same month this new Bill was introduced, the government think tank NITI Aayog released a draft paper on the need for responsible use of FRT. It calls for a “codified data protection regime at the earliest,” and also asks the new PDP to tackle this issue. 

Surveillance technology has the potential to cause a huge humanitarian crisis. But governments justify the large-scale installation of CCTVs by saying that cameras will deter people from breaking the law and will help aid investigators when needed. That argument, however, has been debunked several times. Report after report claims these cameras play no role in deterring crime. Yes, you can always say, it doesn’t hurt anyone, does it? But, this claim may not hold. 

For instance, is there a statute of limitations that suggests how long a video can be saved? What happens to the videos taken of people who aren’t violating the law? Can the videos be sold to private companies? This is worrying, especially in a country that has no laws governing the use of CCTV cameras, the footage they capture, and their possible usage. 

There is also the Information Technology Act and the overarching right to privacy that can be invoked under the Indian legal system, but neither deals with CCTV or the more sophisticated FRT as a specific issue. We can’t argue that like AI, the technology is still evolving and lawmakers need time to process it. For reference, the first CCTV camera was installed in Germany during the second world war. 

The United Kingdom (London has the 6th highest number of CCTVs per square mile) has a clear set of rules for camera surveillance, first introduced in 2013 and updated in 2021. In no unclear terms, these rules ensure that the use of cameras is for public protection and support and not spying. In the US, CCTV surveillance laws vary from state to state. 

Not just cameras

As the technology matures, both camera companies and governments are increasingly adding features such as facial recognition to the mix.

China’s use of CCTVs and FRT certainly gives you an idea of the kind of programmes a central authority can build on a facial database of all its citizens. The recent cases of China cracking down on those protesting strict Covid measures should give you an idea of how terrifying CCTV surveillance coupled with artificial intelligence (AI) technology can be. 

Many regions, such as the European Union, are considering FRT as part of their larger AI regulation. In Australia, FRT is covered under the larger privacy law and also the country’s Human Rights Commission, which is developing a view on how FRT deployments should be regulated.

There are examples from India of how governments have misused the absence of clear laws or mandates on CCTVs or FRT. The Delhi and Uttar Pradesh governments were reported to have used FRT to recognise protestors at the controversial Citizenship Amendment Act protests. While Delhi Police itself admitted that it had purchased the facial recognition software to trace missing children, the prime reason for using FRT in UP was women’s safety. Both breached those mandates. 

Similarly, Haryana also used drones to keep a check on farmers’ protests in 2021. Many state governments have used drone cameras to keep an eye on political rallies. If these images are run through an AI-powered facial recognition system tomorrow to, let’s say, segregate people of a specific community, there is little that an ordinary citizen could do about it. 

This is why laws and boundaries are important

Growing up in the glare of a camera

And what would the implications of this be if we think about schools? In New Delhi and Andhra Pradesh, cameras are being installed in schools. The argument has been that this will keep teachers motivated and let parents know what their children are doing during class. The CBSE and Telangana government also use FRT in some form or the other.

Digital rights organisations have filed Right to Information requests with the Delhi government, which confirmed the use of FRT in conjunction with CCTVs in government-run schools. The Delhi government is fighting a case against the use of CCTV cameras installed in classrooms. Apart from the privacy aspect, there is little to no information on who handles these videos, where they’re stored, and if at all they’re deleted. Can you imagine the videos of our children being sold to large companies?

There also needs to be a separate set of rules while deploying invasive technology that can potentially track minors. There is a need to evolve a framework that protects individual privacy and to lay out clear guidelines on where this video footage is stored, for how long, and which authority has ownership. 

We are far ahead of the argument that privacy is a rich man’s construct. It is a fundamental right of the Indian people under its Constitution. The harms of institutional, personal, criminal, and discriminatory targeted abuse of CCTV footage are concerns that have been documented as far back as two decades. 

We have a long way to go before the possibility of these harms is adequately addressed.

  •  
  •  
  •  
  •  
  •  
  •  
  •