BLOG

How anonymization technology ensures the ethical use of video surveillance

Anonymization of video streams enables companies to leverage the power of AI-based video analytics while protecting personal privacy

Ethical challenges in the new era of video surveillance

Video surveillance has been essential for organizations to maintain on-site safety, security, and efficiency for decades. Recent technological developments have made video surveillance cheaper and more frequent than ever before. Paired with AI-driven video analytics, modern surveillance tools now depend less on human intervention. They are more accurate than the human eye, and never lose focus or get tired.

With the ever-growing power of AI, countless hours of video footage are now a wealth of actionable data. However, if put in the wrong hands, this data can also be used for a variety of purposes that extend far beyond safety, security, or efficiency. Cameras can be used to monitor traffic, safeguard schools, protect perimeters, alert guards, and protect citizens. But they can also be used to track the movement of individuals without their permission.

How, then, do companies leverage the power of video surveillance while minimizing the adverse impact on personal privacy and democratic freedoms?

Protect rather than pursue

At Irisity, we believe in developing cutting-edge AI in the service of humanity, via security systems that detect suspicious activity rather than expose personal identity. Anonymization technology allows users to mask out sensitive foreground objects in a camera’s field of view, such as people, vehicles, or public areas, in real-time. The system can be used in combination with any other type of analytics, such as Intrusion detection, Violent behavior, or Flame detection. Anonymization aligns with the “Protect not Pursue” philosophy by enabling monitoring companies to maintain the safety of their security assets while protecting the personal privacy of individuals captured on video.

Anonymization can be applied in two different ways that can all help mitigate the unethical use of video surveillance data and enable users to comply with data protection laws:

  1. Permanent filter: Sensitive foreground objects in a camera’s field of view, such as people, vehicles, or public areas, are permanently replaced with a fully destructive pixelation filter. The original video is not backed up or saved and the process is irreversible. Data subjects’ privacy is ensured even if the video is intercepted by an unwanted third party.
  2. Liftable mask: A more flexible option for anonymization where the original video can be retrieved by personnel with special authorization. For example, operators at a security center handle anonymized video in their everyday business. Personnel with certain security clearance are allowed to de-activate the mask to help authorities during a police investigation.

Fully-destructive anonymization reduces the administrative hurdles required to install new video security systems. Real-time anonymization enables the use of cameras in sensitive areas, such as schools, hospitals, and public spaces, without jeopardizing the personal privacy of children, patients, or the general public.

GDPR data breaches can cripple your business

Considering the inherent nature of personal data in video surveillance, regulations like GDPR have introduced new challenges to companies that operate within the security industry. With fines for breaching GDPR laws of up to 4% of the company’s annual turnover, operating within the legal guidelines is in the best interest of both individual privacy and the financial well-being of the organizations that handle personal data. Anonymization can help you comply with data protection laws and reduce the risk of paying large fines in the era of GDPR.

New perspectives for companies in the security industry

Laws are changing, and so have people’s perceptions of video surveillance. Even if the technology to identify individuals is at hand, is it always necessary? What about within schools, health care, and homes? There are many scenarios where personal privacy is key to making the very individuals that we want to protect feel safe, secure, and at ease. Zooming out and changing perspective is critical when analyzing the moral aspects of our industry. Is face recognition ethical? In certain cases, the technology could be crucial, but in others, it may not be. What use could the right technology be put to if it ends up in the wrong hands?

At Irisity, we believe that enhanced AI performance, ethics, and privacy go hand in hand – and that improved AI ethics and integrity in the security industry will become our most important legacy.

Read more about IRIS Anonymization >>

[1] Barnoviciu, E., Ghenescu V., Carata, S-V., Ghenescu M., Mihaescu R., & Chindea, M. (2019). “GDPR compliance in Video Surveillance and Video Processing Application,” 2019 International Conference on Speech Technology and Human-Computer Dialogue (SpeD), pp. 1-6, doi: 10.1109/SPED.2019.8906553.

[2] https://gdpr-info.eu/art-83-gdpr/

Read more: 

G4S Lithuania Customer Story

Anonymization & GDPR: How to comply with data protection laws in video security

Anonymizing video streams enables companies to work proactively with data privacy. It greatly reduces the chances of paying large fines for cyber security breaches by making personal data unidentifiable.

Stockholm Metro

Increased safety with IRIS™ analytics at Stockholm Metro stations

To increase personal safety and security for all travelers at their stations, Stockholm Metro (SL) has decided to invest in the new AI-based surveillance system IRIS™ Rail that will send real-time alerts whenever someone or something enters the track area.

Solutions for Alarm Central

The municipality couldn’t get a permit – installed anonymized cameras instead

More and more municipalities are applying for permits to install their own security cameras in public places. Skellefteå municipality applied – but did not get approval from the Privacy Protection Authority. Now they are now installing anonymized cameras instead.