Page 8 - Campus Security Today, November/December 2023
P. 8
PRIVACY IN SCHOOLS
UNDERSTANDING THE DIFFERENCES
AUTHOR
AMY BOLIN IS THE SALES DEVELOPMENT MANAGER AT I-PRO AMERICAS
W hen it comes to video security systems in schools, there are many misconceptions about how AI, analytics, and facial recognition intersect and overlap. Where does one technology end and the next begin? Which ones raise more privacy concerns on campuses? In this article, we will examine the differences between these technologies and address privacy issues as we aim to protect students, staff and property.
AI, MACHINE AND DEEP LEARNING
AI broadly represents a machine’s capacity to replicate human decision-making at a basic level. Machine learning, a subset of AI, allows security systems to adaptively improve their performance from exposure to data (like identifying suspicious behaviors). Deep learning, a branch of machine learning, uses intricate neural networks to thoroughly analyze extensive data.
Specifically, deep learning excels in image analysis, identifying humans, vehicles, and specific traits. These techniques have often been collectively referred to as AI. Many tech sectors have blurred the distinctions, causing confusion about this technology’s ongoing evolution.
AI can bolster security systems by rapidly processing vast data sets, discerning patterns, forecasting potential threats and initiating automated reactions. For instance, an AI-driven security system could notice an unusual human movement near a campus perimeter and notify security before any breach occurs. The system identifies a human (distinguishing it from an animal), and that information is then sent to the analytic which decides this movement is within a zone and/or time of day when humans should not be present. This triggers an alert so security teams can investigate.
Crucially, this AI-guided “human detection” remains anonymous. The only discernible features might be clothing colors or accessories like bags or hats. From a privacy perspective, this vague depiction is generally deemed non-intrusive. Moreover, this data processing can occur “on the edge,” meaning AI-enabled cameras can detect objects, relay this to an onboard analytic, and almost instantaneously notify a VMS when time is crucial.
ANALYTICS AND PRIVACY
Many campuses use video analytics to retrospectively identify individuals after an incident. While valuable, this reactive strategy should be complemented by proactive initiatives that prevent events from escalating. Analytics can be applied to data from security cameras, sensors,
and other devices to provide actionable insights. For example, a camera’s AI-based video analytics can identify congestion in certain areas, unauthorized vehicle parking, or objects left unattended for long durations.
Modern analytics work closely with machine and deep learning to correctly identify events that warrant further attention from security staff. In campus environments, there will never be enough staff to continuously monitor video feeds, particularly when a physically presence is more impactful to smooth day-to-day operations.
FACIAL RECOGNITION
Facial recognition technology predates the advent of AI-based cameras. This technology works by identifying or verifying a person through a comparison of selected facial features from an image or video to faces stored in a database.
However, the use of facial recognition is controversial due to privacy concerns and potential biases inherent in the algorithms. While AI can enhance the accuracy of facial recognition, they are not the same. Unlike the anonymous data produced by deep learning AI algorithms, facial recognition centers on the face, typically comparing it to a watchlist of persons of interest.
It is important to understand that facial recognition systems do not “recognize” individuals they have not been explicitly set to search for. Thus, a facial recognition system deployed on a campus would, in most cases, not recognize any faces.
It is essential to understand that facial recognition systems almost invariably necessitate a server to compare against a database of stored facial data. So, unlike anonymous AI-based analytics, facial recognition is not commonly processed “on the edge” by the camera itself. The deployment of facial recognition is deserving of careful scrutiny, especially given instances worldwide where it has been misused to infringe upon human rights.
AI, analytics and facial recognition are unique entities. While AI and analytics often intersect in the sphere of physical security systems, facial recognition should not be automatically tied to either technology. AI can provide the “intelligence” fueling many of these processes, analytics provides the methodology and tools for deriving insights from harvested data, and facial recognition is a specialized technique that identifies or verifies people based on their facial characteristics.
In discussions about privacy, it is vital not to lump these potent, independent technologies under a singular AI label. Such generalizations can foster misunderstandings, risking the neglect of valuable, anonymous tools vital for ensuring safety on campuses.
8