Articles

Toronto, September 2010: The study, supported by the Public Security Technical Program (PSTP), will examine persistent surveillance capabilities in Maritime, Great Lakes, and St. Lawrence Seaway (GLSLS) border regions, and Canada’s preparedness to counter the small vessel threat. It will employ a systematic and interdisciplinary, qualitative analysis to better understand the current and arising capability gaps relating to port-based (GLSLS and maritime) and coastline/shoreline persistent small vessel surveillance, and evaluate potential technological approaches that will address these gaps.

This will provide a roadmap to designing a Surveillance, Intelligence and Interdiction (SII) solution aimed at countering asymmetric threats and improving border management in the maritime and GLSLS inshore regions. This will be achieved by identifying the most suitable and complementary technologies for identification and tracking of anomalous targets and target behaviour, and a multi-sensor information fusion approach, which will allow for persistent surveillance and the accurate, robust and timely identification of small vessels – compliant and non-compliant.
Consortium Partners:

Communications Research Centre Canada (CRC) – (Federal Lead and Project Champion)

A.U.G. Signals Ltd.

Blue Force Global Special Services Group Ltd.

CFN Consultants

AKW Global Enterprises

For more information, please contact Tatyana Litvak, the Project Manager.

Toronto, July 2010: DRDC Atlantic is developing the capability to detect, recognize and identify small sea craft in images from EO/IR sensors (visible, night vision and infrared cameras). To facilitate these efforts, DRDC-Atlantic has awarded AUG Signals a contract to develop a software tool consisting of a user interface with several specific image analysis algorithms, which will further enhance small craft detection and tracking capabilities.

Detectability and other associated tasks depend on various parameters, including the clutter of the sea background, the white-water wake from the boat, and the diurnal environmental conditions. The technologies to be developed under this contract will include conversion of video streams to image sequences, determination of the sky-sea horizon, background sensor large scale non-uniformity correction, small craft and white-water wake demarcation, time-range synchronization, small craft tracking, image enhancement, and pixel-based contrast metrics

For more information, please contact Dr. Abhijit Sinha, Project Manager.

San Francisco, June 2010: AUG Signals announced today that it has been chosen by The Artemis Project™ as a winner of the 2010 Top 50 Water Companies Competition. This award distinguishes AUG Signals as a leading company in one of the greatest high-growth industries of the 21st Century. AUG Signals was selected by a panel of industry experts based on four criteria: technology, intellectual property and know-how, team and market potential.

“The Artemis Project’s Top 50 Water Companies Competition winners have excelled in key areas of the emerging advanced water technology sector,” said Laura Shenkar, Principal of The Artemis Project™. “We are excited to showcase these innovative companies and congratulate them for their achievements in creating solutions that will reinvent the water landscape.”

About The Artemis Project™

Established in 2000, The Artemis Project™ is a boutique consulting practice that brings unique capabilities to 21st century water management. It combines an understanding of the most advanced water resource management solutions with an international network of developers, investors and users of advanced water technology.

As the leading authority on applying advanced water solutions to business operations, The Artemis Project™ specializes in developing holistic water management strategies for major corporations. The Artemis Project™ also supports product launches of advanced water technology into business operations worldwide. The Artemis Project™ actively participates in water industry events and supports environmental policy initiatives. More information is available at http://www.theartemisproject.com.

The Artemis Project™ cooperates with the BlueTech Blog – the hub for discussion of advanced water technology. Recently, AUG Signals&rsquote; water technologies have been showcased by the BlueTech Blog. Please visit It’s Time for the Smart Water Grid and Smart Water Saves Water, Money and Lives.

Toronto, March 2010: DRDC Suffield has awarded AUG Signals a contract to develop a module that will perform fault-tolerant state estimation (FTSE) using global positioning system (GPS), inertial measurement unit (IMU), odometer and visual sensors.

The project will:

-Develop an accurate kinematic model that will contribute to developing precise navigation for UGVs
-Detect sensor failures fast while conceding very low rate of false alarm
-The detected faulty sensors/sensor systems will be isolated/accommodated/repaired. In addition, once a sensor declared faulty becomes operational again it will be utilized in the kinematic state estimation.
-Develop an ego-motion module that will help to show the utility of such a system, facilitate experimentation in the short run, and contribute to the development of state-of-the-art visual information based operational navigation system in the long run
-Provide a base for experimentation and help the future development of robust operational FTSE for not only UGVs but also in other related challenging fields such as UAV or UUV navigation
-Allow sensor API module to be modified to meet future needs for sensor upgrades and varying input data types, without changing the FTSE module itself
-Facilitate fast and cost-effective future development in this area under supervision from DRDC-Suffield

For more information, please contact Dr. Abhijit Sinha, Project Manager.

Toronto, September 2009: End of the cold war era sparked renewed interest in asymmetric warfare due to the transformation of adversaries to entities of significantly lower resources who use significantly different tactics than the conventional forces. In the case of the Navy, this transition dictates the development of sensor systems that can detect and identify low contrast objects, such as small boats, before they can breach the defensive layers of ships. Effectiveness of asymmetric warfare adversary and the importance of counter-measures against them are apparent from the events like USS Cole attack and piracy attacks near the coast of Somalia, where Canada is providing security to commercial ships as a part of a multinational team.

AUG Signals has been awarded a DRDC-Atlantic contract that aims to analyze an extensive set of EO/IR image data for the purposes of:

1. Estimation of contrast signatures of small boat pixels in four EO/IR bands for different aspect angles, ranges, boat sizes and boat types.

2. Estimation of contrast signatures of white-water bow wake pixels in four EO/IR bands for different aspect angles, ranges, and boat sizes, types and speeds.

Over the course of the project, a fully automated algorithm to detect the small boats and their wakes, and to separate them will be developed. This will be accomplished by a three-stage procedure:

1. CFAR detection of small boats together with their wakes in single frame images
2. Clutter reduction in CFAR detection results
3. Separation of small boats from wakes using image sequences

AUG Signals’ detection, clutter reduction and segmentation algorithms will be adapted for the demarcation of small boats and wakes using EO/IR image sequences. EO/IR contrast signatures of boats and wakes will be estimated by adapting an algorithm available in the literature.

For more information, please contact Dr. Abhijit Sinha.

Toronto, June 2009: ISTPCanada has announced it will fund AUG Signals’ and SoftTeam Solutions’ project aimed at the development of a software product that will automatically and non-invasively detect and recognize tumours using positron emission tomography and magnetic resonance imaging. Developing an automated means to detect tumours will eliminate the time-consuming, exhausting and potentially error-prone task of tracing out tumours. This will allow medical experts to concentrate on analysis and diagnosis, with the result of speeding up and improving tumour diagnosis.

Consortium Partners:

AUG Signals (Toronto)
SoftTeam Solutions (Chennai)
Hospital For Sick Children (Toronto)
Christian Medical College and Hospital (Vellore)
Dr. Kamakshi Memorial Hospital (Chennai)

For more information, please contact Tatyana Litvak

Toronto, March 2009: Sustainable Technology Development Canada (SDTC), an arm’s-length corporation created by the Government of Canada as part of its commitment to create a healthy environment and a high quality of life for all Canadians, will support AUG Signals’ large-scale demonstration project of the Intelligent Drinking Water Monitoring System (IDWMS) in EPCOR’s Edmonton Waterworks System.

IDWMS is an early warning digital signal processing software system with associated sensor sites that is designed to provide water purveyors and appropriate health authorities with real-time, continuous drinking water surveillance tool to facilitate early detection and identification of waterborne anomalies, addressing both accidental and intentional contamination scenarios as well as infrastructure failure. This includes identification of waterborne contaminants – chemical and biological – and detection of water leaks that result from ruptured pipes.

Consortium Members

A.U.G. Signals Ltd.
EPCOR Water Services
Communications Research Centre Canada
National Water Research Institute
University of Toronto
University of Calgary
FuseForward International Inc.

For more information, please contact Tatyana Litvak

Toronto, February 2009: AUG Signals has been awarded a DRDC-Atlantic Image Processing Support contract with the objective to implement an Image Processing Library (IPL) with several image processing tools to be applied to Automated Ship Image Acquisition (ASIA) images in order to extract specific information. The IPL suite will have the following capabilities:

1) Evaluate each ASIA image quality based on factors including exposure, focus, contrast, size, obscurity.
2) Perform image enhancement.
3) Extract the required ship information (such as size, orientation, location, presence of wake, and so on) by outlining the ship from the image.
4) Recognize the text printed on the ship body.

IPL will be implemented by integrating several AUG Signals software modules into a complete software suite: image evaluation and enhancement module, image segmentation and detection module, motion segmentation module, multi-frame image enhancement module and OCR module.

For more information please contact Dr. Ting Liu.

AUTOMATIC IMAGE REGISTRATION (AIR)

As technology moves forward, airborne and space-borne systems will continue generating progressively larger images – more bits per pixel and improved spatial, spectral and temporal resolutions. At the same time, the quantity of imagery and data requiring analysis will grow exponentially, and the traditional manual processes of analyzing data will no longer be useful due to the deluge of information.

In addition, combining images in such a way that the relevant information is not lost or degraded continues to be a challenge. Registration is particularly challenging when fusing imagery from different sensors with different resolutions, and when fusing images of a scene that has been derived from different aspects. Manual image registration techniques require the user to mentally correlate the features of one image with another in an attempt to get a more comprehensive understanding of the scene in question. This process is not only tedious but error-prone as well.

In order to avoid expensive and vital imagery to go to waste, the use of Automatic Image Registration is required to provide assistance to image analysts in performing their tasks. Automatic Image Registration (AIR) could be used to facilitate strategic, operational and tactical decisions, in both military and civilian applications.

In comparison with other registration methods, such as manual or intensity-based registration, AUG Signals’ AIR product has several advantages and innovations, providing:

  • Fully automatic multi-layer co-registration of images from similar or dissimilar sensors
    • Increased number of automatically identified control points between multi-sensor image pair; 
    • Increased spatial registration accuracy for multi-sensor image co-registration;
    • Increased similarities between the images by extracting feature layers;
    • Automatic vector-to-image data conflation

  • Fully automatic processing, the algorithm does not require any prior knowledge of the images
  • Sub-pixel accuracy – in the RMS error sense
  • Automatic registration capabilities that employ different transformation functions,
  • Intelligent and efficient control points estimation and global consistency checking to eliminate mismatched points
  • Robust and works for many different kinds of multi-sensor images

Please see AUG Signals’ Automatic Image Registration Demo


 For more information or a quote, please contact us.

DETECTION AND IDENTIFICATION FOR VIDEO SURVEILLANCE SYSTEMS (DIVISS)

Intelligent video surveillance, or video analytics, is the next generation of security applications. Whether it is a crowded airport, a subway platform, a nuclear plant, an office high-rise, or a vigorously patrolled border, there is an acute need to be aware of any and all suspicious activity on the premises.

AUG Signals has developed unique multi-sensor video registration and enhancement technologies to provide better detection and identification capabilities for video surveillance.

Video Motion Analysis (VMA)

DIViSS has the capability to filter out moving objects that are inherent to a particular scene, without flagging it as possible security breach by placing a “virtual fence” around such disturbances. Rain, falling leaves, even moving cars will not trigger detection events, providing that these are intrinsic to the monitored environment, while objects and entities foreign to the milieu will be automatically detected. This allows robust, 24/7 outdoor system operation.

The decision as to which objects should be filtered out can be programmed by the end user (i.e. the operator instructs the system which objects to ignore – this can be based on several characteristics, such as size, shape, velocity, etc.) or self-learned. The VMA feature allows DIViSS to learn the scene, either upon being prompted by the operator or independently. This involves the system briefly switching to a “learning mode” and then continuing normal operation.

The VMA also provides the exact segmentation of each object, the number of moving objects, and the direction of movement.

Static Detection

DIViSS can detect and flag an object that was added (e.g., object left unattended) or removed (i.e. object missing) from the scene. This operation is performed automatically, in real-time, includes rotation and scaling factors determination, and does not require any prior knowledge of the object/image.

CFAR Target Detection

Target detection is a fundamental capability for all intelligent video surveillance applications. The false alarms inherent to this process are a problem that reduces the value of target surveillance. The implementation of AUG Signals’ cutting-edge multi-Constant False Alarm Rate (CFAR) detection technologies vary the detection threshold as a function of the sensed environment, significantly reducing the false alarm rate. 

Sequential frame registration – camera stabilization

Sequential images from individual sensors are registered by a global motion estimation method that achieves accurate image registration and provides the viewer with a smooth spatiotemporal evolution of events. Correct alignment of sequential images serves two purposes:

  1. Stabilizes camera(s) in order to eliminate possible sequential frame distortion due to camera vibration in outdoor environment or camera pan-tilt-zoom operation
  2. Provides aligned frame sequence for multi-frame deblurring and super-resolution enhancement using information fusion

Sequential image registration is performed on a subpixel-level accuracy. The registration operations are performed on grayscale versions of the original frames to reduce computational complexity (to allow for real-time implementation) while maintaining almost the same level of accuracy.

Fusion and enhancement – multi-frame fusion

While other commercially-available products lack the ability to accurately register complex motions of multiple objects due to their inability to enhance images beyond single-frame based sharpening and de-noising, AUG Signals’ unique multi-frame registration and fusion techniques enable multi-frame based image enhancement (super-resolution capacity).

Super-resolution increases spatial resolution of video images on a frame by frame basis to meet end user needs (e.g., assist online investigations and increase screening capabilities on request). Based on correctly aligned sequential frames, AUG Signals’ super-resolution algorithm obtains super-resolution image by performing multi-frame fusion. The algorithm is able to increase the spatial resolution far beyond the sensor’s resolution, facilitating the recognition/identification of fine details of an object (e.g., fine labels or other features that are smaller than the video sensor resolution).

Original

State of the art technology

AUG Signals multi-frame fusion

AUG Signals has also developed multi-frame blur reduction algorithms to compensate for the loss in resolution of moving objects that occur due to finite exposure time of cameras:

Original

State of the art technology

AUG Signals multi-frame blur reduction technology

Fusion and enhancement – multi-sensor fusion

Recent advancements in sensing technologies have allowed the development of a wide range of cameras. These cameras range from light sensors (e.g. CCD, CMOS) to infrared (IR) sensors, electromagnetic radar sensors, thermal sensors, etc. These different types of sensors provide complimentary scene interpretation and surveillance capabilities. Multi-sensor capabilities allow video surveillance to be adapted for applications with less than ideal environments; for example, IR cameras can be used in low-light situations while CCD can be employed when sufficient lighting is available. Additionally, AUG Signals has developed the necessary technology to allow various sensors to be used simultaneously. The fusion of data streams coming in from multiple sensors provides a synergistically more comprehensive and accurate representation of the scene and enhanced object detection.

In real time automatic implementation, computational complexity is one of the major requirements of multi-sensor registration algorithms. The state-of-the-art spatial domain registration methods are all computationally expensive. AUG Signals’ multi-sensor registration algorithm is very efficient in estimating the motion parameters between the reference image and each of the other images, achieving real time implementation while outperforming the other existing frequency domain registration methods.

Reference image from sensor 1

Reference image from sensor 2

Registered images

Video content authentication

The security and protection of the content of video recordings are crucial for most video surveillance applications. AUG Signals’ watermark-based authentication techniques provide tampering tracking and analysis and ensure content integrity by imposing an encoded, invisible tag (i.e. the “watermark”). Cropping or otherwise modifying the image (including frame deletion and object removal and substitution) alters the watermark, which allows the user to easily identify and pinpoint image tampering as well as estimate the degree of tampering. In addition to incorporating the most advanced security measures, AUG Signals’ video content authentication distinguishes malicious attacks on content integrity from incidental distortions.

AUG Signals’ video content authentication results are invariant to image format conversion, features low computational complexity and storage requirements, and introduces no significant delay in real time streaming applications.

 

Recent Comments

    Archives