AXIS COMMUNICATIONS Analytics Detects Classifies Moving Objects User Manual

June 5, 2024
AXIS COMMUNICATIONS

AXIS COMMUNICATIONS Analytics Detects Classifies Moving Objects User Manual
AXIS COMMUNICATIONS Analytics Detects Classifies Moving
Objects

Solution overview

Solution overview

About the application
AXIS Object Analytics detects and classifies moving objects, specifically humans or vehicles. You can choose different conditions for detection, such as objects that move or stay longer than a set timeBETA within a predefined area or that cross a defined line. Upon detection, Axis network devices or third- party software can perform different actions, such as recording video, playing an audio message, or alerting security staff.

Considerations

Considerations

For best results, the camera must be appropriately mounted, and there are requirements on the scene, image and objects.

This image illustrates an appropriately mounted camera.

Camera Considerations

  1. Mounting height
  2. Tilt
  3. Detection area
  4. Minimum detection distance
  5. Maximum detection distance

The following considerations apply to the mounting of all camera types:

Mounting position

If you mount the camera so it looks down from above, it makes it difficult for the application to classify objects.

Tilt

The camera must be sufficiently oriented towards the ground so that the center of the image is below the horizon. Mount the camera so that the minimum detection distance is longer than half of the camera’s mounting height (minimum detection distance > camera mounting height / 2).

Camera Distance Detection

Detection area

An object’s point of detection must be inside the detection area. The point of detection of a human is at its feet, and of a vehicle it’s at its center.

Maximum detection distance

The maximum detection distance depends on:

  • Camera type and model
  • Camera lens. A higher focal range allows for a longer detection distance.
  • Weather. For example, heavy rain or snow can affect the detection distance and accuracy.
  • Light. Detection accuracy and range can be affected by insufficient illumination.
  • Camera load

We recommend you to use AXIS Site Designer to determine the maximum detection distance for different camera models at your site.

Roll

The camera’s roll angle must be nearly equal to zero. It means that the image should be level with the horizon.

Camera Angle Rolling

Field of view

The camera’s field of view must be fixed.

Vibrations

The application tolerates small camera vibrations, but you get the best performance when the camera is not subject to vibrations.

Object size

For a human to be detected, the minimum height is 4% of the total image height. For a vehicle, the minimum height is 3% of the total image height. However, this requires perfect image conditions and no obstructions to the view. To minimize the risk of missed detections, we recommend a height of at least 8% for humans and 6% for vehicles.

Detection Object size

Object visibility

Detection accuracy can be affected:

  • if objects are only partially visible due to, for example, foliage. It’s particularly important that characteristic features, such as legs or wheels, are visible.

  • when the scene is crowded with objects. For example when there’s a traffic congestion.
    Object visibility

  • when you use the time in area functionality BETA: if objects frequently overlap each other.
    Object visibility

Contrast

There needs to be a certain level of contrast between objects and the background. Fog, direct light shining on the camera, or an overly noisy image can cause contrast issues. You can increase the level of illumination and adjust the image settings to improve the level of contrast.

  • When you use a day-and-night camera with artificial lighting, we recommend at least 50 lux in the entire detection area.
  • When you use built-in IR lighting, the maximum detection distance depends on the camera and the environment.
    IR lighting on Camera

Expected movement of objects in the scene

Objects that approach the camera in a straight line need to move for a longer time before they get detected compared to objects that move perpendicular to the camera’s field of view.

Expected movement of objects

Human pose

Humans need to move in a somewhat upright position.

Human pose

Object motion

Objects need to move within the scene for at least 2 seconds.

Object motion

Recommended image settings

Before you start to use the application, we recommend you to turn on Forensic WDR and barrel distortion correction, if they are available for your camera.

Recommended image settings

The image to the right is an example of barrel distortion. Barrel distortion is a lens effect where straight lines appear increasingly bent closer to the edges of the frame.

Conditions where detections can be delayed or missed

  • Fog
  • Direct light shining on the camera
  • Inadequate light
  • Overly noisy image

Situations that can trigger false alarms

  • Partially hidden people or vehicles. For example, a small van that appears from behind a wall can look like a person since the vehicle is high and narrow.
  • Insects on the camera lens. Note that day-and-night cameras with infrared spots attract insects and spiders.
  • A combination of car headlights and heavy rain.
  • Human-size animals.
  • Strong light causing shadows.

Get started

  1. Log in to the product’s webpage as an administrator and go to Settings > Apps > AXIS Object Analytics.

  2. Select the application.

  3. Start the application and click Open.

  4. In the welcome screen, click Step-by-step to follow the recommended setup procedure.

  5. In step 1, read through the considerations.

  6.  In step 2, select if you want the application to detect humans, vehicles, or both. Read more about Classification of objects.

  7. Select if you want the application to detect when objects move or stay for longer than a set timeBETA inside a defined area, or when they cross a defined line. If you want to learn more, see Object in area on page 13 and Line crossing.

  8. For PTZ cameras, you can choose to restrict detection to a specific preset position. Select it from the list.

  9. Adjust the default line that objects need to cross, or the area where objects need to move.
    To find out how to adjust the default line or include area, see Adjust virtual line or area on page 9 .

  10. In step 3, verify your settings.

You have now created one scenario. To rename or modify it, click Open.

To create more scenarios, click +.

Play Button Icon To watch this video, go to the web version of this document.
www.axis.com/products/online- manual/62075#t10056465

Create scenario: object in area

Play Button Icon To watch this video, go to the web version of this document.
www.axis.com/products/online- manual/62075#t10056465

Create scenario: time in area

Play Button IconTo watch this video, go to the web version of this document.
www.axis.com/products/online- manual/62075#t10056465

Create scenario: line crossing

Adjust virtual line or area
  • To reshape a virtual line or area, click and drag one of the anchor points.
  • To move a virtual line or area, click and drag.
  • To remove a corner, right-click the corner.

Virtual line

  • To reset the virtual line to its default size, click Scene > Reset line.
  • To change the direction that objects should move to be detected, click Scene > Change trigger direction. The red arrows next to the line show the current direction. Actions trigger when objects cross the line in the direction of the arrows.

Area

  • To reset the include area to its default size, click Scene > .
  • To create an area inside the include area where you don’t want objects to be detected, click Scene > Add exclude area.

Configure the application

Modify a scenario

To modify a scenario, click Scenarios and click Open in the scenario card.

  • To rename the scenario, click .

  • To change what type of objects to detect, click Triggering objects.
    Note
    If you select Any motion, the application doesn’t classify objects. Instead, the application detects any object that moves in the scene. It can, for example, be animals, swaying foliage, flags, or shadows. To ignore small objects or objects that only appear for a short time, you can use filters. For more information, see Filters .

  • In an Object in area scenario: To allow objects to stay inside the include area for a certain time before the application triggers an action, click Triggering objects and turn on Time in areaBETA . Set the allowed time.

  • To adjust the virtual line or area, click Scene.

Calibrate perspective

If the scene has a significant depth, you need to calibrate the perspective to remove false alarms due to small objects. During calibration, the application compares the height of the objects as they appear in the image with the actual heights of the corresponding physical objects. The application uses the calibrated perspective to calculate the object size.

Place vertical bars in the image to calibrate perspective. The bars represent physical objects at different distances from the camera.

  1. Go to Settings > Advanced > Perspective and click +.

  2. In the live view, choose two objects of the same, known height, that are located on the ground and at different distances from the camera.
    You can use, for example, fence poles or a human.
    Calibrate perspective

  3. Place the bars by the objects and adjust the length of each bar to the height of the object.

  4. Select the scenarios you want to apply the perspective to.

  5. Enter the height of the objects in Perspective bar height.

  6. Click Save

Example
If there is a fence with 2 meter high poles extending from the camera towards the horizon, position the bars at the fence poles, adjust their lengths and enter 200 cm (6 ft 7 in) in the fields.

Important
Make sure the bars don’t overlap each other in height.

Add burnt-in metadata overlays to video streams

To show the event that was detected in the live and recorded video stream, turn on metadata overlay. When you turn on metadata overlay the application shows:

  • A rectangle around detected objects.
  • The area or line of the scenario where the object was detected.

If you turn on trajectories, the application also shows a line that outlines the path that an object has taken.

If several scenarios get triggered at the same time, overlays are shown for all of them in all streams with that selected resolution.

Important
The metadata overlays are burnt in alarm overlays to the selected resolution of the video stream. You can’t remove them from recorded video.

Note
If you use view areas, the metadata overlays only appear in the first view area. The default name of the first view area is View area 1.

  1. In the application’s webpage, go to Settings > Advanced and, depending on your camera:

    • Turn on Metadata overlay.
    • Under Metadata overlay, select in which resolution burnt-in metadata overlays should appear. You can only
      select one resolution and the setting applies to all scenarios.
  2. To show the path an object has taken, select Trajectories.

Restrict detection to a PTZ preset position

For PTZ cameras, you can restrict detection to a specific preset position.

  1. Go to Scenarios and click Open in a scenario card, or click + to create a new scenario.
  2. Click Scene and select a preset position from the list.

Note
Each time the preset position changes, the application needs to recalibrate. We recommend you to wait at least 15 seconds before you change between preset positions in a guard tour.

Learn more

Classification of objects

The application can classify two types of objects: humans and vehicles. The application shows a rectangle around classified objects. Objects classified as humans get a red rectangle, and objects classified as vehicles get a blue rectangle.

Classification of objects

For cameras with deep learning, vehicles can be further categorized into trucks, buses, cars, and bikes.

Classification of objects

If you use the time in areaBETA functionality, the rectangle is yellow until the time condition has been fulfilled. If the object then stays inside the include area for another 30 seconds, the rectangle becomes dashed.

Each classified object has a point of detection that the application uses to decide if an object is inside or outside an include area or when it crosses a virtual line. For a human, the point of detection is at its feet, and for a vehicle it’s at its center. If a human’s feet or a vehicle’s center gets obstructed from the camera’s view, the application makes an assumption of the location of the point of detection.

Note: We recommend you to take the assumed location of objects’ point of detection into consideration when you draw the include area or virtual line.

For the best possible results:

  • At some point, the entire object needs to be visible in the scene.

  • The object needs to be in motion within the scene for at least 2 seconds.
    Classification of objects

  • For cameras with machine learning, humans need to move in a somewhat upright position. For cameras with deep learning, this is not a requirement.
    Classification of objects

  • The upper body of a human needs to be visible

  • Objects need to stand out from the background
    Classification of objects

  • Reduce motion blur.

Object in area

When you use the trigger condition Object in area, the application detects objects that move inside a defined area. This area is called an include area.

You can turn on time in areaBETA to set a time limit for how long an object is allowed to stay inside the include area before the application triggers an action. When an object enters the include area, the counter starts. If the object moves out of the include area before the set time limit is reached, the counter resets. It’s the object’s point of detection that must be inside the include area for the counter to keep counting. The time in areaBETA functionality is suitable for areas where humans or vehicles are only supposed to stay for a short while, like tunnels or school yards after hours.

Include area

The include area is the area where the application detects selected object types. The application triggers actions for objects if its point of detection is inside the include area. The application ignores objects that are outside the include area.

Reshape and resize the area so that it only covers the part of the scene where you want to detect objects. If you use the time in areaBETA functionality, it’s important to only include parts of the scene where objects don’t frequently overlap each other. The default include area rectangle can be changed to a polygon with up to 10 corners.

Recommendation

If there’s a busy road or sidewalk close to the include area, draw the include area so that objects outside the include area don’t accidentally get detected. This means you should avoid drawing the include area too close to the busy road or sidewalk.

Object Detections

Exclude areas

An exclude area is an area inside the include area in which selected object types don’t get detected. Use exclude areas if there are areas inside the include area that trigger a lot of unwanted actions. You can create up to 5 exclude areas.

Move, reshape, and resize the area so that it covers the desired part of the scene. The default rectangle can be changed to a polygon with up to 10 corners.

Recommendation

Place exclude areas inside the include area. Use exclude areas to cover areas where you don’t want to detect objects.

Exclude areas

Line crossing

When you use the trigger condition Line crossing, the application detects objects that cross a virtually defined line.

The virtual line is a yellow line in the image. Objects of the selected type that cross the line in a certain direction get detected. The red arrows on the line show the current direction. Actions trigger when objects cross the line in the direction indicated by the arrows.

To trigger an action the object must cross the line. As shown in the illustration, the object’s point of detection must cross the line for the action to trigger. Objects that only touch the line don’t trigger actions.

Line crossing

  • In the illustration to the left, the man doesn’t trigger an action, as his point of detection has not yet crossed the line.
  • In the illustration to the right, the man triggers an action, as his point of detection has crossed the line.

For information about the point of detection, see Classification.

Virtual line recommendations

Adjust the virtual line so that:

  • objects are clearly visible in the image before they cross the line.
  • an object’s point of detection is likely to cross the line.
  • objects can’t enter the protected area without passing the line.

Integration

Set up alarms in AXIS Camera Station

This example explains how to set up a rule in AXIS Camera Station to alert the operator and record video that includes metadata overlays when AXIS Object Analytics detects an object.

Before you start

You need:

  1. an Axis network camera with AXIS Object Analytics set up and running, see Get started on page 8 .
  2. metadata overlays turned on in the application, see Add burnt-in metadata overlays to video streams on page 11.
  3. a computer with AXIS Camera Station installed

Play Button Icon To watch this video, go to the web version of this document.
www.axis.com/products/online- manual/62075#t10145372

Add the camera to AXIS Camera Station

  1. In AXIS Camera Station, add the camera. See the user manual for AXIS Camera Station.

Create a device event trigger

  1. Click and go to Configuration > Recording and events > Action rules and click New.
  2. Click Add to add a trigger.
  3. Select Device event from the list of triggers and click Ok.
  4. In the Configure device event trigger section:
  5. In Device, select the camera.
  6. In Event, select one of the scenarios for AXIS Object Analytics.
  7. In Trigger period, set an interval time between two successive triggers. Use this function to reduce the number of successive recordings. If an additional trigger occurs within this interval, the recording will continue and the trigger period starts over from that point in time.
  8. In Filters, set active to Yes.
  9. Click Ok.

Create actions to raise alarms and record video

  1. Click Next.

  2. Click Add to add an action.

  3. Select Raise alarm from the list of actions and click Ok.
    Note
    The alarm message is what the operator sees when an alarm is raised.

  4. In the Alarm message section, enter an alarm title and description.

  5. Click Ok.

  6. Click Add to add another action.

  7. Select Record from the list of actions and click Ok.

  8. In the list of cameras, select the camera to use for recording.
    Important
    To include metadata overlays in the recording, make sure you select a profile with the same resolution as the one selected for metadata overlays in the application.

  9. Select a profile and set the prebuffer and postbuffer.

  10. Click Ok

Specify when the alarm is active

  1. Click Next.
  2. If you only want the alarm to be active during certain hours, select Custom schedule.
  3. Select a schedule from the list.
  4. Click Next.
  5. Enter a name for the rule.
  6. Click Finish.

Note
To see the metadata overlays in the live view, make sure you select the streaming profile that matches the one you set in the application.

Record video when an object gets detected

This example explains how to set up the Axis device to record video to an SD card when the application detects an object.

  1. In the device’s web interface, go to Apps and make sure the application is started.

  2. To check that the SD card is mounted, go to System > Storage.

  3. Go to System > Events and add a rule.

  4. Type a name for the rule.

  5. In the list of conditions, under Application, select the application scenario. To trigger the same action for all scenarios, select Object Analytics: Any Scenario.

  6. In the list of actions, under Recordings, select Record video.

  7. In the list of storage options, select SD-DISK.

  8. Select a Camera and a Stream profile.
    To show metadata overlays, make sure you have turned it on in the application for the same resolution that is in the stream profile.
    Note
    We don’t recommend you to use a scenario with time in areaBETA to trigger recordings if the time an object is allowed to stay inside the include area is more than 30 seconds. The reason is that it’s challenging to use a prebuffer time longer than 30 seconds, which is required if you want to see what happened before the object was detected.

  9. If you want to start the recording before the object was detected, enter a Prebuffer time.

  10. Click Save.

  11. To test the rule, go to the application’s webpage and open the scenario. Click Test alarm. This generates an event, as if the scenario had triggered for real. If you have turned on metadata overlays, a red or blue rectangle will show.

Troubleshooting

Problems detecting objects

when image is unstable| Turn on Electronic image stabilization (EIS) in the Image tab of the product’s webpage
at image edges, where the image looks distorted| Turn on Barrel distortion correction (BDC) in the Image tab of the product’s webpage.
immediately| Objects need to be fully visible in the scene before they can be detected by the application.
in other situations| t could be because the objects melt into the background if they are the same color, or because there is bad light in the scene. Try to improve the light.
Problems with false alarms
due to small animals that appear large in the image| Calibrate the perspective. See Calibrate perspective.
Problems with metadata overlays
On a second client| Metadata overlays are only visible for one client at a time.
Problems with the video stream
on Firefox browser for cameras with high resolutions| Try Google Chrome™ browser instead.

Filters

If you have set up the application is set up to detect any motion, you may experience false alarms. You can then use filters.

Short-lived objects – Use this to ignore objects that only appear in the image for a short period of time.
Small objects – Use this to ignore small objects.
Swaying objects – Use this to ignore objects that only move a short distance.

Filter recommendations

  • Filters are applied to all moving objects found by the application and should be set up with care to make sure that no important objects are ignored.
  • Set up one filter at a time and test it before you turn on another filter.
  • Change the filter settings carefully until you’ve reached the desired result.

The short-lived objects filter

Use the short-lived objects filter to avoid detecting objects that only appear for a short period of time, such as light beams from a passing car or quickly moving shadows.

When you turn on the short-lived objects filter and the application finds a moving object, the object doesn’t trigger an action until the set time has passed. If the action is to start a recording, configure the pre-trigger time so that the recording also includes the time the object moved in the scene before it triggered the action.

Set up the short-lived objects filter

  1. Click Scenarios and select an existing scenario or click + to create a new scenario.
  2. Click Triggering objects and make sure Any motion is selected.
  3. Go to Filters > Short-lived objects.
  4. Enter the number of seconds in the field. The number of seconds is the minimum time that must pass before the object triggers an action. Start with a small number.
  5. If the result is not satisfactory, increase the filter time in small steps.

The swaying object filter

The swaying objects filter ignores objects that only move a short distance, for example swaying foliage, flags, and their shadows. If the swaying objects are large, for example large ponds or large trees, use exclude areas instead of the filter. The filter is applied to all detected swaying objects and, if the value is too large, important objects might not trigger actions.

When the swaying object filter is turned on and the application detects an object, the object does not trigger an action until it has moved a distance larger than the filter size.

Set up the swaying objects filter

The filter ignores any object moving a shorter distance than that from the center to the edge of the ellipse.

Note

  • The filter applies to all objects in the image, not just objects in the same position as the setup ellipse.
  • We recommend that you begin with a small filter size.
  1.  Click Scenarios and select an existing scenario or click + to create a new scenario.
  2. Click Triggering objects and make sure Any motion is selected.
  3. Go to Filters > Swaying objects.
  4. Enter how far objects are allowed to move, as a percentage of the screen, before an action triggers.

The small objects filter

The small objects filter reduces false alarms by ignoring objects that are small, for example small animals.

Note

  • The filter applies to all objects in the image, not just objects in the same position as the setup rectangle.
  • The application ignores objects that are smaller than both the entered height and the entered width.

Set up the small objects filter

  1. Click Scenarios and select an existing scenario or click + to create a new scenario.

  2. Click Triggering objects and make sure Any motion is selected.

  3. Go to Filters > Small objects.
    Note
    If you have calibrated the perspective, enter the width and height of the objects to ignore in centimeters (inches) instead of as percentage of the image.

  4. Enter the width and height of the objects to ignore as percentage of the image.

User Manual
Ver. M8.6
AXIS Object Analytics
Date: June 2022 © Axis Communications AB, 2019 – 2022
Part No. T10144039

AXIS logo

References

Read User Manual Online (PDF format)

Loading......

Download This Manual (PDF format)

Download this manual  >>

AXIS COMMUNICATIONS User Manuals

Related Manuals