Overview Fisheye Node

Overview Fisheye Node

The fisheye node is used to pre-process fisheye camera input feeds. It includes functionality like warping/dewarping and distort/undistort.

Input and Output
  1. Input: Grabs frame from a video file, IP or USB camera.
  2. Output: Redis (default), TCP, UDP frame with distort/undistort/warp/dewarp result
Node Parameters
The following parameters are used in the Fisheye node.

Name: Input the node name used in a specific flow.
  1. default: fisheye
  2. type: string
Mode: Let's you select the mode with available options [Distort, Undistort, Warp, Dewarp].
Overwrite Model: Determine whether you wish to calculate the new fisheye calibration matrix (based on your input) or using the pre-calculated matrix.
  1. default: true
  2. type: boolean
Depending on the mode you select, you will be presented with different parameters to be configured.

Warp/Dewarp Parameters: Define the parameters to configure the warp/dewarp matrix:
  1. dewarp_start_angle: the angle (degrees) of dewarp start
  2. dewarp_cover_angle: the angle (degrees) of dewarp from start to end
  3. dewarp_radius_outer: the outer radius (pixel) of dewarp calibration (polygon transformation)
  4. dewarp_radius_inner: the inner radius (pixel) of dewarp calibration (polygon transformation)
  5. dewarp_scale_factor: zoom ratio
  6. dewarp_aspect_ratio: aspect ratio between width and height
Distort/Undistort Parameters: Define the parameters to configure the distort/undistort calibration matrix. The following formula is applied:
  1.  distort_matrix = [ 
  2.    [(mat_w * .5) * horizon_coef, 0.0, (mat_w * .5)],
  3.    [0.0, (mat_h * .5) * vertical_coef, (mat_h * .5)],     
  4.    [0.0, 0.0, 1.0] ]



    • Related Articles

    • Overview Region of Interest (ROI) Node

      The Region of Interest (ROI) node allows to filter a portion of an image that you want to perform some other operation on. ROI can be set for each camera stream and is therefore device specific. While the ROI node needs to be added to your flow, it ...
    • Overview Fall Detection Node

      The Fall Detection Node detects and tracks the movement of people to identify if a person is falling. Input and Output Input: Group Keypoint Detection output message and stream Output: Fall detection message and stream Supported architecture: ...
    • Posture Recognition Node Overview

      The Posture Recognition Node detects and tracks the movement of people to identify a person's posture. Input, Output and Supported Architecture Input: Group Keypoint Detection output message and stream or Video Feed output stream Output: MQTT message ...
    • Overview Reboot Node

      The Reboot Node is used to automatically reboot an edge device in a certain time interval. Input and Output Input: No input needed (mode: time), or any previous node (mode: message) Output: none Supported architecture: Currently supported on amd64 ...
    • Overview Object Flow Node

      The Object Flow Node detects and tracks people from an input video stream to compose a heatmap and to calculate the average dwell time. Input and Output Input: Object Detection mqtt result message, ROI section definition, Object Counting result ...