Skip to main content

Data Fusion

Lookout+ Nav Exclusive

Data fusion is only available with Lookout+ Nav. This feature requires the advanced sensor suite and processing capabilities of the Nav version.

Lookout+ Nav intelligently combines data from multiple sources to create a comprehensive maritime picture. By fusing information from cameras, AIS, radar, and other sensors, the system provides enhanced situational awareness and more accurate object tracking than any single source could achieve alone.

Data Sources

Lookout+ Nav can integrate data from four primary sources:

Automatic Identification System (AIS)

  • High-quality vessel data including length, beam, heading, and speed
  • Identification information with vessel names and registration details
  • Position reports broadcast by participating vessels
  • Metadata priority - AIS dimensional data preferred over camera estimates

Technical Implementation

Data fusion operates in ray coordinate space using unit vectors from a central reference point called the ray_reference.

Fusion Parameters

  • Angular Threshold: Objects within 1 degree (default) of each other are candidates for fusion
  • Range Correlation: Close-range objects prioritized for association
  • Camera-First Architecture: Visual detections anchor the fusion process

Association Logic

The system intelligently matches and merges tracks using:

  1. Spatial Correlation: Objects within angular and range thresholds
  2. Temporal Consistency: Maintaining track continuity over time
  3. Data Quality Assessment: Prioritizing higher-quality source information
  4. Cross-Boundary Tracking: Following objects between sensor coverage areas

Fusion Behavior

Track Association

When multiple sources detect the same object:

  • Best Available Data: System selects highest quality information for each attribute
  • Continuous Updates: Real-time refinement as new data becomes available
  • Confidence Scoring: Combined confidence levels from all contributing sources

Back-Projection

Objects detected by AIS or ARPA but not visible to cameras:

  • Bounding Box Display: Shows estimated object position even without visual confirmation
  • "Last Seen" Indication: Camera section will not appear in track information
  • Position Estimation: Maintains track using last known position and velocity

Position Prediction

For non-visual sources (AIS/ARPA):

  • Forward Projection: Estimates current position from last reported location
  • Track Continuity: Maintains association with camera detections when objects become visible
  • Duplicate Prevention: Avoids creating multiple tracks for the same object

Data Fusion Example

Example showing fusion behavior: Track 1 represents a back-projected object (no camera detection), while Track 4 shows successful fusion between camera and synthetic source data.

Data Quality Hierarchy

Source Reliability

Different sources provide varying quality information:

Data TypeAIS/ARPACameraPreferred Source
PositionGoodExcellent (close range)Camera (near), AIS/ARPA (far)
HeadingExcellentGoodAIS/ARPA
DimensionsExcellentEstimatedAIS/ARPA
VelocityExcellentEstimatedAIS/ARPA

Information Priority

When sources conflict, the system applies intelligent prioritization:

  • AIS dimensional data (length, beam) preferred over camera estimates
  • Camera classification with AIS vessel type code preferred when available
  • AIS/ARPA velocity preferred over camera-derived motion estimates

Operational Considerations

Environmental Factors

Fusion effectiveness varies with conditions:

Optimal Fusion:

  • Clear visibility enabling camera and AIS/ARPA correlation
  • Multiple overlapping camera views
  • Active AIS transponders on target vessels

Degraded Conditions:

  • Fog/Darkness: Camera detection may fail, relying on AIS/ARPA back-projection
  • AIS Gaps: Non-cooperative targets visible only to cameras and radar
  • Radar Shadows: Objects hidden from radar but visible to cameras