| Name: | Description: | Size: | Format: | |
|---|---|---|---|---|
| 1.78 MB | Adobe PDF |
Authors
Abstract(s)
A qualidade e velocidade da deteção de impactos e da correção de tiro constituem fatores determinantes para a eficácia do apoio de fogo. Esta função é tradicionalmente assegurada pelos Observadores Avançados (OAs), cuja atuação embora essencial, está frequentemente condicionada por limitações no posicionamento,restrições de visibilidade impostas pelo terreno, dificuldades na localização exata dos pontos de impacto e exposição acrescida ao inimigo.
A utilização de Unmanned Aerial Vehicles (UAV) surge como uma alternativa promissora, permitindo uma observação contínua e potencialmente automatizada. Esta dissertação propõe o desenvolvimento de um sistema de deteção autónoma de impactos de morteiro a partir de imagens aéreas obtidas por UAV. Para tal, exploram-se duas abordagens distintas: métodos clássicos de visão computacional, baseados em subtração de fundo e análise de movimento, e modelos baseados em Deep Learning, nomeadamente a arquitetura YOLOv8. O trabalho compreende a aquisição e anotação de um dataset específico, o desenvolvimento dos algoritmos, e a posterior avaliação e comparação dos seus desempenhos.
The accuracy and speed of impact detection and fire correction are critical factors for effective fire support. This task is traditionally performed by Forward Observers (FOs), whose role, although essential, is often constrained by positioning limitations, reduced visibility due to terrain features, difficulties in precisely locating impact points, and increased exposure to enemy threats. The use of Unmanned Aerial Vehicles (UAVs) emerges as a promising alternative, enabling continuous and potentially automated observation. This dissertation proposes the development of an autonomous mortar impacts detection system based on aerial imagery captured by UAVs. Two distinct approaches are explored: classical computer vision methods, based on background subtraction and motion analysis, and deep learning-based models, namely the YOLOv8 architecture. The work includes the acquisition and annotation of a dedicated dataset, the development of the detection algorithms, and the subsequent evaluation and comparison of their performance.
The accuracy and speed of impact detection and fire correction are critical factors for effective fire support. This task is traditionally performed by Forward Observers (FOs), whose role, although essential, is often constrained by positioning limitations, reduced visibility due to terrain features, difficulties in precisely locating impact points, and increased exposure to enemy threats. The use of Unmanned Aerial Vehicles (UAVs) emerges as a promising alternative, enabling continuous and potentially automated observation. This dissertation proposes the development of an autonomous mortar impacts detection system based on aerial imagery captured by UAVs. Two distinct approaches are explored: classical computer vision methods, based on background subtraction and motion analysis, and deep learning-based models, namely the YOLOv8 architecture. The work includes the acquisition and annotation of a dedicated dataset, the development of the detection algorithms, and the subsequent evaluation and comparison of their performance.
Description
Keywords
UAV Observação Avançada Deteção de Impactos Visão Computacional Forward Observation Impacts Detection Computer Vision
