AMB_2024v14n1

Animal Molecular Breeding 2024, Vol.14, No.1, 130-140 http://animalscipublisher.com/index.php/amb 132 traps and acoustic sensors, provide detailed data on animal movements and behaviors. The integration of machine learning algorithms with remote sensing data has further enhanced the ability to automate species identification and habitat monitoring, making these tools indispensable for wildlife conservation efforts (Drakshayini et al., 2023). 2.2 Use of remote sensing in habitat mapping and change detection Remote sensing technologies are particularly valuable for habitat mapping and change detection. By analyzing satellite images and aerial photographs, researchers can monitor changes in land cover, vegetation, and other habitat features over time. This information is crucial for understanding the impacts of environmental changes, such as deforestation and climate change, on wildlife habitats. The use of remote sensing data allows for the detection of habitat degradation and fragmentation, enabling conservationists to take timely actions to mitigate these threats (Drakshayini et al., 2023). 2.3 Monitoring animal movements and behavior Monitoring animal movements and behavior is another critical application of remote sensing technologies. Camera traps and acoustic sensors are widely used to track the movements of individual animals and to study their behaviors in their natural habitats. These tools provide non-invasive methods for collecting data on animal density, migration patterns, and social interactions. For example, camera trap images combined with acoustic analysis have been used to monitor endangered species such as tigers, cheetahs, and sea turtles, providing valuable insights into their population dynamics and habitat use (Drakshayini et al., 2023). The analysis of animal vocalizations using machine learning techniques has also revealed important information about species behavior and habitat quality, further highlighting the potential of remote sensing technologies in wildlife monitoring (Drakshayini et al., 2023). In conclusion, the integration of remote sensing technologies with machine learning and other advanced analytical tools offers significant advantages for wildlife monitoring and conservation. These technologies provide efficient, non-invasive methods for tracking and studying wildlife populations and their habitats, enabling conservationists to make informed decisions and take targeted actions to protect and preserve endangered species and their ecosystems. 3 Synergistic Integration of Genomics and Remote Sensing The integration of genomics and remote sensing technologies offers a powerful approach to wildlife monitoring, providing comprehensive insights into species populations, genetic diversity, and environmental changes. This section explores the methodological integration, successful case studies, and the challenges associated with combining these technologies. 3.1 Methodological integration for comprehensive monitoring The methodological integration of genomics and remote sensing involves combining genetic data with spatial and environmental information to enhance wildlife monitoring. Genomics provides detailed insights into population genetics, such as effective population size, inbreeding, and genetic diversity, which are critical for conservation efforts (Hohenlohe et al., 2020; Thaden et al., 2020). Remote sensing, on the other hand, offers large-scale and long-term environmental data, enabling the monitoring of habitat changes and species distributions (Stephenson, 2019; Thackeray and Hampton, 2020). For instance, the use of reduced single nucleotide polymorphism (SNP) panels allows for the genotyping of degraded DNA samples collected noninvasively, such as from faeces or hair, which can be integrated with remote sensing data to monitor population structure and hybridization (Thaden et al., 2020). Additionally, genomic methods like DNA barcoding and metagenomics can be mapped to remote sensing indicators to provide a comprehensive assessment of environmental status and biodiversity (Bourlat et al., 2013; Yamasaki et al., 2017).

RkJQdWJsaXNoZXIy MjQ4ODY0NQ==