Research

My research focuses on the development of GIScience methodologies, spatial-temporal simulation and modeling approaches, and visual analytics, particularly, for movement and spatiotemporal processes.

Computational movement ecology: Agent-Based Simulation of Tiger Movement

I am currently part of an international interdisciplinary team, together with Prof. James D. Smith (tiger biologist from the UMN Department of Fisheries, Wildlife and Conservation Biology), Prof. Sean Ahearn (Hunter College — CUNY), and the Department of National Parks, Wildlife and Plant Conservation of Thailand. Our research is aimed at generating new insights into the movement behavior of tigers and their interactions in their ecosystem in the Thailand Western Forest Complex, a United Nations World Heritage site. Our research is designed to develop analytical approaches and simulation models to understand how tigers interact with their environment. We are developing an agent-based simulation system which integrates the different components of tiger movement to assess the role that environment (e.g. habitat, prey abundance, co-occurrence of leopards, and human factors) plays in the movement of tigers in their ecosystem and their behavioral variability.

This video animates GPS tracking data of a tiger collected between December 2009 and July 2010 with a sampling rate of one hour. The red line represent the tiger movement track. The thickness of the track indicates the movement speed of the tiger.

DYNAMO: Dynamic Visualization of Movement and the Environment

DYNAMO is an interactive and exploratory visualization tool that allows for the creation of intuitive, interactive, and high quality animations suitable for presentations of animal movement data. The visualization serves as a preliminary and exploratory analysis tool, allowing biologists and movement ecologists to investigate their data and identify environmental drivers of movement and potential correlations, and generate new hypotheses on the relationships between animal’s movement behavior and landscape use.

The initial state of this project was supported by the the Student-Faculty Research Awards of the Dean of College of Letters, Arts, and Sciences at the University of Colorado, Colorado Springs.

More info: Xavier, G. and Dodge, S., (2014). An Exploratory Visualization Tool for Mapping the Relationships between Animal Movement and the Environment. In proceedings of the 2nd ACM SIGSPATIAL Workshop on MapInteraction 2014, pp. 36-42 [pdf].


This video presents a demo of DYNAMO’s features

Movebank Env-DATA Project

envdata-wordcloud

As part of the NASA Earth Science Division Ecological Forecasting Program, in a multidisciplinary collaboration, directed by Prof. Gil Bohrer (Associate Professor for Ecological Engineering), between the Ohio State University (USA) and the Max Planck Institute for Ornithology (Germany), I participated in the development of the Environmental Data Automated Track Annotation System (Env-DATA) as a free online tool that allows users to link animal tracking data with ambient atmospheric observations and underlying landscape information. The new Env-DATA system enhances Movebank, an open portal of animal tracking data, by automating access to environmental variables from global remote sensing, weather, and ecosystem products from open web resources. The general aim of this project is to develop tools and methods to investigate the links between the Environment and animal movement patterns. This research involves handling and analyzing Big Data; such as diverse and multivariate remote sensing data of the environment, together with long-term animal tracking data, obtained from GPS, and Argos satellites, Geo-sensors, or RFID tags.

More info: Dodge, S., Bohrer, G., Weinzierl, R., Davidson, S.C., Kays, R., Douglas, D., Cruz, S., Han, J., Brandes, D., and Wikelski, M., (2013). The Environmental-Data Automated Track Annotation (Env-DATA) System: Linking Animal Tracks with Environmental Data. Journal of Movement Ecology, July 2013, 1:3, BioMed Central Ltd. doi:10.1186/2051-3933-1-3.