Sunday, July 27, 2025

Module 4-Coastal Flooding Lab

 

In this weeks lab we used tools to analyze LiDAR data to find areas that will likely be flooded due to storm surges in different areas. Using the reclassify tools we were able to find cells that would be flooded at 2 meters and at 1 meter. when then used building data to see what building would be flooded and what kind of building they are. I had a hard time with this lab but I am proud of myself for getting through it.

Friday, July 18, 2025

Module 3- LIDAR Visibility Analysis

 In this week we had to take a course on ESRI for Visibility Analysis. We took four courses within the activity: Introduction to 3D visualization, Performing Line Sight Analysis, Performing Viewshed Analysis in ArcGIS Pro, Sharing 3D Content Using Scene Layer Packages. 

In the first course, Intro to 3D visualization we learned how z values can represent elevation adn will allow us to see a 3D image. We used the provided data to navigate and investigate a 3d scene of Crater Lake in Oregon.

In Performing Line Sight Analysis we learned about line of sight. We used the Construct Sight Lines tool to create sight lines from an observer point. After we used the Line of Sight tool that tells you the visibility along the sight lines we just created. 

In Performing Viewshed Analysis in Arcpro we used the Viewshed tool to see waved based items by adjusting refractivity coefficients. This tools let us see streetlight cover, and how buildings can be effected by terrain. 

In Sharing 3d content we learned how we can upload our data to AGOL to share with the public, or organizations

Sunday, July 13, 2025

Module 2, LiDAR and forestry

 In this weeks lab I had to extract liDAR data from Virginia, from the vgil website. With the liDAR data I used many geoprocessing tools to find information. In part 1 of the lab I used tools to separate the part of the liDAR that represents the ground, and then get the area that is not on the ground. Using the minus tool and these 2 new datasets I was able to find the hieght of the canopy in Virginia. 

In part 2 of the lab I used the count, null, plus, float, and divide tool to create a Density map of the canopy in Virginia. In the density map you are able to see areas with dense vegetation with the 1 values showing up darker. The lighter (0) values are showing the ground areas, with low vegetation. This is helpful to foresters to see the areas with higher density, You can see in this map that roads have very low density. 

The lab this week was very interesting but was difficult with the slow speed of the remote desktop. Creating my presentable maps was a challenge due to the slow network speed. 





Wednesday, July 9, 2025

Module 1 Crime Analysis

 The purpose of this lab was to use Arcpro to analyze crime data. Using different techniques of showing the hotspots of crimes can be extremely helpful to police forces for predicting future crime areas. With this lab I create three different homicide hotspot maps using kernal density, grid-based thematics, and local morans I analysis. 

Grid_based: Spatial Join of Chicago grid and 2017 total homicides, keeping all other parameters as default. Opened the table for this feature and selected by attributes- join count greater then 0, exported this as new feature class. To get the top 20 percent I took the total number of of objects (311) and multiplied by .2 which gave me 62.2. Rounding to 62 i sorted join counts by descending and selected the top 62 and exported this.


Kernel Density: Using the Kernel Density tool I made the KD raster image using the 2017 total homicides with the chicago boundary as the barrier. Under symbology, statistics, the mean was 1.18*3=3.54 and the max was 38.86. I used those 2 numbers at the 2 breaks. I then used the reclassify tool, then the raster to polygon tool. I opened the table for the polygon and selected by attribute for grid code is equal to 2.


Local Morans I:I used spatial join between census tracts and  2017 total homicides, keeping all other parameters as default. Added the field “crime_rate” and used the field calculator with the equation  (Crime Rate = !Join_Count! / !total_households!) * 1000. I then used the cluster and outlier analysis (anselin local morans I) tool with the new feature class with the crime rate field as the input field. In this new feature class I selected by attribute only the high-high crime areas, exported this feature. I then used the dissolve tool to create my final feature class. 


If I were given a limited policing budget and had to choose which of these three maps I would use to allocate my officers I would choose the grid overlay. According to the crime density, 11 homicides per square mile is the highest rate of homicides. This is also the map with the smallest total area, so I would need the least amount of land cover with all the maps.