Introduction
The goal of this exercise was to become familiar with the whole process of downloading and obtaining from different internet sources. Then we imported the data to ArcGIS where we joined it. All the data was also projected into one coordinate system and stored in a geodatabase that we designed and built from scratch. This exercise is the beginning step to a larger project. We will be making a suitability abd risk model for frac sand mining in Western Wisconsin. Before we can do that this first step of gathering and exploring the data we will use is crucial.Methods
The first data that we downloaded was a map of the rail system in and around Trempealeau Wisconsin. On the Bureau of Transportation Statistics website there is a Railway Network Zip file that we downloaded and extracted the shape file for to use in our final map.
The next data we downloaded was from the USGS Map Viewer. In the viewer we zoomed into Wisconsin and clicked on Trempealeau County. Then in a drop down of data available we selected the land cover data from 2011. This data was emailed to us in zip form which we then saved and unzipped to use in the exercise. From this website we also downloaded the elevation data set for Trempealeau County.
Another website we used for land cover was the USDA Geospatial Data Gateway. This land cover data was all about cropland, what kind of crops are planted where. We also downloaded an entire geodatabase from the Trempealeau County Land Records site. This database included a wide variety of datasets which would be used later in the exercise.
We went to the USDA NRCS Web Soil Survey website to download all the soil data about Trempealeau County. Like we did with previous data sets we unzipped it into our folders because it is downloaded as a zip.
After we had of this data downloaded we did a series of joins and created some relationship classes to bring together similar data sets and make the working environment easier and more organized for us.
The next step of the exercise was to make a Python Script that would take our downloaded data turn them into rasters that can be mapped for the purpose of project. Our script took the Cropland, elevation, and Land cover data that we downloaded and ran some tools on them that you can also do in the ArcMap environment. It projected each of the data sets into the same projection, ran an extract or clip so that we were only left with Trempealeau County which is our AOI, and finally saved the modified rasters into a geodatabase. Below is that script.
Another website we used for land cover was the USDA Geospatial Data Gateway. This land cover data was all about cropland, what kind of crops are planted where. We also downloaded an entire geodatabase from the Trempealeau County Land Records site. This database included a wide variety of datasets which would be used later in the exercise.
We went to the USDA NRCS Web Soil Survey website to download all the soil data about Trempealeau County. Like we did with previous data sets we unzipped it into our folders because it is downloaded as a zip.
After we had of this data downloaded we did a series of joins and created some relationship classes to bring together similar data sets and make the working environment easier and more organized for us.
The next step of the exercise was to make a Python Script that would take our downloaded data turn them into rasters that can be mapped for the purpose of project. Our script took the Cropland, elevation, and Land cover data that we downloaded and ran some tools on them that you can also do in the ArcMap environment. It projected each of the data sets into the same projection, ran an extract or clip so that we were only left with Trempealeau County which is our AOI, and finally saved the modified rasters into a geodatabase. Below is that script.
After the script was done running we were left with the 3 maps below. One for elevation, cropland cover, and land cover of our AOI only.
When looking downloading and working with any data there can always be errors and that is why we need to look at the metadata for each data set to try and determine if there are errors or things that could cause errors in our data. The table below shows some of the things we check in metadata.
Sources:
http://nationalmap.gov\viewer.html |
http://soils.usda.gov\survey\geography\ssurgo\description.html |
http://nationalmap.gov/viewer.html |
http://datagateway.nrcs.usda.gov/ |