Sunday, April 30, 2017

Final Project: Do Mobile Homes Attract Tornados?

I chose the red pill ...I searched for an interesting and relevant topic, and then found and refined data myself.  As I was learning this semester, always in the back of my mind was the question of whether I'd leave the course with the tools and knowledge to build a "real map" in support of my job and my research.  It's pretty clear in my mind now that I can do this....albeit quite slowly, and with still a lot to learn.  But I can do it.  Pretty cool.

I found a journal article that I felt was interesting.  The authors were looking for spatial correlation between tornado formation and land use areas.  They found a promising spatial relationship between tornados and land use transition zones (e.g.: forested area to farmland, or farmland to urban land use).  No great scientific paper ever survives first contact with a journalist....a reporter from Chicago put facts together, noticing that mobile homes tended to be located on the outer edge of the urban land use zones (where tornados show the highest frequency), so could the two be related?  The term 'tornado magnet' comes to mind; in popular lore, mobile home parks attract tornados. In reality, of course, mobile homes show the most damage due to construction materials and methods, so only appear to 'attract' tornados. So my map--which compared the spatial distribution of tornados and density of mobile home, at the county level, showed pretty conclusively that there is only coincidental correlation between the two variables.

The hardest part of this assignment, by far, was getting the data into a useable format.  My bivariate map used two data sets.  The first was mobile homes as a percentage of total homes, by county, in the state of Indiana. I found that data set in the National Weather Service's Storm Prediction Center (SPC) GIS website.  I checked the data, and deternined that it was 'good to go'.  My second variable was a bit more complicated--I needed data on the frequency of strong tornados (F2/EF2 or stronger), by county, over a 30-year period of record (the standard used in climatology), for the state of Indiana.  I could not find the data that I needed pre-built into a spreadsheet.  So, roll up sleeves and start extracting the data.  National Oceanographic and Atmospheric Administration (NOAA) has a searchable database on tornado occurrence, but it restricts the user to very small samples.  In order to keep my manual data extraction within the constraints, I tailored my search to keep the data size manageable (about 500 records).  I then manually counted tornado occurrence by county, entered the data into a spreadsheet; found county size and population data on a National Association of Counties web site, and manually added that data for each of Indiana's 89 counties. Then I could import the data into ArcGIS, and join the data table to a layer. I then imported the final map in AI and hacked around for hours, just trying not to destroy my map.  I added drop shadow effects to the state and to the graduated symbology.  And then the map was finalized.    

Thursday, April 13, 2017

Module 12: Google Earth

A lab that I could complete in under 3 hours....I feel like I won the lottery.  :)  In the final lab of the class, I learned to cross the bridge between the professional's ArcGIS, and the people's Google Earth.  This was a great exercise, and I am sure that I will use this function again in my workplace, if not again in my Certificate Program.  I learned to convert ArcGIS maps and layers to kml and kmz files, and how to build a zipped kml map and tour.  As much as I have "played" with GE, I've never actually constructed a tour before this week....that will be very useful as well.  I added a few points to the tour that had meaning to me (I was born and lived in Palm Beach County, I lived and worked on MacDill AFB in Tampa, and I met my wife in Miami Beach), but edited out the additional placemarks before I turned it in.  I didn't want to get too silly with the assignment. So this turned into a visit of some dearly loved places...thanks for the memories!  

My largest challenge on this assignment was in finding my original Module 10 dot density map.  The lab instructions directed me to utilize my original dot density map from Module 10.  I could not find my original .mxd--I had saved only .jpg images.  So I used the dot density map from another student that was loaded into the M12 folder on the R:/ drive.  While making the directed "tweaks" to the dot density map in ArcGIS, I was reminded of how frustratingly slow the population data does process.  I did find that in order to get the dots on top, the tip given between steps 10 and 11 of the lab instructions worked great--a 50m altitude did the trick.

Building the recorded tour was straightforward with the instructions that were provided.  I really now can appreciate the difference between photo-realistic 3D, extruded (or 3D modeled), and LiDAR-derived 3D models, and I now want LiDAR-derived maps everywhere now!  You'll forgive the weak research that I conducted (uhm....Wikipedia)....but since I use remote sensing in my workplace every day, I did a cursory search on LiDAR to try and learn a little more about how it works. It was interesting to see that LiDAR was allegedly first used in on a NOAA meteorological satellite to measure cloud heights. When I get more time, I will research further--this is an interesting subject.

So, overall a very straightforward lab, where I learned a few techniques that I will definitely use in my professional life.  One small bug in the lab instructions: page 6, para 4: I was instructed to create placemarks for the next three areas (in Tampa), but no description of what those areas were (the three required placemarks do show up on page 7, under "deliverables", but would be a little easier to follow if they were listed in para 4, page 6.

Thursday, April 6, 2017

Module 11: 3D Mapping

This week I learned 3D mapping background, theory and techniques; used for the first time the 3D Analyst Extension, ArcScene and ArcGlobe applications; got to visit my old friend GE; and provided a review of Tufte's 3D rendition of Mendard's map of the defeat of the once-mighty French Army by  logistics, disease, and the famous Russian General known as 'General Winter'.

Shown above is a representative graphic produced during the ERSI training--a 3D map of Crater Lake, Oregon.

I first completed the assigned 3 hour ESRI module, 3D Visualization Techniques Using ArcGIS, and learned, practiced, applied and was evaluated through lessons teaching 3D visualization of raster and feature data, and was exposed to 2D to 3D conversion using values derived from lidar data. I was able to work functionally in ArcGlobe and ArcScene. I learned about vertical exaggeration (something I actually knew how to do from working with GE).  I learned more, and I appreciate the usefulness of options available in scene properties (backgrounds, skies, solar illumination).  I learned that extrusions are not just for creating buildings--that xy data can be visualized in 3D by adding almost any value as the z component of the database.

Then I moved to a "this is what happens in a non-perfect world" exercise, and manipulated data in order to create and display 3d building footprints.  Finally, I compared and contrasted Menard's original 2D map of Napoleon's Army invasion of Russia with a 3D map created by Tufte, and explained in a provided recorded presentation.  I prefer the 3D map because of the saturation of data in the map, and the ability to "turn it around in my hands" and really study the relationships between the dependent and independent variables--but I am interested in the subject, and I like to deep-dive into data.  If I were a more casual user, I might find the 2D map more appealing, because I can get the main idea very quickly.

Very good lab this week--I enjoyed it!  As was said in the video lecture:  "3D is cool!!".  :) I got to play with GE again, which is second-nature to me.  I did note that one can quickly lose control of the scene when rough movements are made to the flight controls during the fly-through function of ArcScene! 

Tuesday, March 28, 2017

Module 10--Dot Density Mapping

The laboratory was designed to reinforce this week's instruction on Dot Density Mapping. I used ArcMap to manipulate instructor-provided Shape files and tabular data in order to to build a dot map showing population density in South Florida.  I then finalized the Arc map in Adobe Illustrator, and saved as an image file. 

Learning how to join spatial and spreadsheet data was not a challenge, and it seems to me to be a very useful skill for further work in this area.

The instructions for selecting dot sizes and unit values were straightforward; however, more on that process in a second

The lab instructions adequately prepared me for the ArcMap processing disaster that was to come.  I'm quite sure that I spend 30 minutes trapped in the "masking>exclude (or include)>ok>say a bad word to two>restart ArcMap>repeat" loop.  I was working on my (brand new) home workstation using ArcGIS desktop software.  The page 6 suggestion to work with two .mxd files finally worked.  But I did find that restricting the size of each dot to 15,000 people was a good trade-off between dot resolution and processing power required. Rankin points out in his Youtube video that dot density maps are underused.  I see why--he must have used a mainframe to calculate his racial population density map! My dot density legend, by the way, involved creating three similar sizes squares, creating a red dot, and copying it 35 times, dragging the dots into the squares in a more or less random manner.  Not the method that you were looking for.

I exported a map to AI.  I really do not understand AI. I cannot effectively use AI Help functions. I spend hours hacking around in the program attempting to finish my map, and I end up with a map that is not visually appealing, and not effectively laid out.  The "marsh" legend went haywire and it took me a while to manually re-create it.  At some point, I was no longer able to draw a 2D rectangle using the rectangle tool--everything came out 3D.  I tried an AI drop shadow to the map, then I probably performed 50 subsequent editing actions, stepped back and looked at the map.  Then I decided that the drop shadow detracted from the map visually.  But to remove the drop shadow meant clicking "undo x action" 50 times. I was especially frustrated this week when I was unable to figure out how to add a background color inside the neat line (the color that I chose for the large map area also was visible under the map).  I've set a time limit of 8 hours for map manipulation each week--whatever I have at the end of that time will be turned in, and I'll take the grade that comes with it. 

I hate to leave this week's blog with a negative attitude...ArcGIS is an intuitive program and I enjoy working with it.  Creating maps in Arc is relatively easy (with your good lab instructions) and rewarding--I really enjoy the process. This week I learned tricks and tips to efficiently manipulate and process memory-intensive data manipulation in ArcGIS--something that will come in handy.  I am slowly getting familiar with AI, and I feel that I have mastered about 5% of its functionality at this time.   

Friday, March 24, 2017

Module 9: Flow Line Mapping

The map above shows legal immigration to the U.S. in 2007.  It shows immigrant region of  origination, and it shows where immigrants tended to make residence in the U.S after arrival here.  Immigration data was provided by individual countries, but we were asked to consolidate to a continental view. It would have been very interesting to flow line the top ten (non normalized) source countries.

This week's lab explored the general topic of flow line mapping, but also led me to further learn/practice new styling features in Adobe Illustrator.  Starting with a global map, I built a flow line map of world immigration to America.

The lab began with a compare/contrast section (Adobe vs. Arc), which was very useful for me.  Using the Base Map A option, I story boarded (old school--I sketched it out on a piece of paper), I started to build my flow line map.  Grabbing and moving the continents was easy, once I figured out the lasso tool.  I added a flow line (arbitrarily chose a width of 30, hoping for the best) for the largest contributor (Asia).  I opened the provided spreadsheet, and manipulated provided data as instructed to obtain proportional flow line widths; and then drew the rest of the flow lines.  I then added the essential  map elements.

Then, following instructions, I tried my hand at some cool style tools.  I tried the "changing color schemes" instructions and was able to make that work as advertised (although I lost Bolivia and Uruguay and never was able to restore those. I like Uruguay--my stepdaughter lives there!). In case it is noticed: I purposely "removed" the USA from the North American continent breakout map because you can't immigrate to your own country.  I tried the "inner glow" effect--worked poorly on thin lines, but well on wider lines. Next, I tried the drop shadow effect and was very pleased with this style I added drop shadow to almost every object on the map. The extrude/bevel tool was definitely "meh" given my flow lines and colors, but I did apply it to the African immigration, to say that I did it.  

Overall, this was considerably less frustrating for me than many previous labs. This week felt like a turning point to me....AI is starting to become a little less mysterious!  The excellent lab instructions this week definitely helped.

Monday, March 13, 2017

Module 8: ISarythmic Mapping Lab

This week we explored isarithmic mapping.  In the lecture portion of the class we were acquainted with an overview on two types of data, several methods of plot interpolation, the six criteria used to select the optimal interpolation technique, and symbolization in isarithmic mapping.  We were briefly exposed as well to some basics on topographic mapping.  The lab reinforced the subject material and gave us practical application of the concepts.

In the course of completing the lab, we used ArcCatalog to sift through data attributes, and ArcMap to develop a continuous tone map.  We learned to invert data values on a legend.  And we learned how to build a contour map that made use of hypsometric tint.  We learned several new applications within ArcMap (namely--the Spatial Analyst Extension, the INT tool, and the Spatial Analyst Toolbar), and the hillside relief option.  

We used precipitation data derived by the Oregon States's PRISM database.The precipitation data was obtained from weather observing sites reporting precipitation. The weather database is a 30-year average (1981-2010), which is the standard period of record for climatological research.  The elevation data comes from Digital Elevation Model (DEM).  Multiple variables (based on known factors that control precipitation--also referred to as a "Climate fingerprint") are applied to the elevation database to develop a modeled precipitation dataset that is of high spatial resolution.  

I first followed lab instructions and produced a continuous tone map, and inverted the legend to read from left to right. 

Then I created a manual ten-class hypsometric tint, using the break points given in the lab, added hillshade effect and added data contours.  I then added map elements to prepare the map for presentation.