Mapping Environmental Justice

Environmental justice is a relatively new concept that was coined in the 1980s. It combines demographics with environmental factors. GIS can better illuminate the environmental justices or injustices in areas through the comparison and illustration of data.

 

Above is what could be considered a “story map” of environmental justice in three cities: Atlanta, Charlotte, and Richmond. Story maps are an interesting facet of geography in the digital world. Story maps tell a story through media, in this case video, which helps relate maps with other geographic elements. other and an overarching narrative. Story maps can make maps and geographic concepts accessible which may not be apparent outside a guided narrative.

The video above focuses on the environmental comparison of three cities. It’s interesting to see the concept of environmental justice present in all three cities. In each city there seems to be three types of areas. There is a collection of low-income population areas, which may or may not be contiguous, where environmental factors are the worse. There seems to be areas that doesn’t classify as “low-income” but serve as spillover areas for undesirable environmental factors. For example, the southeast part of Atlanta isn’t classified as “low-income” but it suffers almost as much from the detrimental environmental factors like the amount of diesel particulate matter in the air. These areas could be classified as not low-income but not high-income enough to be exempt from detrimental environmental factors.

And, of course, there are the high-income areas. These areas are spared from all the damaging aspects of environmental pollution. The core of the environmental justice problem arises from this perceived injustice. The argument is that even though the high-income areas might mitigate negative environmental factors through paying higher prices for property, both through taxes and the bottom line of a sale, it is unjust to deny the same environmental consciousness to the lower and mid-income areas. This is operating under the assumption that a quality environment, especially where negative effects are mitigable, is a human right and, when denied, is injust on a humanitarian level.

Traffic is an interesting factor to consider. At first thought, one might not think this is a factor that could be manipulated to fall within this income/environmental schema. Imagine an airport, it is constructed and the value of the residential properties within earshot plummet because of the noise pollution. Considering this evolution of value, we could assume roads follow a similar pattern. However, noise pollution from traffic has several elements that can be employed to either reduce the associate noise pollution or reduce the volume of traffic. Sound barriers can be used to limit the amount of noise that reaches residential properties. These also serve an aesthetic purpose and, while not being a designers first choice, can help dress up a property, thus further increasing its value, creating a positive feedback loop that is not present in less fortunate properties. You’ll likely find these sound barriers in high-income areas. This is not because the desire isn’t there for the low-income residential areas. It’s because the financial incentive isn’t there.

The use of alternative transportation methods alleviates these traffic problems for a price. Implementation of bike-friendly infrastructure reduce the volume of traffic but may require expanding and repaving roads, reducing speed limits, adding sidewalks, and developing areas along roads to accommodate bike paths and bike lanes. This brings us back to our original investment problem. What is the worth of implementing the projects if there is no return? Should bike-friendly infrastructure be legislated as a human right? Pedestrian traffic elements like footpaths, elevated footpaths, trails, and greenways follow the same logic and would provide similar results. If a neighborhood’s local amenities are easily accessible by foot it reduces the amount of local traffic. HOV lanes and carpooling elements follow the same logic as well.

Lead paint is another element that contains some interesting implications. Two important legislative measures have reduced the amount of lead in paint used in commercial and residential applications in the 1970s and 80s. A younger city like Charlotte shows a lower concentration of lead paint compared to a city like Atlanta. Charlotte’s construction boom was in the late 80s and 90s, a time where the widespread use of lead paint was discouraged by legislation. Richmond and Atlanta, however, developed during a time before this preventative legislation and the higher amounts of lead paint cooberate this.

Should these environmental factors be legislated? Should they be retroactive? Should they apply for new developments only? Should the free market continue to dictate the environmental quality of neighborhoods? These are all questions that environmental justice initiatives seek to address. Again, data is the light in the darkness to illuminate these issues and cartography is a medium that can present it to the minds that may one day solve it.

Working with bathymetry, rasters, and environmental models

Working with Arcmap is proving to be astutely rewarding. The more capabilities of the program I become familiar the more the scale of the possible projects becomes apparent. The critical cartographic element is also becoming more obvious. Data on a map is only useful if it’s presented in a legible and easily interpretable manner. Avoiding convolution is something I wanted to include in all my maps.

lab-7-attempt-2

This map shows elevation both above and below sea level. The sound is a deep blue to represent its depth under sea level. The lighter blues and teals represent waterways that are more shallow compared to the sound. If I could redo the color scheme on the map I would make sea level a more neutral color and exaggerate the blues to make identifying values below sea level easier.  An interesting design choice was the use of the map inset which magnifies the city of Seattle. All of this was done within the Arcmap software using the draw tool. Although it’s not as functional as something like Photoshop, it still allows provides some editing and design functionality. If I were to redesign I would enlarge the insert so the canopy of the city would be better visible, possibly to the point where the individual buildings could be picked out.

The use of color really makes this map stand out. Normally muted color would be my go to schema but this seems to work fine against the light grey background. I’m starting to enjoying the design element of putting maps together. The inset was a unique way to incorporate the second map of seattle  I’d like to use Photoshop for post production to really create some design elements that map be limited by Arcmap.

wood-turtle

This project concentrated on identifying habitable areas for the wood turtle in Keene, New Hampshire. Working with this data was interesting because it was a land use raster of the entire state of New Hampshire format of. Arcmap’s robust toolkit gives you many options to achieve a certain result.

To begin I clipped the raster data to the Keene city limits using a provided shapefile. I then used the “build raster attribute table” tool to make the raster’s attribute table editable. After starting the editor, a field was added in the attribute table to indicate suitability of each different category of land use. A “0” indicated uninhabitable land and a “1” indicated inhabitable land. The raster was then exported to a shapefile, symbology adjust, and geometry calculated. The final product is the map above.

The next part of the assignment was to find the amount of land in square feet and square miles that was both inhabitable and uninhabitable. Since the information in the attribute table, labeled “count”, did not have a unit of measurement assigned to it, I felt like getting creative. Wikipedia states Keene, NH is 37.5 square miles. In total, the count for both habitable and uninhabitable land was 118,922 unknown units. Using the statistics option in the attribute table, the count units can be separated by the uninhabitable (38244) and habitable (80678). Using some simple math we can calculate the percentage of the count that is uninhabitable:

38244 / 118922 = 0.32158…

We can then turn around and multiple our earlier measurement of 37.5 square miles for Keene:

0.32158… * 37.5 = 12.05…

This gives us our 12.05 square miles uninhabitable. Doing the same calculation with the 80678 count will give us the habitable square mileage. Finding the square footage is just a simple conversion from there.

While this might be the “long route” to achieving these measurements I believe it is important to do this kind of exploration to really get an understanding of these concepts and the toolkits that are associated with them. It definitely makes the user appreciate the shortcuts more when the longer route is typically taken.

Mapping Matthew

Hurricane Matthew was a Cape-Verde type hurricane that quickly but briefly reached category 5 status in the Caribbean Sea and ran along the eastern United States coast for three days before dissipating off the coast of North Carolina into the Atlantic.

Forecasting the movement of a hurricane is a huge deal in the meteorological community for reasons of public safety and risk assessment. Mapping the track of hurricane after it has passed can be just a fun and is definitely a good exercise for developing GIS skills.

hurricane-matthew-nicole-weather-underground

Acquiring data is always the first step and for this project Weather Underground had a robust table of information about the track of the storm including latitude and longitude data which alleviates the need to covert this data into something ArcMap can read.

Copying and pasting this information into an excel document and saving it in either a .txt, or .csv format allows it to be direcly imported into ArcMap. Once it’s in the ArcMap table of contents we can right-click  and select “Display XY data”. Since latitude and longitude data is present, ArcMap can automatically relate the data and create a shapefile. Oh course, the geographical coordinate system will have to match the basemap, in this case WGS 1984, in order to display correctly.

After the data is on the map, the next step is applying the graduated symbology to visually present the strength of the storm at the different data points. In the excel spreadsheet, I used the “replace” command to replace the “tropical storm” status to 0 and the “category 1-5” to their corresponding numbers. This makes it easier for ArcMap to interpret the data.

Hurricane Nicole was added in a similar manner to provide or more complete meteorological picture of the region at the time. The Hurricane Nicole data on Weather Underground didn’t have category information so I went ahead and converted the wind speed, which is provided, to the 0-5 storm intensity scale. Perhaps the data is incomplete because, at the time of this writing, Hurricane Nicole is still an active storm.

Assembling the legend is always an interesting part of the design process. I discovered the “convert to graphics” option and the “group” and “ungroup” options by right clicking on the legend. This makes editing the legend and adding fields and design elements manually noticeably easier. However, once you covert to graphics you lose the some editing functionality so it’s best to make sure all legend elements are included and finalized.

I wanted the labels to be a big part of this map to accurately convey the movement of the storms by notating the dates of certain points. Adding the labels for the “date” field for both of the tracks put more labels than were necessary. I only wanted one label per date and several points shared the same date, cluttering the map. Right clicking on a layer will bring up the context menu and toward the bottom is an option to “Convert labels to annotations”. This conversion is similar to converting the legend to a graphic in that it allows more creative freedom to manipulate individual elements. Once the labels were converted to annotations, I was able to delete the repeated dates until there was an optimum and aesthetic labeling distribution.

In the future I’d like to create a similar project but use the XY to Line or Point to Line tool to create a continuous projection of data. The data must have a unique identifier to use this tools which this Weather Underground data doesn’t have by default. I enjoyed making this map. If I had all of the time in the world I’d like to get a formal metereological education.

Amazon Order History Reports

Data is powerful professional asset but it’s easy to gloss over its applications outside work or school. Data in the home can be just as useful for domestic decision making and personal auditing.

Amazon is quickly becoming the Walmart of the 2010s and it’s no surprise. Online shopping makes buying and selling easier than it’s ever been. The online interface makes advanced data collection possible in ways that aren’t possible in a traditional retail experience.

ksr-3-amazon-graph-1

Amazon makes some of it’s data available to its users in the form of Order History Reports. Armed with this data customers have greater insight into their purchases. This is far easier than the manual bookkeeping that would have been associated with a traditional shopping experiences. I went ahead and downloaded my entire amazon history from 2012 to 2016 and graphed it in several different ways.

I started by opening up the provided CSV file in Excel and parsed the data for things I thought were relevant. About 12 metrics seemed interesting so I decided to concentrate on those. I used plot.ly to make the graphs and Excel to curate and parse the data.

amazon-purchases
Purchase data split by month
amazon-purchases-yearly
The same purchase data distributed yearly.

I definitely switched to amazon for a lot of things I might have gotten locally or through other online stores. The introduction of Prime free shipping had a big impact on my and millions of others’ shopping preferences. Amazon says on their innovations page: “customers would quickly grasp that they were being offered the best deal in the history of shopping”. The truth of this statement became even more apparent when I realize the impact Amazon was having on the way I shop.

Next I loaded up the order history data in Excel and used the =COUNTIF(range, criteria) command in excel to parse the data and find the count of the different conditions of the items ordered. I used Meta-Chart to create pie charts for presenting the data.

meta-chart
The condition of the items purchased from Amazon
meta-chart-1
The destination for the purchases made on Amazon

I thought tax data might be interesting. Online transactions tax you under a variety of conditions. Some products aren’t taxed at all and some are taxed according to specific state and federal legislature.

tax-rate-per-year
The annual percent of taxes paid on purchases

Amazon didn’t collect sales tax from North Carolina residents at all until 2014. North Carolina sales tax is 4.5% statewide and higher in certain municipalities. In Charlotte the sales tax is 7.25%. Shopping through Amazon mitigates this tax and, in some circustances, a smart shopper with a lot of time on their hands might avoid paying taxes entirely. This is another victory for ecommerce over the traditional shopping experiences. This tax differential might be closed as ecommerce and online shopping gain larger shares of the retain market and legislaters attempt to recoup tax revenue losses.

Finally, just for fun I decided to create a graph of the different catagories of items purchased.

meta-chart-2
Categorized purchases

Online retailers will have better ways of analyzing and presenting this data and whole marketing departments dedicated to managing data and its use. Imagine the capabilities of having not only your own data but the data of millions of other customers. These databases are some of the most powerful tools in the modern world and they are constantly changing how we live our lives, both online and offline.

Marble Racing Bot

If you’re an efficient person like me (the unwitted eye might perceive it as lazy) you’re always looking for shortcuts to increase productivity. One of the easiest and most effective ways to increase productivity is automation. Introducing automation to any task allows a users to take their eyes off the task and direct their attention elsewhere. You might be familiar with things like autohotkey which can be used to automate the entry of data into excel spreadsheets or, in some circumstances, *cough* Neverwinter Online *cough*, be used to play games without the presence of the player.

This type of automation in games has become known as botting. The use of these bots is quite common in games that require a lot of time to receive rewards. The popular MMORPG World of Warcraft has quite a large botting community built around it. Botting is not popular with developers who claim it is inherently detrimental to the gaming experience. Using a bot is definitely an unfair advantage. Someone who plays the game for 8 hours a day should, in theory, reap more rewards than the person who loads the bot and lets it run for 8 hours a day while at work. You could also argue that this “busy work” that is easily botted doesn’t constitute a gaming experience and is just a time sink that rewards players who dedicate the most time (and in World of Warcraft’s eyes, subscription money) to the game.

The cat and mouse game botters and developers play is one of the most fun elements of botting and, in some circumstance, can be more fun than the game itself. A historical example is the lawsuit World of Warcraft developers Blizzard levied on an early World of Warcraft bot wowglider. Other programs, like honorbuddy, have come online to bring botting into the current decade. Whether you feel botting is detrimental to the gaming experience or not, it’s impossible to ignore the objectivity of the science behind it. Looking at this process with our white hats on we can glean plenty of information about computer science, artificial intelligence, and social engineering.

The game marble racing on twitch.tv is an interactive gaming experience that allows players to assume the role of a marble racing down a hill. As many as 700 players have been seen rolling down these hills with hopes of achieving marble racing fame. You are awarded points and the game keeps track of your score on a leaderboard that is displayed after every race.

marbleracing

Every bot must have a vector it exploits to be useful to the player. In the case of marble racing, a player is rewarded points for every race completed independent of if the player places or not. A player could come in dead last and still be awarded points. This allows an individual who can play 24/7 in this game of chance to naturally have an advantage.

The game is simple to play. When given a cue, players enter the command “!marble” in the twitch chat which enters their representative marble into play.

marbleracing-2

This is the only element of the game. The rest is fanfare and trash talk between the users. Easy enough to automate, right? Somewhat. The developers have included countermeasures to prevent this kind of unethical play. This is where the cat and mouse game of botting becomes apparent. On the left side of figure 1 below, highlighted in the red boxes, you see the cue used to tell the users the game is accepting input. The cue “Enter marbles now!” is flanked by several lines of “—————–“. This is for aesthetic reasons so the users know when to input their marbles. When there are 700 users in chat, a message without these borders might be missed. It also serves an anti-botting purpose.

marbleracing-5
Figure 1

By opening an IRC client and connecting to the twitch servers the botter can gain access to this chat without actual having the game open in twitch. For this exercise I used mIRC because it has preferable scripting functionality. Often the botters first attempt is to use a simple script to look for a certain line of text (“Enter marbles now”) and return a line in the chat (“!marble”). This can be accomplished by using the following script in mIRC.

on *:TEXT:*Enter marbles now!*:#marbleracing:{

The syntax for the above command is simple enough:

“on” is equivalent to “when an action happens”, “:TEXT:” specifies we’re looking for a specific line in the chat identified as “*Enter marbles now!*”,  “:#marbleracing:” specifies the exact channel we want to look for the text, which in this case is the #marbleracing channel of twitch.tv irc.

We could then include the following line to send out our message

msg #marbleracing !marble

“msg” specifies we want the script to send a msg, “#marbleracing” specifies the specific channel the message is to be sent, and “!marble” is the message we want to send.

One the right side of figure 1 above we can see the user Cody___ using this simplified command. It unfortunately puts the message within the lines of dashes following the cue. This is indicative of bot use because this lines of dashes are milliseconds apart from the cue and it’s humanly impossible to type a command in this fast. The developer can use this technique to identify botters who use this simple script. Once a user’s command appears with the dashes, a moderator can mute this player or put him or her on a watch list.

The next step our botter would have to take in this cat and mouse scenario would be to include a wait command in the script to prevent the telltale sign of appearing within the dashed lines. By including the “timer” command we can make the bot wait before sending the message

.timer 1 4 msg #marbleracing !marble

The “.timer 1” lets the bot know we want to start a timer, “1” specifies what timer in case we want to have more than one active in our script. “4” specifies the number of seconds to wait.

This script is far less conspicuous and doesn’t immediately flag the user as a bot. However, when dealing with automation one’s bot only has to be more effective than the next most effective bots. This kind of cutthroat bot vs. bot dynamic is present in Wall Street in the form of high frequency trading and the bot that recognizes when it’s walking into a trap has a higher chance of walking away unscathed. Twitch’s bots are far less sophisticated but they do exist and this channel was using one that can identify automation. In this case a bot in the channel can see a user replying in the exact same interval after the cue several times and automatically mute that player, leaving them unable to play.

The next step would be to add randomness to the script to appear more human. This obfuscation is critical in gaming since players can tell when they’re playing against a robot with movement or logic that is not human.

Luckily mIRC has randomness elements we can take advantage of:

.timer 1 $r(3,14) msg #marbleracing !marble

By appending “$r(3,14)” to the code we’ve made our wait time random so we no longer trigger the other bot.  “$r” specifies we want the variable to be random and “(3,14)” is the range in seconds. The bot now waits between 3 and 14 seconds randomly before posting.

At this point we have a fully functioning bot that can play the game indefinitely without being automatically detected. However, we can further increase randomness and obfuscation by introducing more elements to the script.

Players can append what type of marble they want to race with. Options include planets, colors, basketballs, and all sorts of other fun cosmetic twists. Since players normally choose to specify what marble they’d like to use, it only makes sense if our bot chooses a style as well. We can take it one step further and randomize which marble the bot chooses by including a random variable function in the script. Our final script looks like this:

on *:TEXT:*Enter marbles now!*:#marbleracing:{
 Var %random = $rand(1,30)
 If (%random == 1.) .timer 1 $r(3,14) msg #marbleracing !marble pool9
 If (%random == 2.) .timer 1 $r(3,11) msg #marbleracing !marble eyeball
 If (%random == 3.) .timer 1 $r(3,11) msg #marbleracing !marble basketball
 If (%random == 4.) .timer 1 $r(3,11) msg #marbleracing !marble jupiter
 If (%random == 5.) .timer 1 $r(3,11) msg #marbleracing !marble pool9
 If (%random == 6.) .timer 1 $r(3,11) msg #marbleracing !marble neptune
 If (%random == 7.) .timer 1 $r(3,11) msg #marbleracing !marble 
 If (%random == 8.) .timer 1 $r(3,11) msg #marbleracing !marble black
 If (%random == 9.) .timer 1 $r(3,11) msg #marbleracing !marble pink
 If (%random == 10.) .timer 1 $r(3,11) msg #marbleracing !marble imGlitch
 If (%random == 11.) .timer 1 $r(3,11) msg #marbleracing !marble earth
 If (%random == 12.) .timer 1 $r(3,11) msg #marbleracing !marble
 If (%random == 13.) .timer 1 $r(3,11) msg #marbleracing !marble
 If (%random == 11.) .timer 1 $r(3,11) msg #marbleracing !marble Kappa
 If (%random == 15.) .timer 1 $r(3,11) 
 If (%random == 16.) .timer 1 $r(3,14) msg #marbleracing !marble pool9
 If (%random == 17.) .timer 1 $r(3,11) msg #marbleracing !marble eyeball
 If (%random == 18.) .timer 1 $r(3,11) msg #marbleracing !marble basketball
 If (%random == 19.) .timer 1 $r(3,11) msg #marbleracing !marble jupiter
 If (%random == 20.) .timer 1 $r(3,11) msg #marbleracing !marble pool9
 If (%random == 21.) .timer 1 $r(3,11) msg #marbleracing !marble
 If (%random == 22.) .timer 1 $r(3,11) msg #marbleracing !marble
 If (%random == 23.) .timer 1 $r(3,11) msg #marbleracing !marble black
 If (%random == 24.) .timer 1 $r(3,11) msg #marbleracing !marble pink
 If (%random == 25.) .timer 1 $r(3,11) msg #marbleracing !marble imGlitch
 If (%random == 26.) .timer 1 $r(3,11) msg #marbleracing !marble earth
 If (%random == 27.) .timer 1 $r(3,11) msg #marbleracing !marble
 If (%random == 28.) .timer 1 $r(3,11) msg #marbleracing !marble
 If (%random == 29.) .timer 1 $r(3,11) msg #marbleracing !marble tree
 If (%random == 30.) .timer 1 $r(3,11) 
})

It isn’t the prettiest script and could be simplified to a degree. What we have now is an array of options the script can choose when it sees the “Enter marbles now!” cue. When the cue is seen the script randomly chooses a number between 1 and 30 and then runs one option then waits and repeats when the cue is seen again.

The final script randomly chooses a marble style, furthering the appearance of human behavior. There are also to lines which do nothing when selecting. When human players play they don’t participate in every single game. By having our bot sit out 1 out of 15 times it becomes more reminiscent of human behavior, further disguising the automation.

This is just a small example of the cat and mouse games that go on between developers and the people searching for shortcuts. It is far from finalized, however. The perfect bot would be able to adapt to changes in the game’s operation. This particular script would fail if the game didn’t produce the “Enter marbles now” cue. That is one thing the developer might implement in the future and it’s back to square one.

As advances in robotics, artificial intelligence, and behavioral science are made, bots are bound to become more and more advanced. Eventually we might have a bot that is truly able to pass the turning test. And it might be sooner than we think.

Marble Racing: twitch.tv/marbleracing
Marble Racing in action: https://youtu.be/qhwgkH8OAw8