The 2016 presidential election was fraught with anxiety and fueled by emotional statements and drama. As the public, we were witness to opinions as to who would win and who would lose. The media, the pollsters, and many reputable sites predicting the outcome were just plain wrong.
Many of us visited sites like fivethirtyeight which displayed current data-driven infographics to keep up-to-date. Unfortunately, infographics aren’t always accurate—especially considering that the people who created the infographics are really just human. And humans, whether they intend to or not, inject bias into data—skewing the results.
A recent article by Fast Company explains how data is dependent on polling. And if the data source is wrong, so are the infographics that paint the picture. Uncertainty is also a big factor in data discrepancy. “Uncertainty is a problem with which the data media industry continues to grapple. It’s ultimately unfair to pin it on the populace when they don’t understand probability—as all designers know, you can’t change people, you have to change design for them. That’s especially true when we’ve learned that much of the source data was wrong to begin with. In fact, since the election we’ve learned that the difference between polling and voting was beyond the industry standard margin of error of ~3.5% in ten states,” the article stated.
Fast Company quoted Giorgia Lupi. “Let’s just stop thinking data is infallible,” says Giorgia Lupi, co-founder of data-driven research firm Accurat. “It’s not. Data is primarily human-made, and reflects our errors in judgement.”
As designers presenting data, we have to ask ourselves if we are injecting bias into our visuals. It’s our responsibility to remain as neutral as possible. However, when the sources are all pointing to one conclusion, that can be difficult to achieve.
Read Fast Company’s article for more information: Why We Had No Idea Trump Would Win