top of page
  • Claire Tills

Quality Research Part II: Publicizing Research

In my last blog post, I discussed (very generally) what can make research “bad.” Now I want to explore the second step of good research, presenting it to the world accurately. Even methodologically sound (check the prior blog post for more on this) research can be misrepresented.


However, the main focus here is the proliferation of bad research. The biggest issue with this sort of research is how sticky it can be. People will continue to quote the erroneous statistics they've heard, even if they’ve see a correction or rebuttal of the statistic. Memory is weird and they might not remember the correction, only recalling and repeating that bad statistic.


Hank Green broke the issue with bad statistics down very concisely with a specific example. Two of his points are incredibly important here.

First, he points out why this type of research spreads so effectively. Because it was conducted with a very specific point of view (or preconception) in mind, it often aligns with the audience's biases. It supports beliefs we already hold, making us more likely to share it without any interrogation or fact checking (confirmation bias in action).

Next, he explains the issue. Most of us don’t double-check statistics in the news. We don’t seek out the primary source to see how and by whom the research was conducted. We do this even less when the research supports our opinions. *A separate issue that I don’t want to explore in detail here is the outrage-sharing of bad statistics (sharing statistics because they are bad).*


For an example specific to information security, this article from UBNow (out of the University at Buffalo) is a decent start. The research isn't quite as bad as the study Green breaks down. Feel free to read the study for yourself and see if you can identify the issues.


Right out of the gate, the article about the research comes out swinging with a fairly sensational statement that doesn't adequately represent the findings.


Whether cracking digital security for good or ill, hackers tend to be people who are manipulative, deceitful, exploitative, cynical and insensitive, according to research from the School of Management.


This is a perfect example of the tension between research writing and journalism. The measured language required to talk about research does not lend itself to the type of writing expected of journalists. Qualifying language like "results suggest that," "tend to be," and "score high on [a given] scale" can get lost in the translation from academic article to media article. In the UBNow journalist's defense, they did use "tend to." Many journalists get this right consistently, many others don’t.


The UBNow article was also limited by what was in the original research article, which didn't accurately represent the limitations of the sample. A study of 439 college students will never be representative of "hackers." Convenience sampling is necessary to get any research done but it is inherently limited. Researchers and journalists need to understand and admit those limitations.


It's hard in this case to separate the issues with the initial research from the articles written about it. The journalist was working from a flawed source but, given how hard they went in the first sentence of their article, it's difficult not to be frustrated with the article. Journalists need to be more careful when writing about research and readers need to ensure that they're actually checking the source material.


There are so many factors at play here, I'm not actually sure how to solve this. Research standards; the immense pressure on journalists to grab attention; low research literacy; lack of access to original sources (via paywalls) are just the few that come to mind as I'm writing this.


However, there are some things we, as consumers of research, can do.

  • Learn a little about data/research literacy: Crash Course and Arizona State University developed an excellent video series about data literacy. There’s a video just on Data in the News.

  • Find and evaluate the primary sources for statistics we share: whether the statistics makes sense or infuriates us, we should all get in the practice of reading source material. If source material isn’t available or is questionable, we shouldn’t share it.

  • Share good research with proper citation: Make it a habit to share and talk about good research in a way that others can track down the source material.

  • Engage with researchers and participate in their projects: one way to ensure researchers have representative samples is to be part of the sample (when appropriate). Even just reaching out to researchers to discuss their work is a huge step and will be the highlight of their whole month.

Ideally, everyone would be driven to understand phenomena of information security accurately and completely, without bias or preconception, and they would be able to make a living doing that work. Barring that fantasy, we all just need to properly tune our BS detectors and think critically about the statistics we read and share. Even the ones with which we agree.


Go forth and read more research, share more research, and if you're feeling particularly awesome, find research to participate in!

bottom of page