I got my major credits for both History and Political Science but was a few credits short for my core requirements so I decided to pick up a minor in Criminal Justice during my last semester. I thought PoliSci was bad in terms of opinions being treated as facts, alchemical postulations being regarded as high science, and a general culture of trendy cults around particular texts/authors being the deciding factor in who and what gets taught and believed; CJ makes PoliSci look like Biology.
Sociology in general is far closer, metaphorically speaking, to alchemy than chemistry. The problem is that Criminology is used to make very important decisions about criminal law, incarceration, sentencing, community standards. One could make a claim that PoliSci has a similar relationship with policy making, but it doesn't hold up. Political Scientists spend more time doing interviews and tweeting than they do running countries, or even experiments (case in point: my idea of using North Dakota and South Dakota as a geographical laboratory to run 10 year long, competitive governing structure experiments was perceived as a joke). Criminologists are actively involved in policing standards/regs, sentencing, etc.
With the stuff I've come into contact with, the allegations leveled in this article do not seem surprising.
A minor in Criminal Justice is not necessarily a useful point of reference for the field. Most undergraduate Criminal Justice degrees are geared more pragmatically toward training police, parole officers, and such. CJ majors (and certainly minors) can graduate without ever being in shouting distance of actual criminological research.
That's not to say that the field is without problems, but what you're describing is not well representative of what we're taking about when we talk about the journal Criminology.
A fair point, and something I considered before posting. There are a few factors that lead me to believe my experience is quite representative.
1) Every single teacher I have (5) are PhDs in criminology.
2) 4 out of 5 classes use their readings/assignments exclusively criminological research papers and peer reviewed journal articles. The fifth is exclusively case law and comments/treatises thereof.
3) Each of the 4 classes covers a rough average of 15 papers, the lowest being ten and the highest 20 or more.
4) The institution I am at is a leader in the field of criminology. In addition, the CJ program is paired very tightly with the law school.
5) The institution I am at is decidedly anti-LEO in both form and content. Though there are a number of matriculating cops and a smattering of future LEOs, the vast majority of undergrads in the program are sociology students aiming for lawschool, social work, or criminology.
I understand your point, and I would wager that it is generally true, but not for my case. TBH, it is the only reason I posted.
I also shared this article with every one of my professors, a few of the guest lecturers, and the dean of the school.
I appreciate your taking the time to respond to this idea that somehow because you were only a 'minor' in CJ, your opinion is 'not a useful point of reference'.
"Not necessarily". The following comment was a very useful qualification.
For reference, I was a philosophy major, and at no point would I have considered myself qualified to opine on the quality of current philosophy. Similarly, a close friend who was a physics major has now completed a postdoc in a subfield and still hesitates to make sweeping statements about the state of "physics."
And lest anyone is left with the wrong impression, criminologists are not running much of anything in the administration of criminal justice in the United States.
As someone who has claimed to be a part of the wider world of criminology, I am a bit puzzled how you could postulate that criminologists are not deeply involved in the criminal justice structure in the United States. Criminologists have been structurally incorporated into CJ at every level of policy making and implementation. Sociologists formulate the very rules for conducting studies (via IRB), sociologists gather, collate, shape, and deliver the data, sociologists write the papers, journals, and articles that use the data, and sociologists run the bodies and organizations that advise law enforcement, government, and universities that define values, decide policy, and force conformity (A/B/C C/B/A order).
Probation over incarceration was a criminological theory made policy. Decriminalization of (x) are criminologists postulating about causal relationships. More importantly, criminologists are heavily consulted by the entertainment industry in terms of form and content.
Regardless, I feel completely comfortable to comment on the state of abysmal reasoning I see in my school. I did it to the PoliSci department for the same reasons. People who cannot define/explain the mathematical and/or computational logic underpinning their methodology have no place in relating conclusions and analysis from said studies. The methodology for encoding of life factors, interactions, and ethno-social details alone should be the biggest red flag. There are so many little biases, arguments from consensus, and cognitively dissonant perspectives that it becomes death by a thousand errors.
I hope these "data thugs" rip sociology to shreds. It is a good thing. These are all my opinions and observations, of course, but I truly believe that all of the social sciences are at an alchemical state; whoever can convince the patrons that they can cull prosperity/equality/justice/peace/wellness from the data gets the support and favor. But it is bullshit. Rather, it isn't science. Not yet. There are too many taboos, too much common knowledge, too much politicking, currently. Sociology needs to be culled in a compassionately dispassionate way by external forces, with the hope of the dross being scraped away so the concrete value, however much or little is there, can be recognized and built upon.
If you look at the author's published papers[0] just about every one involves highly sensitive political and social topics. That means they're likely to be quoted outside of the field where people will say things like "Look, it's scientific it was published in a peer-reviewed journal!"
Young adults have this guy as a professor and they believe that surely their professor knows what he's talking about.
This story is a few months old, the retraction request looks serious[1], and I'm left thinking that either Picket has gone off the rails or this entire field looks awful.
The paper in question has 78 citations and Stewart (the one accused of fabricating data) has 5,712 citations according to Google Scholar.
Very interesting. Here is a list of the 5 articles from that discussion:
· Johnson, Brian D., Eric A. Stewart, Justin Pickett, and Marc Gertz. (2011) Ethnic
threat and social control: Examining public support for judicial use of ethnicity in
punishment. Criminology, 49(2), 401-441.
· Stewart, Eric A., Ramiro Martinez, Jr., Eric P. Baumer, and Marc Gertz. (2015) The
social context of latino threat and punitive Latino sentiment. Social Problems, 62(1),
69-92.
· Mears, Daniel P., Eric A. Stewart, Patricia Y. Warren, Miltonette O. Craig, and Ashley
N. Arnio. (2019) A legacy of lynchings: Perceived black criminal threat among whites.
Law & Society Review, 53(2), 487-517.
· Stewart, Eric A., Brian D. Johnson, Patricia Y. Warren, Jordyn L. Rosario, and
Cresean Hughes. (2019) The social context of criminal threat, victim race, and punitive
black and latino sentiment. Social Problems, 66(2), 194-221.
· Stewart, Eric A., Daniel P. Mears, Patricia Y. Warren, Eric P. Baumer, and Ashley N.
Arnio. (2018) Lynchings, racial threat, and whites’ punitive views toward blacks.
Criminology, 56(3), 455-480.
All centre around racial issues in the criminal justice context.
From that post: "I do not have any solutions to the systemic problems here, but improvements should be easy. Criminology as a field has to improve in terms of making data available with full documentation and reproducible code. That would make errors detectable sooner."
Sadly this is a predominant trend in Academia. Some would argue it's always been there, but it does seem to be significantly more prevalent and unashamed in it's presentation these days.
It's quite shocking that the editor in chief of the 'Criminology' journal seems unconcerned about papers in his journal that rely on fabricated data. It calls into question the veracity of all papers published in his journal. Does anyone have any insight into criminology as an academic field?
Yes. He hasn't been editor in chief for long; the Crim editorship rotates around through top departments every few years. I wouldn't take his position as particularly determinative.
Shawn Bushway's take is probably most representative of the field. If we're going to be taken seriously (insert Arrested Development shot) we need to be able to deal with this.
Criminology has thus far mostly avoided the replication crisis, so there hasn't been much reckoning with sloppy research -- much less with a major figure in the field just making up data. And the co-author network here touches on several of the top programs, so there's plenty of drama.
Yea... Maybe I’m taking it a bit far here, but my spontaneous thought reading this was “How can this McDowall character call himself a scientist...? What the fuck is happening to the proud tradition of scientific research...?”
I haven't taken any Criminonology classes so I cannot comment on the level of field rigor in even a subset but this resembles a stereotypical case of "non-Mathematics Liberal arts majors who think they were done with math".
There is a sadly significant culturally innumerate population who will look at you with scorn and anger for even daring to bring up numbers even for something straightforward like "does a year of typical catfood or small dog breed dogfood cost more" instead of trying to guess based on concepts like "beef is more expensive than fish" or "crazy cat ladies are willing to pay more". Their reaction seems more ignorance than "caught in a lie bullshit more".
It also sounds like TV news screenshots mocked online for poll percentages which overflow 100%. I have done research and replicated sources in high school. They often took numbers from multiple related sources and questions and miscombined them. "Like they had the unpaid interns who knew how to use a computer source the data" was a trope of an explanation but it may well have been the senior professionals who did so.
I think this is indicative of the publish or perish mentality that universities have instilled among their researchers and professors. It creates an incentive to fake data, to fake papers, anything to publish. Those who get upset at closer inspection of their papers and their data sound more like they have something to hide, or they are afraid that the spotlight of closer inspection will turn on them. Then again this could speak to the larger issue that universities are having with aversion to critique, and disagreement.
The author collecting the data is a pretty clear conflict of interest that I guess I’ve never given much thought. Considering how difficult/expensive/time consuming it is to interview people it’s kind of odd the publications don’t require some sort of certification from whichever organization collected and/or funded the collection of the data.
I suspect it hasn't been hiven much thought because of how logistically difficult it would be communications wise and the value of context. Take even a simple yet still complex example as say Microsoft declares that coding and software design should be kept absolutely separate on a personal level. You can make a design or write of anything but not both. That would be a major pain for no concrete gain - like trying to produce bottled water from hydrogen and oxygen burning on the grounds it is "purer" that way - ignoring that apparatus also introduces comparable parts per billion contaminants as a reverse osmosis filter.
As for interviews what would another layer of gatekeeping even add that calling one or both parties to verify that the transcript wasn't from a satirical news site add?
I am dumbfounded that respectable journals still don't regularly engage in data escrow with third party adjudication precisely for this reason. A credible question about the veracity or interpretation of the data, in terms of objectively presented statistical heuristics, should automatically grant confidential data access for further peer review. This could be accomplished easily with threshold-type cryptographic tools, even amongst hierarchies of authors. These studies are the basis of our science and are its direct credibility. It's absurd we don't take precautions against the malefactors who would pervert it.
even without any of the overly complicated tech you describe, some journals maintain statisticians on staff who evaluate everything in every paper and then help correct errors before publication. My stats prof, Stan Glantz did this for a major cardiology journal. He said (and I agree, having reviewed plenty of papers myself) that statistical errors that affect the conclusion of the paper were found in the majority of papers.
It solves a problem, just like any other engineering solution. There are trade-offs involved like always. Obviously you understand the problem I'm trying to solve, since you're passing judgement on my suggestion - do you have a better one? Even given all of your verbiage, I'm afraid I couldn't devine one.
recently saw a tweet by an academic (political) economist who proposed "you are not a real empirical social scientist unless you have published a null results paper." science cannot advance without them, but an individual cannot build career on them.
The sad truth is that the majority of social sciences likely have similar problems surrounding data integrity. Good data is often expensive or hard to acquire, and once you have it, you often have to apply some sort of ad hoc interpretation in order to make it quantifiable for performing statistics. E.g. in psychology there are entire manuals for how you "encode" interactions between a therapist and a patient along various dimensions.
And then once you've got quantitative data, many social scientists don't actually apply statistics correctly, either out of ignorance or willful desperation to have some sort of positive result.
We probably need to have to have some sort of standard for what constitutes acceptable quality data, and have a government funded repository of such data to encourage reproducibility. But that may be way easier said than done.
If you think the problem is limited to social sciences then you are sadly mistaken. It affects all disciplines which deal with statistical evidence, from comp sci to medicine to criminology. Not pure math though, but that field has other problems such as proofs being published that are incorrect and never gets corrected.
Anyone who says anything as inane as "The science is settled" deserves the highest form of ridicule and mockery, yet as you can see in this article the wagon circling is more about politics and power than performing good science.
The more that gets revealed about the whole academic journal scene points to it being more a scam and house of cards credibility wise vs. sincerely advancing human knowledge.
This isn't just an issue about fabricating a research paper, it also represents an enormous failure of the peer review process. 2 or 3 reviewers is common, and yet none of them pushed back on any of the issues presented here. And yet when others took a look the issues became apparent. It's hard not to come away from this with the idea that the system is broken. I also don't understand why some level of sharing anonymous data isn't a requirement for all research. If replication is a cornerstone of scientific research, the very first piece of replication should be obtaining the same results from the same set of data.
I really don't understand, especially given the risk of being found out, why someone would go through all of the trouble to fabricate research like this. With a little extra effort it should be possible to produce legitimate research. Heck, even if you're results are negative you can write the research and say "Results here showed no significant difference on the question of X. However there is a need for further research, and these results simply represent the first step along that path" blah blah blah. It's really not that hard!
Sociology in general is far closer, metaphorically speaking, to alchemy than chemistry. The problem is that Criminology is used to make very important decisions about criminal law, incarceration, sentencing, community standards. One could make a claim that PoliSci has a similar relationship with policy making, but it doesn't hold up. Political Scientists spend more time doing interviews and tweeting than they do running countries, or even experiments (case in point: my idea of using North Dakota and South Dakota as a geographical laboratory to run 10 year long, competitive governing structure experiments was perceived as a joke). Criminologists are actively involved in policing standards/regs, sentencing, etc.
With the stuff I've come into contact with, the allegations leveled in this article do not seem surprising.