

Everything Hertz
Dan Quintana
Methodology, scientific life, and bad language. Co-hosted by Dr. Dan Quintana (University of Oslo) and Dr. James Heathers (Cipher Skin)
Episodes
Mentioned books

17 snips
Jun 2, 2017 • 1h 2min
45: Conferences and conspiracy theories
It’s conference season so in this episode Dan and James discuss the ins and outs of scientific conferences.
Here’s what they cover:
Research parasite award
How much do you save when you don’t run an fMRI study
They come up with an even better name than “Research parasite”
Could the GOP weaponise the open science movement?
Conspiracy theories
Attempts to slow down science by taking science out of context
The Black Goat Podcast
The conference backchannel
Contacting people at conferences
Sitting though seminars (and not falling asleep)
Twitter conferences
Good presentations vs. bad presentations
Starting collaborations at conferences
Do conference locations matter?
Periscoping conference presentations
Links
The research parasite award: http://researchparasite.com
The GOP and science reform https://www.theatlantic.com/science/archive/2017/04/reproducibility-science-open-judoflip/521952/
The Crackpot index http://math.ucr.edu/home/baez/crackpot.html
The Brain Twitter conference https://brain.tc
Music credits: Lee Rosevere freemusicarchive.org/music/Lee_Rosevere/Support Everything Hertz

May 19, 2017 • 1h 9min
44: Who’s afraid of the New Bad People? (with Nick Brown)
James and Dan are joined by Nick Brown (University of Groningen) to discuss how the New Bad People — also known as shameless little bullies, vigilantes, the self-appointed data police, angry nothings, scientific McCarthyites, second-stringers, whiners, the Stasi, destructo-critics, and wackaloons* — are trying to improve science
Here’s what they cover
Power imbalances in academia
Publication bias
Euphemisms for people who are publicly critical of science
How to go about questioning the scientific record
Peer reviewed criticism vs. blog posts
Making meta-analysis easier
Data-recycling
Well-being and genomics
Popular science books and conflicts of interest
The ‘typical’ response to a Letter to an Editor
What Dan and James do during the breaks
Why don’t people report descriptive statistics anymore?
Priming studies
Science in the media
What Nick has changed his mind about
Links
Nick on Twitter - @sTeamTraen
Nick’s blog - http://steamtraen.blogspot.no
This list is from one of James’ blog posts https://medium.com/@jamesheathers/meet-the-new-bad-people-4922137949a1
Music credits: Lee Rosevere freemusicarchive.org/music/Lee_Rosevere/Special Guest: Nick Brown.Support Everything Hertz

27 snips
May 5, 2017 • 1h 3min
43: Death, taxes, and publication bias in meta-analysis (with Daniel Lakens)
Daniel Lakens (Eindhoven University of Technology) joins James and Dan to talk meta-analysis.
Here’s what they cover:
Daniel’s opinion on the current state of meta-analysis
The benefit of reporting guidelines (even though hardly anyone actually follows them)
How fixing publication bias can fix science
Meta-analysis before and after that Bem paper
How to correct for publication bias
Whether meta-analyses are just published for the citations
The benefits of pre-registering meta-analysis
How we get people to share their data
How sharing data doesn’t just benefit others - it also helps you replicate your own analyses later
Success is tied to funding, no matter how “cheap” your research is
How people can say “yes” to cumulative science, but “no” to sharing data
Responding to mistakes
How to find errors in your own papers before submission
We ask Daniel: i) If he could should one slide to every introductory psychology lecture in the world, what would say?, ii) What has he changed his mind about in the last few years?, iii) The one book/paper he thinks everyone should read
Daniel also gives James and Dan ideas for their 50th episode
Links
Daniel on Twitter - @lakens
Daniel’s course - www.coursera.org/learn/statistical-inferences
Daniel’s blog - daniellakens.blogspot.no
Daniel’s recommended book - Understanding Psychology as a science https://he.palgrave.com/page/detail/?sf1=barcode&st1=9780230542303
Music credits: Lee Rosevere freemusicarchive.org/music/Lee_Rosevere/Special Guest: Daniel Lakens.Support Everything Hertz

20 snips
Apr 21, 2017 • 1h 7min
42: Some of my best friends are Bayesians (with Daniel Lakens)
Daniel Lakens (Eindhoven University of Technology) drops in to talk statistical inference with James and Dan.
Here’s what they cover:
How did Daniel get into statistical inference?
Are we overdoing the Frequentist vs. Bayes debate?
What situations better suit Bayesian inference?
The over advertising of Bayesian inference
Study design is underrated
The limits of p-values
Why not report both p-values and Bayes factors?
The “perfect t-test” script and the difference between Student’s and Welch’s t-tests
The two-one sided test
Frequentist and Bayesian approaches for stopping procedures
Why James and Dan started the podcast
The worst bits of advice that Daniel has heard about statistical inference
Dan discuss a new preprint on Bayes factors in psychiatry
Statistical power
Excel isn’t all bad…
The importance of accessible software
We ask Daniel about his research workflow - how does he get stuff done?
Using blog posts as a way of gauging interest in a topic
Chris Chambers’ new book: The seven deadly sins of psychology
Even more names for methodological terrorists
Links
Daniel on Twitter - @lakens
Daniel’s course - https://www.coursera.org/learn/statistical-inferences
Daniel’s blog - http://daniellakens.blogspot.no
TOSTER - http://daniellakens.blogspot.no/2016/12/tost-equivalence-testing-r-package.html
Dan’s preprint on Bayesian alternatives for psychiatry research - https://osf.io/sgpe9/
Understanding the new statistics - https://www.amazon.com/Understanding-New-Statistics-Meta-Analysis-Multivariate/dp/041587968X
Daniel’s effect size paper - http://journal.frontiersin.org/article/10.3389/fpsyg.2013.00863/full
The seven deadly sins of Psychology - http://press.princeton.edu/titles/10970.htmlSpecial Guest: Daniel Lakens.Support Everything Hertz

9 snips
Apr 7, 2017 • 1h 7min
41: Objecting to published research (with William Gunn)
In this episode, Dan and James are joined by William Gunn (Director of Scholarly communications at Elsevier) to discuss ways in which you can object to published research.
They also cover:
What differentiates an analytics company from a publishing company?
How scientific journals are one of the last areas to fully adopt the dynamic nature of the internet
Data repositories
How to make a correction in a journal
The benefits of Registered Reports
When everyone asked Elsevier for a journal of negative results but no one submitted to them
How unit of publication isn’t really indicative of science as a process
Altmetrics and gaming the system
How to appeal to a journal about a paper
Citation cartels: the dumbest crime
William’s switch from research to publishing and his shift in perspective
The crackpot index
James’ flowchart on how to contact an editor
The copyediting process
Elsevier’s approach to open peer review: should junior researchers be worried?
The one thing William thinks that everyone else thinks is crazy
William’s most worthwhile career investment
The one paper that William thinks everyone should read
Links
Williams’s twitter account: @mrgunn
Williams’s blog: http://synthesis.williamgunn.org
The Crackpot index: http://math.ucr.edu/home/baez/crackpot.html
The paper William thinks everyone should read: http://stm.sciencemag.org/content/8/341/341ps12.full
Special Guest: William Gunn.Support Everything Hertz

Mar 24, 2017 • 49min
40: Meta-research (with Michèle Nuijten)
Dan and James are joined by Michèle Nuijten (Tilburg University) to discuss 'statcheck', an algorithm that automatically scans papers for statistical tests, recomputes p-values, and flags inconsistencies.
They also cover:
How Michèle dealt with statcheck criticisms
Psychological Science’s pilot of statcheck for journal submissions
Detecting data fraud
When should a journal issue a correction?
Future plans for statcheck
The one thing Michèle thinks that everyone else thinks is crazy
Michèle's most worthwhile career investment
The one paper that Michèle thinks everyone should read
Links
Michèle's website: https://mbnuijten.com
Michèle's twitter account: https://twitter.com/michelenuijten
Statcheck: https://statcheck.io
Tilberg University meta-research center: http://metaresearch.nl
Guardian story on detecting science fraud: https://www.theguardian.com/science/2017/feb/01/high-tech-war-on-science
The paper Michèle thinks everyone should read: http://opim.wharton.upenn.edu/DPlab/papers/publishedPapers/Simmons_2011_False-Positive%20Psychology.pdf
Everything Hertz on Twitter: https://twitter.com/hertzpodcast
Everything Hertz on Facebook: https://www.facebook.com/everythinghertzpodcast
The startup scientist, Dan's other podcast on boosting your scientific career: https://soundcloud.com/startup-scientist-podcast
Special Guest: Michèle Nuijten.Support Everything Hertz

Mar 10, 2017 • 55min
39: Academic hipsters
We all know hipsters. You know, like the guy that rides his Penny-farthing to the local cafe to write his memoirs on a typewriter - just because its more ‘authentic’. In this episode, James and Dan discuss academic hipsters. These are people who insist you need to use specific tools in your science like R, python, and LaTeX. So should you start using these trendy tools despite the steep learning curve?
Other stuff they cover:
Why James finally jumped onto Twitter
A new segment: 2-minutes hate
The senior academic that blamed an uncredited co-author for data anomalies
An infographic ranking science journalism quality that’s mostly wrong
When to learn new tools, and when to stick with what you know
Authorea as a good example of a compromise between "easy" and "reproducible"
Links
The science journalism infographic
http://www.nature.com/news/science-journalism-can-be-evidence-based-compelling-and-wrong-1.21591
Facebook page
www.facebook.com/everythinghertzpodcast/
Twitter account
www.twitter.com/hertzpodcast
Music credits: Lee Rosevere http://freemusicarchive.org/music/Lee_Rosevere/Support Everything Hertz

6 snips
Feb 24, 2017 • 1h 2min
38: Work/life balance - Part 2
Dan and James continue their discussion on work/life balance in academia. They also suggest ways to get your work done within a sane amount of hours as well as how to pick the right lab.
Some of the topics covered:
Feedback from our last episode
Why the podcast started in the first place
The "Red Queen" problem
Does the "70 hour lab" produce better work?
Some experiments aren't suited to a 9-5 schedule
More tips for anonomusly skiving off at work
What are cognitive limits off focused work?
Do early career researchers even earn the minimum wage when you factor in the hours worked?
How James gets things done: Work on one thing at a time until it's done and protect your time
How Dan gets things done: Pomodoros (40 mins work, 10 minute break), blocking social/news websites
How do pick a lab to work in?
Links
Facebook page
https://www.facebook.com/everythinghertzpodcast/
Twitter account
https://www.twitter.com/hertzpodcastSupport Everything Hertz

Feb 17, 2017 • 57min
37: Work/life balance in academia
In this episode, we talk work/life balance for early career researchers. Do you need to work a 70-hour week to be a successful scientist or can you actually have a life outside the lab?
Some of the topics covered:
An update on "the postdoc that didn't say no" story
Brian Wansink's response
De-identifying data in research
The perils of public criticism
Criticising the research vs. criticising the person
Some sage advice from a senior academic on "Making science the centre of your life"
Look for a boss that won't make insane demands of your time
How much good work is really coming out of a 70-hour week?
An old hack Dan used to do to pretend he was working on data when he was really just on twitter
Links
GRIM test calculator
http://www.prepubmed.org/grim_test/
Jordan's follow-up post
https://medium.com/@OmnesRes/the-donald-trump-of-food-research-49e2bc7daa41#.me8e97z51
Brian Wansink's response
http://www.brianwansink.com/phd-advice/statistical-heartburn-and-long-term-lessons
The "Making science the centre of your life" slide
https://twitter.com/hertzpodcast/status/832501121893724160
Facebook page
https://www.facebook.com/everythinghertzpodcast/
Twitter account
https://www.twitter.com/hertzpodcastSupport Everything Hertz

21 snips
Jan 27, 2017 • 51min
36: Statistical inconsistencies in published research
Topics discussed in the podcast include statistical inconsistencies in research, a caffeine study mishap, research scandal repercussions, detecting data inconsistencies, and ensuring research accuracy with stat checks and R Markdown. The hosts share personal anecdotes, reflect on challenges in research integrity, and discuss solutions like the GRIM test and open science practices.


