cognitive science
and more
Intro Bio Psy
Advertisement

If you see someone yawn, you won't be able to suppress a yawn of your own. If you see someone take a sip from her coffee, you are likely to take a sip as well. If you see someone shift her gaze, your gaze is likely to follow. These are all examples of automatic imitation, or mimicry.

An adorable Rhesus monkey imitating tongue protrusion. (Source, License: CC-by 2.5)

Mimicry may serve a social function. If you see someone take a sip from her coffee, some of the same brain regions ("mirror neurons") become active that would also become active when you take a sip yourself. This neural overlap is, in a sense, a direct form of neural empathy: When I see you do something, it's like I'm doing it myself. And this overlap increases the likelihood of automatic imitation, because seeing someone else perform an action primes your brain to perform the same action yourself.

All of this may sound fanciful, and not all researchers buy into the mimicry-as-empathy notion to the same extent. But by and large this is accepted psychological theory. And personally I think that there's a lot to it.

Pupil mimicry is a specific type of automatic imitation, in which you automatically adjust the size of your eye's pupils based on the pupils of someone you're looking at. So if I would look at you, and if your pupils would dilate, then my own pupils would dilate as well. In the past years, several high-profile …

Read more »

My (unremarkable) experience with signing peer reviews

When a manuscript is submitted to an academic journal, it generally undergoes peer review. That is, the manuscript is read by other researchers who rate the quality of the manuscript, suggest improvements, etc. Based on these peer reviews, the editor of the journal then either rejects the paper, or accepts it for publication, usually after one or two rounds of revision.

A thorny question in this process is whether reviewers should remain anonymous, or whether they should disclose their identity by signing their reviews. There's a wide-spread belief that signing reviews is dangerous, because authors may not appreciate your critical comments, and may even retaliate, for example by trashing your manuscript when it's their turn to review. This would be especially dangerous for early-career researchers, who don't have permanent positions, and are therefore vulnerable to career damage.

A few days ago, Hilda Bastian voiced this concern in a blog post. I feel that her post is somewhat alarmist, in the sense that it starts from the assumption that signing reviews is indeed dangerous. (Although she also points out that it can in some cases help to build a reputation.)

And this prompted me to share my own experiences here.

Robert De Niro as an easily offended author.

I have signed all of my reviews, starting from the very first one, which I believe was in 2011 when I was still a junior PhD student. Since then I've reviewed about 100 manuscripts and grant proposals, most of them undergoing multiple rounds …

Read more »

Why do the US and the UK dominate the World University Rankings?

Every year, all universities in the world are ranked by academic excellence. (I'm only going to use the term 'academic excellence' once, because every time I write it, I vomit a little in my mouth.) These rankings are created by three self-appointed authorities: QS, The Times of Higher Education, and The Shanghai Rankings.

Below you can see where the Top 20 universities come from:

The most striking feature of these rankings is that the Top 20 consists almost entirely of US and UK institutes. In fact, only five countries appear in any of the three Top 20s: US (42×), UK (11×), Switzerland (4×), Singapore (2×), and Australia (1×). And the Top 3 even consists entirely of US (6×) and UK (3×) institutes.

Not all universities are equally good, and I have no problem accepting that Oxford (#6 according to QS) is in all respects a better university than my previous academic home of Aix-Marseille Université (#411-420 according to QS). And universities may, on average and by some measures, be a bit better in one country than another. That's fine.

But the suggestion that, to a good approximation, the US and the UK are the only countries in the world where you can find good universities is ridiculous. What about Japan? What about Germany? What about France—how did Emmanuel Macron become president after receiving an education (well, if you can call it that!) at lowly Paris Nanterre (#801-1000 according to QS)? Compare that to Donald Trump, who was educated at …

Read more »

Mind the gap! Income inequality in the European Union

Inequality is a hot topic. Prominent economists, such as Thomas Piketty and Joseph Stiglitz, have raised public awareness of economic inequality: differences between people in income (what you earn) and capital (what you own). They and others have shown that economic inequality is not only huge, but steadily increasing: the proverbial "one percent" own more and more of everything there is to own. Others have focused on social inequality, such as differences in the opportunities given to women and ethnic minorities, compared (generally) to white men.

Social and economic inequality are, of course, related; in a sense, economic inequality is a subtype of social inequality. But it's a subtype that is (relatively) easy to quantify—and that's exactly what I will try to do in this post, using data from Eurostat, a public database curated by the European Commission.

We all have some gut feeling of the kinds of inequalities that exist: You probably don't need to see an income distribution to know that, on average, men earn more than women for the same job. But how much more? Do men make 50% more, or only 5%? And how does this compare to differences between people with different levels of education? Or to differences between countries? Or to differences between the very rich and the very poor within a country?

It's important to have some idea of the magnitude of the different kinds of inequalities. Because only then you can have an informed debate.

(Disclaimer: This is my best attempt …

Read more »

About (the fact) that

A famous writer (I forgot who or where) once complained about not being able to avoid 'the fact that' in his writing. Such inelegance! Yet he just couldn't bring himself to remove it from every sentence—which he could have done, because reducing 'the fact that' to a plain 'that' always results in a grammatically correct sentence. In fact, I once worked with a copy editor who did just that: She returned my manuscript with every instance of 'the fact that' reduced to 'that'. In all cases, the result was grammatically correct. But in some cases, the result was also atrocious.

So why do some sentences just seem to require a 'the fact that', even when it is grammatically redundant and almost universally despised? I have given this matter a disproportionate amount of thought, and arrived at the conclusion that it is all about expectations.

The word 'that' can have several grammatical roles. It can be an adjective, as in: 'that capybara'. (Which capybara? That one!) Or it can be a conjunction, which is a word that introduces a subclause, as in: 'Do you know that capybaras are the largest rodents?' ('That' can also be a pronoun, of course, but let's forget about that for now.)

Now here's the thing: You often don't know which role 'that' has until you've read the entire sentence. And that's confusing. For example, after reading 'I like that …', you still don't know whether 'that' will be an adjective ('I like that capybara') or a …

Read more »