You didn’t build that. – Barack Obama, July 13, 2012
Connectionism originates in psychology, but the “old connectionists” are mostly gone, having largely failed to pass on their ideology to their trainees, and there really aren’t many “young connectionists” to speak of. But, I predict that in the next few years we’ll see a bunch of psychologists of language—the ones who define themselves by their opposition to internalism, innateness, and generativism—become some of the biggest cheerleaders for large language models (LLMs). In fact, psychologists have not made substantial contributions to neural network modeling in many years. Virtually all the work on improving neural networks over the last few decades has been done by computer scientists who cared not a whit whether they had anything to do with human brains or cognitive plausibility.1 (Sometimes they’ll put things like “…inspired by the human brain…” in the press releases, but we all know that’s just fluff.) At this point, psychology as a discipline has no more claim to neural networks than the Irish do to Gaul, and in the rather unlikely case that LLMs do end up furnishing deep truths about cognition, psychology as a discipline will have failed us by not following up on a promising lead. I think it will be particularly revealing if psychologists who previously worshipped at the Church of Bayes suddenly lose all interest in mathematical rigor and find themselves praying to the great Black Box. I want to say it now: if this happens—and I am starting to see signs that it will—those people will be cynics, haters, and trolls, and you shouldn’t pay them any mind.
Endnotes
- I am also critical of machine learning pedagogy, and it is therefore interesting to see that those same computer scientists pushing things forward don’t seem to care much for machine learning as an academic discipline either.
Maybe something like https://lingbuzz.net/lingbuzz/007180 ?
Ha, you can’t make me read that 😉 I did just read the abstract though. How could a black box we don’t understand “implement genuine theories of language”? That’s a bit like saying that the mysterious aliens from Alpha Cassiopeiae who just showed up in low Earth orbit “implement genuine theories of faster-than-light travel” or something.
tbh they don’t want a genuine theory. They want Kriesgeskorte & Douglas 2018: “understanding brain information processing requires that we build computational models that are capable of performing cognitive tasks.”