You can call me AI

AI models are not very good at coming up with outputs outside of their training data.

I suppose they were going to get there eventually.

FTA (abstract):

Together our results highlight that the impressive ICL abilities of high-capacity sequence models may be more closely tied to the coverage of their pretraining data mixtures than inductive biases that create fundamental generalization capabilities.

How fortunate that they’ve phrased their results in a way that Google’s bosses are likely to overlook. :roll_eyes:

It’s a shame that, in today’s economy, there’s no money to be made in being right… :thinking:

Who, me? Bitter? Nah…

2 Likes