I didn’t get the impression that the author was alleging an organized conspiracy, but rather that large numbers of anti-vaxxers all exhibiting the same patterns produce a natural systematic effect.
From my experience with YouTube, I’d say you’re right, it does work that way, and it does happen all the time. That’s sort of by design, it’s just how recommendation engines work. That’s a large part of what you’d want it to do, in fact.
Two patterns like this that I noticed: Watching videos specifically by or about Anita Sarkeesian produced recommendations for GamerGate videos, anti-Sarkeesian videos, and other things that could be accurately described as “White man rants for an hour.” Watching videos by Sarkeesian’s much-less-famous male colleagues, who discuss similar issues from a similar point of view, hasn’t produced a similar result, as far as I can see. (Sarkeesian doesn’t even allow comments on her stuff, so there is more than commenting at work in the algorithm. Her insane “critics” probably watch her videos over and over.)
The other was that videos about sharpening tools produced suggestions about sharpening knives and cleaning and shooting guns, and sure, you can see how those might go together for some people, but also how other people are not going to be happy about finding such a connection.
Over the last year or so, YouTube has announced a lot of changes and crackdowns and these particular links don’t seem obvious to me anymore.