I don’t think they’re optimising much at all. I think it’s likely just a modified web crawler but without the kind of throttling normal search engine crawlers use. They’re following links recursively. Then probably some basic parsing or even parsing with AI to prepare the data to make another AI model.
I don’t think they’re optimising much at all. I think it’s likely just a modified web crawler but without the kind of throttling normal search engine crawlers use. They’re following links recursively. Then probably some basic parsing or even parsing with AI to prepare the data to make another AI model.