> (…) split the index, which is to say the part of Google Search that scrapes the web and makes that content searchable, from the search user interface, and manage that index as a public utility that different search services could rely on and pay for, an idea that was suggested in a recent paper.
That would be incredible. I wish regulators would make Commons Enforcement a regular part of their playbook.
Commonify the monopolizers’ complements!
@robin in other words, force Google and the other data barons to participate in the crawling of the web as a public utility, since it has been proven too valuable/powerful as proprietary property of any single company.
There’s also an efficient-resourcing argument to be made, because many sites are hit hard by swarms of crawlers, all collecting the same data. Frequently crawled websites would be considerably cheaper to run if mass-crawling was more regulated and operated shared infrastructure.
@erlend Yes, exactly. I don't know if they'll support that of course, but we're not the only ones to have mentioned it. It would be an easy case to make if we could prove that it's a natural monopoly, but that's not in the conclusions of the case at all so I don't know how realistic it is, as cool as it would be!
@erlend @robin The EU project @openwebsearcheu is trying to create a web index as a public utility.
@djoerd @erlend @openwebsearcheu Yes, I've heard about it, I hope something works out!