Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Another proposal: a standardized search protocol over HTTP that can be implemented by most web servers and a DNS-like system of search indexes that use websites' built in index query endpoints (rather than crawling). Essentially decoupling search and advertising by building a search function into the internet standards.

Perhaps commercial products can improve over the quality of results, but simple text and keyword search can work this way.



you can't do that. when you implement search you need a global view over the collection to understand what are the most relevant documents. You are describing the paradise of the spammers :)


Would make SEO much easier, and I guess would increase the amount of spam on Google. Any website can return the search results that it would like.

I guess search engines could apply a score to each site so that those who spam, get down-graded, but then how would they do the scoring? By crawling? Then you still need a crawler and might as well build your own index.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: