Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've pondered this, too. Why not a DMOZ 2.0 type concept, where you seed the index with high quality sites and grow from there. Freshness won't matter as much for most topics, so crawling X times per day probably won't be necessary. Maybe have user defined flags to indicate the types of results they want (large sites, small sites, high authority, technical sites-- maybe different sort by features) with backlinks/domain prominence as a small factor, but also using NLP to determine authoritativeness of said content so as to facilitate less-linked site discovery.


That exists there: * https://curlie.org/


I think something that needs to be actively maintained like DMOZ is bound to burn out. I've been thinking about how you could passively maintain an index of good resources.

Imagine a service that provides you with a personal search engine in exchange for a list of your bookmarks. Those bookmarks provide the signal for what sites to index for the public search engine.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: