Everything you mentioned is possible but needs a lot of product management level refinement. Also I'd be wary of all kinds of adjustable and adaptive stuff since they create a bubble that may be difficult to escape should you become curious about things you don't know. Adaptive stuff discourages curiosity and it's one of the things I started disliking about Google Search among other things.
There may be some room for thinking about bringing unique content above non-unique one. Chit-chat on myriads of forums is probably not interesting unless you are looking for something very specific in which case you'd use the query language. It would also de-rank all non-unique marketing fluff that can be seen everywhere. Of course this might open an opportunity for spammers to de-rank good sites by copying them or to boost others by adding artificially unique content, but then combined with the human review process it might yield net positive results.
Voting... I don't know, Reddit's /all is a result of popular voting. Do we really want the Internet to become all kittens, memes and showerthoughts? :)
But I'm thinking the stereotype of what a search engine UI should look like could also be reconsidered. Google has made a lot of interesting gradual improvements (Wikipedia, IMDB etc. snippets, maps, calculator, dictionary) while leaving the core UI almost intact. For example, when skimming through the search results, do you read the web page titles or the content with your search terms highlighted? Etc, I think there's some work to do for UX folks. Let alone that Google's highlighting functionality has declined considerably to the point of being almost useless most of the time.
There may be some room for thinking about bringing unique content above non-unique one. Chit-chat on myriads of forums is probably not interesting unless you are looking for something very specific in which case you'd use the query language. It would also de-rank all non-unique marketing fluff that can be seen everywhere. Of course this might open an opportunity for spammers to de-rank good sites by copying them or to boost others by adding artificially unique content, but then combined with the human review process it might yield net positive results.
Voting... I don't know, Reddit's /all is a result of popular voting. Do we really want the Internet to become all kittens, memes and showerthoughts? :)
But I'm thinking the stereotype of what a search engine UI should look like could also be reconsidered. Google has made a lot of interesting gradual improvements (Wikipedia, IMDB etc. snippets, maps, calculator, dictionary) while leaving the core UI almost intact. For example, when skimming through the search results, do you read the web page titles or the content with your search terms highlighted? Etc, I think there's some work to do for UX folks. Let alone that Google's highlighting functionality has declined considerably to the point of being almost useless most of the time.