Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm interested on this as well.

I have been trying to fine-tune GPT-2 on genre fiction to work as a sort of "fiction replicator". Stylistically it actually seems to do quite reasonably, but it lacks narrative cohesion. This problem, as you point out, is corpus agnostic.

I thought of trying to keep track of characters and key interactions outside of the model, but I haven't figured out how to make these two models interact reliably -- outside of just having the first component generate prompts for the second model in a kind of cooperative setting.

Is there a known way to set up transformer to do infix generation? That is: give it a start and end prompt, and an estimated number of tokens to fill in between. That seems like it should be doable and could improve things, but I haven't found any work on this problem yet and haven't had the time (and potentially don't have the skills) to look deeply myself yet.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: