There's not enough content in this genre so I gotta get our robot friends to help out haha.
It's unlikely you'll be able to run it locally... Even if you have a high-end gaming computer, most consumer-grade GPUs can't handle the size of these models. Need something like a Nvidia A100 GPU (or several) which has a ballpark cost of around $15,000...
I'll probably be a few weeks before I actually get around to fine-tuning a "unwilling-at-first-but-then-sees-big-dick-story-generator" lol, but in the meantime:
If you want to play around with the existing MPT-7B-Storywriter-65k+ or OPT-13B-Erebus, check out "
Google Colab"... Colab let's you create a "jupyter notebook" (basically a document that you can write Python code in which can print out images, tables, generated text, etc.). You can then buy computing credits to rent a Nvidia A100 GPU connected to your notebook in the cloud to run these open-source large language models with

. Finally, write a little bit of Python to
create an inference pipeline in your notebook with your desired existing model.
Make sure you disconnect from the GPU after running your code or you'll be wasting compute credits fast! (don't activate the GPU until you're ready to run your code)
Happy smut generating