Neural nets have reached a level of comedy previously inaccessible to mankind. Given their inscrutable memes and inedible recipes, neural nets are nowhere near solving society’s problems, but they’ve mastered the art of making humans choke on their own spit laughing. SimLit is already AI-aided storytelling if you incorporate anything game based, so why not add another level of computer interpretation?

This post includes a step-by-step neural net training process starting from the very beginning and requiring the bare minimum of computer skills. The bulk of the work has been done for you already with textgenrnn, a user-friendly Python module that easily allows anyone to train a neural network on a sample of text. So to clarify, I didn’t have to do any real coding for this, but hope this documentation will inspire people who otherwise wouldn’t experiment with something this technical.

If you end up training a neural network on your own story and publishing a post about it, let me know and I’ll drop a link at the end.

(Click here to skip directly to the results.)

This guide is aimed toward people who aren’t already cozy with a command line. That being said, you need to open a shell. If you’re on a Mac, open Terminal, which should be in Applications/Utilities/. If you’re on a Windows computer running Windows 10, follow this guide to open Bash. If you’re running Linux, cool, here’s a link to the textgennrnn GitHub repository, you can get all the info you need from there. If none of these things are true, search ‘[your operating system] install bash,’ follow the directions, and open Bash.

If you do this in an airport, people will think you’re hacking.

STEP 1: CREATE TXT FILE OF YOUR STORY

First you’ll have to make a TXT file containing only the written part of your story, no WordPress formatting junk or anything like that. I clicked ‘View’ on each post and copied the text from there. If you copy directly from the editor, you’ll start picking up WordPress block tags.

Paste all the text of your story into a TXT file. Save it as “story0.txt” if you’re uncomfortable with variables and save it as anything you like if you are.

Now here’s where we get fancy. When you copy-paste from your WordPress site, it leaves blank spaces between the paragraphs. The way I trained it, textgenrnn interprets these blank lines as paragraphs. If you train it on the file as-is, it will keep spitting out blank outputs.

So let’s remove those blank lines. Pull up your shell and navigate to the directory where story.txt lives using cd. If you’ve never used this command before, it stands for “change directory” and takes paths as input. For example, if you’re on a Mac and saved story0.txt in Documents, you would type

cd Documents/

(Other tips: cd .. takes you to the parent directory of the one you’re in, allowing you to go backwards; if one of the file names has a space in it, cd will get confused unless you put a backslash in front of the space, so if it’s in a folder titled Folder Name With Spaces, the command will look something like cd Folder\ Name\ With\ Spaces; and if you type enough letters to uniquely identify the file from other files in the same directory, you can press Tab and it’ll auto-complete the name for you. More info here.)

Then run

awk 'NF' story0.txt > story.txt

The blank lines should be gone now. Feel free to open story.txt and marvel at the wall of text.

The next couple commands aren’t relevant to textgenrnn, but are fun things you can do with your story’s TXT file. Running

wc story.txt

will tell you three numbers, and those numbers respectively correspond to the number of paragraphs, words, and characters (i.e., unicode characters, not sims) in your story. Running

cat story.txt| tr [:blank:] '\n' | sed 's/[^a-zA-Z]//g' | sort | uniq -i | wc -l

will tell you how many unique words you’ve used in your story.

STEP 2: INSTALL PYTHON 3 AND TEXTGENRNN

Download and install Python 3 from here. Or use brew if you have that installed and want to be fancy. Once that’s done, go back to your command line and run

pip3 install textgenrnn

With any luck, you’ll have installed textgenrnn.

You can also copy the code directly from GitHub, but if you do that, Python is going to keep complaining about missing packages when you try to run textgenrnn. If you use pip3 instead, it automatically downloads all the packages for you.

STEP 3: TRAIN TEXTGENRNN

To open Python 3, run

python3

That ensures you’re running the right version of Python. In particular, Macs automatically have Python downloaded, but it’s the wrong version (2.7). Then ‘python’ defaults to Python 2.7, and if you try to run textgenrnn on the wrong version, it’ll yell at you. (You can make Python 3 the default by following the directions here.)

When Python is running, you should see three greater-than signs (>>>). To make sure textgenrnn is working, run

from textgenrnn import textgenrnn

textgen = textgenrnn()
textgen.generate()

If the computer spits out a nonsense sentence, it works! Hooray!

Now you’re ready to train the net on your story. There are several ways to do this, and I’ll go over a couple.

METHOD 1: TRAINING THE BOT ON SEQUENCES OF CHARACTERS

With this method, the bot associates a set of numbers with each character. When it sees a given character, it uses the characters that came before it and that set of numbers to determine what character will likely come next. Every time you train it on a data set, it changes those numbers to make it more accurate at predicting that next character.

textgen.train_from_file('story.txt', num_epochs=1)

There’s some lingo here. The number of epochs is the number of times the algorithm sees your entire story. While it’s training, you might notice a number marked ‘loss’ next to the progress bar. Loss is the negative log of the probability that it correctly guesses the next character in the data set. Put another way, you want loss to be closer to 0. It’s like golf if golfers started counting at 0.

When it’s done training (when you see the >>> marker again), you can ask it to generate a paragraph based on what it learned about your story.

textgen.generate()

If you want it to generate more than one paragraph, you can also give this function a number. I’ll mention another variable that will affect your output: temperature. ‘Temperature’ is a positive number that tells the bot how creative to be. It’s like the predictive text function on a smartphone. Setting the temperature to 0 is like picking the center option every time. Setting the temperature slightly higher, like 0.2, is like mostly picking either the left, right, or center option with some probability, but every once in a while it veers off and picks a less likely option. At higher temperatures, it doesn’t distinguish between high and low probability as much, so it starts picking wackier options. At 3 it starts going nuts. I didn’t set the temperature any higher than 1.

Here’s how to ask the bot to generate 6 paragraphs at once, and to optionally specify a temperature.

textgen.generate(6)
textgen.generate(6,temperature=0.5)
textgen.generate(6,temperature=1)

Neural networks rarely produce good results after seeing the data set only once. You’ll most likely have to train the net for multiple epochs. You can run the train_from_file command at the beginning of this section again, and you can increase num_epochs to do more rounds of training at once.

After the first epoch, I went in increments of two (i.e., num_epochs=2), and then increments of 10 when I got bored of diminishing returns. You might notice the loss going down at first. After a while it may level off, because there’s only so much the bot can do remembering 40-character excerpts over an entire story. You can tell the bot to remember more than 40 characters at a time, but then (a) you’ll have to train it from scratch and (b) it’s going to be very slow. If you want accuracy, the second method is much faster.

METHOD 2: TRAINING THE BOT ON SEQUENCES OF WORDS

Instead of asking the bot to store information about which characters appear together, you can ask it to store information about which words appear together. At the character level (method 1), you’ll see it occasionally make up nonsense words, but using this method, it can only use words that already appear in your story.

This method ignores capitalization and punctuation. In particular, it ignores apostrophes, so if you train it on the story file as-is, it’s going to return don instead of don’t and I instead of I’ll or I’m. You can avoid this by creating a version of your story TXT file with all apostrophes removed and by filling in the punctuation later.

This command removes all apostrophes from story.txt and saves the apostrophe-less version to story2.txt.

cat story.txt | tr -d "'’" > story2.txt

Now you can train the new bot on story2.txt. I made a second bot for this, which I named ‘textgenny.’ Because textgenrnn assumes you want a character-level network by default, you have to specify that the word-level model is a new model the first time you run it, and then remove that option so you don’t lose progress. Like so.

textgenny = textgenrnn()
textgenny.train_from_file('story2.txt', num_epochs=1, new_model=True, word_level=True)
textgenny.train_from_file('story2.txt', num_epochs=1, word_level=True)

This is usually going to be faster and more accurate for stories. Asking the network to generate paragraphs works the same way it does in Method 1. You do have to add your own punctuation and capitalization, though.

SUGGESTION: TRAIN THE NET ON TWO STORIES TO MAKE A CROSSOVER

There’s no rule saying you can only train a bot on one dataset. If you have a TXT file containing your story and another containing your friend’s story, you can combine them together into one file or alternate training sets to create a network that retains elements of both stories. This works with either of the two methods.

You may have better luck pushing both stories into the same TXT file. Here’s what the code to combine story1.txt and story2.txt into crossover.txt would look like.

cat story1.txt story2.txt > crossover.txt

SUGGESTION: INTERACTIVE MODE, OOOH

Textgenrnn has an incredibly shiny interactive mode where the net offers the user a set of words to choose from. Details here.

SAVING AND LOADING NEURAL NETS

If you want to save your neural net, running

textgen.save()

will save your progress to a default file, textgenrnn_weights_saved.hdf5. But if you want to save multiple networks, you can give it a filename, and it’ll work as long as it has the extension .hdf5.

textgen.save("story_net.hdf5")

To load a saved neural net, run

textgen.load()

if you’re using the default save file name. Alternatively, if you customized the file name, you can pass that instead.

textgen.save("story_net.hdf5")

RESULTS

Catastrophe Theory has over 4,500 paragraphs, over 158k words—16k unique words—and almost a million characters. It’s not the easiest chunk of data for the bot to interpret. First of all, because I’m lazy, the CT TXT file doesn’t distinguish between photo captions and the body of the text, let alone headers. Then there’s the stylistic shift from Book I to Book II. So it’s already nearly impossible for the bot to distinguish between weird meta-commentary, messing with the English language, and actual plot points, but there’s no need for any of that. I just want a bot to innocently roast my hobby writing.

Here’s how it did.

CHARACTER-LEVEL

EPOCH 1

Epoch 1 textgenrnn isn’t really sure what CT is about, but it’s pretty sure it involves a stage.

“I can stay the stage of the door is the way the simple was a specture and the and the the same time in the stage with the way the statement is a statume and he was a discover and the way it was the stage of the weight was the same door in the stage when it was a post of the stage of the guys is a

“I can see it in the batter when you were in the statement in the way it was a stage and the is the stage of the statement is the stage and the second that was a little same second and the was a parent and the statement is the second and she didn’t see her face of the stage of the statement is the

At low temperatures, it angsted so hard that it almost made sense.

“I don’t know what is you we want to be anyway.”

At higher temperatures, it knew some of the character’s names and not much else. But it did coin a phrase I’ll use daily from now on, ‘coached fuckering.’

Bernard Swim Jersite in the mind of the day like it sound to the statement than his black status sound says which hadn’t a spectur in the first statement, and the water is in the park of Claudia was the while the ciff is the day and you got her sight in the is what is the she was on the distance of

“Jo Jo, what excepts to you a best is the way the teammate had a couple compute in the first of says in the change is that say he was and you we say I can’s want to hand a controlity in the in the star is the instrumal of the way the content she had and do and stuck and he was in the stage

Shu read her ones of experience and a stop show the point of the time to comment read on the posts of the text of some so her own steps commite has every secret and see her one. It’s a different comment at the back to the sound because the first potent, antic and sims of me actually like they’d hav

The same chess consider gif drawn and womed hilar at her fabor. “Dear & Nirli’s god growed some. He coached fuckering her and miles personal spoud, by of her doctor? Claudia’s hard wouldn’t should that scared That?”

Shu was by far the AI’s favorite character at this point, it’s worth mentioning, with a short name like that.

EPOCH 3

By now it was getting obsessed with the bar. Also ‘first’ things. Not bad.

“The world was the bar on the reason of the first to the best that we said to the but the face is the face of the first contribution and the most point of the first connection of the first party that he was a bar of the first and she was a communic because the first was a face of the first

“I don’t want to see the first being stop with the bar of the bar on the only story with the mother were something at the bar of the bar with the best with the bar of the bar in the first common and the best weird with the bar of the story to be a bar on the better

“I want to do what you can do what I want to be a good the first being the bar of the bar on the beach of the book when you comment the bar and the conversation of the best thing and the best and the first party and the first story with the stand with the first day of the bar is a story of the bar

It’s also a decent poet. Game recognize game.

“Shu becoming not the mirror as you have to appreciate the partner of realized and the love should be a standoons at the same could have a stail and is the bathroom when she starts the double that her moment couldn’t still take a wall and her drink of her son is a train of pillows

“Which. The words for the drink.” Her home was so many combinations.

“The answer was still reading it on the desire with his own presences were thinking about it.”

EPOCH 5

The net retains its training if you break the epochs up into multiple commands, but seeing Con-Jasicé in the low-temperature paragraphs doesn’t inspire confidence. Poor bot. It only wanted to do the thing.

Con-Jasicé was the start of the table was the only she was still still to see the confirments and won the part of the part of the thing were being the part of the thing that was a bad to the freak of the string of the dad was the fored of the spot of the part of the thing I want to do with the thing

But it just didn’t know. It also has no idea who this chick is or why it should care about her Sims story.

“I don’t know what they were the most story of the first time that was still the stranger.”

“I don’t know what you want to get the sims of the stranger.”

Higher-temperature samples were charmingly bonkers.

Calling can start the patronal time of the Backage. “Coupestary boy to me?” He couldn’t have the way a same of the free mention. The mother did the facal as he his lives of the lives and see with the spend of front of the table moth

“If you want to the demon in a memorial right was on the thing

Cot Má!<… This were collect isle worth. Dare, but with introminess. Ah. She was Eating, best in his anglany, soluther then sook one pimpered he the image, Claudred Mike was a left. HerealR, but which jumps, entyment shouse match she mand looking at anyone at the physist thines; he was able to find

Or in the aspiration of the skill-limit is something an expected brown doctor who had been expressed

“Yeah, where we have a drink of her face and have the posters of dated on the moment she can be a magers with a good expirance to his eyes.

Then, every once in a while, your eyes glaze over and the output looks like something that could reasonably capture the story’s essence. Here, it decided to combine an internet culture reference with some unnecessary lampshading and existential dread, and the result comes alarmingly close to peak CT.

“Her ‘F’ressed and the same door with his moments shown for her characters of a story: Claudia was more alone from the nightmare of the stranger parts.

Press ‘F’ to pay respects to Claudia.

EPOCH 7

You can’t expect the bot to come up with a coherent plot. You can expect it to pick up on proper nouns, major concepts, and possibly some authorial quirks. But while humans can pick up on jarring, self-aware language, I wasn’t expecting the bot to invent its own jarring and self-aware usage. Here, it decides you can’t say the word ‘couple’ only once. You have to say it a couple times. Duh. Humans.

“What is the table of the same state of a couple couple couple body of the course to the computence of the stranger of the same arm with a couple couple couple betters

“It’s not a chance of this statement and she could have been the one of the course who was the one of the straight thing that was the one of the course with her house and a straight and the stranger computer and the courter couple couple couple couple parts and the one of the course.

I dunno who the bot’s calling out for avoiding public toilets but she probably deserved it.

“I pose to think that’s the one of the same new person who takes the concept she was both she was transitionally promised to mean and waiting at home to really a shit.

Could be one of the new characters it invented.

Your school. But now Shu, ready Jasper Josono, so my face lets something? Don’t catch.

“Julid’s mom” didn’t stay broke at compared in front of Kendra funny consuming something through the metast from realization hells, meeting her most judges Xiyuan recomitely was in them from scream.”

Not only did the bot feel confident enough to create its own sims, it decided that seeing the data set seven times made it an expert. So qualified, in fact, that it can now give me unsolicited advice on writing SimLit.

“You had to see them in the water of this creative characters that was that not a little man in the sims at the room and place the first way to pretend”

“Is that think that’s what that there is a different thing and her concept is like the aware of the concept sims could be”

“That’s how to do anything.”

Damn, bot, this isn’t a two-way street. You do what I tell you and I don’t reciprocate. Capiche?

EPOCH I-GIVE-UP

Here’s the point of diminishing returns where I started running 10 epochs at a time, sure that the bot wasn’t going to get any closer. I’d like to say it ended up between 30 and 40? But anyway. At its best, the character-level net had two speeds. It could spit out short, snappy one-liners.

And the glitter was all of the particular ants.

A problem while her parents had a monologue was a post-and-forctive thing in the world.

And it’s a Mike aspectly.

“Bernard read her bed. They’re food with handtop.”

On the flip side, if it tried to generate a long paragraph with higher temperatures, it immediately dropped any pretense of sense and went full Lewis Carroll with the whimsical neologisms. I can see the frumious bandersnatch wearing a worsupuit and gending the greenwalps for the Contravolopualost with a genyever.

“I noticed walk up.” Can is the bed best walkware. Take there no. Jo Shot. But night because the seconds she mushed a receps of a worsupuit pet perhinesef out of non-few came of poinds and mets hot about the word’s died behavion on her.

“They come to understand,” bug in borrow, and so reacing to the kid bigged a look of opinions. The asped not droppeating the heat that started slapped the halfwheres and remind it when she could reveal new shoted bizan, that’s my snapped initiation to invber who kept both nasured was retraced

Xiyuan reminded gending these process with a genyever

Shu’s juebung over aronical extraction for the walls cartars. They’re skullied. Seems to sim-broken perfect of his body right build of bemosmed arch source to retrient I want to ank against the womank

The Contravolopualost’s causes agos some drambers because something follow to use out out degalated, his husband called the lead; Kendra rages, how could you get up with the comepits at desk in his respect of his sreen them the couple greenwalps

Mose like some in husbandic mardics constant of confesuly yes. I put you’re a memoratoryn but everything he’s too look adults of inncon eams why? But 60 wade to the constant bees capable conman-werd Pake faby, the characters any of the hell accent that themself might have like Claudia projections

One of these deserves a special mention. Anyone need a band name? Because we got “Father Header and the 28 You-Espinosas.”

To pictos in a specifics om, it’s donigly going to the baththno, Aileen, Charlie tilted up with his on movine aparts (levely) Shank, Bernard at mourned. “Althous Father Header?” Rescured back and reserve all wasping to the 28 of You-Espinosa. It was on the others to Claudia yes.

And once, just once, it asked for plot spoilers.

Charlie?

It’s not clear whether these bits are a dig at CT or at the characters themselves for angsting, but I choose to believe the bot just got fed up and decided to whine about its job.

“I want to be what I can’t do it.”

“I don’t want to do?”

As the number of epochs went into the double digits, the bot decided its favorite character was Hector. Or, as the cool kids call him, Hectation.

“Hector was being the door, but I stick the day!

Hectation don’t know whose social fying

Hector started something as a reason of the stories and some spirit of the stuff her entire partner and still realized the same streeth was a thing

But, overall, it was confident in its progress.

“Now I’m smooth.”

And that’s around the point where I gave up.

WORD-LEVEL

Okay, remember when I thought the ‘couple’ thing was repetitive? I was wrong. The word-level bot is the Katie Ledecky of getting stuck in loops. After Epoch 1, the bot admitted it wasn’t sure what it just read and doubted its ability to interpret it, the thing it read, but eventually decided to share its thoughts. After claiming that CT was written by a potentially volatile humorist AND person, it waxed on about how humans and AI aren’t so different after all and reassured DINKs that their life choices were valid.

I don’t know what I say to. Do you know what I say? I don’t know what you did.

“I don’t know what I saying.”

I don’t know what I say. I don’t know. I don’t know what I was hoping to do. When you think it was a man who been there a good husband, and I don’t want to know what have a have to have a to to a a a potentially potentially potentially potentially potentially potentially volatile potentially volatile volatile volatile volatile volatile volatile volatile volatile volatile volatile volatile humorist and person. Volatile century, volatile person. Shrink, shrink and show—and in and shrink of one to a of shrink a reality shrink: shared shared shared reality reality shared shared shared of shared a shared shared a shared shared of a a potentially potentially potentially potentially potentially volatile. Volatile volatile where conversation—where a conversation, a, a conversation where a volatile volatile where and book a book volatile person would potentially potentially volatile and show show and and and and get show show tell show in. I a I I don’t don’t don’t don’t don’t don’t don’t have don’t don’t have don’t don’t know don’t don’t have have have have to to have have to have children.

Two epochs later, its low-temperature predictions were already picking up on the main themes and hinting at “Guide Me, North Star.”

I’m not sure why do you want to do with my own death.

I’m not sure why not he has no choice but in the birthday boy.

I’m not sure why do you want to do with my hands.

Once the bot hit its stride, it either started generating verbatim passages from the text or spewing out nonsense paragraphs with the same word repeated 18 times. Then there were some occasional moments where the bot let its real feelings slip, and man is it emo.

“I mean, I don’t know how to feel real.

“Tell me exactly what would have been.”

“I don’t know how to feel honestly other than confused, sad. Begin.”

“I just think I am the only one who could be here.”

“I don’t know what’s wrong.”

And unfortunately, it’s not conscious or corporeal enough to appear on The Tonight Show, so that dream has to remain crushed. No Fallon for you.

“Oh, of course I am I ready to go to talk show after.”

Even though I’ve since abandoned the first-person narration, there was something reassuring about hearing the bot talk about itself. It was like having someone new to chat with, and I’ll take what I can get during quarantine.

I at making the random!

No, I’m not going to start with a tough question to stop it.

I don’t know what it means to ask, “What the hell?” And at the least we’re just people.

“Well, you have a point—where the word was for the good art to put a douche finger in what would. Do you want to be friends?”

I mean, not after you insulted my finger like that.

It nailed some of the characters. Bernard is perceptive, Shu is sitting like an idiot, Charlie has to make decisions, and even the external AI thinks Kendra’s being dangerously strange. Strangerous.

Of course Bernard realized she was weird water in the kitchen instead of her stand in her horror movie.

Only invited, Shu sat face down. Charlie has some options.

Kendra, I think you believe bad shit.

And I think it knows about my plans to incorporate digital art, but how? Did you look at my Creative Cloud files, bot? I think it expects me to figure it out for myself, so we’re not going to get any answers here.

And so you made a picture. You already know what I thinking.

A friend once told me, “that’s the best thing I ever heard today.” I wasn’t expecting to hear anything that delightful ever again, but the bot got pretty close to those vibes.

Oh, that’s the first thing I always wanted to say.

CONCLUSION

I’ll call it: the DIY guide is going to outshine the CT bots pretty much immediately. Because CT is weird by nature, combining its disparate elements without context muddied the whole thing. A neural net may have an easier time with a story where the main focus is the plot and the purpose of language is to enhance the plot. So I’m hoping this picks up.

Privately, I’m collaborating with 1esk19 to make Somewhere Among Catastrophe Theory. The SAtS bot is really, really into anything drinkable, hot, and caffeinated, so I’m looking forward to the results.

To wrap up this post, I set the temperature to 0 and asked both bots to generate a paragraph it felt best represented Catastrophe Theory. Here’s what the character-level bot said:

I don’t know what I was the one of the same thing we can care to the same thing that was the one of the same thing it was a side of the same thing that was the one of the same thing it was a side of the same thing that was the one of the same thing it was a side of the same thing that was the one

So CT is about sides of the same thing. And with the word-level bot:

I don’t know what it hard for there to go an animal what your mother was for the past the book of life and the kind of strong spirit the won be just that but it was the there how a one being sense to that the the the one the one one her the the one one the building of building one person of her person person that of person first person building who had the in was final who final was to to to final dad dad to dad to dad have to to to the a the the the the start work start the one work work start they work a work figure start on work around a start start work work work a on a work book book book happy place work work and a of of … place little book little feeling feeling and that someone feeling how one work to a while person of of person to the person to person to to have person to that person her person person that know person person feel feel feel feel the the the i have have someone spoke spoke spoke the spoke to to to to to have her caught have a person of feel of person have person taking feel feel the a to the a building a a song song to song song song in in her they were they they were about and there they they were they that a that there that there there there me should me should a my me someone me me i have and one at was were to that my that person that please my dad my dad dad dad my dad dad dad dad dad work was work at have was one to

There you have it. It’s a story about people, feelings, locations, work, and dads. But both nets agree on the opening:

I don’t know.

And nothing sums up CT better than that.

I Don’t Know How to Feel Real: Training Neural Nets on SimLit and Other Written Work
Tagged on:                 

4 thoughts on “I Don’t Know How to Feel Real: Training Neural Nets on SimLit and Other Written Work

  • September 27, 2020 at 10:18 am
    Permalink

    YES.

    My word-level SAtS bot thinks SAtS is about coffee, tea, and—to my surprise—okra. (To the uninitiated: Somewhere Among the Stars is a space opera. It’s a Mass Effect fanfiction. But, I do seem to talk about caffeinated beverages and food a lot.)

    My character-level SAtS bot thinks SAtS is about shoulders.

    Really looking forward to the monstrosity that will be SAtS-CT bot.

    Reply
    • September 27, 2020 at 1:33 pm
      Permalink

      Mmmmmmmmm. Hot, drinkable, caffeinated okra.

      Here are some of the things SACT-bot (CaSAtStrophe-Theory-bot?) spat out:

      “Charlie is born in half a moment. Neala finished. Xiyuan may have asked a stack of self-serving if his dialogue’s coming from screaming.”

      “This is one of these randomly generated woman he keeps at his father’s.”

      “Liara glanced down at the preserved moth, the sunlight reflecting brightly off the smooth surface of her mind.”

      “No.

      ¿Qué? ¿Qué? Claudia projected her ex onto high fear as she lives at her own coffee, unable to prevent this particular good news article. And kids dance dead into Strangerville. She will never forget.”

      “Liara looked like her tea kettle.”

      “Sitting but no longer sitting under, she no upon her. Her option, her careful reality no at days, her her still spoke flow her as describing eyes. Anything meager, Petra Shepard. Pizza. She, as Shepard, hates her.”

      “No one can accuse Xiyuan of being adorable.” (FUCK YOU BOT, YOU ARE WRONG WRONG WRONG WRONG WRONG)

      And I know this isn’t a space opera, but… Zydrate comes in a little glass vial.

      Reply
  • September 27, 2020 at 10:58 am
    Permalink

    Glad to know that even neural nets don’t know what to do with their hands!

    This was super fun.

    Reply

Leave a Reply to 1esk19Cancel reply

Discover more from Willoughby Whippets and Tibetan Spaniels

Subscribe now to keep reading and get access to the full archive.

Continue reading