Showing posts with label GPT2. Show all posts
Showing posts with label GPT2. Show all posts

Thursday, December 12, 2019

Goblin Mung, A Neural Net Expose

I recently listened to the excellent 99% Invisible podcast, wherein the discussion centered on modern conversational AI, its 20th century origins, and its evolution into contemporary neural networks. One of the top-performing "most human" neural nets today is GPT2, which famously made a very convincing news-story-style write-up about the discovery of unicorns in the Andes (scroll to the bottom). I decided to spin up a test of the open-source coding for this model available at Talk to Transformer, focusing on my favorite fantasy RPG McGuffin, goblin mung. Here's the prompt: Goblin mung is one of the most hotly sought-after hallucinogens available today.