![]() ![]() Maybe next season we’ll get some actual new ones and a new armor set to boot. Those are the two reprised ones, and the old god rolls of the other weapons are still alive and kicking, if you want to go for those as well. Overall, I’m not too excited about this thing. ![]() This thing can also roll Incandescent but I fail to see how that is useful on a rocket launcher that will probably just be blowing everything up in Incandescent range. Tracking Module or Impulse Amplifier/Vorpal – For your basic boss damage needs, I suppose. More reserves in total plus autoloading isn’t bad, though you are negating the crouch reload speed, of course, and you don’t have a damage perk. ![]() Harry Potter And The Methods Of Rationality.Tracking Module/Lasting Impression - Legendary rocket launches aren’t exactly the most meta right now outside of maybe a crafted Palmyra-B, but you could do worse than this combo after the high-impact damage buff a while back.įield Prep/Autoloading – This is a kind of interesting combo considering you usually see these two in the same slot. “‘It Was The Best Of Times, It Was The Blurst Of Times’‽”. I continue my AI poetry generation experiments with OpenAI’s 2020 GPT-3, which is 116× larger, and much more powerful, than the 2019 GPT-2. GPT-3, however, is not merely a quantitative tweak yielding “GPT-2 but better”-it is qualitatively different, exhibiting eerie runtime learning capabilities allowing even the raw model, with zero finetuning, to “meta-learn” many textual tasks purely by example or instruction. (Along the way, I document instances of how the BPE text encoding unnecessarily damages GPT-3’s performance on a variety of tasks, how to best elicit the highest-quality responses, common errors people make in using GPT-3, and test out GPT-3’s improvements in NN weak points like logic or commonsense knowledge.) One does not train or program GPT-3 in a normal way, but one engages in dialogue and writes prompts to teach GPT-3 what one wants.Įxperimenting through the OpenAI Beta API in June 2020, I find that GPT-3 does not just match my finetuned GPT-2-1.5b-poetry for poem-writing quality, but exceeds it, while being versatile in handling poetry, Tom Swifty puns, science fiction, dialogue like Turing’s Turing-test dialogue, literary style parodies… As the pièce de résistance, I recreate Stanislaw Lem’s Cyberiad’s “Trurl’s Electronic Bard” poetry using GPT-3. GPT-3’s samples are not just close to human level: they are creative, witty, deep, meta, and often beautiful. They demonstrate an ability to handle abstractions, like style parodies, I have not seen in GPT-2 at all. Chatting with GPT-3 feels uncannily like chatting with a human. I was impressed by the results reported in the GPT-3 paper, and after spending a week trying it out, I remain impressed. This page records GPT-3 samples I generated in my explorations, and thoughts on how to use GPT-3 and its remaining weaknesses. #IMPORTANT SIDENOTES AND BUGS DESTINY IRON BANNER HOW TO# I hope you enjoy them even a tenth as much as I enjoyed testing GPT-3 and watching the completions scroll across my screen. The latest and greatest neural network for unrestricted natural language generation is OpenAI’s GPT-3. GPT-3 is like GPT-1 and the GPT-2 I’ve used extensively before 1-only much more so, and then going beyond them in a fascinating new way. Scaling works: quantity is a quality all its own. The scaling of GPT-2-1.5b by 116× to GPT-3-175b has worked surprisingly well and unlocked remarkable flexibility in the form of meta-learning, where GPT-3 can infer new patterns or tasks and follow instructions purely from text fed into it. What can we do with GPT-3? Here, we’re all about having fun while probing GPT-3’s abilities for creative writing tasks, primarily (but far from limited to) poetry. Fortunately, OpenAI granted me access to their Beta API service which provides a hosted GPT-3 model, letting me spend a great deal of time interacting with GPT-3 and writing things. Must we content ourselves with mediocre generic poetry, at best, deprived of finetuning directly on chosen poetry corpuses or authors we might like to parody? How much does GPT-3 improve and what can it do? Naturally, I’d like to write poetry with it: but GPT-3 is too big to finetune like I did GPT-2, and OA doesn’t (yet) support any kind of training through their API. Turns out: a lot! Below, I walk through first impressions of using GPT-3, and countless samples. In the latest twist on Moravec’s paradox, GPT-3 still struggles with commonsense reasoning & factual knowledge of the sort a human finds effortless after childhood, but handles well things like satire & fiction writing & poetry, which we humans find so difficult & impressive even as adults. #IMPORTANT SIDENOTES AND BUGS DESTINY IRON BANNER HOW TO#. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |