It’s been just over a last week since Microsoft made such a big deal about an enhancement to its Bing search engine that adds OpenAI’s GPT chat capability to it. In the process, Microsoft declared how it expected this new capability to help drive significant market share growth in search.
But it’s become clear in the last week that this kind of chat capability is much less about enhancing search (particularly if you want accurate answers) and much more about generating creative text useful for other purposes. This capability has become known as hallucination – where a chatbot just started string text together. Cade Metz writes: “hallucinate is just a catchy term for “they make stuff up.”
I’ve had some great fun playing around with this to get help writing, for example, short stories. I’ll provide a few sentences to seed the chatbot’s “thinking” and then ask it to complete a 1,000 word short story based on that input I provided. It’s been fun to see what results.
Even more fun has been asking the chatbot to adjust the style to make the writing similar to other famous authors such as Hemingway, Steinbeck, Twain, George Saunders and others. It’s been fun to see the stylistic changes it makes to the same basic story based on the writer’s style that I specify.
None of this work has anything to do with search – nor does it help in any way by enhancing search results. It’s something completely different, strikingly interesting, and a heck of a lot of fun to play with. Whether it’ll be really useful in helping me do any real writing remains an open question – but meanwhile I’m enjoying sparring with a seemingly smart creative “mind” on the other end of my computer screen!
You must log in to post a comment.