dimanche 14 mai 2023

Google, how do I ask your AI the right questions?

Google, how do I ask your AI the right questions?
An illustration of a woman typing on a keyboard, her face replaced with lines of code.
Live footage of me thinking of what to ask AI bots. | Image: The Verge

A few weeks ago, my spouse and I made a bet. I said there was no way ChatGPT could believably mimic my writing style for a smartwatch review. I’d already asked the bot to do that months ago, and the results were laughable. My spouse bet that they could ask ChatGPT the exact same thing but get a much better result. My problem, they said, was I didn’t know the right queries to ask to get the answer I wanted.

To my chagrin, they were right. ChatGPT wrote much better reviews as me when my spouse did the asking.

That memory flashed through my mind while Iiveblogging Google I/O. This year’s keynote was essentially a two-hour thesis on AI, how it’ll impact Search, and all the ways it could boldly and responsibly make our lives better. A lot of it was neat. But I felt a shiver run down my spine when Google openly acknowledged that it’s hard to ask AI the right questions.

During its demo of Duet AI, a series of tools that will live inside Gmail, Docs, and more, Google showed off a feature called Sidekick that can proactively offer you prompts that change based on the Workspace document you’re working on. In other words, it’s prompting you on how to prompt it by telling you what it can do.

That showed up again later in the keynote when Google demoed its new AI search results, called Search Generative Experience (SGE). SGE takes any question you type into the search bar and generates a mini report, or a “snapshot,” at the top of the page. At the bottom of that snapshot are follow-up questions.

As a person whose job is to ask questions, both demos were unsettling. The queries and prompts Google used on stage look nothing like the questions I type into my search bar. My search queries often read like a toddler talking. (They’re also usually followed by “Reddit” so I get answers from a non-SEO content mill.) Things like “Bald Dennis BlackBerry movie actor name.” When I’m searching for something I wrote about Peloton’s 2022 earnings, I pop in “Site:theverge.com Peloton McCarthy ship metaphors.” Rarely do I search for things like “What should I do in Paris for a weekend?” I don’t even think to ask Google stuff like that.

I’ll admit that when staring at any kind of generative AI, I don’t know what I’m supposed to do. I can watch a zillion demos, and still, the blank window taunts me. It’s like I’m back in second grade and my grumpy teacher just called on me for a question I don’t know the answer to. When I do ask something, the results I get are laughably bad — things that would take me more time to make presentable than if I just did it myself.

On the other hand, my spouse has taken to AI like a fish to water. After our bet, I watched them play around with ChatGPT for a solid hour. What struck me most was how different our prompts and queries were. Mine were short, open-ended, and broad. My spouse left the AI very little room for interpretation. “You have to hand-hold it,” they said. “You have to feed it exactly everything you need.” Their commands and queries are hyper-specific, long, and often include reference links or data sets. But even they have to rephrase prompts and queries over and over again to get exactly what they’re looking for.

A screenshot of an AI snapshot about Bryce Canyon Image: Google
The SGE snapshots also prompt you on what to ask it next.

This is just ChatGPT. What Google’s pitching goes a step further. Duet AI is meant to pull contextual data from your emails and documents and intuit what you need (which is hilarious since I don’t even know what I need half the time). SGE is designed to answer your questions — even those that don’t have a “right” answer — and then anticipate what you might ask next. For this more intuitive AI to work, programmers have to make it so the AI knows what questions to ask users so that users, in turn, can ask it the right questions. This means that programmers have to know what questions users want answered before they’ve even asked them. It gives me a headache thinking about it.

Not to get too philosophical, but you could say all of life is about figuring out the right questions to ask. For me, the most uncomfortable thing about the AI era is I don’t think any of us know what we really want from AI. Google says it’s whatever it showed on stage at I/O. OpenAI thinks it’s chatbots. Microsoft thinks it’s a really horny chatbot. But whenever I talk to the average person about AI these days, the question everybody wants answered is simple. How will AI change and impact my life?

The problem is nobody, not even the bots, has a good answer for that yet. And I don’t think we’ll get any satisfactory answer until everyone takes the time to rewire their brains to speak with AI more fluently.

Aucun commentaire:

Enregistrer un commentaire

Pegasus spyware maker NSO Group is liable for attacks on 1,400 WhatsApp users

Pegasus spyware maker NSO Group is liable for attacks on 1,400 WhatsApp users Photo by Amelia Holowaty Krales / The Verge NSO Group, the ...