Essay··3 min read

Search Habits in the 'Just Ask GPT' Era

In an age where we ask AI instead of Stack Overflow, how are our search habits changing?

What I Said

A colleague was stuck on a build error.

A year ago, I would've said "copy the error message and Google it." But what came out of my mouth was: "Just ask GPT."

And then it hit me. I'd been reaching for the chat window before the search bar for a while now. I can't even remember the last time I opened Stack Overflow.

Search Has Changed

The search routine a decade ago was clear. Error happens, Google it, Stack Overflow, read answers, copy-paste, solved. This took 5 to 30 minutes.

Now? Paste the error message into AI and you get an answer in 10 seconds. It explains the context. You can follow up. "But why does this happen?" "Any other approaches?" It's convenient. Overwhelmingly convenient.

But sometimes it makes me uneasy.

You Get Answers Fast, But...

The process of Googling had its own kind of learning built in.

Analyzing the error message, extracting keywords, scanning ten results, separating relevant from irrelevant, evaluating the reliability of each answer. That entire process was training for problem-solving skills.

With AI, all of that gets skipped. You get the answer fast, but you don't walk the path to the answer.

Back when we drove without GPS, we memorized routes. Now I can't navigate streets near my own house without it. Something similar is happening with search. (I wanted to dismiss this, but I can't really argue against it.)

AI Is Confidently Wrong

What's more dangerous is that AI's answers always come with full conviction.

It delivers wrong answers just as confidently as right ones. Recommends libraries that don't exist. References APIs that were never real. The code looks plausible but throws errors when you run it.

Stack Overflow had a voting system. Community-vetted answers rose to the top. Wrong answers got "this is incorrect" comments. Imperfect, but there was a collective intelligence filter.

AI has no such filter. The filter is entirely on you now.

I Worry About Juniors

Honestly, what concerns me most is the next generation of developers. People who start with AI from day one, never going through the Google-and-Stack-Overflow crucible. Without the experience of struggling, you never build the ability to struggle. And there will inevitably come a moment when AI can't give you the answer. Will they be able to figure it out on their own?

Then again, maybe every generation worries about the next one like this. When calculators came along, people said "math skills will atrophy." Whether that worry was right or wrong -- I still don't think we have a definitive answer.

My Own Rules

I'm not saying we should stop using AI. That ship has sailed.

But I've set a few rules for myself. I don't copy-paste AI answers blindly. I think at least once about why the answer might be correct. When learning something new, I read the official docs first. AI-generated code only gets used after I understand it.

It's a pain. But that inconvenience is what keeps a minimum gap between my skills and AI's skills. If that gap disappears, I'm just a copy-paste machine for AI output.

"Just ask GPT" has become a completely natural thing to say. But before you ask, you should at least know what you don't know. That part seems kind of important.

Anyway, that's where I'm at.

Related Posts