The only real tragedy here is that Google really did have best-of-industry semantic search integrated into their code searching tools, something that nobody has been able to replicate.
GitHub is great, but it's absolute ass for search. To the point where for any nontrivial question I have to pull down the repo and use command-line tooling on it.
New GitHub full text search [1] is amazing. It is so good that for me it often replaces StackOverflow - I just use it to see how some API function is being used. Especially useful if you're searching for an example with a specific argument value.
have you used google's internal code search though? the link you posted is amazing in its performance, for sure. but once you are in some repo and doing what most of us call "code search", github drops off in utility vs google's internal tooling pretty quickly.
i'm only remarking on this because of the context in the parent you are replying to, whom i agree with. local tooling is better than what github provides. as a standalone comment i would simply upvote you.
Chances are that a random stranger on the Internet has not used Google's internal code search. Even if that person has, it would be useful to provide the context for others to understand.
Local code search is great in JetBrains products. I use PyCharm and even on large codebases the search is almost instantaneous and there are enough filters and options to nail down what you need. While JetBrains often drops the ball on the responsiveness of their products, the search remains fast as far (and as recent) as I remember.
I've used both. Google's code search is usually better if you know what you're looking for. It's less so if you need to do something that involves cross-language references (e.g. code file that references a translation string).
Do you mean the non-semantic indexing, which covered most of Google Code? Like grep-style supporting, but no real semantic data?
Or are you talking about the few repos that had semantic indexing via Kythe (chromium, android, etc)? We never got that working for generic random open repos, primarily because it requires so much integration with the build system. A series of three or four separate people on Kythe tried various experimentation for cheaply-enough hooking Kythe into arbitrary open repos, but we all failed.
I'm talking about Kythe, and learning that it ran into issues generalizing it for non-Google-controlled APIs explains a lot of the history I thought I knew!
Yea we never had it for even all Google controlled repos, just the ones that would work with us to get compilation units from their build system.
I was the last one to try (and fail) at getting arbitrary repos to extract and index in Kythe. We never found a good solution to get the set of particular insanity that is Kythe extraction working with random repos, each with their own separate insane build configs.
It almost makes me wonder if the right approach (had Google been willing to invest in it) would have been to wed Kythe and Bazel to solve that "insane build configs" problem.
"Okay, you want generic search? Great. Here's the specific build toolchain that works with it. We'll get around to other build toolchains... Eventually maybe."
Would have been a great synergy opportunity to widen adoption of Bazel.
GitHub is great, but it's absolute ass for search. To the point where for any nontrivial question I have to pull down the repo and use command-line tooling on it.