Google politics and history aside, it's much better to link research with products for software. Unlike physics and biology, software is basically what we say it is, so there isn't a natural ordering to research (and it can wander forever, all too much like literary criticism).
What both Google research and product missed, and ChatGPT provided almost accidentally, is that people need a way to answer ill-formed questions, and iteratively refine those questions. (The results are hit-or-miss, but far better than traditional search.)
What both OpenAI, Bing, and now Google realize, is that the race is not to a bigger model but to capturing the feedback loop of users querying your model so you can learn how to better understand their queries. If Microsoft gets all that traffic, Google never even gets the opportunity to catch up.
If Google were really smart, they would take another step: to break the mold of harvesting free users and instead pay representative users to interact with their stuff, in order to catch up. Just the process of operationalizing the notion of "representative" will vastly improve both product and research, and it would build goodwill in communities everywhere - goodwill they'll need to remain the default.
Progressive queries are just the leading edge of entire worlds of behavior that are yet ill-fitted to computers, but could be accommodated via AI. And if your engineers consider the problem as "fuzzy" search or "prompt engineering" or realism, you need to get people with more empathy, a minimal understanding of phenomenology, and enough experience with multiple cultures and discourses to be able to relate and translate
What both Google research and product missed, and ChatGPT provided almost accidentally, is that people need a way to answer ill-formed questions, and iteratively refine those questions. (The results are hit-or-miss, but far better than traditional search.)
What both OpenAI, Bing, and now Google realize, is that the race is not to a bigger model but to capturing the feedback loop of users querying your model so you can learn how to better understand their queries. If Microsoft gets all that traffic, Google never even gets the opportunity to catch up.
If Google were really smart, they would take another step: to break the mold of harvesting free users and instead pay representative users to interact with their stuff, in order to catch up. Just the process of operationalizing the notion of "representative" will vastly improve both product and research, and it would build goodwill in communities everywhere - goodwill they'll need to remain the default.
Progressive queries are just the leading edge of entire worlds of behavior that are yet ill-fitted to computers, but could be accommodated via AI. And if your engineers consider the problem as "fuzzy" search or "prompt engineering" or realism, you need to get people with more empathy, a minimal understanding of phenomenology, and enough experience with multiple cultures and discourses to be able to relate and translate