Google is to stop giving answers to deliberately misleading questions, as it makes improvements to the AI of its Featured Snippets tool.

For all its trustworthiness and near-ubiquity, Google can still be out-foxed by simply semantic tricks – providing answers to trick questions that a human could see straight through. One example provided by Google itself was that, when asked ‘When did Snoopy assassinate Abraham Lincoln’, it previously gave the answer ‘1865’. That may indeed be the year of Lincoln’s assassination, but Snoopy may quite understandably feel defamed.

Such scenarios will soon become a thing of the past, though, as Google sets about improving how it detects false premises, so answers like these are not returned in future.

Google’s main issue lies in these being included within the Featured Snippets – those which provides an answer to your search query within the results, and don’t require you to click through to a different site.

Though the information in these snippets is pulled from other pages, Google is perceived (rightly or wrongly) to have some ownership of the content, or at the very least giving it validity and authority. This is why the search giant has spent many years (and surely no small amount of money) trying to ensure Featured Snippets are as accurate as possible.

Google already uses sophisticated quality ranking systems to identify correct answers to user queries – though as the Snoopy example attests, it’s far from perfect. However, these new changes will utilise consensus, where Google is able to firstly identify reputable, high-quality websites for the source material, then only present Featured Snippet information when it finds unanimity across them.

If there is no consensus, or a user is searching for information on breaking news stories that cannot be accurately verified, Google will also get better at advising what it calls “information gaps”. In such scenarios, it will still present users with the most relevant results for their search, but preface it with a warning that quality, factual information isn’t yet available. Explaining why the company was so keen to make these changes, Google Fellow and Vice President of Search, Pandu Nayak, said: “We’re determined to keep doing our part to help people everywhere find what they’re looking for and give them the context they need to make informed decisions about what they see online.”