The GIGO principle is a useful idea from the world of mathematics and computing. It stands for Garbage In, Garbage Out, and it means that a machine (or a mathematical process) will only give you useful results if you give it the correct data. You cannot, in other words, get good results from bad data. A calculator or an Excel spreadsheet will perform the calculations perfectly, but if you’ve mistyped the figures, the results will be meaningless. As my dear old nan used to say, you cannot make a silk purse out of a sow’s ear.
The GIGO principle is useful in other areas of our lives, too. We aren’t machines, but we also need good data to make good decisions. If you’re looking to move house, you need to find out as much information as possible about the areas you’re thinking about moving to. Are the buses reliable? Does the neighbour play the trombone at 11 o’clock at night? Does the local shop charge £3 for a pot noodle? We might not be able to answer all these questions, but we should at least consider them. Incomplete data can be as bad as garbage data. And, as we’ve seen, Garbage In, Garbage Out.
We can think about our assessments in a similar way. We all know that referencing is important for maintaining our academic integrity, but we should also consider what type of sources we rely on. Integrity isn’t just about giving credit to your sources; you also have to make sure that the sources you’ve relied on deserve your credit and your trust. Your work is only as good as the work that has informed it. Garbage In, Garbage Out.
Here’s an example. In January 2026, an investigation by The Guardian found that Google’s ai summary feature frequently gave inaccurate and potentially harmful information in response to medical questions. One summary noted by the investigation advised that people with pancreatic cancer should avoid high-fat foods. This advice was described as “really dangerous” by experts, as it is “the exact opposite of what should be recommended.” The summaries also gave incorrect answers to questions about liver function tests and women’s cancer, which the investigation said could “lead to people dismissing genuine symptoms.” (Gregory, 2026a)
The ai overviews were based on various sources from the internet – including some sources that contained inaccurate information. This is an example of the GIGO principle in action – bad information poisons good information. There might well be good medical advice in the ai summaries, too, but the bad advice – the bad data – makes it impossible to trust the results. It really doesn’t take very much Garbage In to ruin a whole batch of results. In fact, Google seems to agree; in March of 2026, The Guardian reported that Google had stopped providing ai summaries of medical questions (Gregory, 2026b).
So, what do we do? Well, for all your assessments I recommend you start with your module’s reading list. Teaching staff and Librarians have put together the reading lists to give you a bank of interesting, high quality and trusted resources for each stage of your study. You search for your module on the reading lists area of the Learning Success Hub to bring up the relevant texts.
We understand that you probably have limited reading time, and dozens of other responsibilities to fit in alongside your studies. We want you to make the most of the time you have, and that means reading sources that are worth your attention. A good essay starts with good reading, and good reading starts in the library. If you want to get started to use the Arden Library Portal, there are some top tips on our Getting Started page.
References
Gregory, A. (2026a) Google AI Overviews put people at risk of harm with misleading health advice. [online] London: The Guardian. Available from: https://www.theguardian.com/technology/2026/jan/02/google-ai-overviews-risk-harm-misleading-health-information [Accessed 27th March 2026]
Gregory, A. (2026b) Google scraps AI search feature that crowdsourced amateur medical advice. [online] London: The Guardian. Available from: https://www.theguardian.com/technology/2026/mar/16/google-scraps-ai-search-feature-that-crowdsourced-amateur-medical-advice [Accessed 27th March 2026]