Home » Plant blindness in smartphone identification applications – are we passing on our biases to our helpful apps?

Plant blindness in smartphone identification applications – are we passing on our biases to our helpful apps?

It is well-known that people are better at identifying animals than plants and this relative inability of people to identify plants is increasingly termed “plant blindness”. Recent research has identified links between undervaluing nature, mental health, and plant blindness. Ask anyone to identify common animals and most will easily identify badgers, foxes, blackbirds and otters. But ask the same people (assuming you have not asked a botanist in the first place) to identify rowan, lords & ladies, horse chestnuts or wood anemone and most will be stumped. There are many useful guides to help the more nature-minded of us to identify common species (and less common) but it is the advent of smartphones (a computer in almost every pocket) that perhaps holds the most promise for helping people over-come plant blindness.

There are now an abundance of applications that are targeted at identifying plants and animals. For plants, these applications are often found being berated on Twitter or other social media sites for not being particularly helpful. One app that aims to help users identify plants, animals, objects and locations and that seemed to offer a user-friendly option and has the benefit of being installed on almost every phone is Google Lens. Users simply take a photograph and ask the app to tell them what is in it. An initial trial of easy species (e.g. oak) was promising but we quickly hit a problem more worrying than vague answers often associated with plant-specific identification applications. If an animal was in the photograph, the app did not notice the plant (Figures 1 – 3). At first we thought that this was the result of a bad photograph but a quick test at a few different locations on different plants, quickly revealed that the app appears to favour animals over plants as the “subject” of the photograph. This is perhaps not surprising but it is concerning.

Figure 1: Examples of Google Lens identification of plants. A) and B) show photographs with the plants clearly the subject and it makes a reasonable attempt to identify them.
Figure 2: A) the original photograph and B) the Google Lens effort to identify what is in the photograph. The software clearly identifies a dragonfly or damselfly but ignores the plant, which is the majority of the photograph.
Figure 3: A) the original photograph; B) the Google Lens effort to identify what is in the photograph and selects the out-of-focus bee rather than the in-focus plant; C) scrolling across the Google Lens suggestions it finally suggests the plant.

Research has already reported that existing biases are being programmed into AI so it is perhaps not surprising that we may also be programming plant blindness into generic identification apps. However, given that plant blindness has repercussions for everything from conservation to mental health, this is something that we should be trying to avoid. Technology is becoming an increasingly important part of teaching and training future ecologists – we need to train out plant blindness in our students and this will be a lot easier if it is not embedded in our technology.

We have only investigated this briefly, and to be fair, Google Lens is not billed as a plant identification app but it does highlight its ability to identify plants and animals on its web site. Clearly more work needs to be done to determine how much of a problem this may be with this and other similar apps and how to address it. For now, we tell our students that these apps may favour animals so they need to be careful with how they take photographs if they want help from an ID app but this seems indicative of a wider problem.

We are forgetting about plants in so many aspects of life and now we may be subconsciously causing our technology to do the same. Understanding and identifying plants should not be a niche interest – we need to encourage greater engagement with plants and apps like Google Lens could be a fantastic way to do this, but only if they are not plant blind themselves.

References

Adams, R. (2019). Artificial intelligence has a gender bias problem – just as Siri. The Conversation. https://theconversation.com/artificial-intelligence-has-a-gender-bias-problem-just-ask-siri-123937

Adams, R. & NÍ Loideáin, N. (2019) Addressing indirect discrimination and gender stereotypes in AI virtual personal assistants: the role of international human rights law. Cambridge International Law Journal. Pp 241 – 257. https://dx.doi.org/10.2139/ssrn.3392243

Astell-Burt, T. & Feng, X. (2019) Association of urban green space with mental health and general health among adults in Australia. JAMA Network Open. 2: e198209. https://doi.org/10.1001/jamanetworkopen.2019.8209

Balding, M. & Williams, KJH. (2016) Plant blindness and the implications for plant conservation. Conservation Biology. 30: 1192 – 1199. https://doi.org/10.1111/cobi.12738

Jose, SB., Wu, C-H., & Kamoun (2019) Overcoming plant blindness in sciences, education, and society. Plants People Planet. 1: 169 – 172. https://doi.org/10.1002/ppp3.51

Pearson, DG. & Craig, T. (2014) The great outdoors? Exploring the mental health benefits of natural environments. Frontiers in Psychology. 5: 1 – 4. https://doi.org/10.3389/fpsyg.2014.01178

Thompson, CW., Roe, J., Aspinall, P., Mitchell, R., Clow, A & Miller, D. (2012). More green space is linked to less stress in deprived communities: evidence from salivary cortisol patterns. Landscape & Urban Planning. 105: 221 – 229. https://doi.org/10.1016/j.landurbplan.2011.12.015

Wandersee, JH. & Schussler, EE. (1998) Preventing plant blindness. The American Biology Teacher. 61: 82 – 86. https://doi.org/10.2307/4450624

About the Authors

Karen Bacon

Karen Bacon is a Lecturer in Plant Ecology at the National University of Ireland, Galway with interests spanning palaeobotany and modern ecology. She is particularly interested in plant – environment interactions, including plant responses to climate change, invasive species and mass extinction events.  She is also interested in how students and the general public engage with plants and in identifying ways of reducing plant blindness through education.

Julie Peacock is an Associate Professor of Ecology in the School of Geography, University of Leeds. Her discipline interests have focused on plant life histories and how plants have responded to global climate change, carrying out fieldwork both in the UK and the tropics. I am also interested the value of stately home gardens as ecological resources and living libraries of plants. My research also focuses on how students learn outside the classroom, both through fieldwork and work-based learning. Plant Blindness is an issue among students and I’m keen to explore good ways to teach non-specialists plant identification skills.

Read this in your language

@BotanyOne on Mastodon

Loading Mastodon feed...

Archive

Discover more from Botany One

Subscribe now to keep reading and get access to the full archive.

Continue reading