This website uses cookies for anonymised analytics and for account authentication. See our privacy and cookies policies for more information.


Supporting Scotland's vibrant voluntary sector

Scottish Council for Voluntary Organisations

The Scottish Council for Voluntary Organisations is the membership organisation for Scotland's charities, voluntary organisations and social enterprises. Charity registered in Scotland SC003558. Registered office Mansfield Traquair Centre, 15 Mansfield Place, Edinburgh EH3 6BB.

Guide to generative AI for Scotland's voluntary sector

We’re producing this guide in 2023, as more and more people start to use and discuss generative AI. It’s clear that many people would like to know more about how these tools work, and would appreciate ideas and guidance to help them tackle the practical and ethical issues that come up. 

Artificial Intelligence (AI) covers a wide range of areas. It can be summarised as 'getting computers and machines to perform tasks we normally associate with human intelligence'. In this guide, we look at one specific area: generative AI.

This guide is for you if: 

  • you work or volunteer in the charity sector 
  • you’re not a specialist in AI  
  • you want to find out more about what generative AI can do 
  • you’re worried about AI and how it might be used 
  • you’re exploring possibilities for using generative AI in your work. 

In this guide, we’ll outline scenarios where generative AI could or has been used then summarise some of the ethical issues that come up, along with possible responses. 

If there are any terms you don't understand, try looking for them in our AI glossary.

What is generative AI? 

When we talk about generative AI, we mean digital tools that people can use to generate new content, often with minimal human input. AI in general covers a much broader range of tools, for example algorithms to spot patterns and make predictions by reviewing large amounts of data. In this guide, we’ll be focussing on generative AI as it has developed rapidly in recent months and is accessible to a wide range of users.  

ChatGPT and other generative AI tools such as Google Bard are based on Large Language Models (LLMs). Stable Diffusion and other tools do the same thing, but turn text prompts into images and illustrations.  LLMs identify probabilities of associations between words and using those to predict how best to respond to text inputs. Rather than providing the 'right' answer they produce the most likely output from a certain input. They work by combining a vast dataset of text (a very large amount of text copied from the internet) with a mathematical algorithm called a transformer function. The algorithm and data set are run through a complex computer system called a neural net. Over time this develops a ‘large language model’. 

A large language model can then make very plausible guesses about how to respond to text inputs. So when a user types a prompt into a generative AI tool, the system is able to respond with an output that typically matches the user’s expectations. Generative AI tools have hit the headlines in 2023 because they are now capable of generating very plausible continuous text from fairly minimal prompts. Many people, including computer specialists, did not anticipate this level of performance this soon. Lots of people assumed that computing power would need to reach a much more advanced stage before generative AI could become a convincing tool. Also, generative AI tools have rapidly developed into user-friendly simple tools, that are as easy to access as search engines. 

Although these tools are easy to use, and are becoming widely adopted, they throw up a number of questions around ethics and risk, including issues around reliability, bias, privacy and intellectual property. We cover these issues in this guide.

Our work to help organisations grow their digital capacity is supported by: