Using GenAI for effective learning

Generative AI—including large language models (LLMs) such as ChatGPT, image generators such as Dall-E and Midjourney, and other tools used for specific tasks like transcription, paraphrasing, and grammar checking—are becoming increasingly common. If you wish to use Generative AI in your studies, it is important to learn how to do this in a way that aligns with UOW policy and enhances your academic skills.

Generative AI (‘Artificial Intelligence’) is a term for a range of computer tools that use large datasets of text or images to generate material in response to prompts from a user. While called ‘artificial intelligence’, Generative AI tools are not actually intelligent. Large Language Models such as ChatGPT simply predict the most likely words in a sequence based on what words appear together often within their dataset. Generative AI can give the impression it is ‘thinking’ in response to prompts or responding with human-level logic and reason. While this makes the technology very exciting, it’s worth remembering that at no point in an interaction with Generative AI is the tool actually thinking or verifying information.

There is no universal policy on the use of Generative AI at UOW. Guidance on whether GenAI is permitted, and for what purpose, is provided in your Subject Outline and may vary between subjects. Make sure you familiarise yourself with the GenAI policy for each of your subjects before beginning work on assessments. If you do use GenAI, Microsoft Copilot is UOW’s recommended tool.

If you are permitted to use GenAI in your subject, this means that the decision whether or not to use these tools is up to you. Before choosing to use GenAI for assessment, it is important to consider some of the risks and costs associated with the technology. The costs—relating to the social and environmental impacts of GenAI—cannot be mitigated and it is up to you whether using the technology aligns with your values. The risks—which relate more to you as a user—can be managed with thoughtful and appropriate use.

Some of the costs of widespread GenAI use highlighted by experts include:

  • The environmental cost of the large data centres needed to power Generative AI, which require unsustainable levels of water and energy consumption
  • Tools that are trained on plagiarised material from the internet—this includes ChatGPT and most image generators
  • Unethical labour practices used in the training and content moderation of GenAI tools
  • Misinformation and disinformation, including ‘deepfake’ content
  • How investment in and development of generative AI contributes to surveillance and military technology

For more information on these topics, see References and further reading 

Some of the risks associated with GenAI include:

  • Data harvesting: when user-inputted data is retained and reused by AI tools. Uploading private, confidential, or identifying information into AI tools may put this information at risk of being leaked or used for undisclosed purposes. Choosing AI tools with clear restrictions on how your data is used, such as Microsoft Copilot, can help manage this risk. UOW recommends Copilot due to its Enterprise Data Protection policy, which means that your prompts and responses are encrypted and not retained by Microsoft to train its products. For more information on using Copilot at UOW, see the Library’s guide.
  • Academic misconduct: inappropriate use of GenAI in assessment, or use where it is not permitted, may constitute academic misconduct. This risk can be managed by familiarising yourself with relevant policies, including your Subject Outline and the guidance from UOW Academic Integrity on GenAI in assessment, making sure you know how to apply them, and asking for help (e.g. from your Subject Coordinator) if you are unsure.
  • Hallucinations: the term used to refer to when GenAI provides misleading or false information while presenting it as real. The risk of hallucination can be managed by always carefully fact-checking outputs from GenAI tools, including suggested academic readings. The Library’s guide to Copilot provides some helpful advice on verifying sources provided by AI.
  • Dependence: relying too much on GenAI tools for some tasks, especially those related to reading, writing, and finding or processing information, can mean you are less capable of doing these tasks yourself over time. If you’re new to university or haven’t studied in a long time, reliance on GenAI can prevent you from developing the academic skills you need to succeed in your degree. This resource provides guidance on how you can use GenAI effectively to support your learning.

Generative AI is most effective for use that is limited and clearly defined. Since it is not actually intelligent, it requires effective prompts to generate the outputs you are looking for. It is best used for objective tasks that require no judgement.

Uses for Generative AI may include:

  • Generating practice problems for Maths study
  • Evaluating code and identifying bugs in Computer Science and related fields (if permitted)
  • Proofreading or identifying grammatical errors in writing
  • Improving accessibility, such as automatic captioning or transcription tools

Generative AI is often used by students for other purposes, such as summarising notes or readings, paraphrasing material, or generating written assignments. It may seem that Generative AI makes these tasks easier, saves time and effort, or does a better job than you can. However, because these tasks require critical thinking and judgement, Generative AI is not a suitable tool: it may misjudge what is important in a summary and what is not; it may drop crucial information or terms when paraphrasing; and it is not capable of the critical thinking, analysis, or reflection required in academic writing. Generating written material for assessment—unless such use is clearly permitted in assignment instructions—is also likely to breach academic integrity.

You will see when browsing the suggested prompts in Microsoft Copilot that the tool has a wide range of possible uses. However, it is important to keep in mind that just because Microsoft says AI can do something does not mean it can do it effectively to the standard expected at university. A good example of this is summarising.

Summarising research papers or articles is a suggested use for Copilot for university students. But summarising an article is not an objective task—it requires critical thinking and judgement by a human to determine what the most important aspects of the article are for your needs. Remember, too, that journal articles are already summarised by the author(s) in the abstract. The authors have a better understanding of their own research, key findings, and how it contributes to the field than AI tools. This includes automatic AI summaries provided by journals, as the authors have no control over these outputs.

If you are worried about the time it takes to read articles when researching for an assignment, read the abstracts first to determine whether the article will be useful for you, and how it might help answer your question or address your problem. Reading academic texts is a skill that can only be developed through regular practice, but doing so will deepen your understanding of your area of study, help you retain the information long-term for your career, and improve your academic writing. Further strategies for effective reading.

Generative AI tools can be helpful, but the ways they can be effectively used to support learning are more limited than the companies promoting them might suggest. A useful concept to think about is ‘cognitive offloading’, which refers to any use of technology to replace thought processes. Research has shown that GenAI use is a form of cognitive offloading which can reduce critical thinking skills over time (Gerlich, 2024). Staying in control of the process and applying your own judgement in evaluating GenAI outputs can help mitigate these effects.

How can I use AI to…

Improve my academic writing?

Academic writing may seem daunting if you’re new to university, returning to study after a long time, or simply feel you’ve never gotten the hang of it. Remember that nobody is “naturally” good at academic writing—it is a skill that needs to be learned and practiced like any other.

Writing is an important part of the learning and thinking process, so it’s important that you—and not AI—are in control of both the process and the outcome. The best way to do this is to not accept any Generative AI outputs, but to use them as a guide alongside your own work.

For example, you may draft an essay or report yourself, and then upload the document to Microsoft Copilot and ask it to review your work or suggest ways to make it more academic in tone. You can incorporate common feedback comments into your prompts—for example, if you often get feedback from your marker saying that you need to be more concise, you can ask Copilot to suggest ways to do this.

Don’t ask the AI to make the edits for you—go through the suggestions and make any changes to your own working document if you agree with them. This is because AI can erase your individual voice, weaken arguments, or remove important evidence such as quotes, statistics and references.

Study for tests or exams?

Microsoft Copilot can help you prepare for tests or exams by suggesting practice questions. The more specific you are with prompts, the more relevant the questions will be. For example, compare the below:

  • Give me some practice questions about child development; or
  • I am a first-year Primary Education student preparing for an exam in a subject about child development. Give me some practice short essay questions that address XYZ learning outcomes.

 

  • Help me practice Maths for nursing; or
  • Create a bank of questions for converting weight units of measurement for nursing using kilograms, grams, milligrams and micrograms.

 

  • I need practice problems for engineering; or
  • Provide a bank of questions asking to find the determinant of a 4 x 4 matrix.

Keep in mind that any material from your subject—such as learning outcomes—is not your data to share and belongs to the Subject Coordinator. If you use this material in prompts, you must be signed into Microsoft Copilot so that your prompt data is protected and cannot be used by anyone else.

For questions with straightforward answers such as equations, you can also check the answers with AI—but remember that it does not have the ability to identify errors in your process or explain where you went wrong. Since AI can make mistakes, it’s also important that you understand the content well enough to recognise if the tool has gotten the answer wrong.

If you find you’re consistently getting things wrong but aren’t sure why, reach out to your Subject Coordinator or tutor for help, or book in with a Learning or Maths Skills Advisor.

Proofread my work and check for errors?

Proofreading your written work (unlike revising or editing) is an objective task which means AI tools can be effectively used. Be clear about limitations in your prompts—you can instruct the tool to identify spelling mistakes or grammatical errors only, without making any suggestions for style and tone. Other software can also help with proofreading, such as spellcheckers in Microsoft Word or other word processing programs.

Referencing: AI can often get referencing wrong, so always double-check any citations generated by referencing software against the UOW guide for the style you are required to use. Generative AI is well-known for making up references that look legitimate if prompted to recommend sources, so make sure you never rely on it for research purposes.

Coding: Generative AI is widely used for identifying errors in code. Before you use it to check your code, make sure such usage is permitted by your Subject Coordinator. Any code submitted as part of assessment should be your own work – so don’t use AI to generate the code for you unless you have been told to do so.

Support my access needs?

Generative AI can be helpful for planning or breaking down the steps of an assignment or task, especially for neurodivergent students or those who otherwise struggle with executive function. You can ask Microsoft Copilot to do this for you, or there are other online tools designed for this purpose. Remember that AI tools outside Microsoft Copilot do not have Enterprise Data Protection, so keep prompts general and do not provide any confidential, identifying, or protected information.

For other uses, UOW’s Student Accessibility and Inclusion team can advise on assistive technology best suited for you, along with any accommodations or adjustments you may need for your studies. AI tools designed for general public use may not be as reliable or functional as dedicated assistive technology.

 

Gerlich, M. (2025). AI Tools in Society: Impacts on Cognitive Offloading and the Future of Critical Thinking, Societies, 15(1)

Further reading for ethics and AI