Skip to content

Teaching with AI

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Full Disclosure: Material on this website was copy edited or is based on suggestions made by a GPT

Have you integrated AI into your coursework yet?

No, and thats OKAY!

A thoughtful, wait and see, approach to incorporating AI into your educational material is rational and reasonable.

Making the decision to incorporate an AI into your education strategy, student assessment, and grading is no easy task.

Further, costs and accessibility issues around AI continue to persist across academia. Underserved institutions and colleges with small budgets, or little investment into IT may not have the ability to meet security requirements needed for secure access to student's coursework or personal data (protected by FERPA).

An important fact to consider though is that most of your students are already making use of AI for their assignments and study. (ref: Source ).

Read about OpenAI Educator Considerations

Yes, but go ahead and read on

Thats great!

Make sure to read through the rest of this section to make sure that you're using GPTs in ways that keep your student's data and personal information safe.

Also, ensure that you're using approved AI software that has been vetted by university security and IT staff.

GPTs can compose essays, pass advanced tests, and are a threat to academic integrity (Eke 2023). As a result, online education, a recent and lucrative innovation in academia, now faces extreme challenges regarding effective remote student assessment (Susnjak & McIntosh 2024).

Attempting to modify coursework to avoid assessment techniques where GPTs excel or using bots to detect GPT generated content are increasingly proven to be futile (Sloan MIT 2024).

"Instead of engaging in a cheating arms race, why not embrace AI?"

Proponents of integrating AI into educational curricula () argue that by adapting and integrating GPTs into the curriculum, we also develop a modern workforce who are empowered by AI assistants.

Cain (2023) explores ways in which prompt engineering can be brought into the classroom and "transition [students] from passive recipients to active co-creators of their learning experiences."

Pro vs Cons of AI in Economics Classrooms

At the 2024 EconEd conference Professor Justin Wolfers examined how AI is revolutionizing economics education by enhancing student learning and easing educators' workloads. In response, Professor Jon Meer discussed how educators are navigating AI integration in the classroom. Meer’s session provides a practical and valuable roadmap for effective implementation.

https://www.macmillanlearning.com/college/us/events/econed

Justin's Pros presentation: https://www.youtube.com/watch?v=sTeOLgMN4UM

Jon's Cons presentation: https://www.youtube.com/watch?v=NXbEvLd1vVk

Security and FERPA Considerations for GPTs in Higher Education

The integration of GPTs into classrooms introduces challenges, particularly in terms of data security and compliance with the Family Educational Rights and Privacy Act (FERPA).

This section outlines key considerations for educators and administrators.

FERPA Protections

FERPA protects the privacy of student education records. It gives parents certain rights regarding their children's education records. These rights transfer to the student at 18 years of age or beyond the high school level.

  • Education Records: Includes files, documents, or other materials that contain information directly related to a student and are maintained by an agency or institution of education.
  • Directory Information: Information contained in an education record that would not generally be considered harmful or an invasion of privacy if disclosed.
  • Rights Under FERPA: Parents and eligible students have the right to inspect and review the student's education records, request the amendment of records they believe are inaccurate or misleading, and have some control over disclosing personally identifiable information from education records.

Generative AI and Compliance with FERPA

Commercial GPTs, such as those used for creating educational content, chatbots, or data analysis tools, can potentially handle personal or sensitive information.

Faculty members must ensure that using these technologies complies with FERPA regulations before using them in the classroom.

FERPA mandates the protection of student education records.

Before using GPTs in educational settings, remember:

  • Do not use student education records with Commercial or external AI tools, unless the data falls under directory information, and even then make certain you are compliant with university policy.
  • When using student data, implement data minimization which anonymizes student information to avoid release of personally identifiable information (PII).
  • Be extremely cautious when inputting student data into GPTs, as this can lead to unintended data leaks or exposure of PII.

Identifying and Securing Student Data

To ensure FERPA compliance:

  • Consult with your university's information technology and information security unit before using an AI software. Ensure that you only use secure, vetted, platforms that are approved by your university.
  • Do not use 3rd party software (plugins or extensions) to analyze or prompt with student data.

Security Risks and Mitigation Strategies

Data Leakage and Exposure

  • Avoid copying sensitive emails, video/audio transcripts, or student information into GPT platforms for summarization or analysis.
  • Educate all staff and teaching assistants on the risks of sharing personal or confidential information with AI systems.

Academic Integrity

  • Develop clear policies on the appropriate use of AI tools for all assignments and exams. The existing Code of Academic Integrity already explains how to deal with cases of plagarism.
  • Implement detection mechanisms to identify AI-generated content in student submissions.

Technical Security Measures

  • Implement zero-trust security solutions, such as secure web gateways, to control access to GPT tools.
  • Use URL and content filtering to prevent unauthorized data uploads and limit access to AI platforms.

Ethical Considerations

  • Address potential equity issues arising from unequal access to AI tools among students.
  • Consider the impact of AI on critical thinking skills and social interactions in the learning environment.

Guiding Graduate Students and Postdoctoral Researchers in AI Usage

Training the next generation of researchers to use AI effectively and ethically is a crucial aspect of graduate mentorship. As an advisor, it is important to ensure that students have appropriate access to these platforms and a comprehensive understanding of the ethical implications for their education, research, and software engineering.

Platforms like ChatGPT could potentially become the primary mentor for graduate students and postdoctoral researchers. Unlike human advisors, these AI systems are available 24/7 to address virtually any question or problem. However, it is essential to strike a balance between AI assistance and independent learning.

To achieve this balance, advisors should:

Encourage AI literacy: Provide students with resources and opportunities to learn about AI technologies, their applications, and their limitations.

Teach responsible AI usage: Emphasize the importance of using AI as a tool to support research, not replace critical thinking and problem-solving skills.

Discuss ethical considerations: Foster open discussions about the ethical implications of AI in research, including issues of bias, fairness, transparency, and accountability.

Promote collaboration: Encourage students to collaborate with AI, leveraging its strengths to overcome their weaknesses and vice versa.

Stay updated: As AI technologies continue to evolve, ensure that both advisors and students stay informed about the latest developments, best practices, and potential pitfalls.

By incorporating AI into graduate and postdoctoral training while maintaining a focus on ethics and responsibility, the next generation of researchers can harness the power of AI to advance their fields while upholding the highest standards of academic integrity.

Teaching with Chatbots

ChatGPT and Gemini can improve teaching and learning processes by generating and assessing information and can be used as a standalone tool or integrated into other systems. It can perform simple or technical tasks and examples show how it can augment teaching and learning.

Gemini LearnLM is available in Google's AI Studio and has advanced features for teaching or tutoring.

Table: Potential role playing examples for chatbots for teaching and tutoring

Role playing Description Example of implementation
Possibility engine AI can suggest alternative ways to express an idea Students can write queries in ChatGPT/Gemini and use the "Regenerate" response function to explore alternative responses.
Socratic opponent AI can act as an opponent to develop an argument Students can enter prompts into ChatGPT/Gemini, using the structure of a conversation or debate. Teachers can ask their students to use ChatGPT/Gemini to prepare for discussions.
Collaboration coach AI helps groups to research and solve problems together When completing tasks and assignments, students can use ChatGPT/Gemini to find information while working in groups.
Guide on the side AI acts as a guide to navigating physical and conceptual spaces Teachers use ChatGPT/Gemini to generate content for their classes or courses, such as discussion questions, and to seek advice on how to support students in learning specific concepts.
Personal tutor AI tutors each student and gives immediate feedback on progress ChatGPT/Gemini provides personalized feedback to students based on information provided by students or teachers (e.g., test scores).
Co-designer AI assists throughout the design process Teachers can seek ideas from ChatGPT/Gemini for designing or updating a curriculum, including rubrics for assessment. Alternatively, they can focus on specific goals, such as making the curriculum more accessible. ChatGPT can provide recommendations and suggestions to help achieve these objectives.
Exploratorium AI provides tools to play with, explore, and interpret data Teachers provide basic information to students who write different queries in ChatGPT to find out more. ChatGPT/Gemini can be used to support language learning.
Study buddy AI helps the student reflect on learning material Students explain their current level of understanding to ChatGPT/Gemini and ask for ways to help them study the material. ChatGPT/Gemini could also be used to help students prepare for other tasks (e.g., job interviews).
Motivator AI offers games and challenges to extend learning Teachers or students ask ChatGPT/Gemini for ideas about how to extend students’ learning after providing a summary of the current level of knowledge (e.g., quizzes, exercises).
Dynamic assessment AI provides educators with a profile of each student’s current knowledge Students engage in a tutorial-style dialogue with ChatGPT/Gemini, and then request that ChatGPT/Gemini create a summary of their current knowledge for sharing with their teacher or for assessment purposes.

More Resources on AI at University of Arizona

University of Arizona Artificial Intelligence

University of Arizona Library Student Guide to AI

University of Arizona Data Lab AI Workshop Series

Google for Education

Google offers self-paced courses on generative AI.

Register with your @arizona.edu Google account and enroll in this 2-hour workshop: Generative AI For Educators

References