Guidance on Artificial Intelligence (AI) and the application of AI to CGU
Artificial Intelligence (AI) is becoming much more pervasive in our daily lives. Most notably, Generative AI tools, like a product called ChatGPT by OpenAI. Additional more commonly used tools incorporate readily accessible AI components to mitigate life’s challenges, including features within your Smartphone, in your pocket, or search tools in many commonly used websites like Google, YouTube and Amazon. More applicable to everyday life in academia, writing assistance tools, like Grammarly, have actively incorporated AI features for years now. Ignoring the ever-growing popularity of AI is not only improbable, doing so is likely a mistake at this point. While the landscape of AI is everchanging, the goal of this page is to provide some guidance and considerations on how to approach AI’s use and applicability to your work.
- Access the AI @ CGU SharePoint
site for more information
Access more resources about AI and CGU.
Think about the information you are considering submitting in terms of safety, privacy, and ownership of the data. Unless you are using a university-sponsored and supported tool, personally identifiable information, intellectual property and proprietary university information should not be submitted to a generative AI tool. Be certain that your activities in AI environments are consistent with the university’s commitment to adherence to maintaining privacy of student and other information.
Integrity/Transparency
If, when, and how you use AI in your class (or your research/work) should be clear to all participants or audiences, as should your expectations for your students, colleagues and staff. Think about the ways that the use of AI is explainable and inspectable. Keep a “human in the loop’’ and provide a human alternative when requested/appropriate.
Equity/Inclusion
Be mindful that AI learns from the data on which it is trained, so inclusion, non-discrimination and diversity are critical. If the data are biased by non-inclusion or disproportionate representation, the system will reflect those biases. Aiming for diversity and inclusion means bringing different perspectives and demographics to AI use and development.
Licenses will be distributed to all faculty and staff who express interest. We encourage you to engage with ChatGPT EDU, explore its capabilities, and become early adopters of this transformative technology.
Until a more permanent process can be developed at CGU, for now, please email the Help Desk at helpdesk@cgu.edu with your request and someone from the AI working group will be in contact.
Or, try one of the free versions (Please do not use free or personal versions of AI tools in conjunction with non-public CGU data and documents)
If you have not seen or used one of these tools, why not try them?
Click here to try go to OpenAI’s site and sign up for ChatGPT.
Click here to go to Microsoft’s site and sign up for CoPilot.
Click here to go to Microsoft’s site and sign up for Google Gemini.
Or if you are not feeling like you are quite ready yet for a Generative AI model yet, you can always try a tool like Grammarly here and see how the tool may help you with your writing.
- Access the AI @ CGU SharePoint
site for more information
Access more resources about AI and CGU.
Topic | General Statement | Links or Notes |
IT and Data Policies | A Note of Caution
Before utilizing yourself, and/or encouraging the use of or asking others, such as students, to use AI tools in their work, please investigate how these tools collect and utilize personal information and data, including reading any privacy policy linked with a particular AI tool, including log-in data, tracking, and other metrics (Center for New Designs in Learning and Scholarship, 2023). You should be mindful of your own data privacy, when you input your prompts into an AI tool to see the output. Security: Read through privacy agreements and make sure you are personally comfortable using different AI-based tools. Consider using more vetted tools in university-licensed programs. Never put private information (especially regarding students or including student data or your own data) into consumer tools. *Note: Similarly, OpenAI has also published a page on safety best practices cautioning the output of AI tools, including having a ‘Human in the Loop,’ emphasizing having a human verifying the results of any AI interaction. What about FERPA and other privacy compliance considerations?”] The Family Educational Rights and Privacy Act (FERPA) are the set of laws designed to protect student information in Higher Education. Instructors who may want to make use of these tools should note that, it is considered FERPA compliant as long as users have logged in using their CGU accounts when signing up for these AI tools for CGU coursework. If you are considering requiring students to use an AI platform to complete coursework, please request the students to use their CGU email addresses. Note that with any online service that is not contracted by the university, in view of FERPA regulations, students must not be required to identify themselves to third parties. Academic records, including assessments such as examinations and assignments, are considered a student record and protected by FERPA. For example, the free version of ChatGPT should not be used to draft initial feedback on a student’s submitted essay that included their identifying information. Asking the free version of ChatGPT to respond to general question prompts would not be a FERPA violation, as no student information is provided to ChatGPT.
|
|
Intellectual Property | Intellectual Property
Most companies that own generative AI tools reserve the right to maintain and utilize user data for various production purposes, as well as share them with third-party stakeholders. Additionally, for the majority of tools, the company will own any outputs the tool generates. This has far-reaching implications for the definition of authorship and public knowledge domain. Additionally, there is often a possibility that small batches of user data may undergo human review in the production process within the tool-owning company. For these reasons, Please do not, and teach your students to never upload unpublished work or share personal and sensitive information within generative AI tools of any kind.
|
|
Teaching and Learning | Start at the Course Level
For Instructors: Develop a course policy on AI use, addressing not only plagiarism concerns and ground rules, but also issues of AI ethics, authorship, privacy, and online safety. Discuss it with students on the first days of class. Continue discussing the relevant aspects of the policy throughout the semester. Keep in mind that students are likely facing major differences in faculty opinions about generative AI. Whether or not you allow generative AI in your classroom, be very clear about your expectations. Write a syllabus statement that clarifies the expectations of AI useInstructors have the discretion to allow for the use of AI or any other tools, however, are suggested to do so explicitly if they want to allow it. Stating what you expect students to use (or not use) in their work helps to answer any questions around what extra support is permittable. Depending on your pedagogical values and course expectations, consider adopting or revising one of the statements below for your syllabus. (All statements adapted from Artificial Intelligence Tools and Teaching by Iowa University’s Office of Teaching, Learning, & Technology.)
For Students: Make sure to read the course policy on AI use, addressing not only plagiarism concerns and ground rules, but also issues of AI ethics, authorship, privacy, and online safety. Discuss it with your faculty and other students on the first days of class. Continue discussing the relevant aspects of the policy throughout the semester. Keep in mind that your faculty and peer students are likely facing major differences in faculty opinions about generative AI. Whether or not generative AI is allowed in in one course, be very clear about your understanding of the expectations within the course and AI use. Read the syllabus statement that clarifies the expectations of AI useCiting AI APA StyleIf ChatGPT or other AI tools have been used in research, clear description of the tool and its use should appear in the Method section of the paper (or a comparable section). For literature reviews, essays, reflective/response papers, APA suggests describing the use of tools in the introduction. You should always include the prompting language you used as well as identifying relevant text that was generated in response. Since these tools do not create content that can be replicated/retrieved by other readers, they should be treated like personal communications. The author of the tool/algorithm is treated as a source and cited in APA style. In-text citations and references follow the standard for software citation, shown in Section 10.10 of the Publication Manual. Reference Format
Parenthetical Citation Format
Narrative Citation Format
Example Provided by APA Style OnlineWhen prompted with “Is the left brain right brain divide real or a metaphor?” the ChatGPT-generated text indicated that although the two brain hemispheres are somewhat specialized, “the notation that people can be characterized as ‘left-brained’ or ‘right-brained’ is considered to be an oversimplification and a popular myth” (OpenAI, 2023). Reference OpenAI. (2023). ChatGPT (Mar 14 version) [Large language model]. https://chat.openai.com/chat MLA StyleIf ChatGPT or other generative AI tool has been paraphrased, quoted, or incorporated into work in any content, the source must be cited. MLA expects writers and researchers to acknowledge all functional uses of the tool in the text of the work, as a note, or otherwise clearly stated. MLA does not recommend treating the AI tool as an author, but recommends a number of strategies for acknowledging AI tools through its existing standards and recommendations at https://style.mla.org/citing-generative-ai/. Works Cited Format
Parenthetical Citation Format
Narrative Citation Format
Examples Provided by MLA Style OnlineQuoted in Your ProseWhen asked to describe the symbolism of the green light in The Great Gatsby, ChatGPT provided a summary about optimism, the unattainability of the American dream, greed, and covetousness. However, when further prompted to cite the source on which that summary was based, it noted that it lacked “the ability to conduct research or cite sources independently” but that it could “provide a list of scholarly sources related to the symbolism of the green light in The Great Gatsby” (“In 200 words”). Works-Cited-List Entry“In 200 words, describe the symbolism of the green light in The Great Gatsby” follow-up prompt to list sources. ChatGPT, 13 Feb. version, OpenAI, 9 Mar. 2023, chat.openai.com/chat. Paraphrased in Your ProseWhile the green light in The Great Gatsby might be said to chiefly symbolize four main things: optimism, the unattainability of the American dream, greed, and covetousness (“Describe the symbolism”), arguably the most important—the one that ties all four themes together—is greed. Works-Cited-List Entry“Describe the symbolism of the green light in the book The Great Gatsby by F. Scott Fitzgerald” prompt. ChatGPT, 13 Feb. version, OpenAI, 8 Mar. 2023, chat.openai.com/chat. Talk about it Additional Considerations about AI use to discuss and communicateFor Faculty: In addition to syllabus statements, consider talking with your students about AI tools.
For Students:
Faculty can make more creative courses and mitigate risks Using Generative AI without proper acknowledgement to create or enhance submissions when an assignment does not explicitly call for it is academically dishonest. It is not, in fact, fundamentally different from having another person write your paper, take your test, or complete your assignment. While it may be difficult to come up with assignments that are completely AI-proof, the tips below can help you make your classroom more resistant to plagiarism.
Adapt assessmentsAI tools are emerging and it can be incredibly difficult to make any assessment completely free from AI interference. Beyond a syllabus statement, you may also consider adapting your assessments to help reduce the usefulness of AI products. Before revising any assignment, it’s helpful to reflect on what exactly you want students to get out of the experience and share your expectations with your students. Is it just the end product, or does the process of creating the product play a significant role?
Some questions to consider as you begin planning:
Potential assignment ideas:
Looking for more? Here are 101 crowd-sourced ideas for using AI in education and five things to think about as you begin teaching with AI.
|
|
CGU AI Tools | AI Detection Tools
CGU OIT does make available Turnitin as a tool for plagiarism detection, which does have a feature to include AI detection, but this add-on feature is not currently offered as part of our purchase. However, the accuracy of both plagiarism and AI detection tools are not reliable; any results from these tools should be used for nothing more than a starting point for a conversation between faculty and the student whose work is in question. Suspected use of excessive uncited content and/or Generative AI in coursework is not sufficient evidence to begin a formal Academic Integrity investigation. Instead, we recommend faculty document their expectations early and often and have open dialogs with students about the implications and responsible use of Generative AI in coursework and academia. We do recommend use of Turnitin or any other plagiarism detection tool after training on how to properly use such tools in the context of what these tools can and cannot provide.
|
|
Support and Training | How should you ask AI?
How you interact with AI is important. As discussed in the previous section, whether you are a faculty member, a student, or a staff member, the result is only as good as the data the tool is given, both how the tool is trained and how the tool is interacted with, so long story short, what you ask AI is what you AI will give you. This is called Prompt Engineering. Prompt Engineering has become so important, the concept has sparked an entire field of research and profession, and OpenAI themselves have published a page on the topic with key considerations when prompting AI to provide you feedback. *Note: Similarly, OpenAI has also published a page on safety best practices cautioning the output of AI tools, including having a ‘Human in the Loop,’ emphasizing having a human verifying the results of any AI interaction. Prof. Robert Klitgaard similarly has published some work on the topic from his experience and made the work available to CGU constituents upon request
|
Center for Teaching & Learning. (2024). How Do I Consider the Impact of AI Tools in My Courses?. UMass Amherst. https://www.umass.edu/ctl/how-do-i-consider-impact-ai-tools-my-courses
Center for New Designs in Learning and Scholarship (2023). Chat GPT and Artificial Intelligence Tools. Georgetown University. https://cndls.georgetown.edu/ai-composition-tools/#privacy-and-data-collection
Instruction Junction at The College (2024). Academic Integrity and AI/ChatGPT. Arizona State University. https://instruction.thecollege.asu.edu/academicintegrityAIChatGPT
OpenAI Developer Platform (2025). Prompt Engineering. OpenAI. https://platform.openai.com/docs/guides/prompt-engineering
OpenAI Developer Platform (2025). Safety Best Practices. OpenAI. https://platform.openai.com/docs/guides/safety-best-practices
Practical Responses To Generative AI (2024). Office For Faculty Excellence. Montclair State University. https://www.montclair.edu/faculty-excellence/teaching-resources/clear-course-design/practical-responses-to-chat-gpt/
Teaching & Learning Transformation Center (2024). Artificial Intelligence (AI) (umd.edu). University of Maryland. https://tltc.umd.edu/artificial-intelligence-ai