Generative AI (GenAI) is a technology that has enjoyed quite a lot of attention recently with the launch of a number of end-user services, including ChatGPT from OpenAI and Bard from Google. The technology is programmed to generate new and original contents, including prose and images, by pre-training patterns from existing data.
In this way, GenAI is able to author well-composed and seemingly well-written papers with only the slightest prompt from a real person, which would seem to make it the perfect companion in higher education, and indeed in education everywhere, especially during peak periods where the workload may feel overwhelming.
There are, lots of problems associated with using genAI in academia and elsewhere.
Class is specifically a space for learning and practicing invaluable writing and researching processes that cannot be replicated by generative artificial intelligence (AI).
While the ever-changing (and exciting!) new developments with AI will find their place in our workforces and personal lives, in the realm of education, this kind of technology can counteract learning.
This is because the use of AI diminishes opportunities to learn from our experiences and from each other, to play with our creative freedoms, to problem-solve, and to contribute our ideas in authentic ways.
In a nutshell, a college is a place for learning, and generative AI (e.g.: chatGPT) cannot do that learning for us.
Academic integrity plays a vital role in the learning that takes place in your classes at Lewis University, and submitting work as your own that was generated by AI is plagiarism.
For all of these reasons, any work written, developed, created, or inspired by generative artificial intelligence does not lend itself to our learning goals and is a breach of ethical engagement.