logosmall2.jpg

The Wheel

St. Catherine University’s official student news, since 1935.

Fall Special Edition 2023: Opinion: Tech savvy Katies: Reexamining AI use

Fall Special Edition 2023: Opinion: Tech savvy Katies: Reexamining AI use

By KC Meredyk

Earlier this semester, I wrote an article about the university’s policy on the use of AI. Now, I’m returning to the subject of AI to try to explore the best way to use AI for class work (when allowed). As a humanities major, I don’t use AI for any assignment. If I did use AI to write for an assignment, I’d feel like I was burning my tuition money. Being a humanities major is all about developing critical thinking skills, a facet that is stripped away when AI is responsible for doing all the critical thinking for you. But as a child of two people in tech, I’m always curious about how people use technology.  

 Naomi S. Baron argues in “Why Human Writing Is Worth Defending In the Age of ChatGPT” for the Stanford University Press that, since writing helps us to think, giving up that task to AI disempowers us cognitively and expressively.

Sometimes, AI writing advice is a matter of stylistic choice where its “corrections” are unneeded, Baron adds. Other times, the “corrections” are flat-out wrong. For example, when using the Oxford comma (or the serial comma) when you write a sentence like, “Economic stressors, internal political instability and the looming war …” tools like Grammarly will tell you to put a comma after instability. You don’t need one, though — it’s a matter of choice.

Mindlessly following the suggestions of AI can damage your writing and take away your “voice” in what you have written. As English professor Dr. Kristen Lillvis said, “Every writer — even if they don’t recognize themselves as writers — [has] a unique voice. I would hate for a writer to not recognize and not develop that.”

Language is power. While you do not have to agree with those who argue you should never use AI writing tools, you do have to be conscious of how the suggestions that they make may alter what you think. As Lillvis noted in both interviews for my articles on AI this semester, it is crucial to develop critical AI literacy skills, learn about how AI works, recognize its flaws and realize how to use it responsibly. 

This article is only a starting point if you want to try to use ChatGPT (or other AI), but don’t know where to start or haven’t been able to get anything good out of the bot. ChatGPT and other generative AI are fed examples and use them to develop algorithms to predict writing. Because of this, AI is generally considered to be decent at the more formulaic elements of writing. 

Here are some general recommendations when using tools like ChatGPT from “6 ChatGPT Mistakes to Avoid” by Maxwell Timothy for Make Use Of. Make sure to provide appropriate levels of context. Keep the chatbot going for a while to develop its responses using the earlier parts of the conversations. Keep prompts simple; excessive detail can cause issues. If you want to change topics, start a new conversation. Otherwise, the bot might “hallucinate.” A hallucinating bot starts making up information.

In Dr. Lillvis’s English class, “Writing for Digital Age,” in an assignment that required students to use ChatGPT, almost everyone noticed that the bot created fake quotes. A student asked why it created a false quote, to which it responded, “It was an important point.” Hallucinating is a common problem for all chatbots, but changing topics or providing excessive detail makes the bot more likely to start spouting falsehoods. Remember, “Garbage in, garbage out.”

We should also remember that AI is trained on data sets by people, and people have biases. As Lillvis pointed out, it is vital to ensure you stand behind what the bot has created.


This is an example of a bot “hallucinating,” having decided that AI were students or faculty at the university despite that being nowhere in my prompts. Credit: OpenAI

Earlier this semester, in my exploration of ChatGPT, I mostly got garbage when I tried to get the bot to create titles for a news article. Now armed with these new tips, I attempted to go back and get better results. 

While it did hallucinate, I got a number of results that I thought were pretty good, and I did not get any repeats this time. Some of the best were, “AI-Enhanced Essays: Are Students Gaming the System or Gaining an Edge?” “The New Best Friend of College Students” and “From Procrastination to Perfection.”

Credit: OpenAI

Credit: OpenAI

Matt Ellis wrote an article for the Grammarly blog (a familiar friend to many of us) suggesting various ways to use AI most effectively that play to its strengths. AI is decent at tasks like generating ideas, writing outlines and basic proofreading. AI is also good at generating citations. (However, you should double-check them against a template.)

Lillvis and the students of “Writing for Digital Age” found that AI was most useful for the brainstorming phases of writing. Lillvis noted, “If you’re somebody who doesn’t like staring at a blank page, it can be a way to alleviate that fear because there’s something there that you can run with.”

Of the students who responded to the survey and had used ChatGPT, Madeline Maurer ‘25 (Elementary Education) viewed it as a good jumping-off point after brainstorming for assignments. Fern Schiffer ‘25 (English) said, “I found the results pretty underwhelming: It made up quotes from the sources, it wasn’t consistent in what position it was arguing for and the prose was overall kinda bad. I don’t see myself using GPT in the future, as correcting the errors in the text that it produces would be just as much work as writing something from scratch.” 

Currently, a number of different tools that utilize AI to do research exist. They can be used to either find sources, cite sources or both. One of the easiest to use is Bing’s AI. Bing’s AI is an extension of ChatGPT. It is more cautious than most AI because it is searching the internet and curating its results. The main advantage of using the generative AI through Bing is that it provides citations. Once again, you should probably not take all of what the AI says at face value in case the sources it pulls from are incorrect. Think of it like when you open a Wikipedia page: You shouldn’t cite Wikipedia, but you can use it to get an overview of a topic and to find sources.

Here is an example of what Bing’s Knowledge Cards look like. There are no citations. Credit: Bing

I asked Bing’s AI to explain to me how Knowledge Cards and Featured Snippets differed from each other. It explained that Knowledge Cards are not pulled from specific web pages and display general information, while Featured Snippets are a quick answer at the top of the results page. Credit: Bing

Above is a Featured Snippet about Rasputin. As you can see, it gives a short overview of who Rasputin was and the major events in his life; below the summary are links to the sources. Credit: Bing

The important thing about these suggestions is that none recommend having any AI tool write your paper for you. The ability to express your own thoughts is an important one to learn. If you use AI to write something for you, it is no longer yours.

Special Edition Fall 2023: Editor’s letter: Carving the pathway forward

Special Edition Fall 2023: Editor’s letter: Carving the pathway forward

Special Edition Fall 2023: St. Kate’s swim and dive team is putting the  ‘team above all else’ this season

Special Edition Fall 2023: St. Kate’s swim and dive team is putting the ‘team above all else’ this season