Appendix
How to Use Artificial Intelligence (pp. 185-187)
Unless you’ve been living under a rock since 2023, you’ve witnessed the deluge of commentary about Artificial Intelligence (AI), especially Chat GPT, the “generative” chatbot released by OpenAI. There’s already a tendency to criminalize the use of AI. That’s appropriate if people are using it to cheat. But not every use of AI is cheating. AI can play a positive role in research and writing—if we let it.
Writing is thinking. It’s not just a list of thoughts that you already had. Because writing is thinking, you’ll often find yourself changing some of your ideas as you write.
Writing is also a process. Your first draft won’t usually be your final one. As a process, writing often involves collaboration. Writers benefit from feedback, whether from peers or teachers. I certainly did while writing my book, Academic Writing as if Readers Matter.
Artificial Intelligence—particularly generative “Language Learning Models” like Chat GPT—can be collaborators of sorts, if you recognize their limitations and work within them. You would do this with any collaborator. You might show your work to one colleague because you know she’s great at the sentence level, but not at assessing your whole argument. With another colleague, it might be the other way around.
AI likewise has strengths and weaknesses. Language Learning Models are good at generating a lot of basic information about well-known subjects very quickly. They’re also adept at summarizing. These can be useful, especially in the early stages of research and writing. But generative AI presents its findings in generic, mediocre prose—less than ideal for any writer
Before I go further, let me acknowledge the AI-spotting problems that have attracted so much public attention. Even in these early days of discursive AI, many people are willing to let second-rate machine-authored prose substitute for their own work. The problems with detection and assessment have already ignited a firestorm of hopes, suggestions, hand-wringing, and apocalyptic warnings. These legitimate concerns require collective problem-solving and adaptation. But they center on dishonest writers, not you.
Using AI doesn’t equal “getting away with something.” Nor is AI new—the internet has long relied on it. You use it when you do a Google search, for example. And there’s no visible line between “my intelligence” and “other intelligence,” either. It’s an academic truism that no idea exists in an intellectual vacuum. We already use other people’s words and ideas when we quote and paraphrase, for instance.
I’ve suggested in this book that you not quote unless the words of your source are demonstrably better than yours. AI writes credibly, but not with emotion or personal voice. AI is not better than you. If you think it is, then read this book again and practice a lot.
The question, then, is how to use AI to help your writing. It’s a powerful tool, and you may find it useful. One professor compares it to “a high-end intern.” You may decide to use it to start a writerly brainstorm, or to finish one. Or you may want to compare AI’s ideas to your own after you write a draft. These are just a couple of possibilities.
But if you do any of these things, keep these cautions in mind:
ChatGPT and other AI does its best work on subjects that are widely written about. The reason is simple: AI works by scouring information pulled from the internet. The more information that’s available about a given subject, the more knowledgeable it will be. If you ask for an AI boost on an obscure topic, one of two things will happen: either the AI will come up empty, or it will make shit up. (Yes, it really does that. How very human.)
Given that the machine may cheat: Don’t rely on AI to know things instead of knowing them yourself. AI can lend a helping hand, but it’s an artificial intelligence that isn’t the same as your intelligence. The educational world is rapidly filling with stories of students who submit AI-written papers containing errors that the students don’t catch because they never bothered to learn the material themselves. Those transgressions will receive their just deserts from teachers, or at the Final Judgment. My point is simply that, as a writer, you have to know the stuff you’re writing about.
If you rely on AI to do the thinking, you become the curator, not the author, of the writing that results. And without an author, the writing will be bloodless. Flat, affectless writing might be okay for a user’s manual for your new air conditioner. But scholarly writing across disciplines needs sensibility.
Luckily, sensibility is something that humans have plenty of. Keep these cautions in mind, and go ahead and add AI to your toolkit. Just remember: use it to help you, not be you.
Academic Writing as if Readers Matter can be purchased here.