One Tech Tip: Do's and don'ts of using AI to help with schoolwork

One Tech Tip AI For Schoolwork
Photo credit AP News/Kiichiro Sato

The rapid rise of ChatGPT and other generative AI systems has disrupted education, transforming how students learn and study.

Students everywhere have turned to chatbots to help with their homework, but artificial intelligence's capabilities have blurred the lines about what it should — and shouldn't — be used for.

The technology's widespread adoption in many other parts of life also adds to the confusion about what constitutes academic dishonesty.

Here are some do's and don'ts on using AI for schoolwork:

Don't just copy and paste

Chatbots are so good at answering questions with detailed written responses that it's tempting to just take their work and pass it off as your own.

But in case it isn't already obvious, AI should not be used as a substitute for putting in the work. And it can't replace our ability to think critically.

You wouldn't copy and paste information from a textbook or someone else's essay and pass it off as your own. The same principle applies to chatbot replies.

“AI can help you understand concepts or generate ideas, but it should never replace your own thinking and effort,” the University of Chicago says in its guidance on using generative AI. “Always produce original work, and use AI tools for guidance and clarity, not for doing the work for you.”

So don't shy away from putting pen to paper — or your fingers to the keyboard — to do your own writing.

“If you use an AI chatbot to write for you — whether explanations, summaries, topic ideas, or even initial outlines — you will learn less and perform more poorly on subsequent exams and attempts to use that knowledge,” Yale University's Poorvu Center for Teaching and Learning says.

Do use AI as a study aid

Experts say AI shines when it's used like a tutor or a study buddy. So try using a chatbot to explain difficult concepts or brainstorm ideas, such as essay topics.

California high school English teacher Casey Cuny advises his students to use ChatGPT to quiz themselves ahead of tests.

He tells them to upload class notes, study guides and any other materials used in class, such as slideshows, to the chatbot, and then tell it which textbook and chapter the test will focus on.

Then, students should prompt the chatbot to: “Quiz me one question at a time based on all the material cited, and after that create a teaching plan for everything I got wrong.”

Cuny posts AI guidance in the form of a traffic light on a classroom screen. Green-lighted uses include brainstorming, asking for feedback on a presentation or doing research. Red lighted, or prohibited AI use: Asking an AI tool to write a thesis statement, a rough draft or revise an essay. A yellow light is when a student is unsure if AI use is allowed, in which case he tells them to come and ask him.

Or try using ChatGPT’s voice dictation function, said Sohan Choudhury, CEO of Flint, an AI-powered education platform.

“I’ll just brain dump exactly what I get, what I don’t get” about a subject, he said. “I can go on a ramble for five minutes about exactly what I do and don’t understand about a topic. I can throw random analogies at it, and I know it’s going to be able to give me something back to me tailored based on that.”

Do check your school's AI policy

As AI has shaken up the academic world, educators have been forced to set out their policies on the technology.

In the U.S., about two dozen states have state-level AI guidance for schools, but it's unevenly applied.

It's worth checking what your school, college or university says about AI. Some might have a broad institutionwide policy.

The University of Toronto's stance is that “students are not allowed to use generative AI in a course unless the instructor explicitly permits it” and students should check course descriptions for do's and don'ts.

Many others don't have a blanket rule.

The State University of New York at Buffalo “has no universal policy,” according to its online guidance for instructors. “Instructors have the academic freedom to determine what tools students can and cannot use in pursuit of meeting course learning objectives. This includes artificial intelligence tools such as ChatGPT.”

Don't hide AI use from teachers

AI is not the educational bogeyman it used to be.

There’s growing understanding that AI is here to stay and the next generation of workers will have to learn how to use the technology, which has the potential to disrupt many industries and occupations.

So students shouldn't shy away from discussing its use with teachers, because transparency prevents misunderstandings, said Choudhury.

“Two years ago, many teachers were just blanket against it. Like, don’t bring AI up in this class at all, period, end of story,” he said. But three years after ChatGPT's debut, “many teachers understand that the kids are using it. So they’re much more open to having a conversation as opposed to setting a blanket policy.”

Teachers say they’re aware that students are wary of asking if AI use is allowed for fear they’ll be flagged as cheaters. But clarity is key because it’s so easy to cross a line without knowing it, says Rebekah Fitzsimmons, chair of the AI faculty advising committee at Carnegie Mellon University’s Heinz College of Information Systems and Public Policy.

“Often, students don’t realize when they’re crossing a line between a tool that is helping them fix content that they’ve created and when it is generating content for them,” says Fitzsimmons, who helped draft detailed new guidelines for students and faculty that strive to create clarity.

The University of Chicago says students should cite AI if it was used to come up with ideas, summarize texts, or help with drafting a paper.

“Acknowledge this in your work when appropriate,” the university says. “Just as you would cite a book or a website, giving credit to AI where applicable helps maintain transparency.”

And don't forget ethics

Educators want students to use AI in a way that's consistent with their school's values and principles.

The University of Florida says students should familiarize themselves with the school's honor code and academic integrity policies “to ensure your use of AI aligns with ethical standards.”

Oxford University says AI tools must be used “responsibly and ethically” and in line with its academic standards.

“You should always use AI tools with integrity, honesty, and transparency, and maintain a critical approach to using any output generated by these tools,” it says.

____

Is there a tech topic that you think needs explaining? Write to us at onetechtip@ap.org with your suggestions for future editions of One Tech Tip.

Featured Image Photo Credit: AP News/Kiichiro Sato