Teaching Responsible Use of AI Tools in the Social Sciences

Like many teachers, I’ve been wrestling with how to handle AI in my classroom. Banning it is neither realistic, nor the best option. I use plenty of AI tools myself, so taking the most extreme stance and banning AI would be a bit hypocritical. Although I am actively increasing tech-free activities in my classes this year to decrease screen time, I need to explicitly teach students how to use AI appropriately.

An early step I took was a search of existing AI policies, starting with academic contests like NHD and the John Locke essay contest, and ending up with International Schools and college/university syllabi. My favorite wording of an AI policy came from a Georgia Institute of Technology syllabus:

“We treat AI-based assistance, such as ChatGPT and Github Copilot, the same way we treat collaboration with other people: you are welcome to talk about your ideas and work with other people, both inside and outside the class, as well as with AI-based assistants. However, all work you submit must be your own.”

If you’d like to see the full policy, it is syllabus number 39 in the linked google doc. Finding this collection of over 200 policies from college syllabi was a great resource for seeing the range of college level responses to the AI challenge. The one I quoted stood out to me because it framed AI within the context of collaboration. We already teach students how to collaborate with their peers, so it should make sense that we need to teach them how to collaborate with AI.

Since my 9th and 10th graders complete National History Day projects, that felt like the perfect context to try a discipline specific approach. Thanks to some great collaboration with my school librarian, I developed a lesson organized around AI scenarios. I have provided these below. Students will do a simple gallery walk, having discussions at each scenario about whether or not they think the student actions described are appropriate or ethical. Afterwords, they will place these scenarios in the various levels of my school’s AI policy and develop a list of acceptable AI tasks for research in the social sciences.

It’s not a world-shattering lesson idea, but I hope it pushes students to think a little more carefully about how they use AI. As an added layer, my 9th graders are being assigned a short article titled “AI is Making you Dumber” and having a mini-Harkness on AI’s ramifications, academic dishonesty, and plagiarism.


My Student AI Scenarios

1. Brainstorming a Topic

A student is beginning their NHD project and feels overwhelmed by the broad annual theme. They ask an AI tool to generate a list of possible topics, and the tool provides a wide range of suggestions. Some of the topics are directly connected to the theme, while others are more loosely related. The student writes down a short list of the most interesting ideas and plans to discuss them with their teacher before choosing a final topic.

2. Background Summaries

A student selects a topic on medieval trade but realizes they don’t know enough about the period to start. They use an AI tool to request a plain-language summary of the time period and its main events. The tool generates an overview that introduces key people, places, and concepts. The student copies some of the background information into their notes as a starting point before moving on to books and databases.

3. Primary Source “Shortcuts”

While researching, a student finds a long primary source that is a political speech from the 1800s. They paste the entire document into an AI tool and ask it to explain the meaning in simpler terms. The tool produces a summary of the main points and identifies a few quotes that seem important. The student saves both the summary and the quotes in their notes to use later in their project.

4. Drafting a Thesis Statement

A student has collected several sources but struggles to express their argument clearly. They prompt an AI tool to generate sample thesis statements related to their topic. The tool responds with three possible versions, each with a slightly different angle. The student copies the suggestions into their notes and uses them to think through how they might structure their own claim.

5. Citation Help

As the deadline approaches, a student is assembling their bibliography and isn’t confident in using Chicago style. They paste a list of their sources into an AI tool and ask it to format them correctly. The tool produces citations with titles, dates, and page numbers arranged into a bibliography. The student then pastes these into their project and adjusts them where needed.

6. Writing Entire Sections

A student feels behind on their paper and wants to finish quickly. They ask an AI tool to “write 500 words on the causes of the Opium War.” The tool produces a polished section with clear paragraphs and transitions. The student copies the response into their project draft and marks it as a completed portion of their work.

7. Feedback on Writing

After drafting an essay for their NHD project, a student wants to know how to improve it. They paste their draft into an AI tool and ask for feedback. The tool highlights sentences that could be clearer, points out places where more evidence could be added, and suggests smoother transitions. The student copies the suggestions into their notes and makes revisions to their draft.

8. Data Fabrication

A student is writing about women in the French Revolution and wants direct quotes to support their points. They ask an AI tool to provide quotes from primary sources on the topic. The tool produces three different quotes, each with an attribution to a historical figure. The student writes them down in their notes and plans to include them in their final project.


The dangers of AI are not entirely new

Ancient and classical histories are full of exaggerations, fabricated speeches, and questionable data. Part of this is attributed to history’s lack of development as a discipline in those times. The realities of source availability and the intellectual values of the times also contributed.

As historical methodology has evolved, it was not just the classical historians who have had charges of exaggeration or inaccuracy leveled at them. George Bancroft, a giant of early American historiography has been charged with overly nationalistic interpretations, but more seriously, with “manipulating sources” through his “tendency to quote material rather loosely.” Historical negationism, of which Holocaust denial or the “Lost Cause” Confederacy narratives are a part, have been used to suppress certain sources or perspectives, manipulate the narrative, and distort or even falsify the historical record.

On a less serious note, popular history is full of little white lies that grown from legend or parable and represent truth in the minds of many people. Whether its George Washington chopping down a cherry tree, Marie Antionette saying ‘Let them eat cake,” or Nero fiddling while Rome burned, these stories misrepresent history even if the original creator believed they represented the “spirit” of historical events.

AI tools introduce strikingly similar problems with students lean on them uncritically. AI has generated sources that don’t exist, misquoted or misunderstood the full context of sources, fabricated data, and made countless other mistakes that distort understanding. The pitfalls of AI echo the evolution of historical practice. AI may be a potentially dangerous new tool, but the problems it threatens us with are not new. Engaging with students around these pitfalls and teaching them how to overcome them, paired with building their academic self-confidence, is more valuable than simply banning AI.

Leave a Reply