top of page
Raena Hunter Doty

Educational Technology Office presents on AI use at FSU


Kevin Kennedy speaking.
Alexis Schlesinger / THE GATEPOST

By Raena Hunter Doty Arts & Features Editor Instructional Technologist Kevin Kennedy, who works in the Educational Technology Office (ETO), hosted a panel discussion titled “How are FSU Faculty Using GenAI?” Oct. 9. Kennedy said he gave a different version of the presentation originally to English professors, and the opening slide of the presentation is an artificial intelligence (AI)-generated illustration of “English faculty discussing AI.” The illustration showed a group of all-white faculty members sitting around a table that had a robot in the center. Kennedy said, “We kept [the illustration] because I think it demonstrates some of the issues we see when it comes to AI,” namely the literal interpretation of “AI” as represented by the robot and the fact that all the faculty were white. Rachel Avard, professor of biology, presented the ways in which she has integrated use of AI into her classroom. “I’ve been trying to incorporate AI into my classroom a lot because I think that it is both helpful to students, but also it is going to quickly grow into a requirement for students to have in the workforce,” Avard said. She said the first AI assignment she gives to students in her classroom is to write Python code using ChatGPT, an AI model, because coding is increasingly an important skill in the biology field, but many biologists aren’t taught how to code, so if ChatGPT can do it for them, it gives them an edge in the workforce. Avard added though it can be helpful, it can also generate output that’s incorrect, and students may not always recognize what about the output isn’t working. “It really, I think, demonstrates to the students how wrong it is, and how quick it is for me to look at the code that ChatGPT made and to say, ‘Nope, that was wrong because it’s missing a comma,’ something that they would have never caught,” she said. She said this demonstrates to students the errors made possible by using an AI to generate a product, and added often ChatGPT can get the output correct on the second or third try, even with no alterations to the prompt. Avard said she also encourages students to use AI to generate study guides and practice problems so they can continue learning without giving away any of the work to someone else. She added the last major way she utilizes AI in her classroom is to discuss bioethical issues with the class. She asks her students to discuss a major bioethical issue with an AI program like ChatGPT, taking turns and asking 12-15 questions so ChatGPT can help the students consider the other side of the debate. “It really gives students this opportunity to learn how to use AI in the right ways and how it doesn’t always fit into our classrooms,” Avard said. Kennedy presented next. He said he teaches first-year writing classes at Bridgewater State University, where he tries to integrate AI into the writing process. He said his first step is to ask students to use AI to help generate thesis concepts and ideas for where to go next, then to reflect on how that conversation helped them. Kennedy said in his Intro to Technical Writing class at Bridgewater State, he has students create a video tutorial for a process they know how to do well and then use an AI program to generate a list of instructions for that process. They then compare and contrast the video and the AI-generated instructions. He then presented on what some professors who were absent from the presentation are doing with AI in their classrooms. He said Trinidad Morales, professor of Sociology & Criminology, assigns his students to ask AI programs a series of questions about racism in the U.S. and write journal responses to the questions. He added Yumi Park Huntington, professor of art history, uses AI programs to generate analysis of different art pieces, and often the AI does well with analyzing Western or American art but struggles to name styles and movements in other parts of the world. Steve Courchesne, co-coordinator and professor of Education, presented his research on implementing AI to generate lesson plans, conducted with Education Professor Wardell Powell. He said they started with experimentation in making it design lesson plans for younger elementary-aged children, but have since shifted their efforts to using it for high school lesson plans. Courchesne displayed a diagram of how he and Powell have used AI to generate lesson plans where the bulk of the work was in the “iterative” steps - “ChatGPT output,” “Reflection,” and “Follow-up question,” which could be repeated until it created a solid output. He said ChatGPT can generally create a fairly viable lesson plan, “but there’s usually stuff that’s missing - details that are just not there that you would need.” Courchesne said one example of this is that ChatGPT might outline a space for group discussion, but it wouldn’t give prompts for questions to ask during this discussion. “The takeaway I want to share is the idea of iteration and the importance of iteration - the importance of not necessarily just taking the first version you get to any question you asked but thinking about, ‘What is missing? What do I need to add? What do I not trust from this?’” he said. He added ChatGPT and other AI models can also struggle with “hallucinated” information - information generated by an AI model that can sound authoritative but is generally untrue. Courchesne said in the attempts to create an elementary school lesson plan, at one point ChatGPT recommended a children’s book about tiger biology, but when fact-checked, the book didn’t actually exist. He said the line of thinking about how teachers prepare lesson plans with AI can also be applied to how students are expected to interact with AI - the same principles of questioning, fact-checking, and engaging in original research all apply. Courchesne recommended putting more emphasis on grading how students create their products than on what their final products look like when using AI, which may mean altering how grades are weighted. To wrap up the presentation, Kennedy gave a few reflective questions to any teachers in the room. The first question asked teachers to consider what they consider to be ethical uses of AI, which Kennedy said he thinks is “the biggest one.” To wrap up the presentation, he suggested anyone interested in integrating AI into their pedagogy should consult with ETO on designing assignments and getting feedback on implementation of AI in their courses.

11 views

Comments


Commenting has been turned off.
  • Instagram
  • Facebook
  • Twitter
bottom of page