Automated Collaboration
Harnessing the potential of generative AI requires critical thinking from both students and faculty
by Autumn Thatcher (MSC '15)
If popular culture is any indication, we, as a society, are fascinated by artificial intelligence (AI). The world of science fiction has allowed us to imagine that anything is possible when it comes to AI—even the undoing of the human race by our very inventions. In real life, we aren’t likely to be fighting for our survival against cyborgs, but the evolution of AI and its endless capabilities have had a ripple effect in the workplace.
The 2022 emergence of the generative AI platform ChatGPT has been received with a lot of enthusiasm. In the workplace, a 2023 survey conducted by Resume Builder revealed that 49 percent of companies use ChatGPT and that 90 percent of business leaders believe experience using it enhances the value[MOU1] of job seekers. But it has also been met with uncertainty and a bit of anxiety over the unknown. Will generative AI replace humans in the workplace? Do creators across the professional spectrum stand a chance against the otherworldly pace of AI-generated content?
According to some of Westminster’s faculty, the outlook for humans is good.
Exploring possibilities in and out of the classroom
Professor Christopher LeCluyse says that when it comes to thinking about whether to incorporate AI into the learning experience for students, that ship has sailed. Statistics reveal that nearly half of students engage with AI platforms: Nerdynav’s 2023 ChatGPT Cheating Statistics and Impact on Education reports 43 percent of college students have used ChatGPT or similar AI tools.
“Asking students not to use AI as part of assignments is like asking them not to use Wikipedia, google things, or use citation software,” Chris explains—“all of which, incidentally, were also originally reacted against with anxiety and a kind of moral panic. Some of this is just what we do when a new technology comes along.”
A professor of literature, media, and writing—as well as director of Westminster’s Writing Center and associate provost for curriculum and assessment—Chris has been closely following the evolution of generative AI and how students are engaging with it. He is also one of several faculty members and students who came together to present to Westminster’s board of trustees about AI, its benefits in the classroom, and the policies that should be put in place around it to protect the students and teachers who want to use it—and those who don’t.
Chris says that keeping up with evolving technology in higher education is an exciting exercise. It is one that brings with it plenty of opportunity for critical thinking—a key component of the classroom experience at Westminster University.
“I have quite a few colleagues who are productively using AI as parts of assignments and trying to design these in a way that students think critically about it. It’s effective to the extent that you engage students in talking about it—and asking, ‘What does AI do well, and what does it not do well?’” Chris says. “’How do you use it ethically, and how do you use it unethically?’ so they’re making conscious choices in the open rather than it being this dirty little thing you do in secret.”
Chris says critical-thinking exercises and conversations—along with assignments that allow students to look at the pros and cons of engaging with generative AI—help both students and faculty feel more comfortable with the technology, as they find ways to let it help rather than hurt them.
“It can help with idea generation, planning, and outlining—and those would all be ethical uses of it,” Chris says. “It can also be used to help students understand passages. This might be useful particularly for multilingual students.”
And the cons? It seems as though many of the cons stem from a lack of understanding of what generative AI actually is or can do.
“ChatGPT cannot search the internet. It cannot search for real sources. If you ask it to write a paper that uses sources, it will fabricate them. If you ask it to quote sources, it will make up the quotations,” Chris explains. “I don’t think that students who are using generative AI without this kind of explicit instruction and discussion are aware of that.”
As with most things, the choice to use AI ethically or unethically comes down to the individual engaging with it. Those who learn how to work with the technology to enhance their productivity and inspire professional and personal development are setting themselves up for greater success—for emerging into their careers prepared to lead.
Preparing for success in the workplace
Computer science professor Jingsai Liang is no stranger to AI technology. When working with students, Jingsai and his colleagues are constantly exploring ways to ensure ethical use of AI in students’ work. Like the evolution of technology, it’s ongoing. But AI technology, unsurprisingly, is not a new, scary thing in the world of computer science.
Jingsai says that, while generative platforms such as ChatGPT can be used to write code at an entry level, students should learn to work with it in a way that enhances their work rather than replaces it because, at the end of the day, AI is not foolproof.
“You never can trust it. If you use ChatGPT, you know all the answers generated from it look correct—look perfect—but they could be totally wrong. They could be totally made up by some false information,” Jingsai says. “You cannot distinguish between a true message and a false message. And those answers probably are biased. It is not real understanding; it just summarizes the patterns from the existing knowledge. Even though ChatGPT can generate answers and solutions, it's still our job to review the work to make the final decision. It's our job, not the machine’s job.”
Jingsai—along with colleagues in Westminster’s computer science program and others around campus—are taking measures to ensure that their students are equipped to do the job of making those final decisions. There are myriad AI programs that computer science students can use for coding—and their availability can present challenges in the classroom for students who might want to lean too heavily on the platforms to do their coursework. But to know how to enhance productivity through AI is to know how to function without it. This can be particularly useful when thinking about leading in the workplace where AI might begin occupying positions.
“Those machines are helping us, not destroying us,” Jingsai says. “They always need humans. They might replace some human jobs. But then they will create new human jobs—so it will help us to develop new skills. For example, the machine replaces the job, but the company will then need to hire someone to manage the machine, to maintain it.”
To ensure students are prepared, Jingsai and his colleagues are constantly thinking about how their coursework needs to evolve to help students keep up with technology—to be the humans equipped with the skills needed to manage and maintain the machines in the workplace.
Partnering with AI
“What I ended up learning once I started working after college was that you can’t prepare for a specific job,” says Calvin Golas (’21), a software engineer at Amazon. “Any of them could use any number of different tools. Being able to quickly adapt and catch on to what tools you’re going to be using—and use them at a running start—is what’s more helpful competently. That definitely was an aspect of what I learned at Westminster.”
In his role at Amazon, Calvin writes code for Amazon’s seller-notification platform. In addition to working on the software, he is also responsible for the design, maintenance, and upkeep of it.
“Artificial intelligence, at least in my work context, tends to be used like a pair programmer: automating away things that would tend to take a lot of manual effort by a person looking over a large section of data,” Calvin says.
Engaging with AI has helped Calvin and his colleagues more rapidly analyze data, such as large amounts of customer feedback, by requesting a summary from a large language model such as ChatGPT or Meta AI. At work, Calvin has used Codewhisperer, but he says that the foremost tool is Github Copilot. These apps are relevant for day-to-day use because they can assist code writers in code completion, generating suggestions from which programmers can find inspiration. In Calvin’s experience, AI is a supplement to the work humans are doing—a way to help individuals produce more and free up time and space to expand their capabilities.
As roles in the workplace morph, professionals have the opportunity to be one step ahead of the game by experimenting with AI and finding ways to make their lives easier, while elevating their roles in the process.
“AI is such a big deal right now. Companies are hiring that want people with skills in terms of AI. Understanding its possibilities and limitations can help in settling unreasonable fear around it,” Calvin says. “ChatGPT is free to use. It’s worth trying out: talk to it a little bit and see what it’s capable of. Learn to understand it in terms of what you can and can’t do with it.”
As professionals feel more comfortable partnering with AI, the creative ways it can be used in the workplace expand. Coupled with critical thinking, its potential is limitless.
“It is still our job to explore the new knowledge,” Jingsai says. “We need to manage those skills of how to effectively interact with AI to make a better job for a better life. You must use it. You must understand it. That’s the future."
Sidebar: Alums weigh in anonymously on AI in the workplace
“It's absolutely critical that students understand the ethical implications of using AI. Passing off AI generated content as original will be a career killer but showing resourcefulness and ingenuity through using AI will be career-maker—especially in the marketing, communication, and design industries. A decent familiarity with AI will be expected, proficiency is using AI tools will make them a better candidate than their peers in the hiring process.”
“If it can benefit the people who you or your company tries to help, why would you not take advantage of that?”
“Accept that it will be the new reality and start learning to use it or how it works now.”
“Learn it since it undoubtedly will become bigger as time goes on. In the same way Microsoft paint was replaced by Adobe creative cloud you should be learning the next thing not settling for the tools that exist today.”
About the Westminster Review
The Westminster Review is Westminster University’s bi-annual alumni magazine that is distributed to alumni and community members. Each issue aims to keep alumni updated on campus current events and highlights the accomplishments of current students, professors, and Westminster alum.
GET THE REVIEW IN PRINT STAY IN TOUCH SUBMIT YOUR STORY IDEA READ MORE WESTMINSTER STORIES