General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsStudy finds 89% of students have used ChatGPT, but 72% want it banned
Story from Futurist last week about the study, then excerpts from the Study.com website.
https://futurism.com/the-byte/students-admit-chatgpt-homework
The responses were surprising. A full 89 percent said they'd used it on homework. Some 48 percent confessed they'd already made use of it to complete an at-home test or quiz. Over 50 percent said they used ChatGPT to write an essay, while 22 percent admitted to having asked ChatGPT for a paper outline.
-snip-
At the same time, according to the study, almost three-quarters of students said they wanted ChatGPT to be banned, indicating students are equally worried about cheating becoming the norm.
Educators are also understandably worried about AI having a major impact on their students' education, and are resorting to AI-detecting apps that attempt to suss out whether a student used ChatGPT.
-snip-
From Study.com, which is interesting because fewer teachers than students think ChatGPT should be banned...though that seems to be partly because they want to use it themselves, and partly out of simple naivete about whether it will be possible to detect student cheating.
https://study.com/resources/perceptions-of-chatgpt-in-schools
Over a third (34%) of all educators believe that ChatGPT should be banned in schools and universities, while 66% support students having access to it.
Clearly, the perceived value outweighs the risks, so how do teachers plan on using it? Out of the 21% of educators getting a jump start on this technology:
7% use ChatGPT to provide writing prompts.
5% use ChatGPT to help teach a class.
4% use ChatGPT to create lesson plans.
4% use ChatGPT to teach writing styles.
3% use ChatGPT as a digital tutor.
-snip-
Study.com suggests that students' greater belief that ChatGPT should be banned because of the risk of cheating is probably due to "the 15% of educators who have used ChatGPT to discuss the moral implications of technology."
Good for the 15%...but it's also very disappointing that 100% of teachers didn't discuss the ethical implications.
As for the teachers' naivete... Judging by the comments, some think it's really no different from using Google (wrong). Or they believe what's generated by AI has "fingerprints" that can always be detected (wrong, and I wish those teachers had then commented on whether they'd need to be able to detect it, since they obviously hadn't bothered to try to actually find detection tools). And one wasn't worried about cheating because, quoting: "I teach in a very underserved community, and the vast majority of my students don't have internet access at home (other than on their phones), so I'm not worried about the cheating aspect." So that teacher isn't concerned since his or her students supposedly can't afford to cheat. Which not only ignores the basic ethical issue about AI, but other basic ethical issues. And the advantage-via-cheating AI will give students who can afford to use it will only increase now that OpenAI is offering a better version, ChatGPT Plus, for $20 a month, with another tier on top of that, and presumably more expensive tiers to come, with the more expensive versions having access to more data, newer data, and probably being harder to detect.
What I found most revealing, though, was how many students felt it should be banned.
They're probably much more aware than their teachers are of just how much ChatGPT will encourage cheating and laziness, and how much it will simply discourage students from bothering to learn - including learning how to reason and communicate - if they can get AI to do it for them and other students using AI might get better grades.
brush
(53,922 posts)We never had these types of "study aide" when I was in college. Imagine, not having to do the college course work, just open up the ChatGPT app, put you feet up and wait for it to generate you essay/work for you.
There is the downside though. Students don't learn anything if their work is done for them.
Doesn't augur well for a nation if most of its college grads cheated their way through school and don't know what their degree says they do.
NJCher
(35,758 posts)I'm a teacher. In a classroom discussion, someone brought up a legal challenge about student loans. Some student and his attorney had an angle they were trying out in the courtrooml The newspaper article said it could have resulted in students having an out of some sort on their loans.
Practically everyone in the classroom thought that was wrong. They said, "We took out those loans; we should have to pay them back."
I didn't comment on this because they lacked the perspective that the institutions had been hiking up the tuition because student loan aid was available. It's not my place to proselytize, anyway. Regardless, I think their opinion shows that students have a moral and ethical sense.
Consequently, as a higher ed teacher of many decades, this does not surprise me.
=======
On another topic on this issue, seasoned teachers are going to know if a student wrote an essay or not. And the AI will get there to warn those who don't have that sense.
Students are quite naive about what they think their teacher will accept. They don't have the years of living experience like we have to be able to make a judgment on this.
Consequently, I have not been involved in the freakout over ChatGPT.
I just it off. One more new item to consider when grading an assignment.
VMA131Marine
(4,149 posts)ChatGPT doesnt know what the assignment was so its still up to the student to add all the relevant details. Im totally fine with using it as a starting point for an outline because the writer is still going to need to make sure everything needed is included. Where I would draw the line is in a creative writing class, the AI is competent, but not artistic and thats pretty obvious.
highplainsdem
(49,044 posts)Sympthsical
(9,127 posts)I've been using it the past two weeks or so as a study/homework aid. Mainly out of curiosity about its capabilities - I'd never use it to write a paper.
I'll go through the materials and my notes as usual. Then, as I get ready to wrap everything up into a block for studying, I run various questions through it just to see what it spits out.
What I've found is it's really good at synthesizing information. Where I would manually go through and pick out the important points and assemble the information in a way that makes logical, cohesive sense, I find it can do that organizing for me really, really well. If I know what and how to ask it to explain a topic, it'll spit out a really decent Cliff Notes version of the subject matter.
I wouldn't rely on it totally, because there are things in places like my lecture notes and textbooks the AI doesn't touch on in its briefer digestion of the questions.
But I could see students easily trying to rely on it and being able to do decently in the class. And it does remove a critical component of the thinking skills that are almost the entire point of attending college. Knowing how to gather and interpret information is pretty much 80% of college right there.
And the AI can short circuit a lot of this.
Polybius
(15,507 posts)Ban it for everyone! Wait can it help me? Ban it for everyone else!
fescuerescue
(4,448 posts)This reminds me of the math teacher calculator panic of 1973.