ChatGPT arrives in the academic world

Originally published at: ChatGPT arrives in the academic world | Boing Boing

5 Likes

This technology might bring about a “paradigm shift” in education, like the one a couple thousand years or so ago, from memorization to writing things down because hey, now we have an alphabet! The skill of memorization fell by the proverbial wayside, replaced more or less by the skill of writing.

Maybe teachers who use writing assignments should shift more to teaching how to detect and refine good writing, and focus less on how to produce it.

But writing and learning aren’t really about whether someone has plagiarized or not. As educators we’re trying to help students read and interpret the world around them.

Yep. And if students will mostly be using AI a lot during their future jobs to write, instead of writing themselves, shouldn’t teachers adapt to that, by helping them practice how to get an AI to produce good writing?

5 Likes
2 Likes

The point of college writing courses is to develop a bundle of interrelated skills and habits. “Writing”–that is, the production of coherent, readable texts–may be the focal point, but behind that short essay or longer term paper or bluebook essay exam stand not only control of grammar, rhetoric (argument, exposition, narrative), and semantics/semiotics (awareness of how language means) but command of a body of information or analytical technique (thus research and reading/comprehension/analytical skills). So any writing assignment should be designed to engage all of those areas, and the “product” should be evaluated accordingly. A teacher who settles for the kind of bland blather generated by this AI isn’t doing the job. (And teaching the skill-set and especially evaluating the papers is an endless slog, made worse when the students are unwilling or underprepared or both.)

I worked at that job for 19 years and have watched my wife do it for 50. It’s not hard to spot the students who are avoiding the intellectual work and generating AI-like cheese-food product. I’ve spent my post-teaching decades as a professional writer, and I seriously doubt that any AI could replicate my output. Not saying it’s impossible, but the organic intelligence (OI?) that produces my copy might be too engaged with the particularities of my experience and my internal semiotic network to be easily mimicked–except, perhaps, by another OI. (I recall that Iain M. Banks addresses the simulation problem in one of his later novels.)

5 Likes

Why is ChatGPT suddenly making this big an impact when GPT3 has been available to the public just as easily for half a year or so now, and is much more powerful?

2 Likes

And it produces “references” in its papers. References in the same way that visual AIs produce meaningful text. Or non horror adjacent fingers.

2 Likes

Because the writing per se is just a vehicle for showing that the student knows the material. It’s more akin to using calculators in maths exams. Can you solve the problem, even if you need help with multiplying? The work needed to write coherently is a complex set of tasks, and you can hardly be expected to recognize good writing if you’ve never been asked to create some.
People use these “services” to avoid having to do the work of learning the material. If you don’t know the material, how can you be expected to double check what an AI has to say about it?

1 Like

If they’re students, yes, that is mostly the case at the moment.

But as I wrote, it’s easy to imagine this kind of AI also being used to produce many forms of writing in many different jobs. If that’s the case, and if AI that writes in specific situations really will get better and better at it, why not teach students how to use it well to produce, or at least help them produce, good writing?

Is producing one’s own writing about the material the only way to come to know it? Of course it isn’t.

You can hardly be expected to recognize good music if you’ve never learned to compose some. You can hardly be expected to recognize good food if you’ve never learned to cook some. You can hardly be expected to recognize a good painting if you’ve never learned to paint. (See where I’m going with this?)

4 Likes

How long before we no longer have educators doing the job of educating and we just hand that over to AI… after all, we’re replaceable cogs, right? We have nothing to actually offer young people other than job training…

Tired Bette Davis GIF by Maudit

3 Likes

In order to write about the material you have to know it. If I’m tasked with giving a presentation about a topic and I got to the AI to create it, and the AI gets half of it wrong, but I don’t know the topic, how will I know? The writing isn’t the only point-there are many ways to learn. But learning how to think and communicate coherently can’t be done by having an AI write for you.
And people do learn to recognize what makes good-or at least effective-art through doing art. People sing, write lyrics, doodle, and all of that helps them see the art made by others.

Sure, but again, writing about the material isn’t the only way to know it.

The issue at hand is an AI that can write in ways that pretty closely resemble, already, student writing. You say that “learning how to think and communicate coherently can’t be done by having an AI write for you,” and I say, I’d bet they can, and educators may well have no choice in the future BUT to let students use an AI to write for them. What teachers can do, already, is teach students how to convert text written by an AI into even better text. And to do that, surely they’d need to know the material that the text is about.

Seems to me that if text written by AIs is to become a norm in the future, in all sorts of fields (a future I find all too easy to imagine), then we’d better hope that we also have good, well-trained human revisers and editors to make it accurate, coherent and so on before it gets published.

You might enjoy shifting goal posts, but I’m not going to help you move that thing.

2 Likes

“Converting” text is what is usually called rewriting or editing. And those are writing skills, which is to say, they depend on the same machineries that generate original text. (A pedagogical cliche: Writing is rewriting.) A student who can’t write a grammatical sentence or a coherent paragraph is unlikely to be able to turn mediocre copy into something better. The process of learning how to write decent copy is iterative and imitative. And it helps a lot to have what used to be a middle-school grasp of grammar and sentence structure, if only so that there’s a shared terminology for explaining what’s going on in a text. (There’s a parallel issue in discussions of how much “theory” is needed to become a competent musician.)

My wife has students who do not know what a declarative (let alone a compound or compound-complex) sentence is. So did I, thirty-some years ago. That kind of ignorance makes coaching** developing writers even harder than it is to begin with.

** I use this term deliberately, since teaching writing is not much like teaching math or physics.

1 Like

GPT3 takes experience to use well – it is a completion engine, so you need to write your queries in the form it needs – that is the beginning of the text you want and it completes it. ChatGPT may be less powerful in a sense, but it allows you to “talk” to it as if it were a human. That just feels more impressive, even if it isn’t really technically.

1 Like

Yep. I didnt mean all or nothing. Editing well certainly still comes from writing at least fairly well.

This topic was automatically closed after 5 days. New replies are no longer allowed.