Idk, I think we're back to "it depends on how you use it". Once upon a time, the same was said of the internet in general, because people could just go online and copy and paste shit and share answers and stuff, but the Internet can also just be a really great educational resource in general. I think that using LLMs in non load-bearing "trust but verify" type roles (study buddies, brainstorming, very high level information searching) is actually really useful. One of my favorite uses of ChatGPT is when I have a concept so loose that I don't even know the right question to Google, I can just kind of chat with the LLM and potentially refine a narrower, more google-able subject.
It’s funny how everyone is against using AI for students to get summaries of texts, pdfs etc which I totally get.
But during my time through medschool, I never got my exam paper back (ever!) so the exam was a test where I needed to prove that I have enough knowledge but the exam is also allowed to show me my weaknesses are so I would work on them but no, we never get out papers back. And this extends beyond medschool, exams like the USMLE are long and tiring at the end of the day we just want a pass, another hurdle to jump on.
We criticize students a lot (righfully so) but we don’t criticize the system where students only study becase there is an exam, not because they are particularly interested in the topic at given hand.
A lot of topics that I found interesting in medicine were dropped off because I had to sit for other examinations.
Even setting aside all of those things, the whole point of school is that you learn how to do shit; not pass it off to someone or something else to do for you.
If you are just gonna use AI to do your job, why should I hire you instead of using AI myself?
The issue as I see it is that college is a barometer for success in life, which for the sake of brevity I'll just say means economic success. It's not just a place of learning, it's the barrier to entry - and any metric that becomes a goal is prone to corruption.
A student won't necessarily think of using AI as cheating themselves out of an education because we don't teach the value of education except as a tool for economic success.
If the tool is education, the barrier to success is college, and the actual goal is to be economically successful, why wouldn't a student start using a tool that breaks open that barrier with as little effort as possible?
How people think I use AI
"Please write my essay and cite your sources."
How I use it
"please make my autistic word slop that I wrote already into something readable for the nerotypical folk, use simple words, make it tonally neutral. stop using emdashes, headers, and list and don't mess with the quotes"
I've said it before and I'll say it again. The only thing AI can, or should be used for in the current era, is templating... I suppose things that don't require truth or accuracy are fine too, but yeah.
You can build the framework of an article, report, story, publication, assignment, etc using AI to get some words on paper to start from. Every fact, declaration, or reference needs to be handled as false information unless otherwise proven, and most of the work will need to be rewritten. It's there to provide, more or less, a structure to start from and you do the rest.
When I did essays and the like in school, I didn't have AI to lean on, and the hardest part of doing any essay was.... How the fuck do I start this thing? I knew what I wanted to say, I knew how I wanted to say it, but the initial declarations and wording to "break the ice" so-to-speak, always gave me issues.
It's shit like that where AI can help.
Take everything AI gives you with a gigantic asterisk, that any/all information is liable to be false. Do your own research.
Given how fast things are moving in terms of knowledge and developments in science, technology, medicine, etc that's transforming how we work, now, more than ever before, what you know is less important than what you can figure out. That's what the youth need to be taught, how to figure that shit out for themselves, do the research and verify your findings. Once you know how to do that, then you'll be able to adapt to almost any job that you can comprehend from a high level, it's just a matter of time patience, research and learning.
With that being said, some occupations have little to no margin for error, which is where my thought process inverts. Train long and hard before you start doing the job.... Stuff like doctors, who can literally kill patients if they don't know what they don't know.... Or nuclear power plant techs... Stuff like that.
This reasoning applies to everything, like the tariff rates that the Trump admin imposed to each countries and places is very likely based from the response from Chat GPT.
Okay but I use AI with great concern for truth, evidence, and verification. In fact, I think it has sharpened my ability to double-check things.
My philosophy: use AI in situations where a high error-rate is tolerable, or if it's easier to validate an answer than to posit one.
There is a much better reason not to use AI -- it weakens one's ability to posit an answer to a query in the first place. It's hard to think critically if you're not thinking at all to begin with.
My hot take on students graduating college using AI is this: if a subject can be passed using ChatGPT, then it's a trash subject. If a whole course can be passed using ChatGPT, then it's a trash course.
It's not that difficult to put together a course that cannot be completed using AI. All you need is to give a sh!t about the subject you're teaching. What if the teacher, instead of assignments, had everyone sit down at the end of the semester in a room, and had them put together the essay on the spot, based on what they've learned so far? No phones, no internet, just the paper, pencil, and you. Those using ChatGPT will never pass that course.
As damaging as AI can be, I think it also exposes a lot of systemic issues with education. Students feeling the need to complete assignments using AI could do so for a number of reasons:
students feel like the task is pointless busywork, in which case
a) they are correct, or
b) the teacher did not properly explain the task's benefit to them.
students just aren't interested in learning, either because
a) the subject is pointless filler (I've been there before), or
b) the course is badly designed, to the point where even a rote algorithm can complete it, or
c) said students shouldn't be in college in the first place.
Higher education should be a place of learning for those who want to further their knowledge, profession, and so on. However, right now college is treated as this mandatory rite of passage to the world of work for most people. It doesn't matter how meaningless the course, or how little you've actually learned, for many people having a degree is absolutely necessary to find a job. I think that's bullcrap.
If you don't want students graduating with ChatGPT, then design your courses properly, cut the filler from the curriculum, and make sure only those are enrolled who are actually interested in what is being taught.
We weren't verifying things with our own eyes before AI came along either, we were reading Wikipedia, text books, journals, attending lectures, etc, and accepting what we were told as facts (through the lens of critical thinking and applying what we're told as best we can against other hopefully true facts, etc etc).
I'm a Relaxed Empiricist, I suppose :P Bill Bailey knew what he was talking about.
I literally just can't wrap my AuDHD brain around professional formatting. I'll probably use AI to take the paper I wrote while ignoring archaic and pointless rules about formatting and force it into APA or whatever. Feels fine to me, but I'm but going to have it write the actual paper or anything.
Real studying is knowning that no source is perfect but being able to craft a true picture of the world using the most efficient tools at hand and like it or not, objectively LLMs are pretty good already.
This is fair if you're just copy-pasting answers, but what if you use the AI to teach yourself concepts and learn things? There are plenty of ways to avoid hallucinations, data-poisoning and obtain scientifically accurate information from LLMs. Should that be off the table as well?
And yet once they graduate, if the patients are female and/or not white all concerns for those standards are optional at best, unless the patients bring a (preferably white) man in with them to vouch for their symptoms.
A good use I've seen for AI (or particularly ChatGPT) is employee reviews and awards (military). A lot of my coworkers (and subordinates) have used it, and it's generally a good way to fluff up the wording for people who don't write fluffy things for a living (we work on helicopters, our writing is very technical, specific, and generally with a pre-established template).
I prefer reading the specifics and can fill out the fluff myself, but higher-ups tend to want "how it benefitted the service" and fitting in the terminology from the rubric.
I don't use it because I'm good at writing that stuff. Not because it's my job, but because I've always been into writing. I don't expect every mechanic to do the same, though, so having things like ChatGPT can make an otherwise onerous (albeit necessary) task more palatable.
So they believe 90% of colleges is shit, they are on the right track, but not there yet. College is nothing but learning a required sack of cow shit. University isnt supposed to be. Everyone who goes to college for a "track" to learn a "substance" is wasting university time in my mind. That's a bloody trade school. Fuck everyone who thinks business is a University degree. If you're not teaching something you couldn't have published 5 years ago, your a fn sham. University is about progress and growth. If you want to know something we knew today, you should be taught to stop going to university, and find a college that's paid for by your state. AND LETS FUCKING PAY FOR IT. that's just 12-15 at that point. Most. We pay more in charges yearly trying to arrest kids for drugs and holding them back then we do just direct people who "aren't sure" what they want.
Edit: sorry for sounding like an ass, I'm just being an ass these days. Nothing personal to anyone