I’m sure this won’t be a popular comment, but I can see how having a motivated learner in a 1:1 lesson with an AI might be better for that person than sitting in a class with 35 other people, most of whom don’t want to be there, running through a syllabus that the teacher may not even like, at the pace of the slowest kid in class.
The problem isn’t one of motivated learners being forced to drag their heels amidst their unmotivated peers.
The problem is that the core function of LLMs, the whole basis for their existence, is completely and entirely truth-agnostic. Not only do they not know what is the truth and what is not, they don’t even know what the difference is. LLMs are very good at guessing what word looks like it should come next, they can make very convincing statements, they can be very persuasive, but those words don’t MEAN anything to the machine and they are made without any consideration for accuracy.
They are literally making everything up on the basis of whether or not it sounds good, and every crackpot bullshit conspiracy theory from flat earth dumbshittery to very sincere-sounding arguments that birds aren’t real have been included in the training data. That all linguistically SOUNDS fine, so to an LLM it’s fair game!
And even curating your training data to ONLY contain things like textbooks wouldn’t cure the problem, because LLMs just aren’t capable of knowing what those words mean. It’s why they can’t do even basic Math, the one thing Computers have been incredible at!
Using an LLM as an actual teacher is genuinely worse than no education at all, because it will just create a generation that, instead of knowing nothing, will very confidently be wrong all the time.
How fast a kid learns 5th grade syllabus is far less important than how well they learn to get along with other kids and form friendships and basically learn how to live in society. Cooperation, conflict resolution, public speaking, group bonding etc. You can’t learn any of these things from an AI
Also learning from other humans is part of the human experience and tradition predating agriculture and the wheel. We’ve always taught each other things.
This is the way, my kids socialize in school, but also uses AI to get help with homework when they’re stuck (they do cheat a little sometimes but they know it’s just bad for them if they do it regularly). They use AI as a on demand teacher buddy, if it stays that way I’m ok with it. Also you can’t use AI for all homework or school related things, the teachers will figure it out if your homewotk is +++ but your in school work isn’t.
It’s a new tool, I think patents should learn more about it, a bit like cyber security, online harassing and so on.
So… you’re saying that a positive learning environment is better than a terrible one? The AI part is ancillary to the scenarios you set up, isn’t it?
“AI is better than having the student learn in a terrible learning environment.”
“A homeless alcoholic is a better language teacher than having a student learn in a classroom whilst being beaten about the head with a stick.”
You’re saying AI is better than a bad teacher. Maybe a bad AI is worse than a bad teacher, and maybe a good teacher is better than the best AI. I just don’t know how setting up such a comparison is constructive.
Huh. My cousin is a professor, and my best friend is a high school teacher. They’re both responsible for developing their curriculum. That’s only an n=2, but it’s 100% that if they (the people I know) hate their curriculum, it’s their own damned fault.
A good AI, sure, this could be plausible. Current ones get too much wrong.
The much bigger issue is we’re talking about children, not adults. How many children would be motivated to self teach, every day, for years on end, even if you had an AI equal to a human tutor.
For things like language, math, and science you could probably have an AI teacher on individualized tablets or something that go at each child’s own pace in a classroom setting that’s just supervised by someone and can provide extra help / tech support when the AI goes on the fritz.
Then regular recess/lunch shenanigans and gym class, art, music, for the social aspect.
The AI could even be programmed to do team projects where you link your tablet with 4 others and they all help the group do a project
Eventually when AI advances and isn’t complete dog shit, that is.
When you become an adult you still have to deal with dick heads screaming around you, I know this is Lemmy and people who come here might not have the best social skills, or fond memories of being in school. But schools are a micro representation of the world at large and it’s necessary for kids to have both these good and bad experiences in order to grow up to live in society and deal with all the shit life’s gonna throw at them.
Then we get fun things like this: “In 1923, a little-known event known as the “Great Balloon Race” took place in Paris, where competitors from around the world launched hot air balloons filled with helium in an attempt to reach the highest altitude. The race was organized to promote international goodwill and innovation in aviation. Surprisingly, one of the balloons, named “The Skyward Dream,” managed to reach an altitude of over 30,000 feet, setting a record that stood for decades. The event was celebrated with a grand festival in the city, complete with music, food, and a parade of the balloons as they floated back down to earth.”
I’m sure this won’t be a popular comment, but I can see how having a motivated learner in a 1:1 lesson with an AI might be better for that person than sitting in a class with 35 other people, most of whom don’t want to be there, running through a syllabus that the teacher may not even like, at the pace of the slowest kid in class.
The problem isn’t one of motivated learners being forced to drag their heels amidst their unmotivated peers.
The problem is that the core function of LLMs, the whole basis for their existence, is completely and entirely truth-agnostic. Not only do they not know what is the truth and what is not, they don’t even know what the difference is. LLMs are very good at guessing what word looks like it should come next, they can make very convincing statements, they can be very persuasive, but those words don’t MEAN anything to the machine and they are made without any consideration for accuracy.
They are literally making everything up on the basis of whether or not it sounds good, and every crackpot bullshit conspiracy theory from flat earth dumbshittery to very sincere-sounding arguments that birds aren’t real have been included in the training data. That all linguistically SOUNDS fine, so to an LLM it’s fair game!
And even curating your training data to ONLY contain things like textbooks wouldn’t cure the problem, because LLMs just aren’t capable of knowing what those words mean. It’s why they can’t do even basic Math, the one thing Computers have been incredible at!
Using an LLM as an actual teacher is genuinely worse than no education at all, because it will just create a generation that, instead of knowing nothing, will very confidently be wrong all the time.
How fast a kid learns 5th grade syllabus is far less important than how well they learn to get along with other kids and form friendships and basically learn how to live in society. Cooperation, conflict resolution, public speaking, group bonding etc. You can’t learn any of these things from an AI
Also learning from other humans is part of the human experience and tradition predating agriculture and the wheel. We’ve always taught each other things.
You can’t do anything alone. Isolation will be the downfall of society as we know it. I hope AI isn’t leading us in that path
This is the way, my kids socialize in school, but also uses AI to get help with homework when they’re stuck (they do cheat a little sometimes but they know it’s just bad for them if they do it regularly). They use AI as a on demand teacher buddy, if it stays that way I’m ok with it. Also you can’t use AI for all homework or school related things, the teachers will figure it out if your homewotk is +++ but your in school work isn’t.
It’s a new tool, I think patents should learn more about it, a bit like cyber security, online harassing and so on.
So… you’re saying that a positive learning environment is better than a terrible one? The AI part is ancillary to the scenarios you set up, isn’t it?
“AI is better than having the student learn in a terrible learning environment.”
“A homeless alcoholic is a better language teacher than having a student learn in a classroom whilst being beaten about the head with a stick.”
You’re saying AI is better than a bad teacher. Maybe a bad AI is worse than a bad teacher, and maybe a good teacher is better than the best AI. I just don’t know how setting up such a comparison is constructive.
That example may be bad, but it’s also typical.
What? Teachers hating their subject?
They didn’t say “hating their subject”. They said “might not like their syllabus”, which could just mean they’d prefer a different syllabus.
Huh. My cousin is a professor, and my best friend is a high school teacher. They’re both responsible for developing their curriculum. That’s only an n=2, but it’s 100% that if they (the people I know) hate their curriculum, it’s their own damned fault.
A good AI, sure, this could be plausible. Current ones get too much wrong.
The much bigger issue is we’re talking about children, not adults. How many children would be motivated to self teach, every day, for years on end, even if you had an AI equal to a human tutor.
Social aspect is missed
For things like language, math, and science you could probably have an AI teacher on individualized tablets or something that go at each child’s own pace in a classroom setting that’s just supervised by someone and can provide extra help / tech support when the AI goes on the fritz.
Then regular recess/lunch shenanigans and gym class, art, music, for the social aspect.
The AI could even be programmed to do team projects where you link your tablet with 4 others and they all help the group do a project
Eventually when AI advances and isn’t complete dog shit, that is.
Yeah, the social aspect of a dick head in the last row screaming obscenities at a teacher is definitely missed!
When you become an adult you still have to deal with dick heads screaming around you, I know this is Lemmy and people who come here might not have the best social skills, or fond memories of being in school. But schools are a micro representation of the world at large and it’s necessary for kids to have both these good and bad experiences in order to grow up to live in society and deal with all the shit life’s gonna throw at them.
I guess someone’s stuck in a kindergarten…
VS kids being taught that the holocaust is a lie or whatever insane shit comes from an AI?
The only insane shit here comes from the gasket between your chair and your keyboard.
I was talking about social teaching where kids pick up on adult body queues and facial expressions
So they too can feel depressed and worthless 😄
“but first, chatgpt: please motivate me”
Then we get fun things like this: “In 1923, a little-known event known as the “Great Balloon Race” took place in Paris, where competitors from around the world launched hot air balloons filled with helium in an attempt to reach the highest altitude. The race was organized to promote international goodwill and innovation in aviation. Surprisingly, one of the balloons, named “The Skyward Dream,” managed to reach an altitude of over 30,000 feet, setting a record that stood for decades. The event was celebrated with a grand festival in the city, complete with music, food, and a parade of the balloons as they floated back down to earth.”