ChatGPT: Lazy or Purposeful?
- Secil Uluderya
- Sep 15, 2025
- 4 min read
Updated: Apr 5

The use of AI chatbots like ChatGPT has become the new hot topic for high schools like my own. Over the past few years, we students have seen many changes––often upsetting ones––in assignments, tests, and grading due to the blooming use of AI. My school, for instance, recently oversaw a significant transformation in essay writing, essentially replacing many out-of-school writing assignments with their in-class counterpart. Our education system seems to
believe that it's better to completely rewire many aspects of the curriculum, especially those in the humanities departments, than to give students a chance to use AI. However, these changes aren't without tradeoffs.
When my classmates and I learned that we would start having more in-class essay writing in replacement of the traditional take-it-home style, we weren't exactly thrilled. After all, nobody particularly favors cramped fingers, time limits, and memorization over the freedom that bringing an essay home offers. Essays, especially in our history classes, used to be oriented around skills like writing flow, specific and well-sourced evidence, and organization. With the new changes, however, we started to see an overall decrease in coherence and information quality. Our teacher urged that spelling and punctuation was of minimal importance and the piece didn't need to be quite polished at all, so long as we were able to conjure valid evidence and connect it to our main point. As someone who writes slowly by hand and likes to think through the flow of each sentence, this switch-up did not fare well for me. I found that, over time, my writing started to lose its quality of sounding presentable in favor of finishing before time ran out––an alteration that was rewarded by our grading rubric. Now, when I look back at some essays that I wrote in class, I feel a sense of disappointment at the visible digression from ones I wrote in previous years.
Even as someone who prefers their own voice to AI and would never consider using it to cheat, I was negatively affected by our school's choice. However, it is certainly true that it did prevent many cases of cheating in the long run. I used to hear of students using ChatGPT to write their essays all the time––only some of which got in trouble for it. By being forced to write timed essays away from screens, students had no choice but to produce their own writing. I can't say I agree with what my school did, but I understand where they came from.
Recently, however, I started questioning the restriction of AI in schools.
It all started when I attended a research lab at MIT over the summer. There, I worked on writing code for analyzation of RNA sequencing data alongside an undergraduate student, who was responsible for making visuals for certain research papers. In the time we worked together, we often ran into small, annoying bugs that were difficult to trace. I was surprised to see that sometimes, if we spent longer than five minutes trying to find the source of a bug, my mentor would simply open up a ChatGPT tab and ask the AI to fix it for us. It worked flawlessly almost every time. I realized that if ChatGPT could work expertly with a very specific coding library in R and solve problems as niche as ours, it would have no problem with almost every other task, especially ones that might be given to us in a school setting. I also questioned that if ChatGPT could be useful and even embraced in an actual work setting, why was it so forbidden in school? Even further, what if we are hindering our own educational progress by neglecting a tool that might shape our very future?
The question, I have come to understand, ties into a deeper purpose of our education system.
A complaint that I've heard from many kids my age is that most of what we learn in school is "useless information." It doesn't sound entirely unreasonable. After all, how many people currently in the workforce would say that it was absolutely crucial that they knew the Bill of Rights was ratified in 1791? Not many, I'd assume. After turning this thought over in my head for a little while, I concluded that it's much less about what we learn, but rather how we learn it. History teaches us critical thinking and memorization. Math teaches us logic and problem-solving. Science teaches us observation and analysis. Our grades do not signify that we know our entire chemistry course, but rather that we are able to use the appropriate cognitive skills to complete it.
Well, how does this connect to the use of AI? It seems that recently, the job market is starting to see a shift in those very same "life skills" that were once valuable in schools due to the emergence of newer technology like ChatGPT. Why should people bother learning basic code if AI can create almost any simple program you'd like? Why should we strive to improve memorization skills if a bot can instantly conjure all the facts, connections, and analysis you could possibly need? Nowadays, it seems like everybody is trying to predict where the job market will lean due to said developments, but it is widely recognized that we may start seeing higher demands in skills like communication, innovation, and creativity––all of which are severely undervalued in many schools.
I'd like to envision a future where we built off AI instead of sticking to the traditional academic practices that are becoming less and less relevant as time goes on. The world is changing, so why aren't our expectations for newer generations?



Comments