Learning as Regeneration
The ChatGPT debate should not be about cheating, the real danger is not teaching students how to learn through questioning and connecting.
I became a college professor because I never wanted to stop learning. I often think of my work as an educator as regenerative design. Each semester I have the opportunity to influence a group of students. Some, hopefully most, will take what they’ve learned and spread it into the world through whatever jobs and roles they take on - passing the knowledge on to others. One of my biggest joys is when students come back years after graduating and tell me how they still think about the issues discussed in my class. Sometimes they will even share how an assignment never fully made sense to them until years later, or seeing the theory we studied in the context of their job made it make more sense. This is the power of education - when learning goes beyond the 15 weeks of the semester, it fractals, grows, and regenerates beyond me. Trees have often been a metaphor in learning - whether it's students who branch from a school/ professor/ class or ideas that branch from the trunk of a field. A tree must not only grow up and out, but it also must establish deep roots to sustain itself.
As the new semester began across higher ed this past month there was a flurry of articles and posts about the concerns of students using ChatGPT. AI policies began appearing on syllabi and discussion boards populated with questions and comments on how professors should handle the potential for cheating with this new form of AI. Truth is, students have always found ways to cheat - copying, stealing, plagiarizing, and paying for others' work. This new AI interface simply makes it easier for students to do and harder for professors to detect. ChatGPT, which launched at the end of 2022, can make seemingly flawlessly written essays appear with a click from any prompt you might feed it. It’s generated a frenzy of discussion - ban it, embrace it, switch to all oral exams, only in-class writing, how will this upend not just higher ed, but all of education? However, missing from the discussion are larger underlying questions - what does it mean to learn? How do you teach a student to learn? And how do you teach a student to keep learning once your class is over?
Learning how to find questions instead of answers.
I would guess most students using ChatGPT are not seeking to learn or understand knowledge and reasoning, they simply want a faster, easier way to arrive at the answer, to get the grade. They want a branch of the tree without establishing any roots. But the act of writing an essay or solving a problem set is not just about the answer or end result, it’s also about the process of knowing how to get there. When done well, this process develops muscles in learning to ask questions and knowing what questions to ask. Learning generates an understanding of how the parts make up a whole, relating and building thoughts on top of one another. It’s developing the roots, while also growing a branch. The danger of ChatGPT is not that a student cheats. The danger is that they don’t learn how to learn. If we don’t teach how to learn, we don’t teach how to ask questions or how to genuinely solve problems.
Feeding a question into ChatGPT will give you an answer (which may or may not be accurate), but once conditions change - as is certain to happen in our uncertain world - how will you know what questions to ask? Our world is changing rapidly - through climate change, geopolitical wars, technological advances, global pandemics, and more. Learning to ask the right questions allows us to adapt to changing situations. Software, tools, jobs, and entire industries will change. The only thing we can teach for certain is uncertainty.
ChatGPT generates its answers from databases that have already been constructed, documented, and recorded. While it may be a useful tool for gathering known resources, it can not be used for writing the unknown, asking new questions, or making new discoveries.
To teach students resiliency in the unknown world ahead we must teach regeneratively - learning how to write a new question and make new connections each time we analyze data in the context of humanity, inclusivity, and within systems.
Humanizing connections and systems
Knowing what questions to ask, and to continually ask the right questions is a core tenant of the Design Thinking process. Design Thinking processes usually follow a flow of cultivating empathy through ethnography and stakeholder investigation. Problems are defined collectively, to be able to ask questions, to ideate, prototype, and test an idea. This is repeated until an answer is discovered and implemented. This process, which requires feedback loops, and constant redefinitions of the question and interactions is impossible to do through an automated process. While it’s true that AI can create systems that iterate, change and learn, AI lacks true empathy and human connection. To have empathy is to consider humanity and our relationships with other people, their experiences, and their interactions.
A successful design process includes stakeholder engagement at every step - understanding who uses what products and systems, how they can be improved, who's included, who's not, who needs to be at the table, how, when, and why. AI or ChatGPT will never seek out new stakeholders or find those that have been excluded. As they are only pulling from the voices already represented in massive internet archives, it doesn’t have the voices or experiences of those that have been marginalized or left out. To learn solely through AI lacks the understanding that to learn means to engage with others, to be in community with others, to feel, and to relate.
Stakeholder engagement connects us to other people, but critical thinking also requires connecting the dots of larger systems. Going beyond surface-level answers, to understand the impacts of broader issues is where we will find future solutions to the climate crisis. Using Systems Thinking allows us to address root causes, solve for multiples, and understand intersectional dynamics. On another level, this type of work requires time and investment. So while some might want the answer quickly from a chatbot. To understand the answer requires an investment of time.
Fears and Fakes
When we practice a learning process with critical thinking, active questioning, and connecting with other individuals and larger systems, we establish roots. When we don’t do this we grow without roots - without a foundation. And a lack of foundation can lead to fear. The rise of standardized tests and the decline of critical thinking has helped perpetuate a culture of fear in the United States. This fear-based culture has grown and regenerated in its own way - and not for the good. Without roots, without the ability to question, organizations like Fox News have become toxic fertilizer to perpetuate fears leading to seemingly irrational decisions - the fear of learning about African American history, the fear of books with LGBTQ+ characters, and the fear of science. How have we elected a Representative to Congress that might as well be AI-generated given the number of lies and untruths that construct his facade of leader in costume? The real human connection is gone and replaced with images of humans on screens leading others to fear.
As a culture, we lack the awareness to examine deeper questions. The ability to ask questions, to inquire, to learn, are exactly the skills we need to tackle the climate crisis. While some facades are led with fear, others are led with ignorance, which greenwashing is an example of. Many consumers easily fall into thinking companies or products are “eco-friendly” and “green” because they lack the critical questioning to go beyond buzzwords and packaging claims. Pointing out that something is “chemical free” is sold as being environmental - forgetting of course that water is a chemical too. We see this in false claims on pretty packages but we also see this in the perpetuation of ideas we are sold will make a difference, when really they are a drop in the bucket like reusable metal straws or silverware sets.
When we continue to measure learning in grades, and put into place systems that prioritize and incentivize getting the grade over understanding, we perpetuate this culture. I’ve argued in the past that much of the loss of critical thinking in the United States stems from George Bush’s No Child Left Behind policy - when standardized testing became the norm and teaching no longer became about inquiry and thought but about memorizing and test-taking. Teaching to the test has now led to a generation that cheats to the test. Student stress about grades has been exacerbated by parents, admissions, scholarships, tiered lists, and rankings, and is marked by the significant rise in anxiety and depression across higher ed. As a professor, I often worry about the generation of students we “graduate”. If we are graduating students based on grades and numbers instead of understanding, human connection, and developing their roots - where will they lead us in the future? ChatGPT isn’t a danger because we’re making copies of trees. It’s dangerous because we’re copying fake trees without roots. Trees that will easily get blown over and knocked down in a massive domino-like collapse.
Roots, branches, blossoms, fruits
Regeneration means to continually grow, while also restoring and renourishing. In the tree analogy, regenerative learning allows us to grow our trunks and branches through the process of establishing roots, continually learning in new ways with new tools. When the tree is ready, it will blossom - perhaps providing fruits for people, dropping new seeds to grow more trees, or shedding leaves to become compost, nourishing the soil. It may become a home for other creatures in our ecosystem, help provide shade and cooling, and live in community with the natural systems surrounding it. Each season it will go through this cycle again and again. This is the process of regeneration, the process of regenerative learning - finding what’s not yet in the database, responding to and interacting with the environment around it.
The climate is changing. We must change as well. To know how to change, we need to know what questions to ask and how we build together. I do not want a chatbot doing this for the future - but rather designers, scientists, educators, planners, organizers, and engineers who work together, as humans, across platforms, tools and systems.
I thoroughly enjoyed this read, thank you. As a storyteller, regenerative futures researcher and ethics advisor working across transmedia storytelling modes including immersive technology, I am constantly warning my creative and tech teams about jumping on the wave of Chat GTP simply to get things done faster. This article explains my thoughts so well!
Great piece, thanks for sharing!