Everywhere you turn in the news these days, you’re hearing about artificial intelligence, more commonly referred to as AI. What is it that prompts both excitement and apprehension, and just how does this tech news affect authors, illustrators, and readers of middle-grade works?
If you find yourself asking these questions, read on. There’s plenty of AI news on the middle grade front, and multiple organizations are speaking out.
Back in December of 2022, the official blog of the Society of Children’s Book Writers and Illustrators published a two-part series titled “The Troubling Ethics of Artificial Intelligence and How It Impacts Children’s Book Creators.”
Part I covered issues for illustrators, like the potential for AI systems to study a wealth of existing artwork, learn patterns within them, and “create” something new from what the system has learned.
Natural questions arise. Can this be termed as “stealing” from existing creators? On one hand, it looks new. What’s the difference between a tech tool borrowing from existing patterns and an artist being inspired by them? But on the other hand, the product is not new and original. The “creation” is generated from the images that have been programmed into it.
The same issues exist for authors, as detailed in SCBWI’s Part II of the series. At issue is the fact that AI systems like the highly popular Chat GPT and a growing number of similar resources, are producing text at a wildly accelerated rate.
These systems can generate a story or a nonfiction piece about a particular topic for a target demographic at a specified word length. Just give it the specs, and watch it “create.” But can you really call it “creating”?
How Does AI Work?
Text generated by AI systems pulls from the language patterns, information, and ideas of existing works. Technically speaking, it’s not creating new work – it’s regenerating data points from established work in a whole new way.
This capability is sending up red flags for authors and illustrators. What qualifies copyright infringement in this strange new world? How likely is it that the publishing industry might succumb to the potential for big returns on small investments?
And what might all this mean for readers? When text and illustrations are being generated from patterns and data points gleaned from existing works, there is no creativity. No human perspective. No potential for something new and wonderful that speaks to the soul and enlightens the mind.
The SCBWI blog posts refer readers to the questions that are already on the minds of the folks at The Authors Guild, so let’s go there next.
The Authors Guild
One major concern presented in this piece is fair compensation for creators. The Authors Guild is actively advocating for changes to copyright laws that will prevent AI from taking over the market for written works.
Rasenberger presents a list of ways AI has already been used in journalism, corporate texts, and literature. She cites two examples of AI achievements that are more than moderately concerning. An AI-generated novel was a finalist for a Japanese literary award, and there was an AI-generated article about the harmless nature of AI published in The Guardian.
Rasenberger goes on to explain that AI is not actually “creating”; it is auto-generating texts and images using existing works that have been programmed into it. She argues that copyright laws need to adjust for AI infringements, and she details a list of concerns that need to be addressed.
However, the author also concludes that she does not foresee AI being able to replace true art. Human art reflects the very real experiences and emotions of its time and place. And that cannot be generated from existing works. In Rasenberger’s words, “I think we can all agree that a world without the arts, which help move us forward as a society, is not one that we aspire to.”
MLA and NCTE
In July 2023, a joint task force of the Modern Language Association and the Conference on College Composition and Communication (a chartered conference of the National Council of Teachers of English) issued a statement about writing and AI in which they discuss both the risks and the benefits of AI. It’s a working paper, so comments are open and a final version is forthcoming.
This working statement “makes principle-driven recommendations for how educators, administrators, and policy makers can work together to develop ethical, mission-driven policies and support broad development of critical AI literacy.” In other words, there may be dangers, but AI isn’t going anywhere, so how can we make this work?
In the introduction, the statement is made that “writing describes a process as well as a product.” This is an important premise to consider. The labor involved in creating should be acknowledged and compensated appropriately, and students of writing need to learn their craft by going through the process of writing.
The paper goes on to define “broad risks and potential benefits of artificial intelligence to language, literary, and writing scholarship and instruction.” For example, while we need to guard against AI resources infringing on copyright and supplanting actual authors and illustrators, we can safely acknowledge the benefit of AI in brainstorming and gathering ideas.
It’s hard to derive real conclusions from all the AI information out there right now because this is just the dawn of the age. However, it’s safe to say that AI is here to stay, and creators as well as consumers of middle grade literature need to be aware of both its positive and its negative potential.
Does AI have the potential to eliminate human creators from the equation? No, it does not. Regenerating text and images from what already exists does not move the world forward. AI will never have the capacity to think, feel, empathize, and imagine.
AI can help us see with new eyes what is already in existence. But it cannot truly create. In the wise words of Albert Einstein, “Creativity is seeing what others see and thinking what no one else ever thought.”
Keep seeing. Keep thinking. Keep creating.