Iteration, ideation, and collaboration have long been the lifeblood of creativity and productivity.

In the creative writing space, we've seen this increasingly codified in the emergence of the now-omnipresent "workshop."

In the business world, approaches ranging from the informal (whiteboarding) to the formal (Agile) have become ever more de rigeur.

In all cases, it's always been about getting people together — in person, or, more recently, remotely — to bounce ideas off of one another. To brainstorm. The collective energy and experience ideally produces an array of options, with the best ideas bubbling to the surface.

Somewhere around a decade or so ago, the term "thought partner" and the concept of "thought partnership" began to emerge as ways of defining and understanding these relationships and processes.

Causeit, an organization that promotes digital literacy, describes thought partnership as "the practice of sharing ideas and experience with others to help them navigate complex challenges." They go on to specify two types of thought partnership: resonant and complementary:

When you pair with people who think like you do, you resonate — becoming a 'sounding board' for each others' best ideas. When you pair with people who think differently than you do, you complement — stretching each others' view of a situation to find and sort useful new approaches to a problem.

While Causeit's focus is the business world, the same general idea applies in the creative writing space as well.

A writer's group, for example, may serve as more of a resonant experience — you swap ideas, workshop different words, lines, and metaphors, and work your way together toward a revised draft. A relationship with an editor, on the other hand, might present more of a complementary model, in that they're likely going to challenge you to find new ways to resolve thorny points in your prose.

Central to thought partnership is the "sounding board," which Merriam-Webster defines as "a person or group on whom one tries out an idea or opinion as a means of evaluating it."

Historically, sounding boards have been human. But in our new age of generative artificial intelligence (AI), highly sophisticated chatbots such as Chat GPT, Jasper, and Bard are being touted for their ability to function as thought partners. These AI-powered services are so sophisticated, in fact, that calling them simply "chatbots" seems like a disservice.

If you're a writer, it's quite possible you've already interacted with some form of AI-fueled writing assistant such as Grammarly or Hemingway. These products can be quite useful for everything from catching typos to assessing readability to unearthing more effective synonyms.

Offerings like Jasper, however, are making seemingly grander claims, as they outline on their website:

You can think of him as your own personal AI marketing co-pilot, helping with everything from full campaign creation to serving as your brand's knowledge base, to (yes) assisting with content creation.

Jasper Chat is Jasper's most conversational AI feature, allowing you to chat with Jasper and ask for help with just about anything you can think of. While most AI chatbots are only helpful as writing assistants, Jasper Chat can be used for an array of other marketing tasks like:

  • Content briefs and schema generation for SEO
  • Content and/or campaign calendars for project management
  • Sales prospecting and outreach templates
  • Email templates
  • Copywriting for paid social/PPC campaigns

Jeff Maggioncalda, the CEO of EdTech standard-bearer Coursera, has been an early and highly visible champion of using generative AI and Large Language Models (LLMs) for thought partnership. In a recent interview about how CEOs can benefit, he had this to say:

Generative AI is an indispensable thought partner for CEOs, partially because it unleashes the full power of language. Because we express thought through language, these LLMs can find patterns of written words, and so find patterns of meaning. Generative AI enables you to express, evaluate, and create ideas.

AI also increases your cognitive ability by improving your thinking. If you have an idea about a certain thing, you often will talk to somebody — with a human brain — about it. You might say, "What do you think of this idea?" and I might say, "Here's what I think." You go back and forth. And by virtue of you talking about your idea and subjecting it to criticism and scrutiny, it sharpens a bit.

He goes on to note that "AI brains have become much smarter. They're getting faster and better; you can interact with them like a human," and observes how these AI brains "can help you generate and summarize ideas, explain, analyze, evaluate, compare, synthesize, advise, and communicate."

While the business benefits are at least arguably evident, you may be wondering about the creative side of the equation. NYU sums things up pretty clearly in a recent article titled "Embracing Creativity: How AI Can Enhance the Creative Process," where they describe generative AI's potential "role in the creative process" and how "AI has the potential to enhance your artistic journey in numerous ways." According to the article, these ways include:

  • Inspiration and Idea Generation
  • Visual Exploration
  • Music Composition
  • Textual Creativity

It's that fourth one that is most germane to our conversation here:

Textual Creativity: Writers and poets can benefit from AI-generated text prompts, which can kickstart the writing process. AI can generate sentences, ideas, or even entire paragraphs that serve as springboards for crafting engaging narratives.

This is essentially the "sounding board" effect, with AI playing the role of the "other" — the entity you bounce ideas off of.

This is all well and good, but I have some issues.

To begin, let's go back to Causeit's enunciation of thought partnership. In addition to what I've already quoted above, they elevate a very critical distinction when they note that, while thought partnership "sounds a lot like advice or mentorship, the key difference is that thought partnership is always mutually beneficial."

Which means … AI "benefits?"

The truth is, that's true. But it's also alarming. Whereas the writer may arguably benefit by uncovering some new way to approach their prose, the AI benefits by essentially transforming the writer's input into data that is used to train its algorithms. In other words, in exchange for whatever benefits you may extract from using the generative AI product, the product leverages your creativity to improve its own performance.

So, while there's a way to see a kind of mutually beneficial synergy at work here, the naysayers — of which there are a great many — have seemingly good reasons for considering this exploitation. Some are even calling it illegal under copyright law. An article titled AI Trained on Copyrighted Works: When Is It Fair Use? offers a good summary of these issues and includes clarifying language such as the following:

In general, copyright grants to its holder six exclusive rights to the copyrighted material: make copies of the work; prepare derivative works (create new matter based on the original copyrighted work); distribute copies of the work to the public; and perform or display the work publicly. Machine learning is most likely to implicate the first two: the right to make copies of the work and the right to create derivative works. Since creating or using a dataset often technically involves making copies of the copyrighted material, it may implicate the "reproduction of copies" aspect of the copyright. If the output data produced by the AI closely resembles one or several of the copyrighted materials in the training dataset (by incorporating them in some concrete form), that implicates the right to create derivative works. Unless there is an applicable exception (such as the "fair use" doctrine), those are acts of copyright infringement.

The thing is, generative AI cannot, in and of itself, be creative. By definition, it can only rely on information that already exists. But through the input of YOUR creativity, it can get better at imitating creativity. And this is problematic when business owners — be they startup CEOs or Hollywood producers — increasingly presume they can replace actual creative thinkers with generative AI proxies.

All of which gets us to the next issue, which has to do with the ability to actually "generate" new ideas.

I'll begin with a disclaimer that the extent to which you actually consider this to be an issue may likely depend on your feelings about the creativity of collagists and DJs.

As with both collagists and DJs, generative AI depends on, uses, and redeploys the work of others. It cannot create net-new ideas. It can only recombine existing ones. Arguably, there is power in this process. By juxtaposing Vietnam War pictures shot by embedded photojournalists with shiny, happy images from mainstream interior decoration magazines, Martha Rosler was able to send a powerful message about the war. On the seminal and culture-changing rap album Fear of a Black Planet, Public Enemy's DJ Terminator X, along with the production team known as the Bomb Squad, used a vast array of samples to create the jarring soundscape that defines the album's signature sonic tapestry. Chuck D, the group's leader, even used a visual art metaphor to describe the process:

"We approach every record like it was a painting. Sometimes, on the sound sheet, we have to have a separate sheet just to list the samples for each track. We used about 150, maybe 200 samples on Fear of a Black Planet."

The point being, that the intentional recombining of existing materials can potentially constitute a kind of net-new creation.

And this gets us to the final issue, which is the nature and role of human creative and evaluative functions.

Ultimately, you just cannot cut the human out of the equation, because an input of human creativity and conceptual acumen is required to kick off the generative process, and human evaluative capabilities are required to determine whether what the generative AI has produced is of value, and whether it needs to be revised or modified before it can be moved forward.

If you're starting to sense where I'm headed with all of this, I'm glad, because it's hopefully pretty obvious by this point.

Generative AI cannot be a true thought partner. At least, not an equal one, nor an independent one.

For one thing, it cannot actually function without you. Your creativity is what makes its "creativity" possible.

For another, it cannot truly start or lead the process. It is dependent on you to prompt it. Once you get it started, of course, it can prompt back at you and keep you thinking along new lines. But by its nature and function, it's inherently reactive to you.

And finally, it is not actually explicitly generative to the full measure of the definition, which, according to Merriam-Webster, means "having the power or function of generating, originating, producing, or reproducing." (my emphasis). Ultimately, it can only recombine what has already been created. And while, in certain cases, those re-combinations can achieve a level of newness (as with the work of Martha Rosler and Public Enemy, noted above), this achievement is only possible when driven by human conceptual and evaluative capabilities.

Generative AI will always depend on humans, and humans will always be the stronger partner.

Perhaps the best way to manage the question of generative AI, thought partnership, and the creative process is simply to think of generative AI less as a partner and more as a tool. After all, a carpenter and a saw do not have a mutually beneficial partnership. But the work of building a wooden house is a hell of a lot easier when you have a good saw.