ChatGPT Sucks! Here are 5 Great Reasons Why

fake-5205183_1280

I was all set to write this post about a lesson I relearned—one a ghostwriter always needs to remember—It’s not my story, it’s theirs. Instead, I was reminded of the importance of avoiding ChatGPT and working with a human and not artificial intelligence when writing a book.

I had a recent conversation with a client about a chapter I submitted that I really liked—and he didn’t. He thought I went too much into his rich family history and not enough about what he calls his hero’s journey. My argument to him was that the history helps build the foundation for him embarking on his hero’s journey.

He disagreed. I countered with how his family history is relevant. He agreed but said at this point in his story, he wasn’t aware of it yet, so it didn’t fit here.

From that talk, I was reminded that a ghostwriter is one half of the team that successfully writes the book, the storyteller. The client is the other half, the story expert. Here, the story expert wins out. It’s his story to tell, and he decides where the details fit. My job is to make sure that they’re presented in the best way possible.

I was certain that other ghostwriters have had similar experiences, so I typed in “examples of ghostwriters realizing it’s not their story but the client’s story to tell” and got nothing.

So, I typed the same thing into ChatGPT and got five examples: J.R. Moehringer ghostwriting for Prince Harry (“Spare), Daniel Paisner writing for Denzel Washington (“A Hand to Guide Me”), Hilary Liftin writing for Tori Spelling (“sTORI Telling”), David Ritz writing for Marvin Gaye (“Divided Soul”), and Laurence Leamer doing an early collaboration with Arnold Schwarzenegger (“Total Recall”).

Thanks to two New York lawyers getting sanctioned for citing fake cases they “found” using ChatGPT, I already know never to trust AI without first verifying everything it spits out. And it’s a good thing I did because I found loads of errors or was unable to find corroboration.

About the only aspect that was correct was that these ghostwriters really did work with the famous people on the named books.  

Breaking down the five examples:

1. J.R. Moehringer—ChatGPT referred to a 2023 essay Moehringer wrote in the New Yorker about an argument he had with Prince Harry about wanting to include a dramatic comeback Harry made during a military hazing ritual. Moehringer wanted to include it, Harry wanted it cut.

The AI said Moehringer said, “It hit me; my job was to help him tell his story, not mine.”

I had previously read this article, so I called it up again. The incident in question is the story’s lead. But ChatGPT got it backwards: Harry wanted to include the quote, Moehringer didn’t. He found it “unnecessary, and somewhat inane.” And he ultimately convinced Harry to leave it out.

The quote attributed to Moehringer? Not in the article.

Verdict: False. AI is wrong.

2. Daniel Paisner—ChatGPT referred to the ghostwriter talking in interviews about how Washington challenged his usual process: Rather than inserting drama or “lessons,” Washington insisted on letting mentors’ stories speak plainly.

“He didn’t want to shine light on himself,” Paisner said. “He wanted to pass the mic.”

Paisner realized his instinct to center the celebrity voice had to yield to Washington’s focus on community.

Although I found several interviews with Paisner, I could not find any in which he said the quote or the realization above. Maybe it’s true that he thought he was going to write a more standard memoir (and he has written several) and Washington wanted the mentors to speak for themselves, but I can’t verify that. I can infer that because the book is a collection of stories by more than seventy famous people in sports, theater, business, and politics; and I found a review that said Paisner conducted all the interviews and “there was a certain sameness to the narratives style from entry to entry.”

Verdict: Possible.

3. sTORI Telling—ChatGPT referred to Liftin’s own memoir “Ghosted” in which she wanted to tone down some of Spelling’s more chaotic or emotional stories, but Spelling wanted her imperfections on full display. “It wasn’t the story I would tell, but it wasn’t mine to tell,” Liftin wrote (allegedly).

I immediately went looking for Liftin’s book—and didn’t find it. Not on Amazon, not on her own website. There is a book called “Ghosted,” but it’s by Rosie Walsh, an entirely different person.

I also found a podcast “As Told To” (hosted by Daniel Paisner, incidentally) and heard Liftin say this: 

“I don’t want to do a book that kind of lightly touches on all the movies that were a great success and all the wonderful people they’ve worked with. I’m not wired that way. I want it to be internal and reflective and to, ideally, have a higher purpose and it’s hard to do that when you have so much story to tell.”

This sounds more like the Moehringer example, in which the AI reversed the roles. I believe in what Liftin said on the podcast.

Verdict: Most likely false.

4. Divided Soul—ChatGPT mentioned how Gaye’s vulnerability surprised him and how Ritz envisioned a straightforward success story, only to have Gaye talk about addiction, trauma, and faith.

I found several interviews and articles in which Ritz alluded to meeting with Gaye in Belgium, but instead of discussions about addiction, trauma, and faith, the talk was always about the song “Sexual Healing” and how Ritz wrote the lyrics to the song, was not credited, brought suit against Gaye once he realized the singer wasn’t going to follow through on his promise to pay him, and attempted to reconcile with Gaye but didn’t because Gaye’s father murdered him.

What ChatGPT failed to mention was that “Divided Soul” isn’t a ghostwritten book. It’s a biography written by Ritz in which he used information Gaye had told him about his life that included, addiction, trauma, and faith.

ChatGPT quotes Ritz: “I had to throw out my blueprint. He was leading, not me.” I can’t find that quote anywhere.

Verdict: Misleading, bordering on false.

5. Total Recall—ChatGPT correctly mentioned that Leamer did not end up ghostwriting Schwarzenegger’s autobiography. It said Leamer attempted an early draft but failed because Schwarzenegger wanted an accessible story, and Leamer over-intellectualized the story.

Leamer admitted: “I realized I was writing my version of Arnold. He wanted his own.”

Though not the final ghostwriter for Total Recall, this shift influenced Leamer’s approach in future collaborations.

It may be true that this happened, but I can’t find anything supporting this. What I found were several reviews of a Schwarzenegger biography Leamer did write, “Fantastic: The Life of Arnold Schwarzenegger.”

Richard Schickel called it a “dully dutiful biography.” An unattributed review said “At one point Leamer describes his subject as ‘like his classic Excalibur convertible, a complex piece of machinery that had to be perfectly tuned, or it did not run properly.’”

Neither of those sound overly intellectualized.

I also found Leamer writing in Newsweek about “Total Recall,” in which he recounts meeting Schwarzenegger for the first time in eight years (“He is devoid of his once-bulging muscles, and his face looks as if a master taxidermist has been at work.”). Leamer also tells about Maria Shriver finding out about her husband’s liaison with the housekeeper and confronting her (“As Maria asks the fateful question, the housekeeper falls on her knees before her mistress. Maria asks her to rise, and the two women hold each other…”)

Again, not sounding overly intellectualized.

Verdict: Mostly, if not entirely, false.

These five examples should give ghostwriters in Boston (and prospective clients) pause. Humans remain the best, most effective, most real, and most honest way to tell a story. Go hire a ghostwriter in Austin. 

Feel free to read and check out my other posts related to ghostwriting. Go to leebarnathan.com/blog.

  

Let's Start A New Project Together

Contact me and we can explore how a ghostwriter or editor can benefit you.