Page 89 of The Proving Ground


Font Size:

“In some cases, although the company wants to say it’s ethical, it doesn’t work out that way. The stakes involved are extremely—”

Marcus Mason stood and objected.

“Your Honor, by talking in generalities, the witness is insinuating that unethical behavior occurred at Tidalwaiv on Project Clair,” he said. “There has been absolutely no evidence of that presented at trial, because it doesn’t exist. I ask that the question and answer be stricken and the jury be so instructed.”

Judge Ruhlin looked at me for a response.

“Judge, first of all, I would ask the court to instruct counsel not to incorporate his closing argument into his objection. Second, I am laying the groundwork so that the jury understands what this witness’s job was at Tidalwaiv and, more specifically, on Project Clair.”

“I’m going to sustain the objection,” Ruhlin said. “Mr. Haller, let’s move on to testimony directly related to the cause of action.”

“Yes, Your Honor,” I said. “A moment, please.”

I looked down at my legal pad and flipped to the next page, skipping several questions that I now knew would not get past the defense’s objections.

“Okay, Naomi, let’s talk about Project Clair,” I said. “When were you assigned to it?”

“I was hired by Tidalwaiv in late 2021,” Kitchens said. “After some training I was assigned to Project Clair in January of ’22.”

“Was that the starting point of the project?”

“No, the project was well down the road. I reviewed code and company directives that were three years old when I was getting up to speed on it.”

“So they brought the ethicist in late to the project.”

Marcus jumped up with an objection, arguing that my statement assumed facts not in evidence. The judge sustained the objection without asking me to respond. I knew the objection was valid. I just wanted the jury to put the question in a back pocket for later. I moved on.

“Dr. Kitchens, you—”

“Naomi.”

“Right, Naomi. Earlier you called Project Clair a generative AI program. Can you tell the jury whatgenerative AImeans?”

“Of course.Gen AIsimply means that these models, like the Clair app, for example, generate new data, whether it be video images or text, from the underlying data they were trained with.”

I liked how she turned to look at the jury as she spoke. I had said to her at lunch, “You’re a teacher. Be a teacher on the witness stand.” She was doing it now, and I believed it was being received well by her pupils, the jurors.

“So, then, would it be fair to say that it is not simply data in, data out?” I asked.

“Correct,” Naomi said. “That is the generative part of the equation. The training is ongoing. These large language models are constantly bringing data in and from that learning more.”

“‘Large language model’? Can you explain that?”

“It’s a machine-learning model designed for natural language generation. It’s trained on vast amounts of data and text, and then analyzes and sifts it all for patterns and relationships when prompted to have a conversation or answer a question. These models acquire predictive power in terms of human language. The ongoing downside, however, is they also acquire any biases or inaccuracies contained in the training data.”

“You’re saying ‘garbage in, garbage out.’”

“Exactly. And that’s where the ethicist comes in. To make sure there are guardrails that keep the garbage from ever getting in.”

I paused for a moment as I made a shift back toward my case.

“You testified earlier that you came onto Project Clair three years after it began, correct?”

“About thirty months after.”

“Okay, and did you replace the original ethicist on the project?”

“No, they did not have one before me. Usually an ethicist is brought in when a project reaches a certain level of investment and viability.”