Expert testimony tends to follow rituals. Direct examination often consists of little more than a lecture by the expert, punctuated by an occasional question from counsel. Cross-examination hews to a host of time-honored rules, including these: “Never ask a non-leading question” and “Don’t ask any question unless you asked it at the deposition first.” These and similar rules have attained such sacred status that the late, great Professor Irving Younger dubbed them (somewhat with tongue in cheek) “The Ten Commandments of Cross-Examination.”
I offer some courtroom techniques that stray from the standard rituals, to add to your tool kit when examining experts. Let’s start with two stories about cross—where I encourage you to break the “Ten Commandments” from time to time—before turning to direct.
The scores of silicone breast implant cases tried in the 1990s were classic battles of the experts. Droves of medical specialists debated whether silicone caused autoimmune disease (a question ultimately answered “no” by an independent science panel convened under Federal Rule of Evidence 706). Another set of experts, mostly chemists and biomaterials engineers, addressed the implants’ durability: Did they remain intact in the body and “last a lifetime,” as a leading manufacturer contended, or fall apart and leak toxic silicone fluid, as plaintiffs claimed?
For many of these cases, the plaintiffs’ star biomaterials expert was Dr. Pierre Blais, a chemistry professor with a decade of experience evaluating silicone breast implants at the Canadian Bureau of Medical Devices. Dr. Blais’s testimony that implants degrade and fall apart in the body was key to the 1990 verdict in Hopkins v. Dow Corning, in which a jury awarded $7.1 million to a woman even though she was diagnosed with autoimmune disease before receiving breast implants. Recognizing that Blais’s testimony had inflamed the Hopkins jury, the plaintiffs’ bar put him on the stand in trial after trial, many of which resulted in seven- and eight-figure verdicts.
Cross-examining Blais was a challenge because of his robust credentials, credibility as a public health advocate (he would limit his fee to the price of a plane ticket), and unique experience having examined more than 6,000 explanted silicone breast implants. But Blais had vulnerabilities. He used no scientific protocols and generated no published or peer-reviewed scientific data for the implants he examined in his laboratory. His “methodology” consisted of looking at and touching the explants. The “laboratory,” it turned out, was his garage. It contained no spectrometers, X-ray microscopes, or other scientific machinery. “I am the machine,” Blais proudly testified on one occasion. On another occasion, in 1993, Blais testified that he intended “in a matter of days” to complete a paper for peer review and publication summarizing data from the 6,000 explants. However, as Blais later acknowledged under oath, he never wrote the paper.
In 1997, Houston plaintiffs’ lawyer John O’Quinn called Blais as his first witness in Spitzfaden v. Dow, the only breast implant class action ever certified (albeit later voided by an appellate court). I was charged with cross-examining Blais on behalf of the defense. By this time, Blais had testified in like cases nearly a dozen times, and he had his approach down pat. Conventional wisdom said I should pursue a Daubert-type cross, so I prepared to confront Blais with prior admissions about his lack of scientific method, lack of data, and failure to publish. But would these abstract concepts resonate with a lay jury? I worried that they wouldn’t.
Then I got a gift. On direct, O’Quinn gave Blais a sample breast implant and asked him to describe it to the jury. “This implant,” Blais testified, “has a quarter-inch hole” that “was built in” by the manufacturer. The hole “is still open,” he added, and “is leaking oil, which is going outside of the envelope.” Blais did not hand the implant to the jury so they could see the “hole” for themselves. As with all of Blais’s scientific observations, the jury was expected to take his word for it.
On cross, I violated Professor Younger’s Seventh Command-ment—“Do not permit the witness to repeat what the witness said on direct”—and allowed Blais to volunteer the same testimony repeatedly, growing more confident and animated with each re-telling. “They leaked like sieves,” he testified. “It was obvious” and “couldn’t be hidden. Even a layman [could see it].” At one point, Blais even chided me for seeming to forget what he’d said on direct: “Remember, yesterday my testimony was that Dow Corning implants had a quarter-inch hole manufactured in them. . . . I call it leakage.”
I did not cut him off or ask the judge for help. Rather, I locked Blais in:
Q: Did I just understand you to testify under oath, as a fact to this jury, that there was a quarter-inch hole in the Dow Corning silicone breast implants? Did you just say that?
A: I did, and not only that, it’s manufactured in and it is specified in, and that hole . . . was never designed to be capped. Built in.
Q: Built in, okay.
Q: That’s your testimony under oath?
I borrowed the plaintiffs’ exemplar implant and, with the court’s permission, handed it to the jury. The jurors passed the implant down the line, holding it up to the light, turning it, peering in vain for any hole or leak. When the jurors finished their inspection, I took in vain yet another commandment—Number 8: “Never permit the witness to explain anything”:
Q: Dr. Blais, I think it was yesterday [you testified] the gel over time inevitably goes back to oil and becomes a soup. You told us there is a quarter-inch hole in all these implants. I didn’t see any quarter-inch hole with soup pouring out of this.
A: [After Blais held the implant up for nearly a minute, manipulated it at various angles, and squinted closely] You’re correct. There isn’t.
Blais then tried to salvage his original testimony, insisting that the implant actually “does have a hole,” which only he could see. “[Y]ou don’t see it because it’s hidden by what looks like fabric. If you pinch it, however, or you look at it from the reverse side, you will see that it has a hole.”
The jurors were buying none of it. Several visibly shook their heads or glared at the witness in disgust. They had just examined the implant closely, and they could tell there was no hole or leak.
At the end of cross, Blais conceded that the only semblance of scientific data he had generated came from a “tea time experiment” or “parlor trick” using a so-called “chicken gun.” In the 1990s, the Royal Canadian Air Force was testing the impact of bird strikes on aircraft by shooting a store-bought chicken out of a small cannon at an airplane windshield. Blais had borrowed the chicken gun, then loaded silicone breast implants in lieu of chickens in an attempt to illustrate his theory that implants are weak and fall apart easily.
Q: [Y]ou thought that the implant would shatter to a mass of gel and scatter throughout the test site, correct?
A: [After being impeached] I will accept that.
Q: Dr. Blais, what happened when the implant was shot through the gun is that the Dow Corning implant went flying through the window, it shattered the armored, reinforced aircraft windshield and came through intact, correct?
Q: It didn’t have a quarter-inch hole, didn’t have a one-inch hole, didn’t have any hole, correct?
A: It went through without gross damage. It was not exploded. It was not shredded.
While Blais’s expert testimony had, perhaps, been shredded, the plaintiffs had wisely designated a second biomaterials expert in Spitzfaden: Dr. Eugene Goldberg, a chemist and chair of the Biomedical Engineering Department at the University of Florida. They promptly put him on the stand to clean up. Unlike Blais, Goldberg had generated scientific data supporting his opinions: a published 1993 study finding that miniature silicone breast implants triggered chronic inflammation in rabbits, and a then in-press study purporting to show that silicone implants ruptured within a few years of implantation.
But Goldberg had an issue of his own. Just before his deposition, my client located a series of letters Goldberg had written to Dow Corning in 1992, praising the company as a biomaterials leader and asking for $2.5 million in funding. In one letter, Goldberg dangled the fact that he had just finished his rabbit study and was about to write up the findings for publication. Goldberg’s letters could certainly be read in an innocent light, but they could also be read as suggesting a quid pro quo in which the company was being invited to weigh in on the study write-up in exchange for a large donation.
The news presented me with a decision at Goldberg’s deposition. While the letters should have been produced by Goldberg in response to our subpoena duces tecum requesting all his communications with Dow Corning, I was under no obligation to produce them because the plaintiffs had not propounded any discovery requests applicable to these letters. As a result, I faced a choice. I could follow the strict commandment I’d always been taught for expert depositions: exhaust every possible question at the deposition, to eliminate surprise at trial. Or I could break that rule and save the letters for trial, risking that the judge might sustain an objection to their surprise use or that the witness might have a reasonable explanation. I decided to take the risk and left the letters untouched at the deposition.
At trial, Goldberg opined on direct that Dow Corning breast implants deteriorated and ruptured once implanted, spreading harmful chemicals throughout the body. He denied any bias, having dedicated his life to the honest, objective study of biomaterials. According to Goldberg, his sole purpose in testifying was “[b]ecause I believe women have been harmed by a device which is flawed. That’s an opinion I hold passionately, if you will, and it’s based upon the best available scientific information that we have.”
My cross-examination went straight to bias:
Q: The type of scientific research you do is expensive, correct?
A: I beg your pardon?
Q: The type of research you do is expensive; it costs a lot of money to do?
A: Most research is expensive.
Q: As the head of the silicone team at the University of Florida, and the head of the Biomedical Engineering Department, . . . part of your job is going out and procuring and soliciting research funds for these various expensive scientific projects, correct?
A: That’s correct.
* * *
Q: Isn’t it true that you met with Dow Corning and sought out funding from them for scientific research?
A: Not to my knowledge.
Q: Well, isn’t it true that you specifically met with them regarding breast implants?
A: Not to my knowledge.
Q: Isn’t it true that you actually sought into the millions of dollars from Dow Corning to do research relating to silicone breast implants as of ?
A: No, sir. Not to my knowledge.
Q: Dr. Goldberg, I would like you to take a look at Defense Exhibits 181 through 184, and tell me if you can identify them as correspondence between you and Dow Corning regarding research funding in mid to late ’91 and into January of 1992, sir.
A: . . . Yes.
I immediately offered the exhibits (which were admitted without objection), displayed them on the courtroom screen, and marched Goldberg through his letterhead, signature, and verbatim words. In one letter, Goldberg lauded Dow Corning “as a leader in sophisticated biomaterials and device technology” that made “high added value” medical products, including breast implants. Goldberg had “pretty much finished” a rabbit study evaluating whether silicone breast implants are safe for long-term implantation and—without divulging whether the data were favorable or unfavorable—was preparing to draft the results for publication. He wondered if Dow Corning was curious to learn more. Goldberg then proposed “establishing a mutually beneficial relationship” in which Dow Corning would provide $2.5 million of funding to his lab.
Dow Corning declined to pay the $2.5 million or any other amount. Whereupon, Goldberg testified, he promptly solicited and received $70,000 of funding for his breast implant research from Houston plaintiffs’ lawyer John O’Quinn. The trial day ended with Goldberg’s admission that he had agreed to testify against Dow Corning at this very trial only after receiving the funding from O’Quinn.
Just before cross resumed the next morning, Goldberg strode up to me, stuck his index finger a few inches from my face, and angrily declared: “You . . . cold-cocked me.” At long last, I actually complied with one of the Ten Commandments—Number 6: “Do not quarrel with the witness”—and politely replied, “Dr. Goldberg, I understand your position, but everything I did was fair, and if you have an issue, you should take it up with your own counsel.”
Turning to direct examination of experts, allow me to offer a “commandment” of my own: Thou shalt not bore the jury. This advice is routinely neglected when we put an expert on the stand and incant the ponderous ritual of qualifying the expert. “Could you summarize your education for us, Professor Bigshot?” “What, if any, awards have you received?” “Please tell the jury, have you written any peer-reviewed papers for publication?” The tedium is only compounded if we bring a lengthy curriculum vitae into the mix and zoom in on the fine print. It’s a miracle any juror is still awake when we finally get to the good stuff.
What’s the cure? To imagine we’re on the jury. As a judge once cautioned me, “You know, you lawyers have seen this movie many times over and over again. Don’t forget the jury is seeing the movie for the first time.”
When a witness takes the stand, jurors are anxiously waiting to find out, Who are you? Why are you here? What do you know that matters in this case? Why should I listen to you? All of which remind me of the five Ws (who, what, when, where, and why) that journalists are taught to put in the lead of a news story. Good trial lawyers get to the point as quickly as good journalists do in the headline and lead paragraph.
The best way to answer the jury’s questions up front is to use a format I plagiarized (with permission!) from my partner Andy McGaan when we tried an oil refinery fire case together in 2006. Quite simply, give the jury headlines on four crisp, clean PowerPoint pages: No. 1: Questions—Short, plain-English bullet points of what questions the expert was asked to answer. No. 2: Education and Experience—Again, distilled to a few, short bullet points. You can add detail during the direct as needed. No. 3: Work Done in This Case—In keeping with the principle of “Keep it simple, stupid,” this headline page must be short and sweet. It can then serve as the jumping-off point for the expert to teach a mini-class to the jury, replete with demonstrations, live or videotaped experiments, photographs, calculations, and whatever else the expert did to answer the questions posed on the first slide. No. 4: Conclusions—If you and the expert have done your job well, the jury should know what’s coming by this point in the direct examination and will almost nod in recognition (perhaps even agreement) as the opinions are given. Exemplars of our expert PowerPoints can be accessed here.
Is this approach leading? Perhaps, but just a little. As a veteran Texas lawyer once responded to my leading objection, “Judge, I know I was leading, but at least I wasn’t draggin’ him all across the room.” And remember, you are allowed to lead an expert on foundational and introductory matters. As long as your headlining slides are derived from the expert report and you shared them with opposing counsel in advance, there is no valid objection.
A mentor told me a story when I was preparing my first expert witness. The defendant in a big antitrust case retained an exquisitely credentialed, Nobel Prize–winning economist from Harvard. The expert’s deposition lasted three days, at the end of which the plaintiff’s counsel said, “Professor, I have to hand it to you, I came at you every day with my toughest questions, and you had a brilliant answer ready for each one.” The witness bantered back, “You have to admit, I’m one hell of a good salesman.” Guess which deposition answer got plenty of airtime at trial?
One of the takeaways from this story (among others) is that credentials alone don’t make a good expert. Far more important is credibility and the ability to teach. If an expert knows his or her stuff, can break down complex technical information into layman’s terms (“put the hay down where the goats can eat it,” as my local counsel once put it), and is equally happy fielding questions on both direct and cross, the jury will be rapt.
In the silicone implant litigation, plaintiffs’ experts attacked the implants’ two main components as grossly defective. The outer envelope, a sealed pouch of clear silicone rubber, supposedly fell apart in the body. The inner filling, represented by the manufacturer to be a cohesive silicone gel, supposedly turned into runny liquid coursing throughout the body. Worse, plaintiffs’ experts opined, Dow Corning had cut corners during product development, rejecting more costly designs that would have been durable and safe in favor of an implant design that was “rushed to market” and put “profits over safety.”
We needed an expert who could teach the jury about the thorough testing Dow Corning performed when designing the implants, the integrity of the cohesive gel, and the strength of the silicone envelope material. We found one in Sue Peters, the Dow Corning chemist who had conducted the research and design for breast implants some 20 years earlier. Our goal was to make the jury feel as though they were standing alongside Peters at the lab bench—to have her “show” the process step by step, not just “tell” the jury her opinion as an expert.
Using blowups from her original lab books, Peters walked the jury through the diagrams, formulas, and sketches from her key experiments, translating scientific terminology into plain English. The gel filling had to meet two strict criteria: to mimic natural breast tissue and to stay cohesive enough that silicone fluid did not leak out of the gel matrix. Peters experimented with dozens of formulations over months of trial and error, rejected gels that were runny, and ultimately succeeded in making a cohesive gel that met the safety criteria. Peters then handed the jury an implant containing this gel, and also demonstrated the same gel inside a beaker, where it remained an intact solid even when turned upside down and shaken.
For the implant’s silicone envelope, Peters started by showing the jury the many medical applications where the identical silicone rubber had been used safely for years before being adapted to breast implants: hydrocephalus shunts implanted in the brain to drain fluid, tubing used in open-heart surgery, artificial joints. Sketching the chemical backbone of silicone rubber on a flip chart, Peters explained how this particular chemical network imparted an unusual degree of strength and flexibility, such that the silicone envelope could be twisted, crimped, and stretched many times its original size without breaking, and could withstand extreme chemical environments without degrading.
Peters then showed a videotaped experiment, performed by Stanford biomaterials engineers at her direction: a three-inch-long coupon of silicone rubber was cut from an implant envelope, clamped at either end, then slowly stretched by an Instron tensile-strength machine past its breaking point. Most materials snap when elongated slightly beyond their original length. But the silicone coupon stretched to 2, 3, 5, and ultimately 10 times its original length. (An excerpt from Peters’s videotaped direct is available.)
Effective expert testimony can be complex, so long as it is thoughtful and guided. Did every juror understand every detail of the science lesson this Dow Corning chemist taught? No. But they got the gist, and they appreciated being treated as intelligent people who could follow a well-taught science class, instead of being lectured at or talked down to.
It’s tempting to delegate and defer to the expert on the content of the direct examination because the expert has such deep, even intimidating, knowledge. But the lawyer cannot be a proverbial “potted plant.” We need to roll up our sleeves, master the subject matter and data, and have an interactive give-and-take with the expert throughout the case to develop the trial testimony and supporting data. This includes tasks as mundane as double-checking the expert’s math—a lesson opposing counsel learned the hard way at a 2003 federal bankruptcy court trial about the confirmability of the plan of reorganization of my client Babcock & Wilcox.
To emerge from Chapter 11, Babcock had agreed to contribute $1.85 billion to a trust that would pay asbestos personal injury claims for certain approved diseases. The company’s insurers objected that certain types of cancer claims should not be compensable because, the insurers’ medical experts opined, asbestos does not cause those types of cancer.
For lung cancer claims, the payment criteria were largely based on a study published in the prestigious medical journal Lancet, which the insurers’ first expert dismissed outright on grounds that seemed ridiculously nitpicky: The Lancet article incorrectly cited a set of 1980 International Labour Organization guidelines as having been published in 1989, and therefore the expert could not trust anything in the entire article.
The insurers next called Dr. Gary Epler, an eminent pulmonologist and epidemiologist from Harvard Medical School, who testified on direct that “there is no causal link between asbestos exposures and stomach cancer” based on a study of millions of people finding that the rate of stomach cancer in a group exposed to asbestos was virtually the same as the rate in an unexposed control group. I noticed that Epler had made two math errors on his PowerPoint slides. Since the immediately prior witness had trashed an entire study due to one inadvertent typo, I decided to call out these errors on cross.
I first got Epler to agree that if the ratio of stomach cancer for exposed persons compared with expected cases was “anything above 1.02 [it] would suggest a causal relationship” between asbestos and stomach cancer. Next, I divided the number on his PowerPoint for the exposed group (4,007) by the number for the control group (3,886), and asked if I had correctly computed the ratio as 1.03, thus exceeding the critical 1.02 threshold and contradicting his testimony of no causal link?
Perplexed, the witness could only repeat, “I see what arithmetic you did. I see what arithmetic you did.” At the witness’s request, I gave him a copy of the study to peruse. But he could only repeat that he’d used “the wrong number, that’s the wrong number,” and was unable to spot and cure the math errors on the spot.
There was no point rubbing it in. So I calmly showed the witness the two numbers he had entered incorrectly. Then I pivoted back to Lancet, since the whole point was to make the previous expert look silly for trashing the entirety of the Lancet study over a single typo, testimony Epler had just sat through minutes earlier.
Q: If . . . your PowerPoint has an inadvertent error, would you accept that I forgive you for an honest mistake?
A: Well, thank you. That’s very nice of you.
Q: All right. Just as I assume you would be equally charitable in forgiving a typographical error in The Lancet magazine. If someone had the year of a citation [as] 1989 instead of 1980, and they got one number wrong, you’d be charitable and forgive that as well, wouldn’t you?
A: I was curious about that. . . .
Q: Okay. But you might give somebody a break on that, right?
A: [Court reporter indicates: “Laughter.”]
Here is another commandment that I think should be added to that golden list: Thou shall get expert summary exhibits admitted. Jurors are drowning in information, especially in long trials steeped in science. A good way to help jurors recall expert testimony is to mark and offer into evidence the key documents used during direct. Pure demonstrative aids, of course, are not admissible. But a chart prepared by the expert summarizing voluminous data, a PowerPoint, or even a calculation made by the expert on a flip chart may come in as a summary of voluminous data under Federal Rule of Evidence 1006.
The jury will have your admitted expert exhibits in the jury room when they deliberate. Your expert’s direct may have been dramatic and effective, but if it took place days or weeks earlier in a long trial, it may also have been forgotten. Getting the key exhibits back in front of the jury while they deliberate can bring the testimony alive and make all the difference in the result.
These anecdotes are a tiny sample of the countless ways you can add punch to expert testimony. On cross, while it’s usually a safe bet to follow the venerated Commandments, I encourage you to break them from time to time—as long as you keep a tight grip on the opposing expert. Direct is different: You’ll be most effective using a light touch, supplying just enough structure for your expert to be the star teacher and for the jury to embrace your story.