Green v. GEICO Gen. Ins. Co.

Decision Date24 March 2021
Docket NumberC.A. No.: N17C-03-242 EMD CCLD
PartiesYVONNE GREEN, WILMINGTON PAIN & REHABILITATION CENTER, and REHABILITATION ASSOCIATES, P.A., on behalf of themselves and all others similarly situated, Plaintiffs, v. GEICO GENERAL INSURANCE COMPANY, Defendant.
CourtSuperior Court of Delaware

Upon Defendant's Motion for Summary Judgment

GRANTED in part and DENIED in part

Upon Plaintiffs' Motion for Summary Judgment

GRANTED in part and DENIED in part

Richard H. Cross, Jr., Esquire, Christopher P. Simon, Esquire, Cross & Simon, LLC, Wilmington, Delaware, Attorneys for Plaintiffs Yvonne Green, Wilmington Pain & Rehabilitation Center, and Rehabilitation Associates, P.A.

Paul A. Bradley, Esquire, Stephanie A. Fox, Esquire, Maron Marvel Bradley Anderson & Tardy, LLC, Wilmington, Delaware, George M. Church, Enquire, Joshua Kahn, Esquire, Miles & Stockbridge P.C., Baltimore, Maryland, Meloney Perry, Esquire, Perry Law, P.C., Dallas, Texas, Attorneys for Defendant GEICO General Insurance Company



This is a class action assigned to the Complex Commercial Litigation Division of the Court. Yvonne Green, Wilmington Pain & Rehabilitation Center ("WPRC"), and RehabilitationAssociates, P.A. ("RA") sued on behalf of themselves and all others similarly situated (collectively, "Plaintiffs"). Plaintiffs filed suit against Geico General Insurance Company ("GEICO"). Plaintiffs allege that GEICO uses two computerized rules, the Geographic Reduction Rule ("GRR") and the Passive Modality Rule ("PMR") (collectively, the "Rules"), to evaluate insurance claims submitted by insureds or their assignees to GEICO. Plaintiffs argue that the Rules improperly analyze and make determinations for these claims without evaluating the substantive facts underlying the claim.

Before the Court are Plaintiffs' claims for Breach of Contract ("Count I"), Bad Faith Breach of Contract ("Count II"), and Declaratory Judgment ("Count III"). Both parties submitted motions for summary judgment—hereafter referred to as the "Plaintiffs' Motion" and the "GEICO Motion." The main issue in the cross-motions for summary judgment is the method in which GEICO processes PIP claims constitutes a violation of their contract and/or a violation of Delaware law.

For the reasons set forth below, the Court DENIES the Plaintiffs' Motion as to Counts I and II. The Court GRANTS the Plaintiffs' Motion as to Count III. In addition, the Court GRANTS the GEICO Motion as to Counts I and II and DENIES the GEICO Motion as to Count III.


Given the issues in this civil action, the Court believes that some background regarding policy issues is appropriate. Automobile insurers typically promise to pay the "reasonable" cost of medically necessary services for injuries their insureds suffer in covered accidents.2

For many years, insurers have used automated systems to perform an initial evaluation of the reasonableness of medical bills.3 The systems typically consult databases with information about millions of bills submitted by healthcare providers.4 For example, by comparing one provider's prices with those charged in the same geographic area, the Rules—GEICO's system—can determine whether the submitted claim exceeds the prices charged by 80 percent of relevant professionals in that geographic region. This is a simple machine learning function known as classification.5 If the system determines that a claim is in the 81st percentile or higher, the claim is reduced, and the insurer will only pay the 80th percentile amount.6

The Court recognizes that automation has the potential to eliminate persistent errors in human-based systems and to produce consistent decisions.7 The Court also recognizes that the systems could fail to take advantage of the potential for error correction and could become devices for error propagation themselves.8 For over a decade, policyholders and health providers have been filing lawsuits, most of them class actions, that challenge the validity of these automated systems. Most have argued that the insurers' approach to paying claims is inherentlyunreasonable because, for instance, a charge over the 80th percentile amount might be valid in some circumstance and because the insurers' approach does not allow a human being to exercise judgment in those conditions.9 Some courts have ruled in favor of plaintiffs where machines acted alone in making decisions.10

Other courts report having felt "strong pressures to discourage ... insurers from taking advantage of their superior bargaining position to ... force insureds to accept less than they are entitled to."11 These courts declare that insurers "may not obtain any advantage over the insured by ... threat or adverse pressure of any kind."12 The laws of some states specifically prohibit certain tactics, such as: "[m]aking known to insureds ... a practice ... of appealing from arbitration awards ... for the purpose of compelling [claimants] to accept settlements ... less than the amount awarded in arbitration,"13 and delaying payment or settlement under one form of coverage, "in order to influence settlements under other portions of the insurance policy."14

The articulated concern seems to be the importance of the sound exercise of human judgment and of ensuring that technology supports, rather than obscures, that goal. Professor Kenneth A. Bamberger, when discussing the use of analytics in making decisions, recommends:

But, as the level of judgment required increases--from decisions governing how to sort and characterize data, to rules constraining its use, to analytics derivingmeaning and predictions, to rules automating decisions accordingly--accountability measures must increasingly promote its exercise.15

Professor Danielle Citron states that "[p]rogrammers routinely change the substance of rules when translating them from human language into computer code [...] The resulting distorted rules effectively constitute new policy that can affect large numbers of people."16 Professor Citron argues that decisions best addressed by standards should not be automated:

Policies that explicitly or implicitly require the exercise of human discretion cannot be automated. For instance, agencies should not automate policies that allow individuals to plead extenuating circumstances that software cannot anticipate. Legal materials providing that a "decision maker may" take a given action explicitly signal that automation is inappropriate. Others implicitly do so by including indeterminate terms that require decision makers to consider conflicting norms that resist precise weighting.17

Robert Helfand, Esquire, notes that a court could decide that the duty of good faith, as a matter of law, prohibits exclusive, or even excessive, reliance on "secret algorithms."18 Undisclosed algorithms in the operation of insurance company functions seem to raise accountability concerns that run counter to the policy goals of insurance law. Technology can make the claims process more efficient and effective. Similar to issues with closed-source code present in technology-based compliance systems, however, the Rules—undisclosed to insureds—can leave insureds "unable to discern how a system operates and protects itself"19 and could shield unintended errors that distort even clear legal and managerial goals. "Programming and mathematical idiom can shield layers of embedded assumptions from high-level firmdecisionmakers charged with meaningful oversight and can mask important concerns with a veneer of transparency."20

Similar to automated decision systems used in the agency context, these systems also seem to jeopardize the right to be given notice of reasons for denial.21 Clear notice decreases the likelihood a decision will rest upon "incorrect or misleading factual premises or on the misapplication of rules."22 As a result, affected individuals could lack the information they would need to effectively respond.23 Mr. Helfand advises:

The solution to the problems outlined here cannot be simply to avoid those models. Rather, it lies in how those models should be developed and deployed [...] At the time when it first puts an automated tool to use in claims handling, the insurer also should prepare a way to demonstrate that the tool performs a well-defined task in a reasonable way.24

From these sources, the Court takes guidance. The Court realizes that there is no per se rule on whether automated rules can be employed in handling insurance claims. Moreover, the Court must examine the particular facts before it without inappropriately shifting the burden of proof.25 The Court recognizes that these sources are merely persuasive and not controlling here. Delaware law, not law review articles, will govern the resolution of Plaintiffs' claims against GEICO.


GEICO sells Delaware automobile insurance policies that provide no-fault personal injury protection ("PIP") coverage.26 PIP coverage is mandatory in Delaware.27 The purpose ofPIP coverage is "to ensure reasonably prompt processing and payment of sums owed by insurers . . . and to prevent the financial hardship and damage to personal credit ratings that can result from the unjustifiable delays of such payments."28 Delaware regulations mandate that PIP claims "shall be payable within 30 days of the demand thereof by the claimants provided that reasonable proof of loss for which the benefits as demanded has been submitted to the PIP carrier."29

Specifically, 21 Del. C. § 2118(a)30 provides:

The purpose of this section is to ensure reasonably prompt processing and payment of sums owed by insurers to their policyholders and other persons covered by their policies pursuant to § 2118 of this title, and to prevent the financial hardship and damage to personal credit ratings that can result from the unjustifiable delays of such payments.31

Further, Section...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT