©2018. Published in Landslide, Vol. 11, No. 1, September/October 2018, by the American Bar Association. Reproduced with permission. All rights reserved. This information or any portion thereof may not be copied or disseminated in any form or by any means or stored in an electronic database or retrieval system without the express written consent of the American Bar Association or the copyright holder.
It takes a lot of data to creatively survive an Alice1 101 rejection.2 Simply applying knowledge of the relevant case law and the U.S. Patent and Trademark Office’s (USPTO’s) official examination guidance is often not enough. A patent practitioner3 needs to understand how each particular patent examiner views what is or is not “an abstract idea” or “something more.” The examiner’s personal views must be interpreted in light of a technology center’s particular policies.4 Neither an examiner’s views nor a technology center’s policies are published. Creative use of prosecution data from the USPTO, however, can often reveal what these views and policies are and how they have changed over time.
Patent practitioners now have access to complete sets of searchable data for published patent applications since 2001. The data is available from patent analytic software services.5 This data includes the contents of the file wrappers for each application and the dates of each file wrapper’s event. Data for up to a million applications can be downloaded and graphically viewed in a conventional spreadsheet. It is difficult, however, for a practitioner to understand changes in behavior simply from columns of data, summary statistics, such as allowance rates, or even trend graphics, such as bar charts. What is needed is an improved visualization graphic that shows all of the data and how that data varies over time.
To address this need, I employ my own visualization graphic that I refer to as an allowance cloud. This graphic is a scatter plot of the dates various applications were filed versus the corresponding dates said applications were allowed. Data sets for a particular examiner or technology center can be viewed to see how said examiner’s or technology center’s allowances have changed. Practitioners can see at a glance just how difficult an examiner might be, how that examiner compares to other examiners in the same technology center, and what recently allowed cases in that technology center need to be reviewed in order to draft an effective response to an Alice 101 rejection.
Policymakers and USPTO management can benefit from such allowance cloud graphics by seeing the impact of various policy changes and management changes. When all of the data is presented for large groups of applications, subtle changes in behavior can be seen.
Figure 1 shows an allowance cloud graphic for a selected “Examiner A” currently in the USPTO’s technology center 3600BM (i.e., computer-implemented business methods).6 This technology center has been the most heavily impacted by the Alice decision.7 To create the graphic, a spreadsheet of all of the published8 patent applications assigned to Examiner A was downloaded from PatentAdvisor. The spreadsheet included the filing dates of the applications as well as the most recent allowance dates. Each data point on the graph is for one allowed application. The data points are semitransparent so that overlapping points can be seen.
The diagonal line on the graph is the “ceiling line.” The ceiling line is where a data point would be if the allowance date of an application was the same as the filing date. This is the highest level any point can be on the graph. The downward scatter in the data shows how long applications were pending before being allowed. For cases allowed in 2009, for example, the shortest pendency for Examiner A was about one and a half years and the longest pendency was more than eight years. Data prior to 2001 is not shown since this data is incomplete.
The allowance cloud graphic has labels above the ceiling line that show when certain events occurred that might have impacted what an examiner’s standards for allowance were at that time. Yellow labels show the dates of different Supreme Court decisions. Purple labels show the publication dates of different USPTO examination guidelines. White labels show when different USPTO Directors took office.
Examiner A has been with the USPTO since before 2001. This examiner only allowed a small number of cases per year prior to 2009, and those cases had very long pendency times. This may have been due to the relative inexperience of the examiner at that time. It might also have been due to the “Second Pair of Eyes” program in effect at the USPTO. Second Pair of Eyes required a second examiner to sign off on each allowance. The goal was to improve the quality of patent examination. For business methods, Second Pair of Eyes started in 2000.9
The 2007 KSR10 Supreme Court decision occurred during Second Pair of Eyes. It does not appear to have had an impact on this examiner’s allowances. There is no statistically significant change in the pattern of data points before and after KSR. There was a lot of anxiety in the patent bar that KSR would slow allowances.11 That would have been reflected in a reduced density of data points after KSR. If anything, however, this particular examiner seemed to allow cases more rapidly as reflected in the gradual increase in density of data points over the next few years.
When David Kappos became Director of the USPTO in 2009, he made it clear that “[p]atent quality does not equal rejection.”12 Second Pair of Eyes was ended shortly thereafter in September 2009.13 The pace of Examiner A’s allowances increased as evidenced by movement of the data points up toward the ceiling line. Many of the data points just below the ceiling line are for continuing applications.14 These applications were picked up and examined within a few months of filing with a high probability of allowance on first office action.
The Supreme Court’s Bilski15 decision in 2010 caused a great deal of anxiety among applicants for business method patents because it was a business method patent that was declared invalid for failing to claim statutory subject matter. This examiner, however, does not appear to have been affected by the decision. There is no significant drop in the density of data points.
In January 2014, Michele Lee became Director of the USPTO. The Supreme Court’s Alice decision was handed down on June 19, 2014. Six days later, the USPTO issued four pages of preliminary guidance to the examiners16 In response, the business method technology center did a massive recall of cases that were allowed but still pending.17 Examiner A had three allowances recalled. Having an allowance recalled is a serious blow to an examiner. It normally indicates a failure to properly examine a patent application. Not surprisingly, Examiner A allowed almost nothing for about a year afterward. There is only one data point between mid-2014 and early 2015.
In December 2014, more comprehensive subject matter eligibility guidance was published by the USPTO.18 Examiner A started allowing cases a few months after that. The examiner’s allowances per year were less than before Alice, but still reasonable. An applicant might have a good chance of finding common ground with this examiner on statutory subject matter for its claims. The examiner’s allowances per year increased still further after the May 2016 subject matter eligibility guidance came out.19 It is now almost back to where it was before Alice.
Unfortunately, Examiner A is not a typical business method patent examiner. A more typical examiner is “Examiner B.” Examiner B’s allowance cloud graphic is shown in figure 2.
Examiner B has the same supervisor as Examiner A, has about the same experience, examines applications in the same class/subclasses, and examines applications from largely the same applicants. During Second Pair of Eyes, Examiner B allowed almost no cases.20 After Second Pair of Eyes ended, Examiner A and Examiner B had similar allowance clouds. After Alice, however, Examiner B allowed nothing for almost two years. Even after the May 2016 guidance, Examiner B still only had a handful of allowances per year. Any practitioner negotiating with Examiner B should review those handful of allowances carefully to see if his or her case is similar. If not, then the practitioner might want to prepare for an appeal right away. The track record for post-Alice appeals has not been good,21 but it still might be better than going through multiple rounds of rejections and responses with Examiner B until, if and when, this examiner starts allowing cases again.
It might be easy to criticize Examiner B’s response to Alice. An examiner’s behavior, however, needs to be evaluated in the context of a technology center’s policies. Changes in those policies can be inferred from changes in the technology center’s allowance cloud graphic. Figure 3 is an allowance cloud graphic for the business method technology center 3600BM.
There are about 40,000 allowances shown in this graphic. Changes in policy are revealed in changes in the density of data points since they reflect changes in the technology center’s examiner corps behavior as a whole. The long pendencies and low allowance rates during Second Pair of Eyes can be seen in the low density of data points during that time. The nonimpact of KSR can be seen in the uniform density of data points before and after KSR. The increase in allowances and reduced pendencies after Second Pair of Eyes ended can be seen in the increased density of data points and the shift of data points up toward the ceiling line. The preference for examining continuing applications can be seen in the dense band of data points just below the ceiling line. The impact of Alice can be seen in the sharp drop in density of data points immediately after Alice.
With the large number of data points in this allowance cloud graphic, more subtle impacts of various actions by the USPTO can be seen. There is a small but sharp increase in allowances after the December 2014 guidance came out. This indicates that the December 2014 guidance may have helped at least some examiners and practitioners reach agreement on statutory subject matter. There is also a gradual but more substantial increase in allowances starting in the beginning of 2017. It is not clear what this may be due to. There were a number of management changes at the USPTO including the naming of a new director for this technology center. Perhaps the influence of the new director has led to the gradual increase in allowances. In any event, practitioners who have been frustrated by apparent lack of progress in negotiating with examiners might do well to interview said examiners again to get a sense of how things have changed.
An interesting feature revealed in this graphic is the horizontal lines for cases filed in early 2013 and early 2014. These are labeled FTF I and FTF II. FTF I corresponds to the America Invents Act (AIA) transition from first-to-invent to first-inventor-to-file.22 This occurred on March 16, 2013. FTF II corresponds to the one-year anniversary of the transition from first-to-invent to first-inventor-to-file. This is when provisional patent applications filed on FTF I would have to be converted to nonprovisional patent applications. Thus, allowance clouds reveal not only changes in examiner behavior but also changes in applicant behavior.
Another big data graphic that can reveal changes in applicant behavior is what I refer to as an abandon cloud. An abandon cloud is a scatter plot of application date versus abandon date for patent applications in a data set. Figure 4 shows an abandon cloud graphic for patent applications in the business method technology center. About 70,000 data points are shown. The fact that there are about 40,000 data points in the allowance cloud and about 70,000 data points in the abandon cloud gives an indication that the overall historical allowance rate for this technology center is about 36 percent. This has always been a difficult technology center to get patents allowed in. The details of the abandon cloud graphic, however, reveal other interesting behavior changes for both applicants and examiners.
This graphic shows the ebb and flow of when cases get abandoned and how various changes in management, court decisions, and examiner guidance impact applicants’ decisions to abandon. Most of the abandons follow a dense serpentine band. The top edge of the serpentine band is offset about six months from when cases are getting a first office action. These are the cases where applicants did not respond to their first office actions. The thickness of the band shows how long cases were prosecuted before most applicants decided to abandon them. A wide band indicates that examiners and applicants are having difficulty in reaching agreement on what is or is not patentable. A narrow band indicates they are reaching agreement quickly.
During Second Pair of Eyes, there was a steady increase in delays to first office actions in the business method technology center.23 This is reflected in the top edge of the serpentine band diverging away from the ceiling line. In 2008, this trend changed as the serpentine band shifted upward toward the ceiling line. This suggests there was a concerted effort to reduce pendencies to first office action. Continued steady improvement in pendency to first office action is seen until about 2016. Since then, the top edge of the serpentine band has been diverging away from the ceiling line. This suggests that delays to first office actions are once again becoming a problem.
There is no immediate impact of Alice indicated on the abandon cloud graphic. The technology center interpreted the Alice decision as essentially a ban on business method patents and stopped allowing cases immediately. The patent bar, however, continued to respond to rejections in hopes of reaching common ground or at least waiting until more clarity emerged from the USPTO or the courts. Some of this clarity appears to have occurred with the publication of the December 2014 guidance. There was a small but clear increase in abandons. They look like a descending curtain. As indicated above, there was also a small but clear increase in allowances at the same time. In that sense, the December 2014 guidance appears to have served its purpose of clarifying the requirements of statutory subject matter for at least some applicants and examiners.
There was a small but sharp decrease in abandons after the May 2016 guidance, thus the trailing edge of the descending curtain. The May 2016 guidance set forth the USPTO’s official requirements for making a prima facie24 case of nonstatutory subject matter. With these official requirements, at least some applicants apparently found that examiners were not making proper prima facie cases, and thus said applicants did not have to abandon their applications.
Visualization graphics are a powerful tool for revealing the impact of various changes in USPTO management, court decisions, and examination guidance. They can also be a tremendous boon to individual practitioners in their negotiations with particular examiners. The data is readily available from online services, and the tools for making the graphics are on most practitioners’ desktops.
Ideally, patent office practice should be uniform across examiners and technology centers. Data analysis should not be necessary. The practical reality, however, is that there are substantial differences between examiners and technology centers especially relating to the requirements of statutory subject matter. Taking these differences into account is essential for effective patent prosecution.
1. Alice Corp. Pty. Ltd. v. CLS Bank Int’l, 134 S. Ct. 2347, 2355 (2014).
2. That is, a rejection under 35 U.S.C. § 101 that cites the Alice decision.
3. That is, a registered patent agent or attorney.
4. The USPTO patent examiner corps is organized into technology centers (e.g., 3600). Technology centers are organized into work groups (e.g., 3620). Work groups are organized into art units (e.g., 3621). Technology centers, work groups, and art units are differentiated by the types of technologies they examine (e.g., 3600 = computer implemented business methods; 3620 = e-commerce; 3625 = electronic shopping).
6. Technology center 3600BM includes work groups 3620, 3680, and 3690.
7. Robert R. Sachs, Alicestorm Update February 2017, BilskiBlog (Mar. 16, 2017), http://www.bilskiblog.com/blog/2017/03/alicestorm-update-february-2017.html.
8. About 15 percent of the applications in this technology center are not published. The allowance data becomes available to the public when and if the patent application issues as a patent.
9. Terry Carter, A Patent on Problems, A.B.A. J., Mar. 2010, http://www.abajournal.com/magazine/article/a_patent_on_problems.
10. KSR Int’l Co. v. Teleflex Inc., 127 S. Ct. 1727, 1740–41 (2007).
11. Mark Nowotarski, Using KSR to Overcome an Obviousness Rejection, Intell. Prop. Today, Sept. 2007, https://web.archive.org/web/20070928093635/http://www.iptoday.com/articles/2007-09-nowotarski.asp.
12. Carter, supra note 9.
13. Luis Alexander Gonzalez & Elizabeth Mukhanov, History and Analysis of Patent-Related Events and Metrics at USPTO (Dec. 13, 2012) (unpublished B.S. thesis, Worcester Polytechnic Institute), https://web.wpi.edu/Pubs/E-project/Available/E-project-121712-104759/unrestricted/WPI_IQP_2012_Washington,_DC_USPTO_2.pdf.
14. Continuing applications are refilings of previously filed patent applications with different claims. There may also be new matter added to their specifications.
15. Bilski v. Kappos, 561 U.S. 593 (2010).
16. Memorandum from Andrew H. Hirshfeld, Deputy Comm’r for Patent Examination Policy, USPTO, to Patent Examining Corps, Preliminary Examination Instructions in view of the Supreme Court Decision in Alice Corporation Pty. Ltd. v. CLS Bank International, et al. (June 25, 2014), https://www.uspto.gov/sites/default/files/patents/announce/alice_pec_25jun2014.pdf.
17. Mark Nowotarski, Surviving Alice in the Finance Arts, CIPA J., July/Aug. 2017, at 20, https://issuu.com/cipa-journal/docs/cipa-2017-07-july-august-2017?e=2508290; Mark Nowotarski, Surviving Alice in the E-Commerce Arts, BilskiBlog (May 17, 2017), http://www.bilskiblog.com/blog/2017/05/surviving-alice-in-the-e-commerce-arts.html.
18. 2014 Interim Guidance on Patent Subject Matter Eligibility, 79 Fed. Reg. 74,618 (Dec. 16, 2014).
19. Memorandum from Robert W. Bahr, Deputy Comm’r for Patent Examination Policy, USPTO, to Patent Examining Corps, Formulating a Subject Matter Eligibility Rejection and Evaluating the Applicant’s Response to a Subject Matter Eligibility Rejection (May 4, 2016), https://www.uspto.gov/sites/default/files/documents/ieg-may-2016-memo.pdf.
20. There are about eight allowances with pre-2001 filing dates not visible in this allowance cloud.
21. Mark Nowotarski, Surviving Alice with an Appeal, BilskiBlog (Sept. 21, 2017), http://www.bilskiblog.com/blog/2017/09/surviving-alice-with-an-appeal.html.
22. First Inventor to File (FITF) Resources, USPTO, https://www.uspto.gov/patent/first-inventor-file-fitf-resources (last modified Feb. 5, 2016).
23. Mark Nowotarski, How Long Will It Take to Get a Patent Reviewed?, Ins. IP Bull. (Feb. 15, 2008), http://www.bakosenterprises.com/IP/B-02152008/IPB-02152008.html.
24. The initial burden for rejecting a patent claim for failing to recite nonstatutory subject matter falls on the examiner. The examiner has to make a prima facie case. Once the examiner has made a prima facie case, then the burden shifts to the applicant to either show that the examiner has erred or amend the claim so that it does recite statutory subject matter.