Yesterday brought the issuance of dozens of papers about the importance – or not – of p-values. That’s important news for trial lawyers, corporate officers, corporate directors, and risk managers. Persons in those roles (and others) are in for surprises if not aware of and ready for debates about and likely reduction in the litigation-related power of p values. Indeed, the many papers issued yesterday on use of statistics deserve attention when thinking about opportunities and risks in litigation.

For yesterday’s editorial from the American Statistical Association, go here to read, and more:  “Moving to a World Beyond “p < 0.05.”

For the paper shown in the cartoon, go here to Nature.

Today, more and more studies involve gathering real world data as exposures occur, and then monitoring the health conditions of the monitored group. Studies of this sort sometimes are referred to as “exposomic studies.” New studies of this sort are underway in New Hampshire involving possible exposures to PFAs, and possible health. Interestingly, these studies include efforts to study whether exposures to PFAs produce observable changes in the immune systems of children. To that end, one study looks at before and after data for children before and after booster shots for kindergarten children, as reported in a March 18, 2019 post at DES Daughters Network.

In Portsmouth, Amico’s organization is working as a community liaison for a $2.6 million federally funded study that will examine the effects of PFAS on the immune systems of kindergarteners exposed to contaminated drinking water at Pease International Tradeport and on Massachusetts’ Cape Cod.

Shaina Kasper, New Hampshire state director for Toxics Action Center, said they hope this research will add to the body of knowledge on PFAS and the effects of exposure in utero and as young children. Researchers from Silent Spring Institute and Northeastern University will examine the children’s immune response before and after their kindergarten booster shots, she said.


Massive, cheap computer power – often combined with AI – has facilitated much of the recent progress in understanding and working against cancer. Accordingly, it’s good to see that research against cancer will be one of the uses for the world’s most powerful supercomputer, which is headed for  the federal government’s Argonne Labs in Chicago’s western suburbs. To be operational in 2021, the supercomputer will operate at exaFLOP scale. What that heck does that mean? It means a quintillion calculations per second. What is a quintillion? It is a thousand raised to the power of six. It is a million raised to the power of five. In other words, it’s a really big number. It is a 1, followed by 18 zeros. See the chart for more. Amazing.

Hopefully this incredible machine will help researchers against cancer fulfill  Jeff Huber’s call to “find a better way” to take on cancer. Who is Jeff Huber? He is a University of Illinois computer science grad who., among other things, led the teams that developed Google Ads, Google Maps and Google Earth. He also is highly motivated against because his wife died in her 40s because of a colon cancer no one saw coming. He provided the commencement talk for the 2016 graduating class at the U of I, and drew a standing ovation after he aired the “find a better way” theme as a mantra for many problems facing societies, including cancer.

Mr. Huber presently serves as a member of the Board of Directors for Grail. Created in 2016, the company’s mission, as in seeking the holy grail, is to make it possible to find cancer early on by sifting through cells in blood to find and identify cancer cells long before a tumor manifests itself.  Grail  is one of a small handful of companies working to bring this type of “liquid biopsy” to mass markets.  Taking advantage of relatively cheap and massive computing power is part of the equation for getting things done against cancer. I’m looking forward to Chicago taking on a larger role in research against cancer, building on decades of marvelous computing work at the University of Illinois. See generally Transforming Science – “Petascale Day” – Celebrating “In Silico” Research and the Blue Waters Supercomputing Project at the National Center for Supercomputing Applications at the University of Illinois.

The March 18, 2019 press release from the Department of Energy is pasted below.


CHICAGO, ILLINOIS – Intel Corporation and the U.S. Department of Energy (DOE) will build the first supercomputer with a performance of one exaFLOP in the United States. The system being developed at DOE’s Argonne National Laboratory in Chicago, named “Aurora”, will be used to dramatically advance scientific research and discovery. The contract is valued at over $500 million and will be delivered to Argonne National Laboratory by Intel and sub-contractor Cray Computing in 2021.

The Aurora systems’ exaFLOP of performance – equal to a “quintillion” floating point computations per second – combined with an ability to handle both traditional high performance computing (HPC) and artificial intelligence (AI) – will give researchers an unprecedented set of tools to address scientific problems at exascale. These breakthrough research projects range from developing extreme-scale cosmological simulations, discovering new approaches for drug response prediction, and discovering materials for the creation of more efficient organic solar cells. The Aurora system will foster new scientific innovation and usher in new technological capabilities, furthering the United States’ scientific leadership position globally.

“Achieving Exascale is imperative not only to better the scientific community, but also to better the lives of everyday Americans,” said U.S. Secretary of Energy Rick Perry. “Aurora and the next-generation of Exascale supercomputers will apply HPC and AI technologies to areas such as cancer research, climate modeling, and veterans’ health treatments. The innovative advancements that will be made with Exascale will have an incredibly significant impact on our society.”

Argonne's Aurora supercomputer will launch in 2021.
Aurora is expected to be completed by 2021. | Photo: Argonne National Laboratory

“Today is an important day not only for the team of technologists and scientists who have come together to build our first exascale computer – but also for all of us who are committed to American innovation and manufacturing,” said Bob Swan, Intel CEO.  “The convergence of AI and high-performance computing is an enormous opportunity to address some of the world’s biggest challenges and an important catalyst for economic opportunity.”

“There is tremendous scientific benefit to our nation that comes from collaborations like this one with the Department of Energy, Argonne National Laboratory, and industry partners Intel and Cray,” said Argonne National Laboratory Director, Paul Kearns.  “Argonne’s Aurora system is built for next-generation Artificial Intelligence and will accelerate scientific discovery by combining high-performance computing and artificial intelligence to address real world problems, such as improving extreme weather forecasting, accelerating medical treatments, mapping the human brain, developing new materials, and further understanding the universe – and that is just the beginning.”

The foundation of the Aurora supercomputer will be new Intel technologies designed specifically for the convergence of artificial intelligence and high performance computing at extreme computing scale. These include a future generation of Intel® Xeon® Scalable processor, a future generation of Intel® Optane™ DC Persistent Memory, Intel’s Xcompute architecture and Intel’s One API software.   Aurora will use Cray’s next-generation Shasta family which includes Cray’s high performance, scalable switch fabric codenamed “Slingshot”.

“Intel and Cray have a longstanding, successful partnership in building advanced supercomputers, and we are excited to partner with Intel to reach exascale with the Aurora system,” said Pete Ungaro, president and CEO, Cray. “Cray brings industry leading expertise in scalable designs with the new Shasta system and Slingshot interconnect. Combined with Intel’s technology innovations across compute, memory and storage, we are able to deliver to Argonne an unprecedented system for simulation, analytics, and AI.”

For more information about the work being done at DOE’s Argonne National Laboratory visit their website HERE.

The times they are a changing. The other day, I stumbled across the Clinical Robotics Law Journal, after being intrigued by a February 20, 2019  article titled:  The Healing Touch: Haptic Feedback and the State-of-the-Art Defense.  Lots of other interesting article titles too, but I resisted the temptation for all but the one. As a teaser, here’s a paragraph from the article:

“Today, private companies and institutions like UCLA are developing technologies that will one day give surgeons the ability to not only see but also feel during robotic surgeries. One such technology accomplishes this “through  [a] sensor that, when placed on the tips of surgical instruments, would provide feedback [in the form of vibrations, forces and buzzes] on the various forces exerted on body tissues to better guide surgery.”  Given the numerous obvious benefits of physicians feeling the tissue on which they operate, it’s not a stretch of the imagination to assume that in the not too distant future, haptic feedback will be considered a state-of-the-art function in surgical robots.

A tobacco company is the latest entity to use bankruptcy to try to limit its obligations for a mass tort. This instance, however, is different because the bankruptcy is in Canada, ownership of the tobacco entities ties to Japan, and the bankruptcy follows on after restructuring efforts that a trial judge viewed as probably illegal fraudulent structures and transfers. This new bankruptcy follows the affirmance of the about $15 billion class action verdict against multiple tobacco companies, as previously mentioned here. The story is told in more detail in a March 8, 2019 post at Eye on the Trials.

So many issues lie ahead for litigation involving AI. With that in mind, here’s the abstract from a new paper by the indefatigible Dan Schwarcz and Anya Prince. This is the link to the paper at SSRN.


Big data and artificial intelligence are revolutionizing the ways in which financial firms, governments, and employers classify individuals. Surprisingly, however, one of the most important threats to anti-discrimination regimes posed by this revolution is largely unexplored or misunderstood in the extant literature. This is the risk that modern algorithms will result in “proxy discrimination.” Proxy discrimination is a specific type of practice producing a disparate impact. It occurs when two conditions are met. The first is widely recognized: a facially-neutral characteristic that is relevant to achieving a discriminator’s objectives must be correlated with membership in a protected class. By contrast, the second defining feature of proxy discrimination is generally overlooked: in addition to producing a disparate impact, proxy discrimination requires that the predictive power of a facially-neutral characteristic is at least partially attributable to its correlation with a suspect classifier. For this to happen, the suspect classifier must itself have some predictive power, making it ‘rational’ for an insurer, employer, or other actor to take it into consideration. As AIs become even smarter and big data becomes even bigger, proxy discrimination will represent an increasingly fundamental challenge to many anti-discrimination regimes. This is because AIs are inherently structured to engage in proxy discrimination whenever they are deprived of predictive data. Simply denying AIs access to the most intuitive proxies for predictive variables does nothing to alter this process; instead it simply causes AIs to locate less intuitive proxies. The proxy discrimination produced by AIs therefore has the potential to cause substantial social and economic harms by undermining many of the central goals of existing anti-discrimination regimes. For these reasons, anti-discrimination law must adapt to combat proxy discrimination in the age of AI and big data. This Article offers a menu of potential responses to the risk of proxy discrimination by AI. These include prohibiting the use of non-approved types of discrimination, requiring the collection and disclosure of data about impacted individuals’ membership in legally protected classes, and requiring firms to eliminate proxy discrimination by employing statistical models that isolate only the predictive power of non-suspect variables.

Keywords: Proxy Discrimination, Artificial Intelligence, Insurance, Big Data, GINA

Schwarcz, Daniel B. and Prince, Anya, Proxy Discrimination in the Age of Artificial Intelligence and Big Data (March 6, 2019). Available at SSRN:


The $15 billion or so Quebec class action verdict in Canada against tobacco companies was upheld late on Friday in a 440 page opinion, in French. It will interesting to watch the reactions of stock markets, and learn more as commentaries and English language translations are distributed. The Eye on the Trials blog is an excellent source of information about the case.



What if Alexa went to law school? That’s the interesting headline used to tee of some exchanges about AI and changes to Lexis/Nexis products, including legal research and court dockets.  This February 11, 2019 post at Dewey B. Strategic is worth reading for some glimpses into the past and what’s ahead; it is titled: Lexis Prepares to Launch a Research Bot – And a CourtLink Makeover

“Protection gaps” are one of the results of the failure of state regulation of most forms of insurance. The gap problem – as it exists for property insurance – will be discussed in detail at an upcoming conference at Rutgers Law School, with an interesting and economically diverse set of business community speakers, and truly astute professors and lawyers. Wish I could be there. See the agenda pasted below. To register, go to


“A conference on

The Protection Gap in Property Insurance

Friday, March 29, 2019

Rutgers Law School, Camden, New Jersey

The protection gap is the difference between losses that are insured and losses that could or should be insured. The Rutgers Center for Risk and Responsibility at Rutgers Law School Conference on The Protection Gap in Property Insurance will address the protection gap in residential and commercial property losses and related types of losses in the United States.

The property insurance protection gap can have significant impact on individuals and communities; a property owner who does not have flood insurance may lack the resources to rebuild after a hurricane, for example, and if many property owners lack insurance, an entire community may be hard-pressed to recover.

The concept of a protection gap raises several issues:

What is a protection gap? What protection gaps exist in property insurance and what causes them? Some examples:

An entity is entirely uninsured or insurance is unavailable. This is rare in property insurance in the US, with the notable and high-profile exception of Puerto Rico, which came to light after Hurricane Maria.

Insured, but certain perils not covered. Homeowners insurance policies exclude coverage for losses caused by natural disasters such as flood or earthquake, and many homeowners fail to purchase available catastrophe insurance.

Under-insured. Three of every five homes in America are underinsured by an average of 20 percent less than full value, according to analytics firm CoreLogic.

Other exclusions or restrictions on coverage. Many homeowners and commercial property policies contain hurricane deductibles or windstorm deductibles, restrictive loss settlement provisions, or other limitations of which policyholders may be unaware.

What solutions are there for protection gaps?

Some examples:

Legislators and regulators can require information disclosures and prescribe policy terms to ensure adequate coverage. In the wake of the California wildfires, the legislature enacted a series of reforms aimed at improving consumer understanding and better coverage for homeowners.

Insurers and intermediaries can innovate products and marketing and can reduce costs to increase availability of coverage and consumer awareness. Insurtech, on-demand insurance, and parametric insurance are being offered as solutions to protection gaps.

Speakers include:

Michael Childress, Childress Loucks & Plunkett

Tom Considine, National Conference of Insurance Legislators

Jay Feinman, Rutgers Law School

Laura Foggan, Crowell & Moring

Nicholas Insua, Anderson Kill

Peter Kochenburger, University of Connecticut School of Law

R.J. Lehmann, R Street Institute

William F. “Chip”Merlin, Jr., Merlin Law Group

Sherilyn Pastor, McCarter & English

Michael Saltzman, Goldberg Segalla

Adam Scales, Rutgers Law School

Daniel Schwarcz, University of Minnesota Law School

Robert Schindler, Rutgers School of Business-Camden

Rick Swedloff, Rutgers Law School

Paul Tetrault, Insurance Library Association of Boston

Sandy Watts, United Policyholders

Harold Weston, Georgia State University

CLE credit available.

To register: