Can Robots Be Sued? Shook's AI Report Surveys In-House Counsel on Liability Concerns

It’s a brave new world for lawyers navigating the potential uses and risks of artificial intelligence. The “Brave New World” is of course a nod to the 1930s dystopian novel set in a futuristic world. Today, scientific discoveries are unfolding in a positive way in real time, making life easier through the use of devices on our phones and in our homes. For consumers and businesses, the future is developing. For attorneys, the lack of legal precedent can represent both a challenge and an opportunity.

Shook, Hardy & Bacon’s AI Task Force sheds some light on this changing landscape in a new white paper co-authored with ALM Media. Unlike other efforts focusing on AI-driven legal applications, Shook’s report is based on a survey of in-house counsel in industries directly affected by consumer-facing AI applications, including the automotive, life science and manufacturing sectors.

“As the business needs of a company push for greater efficiency and effectiveness resulting from implementation of artificial intelligence, the in-house counsel role will need to grow,” stated Shook Intellectual Property & AI Partner Cory Fisher. “Not only to understand the underlying technology, but also the new avenues of potential liability from supplanting traditional actors with AI.”

Shook’s AI Task Force takes a unique approach to helping companies anticipate and resolve liability challenges related to AI application. Unlike a traditional law firm practice group, the task force strategically aligns attorneys across a variety of disciplines, including intellectual property, privacy and data security, product liability, class action litigation, and public policy.

A survey taken in February of general counsel found that 58 percent of respondents said their company plans to expand its use of AI in the near term.

Other findings in the general counsel survey revealed their top concerns:

  • Privacy and security was the top concern for 60 percent of those surveyed.
  • Lack of regulation and standards was the next highest concern for 45 percent of those surveyed.
  • Lack of legal precedent was a concern for about 33 percent of those surveyed.
  • IP and patent prosecution was a concern for 29 percent of those surveyed.
  • Product liability was a concern for about one-fourth of those surveyed at 24 percent.

Fisher and his colleague Shook Public Policy Partner Cary Silverman speak and write frequently on the legal ramifications of artificial intelligence and its impact on business and general counsel. Silverman notes that the first lawsuit against an automaker and an incident involving an autonomous test vehicle was merely a four-page complaint alleging negligence that subjected the manufacturer to a reasonable person standard. While AI legal challenges are relatively new, there is an abundance of case law to evaluate the liability a manufacturer or owner could face if there is an injury.

“They can look not only to the principles of product liability law, but also to agency law and even the law of pets for models of how the law imposes responsibility and places contracts on when a person or business is responsible for actions of a third party who makes its own decisions,” said Silverman.

Silverman points out that AI might be more appropriately considered “augmented intelligence” rather than “artificial intelligence” because technology provides a tool to help people analyze data and make faster decisions. So can robots be sued?

“Nobody can predict the future,” said Silverman. “But we definitely live in interesting times.”