I’ve been on a hiatus from blogging. To those of you who missed having quizzes today and last Friday, I apologize. The #fiveforfriday quiz will return next week.
Also next week, I intend to explore the distinction between conduct that should be considered “unethical” and conduct that “violates the rules as they are written.” I’ll do so through the lens of New Jersey’s recent advisory opinion on Avvo’s Legal Services Program. For an excellent primer on the topic, check out this post from Above The Law.
Now, on to chatbots.
A few months ago, I asked whether robots are non-lawyer assistants. Referring to the idea that lawyers have a duty to ensure that their nonlawyer assistants comply with rules, I wrote:
- “As I’ve often said, Rule 1.1’s duty of competence includes tech competence. Read together, do Rules 1.1 and 5.3 require lawyers who use robots to have some sort of understanding of the coder’s qualifications? Perhaps we will eventually treat the purchase of robots as we do the selection of a cloud vendor and hold that ‘a lawyer must take reasonable precautions in choosing a robot that will perform mundane legal tasks.'”
It probably seemed far-fetched.
Today, Robert Ambrogi’s LawSites blog posted DoNotPay Launches Service to Let Anyone Create a Legal Bot. Essentially, the service allows a lawyer to create a robot assistant.
As I’ve often said, do not fear technology. Advances in technology are not inherently unethical. Think of the ways you use technology today that were not imaginable, or available, 20 years ago.
For example, compare (1) a secure email with a password-protected attachment & read-receipt, to, (2) a stamped letter, dropped off at 5:27 PM on a Friday, with a return-receipt requested.
Would you rather that the rules be interpreted so as to require use of the U.S Mail?
Technological advances create opportunities for lawyers & firms to operate more efficiently and to provide wider access to cheaper legal services.