Quantcast

How Stetson Leads the Way in AI & the Legal Profession

FLORIDA RECORD

Wednesday, December 18, 2024

How Stetson Leads the Way in AI & the Legal Profession

3

Law Firm | Unsplash by Tingey Injury Law Firm

On campus and beyond, the Stetson Law community leads in this new frontier.

From Crummer Courtyard to the gym to the Dolly & Homer Hand Law Library, the Stetson Law campus in Gulfport is buzzing with conversations about how artificial intelligence affects what law students – and legal professionals – do.

Kristen Moore ’09

Students and faculty alike have sentiments ranging from skepticism to excitement. Yet all agree that AI is here to stay and it is having a transformative impact on higher education and the legal field – especially on those who teach or study law, for whom research and writing are integral.

“As lawyers are using it, it becomes part of our job to teach how to use it responsibly and to help our patrons, our students, and our faculty use it,” said Kristen Moore ‘09, instructor in law and associate director of the Dolly and Homer Hand Law Library.

Starting in 2025, Moore will co-teach a course called Advanced Legal Research: Artificial Intelligence and Legal Research [BP1] with References and Student Services Librarian Angelina Vigliotti ‘22.

It is one of many Stetson Law courses directly involving AI and how it relates to ethical implications, and how to assess writing, research, and other skills that are critical to lawyering in a world where generative AI is ubiquitous. In spring of 2025, Stetson Business Law Review will devote its entire symposium to the topic.

New tech, swift action

Dr. Kirsten Davis, Stetson Law’s faculty director of online legal education strategies and a nationally recognized expert in legal communication, jumped at the chance to explore AI’s implications for law schools.

“This was the moment I’ve been waiting on my whole career: the opportunity to study technology that might represent a paradigmatic shift for writing and researching the law,” Davis said.

Law Professor Kirsten Davis

Realizing the likely impact generative AI was about to have on legal education – especially legal writing – Davis engaged colleagues in what became a national conversation. She began teaching her first AI & the Law class in fall of 2023. In spring of 2024, Stetson University named her Provost Faculty Fellow for Generative Artificial Intelligence and Legal Education. She has also been recognized by the American Bar Association and the Association of American Law Schools as a thought leaders in the space. Her work has just begun.

“We are developing foundational policies to keep our community members safe with AI use,” Davis said. “We are starting training for faculty on sensible AI use in the classroom.”

A team of Stetson Law faculty is also participating with the American Association of Colleges and Universities in a year-long institute to explore ways to approach generative AI in the college’s curricula and courses, she said.

Preparing students for a brave new world

As alumni who come into daily contact with students immersing themselves in legal texts, Moore and Vigliotti know the Stetson Law student experience well – particularly the emphasis on ensuring students are practice-ready upon graduating and passing the bar. This is why they find it encouraging that the college is engaging proactivity on AI.

“As these new research tools come up, we’re thinking about when the students go into practice, whether they’re soon to be alumni or might be working for alumni,” Vigliotti said. “Stetson’s really trying to respond.”

Student Services Librarian Angelina Vigliotti ’22

Acknowledging that many students are already using it, a key point they want to stress is that generative AI is a tool – and is by no means a way to evade the deep cognitive work at the heart of lawyering. Moore and Vigliotti want to invite students to see how it compares and contrasts with traditional research methods – and how they can work synergistically.

“As librarians, we’re lovers of information and finding information,” Moore said. “It’s another tool to use in that process.”

An obvious challenge is determining whether an assignment turned in by a student was entirely the product of ChatGPT.

Law professors have long used legal writing assignments to evaluate not just communication skills, but also important underlying lawyering skills like legal reasoning. Students’ use of generative AI to write papers may change that assumption, Davis said, which could mean that new ways of assessing skills need to develop.

“As I tell my students, ‘You need lawyer intelligence to use artificial intelligence,’” she said. “The challenge now is to know how to assess those intelligences.”

While it’s possible that a student at the graduate level would try to pass off an AI-generated paper as their own work, Vigliotti said they are aware of the consequences.

“Students know their name is attached to that work product,” Vigliotti said. “They’re not willing to jeopardize their career, their quality of work, or their integrity.”

In her own work as V.P. and Deputy General Counsel for Turnitin, a platform that develops software to aid educators in detecting AI writing and plagiarism in student assessments, Kristen Chittenden ’10 was so fascinated by the question of how AI affects legal education that she developed a course around it, which she began teaching at Stetson Law as an adjunct in the fall of 2024.

Kristen Chittenden ’10

“In my class, I want them to use AI,” Chittenden said. “I want to see how they’re using it because I would rather learn how they’re thinking about it in a transparent way and what their process is in using it.”

Exploring the tech in an educational setting can allow students to fully grasp its benefits and drawbacks in a setting where there are no consequences for a client.

“Obviously AI doesn’t have genuine understanding and empathy or ethical judgment, so there is the potential for harm or unfair outcomes as a result of that,” she said. “There can also be bias and fairness issues that can be amplified if you don’t have oversight in how you’re using it.”

As the technology – and its use in law firms, courtrooms, and beyond – evolves, Chittenden said she can see classes in the future that are fully devoted to ethics and professional responsibility in the use of AI, to AI rules and regulations, and classes that branch off into specific areas of law, like criminal law or human resources law. “Educating the lawyers of tomorrow in AI and what it can do is critical,” Chittenden said. “Every law school should be thinking about a class in AI.”

How legal professionals use it

AI is now commonly used for tasks such as document drafting, legal research, and even predictive analytics for case outcomes, said Judge Michael Bagge-Hernandez ’07, who presides over Florida’s 13th Judicial Circuit in Hillsborough County. Bagge-Hernandez has been writing and speaking on AI’s uses within the legal profession and its ethical implications.

Some practitioners use it in discovery responses to help quickly identify relevant documents and information and even in client communications, he adds. Applied thoughtfully, it can free up lawyers’ time so that they can focus on duties that constitute the core of what it means to be a lawyer.

Hon. Michael Baggé-Hernández ’07, Florida’s 13th Judicial Circuit

“The advent of AI in legal practice is comparable to other major technological shifts, as it allows attorneys to allocate more time to strategic decision-making and client engagement—areas that require the human touch and deep legal expertise,” Bagge-Hernandez said.

Matt Hitchcock ’07, who is associate general counsel for the insurance company Vault, says he uses it to enhance tasks like brainstorming, rough drafts, evaluating documents, and data review and summarization.

“As an in-house attorney, you often don’t have a support staff, so having a digital assistant that can draft a letter while you do other work is very helpful,” he said.

Although AI is getting better every day, Hitchcock recommends viewing what it produces through a similar lens as one would a junior attorney.

Stetson Law 2007 alumnus Matt Hitchcock

“You should have a base level of trust (if not, there is a problem that needs to be addressed) but you should also review the output with a critical eye because sometimes the AI will get it wrong,” he said. “It’s critical that the attorney using the AI tool has the legal knowledge and skill to know when the AI is getting the law incorrect because at the end of the day, AI is just a tool and the attorney has a professional obligation to their client to ensure they’re providing good legal counsel.”

Knowing where AI can help – and its limits – is critical. It would not be wise, Chittenden said, to let it take over the Shephardizing process completely or generate an entire document, given the tendency for hallucinations. “Can it help you brainstorm ideas that you can then evaluate? Absolutely. Can it help you outline ideas and get organized? One hundred percent. Can it help you with counterpoints? Absolutely,” Chittenden said.

The risks of reticence

While it’s still possible for lawyers to conduct research using traditional methods, like consulting physical law books, the reality is that tools like Westlaw, LexisNexis, and Fastcase have made this approach nearly obsolete.

“Practitioners who resist adopting AI risk being left behind in a rapidly evolving legal landscape,” Bagge-Hernandez said. “It’s no longer practical – or billable – for an attorney to spend hours sifting through books when these advanced platforms can deliver precise results in a fraction of the time.”

He adds that in cases of malpractice or billing disputes, expert witnesses testifying about standard practices in legal research may soon include using AI for certain tasks – just as they currently assert that a competent attorney uses tools like Westlaw or LexisNexis rather than physical books.

Those who hesitate to adopt these tools may end up at a disadvantage that could jeopardize their competitiveness in the field.

Bagge-Hernandez cautions that lawyers “must balance this efficiency with ethical considerations, ensuring that AI is used as a tool to enhance, rather than replace, the critical thinking and judgment that define the legal profession.”

He recommends establishing clear policies for the use of AI that address transparency with clients about use of AI in their cases – and stress that AI is a tool that supports but does not replace a lawyer’s expertise of judgment.

One question Davis is frequently asked by reluctant legal professionals is whether artificial intelligence will eliminate the need for lawyers. 

“The discussion of the end of lawyering has been going on for decades, Davis said. “Generative AI changes the calculus, but I think clients and the legal system will always need careful, smart, lawyers with good judgment and even better ideas who can make good use of technology.”

Start here: Recommended resources

Information about AI may seem nebulous and ever-evolving, but legal professionals who are thought leaders in the AI space recommend keeping up with its implications for lawyering via several trusted sources.

Professional organizations including the Florida Bar, New York Bar, and the American Bar Association (ABA) have offered guidance on AI use for legal professionals – and entities like Lexis and Westlaw also are also producing timely articles on the matter.

Hitchcock and Bagge-Hernandez both recommend reviewing Florida Bar’s Ethics Opinion 24-1 in particular. The ABA Model Rules concerning the duty of competence (Rule 1.1) and supervising nonlawyer assistance (Rule 5.3) also offer guidance on sound use of it.

“Lawyers must establish clear policies that include transparency with clients about the use of AI, rigorous oversight of AI-generated work, and ongoing education to stay informed about the evolving capabilities and ethical implications of AI,” Bagge-Hernandez said.

Original source can be found here.

ORGANIZATIONS IN THIS STORY

More News