AI Sandbox offers a testing ground for judges and court staff to explore technology

AI Sandbox offers a testing ground for judges and court staff to explore technology

January 15, 2025 -- Courts that are curious about generative AI now have a secure platform to test drive this emerging technology.

Like ChatGPT and other AI large language models, NCSC’s new AI Sandbox allows users to explore how generative AI works and ways courts can utilize the technology to improve productivity, efficiency, and customer service.

The AI Sandbox, a product of the Thomson Reuters Institute (TRI)/NCSC AI Policy Consortium for Law and Courts, is a secure and private cloud environment exclusively for judges and court staff. Accessible only via the NCSC portal, this dedicated space ensures the privacy and security of all AI experiments.

"All the data input and information generated stays within the system," explained Andre Assumpcao, an NCSC data scientist and sandbox engineer. "Some courts are hesitant to use public AI tools or aren’t sure where or how to start. Our private tool is a safe alternative to use to experiment with ideas and explore AI possibilities."

How it works

The AI Sandbox uses the same large language model technology as other public AI tools. Once logged into the sandbox, users can start a new chat or use a templated statement or question from the prompt library.

All chat interactions are automatically saved and available in the chat history, allowing users to return to a previous conversation and keep chatting. Users can also share their chats with others.

Additionally, the AI Sandbox can summarize or answer questions about uploaded documents—including PDFs, Word files, and images of text—and provide citations for easy reference.

Users should only enter publicly available information and avoid using sensitive case details when testing in the private tool. Assumpcao said this is an overall good practice for both testing and real-world environments.

AI use cases in courts

Courts are currently using AI for various internal procedures, including:

  • Creating job descriptions;
  • Drafting court orders;
  • Providing access to policies and procedures through internal-use chatbots;
  • Delivering information to litigants through public-facing chatbots;
  • Retrieving and docketing information from filed documents; and
  • Processing invoices.

Assumpcao said courts considering AI should look for opportunities for low-risk improvements to internal procedures and always keep a human in the loop to check the work and verify the accuracy of information generated by AI technologies.

To learn more about some of these use cases, watch the Dec. 18 AI in Action: Current Applications in State Courts webinar and register for the Jan. 29 webinar, Tech for All: Applications of AI to Increase Access to Justice.

For additional information on NCSC's AI-related work, visit ncsc.org/ai.