Callidus is the most advanced legal AI platform. Offering a wide range of support for both litigation and transactional workflows, Callidus helps legal professionals drive better outcomes with increased efficiency. Callidus keeps the lawyer in the loop with interactive and highly visual workflows, and none of our solutions require more than 5 minutes of setup time
The AI-driven platform transforming mass tort case evaluations and settlements. Our software analyzes thousands of medical records in minutes, swiftly categorizing documents and surfacing key information to streamline case reviews and preparation for MDL settlements.
LAER AI is driven by a singular vision of radically transforming the experience of search and helping organizations find meaning and patterns hidden behind volumes of disparate and unstructured data. Its current product provides a significantly more accurate, faster, and cost effective solution to the expensive document review process duringlitigation, investigations, and compliance.
descrybe.ai is a singular way to search for, and understand, caselaw. Our unique process leverages generative AI to make complex legal information more accessible to professionals and laypeople alike. We are laser-focused on easy access to caselaw research, by lowering cost (it's free!), increasing ease of use, and employing natural language search and summarization capacity.
Responsiv is a legal and compliance automation platform that takes routine administrative tasks and delivers ready-to-review first drafts of work product. By integrating with your organization's systems, Responsiv analyzes regulatory changes and performs a gap analysis on your existing controls, policies, and procedures, suggesting necessary changes.
Lexlink AI revolutionizes legal document analysis by automating the identification of inconsistencies and discrepancies, enhancing litigation strategies. Key features include advanced inconsistency detection and automated discovery drafting, streamlining processes and saving time. Committed to transforming the legal industry, Lexlink AI ensures data privacy and champions innovation, making proceedings more efficient and fair.
The new GenAI Profile reflects NIST's recommendations for implementing the risk management principles of the AI RMF specifically with respect to generative AI. This guidance is intended to assist organizations with implementing comprehensive risk management techniques for specific known risks that are unique to or exacerbated by the deployment and use of generative AI applications and systems.
Image-generating technology is accelerating quickly, making it much more likely that you will be seeing "digital replicas" (sometimes referred to as "deepfakes") of celebrities and non-celebrities alike across film, television, documentaries, marketing, advertising, and election materials. Meanwhile, legislators are advocating for protections against the exploitation of name, image, and likeness while attempting to balance the First Amendment rights creatives enjoy.
Aescape offers AI-powered robotic massages via its “Aertable,” which uses real-time feedback and a body scan system to deliver personalized experiences. With features like “Aerpoints” simulating therapist touch and “Aerwear” enhancing accuracy, Aescape addresses the massage industry’s challenges like inconsistency and therapist shortages. While expanding rapidly, it raises legal issues including liability, privacy, licensing, regulation, and IP concerns.
Large language models rely on vast internet-scraped data, raising legal concerns, especially around intellectual property. Many U.S. lawsuits allege IP violations tied to data scraping. An OECD report, Intellectual Property Issues in AI Trained on Scraped Data, examines these challenges and offers guidance for policymakers on addressing legal and policy concerns in AI training.
The integration of artificial intelligence (AI) tools in healthcare is revolutionizing the industry, bringing efficiencies to the practice of medicine and benefits to patients. However, the negotiation of third-party AI tools requires a nuanced understanding of the tool’s application, implementation, risk and the contractual pressure points.
Parents of two Texas children have sued Character Technologies, claiming its chatbot, Character.AI, exposed their kids (ages 17 and 11) to self-harm, violence, and sexual content. Filed by the Social Media Victims Law Center and Tech Justice Law Project, the suit seeks to shut down the platform until safety issues are addressed. It also names the company’s founders, Google, and Alphabet Inc. as defendants.