
It’s imperative to also apply your own expertise and critical thinking skills to AI results.
Lawyer Jill Blood is vice president and deputy general counsel at Maritz, where she jokingly calls herself “the resident wet blanket” when it comes to AI, she told Digital Media Editor Magdalina Atanassova during a Convene podcast conversation about AI and legal issues. It’s not that Blood doesn’t appreciate or use AI. “I think it’s going to change the world once we really know how to harness its power,” she said. “But there are some legal and ethical risks.”
Among them are concerns about using AI in a way that protects your own intellectual property and data and that of others, she said. “We’re not going to use AI to plagiarize somebody else’s work. And even if it’s helpful, we’re not going to use AI in away that compromises guest data.” It’s important to understand how specific AI tools use information to prevent mishandling sensitive data, like housing lists, Blood said. “If you are using an enterprise tool that’s locked down, one that doesn’t use your data to train [an AI model], it’s probably pretty similar to using that data within your own internal systems. But if you want to put something into a public-facing generative AI tool that you’re using for free, you have to be really thoughtful about what that looks like.” In general, “the more sensitive the information, the more proprietary the information, the more careful you want to be about what tool you’re using, how safe it is, and how you’re using it,” she said.
Jill Blood
Planners also should be “very, very aware” of inherent bias in gen AI tools, which reflect and amplify the biases found on the internet, Blood said. Laws in some states regulate use of AI in HR functions, including screening job applications, because of the potential for discrimination, she said, and AI also can generate biased risk assessments. Meeting professionals also should be aware of the European Union Artificial Intelligence Act, Blood said. The comprehensive law — which will come fully into effect in 2026 — has applicability outside of the EU if you have EU attendees or exhibitors, and “it’s pretty restrictive,” Blood said.
With both the technology and regulations around its use changing so fast, rather than creating hard and fast rules around the use of AI at Maritz, “what we’ve done is establish a series of guidelines,” Blood said. “They include things like: We want to have fair employment practices, we’re going to respect confidentiality, we’re going to make sure information is secure.”
In the events industry, Blood hasn’t yet seen many details on how event organizers will communicate that AI is in use to participants and how it will be used, she said. “And I think we’re going to see that come down in the next few years — especially as more and more tools for things like engagement emerge. I think as an industry we’re going to have to land on how are we disclosing that, how are we managing it, and what does that look like? I don’t think we’ve quite figured that out yet.”
Another bit of advice: “Don’t be afraid of AI, but be a little bit afraid of AI,” Blood said. “Don’t be so afraid that you’re not exploring it, that you’re not thinking about the possibilities, that you’re not tapping into what it can do.”
Use AI — and Your Brain
Maritz’s Jill Blood is a lawyer, which, she said, makes her “sort of personally obligated to say that AI is not something that you should use to replace the need to consult a lawyer for your initial contract templates for important contracts” — meeting professionals “definitely want to be bringing in your lawyer.” That said, she added, “I think that AI can be really powerful for idea generation and for helping event planners think through the implications of different scenarios.” For example, ask AI to suggest clauses you might want to include for an event held during hurricane season or for attrition, she suggested. But it’s imperative to also apply your own expertise and critical thinking skills to the results, she said — “use your brain.” And check AI for accuracy and bias. “I would never take an emergency preparedness plan straight from a generative AI tool and just adopt it.”
Barbara Palmer is deputy editor at Convene.
On the Web
- Listen to the Convene podcast episode, “AI and Event Contracts: Legal Risks, Compliance, and Ethical Best Practices with Jill Blood,” and more conversations in our series about AI and business events.
- Learn about Spark, the AI tool developed for event professionals by PCMA and Gevme, at pcma.org/spark.