
AI disclosure is quickly becoming a brand risk problem, not just a legal one. When J. Crew used AI-generated imagery in its collaboration announcement with Vans, audience backlash was swift. Examples like this serve as proof that brand trust in the AI age is fragile, and the stakes are high when reputation directly affects the bottom line.
Navigating these new community dynamics requires a mix of traditional brand strategy and practical knowledge of digital rights. Jaime Schwarz, Founder of brand consultancy Brand Therapy and an adjunct professor of marketing and PR at the City College of New York, is actively studying and advising around these changes. After senior creative roles at agencies including BBDO and gyro, he founded MRKD.dj, an automated licensing and rights-management platform that recently partnered with DigiRamp to simplify music rights and royalties. As the author of The IP Economy, Schwarz doesn't see AI as a crisis. It's simply a structural reality that requires new operational guardrails.
"A brand is the value of trust between a consumer and a company. Breaking that trust is really, really easy to do, but really hard to build back up. It's way more of a threat than any competition ever could be," Schwarz says. The sudden rollout of generative tools, for one, forces brands to rethink how they protect their assets now that fans can easily remix brand content. Just look at Disney ending its relationship with OpenAI’s Sora. Even carefully guarded intellectual property is difficult to manage in open AI environments. Some corporate teams are taking a wait-and-see approach as the first waves of AI copyright lawsuits move through the courts. Schwarz advises leaders not to wait for every legal question to be answered. Instead, build around what is already settled, like the U.S. Supreme Court's decision that AI-generated works cannot be copyrighted, while staying informed on ongoing litigation.
Dropping the gavel: "When people ask what to do because nothing has been settled and it feels like a world of chaos, I have to ground everybody by saying the Supreme Court ruled that AI cannot own IP. Imagine what the world would look like if that was not settled. Let's live within that space and focus on what we can build where there is value." In other words, AI-generated content cannot be copyrighted, meaning no tool or company can claim ownership over what its model produces. For brands, that's a meaningful safeguard in an otherwise amorphous legal landscape.
To maintain that protection in practice, brands need clear internal guardrails around how AI-generated materials are created and when use is disclosed. In Schwarz's view, one of the most pressing risks for modern brands is not a specific platform, but hypocrisy that derails consumer trust. Moving too fast with new tools can make for mistakes like putting out AI-generated work without properly acknowledging it. To avoid breaking consumer trust, agencies need clear, internal disclosure practices that signal when and how AI is involved.
The hypocrisy tax: "Social media made hypocrisy the top problem for brands. In the age of AI, we are not telling the truth to each other about what we are making. That is going to catch up with us," Schwarz warns. The brands that come out ahead, he says, will be the ones building clear protocols now for when and how to disclose AI involvement before regulators or consumers force the conversation.
The speed of AI adoption is also reshaping how brand teams are structured internally. Schwarz describes a new internal role he calls "brand stewardship." As organizations adopt new AI advertising production tools, stewards are tasked with maintaining a consistent internal compass when AI is part of the content pipeline. They vet AI tools for technical bugs, but also outputs that might warp brand voice, safety, and cultural perception.
The voice of conscience: Schwarz says brand stewardship goes beyond compliance. It's a dedicated internal function designed to keep brands authentic as AI reshapes the content pipeline. "Brand stewardship is a new kind of role inside a company where you are really tasked with keeping the brand on track because it keeps shifting. It's there internally for the company to keep everybody on brand. It becomes that voice in the back of your head, the voice of conscience."
Schwarz argues that the strongest defense against AI-related brand risk is clarity around what the brand stands for in the first place. That reduces the need to police how employees interact with AI tools. In his Brand Therapy practice, he works with leadership teams to clarify why the brand exists and what it stands for, then uses that internally as a way to build culture. If teams understand and believe in the brand’s purpose, they are more likely to use new technologies in ways that reinforce that identity.
Passing the baton: Making mission and identity crystal clear changes how leaders too. He says the best results come not from oversight, but from building genuine alignment within your team. "As a leader, you know you are not the boss. You work for the brand and the company. You are a steward holding the baton for now. And when it comes to your team, you are going to get the maximum out of somebody with intrinsic motivation. Are we aligned? Then there is no more oversight. It is trust."
The question of who controls brand identity is also shifting as fans increasingly dictate how a brand is perceived. This means maintaining credibility requires directly engaging communities rather than simply pushing product to them. Brands used to compete mainly on product features, but with the rise of social media, many have shifted toward purpose-driven identities. Now, some campaigns are entering a "tribal" stage, where brands plug into communities that already exist like gaming, fandoms, or local scenes, and play a supporting role rather than trying to own the space outright. As Gen Alpha pushes for more in-person and spatial experiences, this often means partnering with creators who already have credibility.
As AI gives fans more power to remix and co-create brand content, the brands best positioned to benefit are the ones who aren't trying to own the conversation. Schwarz points to Tastiez's collaboration around tabletop gaming, which features a sweatshirt with tear-off sleeves that doubles as napkins. The idea is a simple, playful acknowledgement of the fact that on game night, people want snacks without greasy hands on their cards. The idea works because it maps to a real behavior inside a culture rather than inserting the brand at the center of it. As fan-driven AI film projects and automated video spots multiply, that same instinct applies: give creative freedom to your audience and build around complementary brands and shared ecosystems rather than competing for the center of the frame.
Main character syndrome: "The first lesson to every client is that you are not the big deal. Your product is not in service of itself. It is in service of the night overall, and you have one role within that. If you recognize it is a role of a greater service that all of your complementary brands are working on together, the thing you can tap into is so much stronger."
Building IP infrastructure, internal culture, and tribal participation may feel like separate challenges, but Schwarz sees a common thread: all three demand better operational discipline as the relationship between brand and AI evolves. Right now, rules around AI generation and IP co-creation remain loose. Schwarz expects the market will eventually mature and standards will solidify. Building good transparency habits early is simply cheaper and easier than doing damage control later. For marketers and agencies, AI can't just be viewed as a shortcut to more content because it's a crucial place where brand trust is earned or lost.
"We are in a lovely wild west right now where everybody gets a free pass, but that window is going to close," Schwarz says. "So be true now. You do not know when the door is going to close. You do not want to be caught on the other side. Learn how to build trust now."