Want to connect with Garandor?
Join organizations building the agentic web. Get introductions, share updates, and shape the future of .agent.
Is this your company?
Claim this profile to update your info, add products, and connect with the community.
AI agents increasingly act as autonomous creators and distributors of content. Garandor's infrastructure is relevant to the agent ecosystem because it provides a method for agents to sign their outputs, ensuring that synthetic media can be traced back to its source. This is vital for maintaining trust in agentic systems and preventing the unchecked spread of unattributed AI content.
Furthermore, for agents operating in enterprise environments, Garandor acts as a guardrail. By screening agent outputs for intellectual property violations, the platform allows developers to deploy agents that interact with sensitive or copyrighted materials without increasing legal exposure. As agents move toward multi-modal capabilities, Garandor's focus on robust, invisible watermarking for different model types becomes a foundational part of the agent safety stack.
Garandor is an infrastructure startup focused on the legal and technical risks of generative AI. Based in Edinburgh, the company builds tools to track content ownership and detect intellectual property infringements. Their core offering is a suite of digital watermarking models and IP screening services designed for the current era of high-volume synthetic media production.
The problem Garandor addresses is the tension between generative models and copyright holders. As large language models and image generators proliferate, the risk of accidental IP leakage or intentional counterfeit production has become a primary concern for enterprise adoption. Garandor provides a way to mark AI-generated content invisibly, allowing creators to prove ownership and companies to monitor how their brands are represented in the generative ecosystem.
The company emerged from a research-heavy background. It maintains a close relationship with the University of Edinburgh's Generative AI Lab. Founders Matyas K. Zsoldos and Bence Szilagyi have structured the team around researchers and engineers with backgrounds in distributed systems and AI safety. This research-first approach is reflected in their participation in workshops with Google DeepMind and INRIA.
Garandor's platform includes capabilities for training data de-risking and output IP screening. For organizations training their own models, the company helps identify and remove copyrighted material from datasets before training begins. For those deploying models, they offer real-time screening to ensure the model does not generate outputs that violate existing trademarks or copyrights. This is increasingly a requirement for compliance with the EU AI Act, which mandates transparency and risk management for foundation models.
Rather than building a proprietary silo, Garandor aligns with emerging industry standards. They are members of the Coalition for Content Provenance and Authenticity (C2PA) and the Content Authenticity Initiative (CAI). These groups work on open standards for digital provenance, and Garandor’s tools act as an implementation layer for these standards within AI workflows.
The company is part of the NVIDIA Inception program and the Google for Startups Cloud Program. These partnerships suggest a focus on high-performance infrastructure — as watermarking and screening at scale require significant compute resources. Their client base typically consists of AI startups looking to integrate into enterprise environments where legal departments require strict IP guarantees.
The emergence of companies like Garandor signals a shift in the AI market from raw capability to managed risk. While early AI development focused on the quality of generation, the current phase is defined by accountability. Garandor is positioned as a middle-layer provider that allows developers to focus on model performance while outsourcing the compliance and protection features.
By providing localization and multi-model support, they ensure that watermarking remains effective even if the content is cropped, compressed, or edited. This technical resilience is a differentiator in a market where basic metadata tags are easily stripped. As the legal framework for AI matures, infrastructure that treats IP as a first-class citizen becomes a necessity for commercial deployment.
Invisible watermarking and automated brand protection for AI-generated content.
examples and utilities of dstack applications
Calculate AMD SEV/SEV-ES/SEV-SNP measurement for confidential computing
Write Parachains on Substrate
Substrate: The platform for blockchain innovators
An Ethereum-compatible smart contract parachain on Polkadot
Promise and RxJS APIs around Polkadot and Substrate based chains via RPC calls. It is dynamically generated based on what the Substrate runtime provides in terms of metadata. Full documentation & examples available
Flowistry is an IDE plugin for Rust that helps you focus on relevant code.
The main repo for manta blockchain nodes.
Garandor is hiring
You've explored Garandor.
Join organizations building the agentic web.