Documentation Index
Fetch the complete documentation index at: https://docs.agentsystems.ai/llms.txt
Use this file to discover all available pages before exploring further.
1. Set Up Your S3 Bucket
The SDK writes raw logs directly to your S3 bucket — you maintain full control.
- Create an S3 bucket (names are globally unique, so include your company name — e.g.,
acme-corp-raw-ai-logs)
- Create the following IAM policy:
{
"Version": "2012-10-17",
"Statement": [{
"Effect": "Allow",
"Action": ["s3:PutObject", "s3:GetObject", "s3:ListBucket"],
"Resource": [
"arn:aws:s3:::acme-corp-raw-ai-logs",
"arn:aws:s3:::acme-corp-raw-ai-logs/*"
]
}]
}
- Create an IAM user and attach the policy
- Generate access keys for the user (select “Application running outside AWS”) and save them
2. Install the SDK
mkdir notary-demo && cd notary-demo
python3 -m venv .venv
source .venv/bin/activate
pip install agentsystems-notary langchain-anthropic python-dotenv
mkdir notary-demo && cd notary-demo
python3 -m venv .venv
source .venv/bin/activate
pip install agentsystems-notary crewai python-dotenv
Create a .env file:
# AWS credentials for your S3 bucket
ORG_AWS_S3_BUCKET_NAME=acme-corp-raw-ai-logs
ORG_AWS_S3_ACCESS_KEY_ID=...
ORG_AWS_S3_SECRET_ACCESS_KEY=...
ORG_AWS_S3_REGION=us-east-1
# Arweave hash storage
ARWEAVE_BUNDLER_URL=https://node2.bundlr.network
ARWEAVE_PRIVATE_KEY_PATH=path/to/rsa-4096-private.pem
# Your LLM provider
ANTHROPIC_API_KEY=...
4. Integrate the SDK
Create test_langchain.py:import os
from dotenv import load_dotenv
from agentsystems_notary import (
LangChainNotary,
RawPayloadStorage,
ArweaveHashStorage,
LocalKeySignerConfig,
AwsS3StorageConfig,
)
from langchain_anthropic import ChatAnthropic
load_dotenv()
# Where full audit payloads are stored (your S3 bucket)
raw_payload_storage = RawPayloadStorage(
storage=AwsS3StorageConfig(
bucket_name=os.environ["ORG_AWS_S3_BUCKET_NAME"],
aws_access_key_id=os.environ["ORG_AWS_S3_ACCESS_KEY_ID"],
aws_secret_access_key=os.environ["ORG_AWS_S3_SECRET_ACCESS_KEY"],
aws_region=os.environ["ORG_AWS_S3_REGION"],
),
)
# Where hashes are stored — Arweave for independent verification
hash_storage = [
ArweaveHashStorage(
namespace="my_namespace",
signer=LocalKeySignerConfig(
private_key_path=os.environ["ARWEAVE_PRIVATE_KEY_PATH"],
),
bundler_url=os.environ["ARWEAVE_BUNDLER_URL"],
),
]
# Initialize notary
notary = LangChainNotary(
raw_payload_storage=raw_payload_storage,
hash_storage=hash_storage,
debug=True,
)
# Add to any LangChain model
model = ChatAnthropic(
model="claude-sonnet-4-5-20250929",
api_key=os.environ["ANTHROPIC_API_KEY"],
callbacks=[notary],
)
# All LLM calls are now logged
response = model.invoke("What is 2 + 2?")
print(response.content)
Run it: Create test_crewai.py:import os
from dotenv import load_dotenv
from agentsystems_notary import (
CrewAINotary,
RawPayloadStorage,
ArweaveHashStorage,
LocalKeySignerConfig,
AwsS3StorageConfig,
)
from crewai import Agent, Task, Crew, LLM
load_dotenv()
# Where full audit payloads are stored (your S3 bucket)
raw_payload_storage = RawPayloadStorage(
storage=AwsS3StorageConfig(
bucket_name=os.environ["ORG_AWS_S3_BUCKET_NAME"],
aws_access_key_id=os.environ["ORG_AWS_S3_ACCESS_KEY_ID"],
aws_secret_access_key=os.environ["ORG_AWS_S3_SECRET_ACCESS_KEY"],
aws_region=os.environ["ORG_AWS_S3_REGION"],
),
)
# Where hashes are stored — Arweave for independent verification
hash_storage = [
ArweaveHashStorage(
namespace="my_namespace",
signer=LocalKeySignerConfig(
private_key_path=os.environ["ARWEAVE_PRIVATE_KEY_PATH"],
),
bundler_url=os.environ["ARWEAVE_BUNDLER_URL"],
),
]
# Initialize notary (hooks register automatically)
notary = CrewAINotary(
raw_payload_storage=raw_payload_storage,
hash_storage=hash_storage,
debug=True,
)
# Create LLM
llm = LLM(
model="anthropic/claude-sonnet-4-5-20250929",
api_key=os.environ["ANTHROPIC_API_KEY"],
)
# Create agent and task
agent = Agent(
role="Research Analyst",
goal="Answer questions accurately",
backstory="You are an expert analyst.",
llm=llm,
)
task = Task(
description="What is 2 + 2?",
expected_output="The answer to the math question",
agent=agent,
)
crew = Crew(agents=[agent], tasks=[task])
# All LLM calls are now logged
result = crew.kickoff()
print(result)
Run it:
With debug=True, you’ll see confirmation that the log was written to S3 and the hash was uploaded to Arweave.
5. Verify
Anyone can verify your logs using the open-source CLI — no account required:
npm install -g agentsystems-verify
agentsystems-verify \
--namespace my_namespace \
--start 2026-01-01 \
--end 2026-01-31 \
--logs logs.zip
See Independent Verification for full CLI documentation.
Signer Options
The examples above use a local RSA-4096 private key. You can also use managed key services:
LocalKeySignerConfig — Local RSA-4096 private key (PEM file or env var)
AwsKmsSignerConfig — AWS KMS managed key
GcpKmsSignerConfig — GCP Cloud KMS managed key
AzureKeyVaultSignerConfig — Azure Key Vault managed key
Using Custodied Storage (Optional)
If you prefer AgentSystems to manage hash storage, you can use our custodied option instead of (or in addition to) Arweave.
Create Account
Go to notary.agentsystems.ai and sign up.
Generate an API Key
- From the Dashboard, click Add under API Keys
- Name your key and select environment
- Copy and save the key — it’s only shown once
Add to Environment
AGENTSYSTEMS_NOTARY_API_KEY=sk_asn_test_...
Add Custodied Hash Storage
from agentsystems_notary import CustodiedHashStorage
hash_storage = [
ArweaveHashStorage(...), # Keep Arweave for independent verification
CustodiedHashStorage(
api_key=os.environ["AGENTSYSTEMS_NOTARY_API_KEY"],
slug="customer-123",
),
]
Verify via Portal
- Go to notary.agentsystems.ai → Tenants → Generate Verification Ticket
- Export logs from your S3 bucket
- Upload both to verify.agentsystems.ai
See Verification Guide for full portal documentation.