← All tools

Cloud & Storage

AWS S3 Object Store

List, read, write, head, and delete S3 objects.

s3_object

Overview

Works with files stored in Amazon S3, a very common place companies keep logs, reports, and shared data. Agents can list what's there, read small files, upload results, or clean up old items.

How it works

Wraps the AWS SDK S3 client with five operations: 'list' enumerates objects under a prefix; 'read' returns a text object's body (capped at 1 MiB to protect the agent's context); 'write' uploads a text object; 'head' returns metadata without transferring the body; 'delete' removes an object. Uses the default AWS credential chain so it works from env vars, profiles, or IAM roles.

Example

When a user asks:

Upload this generated report to s3://reports/2026/.

the agent calls the tool:

s3_object(operation="write", bucket="reports", key="2026/q4.md", content="…")

and gets back: the full s3:// URL of the uploaded object.

Configuration

Set these before calling the tool. Values marked required must be present or the tool call will fail.

AWS_REGION required

AWS region (also accepts AWS_DEFAULT_REGION). Used when no Spring override is set.

AWS_ACCESS_KEY_ID / AWS_SECRET_ACCESS_KEY required

AWS credentials via the default chain — or use AWS_PROFILE / IAM role / SSO. Any supported AWS auth works.

swarmai.tools.s3.endpoint-override optional

Optional custom endpoint URL for LocalStack, MinIO, or other S3-compatible stores.

Use it in a workflow

Wire this tool into a SwarmAI crew. Use the YAML DSL for declarative workflows, or the Java builder API when you want full programmatic control.

YAML DSL

# storage.yaml
name: storage-crew
process: SEQUENTIAL

agents:
  - id: storage
    role: Storage Manager
    goal: Manage generated artifacts in S3
    tools:
      - s3_object

tasks:
  - id: storage-task
    agent: storage
    description: Upload the generated report to s3://reports/2026/q4.md.

Java

import ai.intelliswarm.swarmai.agent.Agent;
import ai.intelliswarm.swarmai.task.Task;
import ai.intelliswarm.swarmai.swarm.Swarm;
import ai.intelliswarm.swarmai.swarm.SwarmOutput;
import ai.intelliswarm.swarmai.process.ProcessType;
import ai.intelliswarm.swarmai.tool.cloud.S3Tool;
import org.springframework.ai.chat.client.ChatClient;
import org.springframework.beans.factory.annotation.Autowired;

@Autowired ChatClient chatClient;
@Autowired S3Tool s3Tool;

Agent storage = Agent.builder()
    .role("Storage Manager")
    .goal("Manage generated artifacts in S3")
    .chatClient(chatClient)
    .tool(s3Tool)
    .build();

Task storageTask = Task.builder()
    .description("Upload the generated report to s3://reports/2026/q4.md.")
    .agent(storage)
    .build();

SwarmOutput result = Swarm.builder()
    .agent(storage)
    .task(storageTask)
    .process(ProcessType.SEQUENTIAL)
    .build()
    .kickoff();

What it's good for

Real scenarios where agents put this tool to work.

Audit deploy buckets — list the last N uploaded artifacts under a prefix
Persist agent-generated reports to s3://reports/<date>/<slug>.md
Fetch prompt templates or small configs stored in S3
Clean up stale test fixtures by deleting a known key after a run

Source

Implementation lives at swarmai-tools/src/main/java/ai/intelliswarm/swarmai/tool/cloud/S3Tool.java in the swarm-ai repository.

Open s3_object on GitHub →