Résumé IA
Amazon Bedrock AgentCore permet désormais d'intégrer des agents IA directement dans Slack, éliminant le besoin de basculer entre applications tout en gérant la mémoire conversationnelle, la sécurité et les délais de réponse. La solution repose sur AWS CDK avec trois fonctions Lambda, Amazon API Gateway, SQS et Secrets Manager, tandis que l'agent est conteneurisé et hébergé dans l'AgentCore Runtime via le SDK Strands Agents. L'architecture utilise le protocole MCP (Model Context Protocol) pour l'exécution des outils, et bien que l'exemple soit un agent météo, la couche d'intégration est entièrement réutilisable pour tout cas d'usage métier.
Integrating Amazon Bedrock AgentCore with Slack brings AI agents directly into your workspace. Your teams can interact with agents without jumping between applications, losing conversation history, or re-authenticating. The integration handles three technical requirements: validating Slack event requests for security, maintaining conversation context across threads, and managing responses that exceed Slack’s timeout limits. Developers typically spend time building custom webhook handlers for Slack integrations. AgentCore helps remove this work by providing built-in conversation memory, secure access to agents and their tools, and identity management that tracks agent usage, all from within Slack. In this post, we demonstrate how to build a Slack integration using AWS Cloud Development Kit (AWS CDK) . You will learn how to deploy the infrastructure with three specialized AWS Lambda functions, configure event subscriptions properly to handle Slack’s security requirements, and implement conversation management patterns that work for many agent use cases. We’re using a weather agent as our example, but the integration layer that you’re building is completely reusable. You can customize the runtime and tools for your specific business needs without changing how Slack communicates with your agent. Solution overview This solution consists of two main components: The Slack integration infrastructure and the Amazon AgentCore Runtime with tools. The integration infrastructure routes and manages communication between Slack and the agent, as the runtime processes and responds to queries. The integration infrastructure in this solution uses Amazon API Gateway , AWS Lambda , AWS Secrets Manager , and Amazon Simple Queue Service (Amazon SQS) for serverless integration. The agent has been containerized and hosted to run in AgentCore Runtime. It’s built with the Strands Agents SDK that integrates with Amazon Bedrock AgentCore Gateway for tool access and AgentCore Memory for conversation history. The runtime maintains context throughout conversations and uses the Model Context Protocol (MCP) , a standardized protocol for tool execution and communication, to invoke tools. With these components in place, the following section examines how they work together in the architecture. Architecture diagram The following diagram represents the solution architecture, which contains three key sections: Section A – Image Build Infrastructure – First, WeatherAgentImageStack CDK deploys the container image build pipeline ( Amazon Simple Storage Service (Amazon S3) bucket, AWS CodeBuild project, and Amazon Elastic Container Registry (Amazon ECR) repository). This uses CodeBuild to create AWS Graviton (ARM64) container images that are stored in the ECR repository for use by the AgentCore Runtime. Section B – AgentCore Components – Next, WeatherAgentCoreStack CDK deploys the AgentCore Runtime, Gateway, Memory, and AWS Lambda function. The Runtime uses the Strands Agents Framework an Open Source AI Agents SDK to orchestrate model invocations, tool calls, and conversation memory. Section C – Slack Integration Infrastructure – Lastly, WeatherAgentSlackStack deploys the integration infrastructure (API Gateway, Secrets Manager, Lambda functions, and SQS). This handles webhook verification, Amazon Simple Queue Service (Amazon SQS) queuing, and message processing through three Lambda functions. This layer is reusable for AgentCore use cases. The request flow consists of the following steps: A user sends a message in Slack through direct message or @appname in a channel. Slack sends a webhook POST request to API Gateway. The request is forwarded to the verification Lambda function. The Lambda retrieves the Slack signing secret and bot token from Secrets Manager to verify authenticity. After verification, the Lambda asynchronously invokes the SQS integration Lambda. The SQS integration Lambda sends a “ Processing your request… ” message to the user in a Slack thread. The SQS integration Lambda sends the message to the SQS FIFO queue. The queue triggers the Agent Integration Lambda. The Lambda invokes AgentCore Runtime with the user’s query and a session ID from the Slack thread timestamp. AgentCoreMemorySessionManager retrieves conversation history from AgentCore Memory using the session ID (thread timestamp) and actor ID (user ID). The Strands Framework retrieves tools from AgentCore Gateway using the MCP protocol. The Strands Framework invokes the Amazon Bedrock model (Nova Pro) with the message, context, and tools. The model determines which tools to invoke and generates requests. The Gateway routes tool invocations to the MCP server on Lambda, which executes weather tools. Tool results return to the Strands Framework, which can invoke the model again if needed. The Strands Framework stores the conversation turn in AgentCore Memory. The Agent Integration Lambda updates the “ Processing your request… ” message with the agent’s response.