Software Supply Chain Intelligence Guardian
A multi-agent intelligence network that continuously monitors your entire dependency ecosystem for supply chain threats - detecting compromised packages, tracking vulnerabilities, and generating actionable remediation plans before a breach reaches production.
The Software Supply Chain Intelligence Guardian demonstrates a continuously running multi-agent threat intelligence pipeline with persistent shared state - a living SBOM that all agents read from and write to, enabling emergent capabilities no single agent could achieve alone.
Software supply chain attacks have become one of the most dangerous and fastest-growing threat vectors entering 2026. The Shai-Hulud worm compromised 187+ npm packages, SANDWORM_MODE hit in February 2026, and PackageGate exposed simultaneous zero-days in npm, pnpm, vlt, and Bun. Traditional security tooling is reactive - scanners flag known CVEs after the fact, by which time production systems may already be compromised.
This blueprint deploys six specialised agents that form an autonomous intelligence pipeline:
-
Dependency Scout (hourly) - Reads project manifests, builds full dependency trees including transitive dependencies, and writes a normalised SBOM JSON to the shared workspace. The living SBOM is the foundation every other agent depends on.
-
Threat Harvester (hourly, offset by 30 minutes) - Fetches live intelligence from GitHub Security Advisories, the CISA Known Exploited Vulnerabilities feed, npm audit API, and web search for emerging attack reports. Maintains a structured threat registry with deduplication.
-
Risk Analyst (hourly, runs after harvester) - The correlation engine. Cross-references the SBOM against the threat registry to calculate per-service exposure scores. When critical or high findings are detected it immediately calls the Remediation Planner and Alert Coordinator to trigger the response chain.
-
Ecosystem Health Monitor (daily) - Catches the decay signals that precede supply chain attacks: unmaintained packages are prime targets for malicious takeover. Tracks maintainer activity, commit recency, download trends, and issue response rates. Flags "zombie packages" heading toward abandonment before they become the next attack vector.
-
Remediation Planner (triggered by high-risk findings) - Produces concrete, tested migration plans. Uses shell execution to verify proposed alternative packages actually work before recommending them. Includes effort estimates and a SPDX-lite SBOM export for EU Cyber Resilience Act compliance.
-
Alert Coordinator (triggered) - Translates raw intelligence into tiered Slack messages: P0 critical alerts to #security-alerts, daily risk digests to #security-daily, and weekly ecosystem health summaries to #engineering.
Novel architectural patterns in this blueprint:
-
Living SBOM - Persistent file storage maintains an always-current, machine-readable inventory. All agents share this single source of truth without direct coupling, enabling a clean data pipeline architecture.
-
Threat correlation - The Risk Analyst joins two independently maintained data sources (SBOM + threat registry) to produce intelligence neither agent could generate alone.
-
Proactive abandonment detection - The Ecosystem Health Monitor catches decay signals months before a package becomes unmaintained and gets targeted by attackers - turning a reactive security posture into a proactive one.
-
Autonomous compatibility testing - The Remediation Planner uses sandboxed shell execution to actually verify proposed replacements work before recommending them, eliminating the guesswork from emergency migrations.
The six scheduled and event-triggered agents run autonomously around the clock. During active supply chain events - which happen regularly in 2026
- the alerts deliver genuinely actionable intelligence. The persistent files fill up with structured data over time, demonstrating how shared state enables emergent multi-agent capabilities.
Backstory
Common information about the bot's experience, skills and personality. For more information, see the Backstory documentation.
Skillset
This example uses a dedicated Skillset. Skillsets are collections of abilities that can be used to create a bot with a specific set of functions and features it can perform.
bash
Execute a shell command or scriptrw
Read or write content to a file in the space storageFetch Web Page
Fetch the content of a web page using a URL and convert it to textSearch Web
Search the web for specific keywordsFetch Web Page
Fetch the content of a web page using a URL and convert it to textrw
Read or write content to a file in the space storageRead/Write Space Storage File
Read or write content to a file in the space storageCall Bot
Call the Remediation Planner agent when high-risk findings are detectedCall Alert Coordinator
Call the Alert Coordinator agent to dispatch tiered Slack notificationsFetch Web Page
Fetch the content of a web page using a URL and convert it to textSearch Web
Search the web for specific keywordsRead/Write Space Storage File
Read or write content to a file in the space storagebash
Execute a shell command or scriptSearch Web
Search the web for specific keywordsFetch Web Page
Fetch the content of a web page using a URL and convert it to textRead/Write Space Storage File
Read or write content to a file in the space storageRead/Write Space Storage File
Read or write content to a file in the space storageSend Slack Message
Send a message to a specific channel in Slack
Secrets
This example uses Secrets to store sensitive information such as API keys, passwords, and other credentials.
Slack
A secret without description
Terraform Code
This blueprint can be deployed using Terraform, enabling infrastructure-as-code management of your ChatBotKit resources. Use the code below to recreate this example in your own environment.
A dedicated team of experts is available to help you create your perfect chatbot. Reach out via or chat for more information.