Search pages, courses, and articles
Most organisations are already running AI without realising it. ChatGPT in marketing, Copilot in engineering, Salesforce Einstein in sales, hiring tools doing CV screening. Each one may be limited, high-risk, or prohibited under the EU AI Act. This pillar maps the audit, classification, and vendor-assessment work that turns shadow AI into a controlled tool register.
Tools are where the Article 4 literacy obligation meets the Annex III high-risk regime. The two questions every organisation needs to answer in 2026 are: what AI is running in our environment, and what is each instance classified as under the EU AI Act? This page indexes our writing on Tools, plus the calculator, classifier, and audit tools that turn those questions into a register and a procurement standard. Most procurement teams discover 3 to 5 high-risk AI systems they didn't know they had.

Find out if your AI tools are high-risk under the EU AI Act. Annex III categories explained with real examples: ChatGPT, Copilot, Salesforce, and more.

The definitive guide to OpenClaw for business. Security risks, EU AI Act compliance, real use cases, and an 8-step deployment checklist for doing it right.

How AI is transforming business intelligence. From predictive analytics to automated reporting: real use cases, tool evaluation criteria, and the compliance angle European businesses must know.
Three questions: Article 5 prohibited, Annex III high-risk, Article 50 limited, or minimal. With the obligations for your tier.
OpenLookup tool. Search 40+ common business software products for AI features that trigger EU AI Act obligations.
OpenBuild a branded Article 4 + Article 26 AI Usage Policy in 6 steps. PDF + a public hosted copy you can share with auditors.
OpenEU AI Act compliance lives across four pillars. Tools is one of them. Browse the others for the full picture.