Philips scales AI literacy to 70k employees
OpenAI
12 nov 2025
Philips is scaling AI literacy to ~70,000 employees, training leaders first, inviting bottom‑up use‑case proposals, and rolling out governed tools. The result is safer, everyday AI use that reduces administrative burden and supports better patient care through transparent, permission‑aware workflows with human oversight.
Philips is turning AI literacy into a company‑wide capability for roughly 70,000 employees. The aim is simple but ambitious: give teams the confidence to use governed AI tools in their real work so they can spend less time on admin and more time on improving patient care.
Why this matters now
Health systems are under pressure from staff shortages, rising demand, and widening trust gaps around new technology. Philips’ programme targets those realities head‑on. Instead of limiting AI to specialist teams, the company is building a shared baseline of skills and behaviours — from executives to frontline staff — so each role knows when AI helps, when it doesn’t, and how to keep humans in the loop. That shift from isolated pilots to everyday literacy is what unlocks compounding benefits: faster information discovery, clearer documentation, and more consistent service — all with the auditability clinicians and regulators expect.
How the programme works
Philips starts at the top. Executives receive hands‑on training so they model responsible use rather than merely mandate it. Momentum then spreads through structured challenges that invite employees to propose and test use cases in low‑risk environments. Access to enterprise‑grade tooling (with permissions, logging, and data controls) channels curiosity into safe productivity gains. As confidence grows, the focus moves from individual tasks to workflow improvements — for example, summarising long documents for a specific role, or turning meeting notes into patient‑safe action lists with provenance. The cultural message is consistent: transparency, fairness, and human oversight come first; automation follows when evidence supports it.
What’s new in 2026
What’s different this year is the scale and the emphasis on trust. Philips has long embedded AI inside products; now the goal is to make day‑to‑day AI use a normal, teachable skill across the enterprise. By starting with low‑risk, internal workflows and measuring faithfulness and user satisfaction, the company builds the foundations to move responsibly into regulated use cases. Crucially, this approach prioritises giving clinicians time back — reducing administrative burden so that more time is spent with patients, not paperwork.
Practical Examples
A clinician might ask for a summary of policy updates relevant to their ward, with citations to the exact clauses and effective dates. A service team can convert free‑text notes into structured action items and next‑step checklists that follow hospital protocols. Programme managers turn meeting transcripts into approval-ready plans, while IT teams use AI to draft incident post‑mortems that include links to the original logs. Each scenario demonstrates literacy applied to real work, not just conceptual knowledge.
FAQs
Why is AI literacy important for Philips?
It turns experimentation into operational value. With shared skills and clear guardrails, teams can use AI to speed documentation, surface knowledge with citations, and support patient‑safe workflows — all while meeting compliance expectations.
How does AI literacy benefit employees?
People spend less time searching and summarising and more time applying expertise. Literacy also builds confidence to question outputs, escalate when needed, and contribute new, safe use cases.
How does Philips teach AI at scale?
Through leadership training, company‑wide challenges, and governed access to enterprise tools. Employees practice on low‑risk tasks, measure quality, and gradually adopt AI in workflows that matter.
Next Steps
Scaling literacy is how a global company turns isolated successes into a system. Philips’ approach — leadership first, grassroots momentum, and governed tooling — shows how to pair innovation with responsibility. If you’re planning a similar programme, start with low‑risk tasks, measure faithfulness and satisfaction, and grow into regulated workflows once trust has been earned.
Want a playbook and training path for your teams? Generation Digital can help design curricula, governance, and evaluation aligned to UK/EU requirements.


















