OpenAI invests in Merge Labs to advance brain–computer interfaces

OpenAI invests in Merge Labs to advance brain–computer interfaces

OpenAI

14 ene 2026

Three people in a modern office interact with advanced technology, including a large monitor displaying AI data, a woman wearing a sensor-equipped headset, and electronic equipment on the desks, highlighting innovation in tech research and development.
Three people in a modern office interact with advanced technology, including a large monitor displaying AI data, a woman wearing a sensor-equipped headset, and electronic equipment on the desks, highlighting innovation in tech research and development.

¿No está seguro de qué hacer a continuación con IA?
Evalúe su preparación, riesgos y prioridades en menos de una hora.

¿No está seguro de qué hacer a continuación con IA?
Evalúe su preparación, riesgos y prioridades en menos de una hora.

➔ Descarga nuestro paquete gratuito de preparación para IA

OpenAI has invested in Merge Labs, a neurotech startup developing brain–computer interfaces that bridge biological and artificial intelligence. The collaboration aims to create human-centred, non-invasive interfaces so people can interact with AI more directly—enhancing ability, agency and future applications across assistive tech and human–AI collaboration.

At a glance

  • What’s new: OpenAI has participated in Merge Labs’ seed round to accelerate brain–computer interface (BCI) research and development.

  • Who’s Merge: A neurotechnology startup aiming to bridge biological and artificial intelligence with non-invasive approaches and AI models.

  • Why it matters: Aims to create natural, human-centred interfaces for interacting with AI—beyond keyboards, touch and voice.

  • Stage: Early-stage; research-first with applications expected across assistive tech, learning/communication, and human–AI collaboration.

The story

OpenAI says progress in interfaces unlocks progress in computing—and that BCIs are the next frontier. By backing Merge Labs, OpenAI positions itself closer to hardware and biosensing breakthroughs that could enable people to express intent more directly to AI systems. Merge frames its mission as “maximising human ability” by combining biology, devices and AI in accessible form factors. While timelines are not detailed, the company emphasises non-invasive or minimally invasive modalities and expects near-term work on robust data acquisition, interpretation models and safety.

What’s new

  • Seed investment & collaboration: OpenAI contributes capital and AI expertise; Merge focuses on sensing and modulation of brain activity and pairing that data with foundation models tuned for neural signals.

  • Non-invasive focus: Early signals point to ultrasound-based or other deep-reaching, non-implant modalities rather than surgical electrodes.

  • Human-centred design: Target form factors people will actually wear/use; privacy, consent and safety are core design constraints.

Why this matters

  • Better human–AI I/O: Interfaces that read intent and provide feedback could make AI more useful, accessible and controllable.

  • Assistive potential: Long-term, BCIs may improve communication for people with motor or speech impairments, and aid neurorehabilitation.

  • Competitive landscape: Activity in neurotech is accelerating (non-invasive and implant approaches). OpenAI’s move signals intent to help shape standards and models for neural data.

  • Governance stakes: Data sensitivity is high; expect rigorous privacy, safety and ethics frameworks as part of productisation.

What to watch

  • Research milestones: papers/prototypes on signal quality, decoding accuracy, closed-loop control, and safety.

  • Partnerships: hospitals, universities, and device partners for trials and validation.

  • Policy & standards: guidance on neural data handling, consent, and medical/consumer compliance.

FAQ

What is the primary goal of the investment?
To advance non-invasive BCIs and the AI models that interpret neural signals, enabling more natural interaction with AI.

How could this benefit human experience?
By improving how we communicate intent to machines—potentially increasing ability, agency and accessibility in everyday tools and assistive technologies.

Who are the key players?
OpenAI as strategic investor/collaborator; Merge Labs as the neurotech startup focusing on the interface layer.

Is this a medical device effort?
Too early to say. Expect research and prototypes first; any clinical applications would follow appropriate regulatory pathways.

OpenAI has invested in Merge Labs, a neurotech startup developing brain–computer interfaces that bridge biological and artificial intelligence. The collaboration aims to create human-centred, non-invasive interfaces so people can interact with AI more directly—enhancing ability, agency and future applications across assistive tech and human–AI collaboration.

At a glance

  • What’s new: OpenAI has participated in Merge Labs’ seed round to accelerate brain–computer interface (BCI) research and development.

  • Who’s Merge: A neurotechnology startup aiming to bridge biological and artificial intelligence with non-invasive approaches and AI models.

  • Why it matters: Aims to create natural, human-centred interfaces for interacting with AI—beyond keyboards, touch and voice.

  • Stage: Early-stage; research-first with applications expected across assistive tech, learning/communication, and human–AI collaboration.

The story

OpenAI says progress in interfaces unlocks progress in computing—and that BCIs are the next frontier. By backing Merge Labs, OpenAI positions itself closer to hardware and biosensing breakthroughs that could enable people to express intent more directly to AI systems. Merge frames its mission as “maximising human ability” by combining biology, devices and AI in accessible form factors. While timelines are not detailed, the company emphasises non-invasive or minimally invasive modalities and expects near-term work on robust data acquisition, interpretation models and safety.

What’s new

  • Seed investment & collaboration: OpenAI contributes capital and AI expertise; Merge focuses on sensing and modulation of brain activity and pairing that data with foundation models tuned for neural signals.

  • Non-invasive focus: Early signals point to ultrasound-based or other deep-reaching, non-implant modalities rather than surgical electrodes.

  • Human-centred design: Target form factors people will actually wear/use; privacy, consent and safety are core design constraints.

Why this matters

  • Better human–AI I/O: Interfaces that read intent and provide feedback could make AI more useful, accessible and controllable.

  • Assistive potential: Long-term, BCIs may improve communication for people with motor or speech impairments, and aid neurorehabilitation.

  • Competitive landscape: Activity in neurotech is accelerating (non-invasive and implant approaches). OpenAI’s move signals intent to help shape standards and models for neural data.

  • Governance stakes: Data sensitivity is high; expect rigorous privacy, safety and ethics frameworks as part of productisation.

What to watch

  • Research milestones: papers/prototypes on signal quality, decoding accuracy, closed-loop control, and safety.

  • Partnerships: hospitals, universities, and device partners for trials and validation.

  • Policy & standards: guidance on neural data handling, consent, and medical/consumer compliance.

FAQ

What is the primary goal of the investment?
To advance non-invasive BCIs and the AI models that interpret neural signals, enabling more natural interaction with AI.

How could this benefit human experience?
By improving how we communicate intent to machines—potentially increasing ability, agency and accessibility in everyday tools and assistive technologies.

Who are the key players?
OpenAI as strategic investor/collaborator; Merge Labs as the neurotech startup focusing on the interface layer.

Is this a medical device effort?
Too early to say. Expect research and prototypes first; any clinical applications would follow appropriate regulatory pathways.

Recibe noticias y consejos sobre IA cada semana en tu bandeja de entrada

Al suscribirte, das tu consentimiento para que Generation Digital almacene y procese tus datos de acuerdo con nuestra política de privacidad. Puedes leer la política completa en gend.co/privacy.

Próximos talleres y seminarios web

A diverse group of professionals collaborating around a table in a bright, modern office setting.

Claridad Operacional a Gran Escala - Asana

Webinar Virtual
Miércoles 25 de febrero de 2026
En línea

A diverse group of professionals collaborating around a table in a bright, modern office setting.

Trabajando con Compañeros de IA - Asana

Taller Presencial
Jueves 26 de febrero de 2026
Londres, Reino Unido

A diverse group of professionals collaborating around a table in a bright, modern office setting.

From Idea to Prototype - AI in Miro

Virtual Webinar
Weds 18th February 2026
Online

Generación
Digital

Oficina en el Reino Unido
33 Queen St,
Londres
EC4R 1AP
Reino Unido

Oficina en Canadá
1 University Ave,
Toronto,
ON M5J 1T1,
Canadá

Oficina NAMER
77 Sands St,
Brooklyn,
NY 11201,
Estados Unidos

Oficina EMEA
Calle Charlemont, Saint Kevin's, Dublín,
D02 VN88,
Irlanda

Oficina en Medio Oriente
6994 Alsharq 3890,
An Narjis,
Riyadh 13343,
Arabia Saudita

UK Fast Growth Index UBS Logo
Financial Times FT 1000 Logo
Febe Growth 100 Logo (Background Removed)

Número de la empresa: 256 9431 77 | Derechos de autor 2026 | Términos y Condiciones | Política de Privacidad

Generación
Digital

Oficina en el Reino Unido
33 Queen St,
Londres
EC4R 1AP
Reino Unido

Oficina en Canadá
1 University Ave,
Toronto,
ON M5J 1T1,
Canadá

Oficina NAMER
77 Sands St,
Brooklyn,
NY 11201,
Estados Unidos

Oficina EMEA
Calle Charlemont, Saint Kevin's, Dublín,
D02 VN88,
Irlanda

Oficina en Medio Oriente
6994 Alsharq 3890,
An Narjis,
Riyadh 13343,
Arabia Saudita

UK Fast Growth Index UBS Logo
Financial Times FT 1000 Logo
Febe Growth 100 Logo (Background Removed)


Número de Empresa: 256 9431 77
Términos y Condiciones
Política de Privacidad
Derechos de Autor 2026