Imagine being paralysed from the neck down and unable to move or speak—yet still fully capable of thought. Today’s brain-computer interfaces (BCIs) can translate specific mental commands, such as ‘move my right finger’, into actions, allowing patients to control robotic limbs or type messages using only their thoughts. But what if we could bypass these mental steps entirely?
This is the focus of the cutting-edge research of Michiel Spapé, associate professor in the Centre for Cognitive and Brain Sciences (CCBS) at the University of Macau (UM). Unlike traditional BCIs, which require users to mentally simulate step-by-step movements, Prof Spapé’s team investigates how the brain encodes motivations—for example, the desire to drink from a cup—without focusing on the muscle movements needed to accomplish the task. Their work holds the potential to create a future where even healthy individuals can control technology effortlessly through pure intention, blurring the line between mind and machine.
Tackling the Fundamental Challenges in BCI Technology
Prof Spapé has long been driven by a profound question: How can we translate subjective human consciousness into a form that machines can understand? Current BCI technology works by detecting electrical signals generated by brain activity and converting them into commands that computers can interpret, enabling the brain to control external devices. However, despite these advancements, the technology often feels unnatural in real-world use.
Prof Spapé uses a simple example to explain this challenge. When we pick up a teacup, our brain naturally and effortlessly generates a smooth sequence of motor commands, much like breathing. Existing BCI systems, however, require users to break this action into a series of deliberate mental steps. To lift a teacup, a person must first imagine raising their arm, then moving it forward, and finally gripping the handle with precise finger movements. This rigid, step-by-step process makes interactions feel awkward and counterintuitive. Prof Spapé asks: ‘How can we make these commands as natural as breathing, without requiring conscious effort?’ In CCBS, he is leading his team to answer this question. They are dedicated to overcoming this bottleneck by creating BCIs that enable seamless and intuitive interactions between the brain and external devices.
Through relentless effort, Prof Spapé’s team has developed a bidirectional BCI system powered by neuroadaptive modelling. This advanced technology uses real-time brain activity analysis to determine ongoing motivations and emotions, and uses this information to perform actions or higher-level interactions. For example, by integrating the learning and training capabilities of generative AI, the system can interpret how we respond intuitively to different stimuli, allowing for far more advanced brain control than simply movements.
A notable application of this technology is the ‘AI Tutor’ system, which Prof Spapé’s team is currently developing. This system combines neuroadaptive modelling with multimodal emotional AI, and features a Deepseek-based artificial agent that acts as a tutor. The AI Tutor assesses students’ emotional states by analysing their micro facial expressions, speech prosody, and real-time electroencephalography (EEG) data. Using this information, it can control teaching strategies to better suit individual learners. Specifically, the AI Tutor identifies emotional indicators that are essential for motivation and learning, including frustration intensity, attentional engagement, and moments of sudden insight. For example, if the system detects a student’s frustration with the learning materials or a decline in interest, it signals the need to adjust teaching strategies, such as by simplifying the material or repeating key concepts. This technology has the potential to significantly enhance learning effectiveness, particularly in distance learning scenarios and during study sessions.
‘It’s like the difference between a choreographed routine and an improvised dance,’ explains Prof Spapé. ‘Current BCIs force the brain to follow pre-set commands, whereas we’re developing technology that enables genuine neural dialogue—where both systems adapt to each other in real time.’
Capturing the Brain’s ‘First-Person Experience’
A central focus of Prof Spapé’s research involves advancing BCI systems beyond simple movement decoding to capturing the motivational dimension of actions—what a person truly desires and how it feels to perform an action. This work builds on a fundamental insight from cognitive science: natural behaviour emerges from integrated perception-action-emotion loops, not just isolated motor commands. Traditional BCIs often treat the brain as a ‘biological remote control’, translating neural signals into one-way commands, such as ‘move robotic arm to cup’. However, such systems fail to capture the first-person experience—the emotional feedback that shapes intentions, like the urge to withdraw from a hot cup or the satisfaction of a firm grip. Prof Spapé’s approach prioritises decoding these affective states to create BCIs that understand not just what someone is doing, but why they want to do it, closing the loop between desire, action, and emotional consequence.
To truly replicate the cycle, Prof Spapé emphasises that BCI technology must advance beyond its current paradigm. Next-generation BCIs must not only decode motor commands but also capture the brain’s first-person experience—the subjective sensations and perceptions that form the foundation of human consciousness. Achieving this requires neurophenomenology decoding, which links neural activity to subjective experiences such as the vividness of seeing the colour red, the sharp sting of a pinprick, and the nostalgic memory evoked by the smell of coffee.
Prof Spapé explains the complexity of this challenge: ‘Imagine trying to describe the colour red to someone born blind. You can outline its wavelengths and associations, but you can never truly convey the experience of seeing red.’ This highlights the irreplaceable nature of first-person experience and poses a fundamental challenge in consciousness science: how to bridge the explanatory gap between subjective phenomenology (what it feels like) and objective neurobiology (what physically occurs) to truly understand how consciousness emerges from neural processes. To tackle this issue, Prof Spapé and his team are conducting experiments to translate subjective first-person experiences into quantifiable data. By correlating these experiences with brain activity, they aim to identify the neural mechanisms responsible for generating specific conscious experiences.
Injecting New Momentum into Neuroscience Research
Prof Spapé joined UM in 2023 as an associate professor at the Centre for Cognitive and Brain Sciences. Along with bringing cutting-edge perspectives to brain science research, his unique interdisciplinary background has also infused innovative momentum into the UM master’s programme in cognitive neuroscience, where he integrates his expertise in cognitive psychology, neurotechnology, and computer science.
Prof Spapé’s multidisciplinary expertise comes from an insatiable intellectual curiosity and a determination to push beyond traditional academic boundaries. Originally specialising in cognitive psychology, he quickly became captivated by the deeper complexities of the human mind. Reflecting on his educational journey, he shares, ‘I started as just a psychology student fascinated by the human mind. But during my PhD at Leiden University in the Netherlands, the first time I saw and understood EEG brainwaves—those messy, jiggly lines—it felt like the brain was whispering its secrets to me.’
This revelation reshaped Prof Spapé’s academic journey and deepened his commitment to neuroscience. After completing his studies at Leiden University, he pursued postdoctoral research at the University of Nottingham in the UK, where he focused on the electrophysiology of motor control. There, he delved into signal processing and immersed himself in the world of neurons and circuits. He then spent four years at the Helsinki Institute for Information Technology in Finland, studying computer-mediated touch while collaborating with leading experts in computer science. However, along the way, Prof Spapé had a profound realisation. ‘At heart, I’ve always been a psychologist. I want to understand how the mind works, not just the brain,’ he reflects. ‘That’s why I eventually returned to psychology departments, first in Liverpool and later in Helsinki. To me, neuroscience and computer science are critical tools to uncover the mysteries of the mind. But to make a global impact, there needs to be an equal partnership between psychology and technology.’
Throughout his career, Prof Spapé has been dedicated to interdisciplinary collaboration, working closely with computer scientists and engineers to achieve research breakthroughs. Now at UM, he leverages his European academic network to broaden students’ international research perspectives. One of his initiatives—which he humorously refers to as his ‘spy-vs-spy operation’—involves arranging regular online meetings between his students and their counterparts in Helsinki, fostering meaningful cross-border collaboration.
‘It’s about keeping communication channels open,’ Prof Spapé explains. ‘At the very least, students learn how research works in other countries and improve their English communication skills. But more often, these exchanges spark new ideas and lead to collaborative projects. Just last month, we held a productive online session with Finnish scholars to optimise some joint research initiatives.’
Making Technology an Extension of the Body
Prof Spapé and his team are working to translate the brain’s subjective consciousness into controllable actions. More specifically, their work focuses on teaching machines to truly understand the nuances of tactile feedback. ‘This foundational research will open up new possibilities for human-machine interaction and has game-changing implications for the future of smart prosthetic limbs and remote collaboration,’ Prof Spapé explains. ‘Perhaps in the near future, technology will no longer feel like a separate, external device, but instead function as a natural extension of the human body.’
Deciphering the brain’s consciousness to redefine brain-computer interaction is an extremely challenging pursuit. Inspired by the cognitive psychologist Prof Geoffrey Hinton, the 2024 Nobel laureate in physics, whom he deeply admires, Prof Spapé remains passionate about artificial neural networks and machine learning. He is committed to unravelling the mysteries of consciousness to fundamentally reshape the ways humans and machines interact. ‘I hope to leave my mark on this journey. Through groundbreaking discoveries, my goal is to establish the theoretical and technological foundations for a new era of human-machine symbiosis.’
About Prof Michiel Spapé
Michiel Spapé is an associate professor in the Centre for Cognitive and Brain Sciences and the Faculty of Science and Technology at the University of Macau. He obtained his PhD in psychology from Leiden University in the Netherlands and went on to conduct research on the brain and mind at the University of Nottingham in the UK, the Helsinki Institute for Information Technology in Finland, Liverpool Hope University in the UK, and the University of Helsinki in Finland. He has published 74 peer-reviewed articles, authored two textbooks, and holds a patent. Prof Spapé also serves as an associate editor of Psychological Research and Frontiers in Psychology: Cognition.
Source: UMagazine ISSUE 31
對全身癱瘓的患者而言,這是最殘酷的困境——清醒的意識被困在無法動彈的軀體中。如今的腦機交互(Brain-Computer Interface,簡稱BCI)技術,就像一台需要複雜指令的機器:患者須刻意想著「移動右手手指」,系統才能辨識並轉化為機械義肢的動作。但如果我們能繞過繁瑣的「意識指令」,直接解讀大腦深處最原始的意圖信號呢?
這正是澳門大學認知與腦科學研究中心副教授Michiel Spapé正在推動的BCI前沿研究。與傳統BCI技術截然不同的是,這項研究不再要求用家在腦海重覆動作指令,而是直擊人類意識的核心,即解碼大腦的原始動機——例如從杯子中飲水的慾望,而非完成飲水所需的肌肉運動。這項研究將塑造人機互動的未來,只需「想要」就能操控智能設備,為「意念即指令」的BCI新時代鋪路。
攻關腦機交互的根本難題
Michiel Spapé教授一直試圖攻克一個重要的謎題:如何將主觀的「意識」,轉化為可解碼的科學語言?現有的BCI技術是通過檢測大腦活動產生的電信號,將其轉化為電腦能夠理解的指令,實現大腦對外部設備的控制。這項技術雖然能解讀人類意念,但實際應用中存在明顯的不自然。
Spapé教授以「飲水」這個日常動作為例:人腦在執行「拿起水杯」這簡單的動作時,能無意識地流暢整合舉臂、前伸、握取等細微指令,猶如呼吸般自然。然而現有BCI卻反其道而行,硬生生將這串一氣呵成的連貫動作,拆解成「先想像舉臂高度,再計算前伸距離,最後模擬五指握姿」等繁瑣的意識指令。這種「分解式操作」既違背大腦本能,更讓簡單的動作變得費神。「怎樣令操作像呼吸一樣自然?」在澳大認知與腦科學研究中心裡,Spapé教授正引領團隊攻克這道「卡脖子」難題,推進「意念」控制機器的技術向前。
經過不懈努力,Spapé教授與團隊成功開發出基於「神經適應模型」的雙向BCI系統。這項技術能夠即時分析大腦活動,精準判讀使用者當下的動機與情緒狀態,並據此執行相應動作或實現更高層次的互動。透過整合「生成式人工智慧」的學習與訓練能力,該系統可以解讀我們對不同刺激的本能反應,從而實現遠超傳統單純動作控制的先進腦控互動體驗。
Spapé教授團隊正在開發的「AI導師」系統便是該研究的其中一項應用。此系統結合「神經適應建模」與「多模態情感AI」,並搭載基於Deepseek技術的AI代理,扮演教學輔助角色。透過分析學生的面部表情、語音語調及腦電圖數據,捕捉其大腦釋放的意識信號,即時判斷其挫折感、專注度及頓悟時刻等情感狀態關鍵指標。當系統檢測到挫敗感或興趣下降時,就表示需要調整教學策略。例如簡化材料或者重覆關鍵概念。這項技術有望顯著提升學習成效,特別適用於遠程學習情境與自主學習階段。
「這就像跳探戈一樣,」Spapé教授形象地比喻,「理想的互動應該是雙方默契配合,而非一方生硬地跟隨指令。我們致力為腦機介面創造一種嶄新的溝通方式。」
捕捉大腦的「第一人稱體驗」
Spapé教授研究的核心目標,在於推動BCI系統突破單純動作解碼的局限,進一步捕捉行為背後的動機維度——即人們真正的慾望本質,以及執行動作時的主觀感受。這項研究奠基於認知科學的重要發現:自然行為源自「感知—動作—情感」的整合循環,而非孤立的運動指令。傳統BCI系統往往將大腦視為「生物遙控器」,僅將神經訊號轉譯為單向指令(例如「將機械手臂移向杯子」),卻忽略了情感回饋,如觸碰燙手杯子時的退縮衝動,或是穩握物品時的滿足感。
要真正模擬這循環,Spapé教授強調BCI技術須突破現有範式。新一代BCI不僅需要解讀運動指令,更須捕捉大腦的「第一人稱體驗」,也就是構成人類意識基礎的主觀感受,例如看到紅色時「覺得鮮豔」的視覺體驗、被針刺到「感到痛」的觸覺、聞到咖啡香時「喚起記憶」的聯想。
Spapé教授解釋,想像你向天生失明人士描述「紅色」,但無法傳達「紅的感覺」——這正是「第一人稱體驗」的不可替代性,更是對意識科學的挑戰:如何將第一人稱主觀與第三人稱客觀這兩種視角結合,以真正理解「意識」從大腦中湧現。目前,他和團隊正努力透過實驗,將主觀的「第一人稱體驗」轉化為可量化的數據,並與大腦活動對應,推論哪些大腦機制能產生特定的意識體驗。
為腦科學研究注入新動力
Spapé教授於2023年加入澳大,擔任認知與腦科學研究中心副教授,他不僅為澳大帶來前沿的腦科學研究視野,更以其獨特的跨學科背景為「認知神經科學碩士課程」注入創新活力,將其在心理學、神經科學、計算機科學等豐富而全面的知識傳授予學生。
Spapé教授的跨學科的專業造詣,源於永不滿足的求知慾與突破傳統學術疆界的決心。最初專攻認知心理學的他,很快便被人類心智深層的複雜性所吸引。回顧求學歷程,他分享道:「最初我只是個對人類心智著迷的心理系學生。但在荷蘭萊頓大學攻讀博士期間,當我第一次看見並理解腦電圖(EEG)那些雜亂波動的腦波線條時,彷彿大腦正對我低語它的秘密。」
這番頓悟徹底改變了Spapé教授的學術軌跡,也深化了他對神經科學的投入。博士畢業後,他前往英國諾丁漢大學從事博士後研究,專注於運動控制的電生理學機制。期間,他深入鑽研信號處理技術,沉浸於神經元與神經回路的世界中。隨後四年,他在芬蘭赫爾辛基信息技術研究所任職,研究電腦媒介觸覺技術,並與頂尖電腦科學家展開合作。
這段學術探索之旅讓Spapé教授有了更深刻的體悟:「我的學術本質始終是心理學。我真正渴望理解的,是心智(mind)運作的奧秘,而不僅是大腦的生理機制。」這牽引正是他先後回到利物浦大學、赫爾辛基大學心理學系任教。對他而言,神經科學是揭開心智奧秘的精密儀器,電腦科學則是模擬它的強大工具。「但若要推動真正的變革,關鍵在於建立心理學與科技間平等對話的夥伴關係——讓技術為人文服務。」
多年來,Spapé教授始終致力於跨學科合作,與電腦科學家及工程師緊密配合以實現研究突破。如今在澳大,他充分運用其歐洲學術網絡,拓展學生的國際研究視野。其中一項他幽默稱為「間諜對間諜行動」,便是定期安排其學生與赫爾辛基的研究夥伴進行線上會議,促進深度的跨境學術協作。
「學術合作的本質,在於建立持久的對話橋樑。學生不僅能了解其他國家的研究模式,並從中提升英語溝通能力。更多時候,這些交流會激發新想法,促成新合作。就在上個月,我們與芬蘭學者舉行了一場富有成效的線上會議,共同優化多項聯合研究計劃。」
讓科技成為身體的一部分
在實驗室中,Spapé教授與團隊正致力於將大腦的「主觀意識」轉化為可操控的機械動作,更深入而言,是聚焦於如何讓機器真正理解「觸覺反饋」的細微差異。「這項基礎研究將為人機互動開創全新可能,對未來智能肢體輔具發展具有革命性意義。」Spapé教授表示,「或許不久的將來,科技將不再只是冰冷的工具,而能真正成為身體的自然延伸。」
捕捉大腦「意識」,開啟腦機交互的新局面,是一條極具挑戰的道路,Spapé教授如同他所敬重的2024年諾貝爾物理學獎得主、認知心理學家Geoffrey Hinton教授一樣,始終對人工神經網路與機器學習領域滿懷熱忱,他將繼續深耕「意識」難題,重塑人機對話本質。「我希望在這條路上留下自己的足跡,用突破性的發現,為人機共生的新紀元奠定理論基礎與技術基石。」
Michiel Spapé教授簡介
Michiel Spapé是澳門大學科技學院副教授、協同創新研究院認知與腦科學研究中心副教授,於荷蘭萊頓大學獲得心理學博士學位,隨後分別在英國諾丁漢大學、芬蘭赫爾辛基資訊技術研究所、英國利物浦希望大學,以及芬蘭赫爾辛基大學從事大腦與意識研究。曾撰寫或合著74篇同行評審文章、兩本教科書和一項專利,並擔任Psychological Research和Frontiers in Psychology: Cognition的副編輯。
來源:《澳大新語》第31期