Immigration Corner | When AI gets it wrong
Loading article...
Dear Miss Powell,
I recently saw that you posted a news story about someone whose permanent residence application was refused because IRCC used AI. I didn’t know they could do such a thing. Should I be worried about this when applying to Canada?
RS
Dear RS,
You are right to pause and ask this question. The story you are referring to is real, it is recent, and it raises important concerns for anyone who has an immigration application before Immigration, Refugees and Citizenship Canada.
WHAT HAPPENED
In February 2026, a French health scientist named Kémy Adé received a refusal of her permanent residence application. Dr Adé holds a PhD from the Sorbonne University in the immunology of ageing and was working as a post-doctoral research fellow and guest teacher at McMaster University in Canada. The refusal letter stated that her job duties did not match the Canadian work experience she was claiming.
The problem was that the duties cited in the refusal letter were not her duties at all. According to the Toronto Star, which broke the story, the refusal described her work as wiring and assembling control circuits, building control and robot panels, and programming and troubleshooting. Dr. Adé is a biomedical researcher. She has never performed any of those tasks, and she never described them in her application.
What made this case particularly significant is that the refusal letter contained a disclaimer disclosing the use of generative artificial intelligence in the application review. It is believed to be the first time IRCC explicitly acknowledged using generative AI in the context of an immigration refusal. The disclaimer noted that all generated content was verified by a human officer. Dr. Adé’s lawyer responded to that statement with one question: how could any human being have read that description and believed it matched his client?
THE SYSTEM BEHIND THE STORY
IRCC has been using various forms of automation and data analytics in immigration processing for years. IRCC published its first formal AI strategy in February 2026. The strategy sets out principles including human oversight, transparency, and fairness. It states clearly that AI tools do not refuse or recommend refusing applications and that all refusals are made by human officers based on their own review. However, this case highlights a troubling gap between policy and practice. While AI is not supposed to make decisions, it may still influence them.
Why Job Duties Matter So Much
If your duties are mischaracterised, whether by a human officer, an automated tool, or a combination of both, the result can be a refusal that has nothing to do with your actual qualifications. That is exactly what appeared to happen here.
Your reference letters and employment documents must describe what you actually did, clearly and specifically. Any inconsistency between your documents creates room for misinterpretation, and in a system under pressure, that room can be costly.
What You Should Do If This Happens to You
Read your refusal letter carefully. Pay specific attention to how your job duties are described. If you see duties attributed to you that you never performed and never claimed, do not ignore this.
In Dr. Adé’s case, her lawyer requested reconsideration, and IRCC reopened the file. This is a reminder that a refusal is not always final, particularly where the decision is based on facts that are plainly not supported by the application on record.
Where a decision is based on inaccurate information, you may also be able to seek judicial review at the Federal Court of Canada. The court will assess whether the decision was reasonable and whether the process was fair. A refusal grounded in facts that do not exist in your file can be set aside. The timelines for seeking judicial review are short, so you must act quickly.
What This Means for Applicants Going Forward
Immigration processing is changing. AI is now embedded in parts of the system, and that is unlikely to reverse.
For applicants, this means that clarity and consistency in your application are more important than ever. Your file must be precise, your duties must be described accurately and in detail, and your documents must tell a coherent story that leaves as little room as possible for misinterpretation by any reviewer, human or automated.
If you are preparing an application and are unsure whether your work experience is properly aligned with the relevant occupation, seek proper legal advice before you submit. A well-prepared application is your best protection in a system that is still learning how to use the tools it has adopted.
Deidre S. Powell is a Canadian lawyer, mediator and author of “Tell me a story Grandma”. Connect with her via www.deidrepowell.com or via Facebook, Instagram, and Twitter. Telephone/WhatsApp 613-695-8777