Mortgage

If the technology exceeds the regulations… – Mortgage Strategy

What do cars and mortgages have in common?

They are expensive, they can crash- and are increasingly arguing about the extent to which they can or should be automated.

Reading the FCA’s mortgage rules review consultation document, what fascinates me the most is what I don’t say. It doesn’t mention artificial intelligence (AI) or automation anywhere.

But I can’t help feeling that releasing AI is one of the main purposes of regulators seeking to answer Prime Minister Rachel Reeves’ call to reduce risks and more growth.

We should not resist change. But we must clearly see our tempting changes

For those who have not read the paper, there are three core components. The two are simple and independent. One person may be the basin of our industry.

The FCA most incredibly proposes retirement outdated guidelines on interest-only clients and cost of living support, and eliminates the need for affordable checks for those who redeem cheap deals without increasing debt. These are wise adjustments to existing rules that should help expand access to the cashing-out market.

Now it’s meat. The FCA hopes to “simplify and clarify” the mortgage advice perimeter by removing the concept of “interactive dialogue” triggers.

I firmly believe that the proposed changes to the periphery are intended to pave the way for the widespread use of AI – but through the backdoor

As Seb Murphy wrote Mortgage Strategy In May, when the first sign of customer interaction, shifting existing clear guidance to regulated advice would have a whole bunch of tricky effects – for the consulting process, the responsibilities of the consultant, the consumer and ultimately for the outcome of the client. I won’t repeat these arguments here.

High risk

So what does all this have to do with cars? Why should we be vigilant?

We heard last month that Uber is preparing to launch AI-guided driverless cars on UK roads – but regulations are not ready until the second half of 2027. I see a great similarity, potential for potential access to AI mortgage advice – and needs to be done with great caution.

The similarities are obvious. Both are currently dominated by human interaction and there is no record of complete automation or digital delivery. Both are highly dangerous: the potential consequences of failure lead to a highly regulated environment and have wise checks and balances to maintain safety.

We’d better make sure who is responsible when the crash occurs

Both rely on the availability and interpretation of quality data to provide “safe” results. Both face doubts about reliability, equity and accountability. Both are subject to scrutiny on how decisions are made and how transparency can be ensured. Both raise serious questions about responsibility when they fail, especially if the failure is systematic.

And, ultimately, before mass adoption becomes mainstream, both require a seismic shift to culture and consumer comfort.

back door

Back in the mortgage, I firmly believe that the proposed changes to the periphery are intended to pave the way for the widespread use of AI, but through the backdoor.

Currently, there is no way to provide “Robo” advice that is in line with MCOB and consumer responsibilities. Quietly removing the “interactive conversation” trigger means that the advice provided through non-human channels such as AI chatbots is the same as that provided by humans, while the regulatory focus shifts to whether to make personal suggestions.

We have seen what happens in other departments when technology exceeds regulations

Likewise, the proposed “clarification” of the suggestion boundary (i.e., focusing on the existence of personal advice) means that companies can design AI tools that provide guidance or filter product options with more confidence without going beyond regulated areas of advice; unless they want (from a cost standpoint, they certainly will).

The shift from “interactive dialogue” as a threshold for suggestion is not just technical. This is a potential green light for companies to build digital journeys that look like suggestions without triggering the regulatory obligations that follow. In a world where a person’s AI becomes smarter, smarter, and more human, we risk the future of the future, not knowing whether they are being suggested, guided or just recommended.

This is important because unlike a self-driving car that makes a wrong turn, it can take years and be difficult to reverse. If we severely automate (or adjust too loosely), consumers will not only cause inconvenience. They can suffer long-term financial damage.

Currently, there is no way to provide “robot” advice that complies with MCOB and consumer responsibilities

The FCA talks about simplifying rules for innovation. However, innovation without clear responsibilities is dangerous. If AI tools give personal advice, what if it turns out to be wrong? Software developer? Lender? Compliance officer? The algorithm itself?

These are not only questions of assumptions. If it is not resolved in the consultation document now, these questions are questions that regulators will be forced to answer in court.

We have seen in other departments what happens when technology exceeds regulation. From social media to cryptocurrencies to generative AI, this model is always the same: disruption, adoption, rebound, reform.

Innovation without accountability is dangerous

The mortgage market has a chance to surpass this curve. But only we are asking the right questions now. We should not resist change. But we must clearly see our tempting changes.

Deleting the suggested trigger may seem like a sorting exercise, but it may be the same as handing over the key to an AI driver before writing the seat belt regulations.

And if we do this, we’d better make sure who is responsible when the crash occurs.

Alex Beavis is the founder of Beavis Advisory


This article is in the June 2025 edition Mortgage Strategy.

If you want to subscribe to a monthly print or digital magazine, click here.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button