Machine learning and some types of AI were already being widely used for fraud and money laundering detection, but applications for other forms were only at an exploratory stage, Woods said.

"We want to be technology-agnostic in all of our regulation," Woods said at a press conference after the BoE published its latest Financial Stability Report.

"AI-specific financial regulation may not be the right way forward," he added.

He said the Bank's Financial Policy Committee would take a look at AI and machine learning next year to consider their risks to financial stability, working alongside other authorities.

AI and machine learning could deliver significant benefits to the financial sector by driving greater operational efficiency, improving risk management and providing new products and services, the committee said.

But wider adoption could post system-wide risks, such as amplifying herd behaviour or increasing the risk of cyber attacks, it added.

BoE Governor Andrew Bailey said everyone was learning at a fast pace about AI, and should approach it with "eyes wide open".

"It has, I think quite profound implications, potentially for economic growth and how economies are shaped going forward," Bailey said.

Firms using AI need to understand exactly how it worked, he added.

Woods said the BoE could consider what steps senior managers in the financial sector, who are directly accountable to regulators, should take to assure themselves that "what's coming out of the black box is actually reasonable".

The BoE has just been given new powers to oversee how banks use third parties, such as cloud providers, for critical services.

"So if it became the case the financial system became significantly dependent on a provider of AI technology, it might well be that that will be a candidate for the use of those new powers," Woods said.

(Writing by Huw Jones; additional reporting by Andy Bruce and Suban Abdulla; editing by David Milliken and Christina Fincher)

By Huw Jones