I know what he means. As the other commenter pointed out, the question is whether ARM will be as popular 10+ years down the road. The chip Oberon was on was very popular back then. Not any more. So, they had to re-write significant portions of it to support new hardware.
Wirth decided it would be better to make a simple, custom processor that follows the spirit of the system. That by itself would be a fun project. He could put it on any current and future FPGA, something very unlikely to disappear. He then ported Oberon to it. Now, it's future-proof.
Maybe I worded it unclearly. The original version of Oberon ran on a popular, embedded chip. Those don't exist any more despite that people at the time thought they'd be around forever. The risk exists with ARM but less.
Oberon-related tech is also training for ETH students. They get plenty of experience trying to extend the OS, port the compilers, etc. They have little in hardware by comparison, though. FPGA's have gotten cheap and their nature means they'll stay around in some form forever. Verilog also goes back decades. So, there's the added benefit of teaching students to build a processor which could also run the system.
So, combining future-proofing, simplicity, OS education, and HW education requirements all into one meant a custom, Wirth-style processor on FPGA was best option. It also creates opportunities for things like TCP/IP, graphics, or security engines he probably didn't think about at the time. His students or FOSS volunteers might pick that up given how simple the HW is.
So, ARM is a bad idea. Those SOC's are too complex. Extending or re-implementing it needs a ridiculously-expensive license. They also sue people over even using the ISA, which makes me boycott them where possible. The FPGA doesn't have these problems plus allowed he and his students to attempt an ideal replacement that could last decades. So, that was his choice and one of the better options.
Note: I'd have used it as an excuse to get some grant money to do a full, low-cost, RISC-V implementation to put A2 Bluebottle on. More practical. That's just me, though. ;)
It would mean compromising the theme there of consistency-all-the-way-down.
What I would like to see instead would be RISC-V instead of their own ISA. At least there some learnings could be shared with other academics and tinkerers.
Tell that to the companies trying to push MIPS SOC's for Android smartphones. If it isn't ARM, they tell them to get lost. Likewise, for x86 + Windows. As ARM and Intel themselves say, being able to leverage the ecosystem of existing code, tools, talent, and so on is a huge differentiator. ARM on mobile (esp iOS & Android) has a massive ecosystem and many tool vendors whereas RISC-V or Wirth's CPU's don't have jack. So, manufacturers explicitly require ARM for high-end smartphones and SOC makers default on it despite ridiculous licensing cost.
Note: Areas like embedded in general with smaller, custom jobs don't worry about ISA's as much. Plenty of ARM, MIPS, PPC, x86, SPARC, Super-H, 16bits, 8bits... you name it. Much more fun space to be a programmer if you get to pick the cpu/board. :)
I'm in the GPU business. Here, ISA compatibility does not even exist and nobody cares. As long as you own an entire toolchain - a GLES/OpenCL driver, or, as in this case, an Oberon OS and a compiler - nobody gives a tiniest bit of crap about the underlying ISA.
So, I do not see any reason for the Oberon machine to rely on any "standard" ISA. It can even be compiled into some kind of a NISC, and nobody would notice any difference.
Oh I agree about Oberon and GPU's. My argument is for mainstream market for software in desktops and mobile. The ISA usually does matter.
Now, if we're talking GPU's, the effective ISA would be DirectX, OpenGL, OpenCL, etc. They're the standards that software and tooling target. So, does you product get by with not supporting any of those? Or do you have to comply with the your niche's standard interfaces and ecosystems, too?