ARM has the Thumb opcodes, which aren't your "PISC" concept (which is a limited form of loadable microcode, another thing that's been done) but special, shorter encodings of a subset of ARM opcodes, which the CPU recognizes when an opcode flips an internal bit. There's also Thumb-2, which has a mix of short (16-bit) and full-size (32-bit) opcodes to help fix some performance problems of the original Thumb concept:
> ARMv4T and later define a 16-bit instruction set called Thumb. Most of the functionality of the 32-bit ARM instruction set is available, but some operations require more instructions. The Thumb instruction set provides better code density, at the expense of performance.
> ARMv6T2 introduces Thumb-2 technology. This is a major enhancement to the Thumb instruction set by providing 32-bit Thumb instructions. The 32-bit and 16-bit Thumb instructions together provide almost exactly the same functionality as the ARM instruction set. This version of the Thumb instruction set achieves the high performance of ARM code along with the benefits of better code density.
> The Thumb instruction set is a subset of the most commonly used 32-bit ARM instructions. Thumb instructions are each 16 bits long, and have a corresponding 32-bit ARM instruction that has the same effect on the processor model. Thumb instructions operate with the standard ARM register configuration, allowing excellent interoperability between ARM and Thumb states.
> On execution, 16-bit Thumb instructions are transparently decompressed to full 32-bit ARM instructions in real time, without performance loss.
For an example of how loadable microcode worked in practice, look up the Three Rivers PERQ:
> The name "PERQ" was chosen both as an acronym of "Pascal Engine that Runs Quicker," and to evoke the word perquisite commonly called a perk, that is an additional employee benefit
And for a description of how to build a computer with loaded microcode you could start with Mick and Brick [1] (PDF). It describes using AMD 2900 series [2] components, the main alternative at the time was to use the TI 74181 ALU [3] and build your own microcode engine.
> On execution, 16-bit Thumb instructions are transparently decompressed to full 32-bit ARM instructions in real time, without performance loss.
That quote is from the ARM7TDMI manual - the CPU used in the Game Boy Advance, for example. I believe later processors contained entirely separate ARM and Thumb decoders.
https://developer.arm.com/documentation/dui0473/m/overview-o...
> ARMv4T and later define a 16-bit instruction set called Thumb. Most of the functionality of the 32-bit ARM instruction set is available, but some operations require more instructions. The Thumb instruction set provides better code density, at the expense of performance.
> ARMv6T2 introduces Thumb-2 technology. This is a major enhancement to the Thumb instruction set by providing 32-bit Thumb instructions. The 32-bit and 16-bit Thumb instructions together provide almost exactly the same functionality as the ARM instruction set. This version of the Thumb instruction set achieves the high performance of ARM code along with the benefits of better code density.
So, in summary:
https://developer.arm.com/documentation/ddi0210/c/CACBCAAE
> The Thumb instruction set is a subset of the most commonly used 32-bit ARM instructions. Thumb instructions are each 16 bits long, and have a corresponding 32-bit ARM instruction that has the same effect on the processor model. Thumb instructions operate with the standard ARM register configuration, allowing excellent interoperability between ARM and Thumb states.
> On execution, 16-bit Thumb instructions are transparently decompressed to full 32-bit ARM instructions in real time, without performance loss.
For an example of how loadable microcode worked in practice, look up the Three Rivers PERQ:
https://en.wikipedia.org/wiki/PERQ
> The name "PERQ" was chosen both as an acronym of "Pascal Engine that Runs Quicker," and to evoke the word perquisite commonly called a perk, that is an additional employee benefit