We've only tested on personal systems so far, just a few TB.
The theoretical limit is around 9PB, but we don't know what performance would be like at this scale.
A one TB spinning disk drive takes about 30 mins to scan, several thousand files per second, and query performance is very good. Most people scan project directories on demand, and the whole disk once a week or so. SSDs are much faster of course.
There is a restriction on the maximum number of files acted on by one query (e.g. moved, deleted, renamed), as the exec() function caches the list of files in memory. On macOS we're ok with tens of millions of files, but the first Windows release, due any day now is 32bit (we're fighting compiler issues), so the limit is around a million files.