If you are iterating over a lot of files, a read while loop can be a major bottleneck. As long as you use the null options from find and pipe into xargs, you should be safe with any filename.
I've found it can reduce minutes down to seconds for large operations.
If you have to process a large number of files, you can let xargs minimize the number of times a program is run, instead of running it once per file.
Something like:
# Set the setgid bit for owner and group of all folders
find . -type d -print0 | xargs -0 chmod g+s
# Make the targets of symlinks immutable
find . -type l -print 0 | xargs -0 readlink -z | xargs -0 chattr +i
Way faster.
But there are lots of caveats. Make sure your programs support it. Maybe read the xargs man page.
Personally I skip the middleman when I can with "find ... -exec cmd {} +"
find . -type d -exec chmod g+s {} +
Or even minimise arguments by including a test if the chmod is even needed:
find . -type d \! -perm -g=s -exec chmod g+s {} +
I actually have a script that fixes up permissions, and I was delighted to fit it in a single find invocation which only performs a single stat() on each file in the traversal, and only executes chown/chmod at all for files that need change:
I've found it can reduce minutes down to seconds for large operations.
If you have to process a large number of files, you can let xargs minimize the number of times a program is run, instead of running it once per file.
Something like:
Way faster. But there are lots of caveats. Make sure your programs support it. Maybe read the xargs man page.