Light is considered a particle - or more accurately has particle-like behaviours in certain contexts - because you can count individual photons. The energy varies with frequency, and it arrives in discrete lumps at discrete locations. You can't have half a photon. It's all of a photon or no photon.
It's one of Einstein's key insights, one of the foundations of quantum theory.
In Quantum Field Theory you have quantum fields. There are no 'waves' and no 'particles', there's only a 3D probability density field which defines the probability of particle-like events at specific locations. It's the probability field that is 'wave-like', and particles are considered excitations of the field.
So you know you're quite likely to see an electron-like or photon-like event in one location, and not at all likely in another.
Calling this a 'particle' is just an analogy. All you can really say is that a particle-like measurement is likely and/or did happen in one region.
What this really means - whether it's an observer artefact, or an exchange of information, or the output of some kind of computational and/or causal substrate, or something else entirely - is still a mystery.
While seemingly intuitive, the idea of a QFT field as a probability field over space is not correct. It usually comes as a shock to anybody entering QFT when they finally realize the actual "probability space" is infinitely larger..
So as a programmer's example you might have an idea that to simulate a QFT you could have an array of floats over space to describe your "electron field", and you draw in values to reflect a probability distribution and you'll write some update rules to describe how this evolves in time. But this won't work, because it's not how nature works.
What you need is that electron field array (and a photon field array to make any kind of non-trivial observations), but in 4D (space + time), and you need to iterate over all possible values in all array bins for all fields, calculate a magic number for each configuration, and weigh all these magic numbers together to figure out the actual probabilities for any configuration.
Even for a 16x16x16x16 array, that is 65536 bins, even with only 2 levels in each bin this is 2^65536 combinations.
All of practical QFT (theoretical, perturbative or non-perturbative lattice methods) is about doing this calculation with (obviously radical) simplifications.
That we can do these enormous simplifications and still get usable results, can be a very good sign that the underlying reality is in fact simpler than what QFT implies but nobody has figured this out yet.
> Light is considered a particle - or more accurately has particle-like behaviours in certain contexts - because you can count individual photon
So how do they produce an inference pattern in the double slit experiment? I know you don't have an answer to that, but that's rather the point - neither does QM. Probability density does not explain an interference pattern. Only one thing does - that light is in fact a wave.
It's one of Einstein's key insights, one of the foundations of quantum theory.
In Quantum Field Theory you have quantum fields. There are no 'waves' and no 'particles', there's only a 3D probability density field which defines the probability of particle-like events at specific locations. It's the probability field that is 'wave-like', and particles are considered excitations of the field.
So you know you're quite likely to see an electron-like or photon-like event in one location, and not at all likely in another.
Calling this a 'particle' is just an analogy. All you can really say is that a particle-like measurement is likely and/or did happen in one region.
What this really means - whether it's an observer artefact, or an exchange of information, or the output of some kind of computational and/or causal substrate, or something else entirely - is still a mystery.