It's both. The system or process has it's actual entropy, and the sequence of observations we make has a certain entropy. We can say that "this sequence of numbers has this entropy", which is slightly different from the entropy of the process which created the numbers. For example, when we make more coin tosses, our sequence of observations has an entropy which gets closer and closer to the actual entropy of the coin.