Two things: first, the data is used to improve our algorithms. Second, spam reports get 4x the weighting when we prioritize sites for manual spam investigation. That does mean that if you do a spam report on a page/site that only gets seen (say) once every six months, it might not get looked at on the manual side though, simply because there are other sites that are hurting users more.
Giving spam reports 4x the weight compared to the sites that we autodetect is how we try to balance responding to outside feedback with spending our review cycles on the spam that impacts users most.