Browsers sometimes try adding things to URLs to try and make them work. Firefox tends to add https:// if http:// fails, perhaps some browsers are adding www.
Funny, if true (doing some research...) I'll add this to the warchest for that classic interview question about what happens when you type a url into your browser.
Definitely was true. Browsersers (at least FF) also used to add .com at the end. I think these days they all just send you to their ad-laden funding source instead if there is no TLD.(*)
I just don't get how the browser gets a response like this (below), and then figures out what to do next. Sister comment said it might just try the common "www." prefix.
I think there isn't an inbound link, the crawler is choosing to hit http://wix.com/ if links to wix.com subdomains are common enough. It might be that there are millions of links to www.wix.com and docs.wix.com and user.api.wix.com and not a single (broken) link to wix.com, and they will crawl http://wix.com/ anyway and decide that "the site is dead". This is a problem with their methodology.
Yes this goes to “what is a site?” and “who/what is controlling what sub domains”. Especially with things like GitHub.io, and indeed wix. I think ignoring dead apex domains when a subdomain worked would have been a good extra pass for the methodology.
Perhaps that is the reason for the apex domain to be dead in the first place - to communicate that the subdomains are the real roots of separate sites. Similarly, TLDs themselves are not supposed to have any A records (although there are some that do).
I'm not sure if it's because I have my browsers set to generally "do what I say" or because I'm using a filtering proxy, but Firefox doesn't seem to try again if I just put "wixsite.com" in the address bar --- it gets a Host Not Found from the proxy and stops.