Hacker News new | past | comments | ask | show | jobs | submit login

SEO consultant here (yeah, yeah).

Please, still in 2019, be careful with JavaScript unless you're willing to deal with a lot of uncertainty.

Google routinely says how things are supposed to work, but there are plenty of examples where their crawlers act inconsistent to public statements. To be clear, probably not Googlebot's fault though; we as a species do a lot of weird things when building websites.

Just finished a big project with a client who had a client-side render of their react app for the customer facing website. Googlebot was VERY sporadic with how they were indexing certain parts of the site, and in general, the site lost out to inferior sites in more competitive SERPs. Server side render fixed a lot of this and rankings/traffic jumped.

It's also worth noting that the Bing/Yahoo crawlers (still a notable chunk of web traffic!) can't crawl JS. You can ignore this chunk of market share if you want but someone is going to happily take it.

As a general rule, my advice is always this: Make it as easy as possible for bots to crawl and index your site's content. The more hoops the crawler is jumping through, the worse your site will perform in search engines.




Bing Mobile Friendliness Test correctly renders JS pages. Fetch as BingBot does not currently. I logged a support request through Bing Webmaster and got a reply that their engineering team is working on it, so I would expect Bing to crawl JS sites just like Google in the near future.


Executing js is expensive, so you should expect bing to limit its willingness to index such sites, same as google.


> Server side render fixed a lot of this and rankings/traffic jumped.

Good, they should just delist client side rendered web pages.


While I love to rag on SPAs as much as the next person, doing that is ripe for antitrust. "They're trying to force the web to look like what they want"


“You didn’t go to huge extra expense to index my website so I am going to sue you for anti-trust” - good luck. Google has been picky about executing js all along.


Why? I enjoy writing client-side SPAs, and I can pump out a better UI/UX must faster than I could otherwise.

My users don't care how I build it, just that the site works and is enjoyable to use.

If I need server-side rendering I'll still write it with a client-side rendered (e.g. React), but tack on server-side rendering at the end.


I like to build my websites in COBOL but the browsers won't support it even though I'm more productive.

You are not building this for yourself if you need to be indexed by google. You need to build in a supported way.

The bigger question is why you think an spa should rank well in general. It's only one page.


Yes, I agree - server side rendering is likely better for SEO in most scenarios, and so I should consider it in a business context where SEO is essential.

This isn't everyone.

The comment I replied to said that Google should "delist client side rendered web pages", which is a terrible idea. Maybe I'm OK with the SEO hit? Maybe I've architectured my SPA so that it can be indexed? (At least to some degree)

> The bigger question is why you think an spa should rank well in general. It's only one page.

SPAs can have multiple pages (despite the name). Check out react-router [0]. Browser history can be manipulated to give the same "back button/forward button" functionality between logical pages. This is done for you by whatever framework you use.

For my use case, having the browser reload every time a page is changed would severely interrupt UX.

[0] https://reacttraining.com/react-router/web/guides/quick-star...


But to a crawler it's one "page". To humans it's different, but not to robot. Crawler still (mostly) "page" == GET


Yet here we are discussing TFA that says otherwise, and crawlers are only going to be become better at it.

This is a beautiful part of the new web: indexable applications.


Crawlers were suppose to have this down in 2012 (at least google),now they recommend a sitemap page.


>The bigger question is why you think an spa should rank well in general. It's only one page.

Spoken like someone who hasn't used the web in 5 years.


I hope my mailed letter will reach hn soon so I can reply.

My work has powered some of the sites you've visited over the last 5 years. I hope you've enjoyed the experience.

Still doesn't make a spa worthly of ranking well in general.. unless it's really a useful app with actual functionality instead of a shallow one page website.

Reply with some sites you made that you feel should rank well. Let's see how they hold up.


You can’t just say something completely false, wait until someone corrects you, and then go off about how you’ve made websites that they’ve used in the last 5 years so you know more than them.

That’s called appeal to authority. It’s a logical fallacy. Your statement was false, regardless of how “authoritative” you are on the subject. If you meant something else by your statement, you should have expressed it better.


No one is saying the should get a boost in ranking or rank well just by being an SPA. You're the one suggesting they should be penalized just for the fact they use client side rendering.

You seem to suggest that a SPA can only consist of a single `view`, when in fact there's often no discernible difference apart from the underlying architecture. Maybe we need a new word, so people stop getting confused. SPA has nothing to do with the amount of content or views a website has.

It's just as simple to create a shallow, useless website in PHP as it is in React.


>My users don't care how I build it, just that the site works and is enjoyable to use.

Filter out poor users by making the site inoperable with older hardware. If they cannot afford current gen i9, they will not afford whatever my ad network is pushing. Increases the conversion rate, so all is good.


My laptop from 2011/2012 loads everything perfectly, albeit a bit slower. I assume most people have a computer produced in the last decade.

Even my grandparents use an old iPad or cheap smartphone.

My machine isn't particularly powerful either - I use an old Macbook.


Even if it loads and performes OK, I still don't like it. Client-side rendering is making the web even more fragile - nothing is going to last anymore.


It is a website. If it requires i9 to run, something went seriously wrong. Was this sarcasm?


Exaggeration.


I have also had a site which was purely client side React fail to index properly - some pages were not picked up, others were indexed weirdly (some of the page content ended up in the title on google somehow!). Migrated the site to use react-static (which was not too painful a migration) and all good now! This was before the change google announced to using the latest Chromium for Googlebot so I’m not sure if that would have solved the issue.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: