Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: In-view.js – Get notified when DOM elements enter or exit the viewport (github.com/camwiegert)
118 points by Heqx on Aug 27, 2016 | hide | past | favorite | 62 comments



There is an emerging standard called Intersection Observer that addresses the same use case: https://github.com/WICG/IntersectionObserver/blob/gh-pages/e...

This is a really useful problem to solve. But, I would personally prefer to solve it with a polyfill for a standardized approach that will eventually receive native implementation.


The repository you linked to, actually contains the official polyfill: https://github.com/WICG/IntersectionObserver/tree/gh-pages/p... (it should be about 6 KB minzipped)


And the polyfill defers to the native implementation where available, like Chrome and Opera, for much better performance and responsiveness.


Came here to say this. Recently spent a day hand-rolling my own, much like the author here has, only to rip it out at the end of the day and replace it with a lightweight wrapper around IntersectionObserver, using the polyfill. The API is just great, really well-thought out and much better than what I had come up with. The only downside is that I'd gone to some pain to get everything working with RAF, and the polyfill doesn't bother with that.


I hear you. But, isn't this just an editor's draft spec? Only Chrome and Android have done any implementation at all. So, as far as I can tell, there's a chance this will never be fully implemented? And, because it's a draft, the spec could change significantly. Is that right?


A spec being draft seems like an odd reason to ignore it and do a completely different API.

Intersection Observer has been implemented in Chrome since 51, and Opera since 38. It's currently being implemented in Firefox ( https://bugzilla.mozilla.org/show_bug.cgi?id=1243846 ), and is "likely" from Edge (https://developer.microsoft.com/en-us/microsoft-edge/platfor... ).

in-view.js's API is certainly less likely to be the cross-platform API than Intersection Observer. Also, plenty of APIs are implemented by browsers at the editor's draft stage.


I was sincerely asking all of those questions. I'm not very familiar with the process these specs go through to reach adoption.


All the more reason to experiment with the proposed standard API and contribute feedback.

The advantages of standardization and eventual native implementation outweigh the immediacy of a JS-based API designed in relative isolation. Why should I invest time learning this micro-library, when the API is certain to be different from the native implementation? The documentation doesn't seem to even acknowledge the existance of the standard (did the author do any research before implementing a one-off library? I have no way of knowing.), much less explain why it differs from the proposed standard.


I know that this type of functionality is useful, but please consider the performance implications before doing anything like this. Non-native implementations (i.e. not IntersectionObserver) _always_ use properties and methods that cause style and layout calculations which have a significant performance impact on low-powered devices.

A more comprehensive list is here[1], but the main culprits are HTMLElement.offset{Width,Height}, Element.client{Width,Height}, and Window.getComputedStyle(). Avoid these, _especially_ on events like scroll. You will ruin the experience for many of your users.

[1] https://gist.github.com/paulirish/5d52fb081b3570c81e3a


and it's impossible to achieve it without probing those attributes.

some advertising libraries solve that, since the advertising industry deals in in-view ad impressions for a few years now, by being smart about when to do it. the best one is called safeFrames. others like moat detect if the user has a fast computer, abuse it to no end, and then extrapolate the results for the whole audience.

i expect all of them to move to the native APIs soon (but Android fragmentation will make it slow)


There are a lot of edge cases to handle when creating a library like that and I cannot see them dealt with in this library.

More than three years ago I created in-viewport and have perfected it since. https://github.com/vvo/in-viewport it's used on fortune 500 websites along with my lazyloader (https://github.com/vvo/lazyload).

https://github.com/camwiegert/in-view/issues/7


I hope this gets used for good, not evil. Lots of sites (you know, those sites, they crop on on HN from time to time) flash up "give me your email address so I can send you spam" boxes before I can read the blog post. Or when it thinks I've gone somewhere else when in fact I just opened it in a new tab. Or if a momentarily switched away. I mostly close blogs that aggressively try to sell me things, but sometimes I want to read the content.

Conversations about what's an app and what's a document aside, I'd much rather documents didn't know anything about how they were being displayed.


It's unfortunate. However, it's been shown over and over to have the best conversion rate for getting people's emails, so it's a trend that's only going to become more popular from here (until the next high conversion pattern is developed).


Not necessarily. Google just announced that, in the name of accessibility, they will start penalizing sites that use them.

https://webmasters.googleblog.com/2016/08/helping-users-easi...


I think it's only for mobile versions of a website:

"To improve the mobile search experience, after January 10, 2017, pages where content is not easily accessible to a user on the transition from the mobile search results may not rank as highly."

They define what is acceptable and not right after. I like the move though !


Sure, it's great in that one metric. But annoying me with the offer to get a bunch of emails I don't want isn't making me like your site.

Even if I was the odd one and nobody else would mind those pop-ups: In the context of multiple experiments showing that a few hundred milliseconds additional page load time have a very real impact on engagement rates it stands to reason that interrupting the user experience with unrelated email sign ups should have a similar effect.


Adblock fixed that for me.


I am using this technique to lazy load and unload videos on my site[1]. Since there are many videos that autoplay on scroll, some mechanism is needed to stop loading a video that has been scrolled over.

I am using onScreen which I found more efficient than alternative solutions.

[1] http://rybakov.com/blog/

[2] https://github.com/silvestreh/onScreen


Babel and webpack are megaoverkill for this and account for a good chunk of the 2k gzipped size. The actual library code is barely 140SLOC. There's a lot of room for improvement if this is intended to be a real standalone library (vs. a webpack/babel test).


I've found rollup, https://rollupjs.org/, to be really useful for this.


This looks nice. Is it as battle-tested as browserify?


It's fairly new but The Guardian uses it for all their JavaScript (the creator works there).


tl;dr—don't use rollup with large untested dependencies

I've had projects that got really strange error messages when using rollup that completely went away when I switched back to Babel.

One "issue" with rollup is that it is not 100% semantically correct. Neither is Babel in all cases, but the creator argues that if you're not following exact semantics now, you're relying on your tests to ensure proper behavior, so use something that is at least more minimal and keeps your bundle size down.

So when you're writing code and bundling with rollup, you can pretty much ensure everything works fine, but as soon as you pull in an extensive third-party library you have no assurances that it has been tested with rollup and will work correctly in all cases. In the worst case, it will seem to work fine but in weird situations will actually error out. This was my experience with rollup.


I should clarify on my above comment, in rollup the semantics don't matter as much but if you're using Bublé instead of Babel, the semantics may very well come into play. In either case, I didn't have luck on a project until I moved fully to webpack+Babel.


Webpack's runtime overhead is tiny, babel's varies based on what polyfills you need but very little is included here.

Most of the code here is actually from lodash due to the use of `throttle`, not either of the projects you mentioned.


I believe Waypoints also does this same thing, it's a fairly mature lib. https://github.com/imakewebthings/waypoints

kudos for in-view's demo page though


Why is Babel overkill? It lets you write your code in ES6, which is a huge benefit for many reasons.


The fact that it's delivering 140 lines in 5.5kb. That's a lot of bloat! There are ways to write code in ES6 without that (see sibling responses).


Hey, author here. I'm very open to ideas to reduce bloat. It's 140 SLOC and ~1.1kb gzipped without lodash/throttle. I considered writing my own throttle, but wanted something more battle-tested.

Does Rollup produce a more efficient bundle in your experience?


Not sure why I was downvoted, I think it's a valid point. To prove it, here's the functionality of your library in ES5 with a naive throttle function: https://gist.github.com/nathancahill/f7ea239306737f2075a94de...

Minified (1.49kb) and gzipped (677b).

Whether lodash functions should be used instead of naive functions is up for debate. My opinion is if the lodash functions are 5x the size of the entire library, it's probably best to not include it, or to include it as a build option (if the user's project already includes lodash for example).


No worries, lodash/debounce can be reduced further with babel+webpack plugins.

Assuming the current setup of babel+webpack the difference between your naive version is just ~0.5 kB.


Which still doubles the size of the library: ~0.6kb to ~1.1kb. So while it's just half a kilobyte, do that over and over, with nested dependencies and it really adds up.



Can you point me to an explanation of the differences between the 'lodash.throttle' module and importing 'lodash/throttle'?


lodash.throttle is a standalone zero-dependency package of just the throttle module. The `lodash` package is a collection of modules one of which is `lodash/throttle`.

You can generally get smaller bundles using `lodash/xyz` modules over the `lodash.xyz` packages because of plugins like:

https://github.com/lodash/babel-plugin-lodash

https://github.com/lodash/lodash-webpack-plugin

Though in the future lodash-webpack-plugin may support `lodash.xyz` packages too.


You should get about the same results using a bundler. Perhaps the lodash.throttle package is for reducing `npm install` time or reducing bloat if you check in your node modules?


OP is using that already.


rollup can only load the functions of lodash you actually use into your bundle


So can lodash. That's what the author is doing here by using

  import throttle from 'lodash/throttle';
rather than

  import { throttle } from 'lodash';


The only sibling response is rollupjs, which replaces webpack with ES6 import syntax. You still need bablel for the rest of the ES6 syntax.


> You still need bablel for the rest of the ES6 syntax.

Or Bublé[1] ;) which i've found to be much faster than Babel, even for relatively small codebases (~ a couple thousand LoC).

[1]: https://gitlab.com/Rich-Harris/buble


But if your entire project uses Babel, those 140 lines will already be in your code and thus won't be added.


Whenever you attach an event handler on scroll, it can impact page performance because the javascript needs to be evaluated before the scroll event can continue AFAIK...going from some old Paul Irish talk on performance.

The #1 killer he found responsible for "jankiness" (that horrible scroll feeling where the page scroll is unresponsive and not 60 fps smooth) is by people attaching event handlers on the scroll event that take more than a few ms to complete, thus reducing the frame rate of the page.

It seems like this library is throttling the evaluation of the event handler but you still need to be careful as to not solve one performance problem and create another. I know I personally hate slow scrolling web pages.

This seems like something that should be implemented in some as yet unreleased CSS selector. But that still won't stop people from abusing it.


Whenever I rely on scroll information, I store the event info I need on the scroll event in some variable, and the logic that would normally be placed in the callback goes in a separate requestAnimationFrame loop. This way, even if it takes a few ms, it doesn't impact the scroll speed of the page. I wish more developers did this.


Ah the good old "GET OFF THE UI THREAD!"


"Whenever you attach an event handler on scroll, it can impact page performance because the javascript needs to be evaluated before the scroll event can continue AFAIK...going from some old Paul Irish talk on performance."

This isn't true no. Scrolling is an asynchronous event, in this case meaning it will happen regardless (without waiting on any JS event handlers), the JS event triggers after its happened. This is the reason you can't prevent scrolling from javascript.

Unlike clicking links, or submitting a form, the browser doesn't wait for any JS event handlers to return before doing this action. This was conscious decision to stop scrolling becoming too heavy or slow, or worse...crashing the page.


It's not true in some cases, sometimes. It's still quite easy to cause performance problems listening to scroll events, because even if scroll is partially asynchronous, the webpage itself is not - the browser will wait to render new content if the page is blocked.


Only yesterday I implemented this in an application, although a bit simpler (is the user < n pixels from the bottom of the document? then render more heavy stuff) so yeah, that's a pretty recurrent use case for a library like that.


That only works for content that is statically (absolute) positioned?


I made something similar for use with Knockout.js. Love your clear code and efficient implementation! I will probably swap out my core logic for yours.

Some here have mentioned the webpack overhead, and suggested rollup.js. I've had great results with rollup for these teeny-tiny browser-focused projects.


Definitely a problem that needs solving, but we run the risk of implementing too many 'bite-size' libraries with code that could probably serve better as helper extensions to already existing toolkits e.g. jQuery, underscore, etc.


Similar to https://github.com/customd/jquery-visible which I use a lot.


Won't this be a performance killer?


Depends on how you use it and on how much elements. I am not sure if this is even worth to be a library. Sometimes I am wondering why people want to use a lib for everything.


I agree, many times a library isn't necessary. In backbone.js I solved this with an interval triggering a global event which sends current scroll pos to all listening views.


If it's not my own code, it's worth being a library.

And I don't want to spend my time learning to write it correctly to handle all case and be performant.


Any time you're performing an operation on a scroll event, that is a risk. But this uses a single "scroll" event listener throttled to fire a maximum of every 100ms.

https://github.com/camwiegert/in-view/blob/master/src/in-vie...


Interesting project. I wonder if you might be willing to outline a few use cases?


One of the best use cases is lazy loading expensive data until the div is visible. Image loading, or even fetching a resource. You essentially get this for free in native apps (iOS/Android with recycled views) but there hasn't been a great way to accomplish this on the web.

Note: this is definitely not the same as a recycled view. But can help accomplish one aspect


Yes, lazy loading is a good use case.

Echo.js (https://github.com/toddmotto/echo) is another small no-dependency library (1.89 KB minified), which detects elements appearing on screen, and goes a step further to swap the "src" attribute with a placeholder for lazy loading and unloading.

Disclosure: I've submitted a pull request to it.


For example store img src tag in data-src and then set src when image comes into view. So to say lazy loading of images.


Fancy animations like fading in/out content when scroll has reached a given element.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: