Hacker News new | past | comments | ask | show | jobs | submit login

Parallel in that you can invoke many Lambda functions at the same time and have them run independently of each other



So basically if I had 200 automations I could run them on 200 lambdas and have them finish by the time the slowest one finishes? That pretty awesome, specially for testing. For many cases this would also fall under the free tier since it not that many requests/usage...it kinda seems too good to be true. Am I missing something?


> For many cases this would also fall under the free tier since it not that many requests/usage...it kinda seems too good to be true

(since I've ran into the same trap a couple of weeks ago and ended up with USD 650 of unanticipated charges): the free Lambda tier includes

* 1M requests

* 400k GB-seconds

--> the GBsec can be a serious bottleneck. Imagine you're running each Lambda instance with 512MB RAM and each instance takes 2 mins to complete your test. This means that one Lambda instance is ~61.5 GBsec, meaning you can execute ~6,500 of these instances per month to remain in the free tier.

Depending on how extensive your tests are/how often you run them, you might run out of free GBsec well before you'd run out of the requests quota.


Granted that means running ~216 instances per day, or 9 instances per hour (taking 18 minutes per hour to run total). Now you're right, if you're running a screenshot service then this will kill you real fast.

However assuming a 8 hour work day you then get ~27 instances per hour. Each test takes two minutes to run, so for a single user testing, assuming a code - test - code - test routine, you'd be able to do that nearly continuously, for 8 hours a day, every day of the month (no weekends or days off). Seems safe to assume that wouldn't occur.


Yep. That's one of the main reasons why we're so excited about this project!


Haven't had time to read the source yet, how are the lambda headless chrome reliability issues dealt with? Is it a different chrome build than serverless-chrome?


For now, we're using the workaround proposed here: https://github.com/adieuadieu/serverless-chrome/issues/41#is...

The Chromeless Proxy service uses the @serverless-chrome/lambda package as is. Same build.


Nope, welcome to the serverless future! :)


You guys should put up an example on how to convert a (small) test suit to use it, like you said you did in your top comment. The examples you have are cool but don't really help visualize that parallelism gain.


Sounds like a great idea. Would you mind creating an issue here so we can track this? https://github.com/graphcool/chromeless/issues


Done.


There's a section in the readme which I'm not sure of - "Running integration Tests for example is much faster." I guess that gains is from the parallel executions. But for the integration testings, there is still work to be done to make a central controller / runner to distribute various tests to different Lambda instances, correct? If that's the case, is loadtesting using Chromeless be another use case? Just initiate in parallel and blast the app infrastructure. It won't be efficient in terms of costs as using non-real browsers but this is probably as realistic as simulating real users load through real browsers.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: