Hacker News new | past | comments | ask | show | jobs | submit | chocks's comments login

Direct link in this post works but looks like the one linked from the careers page is broken. https://supabase.com/careers#positions titled "Site Reliability Engineer: Postgres" goes to a broken link: https://jobs.ashbyhq.com/supabase/ba5e5848-0ef7-430d-b773-94...

hey Fred, just a heads up - the career page links to a greenhouse board that seems inactive. https://tandemai.com/careers/#currentopening goes to https://boards.greenhouse.io/tandemailimited which says "The board you are looking for is no longer open."

Also use safari on ios, I started typing my time zone (New York) is the search box and it pulled up and I selected it.


heads up - seems like there's a typo in the url, has a %7C appended to it so can't open the link.


Fixed now. Thanks!


There’s Amazon, Instacart, Snowflake, Doordash, Shopify


Perfectly cooked eggs every single time, bought this in 2014 still going strong: KRUPS F23070 Egg Cooker with Water Level Indicator, 7-Eggs Capacity, White https://www.amazon.ca/dp/B00005KIRS/


Snowboard looks great. Congrats on the launch! At my work we currently use this open source tool https://www.amundsen.io/amundsen/ mostly to search schemas and tables. Curious to see the differences with Snowboard.


Thank you!

To give an opinionated position regarding Amundsen:

Amundsen is a great open source project with many cool ideas. However, we found that the experience on Snowflake was quite shallow and much more insights and context on data assets are possible. E.g. data profiles, lineage or usage information are not readily available in Amundsen. You can customize and extend Amundsen to do more, but most data teams should build and innovate in other areas closer to the business. ;) Most companies use Snowflake over the Apache Spark open source stack exactly because of that.

If you are interested to learn more, please reach out: rick at snowboard.software


I’m genuinely curious why companies don’t do this? It seems like a great deal, many cities in US are great with decent cost of living. I used to live in Kansas before & home ownership is cheaper and time zones are usually better. Maybe the total cost of hire is still higher? idk


Because the cost of living delta is mostly just home ownership. Yes, that trickles down to somewhat higher prices for contractors, child care, etc. but it's mostly rents and houses. So $50K/year say is probably a pretty reasonable cost of living adjustment between the Bay Area and most reasonable low CoL areas. Cut more than that and you're also cuttings savings and spending power for things that are priced similarly across the continental US.


Just looking at annual pay, lets say you pay devs a reasonable 75K USD in Chattanooga. Thats 56 Lakh INR annual pay in India. You can get a top engineer for that kind of pay.

This doesn't even account for the difficulty in hiring an engineer of the same caliber in Chattanooga and all the associated overhead like health insurance, etc.. which are much lower in India.

If you already have experience building a high quality engineering team in India (thus know how to overcome cultural barriers), its pretty much a no-brainer.


Hey, I’ve felt the same about missing social aspect of office during these wfh days. One thing that has worked for me is, I do virtual coffee chats with couple folks in my team and other friends at work every week and we talk about stuff of mutual interest work or non-work related.


Pretty cool! We use Airflow heavily here at Instacart. Some of our teams use a managed service from google for deployment and orchestration https://cloud.google.com/composer/ For companies wanting a standard structure of dags and self hosting their airflow deployments, your tool would be super helpful to get started. One suggestion - would be cool to add separate deployments for the different components of airflow - webserver, workers, scheduler etc. reading through the readme it looks like you deploy the single image to the Qubole Cloud? Often times deploying code to airflow is updating the dags files in airflows file system.


Thanks for the feedback.

The main motivation behind building this tool was to make onboarding easier on Apache Airflow. There was no standard structure for a airflow projects and setting it up on local can be a nightmare sometimes. The simple CLI tool makes it very easy to create and test your project locally before deploying it to your production or staging environment via your CI/CD.

Right now we are using a docker-compose file which brings up all the Airflow services but we are also currently working on providing a command to control individual process.

Qubole is not a cloud but a self managed Data Platform. Deploying on Qubole means just putting all the Dag files on the machine (AWS/ GCP/ Azure) where airflow is running. Qubole provides out of the box solutions for running airflow on your cloud with a click of a button. We offer bunch of different things (Spark, Presto, notebooks, etc) and have a great eco system build around Airflow.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: