Hacker News new | past | comments | ask | show | jobs | submit login
Algo Deck: an open-source collection of 200 cards on algorithms (github.com/teivah)
214 points by teivah on Feb 1, 2020 | hide | past | favorite | 27 comments



Unpopular opinion: Programmer time is more valuable than machine time. Maintenance is 90% of the life of a system. Therefore, just know the concepts for each .md file, be able to describe their general properties, and be able to work with existing implementations in the real world. Formal complexity is rarely useful, because on cheap hardware very inefficient code scales very well. Certainly, don't re-implement basics like sort, graph traversal or queuing algorithms unless there is a pressing need.

For example, it is infinitely more useful (faster implementations, more proven code, less maintenance overhead, free upgrades) to be familiar with iconv and the data available in the Unicode database than to memorize random facts about encodings.


> Unpopular opinion: Programmer time is more valuable than machine time. Maintenance is 90% of the life of a system.

That's actually the popular opinion.

Unpopular opinion: programmer time is more valuable than machine time up until the product starts to scale and/or outlives the savings. A programmer spending a week longer on doing something right might cost you a couple thousand dollars up-front, and would repay itself only in a decade. Then you scale 200x, and suddenly that suboptimal code is bleeding you those thousands of dollars a week.

Very unpopular opinion: the above examples is a good case; if your AWS bill gets too large, somebody in the company will start asking questions. What usually happens is that performance is externalized to end-users. It's their CPUs that burn through fossil-fuel-generated electricity that much faster and can no longer run more than a few apps at the same time; it's their mobile phones that need to be replaced a year early because of the aggregate bloat. It's their sanity that's being assaulted by laggy, constantly acting up devices. But it doesn't manifest itself as a separate line item on the company's books, so nobody gives a damn. Instead, laziness and shoddy craftsmanship is explained away by economic self-interest.


Friends don't let friends use cloud hosting.


There's more craftsmanship nuances between being lazily obliviousness to real-world impact and prematurely optimizing everything. Build something that works and is useful first, make it not suck and then try to make it awesome with what time is left. IOW, pull model of development from the real world rather than pushing through arbitrary/unprioritized development obsessed with using particular technological fashions. Yes, reuse optimized libraries that are best at solving common problems but don't deploy Hadoop to make an Arduino toaster, e.g., apply engineering discretion long learned from much trial-and-error and then verify it works and is correct with proper testing and benchmarking.


This is the gap between theory and practice.

When you study you will learn a lot of stuff that seems useless for a Software developer at his daily job.

When you only know practice you will be able to use the tools but probably do not understand what most of them do, how they do it and why. Of course this may seem useless in many cases but on certain occasions like optimization some more theoretical knowledge may help you to dramatically improve your code or architecture (It could be the other way around of course).

In the end I think no matter if it is practical or theoretical know how you may gain something from it in the future even if it is not a (typical) topic of your field of work.

A very important insight for me is that you can learn as long as you breathe and it helps you to age more gracefully while being better at what you do.


When I studied there were various courses I thought useless, only to realize years later how important they were. I'm glad I learned them.


As somebody that interviews a lot of candidates, I almost never ask rote memorization questions like these. I have to wonder, are there people who do? What sort of signal do you hope to gather by doing so?


It's probably better to think of it as the building blocks for answering more complicated questions. E.g. you write an algorithm and one of the followups the interviewer asks you is the runtime of it. Well, you used a hash table, so lookups are constant time, but you iterate over the keys, which is is linear in the size of the table. So when you do some work linear in the size of the input for each item in the table, you can work out that's a quadratic algo. That sort of thing.


That's how I thought of it when I was studying leetcode, and that's the advice I give to other people when they ask how I got my job. Each leetcode solution was another tool in my toolbox. The more tools you have, the easier it will be to put two things together for a larger solution, or to modify a tool to fit a new problem.

I understand the resistance to flash cards as rote memorization without being able to build upon them, I think that's a cogent criticism. At the same time I think some people will be able to take advantage of this to build their toolbox.

And as a minor note, I personally love when people can name an algorithm or concept. There was one interview that I think I completely failed, but I was able to name (but not write out) a concept. I actually incorrectly name dropped Hamming distance, when it was actually the Levenshtein distance, but I was able to diagram the concept behind it. I passed that interview (to my tremendous surprise), and I think the name drop and the simple diagram helped.


Unfortunately there’s a large amount of grifters out there providing tech interview “coaching” to people with no prior experience. Fortunately as someone who also interviews SW candidates it’s really easy to spot who just memorized Cracking the Coding interview and Geeks for Geeks answers.

I literally had someone write down the exact answer for an algorithm question that appears on GeeksforGeeks down to the small bugs which the post had originally.


> I literally had someone write down the exact answer for an algorithm question that appears on GeeksforGeeks down to the small bugs which the post had originally.

how did you know/verify this?

wondering why an interviewer would go GeeksforGeeks and other interview sites and match a candidates answer line by line.


When I was writing the interview question I was wondering if some site had already covered it. Turns they had, so I ran their solution and discovered it had some bugs. Whenever I ask the question, I refer to the "answer" on geeks for geeks. If they copy it, they get a no-hire from me.


It seems pretty reasonable to me, as an interviewer, to look up the question I am thinking about asking to see what is available online. Is there a reason you wouldn't do this?


> to look up the question I am thinking about asking to see what is available online.

ok if you find that your question is very common leetcode problem. What do you do with that information?

Looks like GP decided to use that common question anyways in the interview.

Write some sort of automated plagiarism test that checks against all common sites and answers ( leetcode can have 100's of user posted solution) ?


I interview a lot of candidates as well, and I agree. My philosophy is that an interview is an opportunity to talk about a problem together and sketch out some code. The only reason we choose to talk about algorithms and data structures is because they're a domain that's pretty similar no matter what sort of programming you've been doing.

"Memorizing" these algorithms isn't the approach I'd use at all. No interview of mine will ever be of the form "implement such-and-such well-known algorithm".

However, my questions do begin with "imagine we're on the x team and we have y technical problem - what should we do?", and it's helpful to have a toolchest of coding techniques that includes hash tables, trees, etc.


I haven't used hashtables or trees since I was in university, 15 years ago. I'd look them up if I needed them, but I never really have. Do people really keep these things in active memory?


If you're coding in a modern dynamic language, you're probably using hashtables every day without realizing it (that's one popular way dictionaries/associative arrays are implemented). As for trees, your code is a tree. Every time you're working with a nested data structure, you're working with a tree. Trees are fundamental to programming and data representation in a way hardly anything else is.


> If you're coding in a modern dynamic language

I see. I don't do that very much.

> As for trees, your code is a tree. Every time you're working with a nested data structure, you're working with a tree. Trees are fundamental to programming and data representation in a way hardly anything else is.

I haven't really needed to write a program to traverse my code as a tree, but I guess you're right that I must have used trees in C at some point.


boolean checkExactlyOneBitSet(int num) {

return (num & (num - 1)) == 0;

}

That looks like it'll fail when num == 0. I think it should be:

  return num && ((num & (num - 1)) == 0);


maybe posting a github issue would be helpful


I followed the link for Anki on the iPhone and it’s $25. No other options?


Think you are able to use the web ui. In my experience it works pretty well. It also syncs with the free desktop app.

I still having a hard time understanding the use for this after the job interview. The few times I’ve had to deal with data that was large enough to be big O and needed to optimize it, I just spend some time researching it a bit. Plenty of answers online for those cases. Someone who actually enjoys thinking and researching this subject, but not I’m not going to grind hackerrank.

It actually stopped me for even considering to apply at a company that uses this as a bar. A developer generally is someone who solves business problems, sometimes with code.

I’m happy that I work in a country where this grinding is not the norm.


The explanation that iOS users are subsidizing all other Anki users with that price somehow made it easier for me to swallow than if I viewed it as mere price-gouging.


It's free on Android. Works fine in windows and ubuntu too.


Free on safari


The authors (imho valid) argument (paraphrased) is that if you have 1k usd to buy a phone when there are equally functional and more open 100usd phones then you can pay $25 for an app.

Otherwise buy a $100 android and use the free app. Or use the windows and Linux versions (free too)


That’s a strange argument. There are many Android phones that cost $1k. Also, my iPhone didn’t cost $1k




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: