Hacker News new | past | comments | ask | show | jobs | submit | more DarkShikari's comments login

This sounds similar to what The Asylum does (http://tvtropes.org/pmwiki/pmwiki.php/Main/TheAsylum); they make extremely cheap knockoffs of big-name movies, like "Transmorphers", and rely on people not paying attention to inadvertently buy or rent their movies. It's apparently incredibly profitable.


Or less cynically they make funny knockoff movies that a lot of purple watch on purpose. The book thing is pure scam.


he probably thinks that she in some way caused the stalker to spam her project.

No matter what happened here, this entire concept is horribly, horribly wrong. It's blaming the victim -- saying that it's somehow her fault for "causing" the stalker to do something... as opposed to the stalker's fault, for stalking.


Except that Forrest's point is that they probably didn't know the spammer was a stalker. This is kind of relevant. If a random person walks up to you and asks if you've seen this person and shows you a picture, you'd probably help them out, completely unaware they're a serial killer. That doesn't make you a bad person.

If Kickstarter does respond to this by maintaining their stance, then there's a huge problem. Until then, it's more a case of an incompetent employee.


I don't think that changes anything? Again, replace the word "stalk" with "spam": it's the spammer's fault for spamming, not her fault for hypothetically "causing" the spammer to make the decision to spam her.


I agree there.


I was in no way saying that it was the victim's fault. I just don't think that Kickstarter should be on the receiving end of a lot of criticism until we get the full story from them. It may very well be that they are under the impression that that it is her fault, although I am not saying it is, it is just plausible and would account for their reaction to her ban.


Banning IPs works well enough even for vastly bigger sites like Wikipedia. Combined with an open proxy checker, it's surprisingly effective.

It's not perfect, but for ANY site of sufficient size with unmoderated user content, this problem eventually has to be dealt with. Threatening to ban the victims is not a solution.

Kickstarter earns 5% from every single successful project: they can afford putting some effort into moderation and protecting their customers.


I tried this, but Thunderbird simply locked up due to the sheer volume of emails (I subscribe to LKML and other high-volume mailing lists). I haven't been able to find any good backup solution anywhere, and articles like this really scare me.


Thunderbird 11 is much better than Thunderbird 3.x was at handling a GMail backup. I've just finished backing up ~120,000 messages using Thunderbird 11 (archiving them into monthly folders about once an hour during the backup), and it's doing fine. (I tried the same thing a year ago with Thunderbird and gave up after 45 minutes.)


good to hear. I had tried THunderbird in the past but it would just crash. Just started today for first time in a year and just upgraded from V7 to v11. Working good so far.


Try getmail. I've used it to suck down a very large amount of mail.


I prefer fdm[1], but I agree that cli mail fetch tools are great for this.

[1]: http://fdm.sourceforge.net/


My laptop doesn't have a middle mouse button, and it's a modern Dell. Naturally I use a real mouse when I'm on a desk, but remember that not everyone has a middle mouse button.


Try clicking both right and left touchpad-button on your laptop. At most (all?) laptops it will open the link just as if you had middle-mouse-clicked it.


That works in X, but I've never seen that work in Windows. And when I have used it, the timing was only about 9/10; the other times I get a left-right click combo, or a right-middle combo, or something else equally disastrous considering how the X clipboard works.


I use Windows 7 and it works for me.


So do I. I just tried it. I get a left and a right click. So the menu shows up for a little while, then it follows the link anyway. Wherever the Emulate3Buttons setting is in Windows, it's not on for me. I open new tabs by right clicking then left clicking for a reason, and it's not because I don't know about the dozen other ways to open a tab.

I'm surprised nobody's suggested "hit tab until the link is highlighted, then hit the menu key" yet. There's another great workaround.


I know that's the standard, but it doesn't appear to work on this one (Studio XPS 16).


"Only utilizing one core" would mean that the software only uses one, but the hardware has two.

"Binning parts" means they're either actually hardware-disabling ("diking out") the extra core, or the extra core was bad to begin with (making use of otherwise bad chips).

If those images are real, both cores are definitely on the chip, but whether one is hardware-disabled or not is not certain.


Given how many A5s they produce and the low cost of the Apple TV, my money's on the latter.


Agree.

It does seem like the images are real, because apparently the site is in the business of selling very high-res versions of such chips:

http://www.chipworks.com/en/technical-competitive-analysis/r...

"Die photos are available in the Chipworks Report Store at top metal and lower metal/poly"


Chipworks are in the business of all manner of reverse-engineering silicon. It's a pretty interesting process.

Sadly, it's one of the few semiconductor segments in which Ottawa, where I live, is still something of a leader.

And, weirdly, the owner of the company used to live in the condo below me.


Would make more sense to software disable it? Then when new hardware comes out the older hardware can be upgraded to support a new OS and utilise the second core rather than being left behind.


As mentioned if they have switched all their 4S production over to this chip they may be able to support their Apple TV line primarily with dies that have a borked processor (otherwise those get thrown out). In that case only upgrading a portion of the people who bought Apple TVs with a software update might be counterproductive.


iOS is already a pre-emptive, multi-processing OS. It would have no problem using a second core.


Wouldn't it make more sense to enable it from the get go if it is really available and not locked out for yield purposes?


Yeah.

The simplest and most plausible explanation here is that they're using the Apple TV's relatively low performance demands as a way to get some use out of A5s with defective cores


Not if you're planning to advertise the next version of your product as "now it has a dual-core processor"


Is there any reason why Windows or at least Cygwin isn't supported? Much of the time, the whole point of SSH for me is to connect to a Unix machine from a non-Unix machine.


The code compiles under Cygwin with not too much effort -- the current master may compile. The big problem there is you don't really have a good UTF-8 terminal, but you're welcome to use it. If you know how we can best package the software for Windows or Cygwin users, we're happy to take a patch. (github.com/keithw/mosh.git)


Cygwin has a built-in utf-8 terminal that I find quite descent. (Only for some months, it did not have it before.)


Just posted some more terminal geekery on http://mosh.mit.edu -- I'd be interested how the Cygwin terminal does on the test cases shown there.


Mintty, badly. The first test gives no hat (or I just can't see it). The second causes it to get stuck in heiroglyphs. The third doesn't work correctly either. It looks like it prints xyz correctly, then jumps to the second line on screen and then continues from there.

The "cygwin bash shell" which uses cmd.exe does the first test correctly but similarly fails on the others.


Those test cases look useful -- Do you know of any more sets?

I'm writing a terminal emulator out of interest, in hopefully a cross-platform way. I'm personally frustrated by the existing ones.

Got any wish-list items for one?


Start with the putty source and integrate the mosh protocol?


In case anyone’s interested, here’s how I managed to get Mosh to compile under Cygwin:

https://gist.github.com/gists/2349067/


mintty?


If only it was that easy: change programming languages, change programming models, and poof! Magical parallelism.

But parallelism is harder than that. It's an algorithm problem, a design problem, not a language or code problem. While OpenCL might be harder to write than plain C, for anything except the most embarrassingly parallel problems, that difficulty pales in comparison to making the solution parallel to begin with.

For every problem where you can simply split into masses of shared-nothing tasks, there's a dozen others where you can't. Rate-constrained video compression. Minimax AI search. Emulation. All of these can be parallelized, but it requires parallelizing the algorithm and making sacrifices that are light-years beyond what a dumb compiler that isn't even allowed to change program output (let alone rewrite half the program) could do.

Modifying an algorithm -- possibly even changing its structure entirely, changing its output, or even accepting nondeterminism -- is inherent complexity, to use the terminology of Mythical Man Month. No programming language or tool can eliminate this complexity. Good tools can make it easier to implement an algorithm efficiently, but they really can't take a complicated application and figure out how to change its design, structure, and behavior. Until we have AI-complete compilers that take program specs as "code" to be compiled, that's a human job.


One significant issue is that most programming languages are inherently temporally inexpressive, having either full order (imperative/impure functional) or no order (purely functional). Ideally the language would make it easy to organize statement in such a way that a programmer could indicate a partial ordering relation without having to use function composition. Additionally, a big problem with function composition in purely functional programming is that multiple long function compositions can only be ordered as monolithic units, however in many cases being able to order at the individual function level is desirable. If a language had an explicit notion of relative temporal index for statements (which could be adjusted) it would be a nice win for writing concurrent code. That would also let compiler writers lift a lot of the stuff they do up to the program source level as macros (which would be a HUGE win).


A futures library solves the case where, in an imperative programming language, the programmer writes a block to compute A, a block to compute B, then combines A and B. Pure FP works fine in that case, since writing (+ (compute-A) (compute-B)) does the same ordering.

It seems to me pure FP is fine unless computations for A and B have interdependencies, which would result in duplicate computation in pure FP unless you can refactor to a different algorithm. I'm not sure I understand what you mean by wanting to control ordering at a "the individual functional level", unless you mean interdependencies between computations done in "monolithic units".

    A->(loop B until done)->C
    A->(loop D until done)->E
    
    C-\
      +-> F
    E-/
Something like that. Am I one the right track?

No, because I can still express that in pure FP:

    (let [(A (compute-A))]
      (let [(C (compute-C A)
            (E (compute-E A)]
         (compute-F C E)))
Trying again ...

    A->(loop B until (valid B D))->C
    A->(loop D until (valid D B))->E
    
    C-\
      +-> F
    E-/
Is that better? Care to help me understand what you're getting at?


I was thinking more along the lines of working around data-locality issues. For instance, imagine that you have code that requires very high latency fetches, or you are working with a data set that doesn't fit in memory. Typically, you have to develop modified algorithms that for these scenarios, but there is no reason that a minimal fiber scheduler couldn't adapt seamlessly. Even better, if your conditions change (like for instance, mobile devices moving from low throughput to high throughput links) a scheduler can adapt, but the hand rolled algorithm must be re-coded.


Did you mean a data-flow language. A language that is driven by the stream of data? Mozart/OZ implements some of that. Also see here: http://en.wikipedia.org/wiki/Dataflow_programming


I am a big fan of synchronous dataflow as a way to organize programs. It meshes well with the idea of modelling knowledge rather than "programming".


> No programming language or tool can eliminate this complexity.

But some languages can help get you there. Programming languages are tools and these tools come with idioms and paradigms of their own.

Certain paradigms help, often by constraining the developer to think and write in a way that makes the final result more parallelizable.

For example functional languages with immutable data structures help guide the implementation towards that. Actor models also do that.

Take a program in Erlang that someone wrote 5 years ago. They split it into 10 concurrent, cpu-bound, actors. Years ago it ran on a single CPU hardware. So the code was robust and it benefited from process isolation but didn't actually execute in parallel. It might have even been slower than the equivalent C++ or Java code.

Now all of the sudden there is a performance requirement. The site goes "viral" as they say. Now there are millions of requests per day not hundreds. If the program was written idiomatically correct all they have to do is get a big hunkin' box with 16 cpus and lots of ram. And all of the sudden those actors are working in parallel. No change to the code needed.

Next month the site goes "pandemic" so now they have to think about scaling beyond a single machine. No problem, they just move some of the processes (actors) to another node running on another machine. The way to send a message to an actor is still the same Pid ! Msg, just like it was in 2006 when running on a single core CPU. Still minimal code changes needed.

This is an example where languages and toolsets help guide developers in a certain way.

You brought the example of algorithms. The thing is, I believe, not that many programmers these days design algorithms, unless they are involved in research or are in graduate school. Most programmers engineer systems by stitching together components and APIs. I think with the right frame of mind and after some practice, it is not that hard to see how to split a lot of these components into concurrent pieces that can then be parallelized if the correct language and toolsets are used. It is often not the inherent problem but the way the mind of the programmer has been molded by years of applying another paradigm (ex. imperative, OO, mutable state).


You seem to be of the opinion that designing a parallel architecture or algorithm is really the difficult bit, and that what we need is the magical paralleliferizing compiler which as you say, doesn't exist.

Designing a parallel approach to a problem, where such a solution is possible, is always going to be easier than trying to implement it correctly using threads and locks and mutable state.

FP can make some of the basically-impossible possible, with e.g. STM to avoid thread deadlocks and starvation, or even better, by supporting deterministic parallelism so you don't have to mess with threads and concurrency at all.


Designing a parallel approach to a problem, where such a solution is possible, is always going to be easier than trying to implement it correctly using threads and locks and mutable state.

I haven't found this to be true in my (limited) experience. In writing sliced-threading support for x264, threading issues took up only a small percentage of my time. Admittedly, this is a rather simple application: the frame is split up into independent threads which never wait on each other, communicate asynchronously without any attempt at determinism, and all finish before the program can continue.

My experience is limited, but a generalization like "always going to be easier" seems rather at odds with reality. I have never found dealing with mutexes to be difficult in any situation. What I have found to be difficult is trying to design lock-free code, using memory barriers, and other trickiness to avoid slow locks in situations where the overhead of locking is just intolerable. This is definitely an application where things like pre-made lockless datastructures and the like come in handy.


I've been doing some real-time audio programming lately, which requires the same techniques. You can't take a lock or even malloc in this kind of code, so you're forced to implement things like lock-free ring buffers to communicate with the rest of the world. In my experience this is 10x harder than conventional lock-based parallelism and, unfortunately, I don't see any FP lang coming to the rescue in this domain any time soon.


> My experience is limited, but a generalization like "always going to be easier" seems rather at odds with reality.

You're probably better at reasoning about nondeterminism than I am. In my also limited experience, even trivial parallel algorithms are tricky to get right. So it seems to follow that an algorithm that took more thought (say, something using speculative parallelism) would likewise be even trickier to implement correctly.


What you say is true (http://en.wikipedia.org/wiki/Amdahls_law), but it's not really the point. Most programming is not that hard, it's just moving data from one place to another, operating on that data, and then moving it somewhere else. Using functional languages trivially allows you parallelize these types of operations, and forces you to think about problems in a way that will allow you to parallelize other things more easily.

So you're right, good design is a human's job, but when you're just doing things the grunt work that most of software development is, having tools that enforce The Right Thing is very helpful.


This isn't about Amdahl's law; this is about the fact that many problems are simply hard. Let's look at some examples.

One common example is the need to split a larger task into smaller subcomponents, but where all of the components combined need to obey some global constraints. A specific example: we're compressing a video frame, the result must fit within a certain size, and we can't parallelize over multiple frames for latency reasons. This means we need to split the frame into chunks, but somehow all the chunks have to communicate with each other in real-time, as they work, in order to ensure they obey the global constraint. Do you have a "master" thread that manages them all and makes decisions? Do you use some sort of algorithm where they act as separate agents, asking each other questions? Suddenly this is a lot more complicated than what you started with.

Another example is a search algorithm. Whether you're performing a minimax search of a game tree or simplex optimization, you're implementing algorithms that are normally not parallel. Parallelizing minimax is actually incredibly difficult and requires making a lot of hard decisions about where you branch threads, where a thread gives up, how to avoid duplication, and so forth. Fancy programming tools can give you features like thread-safe hash tables that help you, but they don't solve the actual problem. See any multithreaded chess engine for an example of this problem. Note particularly that the engines don't get perfect speedup from multithreading -- but it's NOT because of Amdahl's Law! It's because the searches between threads unavoidably duplicate at least some work.

"Moving data, operating on it" would be grossly oversimplifying real-world, complex programs like these, and there's nothing a functional language would do to "trivially parallelize" them. Dependencies in calculations are often so tangling that you cannot naively parallelize them without making dramatic, possibly sacrificial, changes.

Tools like FP can be useful, but they don't solve problems of inherent complexity. There is no silver bullet.


I have a little hunch that any parallelism "answer" is going to be derived primarily from ever-more-grand exploitation of its strengths, not patching of its weaknesses.

We know that in a formal reasoning sense, an asynchronous system is less powerful than a synchronous one, because you can implement any async design in a sync system, but not the other way around. Yet in practical use, asynchronous, "hard" decoupled systems are the ones that scale better and are easier to maintain. So we keep telling ourselves, "decouple everything" and have invented a gross array of patterns and techniques that add "decoupling pixie dust."

We know this, but usually don't extrapolate it to its natural conclusions: we're using brute-force engineering to attack our needs from the wrong direction - trying to break arbitrary sequential computations into parallel ones via sufficient smartness, instead of folding parallel ones back into sequential ordering where it produces wins, and breaking out the "sequential toolbox" only if the problem really taxes our abilities of algorithm design, as in the cases you have outlined.

Of course, adhering to parallel designs from the beginning is hardly a silver bullet either, but there are real wins to be had by exploring the possibility, and a good number of them can be experienced today simply by working that style into "ordinary" imperative or functional code with abstractions like actors and message-passing.


Correct me if I'm wrong, but in a formal sense you get the synchronous system inside the asynchronous one trivially. So the two are of equivalent power.

If you imagine that in the case of threading systems, you can implement a multi-threaded system with your own green threads library, which is async on top of sync. You can get sync on top of async with native threads by simply using one thread. Or, if you want to be complicated, by using multiple threads with locks that serialize to a single synchronous ordering.


> We know that in a formal reasoning sense, an asynchronous system is less powerful than a synchronous one, because you can implement any async design in a sync system, but not the other way around.

As someone else pointed out as well, I just don't see it. If components of an asynchronous system can wait on an event you can make it synchronous by driving it with deterministic synchronous events. Imagine a bunch of independent actors that always wait for a clock signal and then processing one piece of data then waiting for next and so on.

So I actually see a synchronous system a restricted case of an asynchronous system.

If you think about it, the world is inherently asynchronous. If you have 2 agents in the real world. They process and do things asynchronously, there is no global event or clock system that drives everything.


When you start talking about latency, you are inherently talking about Amdahl's Law because the question that gets begged is, "can we make amount of work X complete in less time with more processing power Y" ? And in your problem domain of video coding, esp if it is for real-time face to face communication or telematics, latency matters. So Amdahl's Law matters because it dictates the bound on latency.

Doing more work X (making the pie bigger) in about the same time as Y is http://en.wikipedia.org/wiki/Gustafson%27s_Law, latency, like the time of light it is one of those physical universal properties we won't be able to tech our way out of. Latency is inherently a serial problem. What changes when we could try every solution at the same time?

Nearly all hard problems we are solving right now don't need to have exact solution. They can be time bounded or quality bounded or both. What we lack is a clarity in being able to create algorithms that are probabilistic and progressive. Algorithms that refine and converge towards a solution over time. Better languages can help with that.

Here are some examples of probabilistic parallel minimax algorithms.

http://www.top-5000.nl/ps/Parallel%20randomized%20best-first... http://www.math.md/files/basm/y2010-n1/y2010-n1-(pp33-46).pd...


I don't think value of FP is that it solves the complexity of parallelism. Rather it is that once one has solved the problem (designed the algorithm), FP can help assure the solution is implemented correctly. This in and of itself is a large advantage.


I use functional programming because I lack the mental ability to hold so much state in my head. A friend of mine is a badass at programming in C++ using threads. I don't and will not have the ability he does. So I use FP and my programs are correct and by being functional the options to speed up my code are also easily proved correct.

Being able to handle large amounts of complexity and being ok with it is not necessarily a good thing. If I could reason like a compiler I would probably just use a flat memory model and address each byte individually.


I preferred the parable, personally -- I think it's just a preference thing. Viewing things as an interaction, even just a sort of Socratic dialogue, works well at least for my mind.


> Socratic dialogue

There are more references to culture as well. See below for a discussion of the Sesame Street (or was that "It's a Wonderful Life") names.


This article is superb.

We tried placing ads for ninjas, rock stars, and so on, but I discovered this was the cultural equivalent of advertising for white males who drink dry martinis. Not that white males who drink dry martinis can’t do the job, but there’s no real difference between advertising for a Ninja and throwing half your resumés away because you don’t like unlucky people. Either way, you end up with fewer resumés.”

This is so true, so important, and so many startups (and even bigger companies!) miss this. Job ads provide cues, conscious and subconscious, to the people reading them. Not everyone reading the ad is identical to the person writing it, and a badly written job ad can easily send the message "this company isn't for you" to a large number of skilled potential applicants. This applies not just to categories like gender or race, but even to personality types and personal interests. Unless you really want a company of only extroverts, for example, don't write a job ad that scares off introverts.

In the canonical example, if you constantly ask for "rock stars", you will turn off people to whom that doesn't appeal, including tons of good programmers. But it goes beyond that: don't assume that all your applicants are any particular kind of person with certain interests. A job ad should focus on what the job actually is, and things that are important to the job.

The best programmers often have a lot of choice in where they work, and as many HNers know from experience, if they see a job ad that turns them off in some fashion, they will probably not even bother reading further: they know they have better options, so yours probably isn't worth their time. If the vast majority of skilled programmers skip over your resume, it's no wonder you only receive resumes from unqualified applicants.

In short, when writing a job ad, you need to think from the perspective of people applying. Use your empathy, put yourself in their shoes, rather than just writing what you think looks cool.


The article is almost completely useless.

It claims you shouldn't throw out resumes based on simple heuristics and stuff that only correlates with programming ability because there are so few people applying who are any good, but has exactly one sentence to say about how to then actually find those few people: "I grill them, hard, on actual programming and actual software development."

Right. Sure. Every single one of the 500 applicants. Good luck with that. Get back to me in 6 months.

It's nice in theory to be super-inclusive about everything except "actual programming", but almost completely divorced from reality.

It sounds like someone who, hearing about heuristic solutions to the traveling salesman problem responds that "I cannot possibly accept a solution that may not be 100% correct! It's the computer's job to give me a correct answer, and I'm damn well going to wait for it, no matter how long it takes! Exponential, schmexponential!"

Heuristics have their place. Something that strongly correlates with what you're looking for but is much easier to test for can be very, very useful.


The article criticises the unquestioning use of bad heuristics. One needs to think very carefully about how accurate the heuristics employed are at describing the unseen (but directly desired) attribute(s) as well as what un-desirable attributes it may be silently signalling.

As someone who worked at an investment bank, I'd go ballistic when managers would turn away candidates because their resumes weren't properly aligned or formatted. The irony was amplified by the fact that we were not a client-facing trading desk. Here the heuristic used (resume formatting) did not correlate with the desired attribute ([trading skills]) but did correlate well with some undesirable attributes (insecure children more concerned with form than function).


I agree with everything you are all saying.

What I was trying to claim is that there’s a relationship between heuristics and the ham/spam ratio of the documents being classified. When there is very little ham and a lot of spam, my claim is that false negatives are extremely expensive. When the benefit of hiring is very high and the cost of interviewing is low, my claim is that false positives are relatively cheap.

Given these two, my claim is that given current market realities, when hiring programmers for a mission-critical role the best strategy is to go light on spam-filtering heuristics and heavy on direct inspection of the ham content.

But obviously, some heuristics have their place. To a certain extent, everything except “Start work on Monday, there will be three month’s probation” is a heuristic.

And “This article is almost complete useless” is also true, given that it says in many paragraphs using a contrived format what I just said in two paragraphs, it obviously contains a lot of redundancy and extraneous noise!


That "extraneous noise" actually stresses your point beautifully. While we don't want to generalize from fictional evidence, fictions can be great at making the reader pay attention to your case (and yours certainly is).

Now that I'm a bit more aware of the gazillion ways my resume can be mistakenly filtered, I can (i) work on defeating those filters, and (ii) not take it personally when it still doesn't pass through. I know it sounds obvious, but I didn't get it on a gut level until your story put me in the shoes of the recruiter, and showed me that one can be both reasonably competent, not evil, and still miss many relevant resumes.


yar i guess those noise is just to give the picture of a seemingly successful person and make people fall for its crap.


I agree. If there was a TLDR; it'd have been nice :).


You know what's kind of funny, though? I bet that a not-so-small percentage of people who are nodding their heads up-and-down wildly in response to this essay think nothing of filtering their applicant pool based on github profile.

After all, it's a "coding" metric, right?


Ha, I always love seeing a job application that demands a Github profile, with no place for Bitbucket or Google Code (not to mention Sourceforge!).


... or work primarily covered by NDA.

Yes, I've contributed to open source projects, but that was before they migrated to github (or even to their current mailing list).

Still, I've been bitten by this demand.


A couple of years ago I saw a programming job ad with language along the lines of "don't bother applying if you have a gmail address on your resume." Wonder how that panned out for them.


They would probably get applicants with a hotmail address...


They probably didn't get anyone who uses a Gmail address as primary, which is what they asked for.


Like in every skill you need to choose your heuristics carefully and like in most jobs, HR and CTOs often aren't the best. That doesn't mean the whole idea is crap.

The thing about formatting I can't answer if you are right or "most HR people" are right. The argument they bring seems as legitimate as yours: People who care about the looks of their result are more disciplined, probably faster (because they get the content and the formatting in the same time frame) and care more about working in that company. Does that sound unreasonable? Which one is right can only be told with data. And because of that I tend to support the HR guys. It's more likely (of course not 100%) they have more data and experience then you and me.

(So indirectly I also agree that work experience can be a good discriminator concerning future work quality expectations)


> It's more likely (of course not 100%) they have more data and experience than you and me.

Having data does not imply usefully analyzing it, and "experience" is nothing more than a plurality of anecdote. Without actually seeing the data, I think your best guess here should be 50/50, rather than the very biased-toward-the-high-end "more likely (of course not 100%)" that you have. Appeal to authority is especially dangerous when you are making decisions that affect your company's bottom line based on peoples' ability to format text.


I confess, after spending a few years as an inexperienced coder applying for anything I could get, that I take a perverse pleasure in updating my shitty resume only occasionally now, and not giving a wet fish about the formatting. (a) because it's cathartic, (b) because I don't need to suck up to assholes any more, and (c) because one of the smartest, most reliable people I ever hired submitted her CV to me entirely in Comic Sans.


<i>insecure children more concerned with form than function)</i>

I thought that was what investment banks wanted to hire?


Not to mention the importance of cultural fit, too.

People dismiss cultural fit because it seems superficial. In reality, programming ability is part of cultural fit - who wants to work with an incompetent? A good programmer does not exist in isolation, a good programmer exists in context of the people s/he works with.

So if you find that rockstar/ninja advertising attracts the type of people you want - why not? Personally I think it's tacky, and we make fun of it where I work. But there are plenty of places with no such qualms, and it seems dishonest to not embrace that.


Yes, cultural fit is important. Programmers need to be able to work together ... BUT ...

... I once hired a guy precisely because of his strong stance against some of the norms in the local culture (and he passed the minimum programming threshold and was willing to work for the peanuts we were offering).

I inferred from his political/cultural views—which he was not shy about expressing—that he would have the will to speak up and question my decisions if he felt strongly about it. As a young manager I knew I needed someone working for me who would challenge my thinking on occasion. Whether he knows it or not, he taught me a lot about how I needed to improve as a manager (mostly by giving me opportunity to recognize my flaws).

Maybe that's "cultural fit" ... I had a niche I needed filled, and he filled it. Yet, the role I wanted him for was to be disruptive on occasion. And it was a good thing.


This is the only use-case/meaning of cultural fit that makes sense to me. But, i do think we ought to find a more specific name(http://lesswrong.com/lw/bc3/sotw_be_specific/). Perhaps something along the lines of balancing the team.


Agree, you just need to know which type you're targeting. The opposite of "rockstar" might be mentioning a "family friendly atmosphere". It will appeal to a certain audience which might or might not be your target.


Where did it mention cultural fit? It seems that he skipped that, too. The one place I thought at first glance was referring to cultural fit:

> quite a few people are going to be really nice people that nevertheless aren't right for us

was actually just talking about being incapable of doing the job:

> If there's only one in two hundred resumes worth considering, quite a few people are going to be really nice people that nevertheless aren't right for us.

The mention of "ninja" programmers wasn't in respect to culture, either, but simply optimization.


We're currently hiring at Startup Weekend, and at the end of the job ad we said something like "Ninjas, rockstars, gurus, and jedi masters need not apply."

In hindsight, I think that actually says a fair bit about who we're looking for AND our culture. "Do a good job. Don't get distracted by buzzwords."


I think that comes across as a bit smug.


I second that...it does sound a bit smug


I'm with DarkShikari - this is a great article.

Hiring is one of the most important things for any company, at any stage, that it is worth doing right. If interviewing 100 people instead of 50 will increase overall talent, it's worth bending over backwards to do this. Even if it means sacrificing in other areas.

Ultimately, everyone ends up using heuristics at some level. But it is worth being very careful what heuristics you use, and the article does a great job of demonstrating this. Do you really want to exclude excellent developers who didn't major in CS? Do you really only want to hire engineers who self-identify as rockstars? Maybe - but be conscious about what you're doing.


Just because someone points out a problem doesn't mean the onus is on them to find the perfect solution.


The 500 applicants gets wittled down quickly after you look at someone's Github or code examples. Suddenly you're looking at 10 or 15 at most. You can do that.


And now you're using a heuristic even less correlated with programming ability. My list of awesome programmers I would hire again in a heartbeat has, to my knowledge, a combined 0 lines of code on github.


It's one heuristic out of many, and it's a fast one. If you go through all 10-15 github profiles, and you find an awesome programmer, than you're done for that point in time. If you don't find any, you can use a more time-consuming heuristic.


Throwing out resumes with spelling mistakes is even faster, and if you don't find anyone, you can always go back. The same can be said for any filtering technique.


The problem here is that a resume with no spelling errors says nothing about a persons coding ability. The resumes without errors are lacking the negative indicator of poor spelling/formating. A strong Github profile, however, consists of many code samples and is a positive indicator of coding ability, making it a much more useful metric. This doesn't imply a knock against programmers with no Github profile, simple a leg up for those who do.


I generally prefer using Wikipedia edits or Linux kernel commits.


I'm a semi-introvert graduating with my BS in CS and am looking for jobs right now.

And this is all I ever feel. I read dozens of job ads across the country (I look through all of them because I really want to get out and move some place new, see the world - being fresh out of college is the best time for me to do that and all) and in the process I see only two kinds of jobs for someone like me.

The ads either come across as wanting rock star geniuses that could develop in a month the entirety of the next facebook or google in their sleep, or they come off as grossly incompentant in that they don't know what they want from an employee.

When an ad lists skill sets from assembly to rails to genetic algorithms I just sigh because the company obviously doesn't know what they want, and I want to work some where that I can not only get better at my trade and create great things but also have confidence in the business not going under in a few months.

Simultaneously, the other set of job seekers want 5+ years experience for a startup and they use the rock star vocabulary, and I get turned off on that because I am not the second coming of John Carmack or Bill Gates, I wish I was, but I just am not that smart.

Compound that with the reality that I have a passion for software and as a result I only want to work on things I find interesting and useful myself (eating my own dog food) and I might consider one ad in a hundred. And I'm not even location limited!

It just seems to me like there is no middle ground, either you are a genius rock star or the employer appears clueless about what they are after in a developer. It really grinds my gears with all this job hunting shenanigans.


Ugh. I know the feeling.

I once answered a job ad - on HN no less! - asking if I had ever wanted to create a programming language and saying I should contact this company to show them if I had.

Well, I was a PL researcher by the end of undergrad. So of course I sent them my research paper, my senior thesis, and my project link.

Turned out they make a mobile furniture-sale application. They've got really cool guys working on it who've done things like compilers or languages before, but when it came right down to it, they stopped calling me back when they realized I haven't done mobile development before.

<frustration>For fuck's sake, if you wanted an experienced mobile dev, ASK FOR ONE! And if you asked for a PL researcher, WHAT ELSE DID YOU EXPECT? AND WHY ARE YOU EMPLOYING COMPILER EXPERTS TO WORK ON SELLING FURNITURE?!</frustration>

It would be really nice certain companies stopped trying to make themselves sound cool or important, and just let job-seekers know what they are hiring for. Obscuring your intent, in hiring, is equivalent to wanting a bad match.


This is in no way condescending to you, but realize that your options will massively open after 2 years of work experience.

I graduated in 2010, I got 3 job offers out of the 3 companies I applied to. However, none were exactly my dream job and the few companies I really wanted to work for didn't even talk to new grads. Today I can absolutely get those jobs.

My girlfriend went through the exact same evolution. It was brutal finding anything in 2010, and she just got her dream job with 2 years of work experience under her belt.

Those 2 years of work experience really open doors.

It's an enormous mistake to hold out for the 'perfect job'.

One of my roommates was a finance/econ/accounting triple major and wanted a Goldman Sachs job (not our cup of tea, but it's instructive). He didn't get that job, but saw the 'regional' banks as below his level, so he didn't take any job and brushed up his resume and applied again. Meanwhile another classmate with a less stellar but similar resume took a job at the regional bank. Two years later he switched into JPMorgan. My roommate who passed has basically been unable to get any job - even the regional bank jobs he once considered beneath him.

So don't let the frustration make you do something stupid and pass on the dumb job offers. You have to get a job right away. You can switch from that job after 6 months or 1 year or whatever, but do not make yourself unemployed. You only become less employable as your graduation date slips because it signals other employers that some have passed on you.


I see where you're coming from, but software is a lot different from banking.

"You have to get a job right away." definitely isn't true or good advice for a programmer. You can learn a lot more a lot faster and add more impressive pieces to your portfolio by working on your own projects, open source, or contracting than by accepting a bad job at some dysfunctional company.


Nonsense.

Getting a programming job at a company that uses the technologies you want to work with will seriously help you move into another role in the near future. Even if the company is crap.

It's nice that some people will care about your personal projects but A LOT of good companies won't see these in the same light as real professional experience dealing with real production, team and stake holder issues.

Also, how can anyone support themselves for months whilst they wait for the perfect job which may never come because it gets harder to find a job the longer you are unemployed?

As for contracting; to land a good contracting role (I'm talking stuff outside of rent-a-coder) is very difficult. Most companies wouldn't even consider you as a contractor unless you have several years of solid experience. I can only assume you mean the kind of 'contracting' that happens at places like rentacoder.com. If you think anyone will care about this then you are mistaken, plus, it's a waste of your time as you'll get paid peanuts and end up providing free support to the idiots that gave you the work in the first place.

If you're fresh out of education and will not be landing a role with Google (or the likes of) and you're struggling to get work at a good company then you should take the first job that comes along that works with the technology you want to specialise in.


Speaking from experience, most of your assumptions are completely wrong. It's much easier to gain valuable skills in personal, open source, and consulting projects, because you can target your work towards sought after languages, tools, and frameworks, and clients you contract with will often let you choose the technologies if you make a good case for them. Bad companies will tend to use bad and outdated technologies, and they won't be open to the suggestions from a new hire.

It's also extremely difficult to build a strong portfolio working at a bad company, because bad companies by their nature rarely get anything done, and even if they do, they'll pigeonhole you into the most monotonous tasks and you'll get 0 experience making architectural decisions or working on anything outside whatever little gutter they throw you in.

On the contrary, doing your own projects or contracting (ideally with startups or small companies) will force you to learn every level of the tech-business stack, from feature planning to architecture to coding to design to deployment to launching and promotion. You can work on many projects in a short timespan, and you'll be left with numerous portfolio pieces that are all your own.

"Most companies wouldn't even consider you as a contractor unless you have several years of solid experience. I can only assume you mean the kind of 'contracting' that happens at places like rentacoder.com."

This simply isn't true. You just need to sell yourself and show that you have good ideas and get things done. I got paid $60 per hour on my first rails project with no experience whatsoever in rails. The client never asked. She just saw that I had several strong pieces of work in my portfolio (half of them personal projects), and told me to use whatever tools I thought best for the job. If you're considering rent-a-coder you're definitely doing it wrong.

Don't settle for mediocrity. Take the initiative and start building a portfolio any way you can. The sky's the limit once you have a solid body of work to point to. You can make good money contracting or you can have your pick of awesome jobs. Having experience in some crappy job isn't what matters, it's showing that you can build great software.


Speaking from experience, most of what you say is wrong.

Most of the contracting positions available in London (where I live) are only available to experienced professionals. There's no way anyone is going to hire a nobody for £500 per day. Good luck finding much work available below this level, I used to contract myself at £200 per day but this only because I personally knew the manager on the project and I had worked with him as an engineer as an intern one year previously.

Showing you have good ideas and that you get things done means NOTHING. The other guy applying for the same job who has 6 years of experience will beat you every time. Maybe things are different where ever you are from but where I am from you don't get to work on anything that anyone cares about unless you have solid experience.

If you want to land a role at some small web dev shop where you will be stuck working on crappy rails or other nonsense things then go make your portfolio out of $60 a day "contracts" and have a great life. This does not exist in the UK and when people say 'contracting' they mean real work earning real money.

The best thing you can do is get an awesome internship then leverage that to get a good perm role somewhere else. This is exactly what I did. I didn't return to the place where I did my internship as I landed a better package elsewhere.

If you can afford to work for $60 a day and you are lucky to live in the Valley or somewhere where these roles are available, then lucky you. I prefer something a little more rigorous where I'm not a slave working on crap.


You're coming across as a really arrogant and unpleasant person. Frankly, you sound like someone who couldn't hack it on your own and is now bitter. I'm sorry it didn't work out, but you shouldn't generalize your own weaknesses onto everyone else. If someone isn't much of a developer or is afraid to talk to others and look for clients, then yeah he should take whatever he can get, but someone who has brains will be selling himself short and possibly destroying his longterm creativity and drive by accepting the corporate coding drone role.


Take a look at the london contracting market. There is no room for anyone who isn't experienced. London doesn't have little cheap contracts were front end devs can come and work for no money. All the contracts are for serious companies paying serious money. You have absolutely no idea what you're talking about. You're typical HN. Contracting is seen as something that you enter after years in industry. Not everyone lives in the valley where companies come and go quicker than the local bus. Your attitude towards our industry is that of an inexperienced child.


My advice to you is that a lot of the qualifications and requirements listed in job ads are pulled out of the air, and many of them are absurd. I've seen plenty of job ads listing "5 years of iPad experience" or "3-5 years JSON experience". Obviously those are not very closely related to real job requirements. The thing to do in those cases, I think, is to look at the company, try to get a sense of what the job would be, and apply if you are interested and if it approximately matches your skill level.

The web startups with rockstar job ads may wish they had John Carmack or Bill Gates sending them resumes but I doubt it happens much. Bill Gates doesn't need to work at a little company somewhere -- he made his own company and is now a billionaire. A skilled, motivated recent CS grad is probably a great candidate for most companies.


How do you feel about python and machine learning? Numenta is hiring in Redwood City, CA and this post has motivated me to leave no stone left unturned. :) (Contact info in profile)


To the grand-parent, I got a job off of an HN comment once, and I learned more at my time there then in 4 years of schooling, and it was a blast.

Contact this man.


Even more so when the company founder is: http://en.wikipedia.org/wiki/Jeff_Hawkins


Consider loosening your requirements for companies you would interview with. Get your foot in the door somewhere - it will open up more opportunities later.

Straight out of college, I got accepted by an outsourcing sweatshop early on, with whom I interviewed because I was bored and not necessarily because I wanted the job. I went ahead and took it anyway, thinking I'll learn do something for 6 months that I can put on my resume and move on.

During my spare time I made minor contributions to a couple of open source projects, then got an offer to hack on open-source software at Novell. I eventually turned it down to go work for a smaller company, which is where I am now, for the last 7 years.


I feel the same way, when I hear about this talent shortage, I can't help but assume they mean there are only so many people at the top of stanford/mit's classes.


I'm perhaps more of an extrovert in the same situation, but just apply to everything --- sure you'll fail at some stuff but it's always instructive. The key is realising the difference between a judgement of you and a judgement of how you interview :)

Hell, I got an internship last year by cold-emailing someone from the HN Who's Hiring thread and asking what they were running this year. Just put yourself out there and you'll have no problem.


This is wrong. I don't know a single Rails developer (with or without a degree) who doesn't have more work than they can handle available to them. Sometimes the jobs are more concentrated in certain areas than others, but I personally get several cold calls from recruiters every week, and I have no degree, and I am no rockstar.


When I do eventually move to SF / a tech hub, applying for everything is my plan, if I don't find something before moving.

I also don't know rails, and I'm just now getting into web technologies, trying to decide what to really dig into (between node and rails really) . Is rails really that hot? Just starting out with rails now I feel like I'm really late to the party. I'm mostly interested in Clojure, but that's not exactly the hottest job market.

It's nice to hear that it really is that good though, It's hard to really get a feel from it living so far from any tech hubs, without many developer friends.


Rails developer <: programmer. Contrary to Bay Area belief, most programmers are not, in fact, Rails developers.


> I'm a semi-introvert graduating with my BS in CS and am looking for jobs right now.

BTW, your HN email isn't visible unless you put it in the about section. You might get some more interest if you put it there with a webpage url.


You might want to give your email ID in your profile.


zanny, if you are interested in a Python/Django job in San Francisco, email me. My email address is in my profile.


Indeed. My ad scoring program deducts 5 points and transparently kills off ads that contain terms like "ninja" and "rock star" because to my mind they indicate a propensity for either abuse ("a rock star should be able to code this in three days!!") or fad-following ineptitude ("rock stars are the coolest kind of programmers, I must have them! also, we write EVERYTHING in Ruby, it's like the Mac of programming").


Personally I think those terms indicate feeble attempts at psychological manipulation. I feel like responding to job ads like that would be like admitting you're a gullible fool. They're trying to recruit and stroke egos, not people.


Or they think of themselves as ninjas, which is too a big red flag.


I dunno. Think about Steve Jobs recruiting for Mac engineers with a pirate flag flying from the building. If 'pirate' then == ninja/rockstar now, you might be missing a great job.


Was that part of Steve's recruiting schtick? IIRC, that was just internal company culture and not something that Apple HR (or Steve if he was interviewing) would have mentioned.

I do believe he used words like "changing the world", etc. These are more about results than ego. Huge difference.


You are on to something here. If people want o make it a hipster contest don't use the recruiting op and waste it. It simply does not add up unless you are stuck in a mindset that is not conductive to anything productive. Nihilists are us should never be a place one "wants" to work at unless there are no other ops available.


my typical response to ads for rockstars is pointing out that rockstars tend to earn 10 to 100 times more than a great studio musician and generally have to be the prima donna all the time. Then I ask if forcing myself to be the center of attention is part of the job description, and just how much better the position pays compared to the typical valley salary.


Along that same lines... you never see a good ninja. So if I'm hired under the guise of being a ninja you can expect to never see me in the office. So telecommuting would be expected?


This is great.

I'm going to apply for and get a job at a company advertising for "ninjas," then not let my boss know when I'm taking vacation or a sick day - just so when he calls to ask where I am, I can reply with "Well, you did say you wanted a ninja programmer..."


if you don't mind, I'm going to borrow that line from you...


I don't mind at all.


if you were really a ninja however, you would have stolen that line under cover of night, replacing it with a forgery that looked identical, before using a selection of grappling hooks to make a stealthy escape across the rooftops...


I gather you read Perfect Crime Party (Bakuman around chapter 90) then?


nope, just looked it up just now, looks fun

will add it to my reading list, needed some new manga to get into anyways, thanks for that :)


how do you know that i didn't, and this is just a diversion?


Because I'm wearing lucky pants, of course.


Your program would do the same for an advert that said something along the lines of "We don't care whether you are a ninja or a rock star"?


It would be great if you made your software available as a web app for all of us to use.


The argument made in the article applies in both directions.


When I was open to new job opportunities, I immediately turned it down if the job description included words like rock starts, ninjas, etc. I am not against the mindset. It is just not a good fit to my personality.

I want to work on products that I consider meaningful and work with good and serious engineers. It is nice to hang out with colleagues, have drinks, saying jokes, etc. I am a big fan of classic rock. But calling engineers as rock stars, ninjas, etc, or engineers labeling themselves as such personalities, really convey a somewhat negative impression to me.


I think another over-abused phrase is "smart and gets things done". Thanks for regurgitating Spolsky in your ad. Tell me do you have individual offices for your engineers? No...it's the standard cube farm? Then you don't get to quote Spolsky.


Job descriptions just shouldn't have "rock star" or "ninja" unless they are hired on their ability to perform music in front of an audience or on their katana wielding skills.


Rock Star: that guy who shows up late and worked on a hit project 15 years ago that he's always bringing up.

Ninja: that guy who's never at his desk and never says anything in meetings.


You forgot to mention "brings his katana to meetings" (the sword, not the Suzuki motorcycle).

It's kind of "friend-of-a-friend", but I had a boss who told me about working with a developer who actually did that, I think at a game company in the late 80s.


By any chance was it puts on sunglasses John Romero?

YEAHHH!


wow, i never knew i was a rock star


The more useful data is that any ad that uses such language doesn't actually employ any ninjas or rockstars.

At best it has an HR person who never managed to stop being 17years old, typically it's a bunch of trend following losers, at worst it's a bunch of creeps who think they can under-pay and over-work you in return for having a pinball table in the corner.


Ironically, a job seeker that rejects an ad based on a keyword or two is exactly the shallow reasoning this article criticizes; it's just the other side of the same coin. I'm pretty sure there are actually good startups that put out "cool" (lame) ads just, like good programmers that let a typo slip by.


> Ironically, a job seeker that rejects an ad based on a keyword or two is exactly the shallow reasoning this article criticizes; it's just the other side of the same coin.

Not really. The situations are asymmetric -- an employer seeking a talented programmer needs to be rejecting potential employees on as few arbitrary criteria as possible, to maximize their chance at finding just a single qualified candidate.

A talented programmer, on the other hand, has Berty Wooster's time-management problem; there are far more people desperate to hire than he or she could possibly could ever hope to give attention to. You need to winnow out any opportunity that is even the slightest bit unappealing just to get the flood down to a manageable level.


> You need to winnow out any opportunity that is even the slightest bit unappealing just to get the flood down to a manageable level.

This is precisely what the article was talking about but in reverse, like the GP said. You're implying that job ads that rub you the wrong way on a minor point are ads that are at the bottom of the desirability list -- this is the exactly the correlation/causation fallacy the article discussed.

In other words, that vast list of companies looking to hire the talented programmer is comprised of 1) companies that suck 2) companies that are OK, and 3) companies that rock. The occurrence of the word "ninja" or "rockstar" in their ad isn't an indicator of which group the company falls into. What you're advocating is no more scientific than throwing away unlucky resumes.


> You're implying that job ads that rub you the wrong way on a minor point are ads that are at the bottom of the desirability list -- this is the exactly the correlation/causation fallacy the article discussed.

You've misunderstood.

The employer faces the problem of there being too few really good programmers, and so must do nothing that would prevent them from possibly seeing one in their candidate pool. It's like panning for gold; you need to get as much crap as possible going through your pan to find anything of value.

The good programmer has an entirely different problem. There are a ton of jobs out there that I would find stimulating, engaging, and fun. Finding them isn't hard at all. They show up on HN, they show up in my inbox, they show up in conversations at conferences and events, they show up in unsolicited offers from friends, acquaintances, headhunters, etc. They show up on a weekly basis.

If I'm in the market for a job, I don't need to carefully grovel through every opportunity to find the one good one. I couldn't possibly, even if I wanted to, because my attention isn't infinite. Talking with every potentially interesting employer would take more hours than I have in the day.

Instead I need to, like Berty Wooster, somehow winnow down the 100 equally plausible opportunities to something small enough that I can reasonably give my time and attention to picking between. That the heuristics for doing so are arbitrary is irrelevant; my only concern is grossly reducing the numbers I'm dealing with, because I have no reason to fear I won't find a good one among those that are left.


Honestly, there are a lot of crap jobs posing as dream jobs as well. I remember talking to my (female) cousin once and she said to me "People act like it's only good men that are hard to find. The sword cuts both ways." Just because there's a higher ratio of Women to Men doesn't mean that the ratio of good women to not so good women is higher than it is for men. I didn't get what she was saying until I started dating heavily.

In the same regards just because there are more jobs than candidates doesn't mean there isn't just a high ratio of crap jobs. So the same 100 job ads will likely have 1 or 2 great opportunities and 98-99 posers. The candidate's job is just as tough as the recruiter.


This reminds me of that job advert which specifically asked for programmers with MacBooks only. No way in a million years would I apply there! I have an iMac, but only because I'm interested in different sorts of computer. I mean seriously, not everyone can afford an Mac.

I've had this on my chest since I saw that advert. Just needed to vent :)



Reminds me of this steaming pile from last year http://blog.expensify.com/2011/03/25/ceo-friday-why-we-dont-...


Ahh, I can't find it. I just remember that it was looked like a normal HN story but you couldn't comment on it, and that it didn't ever actually mention the company name.


What's worse:

Looking for: Leonardo da Vinci 2.0, Einstein reincarnated

Compensation: We cannot afford to pay you now but we promise equity


What's worst:

Looking for: the lovechild of Richard Feynman and the second coming of Jesus

Compensation: small salary with some equity

Actual work: it's like Craigslist, but for social mobile


Something this article doesn't mention, is identifying the success of an individual in the culture of the organisation they've applying for. Roles and job descriptions change, in startups and MNCs alike. Sure, do technical evaluations, but explicitly, do not assume:

1. An analysis of factors which make someone in your organisation successful, and be critical, an example in being critical may be 'manage explicitly conflicting goals' 2. Map candidates both to the 'now' of what is needed, and the factors of success needed in your organisation short medium and long term.

It is quite easy to find great people through technical questions. It is bad for all when great people don't fit. This is not behavioural interviewing stuff or psychological surveys, simply match proven success and/or evidenced motivation.


What exactly is a ninja or a rockstar in terms of programming? If you're not a rockstar what are you? Would anyone ever actually describe themselves as a rockstar programmer?

I was pretty glad to see "anyone who uses rockstar as an adjective" on the list of people take out in the God Bless America trailer.


> What exactly is a ninja or a rockstar in terms of programming?

Someone who enthusiastically engages in RDD (resume driven development). Someone who knows whatever is currently fashionable in the blogowhatever...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: