Hacker Newsnew | past | comments | ask | show | jobs | submit | more bcantrill's commentslogin

A postscript on this piece: unsurprisingly, the piece has especially resonated with parents of NCAA athletes, and I have since heard from parents in a wide range of sports (lacrosse, soccer, gymnastics) saying that the piece reflected their own experiences. It also resonated with athletes themselves, and anyone who liked the piece should check out the discussion that we had about it with a former colleague of mine, Robert Bogart, who also happens to have been an NCAA champion swimmer.[0]

[0] https://oxide-and-friends.transistor.fm/episodes/diving-in-w...


For whatever it's worth, baseball is actually headed in the opposite direction: the cut in the MLB draft to 20 rounds has made college baseball more important at the elite levels, not less. That, coupled with the fact that schools can't relocate/threaten to relocate (Rays, White Sox and especially my fellow A's fans know this pain!) makes being a college baseball fan easier.


For whatever it's worth, details of my findings are in [0].

[0] https://bcantrill.dtrace.org/2018/09/28/the-relative-perform...


pretty cool! in isolation looks awesome! i'm still a little curious about the impacts increased executable image size, especially in a complete system.

if all the binaries are big, does it start to crowd out cache space? does static linking make sense for full systems?


The kernel will only load the parts of the binary you actually run, and can drop the disk cache for those parts that haven't been ran in a while.

So more than the absolute size of the binary, you should worry about how much is actually in the 'active set'.


Yes, that's all (unfortunately) correct. Part of the reason that we have been supportive of the openSIL effort[0] is to make our approach more generally attainable -- and of course we have opened our own work[1] and we will continue to be outspoken advocates for transparency at the hardware/software interface[2].

[0] https://github.com/openSIL/openSIL

[1] https://github.com/oxidecomputer/illumos-gate

[2] https://rfd.shared.oxide.computer/rfd/0552


Your (absurdly) broad claim is false -- and it's very easily disproven.[0]

[0] https://ahl.dtrace.org/dtrace-conf-2024/


You made this comment twice, so I guess I'll reply to it twice, but the reality is that the acceptance rates of ATC was very, very low (again, under 13% in 2004). Low acceptance rates actually do indicate a shortage of good publishing venues (certainly relative to the number of submissions), but it would be interesting to look at the ATC acceptance rate over time; if it was much (much) higher in its final years, it would be easier to accept your assertion.


https://github.com/emeryberger/csconferences indicates that ATC acceptance rate has averaged 20% in the last five years, so better than ~13%, but not by too much.

However, at least speaking as an academic, I wouldn't say that ~20% acceptance rate is necessarily indicative of a shortage of good venues. There is plenty of not-so-good research that is submitted to top places that has no hope of getting in. (My experience is from computer security research, where the acceptance rate of security conferences has gone down, but the fraction of good papers has also actually gone down, so the fraction of good papers getting in has roughly stayed the same.)

That being said, ATC seems to indeed have been a high-prestige conference back in the day, and hence indeed competitive. My experience is from recent years, where it was viewed as a good-but-not-tier-1 conference in systems research.


17% in 2024. More submissions than other conferences. Attendance numbers probably skewed by cross-registration with OSDI.


I really don't care about how prestigious academics view (or viewed) ATC, but for whatever it's worth when our paper was accepted in 2004, was just under 13%. That is bluntly too oversubscribed for the practitioner (whose job is to ship systems, not win paper popularity contests -- or adjudicate those popularity contests by serving on program committees). In some ways, if ATC wasn't thought prestigious, it's even worse -- it means that practitioners lost our conference for nothing at all.

A final note: given that OSDI is on your list of prestigious conferences, the way the program committee was conducted in 2010 (which I outlined in my ATC keynote[0]) should be particularly galling.

[0] https://speakerdeck.com/bcantrill/a-wardrobe-for-the-emperor...


Request for clarification: 13% of what?


13% of submissions to the conference were accepted to present.


Oh no! I had honestly never heard of EE380 when they asked me to present[0], but I really appreciated the fact that it was (either implicitly or explicitly?) open to the public. I wish I had heard of it earlier, and sorry to see it go!

[0] https://www.youtube.com/watch?v=vvZA9n3e5pc


I absolutely agree, and I got to as much in my USENIX ATC 2016 keynote[0]: there are so many vectors now for sharing new ideas, it's almost hard to remember that in the heyday of USENIX ATC, technical conferences were really one of the only ways (along with USENET, really) for practitioners to broadcast a new idea. While there were perhaps upsides to having such limited vectors with respect to high signal, there were of course many more downsides; systems research has been well-served by information connectedness!

[0] https://speakerdeck.com/bcantrill/a-wardrobe-for-the-emperor...


I definitely agree with you, and apologies if that didn't come through in the piece! (Perhaps the rare case where I undershot on the metaphors?)

One thing I didn't mention but certainly believe: academics were attracted to USENIX ATC because of its attendance count (especially at the height of the Dot Com boom when it had nearly 2,000 attendees!) -- but no one really took apart who was attending or what they were looking for. So the conference became more academic because of the attendee count -- but that it became more academic also drove the attendee count down. (I heard from a lot of practitioners who attended that they struggled to find sessions that were relevant to their work in even the broadest sense.) I know I link to it in the piece, but I think Rik Farrow's piece[0] really got right to the heart of all of this.

[0] https://www.usenix.org/system/files/login/articles/login_fal...


I didn't mean to imply that I disagreed with anything other than that specific point (which perhaps is more nostalgia for the halcyon days of in-person conferences than anything else).

I do think that you correctly identified a major source of the problem back in 2004 as economic factors. I was at Mozilla when Rust was built (though not directly involved) and the sum of Mozilla's investment into Rust over the years easily broke into the 8 digits. Google's investment in Go I'm sure is an order of magnitude or two more than that. This is simply beyond the capacity of any academic institution or grant process. The only academic efforts that get to this level require Acts of Congress (e.g. LIGO, the Human Genome Project, the James Webb Space Telescope, etc).

And anyone who can afford to drop $10M+ into development can find channels for distributing and publicizing their work that don't go through program committees. Open source is definitely a big part of that but I don't think that's the whole story. I'd certainly count CUDA as a major advance in systems software since 2004, for instance.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: