Hacker News new | past | comments | ask | show | jobs | submit | QuackingTheQ's comments login

I would also go to bat for Diaspora, single-handedly re-invigorated my interest in science fiction


I loooove diaspora! Egan was inventing neopronouns 25 years ago and it totally works in the context of the story, reader doesn’t even bat an eye seeing ve/ver/vis so consistently and casually after a few pages. Just one of the many forward-thinking aspects of that story.


When this book came out I'd previously seen ze/zir which sounds less jarring in legacy English. I think this was from a few people in the sf world in online discussions rather than in fiction, though I can't really remember anymore.

Either way, it scales better than having everyone publish an individual pronoun policy and every else remember it, O(1) vs. O(N^2).

Diaspora is excellent.


I went to a small rural school on the east coast and circa 2005 or so I recall getting a mass email from an acquaintance explaining their new pronouns of ze/zir. That was the first I had ever heard about someone preferring different pronouns, and it was probably close to a decade before I heard those particular ones in any other context. All that is to say that it makes sense if it had made its way out to a rural college in 2005 it was probably being used a bit more widely in the SF a few years prior.


Also chalk up Tim Leary for "hir" back in the 1970s (maybe 1960s).


Diaspora is one of the most interesting and captivating things I have ever read. I love it dearly.


I loved Permutation City and his short story collection, Axiomatic and really liked Distress. However, I stopped reading Diaspora 30% in. If it hasn't click yet, should I still keep reading?


> I loved Permutation City and his short story collection, Axiomatic and really liked Distress. However, I stopped reading Diaspora 30% in. If it hasn't click yet, should I still keep reading?

It depends on what doesn't click. If you're waiting for it to get more down to earth, it doesn't (either literally or figuratively). But I do remember it as starting out very dry, and getting, while not less dry, considerably more absorbing as it went along.


I am totally fine with dry. What usually pulls me in with Egan and hard scifi in general, is cool technological ideas and seeing how they play out societal (example: floating island state in Distress). Egan sticks out to me that he also will apply or combine existing concepts in mind-warping ways (example would be the infinite cellular automata in Permutation City). So far non of that has really happened for me in Diaspora. AI existing at a faster speed than the physical reality is cool, but feels like table stakes.


I'm not sure how far into Diaspora you made it, but the further you go the more mathematical or physics-based it gets. Computability/complexity theory show up later, there's a large portion of the book dedicated to an alternative physics model, etc


Oh, that sounds wonderful. None of that has really come up yet. Thank you! I'll give it another shot.


knowledge base is buzzwordy. it's a very elaborate (and good!) note taking app.


This recent breed of note-taking tools (Obsidian, Notion, logseq, Roam Research) are aimed at enabling a style of personal knowledge base known as networked thinking, which attempts to facilitate the emergent creation of new knowledge by connecting related ideas in your notes.

[1] https://en.wikipedia.org/wiki/Frederic_Vester#Networked_Thin...


Of those, Roam and Logseq definitely are, but Obsidian and Notion are much more open-ended in how they can be used, although I think the Obsidian community online does tend to overindex for that kind of usage.


Would you call a private wiki a note taking app ?


A pet peeve of mine is that differentiable programming is co-opted almost entirely by deep learning + neural networks. The idea of differentiable programming is much bigger than SGD, and in fact neural networks are typically a simple program to differentiate. Full differentiable programming requires solving much more involved problems around control flow than just implementing numerical forward/reverse mode for math operations with well defined and understood gradients.


Julia's AD is compatible with control flow. They have their own issues, but Zygote + ChainRules actually work pretty well


I've spent a lot of time developing large computational codebases in Julia, and I think the most insidious of these issues is a product of no formal way of enforcing interfaces. Using one of the common packages to build a trait system and add some sort of guarantee that all the right methods are implemented for a given trait simplifies maintenance dramatically.

This doesn't catch mathematical bugs, but those crop up everywhere. Instead, knowing what the interfaces must be specified so you can trust your implementation is crucial, and being able to know when it is invalidated is invaluable.

I've had a few awful bugs involving some of the larger projects in this language, but a proper interface/trait system would simplify things exponentially. There are some coding style things that need to be changed to address this, like using `eachindex` instead of `1:length(A)` for array iteration as the example in the article points out. However, these should be one-off lessons to learn, and a good code linter should be able to catch potential errors like this.

Between a good code linter (or some static analysis, I'm pulling for JET.jl) and a formal interface spec, I really think most of Julia's development-side issues could be quelled.


I agree with the kernel of your point here, but also with the author of the article when he says "But systemic problems like this can rarely be solved from the bottom up, and my sense is that the project leadership does not agree that there is a serious correctness problem. They accept the existence of individual isolated issues, but not the pattern that those issues imply."

My impression is that the Julia core devs are more focused on functionality and being able to construct new, more powerful, faster capabilities than on reflecting on how the foundations could or should be made more rigorous. For this, I think the devs have to philosophically agree that soundness in the large should be a first-tier guiding principle, and that the language should have mechanisms whereby correctness-by-construction can be encouraged, if not enforced. Presently, notions of soundness seems to only be considered in the small, such as the behavior of specific floating point ops. Basically, I don't think the core devs are as concerned with soundness, rigor, and consistency as they are with being able to build more impressive capabilities.

I don't want this to sound like I'm ungrateful for the awesomeness that Julia and its ecosystem does bring to the table. For numerical computing, I don't see any alternatives whose tradeoffs are more favorable. But it is disappointing that it doesn't seem to learn the lessons about rigorous language design and the language-level implications for engineering vs. craftsmanship appropriate for a twenty-first century language.


Sounds like Julia needs a Snow Leopard/Mountain Lion/High Sierra release - no new features, just cleaning things up...


Could some of the need for interfaces be addressed by providing an extensive test battery for types of object? It seems like if something claims to be an implementation of a floating point number it should be possible to smash that type into every error ever found to uncover implementation errors.


It's possible to hack interface verification into place at test-time, but that has a couple of problems:

1. Running the whole testing framework to determine if you implemented an interface is a high overhead when you're developing

2. You have a lot of tests to write to really check every error. Perhaps a package which defines an interface could provide a tester for this purpose

3. Interfaces should be attached to the types, and that should be sufficient for verifying the interface

I would settle for something like checking for the implementation of methods a la BinaryTraits.jl over what we have now, which is nothing. A huge step would be documentation and automated testing that proper interface methods are implemented, not even verifying if they're "correct". This drastically reduces the surface area you need to write and check to confirm compatibility with outside code.

This simple interface specification does produce design issues of its own, but correctness is much easier to handle if you know what needs to be correct in the first place.


Yes, although that seems like the easy half of this, making sure `struct NewNum <: AbstractFloat` defines everything. There aren't yet tools for this but they are easy to imagine. And missing methods do give errors.

The hard half seems to be correctness of functions which accept quite generic objects. For example writing `f(x::Number)` in order to allow units, means you also allow quaternions, but many functions doing that will incorrectly assume numbers commute. (And not caring is, for 99% of these, the intention. But it's not encoded anywhere.) Less obviously, we can differentiate many things by passing dual numbers through `f(x::Real)`, but this tends to find edge cases nobody thought of. Right now if your algorithm branches on `if det(X) == 0` (or say a check that X is upper triangular) then it will sometimes give wrong answers. This one should be fixed soon, but I am sure there are other subtleties.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: