Hacker News new | past | comments | ask | show | jobs | submit | maxfurman's comments login

You may know this already, but the SQLite CLI can actually read and query data directly from a csv file, with the right flags

They are Jedi’s at jq though.

This gets to an essential point about LLMs - they are the ultimate intern. Anything you wouldn't ask an intern to do, you probably don't want to ask the LLM to do either. And you certainly want to at least spot check the results. But for army-of-intern problems like this one, they are revolutionary

with the exceptions that an intern is (hopefully) going to learn from their mistakes and improve

If you have a reviewed output dataset from an LLM, you could use it for RLHF.

It's an electron app, so, yes, kinda. It has to load and parse all of its own JS at boot time (barring any lazy loading within the app)


VSCode is quite a bit snappier for me though, I don't think electron is necessarily the problem.


It almost never is. While HN contrarians endlessly complain about it, the rest of the world happily ships Electron. Discontent with Electron is so comically over represented here that I'm beginning to think it's an insecurity or complex or something.


It's simple:

Electron is great for developers because it allows them to easily build a cross platform app. Businesses love it for this reason.

Most users don't care. They barely realize what software is to begin with, and the mere suggestion that things could be different is baffling to them.

Then you have the highly technical users who know the software could be better, but isn't, because of the previous points.


In my experience, some regular users also dislike the UX of websites as applications, and that's what Electron apps are. They feel heavier than native software, the UI elements tend to redraw and jump around unexpectedly, and the keyboard response is noticeably slower.

You can recognize them because they dislike the feel without even knowing what's an electron app. I can vouch they're not insecure about Electron.


Only if using DOM right? If someone ported a WebGL game to Electron the same 3D engine would be at play. Or 2D canvas for that matter. I think you're specifically referring to the overhead of browser DOM and how elements therein load, focus, apply styling, etc.


I hate every Electron app I use, and that includes vscode.

There is a unique type of jank associated with it.

Would love to live in the apparently "bizzaro" world where desktop class apps developed using tools which try to be more native to the OS were still successful in 2024.

Every update to Adobe creative suite makes it more and more close to an electron app and further and further away from it's roots as one of the premiere high end desktop caliber programs. I interpret the rise of Electron to be mostly correlated with the lack of knowledge on how to do old school desktop GUI development among the folks who teach CS education.

I expect that fl studio, photoshop, word, and every other remaining good desktop caliber app will be lost to the "make everything a in web-browser" trend, eventually. I'm so over it and it hasn't happened yet.

This is also the reason why GenAI STILL has no good prosumer tools. ComfyUI/Automatic1111/Forge/lm-studio are the closest we have and they're gradio or electron webapps which indicates that AI folks are not down with leaving the python or JS ecosystem for even a minute.

This means we need good desktop caliber GUI developers who understand AI. Unfortunately, they basically don't exist.

Thus, the world runs SOTA models on janky shit gradio or electron frontends instead of the desktop GUI's we had for similar UIs 10-20 years ago.

Word 2003 was peak software, and I'll never change my mind.


For me the problem with Electron is the “uncanny valley” effect - every app implements its own UI elements, and nothing is quite the same as the OS’s native behaviour.

Different apps have different unique quirks/odd behaviour, almost none of them do the right thing on platforms like macOS.


I do find it sad that native apps were quick and snappy back in the day, but now everything is effectively a browser, and has a DOM that needs to be parsed and dealt with, thus negating the hardware Moore's Law improvements that have happened since.

There must be a better way.


I used to feel the same way until Apple started dogfooding their ill-performing SwiftUI. The insane hardware packed into M1-3 chips compensates for it a bit (not always), but especially on my Intel mbp it can be jankier than most of the electron apps I interact with regularly. If VSCode was as janky as the "native" Settings app on macos I sure as hell wouldn't use it


yea man, I think it's stupid for a text editor like Atom to be 200mb and take up 1gb of ram because I have a "complex". I'm annoyed at the 120 independent copies of CEF on my computer using 20gb of disk space because I'm "insecure"


Any chance you could release the source code some day? I'm very curious how you pulled this off!


ITT: People who have not read or seen The Big Short


The 2008 mortgage crisis is now 16 years old, and the stock market had recovered by 2011. So presumably anyone who entered the work force afterwards might have little sense of it, which would be anyone under the age of 31 (or 35 if you count college).


The Rust team prioritized keeping the standard library small, and allowing important but inessential functionality to live outside the stdlib and keep their own release schedule.

MS has massive resources and can align the many many modules of .NET into major releases. The Rust project does not.


>> The Rust team prioritized keeping the standard library small, and allowing important but inessential functionality to live outside the stdlib and keep their own release schedule.

There are crate "recommendation lists" like this: https://blessed.rs/crates

While these help, Rust's small standard library has slowed Rust adoption where I work because our IT security team must approve all third-party packages before we can use them. (Who blindly downloads code from the Internet and incorporates it in their software?)

This means that unless there is a compelling need to use Rust, most teams will use C# because .NET is approved and has almost everything. I prefer Rust to C#, but we are not using it that widely yet.

It's not a fair comparison, but it is how things currently are.


blessed.rs is fairly new. And if you look for yaml on this page, it links to serde; and serde_yaml being deprecated (with no blessed fork yet) is precisely part of the issue discussed in the parent article: https://lib.rs/crates/serde_yaml ; and that's perfectly fine for the maintainers to declare, it was all open source with no warranty whatsoever.


> our IT security team must approve all third-party packages before we can use them

Do they also review the original tooling? Why would one single out third-party packages?


>> Do they also review the original tooling? Why would one single out third-party packages?

Everything used for software development is explicitly approved. This includes programming languages, compilers, debuggers, IDEs, etc. We are primarily a Microsoft shop, so the majority of our development tools follow that direction.

For FLOSS libraries, the approval process covers both IT security review, a source code scan / static analysis, as well as a legal review of the package's license to ensure it is not on the prohibited list.

This makes management happy since it prevents potential security and legal issues. It keeps our customers happy since they get quality software made from fully traceable components.


It sounds like you download software blindly from the Internet with the extra preliminary steps that make your customers happy.


It's less blind than other places I have worked.

When a CVE is announced, we know immediately if we are impacted and what will need to be fixed.

Some places have no idea what their dependencies are. I am sure there are lots of log4j horror stories from Java shops that were not so careful.


I wonder if there is a scope for a "opt" / "extras" type level where libraries for stuff super commonly used, like JSON/YAML/ssl etc can be placed. It doesn't have to cover everything the defacto third party libraries do (that will be too much and not practical), just including happy paths should be enough for most people to just include the meta package and get the most generally used features instead of including a ton of third-party libraries.

My knowledge about Rust's library system isn't great so there are probably several gaps in my argument.


Several people have tried to create such a thing over the years, but the ecosystem never uses them. People objectively prefer to pick their own set of packages.


There are several waffle houses in PA, DE, and MD. Isn't that the northeast?


Sorry, I should've said there are "essentially" none.

The two towns, Lancaster and Allentown, in PA with Waffle Houses, are kind of exceptions that prove the rule. They are special rarities -- the northern waffle house, like the jumbo shrimp.

But Waffle House in its true environment is no more a rarity than an orange in Florida. Its commonness is an inherent part of its charm.

Technically, it's possible a northerner could drive a few hours to get to a Waffle House. But the whole point and charm of Waffle House is its ubiquity. Anytime, anyplace: just past the next offramp and open 24 hours.

It is not a planned destination, it's ... just where yall end up. And so no, while technically there's a couple waffle houses north of the mason dixon, those in the know would agree there isn't really waffle house in the north, regrettably.


Maryland is usually referred to as the mid-Atlantic. Technically speaking, MD is South of the Mason-Dixon line, so it’s in the south. I say this as a Marylander myself.


There may be a couple in Northeast PA (and others in Pennsyltucky), but those parts of PA aren't usually thought of as part of the Northeast US.


I am not getting a good sense from the website of what is special or unique about Shen. The top line feature (above the actual list of features) is the presence of an "S series kernel" which as far as I can tell is Shen-specific. After that the top feature is pattern matching, which has become common in the mainstream languages lately (Java and Ruby come to mind)

A little further digging shows that this language is a Lisp. Great! I love lisps and functional programming, and I have a particular soft spot for Clojure. Are there any domains where this language would excel and CLJ would not?


Shen is a very unique language, and one of the ways in which it is unique is that so much of its marketing, information, etc is non-obvious, and less accessible than you might want.

I think the main thing that I find compelling about shen is its type system, especially its sequent calculus system (for defining types in a way that would not be possible for most languages).

The other thing about it that is compelling is how portable it is. the main language is implemented in a simple kernel language; someone who wanted to port the language to a new environment would need to implement a small (relatively) set of primitives, and then you can run the entire shen environment on top of it.

Its worth looking into, however I do caution that it has plenty of rough edges etc.

For me personally I think of it as an inspiration for programming languages I wish to develop someday. Additionally, if you ever worked in a certain environment and really dislike that the language is a bit weak, shen might be something you could port to that language and use. For example, I recently updated https://github.com/deech/shen-elisp so that some of its rough edges were a bit smoothed down and should be more usable; I haven't actually written any shen yet that runs in emacs. That's still a ways away.


> I think the main thing that I find compelling about shen is its type system, especially its sequent calculus system (for defining types in a way that would not be possible for most languages).

That sounds interesting. Can you give an example of a type defined this way that would not be possible in most languages?


One quick example is that the language allows for (nearly) arbitrary computation in the definitions of types, i.e. defining an enumeration/sum type of the employees of a company as:

{datatype Employee

  if (element? emp read-file “employee_list.txt”)
______________________

emp : Employee

}

where the enumeration type contains a case for each employee listed in the text file.

The concept is similar to algebraic datatypes where user defined types are created from sum and product types in combination. However, in Shen the Sequent Calculus based datatypes are definable and constructable using any well formed sequent. Then add on the computational content of the sequent clauses to that and you get a system that is wildly expressive.


Note: this is all of extremely interesting, extremely powerful, and extremely cursed! But I think it is certainly interesting food for thought


Why cursed? Why a neat CSV file can't serve as a part of the source three, while a much more cursed pass with `cpp` to handle `#include`, `#ifdef` and such is seen as okay, even e.g. in the Haskell community?


Well, first off, I don't think those things in haskell are good, really. They are/were necessary because of various _other_ issues, but they are not "good" features. And e.g. template haskell doing arbitrary IO at compile time is famously a cause of a variety of unintended problems (e.g. can't cache correctly if you never know what things a TH-module depends upon, very hard to cross compile if you can't be sure the arbitarary TH logic isn't depending upon the compilation host architecture to make some decision for compilation target, etc)

Specifically though for Shen, what causes me pause is how frequently this kind of "call an arbitrary term-level function in the type system" is done; it is quite frequent, and thus all my mental alarm bells sound. Debugging issues that arise in things like this can be quite challenging.

Sharp/dangerous tools are often necessary, but what I find unfortunate is that you need to use those sharp tools constantly, when a safer tool could do just as well!


I agree in general, and with the idea that effectfullness of compile-time calculations should be strictly limited.

OTOH reading an input file is one effect that a compiler inevitably has anyway. A clean way to load and interpret an input file as a part of comptime computation would be very helpful, without the problems of arbitrary effects.


Yeah imo the way this would be handled ideally is that some other program would generate a source file from the csv. That source file can then be included normally and work with existing caching infrastructure etc.

But ya know back when make was the way of doing things this would be easy to set up. Now in the age of each tool having its own build system it is different.


That’s pretty cool. Any other examples? Actor framework equivalent ?


check out https://shenlanguage.org/learn.html, especially

"Coding a Lisp Interpreter in Shen by Mark Tarver", "Shen Tutorial: Sequent Calculus by Neal Alexander", and " Defining Types in Shen by Chris Double" - these all illustrate what is going on in different ways.


It's a hybrid of lisp and prolog.


Won't work on my Intel Macbook :-(


This is disappointing. Anything similar available for Intel Macs?


I'm not sure there's one single zeitgeist for them to align with. They could be in tune with the cultural attitude of the majority and still run afoul of millions of people


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: