For a look at some of the amazing output from an "ancient" EOS, you can look at Magic Lantern's Discord. It's rather shocking how far this little camera could be pushed. It is definitely a fun hobby project to fool around with these things. After awhile I stopped having the time and moved over to Sony APS-C with vintage lenses. I was able to maintain some of the aesthetic without getting frustrated by stuttering video. Still it's really a cool project.
We do React Router 7 (previously Remix) with Fastify for a couple of our apps. Both external and internal. I'm really happy with the dev experience. We also have plenty of NextJS floating around here too. Most teams use it as a "better Create React App". There's little use of any of its features. I'm the only one that's used any server-side stuff.
For my team, React Router 7 just gets out of the way. If you use it in framework mode (SSR) you will have loaders and actions, which are things that run on your server. I find it SO much less convoluted. The logger example from the article above is childs play in React Router. Either import your logger and run it in your loader. (You may want to add .server.ts to make it explicitly ONLY server) or inject the logger via context.
I had a project slated to use this framework. The pilot went fairly well. Fresh has the right ideas on static vs dynamic islands. In the end, we deployed with Astro - which also has similar ideas. In the end, I just wasn't able to get full buy-in on Deno.
That’s a shame, I’ve also been through much the same process.
Astro is pretty good too, though. I’m not 100% sure on some of the decisions it’s made, and personally don’t enjoy the need for new file formats and domain specific languages, but it does a half decent job of being framework-agnostic despite a few pain points.
Obviously some will find this a silly opinion but the one thing that turned me off the most about the Nim programming language was its use of significant whitespace. The same is true with F# (and of course Python). Having had apps with YAML for config, and having had nightmares trying to copy/paste config directives from various sources, I just find whitespace to be unwieldy.
Now that's a strong opinion, (weakly held - as a language can't be judged based on this design decision). But it does sour my interest a bit.
> Having had apps with YAML for config, and having had nightmares trying to copy/paste config directives from various sources, I just find whitespace to be unwieldy.
Convert your YaML into JSON and save it in your YaML file. There is probably an online converter, but writing one in your language of choice should be less than ten lines of code.
Do the same YaML→JSON for the “source” configuration you want to copy from, and copy-paste the parts you want. Leave them as JSON.
Complaining about Python's significant whitespace, I get it. I don't mind it personally, but it's obligatory and you can't overcome it (unless you do `coding: with_braces` tricks, of course). But why one would complain about YaML's whitespace? It is not obligatory.
> But why one would complain about YaML's whitespace? It is not obligatory.
The problem (as felt by me and also as identified by the person you replied to) is that you can't copy-paste/munge some stuff into the right spot and then just let the formatter to fix the indentation. It's not a problem that the format "at rest" has whatever certain indentation to be correct, its that while being actively editing your formatter cannot automatically set the correct indentation.
The flow that you're talking about of converting yaml to json and then putting it into yaml could work in some cases but thats very much a kludge. It will have numerous bad side effects unavoidable, including that it would discard comments in the middle since JSON doesn't allow for comments at all, theres no timestamps in JSON, there's no octal numbers, etc.
> The problem (as felt by me and also as identified by the person you replied to) is that you can't copy-paste/munge some stuff into the right spot and then just let the formatter to fix the indentation.
That problem I undestand, and that is why I suggested to convert both into JSON —or YaML with default_flow_style=True which would preserve datetimes and other non-JSON stuff— and copy-paste without the hassle of having to indent/unindent correctly. Of course that doesn't help with copying comments. That would need extra copy-paste operations, but still one hasn't the hassle of significant whitespace. The following is also valid YaML:
{"some_key": {
"attr1":
# an intermittent comment
"val1", "attr2": 12312 # more comments!
}
}
My gripe with json is the lack of support for comments. Whenever I come across a config file that has comments about what the config line(s) mean, I am so grateful.
Whenever I come across a json config file, I kind of despair a little and start poking at the code in hopes there are comments about what the config means.
I totally agree with your gripe about JSON's lack of comments. There were people AFAIK who tried to write a spec with comments (and maybe dangling commas? was it called JSON5?) but by then it probably was too late.
My biggest issue with JSON5 is as far as I'm aware, if you update settings programmatically, you tend to lose comments... not sure of any implementation that preserves them.
Never understood how putting up roadblocks for developers trying to copy-paste code was deemed acceptable, or GVR (and others) thought the solution to poorly formatted code was making formatting carry semantics instead of just writing an auto formatter.
I agree. Autoformatters are everywhere and easy to use. I'd far rather do that (plus maybe a pre-commit hook) than have to deal with whitespace in the language.
Well of course that's completely ahistorical. The motivator was eliminating block terminators. Python's predecessor, ABC, was a teaching language; programs were small and refactoring was not a thing. Heck, copy-paste was hardly a thing, and there certainly were no auto formatters. Indented code was a step up from BASIC, the teaching language prevalent at the time.
Yea, in the 90s significant whitespace seemed great because it meant that you got readable code. The amount of code that you might see copy/pasted with terrible formatting/indentation in other languages could make you want to scream.
Now, when you paste code and things are wrong, an auto formatter cleans it up for you. Before, you'd just end up with an unreadable codebase.
If you grab that version and unpack it and look at /OChangelog then it seems to date back until at least 1989, same as Python itself.
That was for C source, of course. I expect there were pre-GNU indent variants, perhaps posted on comp.sources.unix and maybe some commercial things as part of very expensive compiler packages.
I would say that running autoformatters in any kind of routine way was pretty rare. EDIT: but I think ascribing the language design to commonality or not is probably ahistorical. Even today it's a rather passionate debate. And even at the time, Lisp - the poster child of copy-paste friendly PLangs - was routinely autoformatted within Emacs', but that was not enough for people to not find Lisp code "ugly".
I get everyone has their thing, but I've been writing Python professionally for years and I can't even remember the last time significant white space was an issue. You just get used to it, like everything else.
I'm automatically going to be interested in any language with significant white space because there are very few mainstream ones and I hate the visual clutter that block delimiters create. Pretty much there's just Python. Scala 3 can happily do both.
I think we'd be better off if text editors just had option of representing braces and such as consistent indentation. Block delimiting tokens should optionally have semantics of non directly printable characters like new line or tab.
You'd love Haskell, which uses curly braces for many constructs, but also has rules by which they are implied by indentation -- so in practice you only ever see them on records.
I love python syntax overall, absolutely despise Haskell. Wastes my time constantly and gives me incomprehensible compiler errors when you screw it up. Expression oriented languages are really poorly suited for whitespace imo, unless they're hyper-regular like s-expressions: I could imagine a decent whitespace-based version of those.
A language that can do both Python and C "styles" is Ring. It is possible. But the issue is people have such a strong preference for one or the other, that they force the language and developers to permanently choose.
Even Allman versus K&R or tabs versus spaces are huge battles, without even going into significant white space.
Seeing tests written by Claude/Gemini really frustrate me sometimes. They'll go to great lengths to make tests pass. Somtimes even just testing console/stdout logging. On the other hand, when I've written failing tests and told the tools to write implementations to make the tests pass, I've had some really good results. Sadly that's the part I enjoy the most so I really don't want to delegate that to a tool.
I spent the last half of the 90's and the first part of the early 2000's building computers. Like the author, it started with a massively thick Computer Shopper catalog. Motherboard from TC Computers, 1.3GB Dirt Cheap Drives, 16MB of 72pin DIMMs from my dad's old Compaq. 486DX4 from some other seller. Man that was such a rush cobbling that thing together. But the bug stuck with me and eventually got me a job - which got me installing Novel, WindowsNT and eventually Linux! Then my boss sprang the big one on me, "you know, the real money is in software development". What a great trip down memory lane.
Perhaps I'm misreading the person to whom you're replying, but usefullness, while subjective, isn't typically based on one person's opinion. If enough people agree on the usefullness of something, we as a collective call it "useful".
Perhaps we take the example of a blender. There's enough need to blend/puree/chop food-like-items, that a large group of people agree on the usefullness of a blender. A salad-shooter, while a novel idea, might not be seen as "useful".
Creating software that most folks wouldn't find useful still might be considered "neat" or "cool". But it may not be adding anything to the industry. The fact that someone shipped something quickly doesn't make it any better.
Ultimately, or at least in this discussion, we should decouple the software’s end use from the question of whether it satisfies the creator’s requirements and vision in a safe and robust way. How you get there and what happens after are two different problems.
Well, every browser engine that is part of WHATWG. That's how working groups... work. The current crop of "not Chrome/Firefox/Webkit" aren't typically building their own browser engines though. They're re-skinning Chromium/Gecko/Webkit.
If that's what you "mostly" remember, your memory is awfully selective. It's totally fine for you to have a bias, but you're overlooking decades of massively successful products and services.
Having owned plenty of Thinkpads (Linux), Dells(Windows and Linux) and plenty of Macbook Pros, I can say, Apple's superiority of hardware is so far beyond the rest. Having an OS with a BSD-ish experience is really nice as well. I've spent 27 years in engineering and during most of that time I get the random "Linux is far superior", "I like Windows better" folks... but by and large, yes, Apple's tech has a ton of good will.
I don’t get your comment, do you mean superiority in what? Are you comparing operating systems or hardware? The combined experience?
If you asked me 2 years ago I would say something different about Linux than I would said today, because I’m running a different distribution with a different desktop environment and that changed my experience completely, even though I’m running on basically the same hardware.
I run Linux in Apple hardware too, how does that rank in your comparison?
Hardware: Apple announced an ARM based CPU and started shipping. It was _mostly_ a seemless experience thanks to Rosetta2. The performance on these well-built machines was outstanding. Even the Intel-based machines previously had really strong performance. The machines themselves (on average) were among some of the most well-built. Yes, there were outliers with the butterfly keyboards. Yes there were outliers with silly features like the touchbar. We're talking on average.
Software: Apple's OS is just a boring Unix that works. Yes I realize that Unix is in name only - but on top of that XNU microkernel really is a lot of BSD. Having the GNU tools available AND Sound/Fingerprint Reader/HiRes Display that actually scales... that is still not the reality in Linux. (I still love Linux btw - I keep multiple machines around the house running it) So not having to spend a great deal of time fiddling with config files when I plug in an external monitor actually is a big deal. Most folks don't want the hassles of messing with pavucontrol just because they switched to their external audio setup. Most folks will appreciate when they drag a window to that exterinal monitor that the HiDPI didn't cause text to go wonky.
So those are the areas where Apple is just massively superior. They nailed it in the "it just works" department. They've nailed it in the "quality hardware" department.
Windows also does fairly well in a lot of these areas.
As far as running Linux on Apple hardware? I had a buddy come into a meeting running Gnome+Ubuntu on his MacBook Pro back around 2017... as soon as he plugged into the projector, it was a mess. I'm sure it's gotten better since then.
Of course it does in the US tech bubble, if you talk to people who haven't been using Macs for 30 years you might hear a different story. While Apple makes good hardware they also have plenty of blunders, especially in recent years, much like Microsoft in its domain really. Both are coasting on their past successes and familiarity. I get it, many of my coworkers watch their announcement streams like they're video game announcements. From my standpoint they haven't put out anything exciting since the iPhone/iPod Touch, but I don't have the money for toys that cost thousands of dollars apiece like the Mac Studios or their VR headset, so maybe I'm missing out.
The VR headset was such a flop that I think it might paradoxically have not hurt their reputation. Like nobody is saying “wow, this Apple vision thing really sucks,” because nobody has seen one.
Also no one cares about it positive or negative because it’s such a nothing burger. No one even thinks apple thought it would be big, it was an experiment that’s all.
But I've yet to meet a person that said, "Oh, Rachel and Chandler from Friends... maybe Windows IS cool!". It wasn't cool, it wasn't anything. Apple was trendy with the designers and creative types, and Windows was what you probably used at your doldrums day job. The only place where MS has ever been "cool" is with gamers. I think your "Walmart" analogy is a perfect one.
reply