Hacker Newsnew | past | comments | ask | show | jobs | submit | rleigh's commentslogin

Recently it's got really bad though. The taskbar is badly broken.

* If you use auto-hide, it won't show when some applications are open. Edge in particular is bad.

* Some applications simply don't show on the taskbar at all. Teams is one. It's in the alt-tab list.

* Sometimes it stops working entirely.

The testing and QA of this stuff appears to be largely absent.


And also in QtXmlPatterns (now also retired).

Just for the record, Xalan-C is even less maintained than libxslt. It had no releases for over a decade, and I made a final 1.12 release in 2020 adding CMake support, since the existing builds had bitrotted significantly, along with a number of outstanding bugfixes.

I initiated its removal to the Apache attic in 2022 in https://marc.info/?l=xalan-c-users&m=165593638018553&w=2 and the vote to do this was in https://marc.info/?t=166514497300001&r=1&w=2&n=20. It has now gone nearly four years without any commits being made.

It's a great shame we are now in a situation where there is only a single proprietary implementation of the very latest version of the standard, but even the open-source 1.x implementations are fading fast. These technologies have fallen out of favour, and the the size and complexity of the standards is such that it's a non-trivial undertaking to keep them maintained or create a modern reimplementation.


It's more than generous. You can run it with much less resource utilisation than this. It only needs a few tens of kilobytes of flash (and you can cut it right back if you drop bits you don't need in the library code). 32 KiB is in the ballpark of what you need. As for RAM, the amount you need depends upon what your application requires, but it can be as little as 4-8 KiB, with needs growing as you add more library code and application logic and data.

If you compare this with what MicroPython uses, its requirements are well over an order of magnitude larger.


The main source of complexity isn't the .deb format, but the tooling and infrastructure around the format. It's mired in overcomplexity, and it's very much still in a '90s mindset of building locally with multiple layers of Perl-based tools. If it was rethought to be git-native using docker images or equivalent then it could be of equivalent simplicity to other contemporary systems. When I look at what you can do with the FreeBSD ports and Poudriere or with Homebrew and other systems, I see how much of the complexity has been added incidentally and incrementally, with good intentions, but a radical rethink of the basic workflows are necessary to consolidate and simplify them.

[I used to maintain sbuild and was the author of schroot back in the day]


Honestly, I found that one of the most user-hostile workflows they implemented to date. It's really obnoxious.

The number of times I've wanted to save in their native XCF file format is... zero. But I always want to save in a standard image format, and I don't really consider that to be exporting, just saving.

I understand why they wanted this, but I don't think many of their actual users did.


They do that to preserve data. If you’re making a complex image with all sorts of layers and masks and then you save to a JPEG, you lose all that information as the image is flattened and compressed. Saving in the native format lets you be able to open the file again at a later time and resume working without losing any data.

Users would be seriously upset if they made JPEG the default and the native format a buried option. People would be losing data left and right.


Saving as XCF still loses the undo history so it's really a question of which/how much information is lost. Meanwhile if you have a single layer image and export it to PNG which preserves as much relevant information as saving it as XCF it will then still complain about unsaved data if you try to close it. Absolutely infuriating behavior that no real user ever asked for.


Affinity does the same thing; I don't remember about Photoshop.

The obnoxious thing is separating "save" and "export" into different menu items. Much (most?) software lets you choose "save as" (including saving as a different format) from the regular File/Save dialog. But Affinity Photo (and apparently GIMP) forces you to cancel out of the Save dialog for the millionth time and go back to the File menu and choose "Export." It's annoying and unnecessary.


I don’t know, pretty much all production software I’ve ever used has made a distinction between export and save. Because export takes compute and can change the output, not all formats are created equal.

Saving in the internal format is probably rare if you’re just a user, but if this is a 40 hour a week job, then the compute time savings and potential disk space saving from doing that might be worth it.


The problem not being able to make the save/export decision from the same dialog. A lot of software lets you do "save as" and pick a different format AFTER you go down the File/Save path.

Having to cancel out of File/Save and go back to the File menu and choose File/Export, over and over and over in software that defies this convention, is incredibly irritating.


It really depends upon the target market. That's fine for hobbyists. But I use the Bambu X1 for small-scale prototyping in a company, and it has to be usable out of the box. We can't justify an entire week of labour for each printer we buy.

The Bambu has been ideal for that reason. Every material pretty much just works, and the quality is excellent. The cloud integration and janky LAN mode is the downside, and this current topic even moreso.


Yeah, I've got an A1 that I bought on sale. It's sitting next to a Prusa MK3S. I was doing prints for my nephews for Halloween and the A1 would do a print in 2h and PrusaSlicer estimated 9h for the MK3S. And I have, so far, not had a single failed print on the A1. They're rare on the MK3S too. But... the MK3S is "start the print and it'll be ready in the morning" and the A1 is "start the print and it'll be ready by lunch, and if you need to iterate you can have another one done by 3pm"


sbuild is at least 25 years old, it might be nearer to 30 at this point. I did a lot of cleanup of it during the mid-2000s, including adding the schroot support to it which is being removed here, and that included abstracting virtualisation backends which allowed newer solutions to replace it. The original had to run as root to do a chroot(2) call, and it reused the same environment between builds. schroot removed the need to run as root, and it could provide a clean snapshot for each build. But it too is now rather dated--though the replacement tools often lack some of the features it has, like being able to build from ZFS cloned snapshots or Btrfs snapshots, or even LVM snapshots. We wanted it to be fast as well as safe, and the time-to-build and time-to-clean-up using these was and is excellent.

Replacing key infrastructure is hard, particularly when it's working well and the cost of replacement is high, especially so when it's volunteer effort. It was cutting edge back when Perl was all the rage, but while Perl might no longer be the fashionable choice, tools written in Perl continue to work just as they always have. I found it dense and impenetrable even 20 years back, and I used to have the whole thing printed out on fan-fold paper from a dot matrix; it was about 3/4" thick and covered in annotations! Understanding several thousand lines of obscure regexes is a cautionary tale in how to write unmaintainable Perl, and was one of the motivations to clean it up, refactor it into meaningful functions, and document what it was doing so that it could be maintained with a little less effort.


Good to see schroot is being replaced in sbuild, given that it's nearly 20 years old and no longer a cutting-edge solution to the problem of creating build environments. Can't believe time flies that fast since I wrote it in 2005!

Has an outright replacement for sbuild itself been considered? Given the ubiquity of modern CI systems, it seems quite possible to replace it entirely. I would have thought GitLab runners or Jenkins or any other standard CI system would work well. While I once enjoyed writing Perl, sbuild is not a great piece of code, and the problems it was trying to solve now have better and more widely-used solutions which don't require ongoing maintenance of an even older piece of Perl code.


And Siemens SolidEdge is totally free for hobbyist use.


It's because these tools are used to deliver high quality physical products on time and on budget with no scope or interest in messing around with software defects. These tools are treated like physical tools. It's worth paying for a high quality one which won't cause unexpected grief, by people who will provide immediate support because we pay them.

I think if there were good quality open source equivalents they would be considered, but they pose a huge risk, possibly even an existential risk, if they derail our development plans unexpectedly. Paying a lot of money for seriously good quality tools reduces that risk dramatically.

I've had a brief look at FreeCAD, and it's got a lot of potential. But when you compare it with SolidWorks, OnShape or SolidEdge, there's clearly a huge gap in usability and capability which needs closing before a lot of people will be able to consider it seriously. I'm sure it will eventually get there, like KiCAD did, but it will take many years and a lot of investment to get the usability, polish and featureset up to parity. It looks like Ondsel did a really good job to make some progress along that path.


> when you compare it with SolidWorks, OnShape or SolidEdge, there's clearly a huge gap in usability and capability

All 3 of these are using the same geometry kernel - siemens parasolid.

Most open source CAD software is using OCCT (cascade).

It’s the kernel that brings a lot of the capability. Check out “plasticity” (https://www.plasticity.xyz/ ) for an example of a single developers implementation of the parasolid kernel


I tried plasticity and found it unusable. I couldn't fathom how to design anything without dimensioning!


It's not even a question of 'good' so much as 'the same as everyone else'. CAD drawing pass through a huge number of hands/companies in the process of getting physical goods made, and any slight compatibility problems can turn into huge costs and lots of blame to go around.

It's very much the case that everyone in the supply chain switches over, or nobody does.


I agree although I feel like one could easily make the exact same comments regarding compilers. I can't quite pin down why there would be many free (for various values of free) industry standard compilers but not cad programs.


Because those compilers are mostly sponsored by OS companies to outsource their software development costs in part.

Also notice that all compilers for scenarios where liability is actually imposed, like in physical goods, most compilers are closed source, proprietary, and certified.


Open Source works best for building blocks of software, not for end products. Companies have incentives to share their libraries and tooling, not so much for the final product like a CAD program.


Software engineers find compilers interesting and useful.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: