Funny, I thought the whole point was to hold on to the bag as long as you can. Think back to the first time you heard about btc or eth, and how much return a modest investment would have made. It's the people that sold early that lost out.
You're not wrong, but has there ever been a time when buying & holding the 3 largest market cap coins wasn't a winning strategy?
Most of the deadcoins were very low cap, and low cap investments are inherently risky. Most of the rugpulls were schemes around small coins or individuals not actually holding the private keys to the wallet.
I think I took the PRAM battery and hard drive out because both were dead. But the RAM I cannot remember what happened, either me or someone else might've scavenged it for another project
Hi, retro computing person here. I've had a similar debate with Rust evangelists in past
Something the Rust community doesn't understand is when they shout "REWRITE IT IN RUST!" at a certain point that's simply not possible
Those mainframes your Bank runs? I'm sure they'd love to see all that "awful" FORTRAN or C or whatever other language rewritten in Rust. But if Rust as a platform doesn't support the architecture? Well then that's a non-starter
Worse still, Rust seems to basically leave anything that isn't i686/x86_64 or ARM64 as "Tier 2" or worse
This specific line in Tier 2 would send most project managers running for the hills "Tier 2 target-specific code is not closely scrutinized by Rust team(s) when modifications are made. Bugs are possible in all code, but the level of quality control for these targets is likely to be lower"
Lower level of quality control when you're trying to upgrade or refactor a legacy code base? And the target is a nuclear power plant? Or an air traffic control system? Or a bank?
The usual response from the Rust evangelists is "well then they should sponsor it to run better!" but the economics simply don't stack up. Why hire 50 Rust programmers to whip rust-m68k into shape when you can just hire 10 senior C programmers for 20% of the cost?
EDIT: Architecture, not language. I need my morning coffee
>Those mainframes your Bank runs? I'm sure they'd love to see all that "awful" FORTRAN or C or whatever other language rewritten in Rust. But if Rust as a platform doesn't support the architecture? Well then that's a non-starter
But Rust does support S390x?
>Worse still, Rust seems to basically leave anything that isn't i686/x86_64 or ARM64 as "Tier 2" or worse
Rust has an explicit documented support tier list with guarantees laid out for each level of support. Point me to a document where GCC or Clang lists out their own explicit guarantees on a platform-by-platform basis.
Because I strongly suspect that the actual "guarantees" which GCC, clang and so forth provide for most obscure architectures is not that much better than Rust, if at all - just more ambiguous. And I don't find it very likely that the level of quality control for C compilers on m68k or alpha or s390x is not, in practice, at least a bit lower than that provided for x86 and ARM.
How extensively is GCC testing on s390x, and do they hard-block merging all patches on s390x support being 100% working, verified by said test suite in a CI that runs on every submitted patchset? Or at least hard-block releases over failing tests on s390x? Do they guarantee this in a written document somewhere?
If they do, then that's great, they can legitimately claim to have something over Rust here. But if they don't, and I cannot find any reference to such a policy despite searching fairly extensively, then GCC isn't providing "tier 1"-equivalent support either.
I work for Red Hat so I'm well aware that there are people out there that care a lot about s390x support and are willing to pay for that support. But I suspect that the upstreams are much looser in what they promise, if they make any promises at all.
> This specific line in Tier 2 would send most project managers running for the hills "Tier 2 target-specific code is not closely scrutinized by Rust team(s) when modifications are made. Bugs are possible in all code, but the level of quality control for these targets is likely to be lower"
Are you operating under the assumption that the largely implicit support tiers in other compilers are better? In other words: do you think GCC’s m68k backend (to pick an arbitrary one) has been as battle-tested as their AArch64 one?
(I think the comment about evangelists is a red herring here: what Rust does is offer precison in what it guarantees, while C as an ecosystem has historically been permissive of mystery meat compilers. This IMO doesn’t scale well in a world where project maintainers are trivially accessible, since they have to now field bug reports on platforms they can’t reproduce for and never intended to support to begin with.)
> do you think GCC’s m68k backend (to pick an arbitrary one) has been as battle-tested as their AArch64 one
m68k might be a bad example to pick. I was using gcc to target m68k on netbsd in the mid 1990s. It's very battle tested.
Also, don't forget that m68k used to be in all of the macs that Apple sold at one point before they switched to powerpc (before switching to x86 and the current arm chips). You could use gcc (with mpw's libs and headers) on pre-osx (e.g. system 7) m68k macs.
> m68k might be a bad example to pick. I was using gcc to target m68k on netbsd in the mid 1990s. It's very battle tested.
That was 30 years ago! Having worked on LLVM: it's very easy for optimizing compilers to regress on smaller targets. I imagine the situation is similar in GCC.
(The underlying point is simpler: explicit is better than implicit, and all Rust is doing is front-loading the frustration from "this project was never tested on this platform but we pretend like it was" to "this platform is not well tested." That's a good thing.)
Is...this an AI generated summary of a legal case between OpenAI and Musk?
The famous "IANAL" (I Am Not A Lawyer) applies so much harder in this instance. Can we get an actual lawyer to disseminate the facts? Not an AI slurry generation
Coreboot seems like such a great project. Until you actually try to use the damn thing
I wanted to set up some old Optiplexes with Coreboot for a Windows XP LAN party. I ended up going through several motherboards with bad flashes, both of Core and Libreboot
Eventually I rolled up in their IRC and basically asked what the hell I was doing wrong.
"oh yeah we disabled [some feature]. You need to dump a bunch of firmware blobs first and then add those to the build"
Even after doing that though I found I was able to boot SeaBIOS (but nothing else!) and Windows would similarly just immediately crash and burn trying to load in any capacity. Linux worked "fine"
More annoyingly it was both slower than the vendor firmware and SeaBIOS would also hang on an LSI SAS card I had inserted (I guess it was trying to run the card OptROM?) which was annoying because I wanted to use the machine to flash the card to "IT mode" (aka JBOD mode)
Does anything like this exist for the original Age of Empires games?
Battle.Net has "PVPGN" that covers Diablo 2 up through Warcraft 3, and the Westwood Online games (ish) but searching for an AoE equivalent turned up nothing
I've LANed them "online" many moons ago by using software that made a virtual lan that we all installed and managed to find each other ingame. "Hamachi" I think?
Check out Voobly. That seems to be a pretty big hub for people playing the original games, I think mostly patched versions of the originals with some nice features.
I believe the originals just supported offline LAN play. I remember getting my friend to have his modem dial my modem so we could play together, no Internet connection required.
From memory, there was a patch that added a new-fangled option "internet" as well as LAN play. I remember the days of fiddling with port forwarding and then calling a friend over the telephone to read out my IP address for them to type in.
reply