Hacker News new | past | comments | ask | show | jobs | submit | tekklloneer's comments login

Early this on, the process of building physically accurate "fake humans" was not really possible.

Bear tests were probably more an early proof of viability than anything else. The bears survived, proving that humans wouldn't be completely annihilated.

Animal testing cruelty wasn't really being thought about heavily in post WWII military aerospace development. It's unfortunate but a historical footnote reminding us of the importance of proper testing.


>Animal testing cruelty wasn't really being thought about heavily in post WWII military aerospace development

Considering that the endpoint is machines killing actual people, it would be the height of hypocrisy in my books for them to care about "animal testing cruelty".


I don't (reading modern past and foreseeable future), but if you subscribe to the idea of 'just war'[0] it would make sense.

0: https://en.wikipedia.org/wiki/Just_war_theory


By "actual people" do you mean "who cares about animals dying in horrendous ways when people will be dying?"


More precisely, if a project is about making killing machines for people, that it might kill some animals to test them is the least of its moral problems.


Logging the max acceleration in x,y,z would have been better than throwing a bear out of plane. Knowing a bear survives is ridiculous and they knew it. Ethics are orthogonal, cadavers would have ben a better choice.


What can you compare that acceleration data to? Put a human on a shaker table and measure his brain damage? At some point you have to damage humans or animals to find out their limits.


You would now have numerical values to compare rather the MILSPEC 'dead_bear' units. Nowhere did I make an ethical argument against animal trials or say that we should do experimentation on humans. But launching bears out of an ejection seat was even stupid then.


Logging what with what, in 1962?


A pencil on a spring is enough to measure maximum acceleration, I'm sure by 62 they had something at least as sophisticated.


Imagine if they have the advanced capability of moving the paper under the pencil, they might get acceleration over time.


In 1962 they would have had multi-channel magentic tape (analog) recording. The first flight data recorders date to the early 1940s.

Fitting it in ejection seat might be another challenge but I'm sure they would have developed something before moving to live subjects.


Might these proto-engineers also harnessed radio waves to transport signals across the aether for a form of telemetry?


He meant actually put a piece of wood log inside the capsule and observe the marks on it after it returned to earth.


As someone with long term health conditions, it hangs in the back of your mind. I'm fortunate enough not to have to call mine "chronic", and I may (probably will) be blessed by the results of research from over a decade...

But it haunts me in every relationship I have, even if it's irrational.


No, it's a bug. A home/consumer router isn't targeted at power users, and should not require the average user to understand what they're doing.

I also think it's unreasonable to expect people to patch bugs in consumer gear they've purchased.


I don't think those goals are actually contradictory. Cryptographically verified auto updates, manually acknowledged non signed file upload updates can coexist peacefully.


Or, you're using snapchat, or instagram, or signal.


Or, any other IM program (including text messages). Or getting directions. Or checking departure time of a bus. Or switching a song that's playing. Or turning my Hue lights on/off. Or checking my account balance. Or paying with the phone.

Each of those actions should take no more than couple of seconds (occasionally, a couple dozen). And it can, if your phone is not lagging out after lockscreen or when trying to load an on-screen keyboard.


All these things work just fine on these older devices. The pauses the original message mentioned are mostly encountered when starting or switching between applications. Once the application is running things work as intended. I use Telegram on that Motorola Defy I mentioned, no problems. It runs navigation apps (Navigon, OsmAnd~) without problems. It takes photo's of reasonable quality, those photo's can be edited on the device. I use it to play music on the device itself using Dsub (a Subsonic client) or Apollo, to control remote players using MPDroid (which controls mpd (music player daemon) on remote devices). It plays video from the likes of Youtube and Vimeo just fine. I use it to read books and publications, no problems. I even use it as a telephone every now and then...

It can take a few seconds to switch between any of these apps, especially when there are several of them running in the background. Would a new device be faster? Sure it would. Will I buy a new device sometime in the future? Sure, when this device kicks the bucket or another device shows up which offers the same feature set (good performance (compared to current devices), good battery, waterproof and sorta-shockproof yet still looks like a normal device instead of some prop from a B-movie). Do I feel like I'm missing out on something by using a 6 year old phone? No, I do not.


Well, if you use a 6-years-old phone and it works for you, then by all means, stick to it. I would, too.

In my case it's not about chasing the newest features and highest resolutions - it's about certainty of getting a quality product. I no longer want to risk getting a shitty, laggy phone, or a phone that turns into one after few months. I consider it not worth the frustration it causes in daily usage.


I disagree - I've used a $300 stock android phone for just over a year now, and I haven't really run into any "cruft" issues. But, poor android performance is a very valid problem with android.


Agree with this - cheap Android phones are what they are, but mid-range hardware is astonishing and if you don't install clutter, you don't get clutter.

(I even remove/disable any Facebook clients etc, only use Chrome to access services. Perhaps this is why I'm happy.)


I have a Nexus 5X and then I bought my wife a Moto G4 plus and I was really impressed with her phone which was a good amount cheaper than mine.

We've both had them for a while now and they are both good phones but I think I'll spend less on my next phone since I can and still get something that works very well.


Yes, $250K counts as rich. You can "get by" on combined income of $150K extremely easily. Maybe you don't buy a new bmw every year or live in an untrendy neighborhood. Oh, no.


Signature based anti-virus is a must have on any widely deployed platform that doesn't have default code-sign requirements. So, basically, Windows and FOSS Desktop.

But, it's become so drastically commoditized that there's no reason for the average user to have anything but the built-in MSE (on windows, at least).

It doesn't stop new attacks, but it does help raise the bar against malware.


I believe "Widely deployed platform" is the issue here.

A yummy target for virus and ransomware authors is a widely used piece of software: OS, browser, crypto library, Word processor, spreadsheet, PDF reader,...

Part of the problem is that in each of these categories, a single vendor often holds over 50% market share. As soon as a bug in one of those allow a RCE, that means millions of users at risk.

It's also true even if you don't run the software directly but use a service: memes that infect social media (see: Facebook and fake news) are basically viruses too.

Species avoid extinction from viruses thanks to diversity. Software users that want to stay safe should consider using the less popular alternatives.


> Part of the problem is that in each of these categories, a single vendor often holds over 50% market share. As soon as a bug in one of those allow a RCE, that means millions of users at risk. ... > Species avoid extinction from viruses thanks to diversity. Software users that want to stay safe should consider using the less popular alternatives.

This point is badly under-discussed whenever this AV debate comes up.

Yeah, the variety of vendors, products, and methods in the third-party AV arena make production less predictable for software developers. That's the point. It makes it exactly as unpredictable for attackers.

Should AV vendors work harder to make their software easier to develop around? Arguably, yes. Security through obscurity is no security at all. But that should be the target of the argument, not the homogenization of security systems. I don't care how big or small the Defender attack surface is if every single desktop computer in the world has the exact same attack surface.


> It doesn't stop new attacks, but it does help raise the bar against malware.

It also provides a giant new attack surface.


I disagree. He's been transparent with his goals, and people (besides a loud but relatively small amount of hard dissenters) don't seem to object. And by that, I mean: they use facebook.


Sudoroom and noisebridge are still alive.

The fact of the matter is that for all of the innovation you hear about there, it's really just romanticizing the margin thinning of existing products. Nothing wrong with that, but it's easy to romanticize.


Part of the problem is this: who'd fork it? Who'd put in the effort? Sure, they alienated key members, but the people doing the bulk of the work were at the core, including Leah.

So, it's not so simple as forking and coming back when the dust has settled.


There were actually enough pissed off core members to pull off a fork - what was missing was a leader.

We'll see how all this ends up - but given Leah track record - the whole "leaving the GNU project", the allegations, her aggressive behavior towards competition - don't instill me a lot of confidence that the dust is settled for good.

Look at this for example - censorship happened just 3 days ago - they only reverted the issue after community backlash https://trisquel.info/en/forum/libreboot-issues-open-letter-...


If someone was missing, then there wasn't enough people.


It's a bit nuanced, but I guess you're right. There's the librecore thing - but AFAIK they didn't reach out to coop the other developers, they didn't publish their own builds, and the inertia was lost. What could be the straw that breaks the camel's back - would be for Leah to piss off Paul Kocialkowski, as he's the guy who wrote a big chunk of libreboot (especially the newer parts of the system, with the chromebooks); But time will tell.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: