Hacker News new | past | comments | ask | show | jobs | submit | more unfocussed_mike's comments login

> So while doctors needs to be able to talk to laymen, the chemists working in medicine factories don't.

But they for sure need to talk to lawyers, accountants and doctors occasionally. All of those (especially the doctors /s) are laymen when it comes to chemistry.


You have frontline and backline chemists as well. Some chemists needs to be able to talk to less technical jobs, but other chemists specializes on improving the process or other technical skills.

So it isn't laymen/specialists, you have many many layers with people dumbing it down a bit in every step. Telling the specialists at the bottom of those layers that they need to be able to talk to the top of the layers is just nonsense. You need to be able to talk to people who are less technical than you, and to people who are more technical than you, so the layer above and below you, but that is it. It can help to be able to bridge more layers, but it isn't that important.

The problem with programming is that almost all those layers have the same name: software engineer.


You are missing one of the great untold truths of engineering: non-technical people can be just as brilliantly intelligent.

They just don't speak your language or have your experience.

You do not particularly need to dumb it down. You do need to think about which things they actually need to know, and provide some backstory that helps them contextualise it.

Get good at this and your life will be enormously better. Keep this attitude and you will find your world shrinking.


I agree wholeheartedly with this. I have lost count of the number of times I have found a solution to some problem after explaining some technical detail to a layman who then suggested something I wouldn't have thought of.


This was my telephone life with my Dad for thirty years, all of his later life and almost two thirds of my life, even well into his dementia (because his memories of his professional career were really untouched by it).

After only one month I already miss it enormously.

I am very glad you find these calls rewarding and I am certain she does too. I am going to have to find someone to fill this role in my own life again.


I'm sorry for your loss, and I am glad that you had a fulfilling relationship with your dad.

It sounds like you are steadily strolling the road of healthy grieving, I'm sure he would be proud.

Keep your head above the water my friend.


Thank you.

And yes -- navigating the line between the endings/beginnings bit, the loss (which it is), and tragedy (which it isn't) is difficult but this time around I am finding it easier.

One of the things I have already realised is that explaining-stuff-to-my-Dad is portable. I can do it in my head. And when I can do that without tears, I'll be able to add my Dad to any audience in the future, and hear his questions as well as theirs.


I’m terribly sorry for your loss. I very much appreciate your reply, and I hope you find the person you want/need in this role.

If I may be presumptuous as a stranger who’s grieved several special relationships to offer advice, please try to catch yourself if that pursuit/search feels like it’s looking for a substitute for your Dad the person. Whichever relationship like that comes next will be both familiar and unusual. It might still be worth pursuing even if it doesn’t feel right at first.


All very deeply true.

Many moons ago I worked in a briefly-successful UK "dot com" integrator, and then took a break from work for personal reasons.

When I came back, I rejoined in the design department, rather than return to a role in engineering, where I had been mostly front-end. (I described myself whimsically as a "pet engineer".)

What we realised then is that the design team needed an engineer on their side of the divide to act as a go-between with implementation, but also to translate requirements in both directions, explaining what each side thinks (as well as urging some respect for the designers' craft).

Nowadays this is reasonably commonplace but at that time it was pretty radical.

Technical teams often make the mistake of thinking that their knowledge, their language, and their problems are supersets of or at least the essence of the problems of the business. They are quite wrong.


And they are not good at certain kinds of colour subtlety -- even the Foveon sensors struggle with the colours of dim, diffuse light (sunsets etc.) and deep muted colours (the grey-green of ivy leaves for example).


Amstrad's last great PC series (the 3x86 series made with very standard components, unlike the unusual 2x86 series) used this strategy for the 3386, I think.


Enjoyably deadpan :-)


Well... you are eliding the one, rather decisive way TSMC could indeed end up in Chinese hands.

We just have to hope that the Ukraine conflict has convinced China not to try to pull it off. We certainly would not be able to find economic sanctions that could possibly work if they did.


> Well... you are eliding the one, rather decisive way TSMC could indeed end up in Chinese hands.

The only thing that can end up in Chinese hands is the rubble of what used to be TSMC facilities.


You need three things to keep TSMC operational: facilities, supply chain and people.

Ukraine has shown that those can evaporate in mere days.


True, but that doesn't imply that TMSC will be taken over by China.

In Ukraine, the territory is what Russia wants (minerals, crops) but in Taiwan it's the "human capital" that's desirable, and the latter isn't so susceptible to invasion.

It's likely that Russia, with ten times the armed forces of Ukraine, will devastate Ukraine, unfortunately. But China is more likely to hold off devastating Taiwan.


If things go any worse in Shanghai that might be a more realistic demotivator


AMD does ARM stuff and has had plenty of time to get it right.

Intel has had, what, three, four bites of the RISC cherry? They've owned big chunks of the ecosystem Apple now exists in. Evidently one of their pitches involves RISC-V and specifically SiFive.

I think it's a mistake to see the future as Apple vs Intel or AMD; that is the battle of the first -- nearly over -- quarter of this century.

The future is Apple vs Samsung and some Chinese megacorps we don't necessarily recognise. Because the battle is now moving on from large-scale off-the-shelf products and chips for motherboards to SoCs and large-scale bespoke design.

It will be the speed and performance of native apps on Apple and Samsung ARM silicon vs the freedom of choice and universal portability of WASM on RISC-V and the-long-tail-of-every-other-architecture.


I think Intel is too late. They can play catch up with the ARM market, but it will take time because there are are enough other players that they're not relevant enough to be on top again. That is unless they can out-innovate the competition, which I doubt, but who knows.

Over the past few years I've read articles and watched videos on Intel making some kind of comeback, yet the journalists creating those pieces seem either oblivious to RISC or are ignoring it. I don't know which one. That tells me all I need to know, and so far it isn't looking like Intel is the future for consumer electronics.


Well. I think x86 is dead and Intel just doesn't know it yet. Might take a decade for them to realise.

Intel as a business? Maybe as a design house they will struggle to catch up. As a foundry? If they do not catch up with the amount of government support they are going to be able to claim, then there is something very wrong.

Most chip designers have at least one way forward, which is to produce chips that run WASM well on Linux or a BSD. This is actually a good time for minority architectures, because a truly portable architecture is on the horizon, and everyone has an incentive to see it work; that incentive being not losing all their business to Apple, Samsung and a few Chinese firms whose names we don't recognise yet!


>Well. I think x86 is dead and Intel just doesn't know it yet.

Let me toss that in my list of sentences that will poorly age, alongside "This is the year of gaming on the linux desktop" and "surely nothing wrong will happen if I plug in this SCADA on the Internet"


There was a small measure of satire/hyperbole in what I said. But you have separated that one sentence from the second sentence that provided context, so I don't think I particularly need to defend it on its own.

Perhaps I should have used a semicolon to defend against you.

The point was that in ten years time, I think people will look back and say: 2022 is the year x86 really ended as a forward-looking architecture. Ergo: dead, they just don't know it yet.

But to expand: one of the things that is very striking about tech stuff is how quickly the energy in any one system can deflate.

Looking at the Apple competition, do you think buyers will think:

1) Intel simply need to make x86 a bit better to compete, or

2) It is about time we started looking at architectures that can scale up to compete with Apple Silicon on all levels (absolute CPU power, power consumption, mobile to server)

Architecturally, x86 is in serious trouble and this isn't just something I -- a random non-chip-designer -- think. It's what Intel think, or their entire recent push would not be to become a fabricator for other chip designs, built on an unhealthy amount of anti-asian supply chain FUD.


Once again, no. For the price of a Mac Pro, you get x86 hardware that performs simply better. Despite HN having a hardon for "b-b-but it's only 85W", it doesn't matter to a very large portion of users. I don't give a single crap if my workstation uses up 600W to run a 5900X and an RTX3070 (or, well, if I could find one, but shhh). I merely care about raw power. My electricity bill is a non-factor. And that's just with something of "equivalent power" (not even taking into account the fact that people don't give a shit about OSX if it doesn't run their software/games/etc). I can get some x86 hardware that runs an M1 into the ground easily, and given that all the current gen is on old processes, upcoming x86 hardware will, once again, be more than just competitive, but better. Then Apple will come out with the M2, reap all the benefits that come out of being a non-upgradable SoC with soldered RAM and soldered CPU, maybe be on top of very select benchmarks for a while, etc. As with every time that Apple came out with "revolutionary new hardware", what it mostly meant is "we slapped TSMC with a load of cash and got their latest and greatest process".

So, to respond, buyers will:

1/ Not give a crap about whether it's x86 or Apple Flavored ARM (read: an ARM with undocumented extensions that is pretty much CISC), but merely look at price, in which case Apple buyers are an extreme minority. A $500 laptop with a mobile Ryzen is more than enough for the average user, and mostly, most people cannot afford a $1200+ macbook air.

2/ Not care about power consumption because it does not matter to a boatload of people. Get out of your bubble, noone outside of software developer nerds care if they have to plug in their laptop once every four hours. Most people have lunch breaks. Is it a great thing to have in a laptop ? Absolutely, and Apple has done great in that regard. Does it matter anywhere else ? Absolutely not. Once again, it's HN jerking itself off about pErFoRmaNcE pEr WaTt when Apple put out a CPU that can't even draw the 105W it's rated for, and has no proof that it can improve in single threaded performance with more power. But sure, toss out more cores, it's what x86 has been doing for years, but when Apple comes out with the M1 Ultra it's a stroke of genius to slap two of these bad boys together.

3/ People actually upgrade their hardware and don't rebuy a macbook pro every two years like half of this forum does.

ARM is not a forward looking architecture, and neither is x86. They both are architectures, with their advantages and flaws (and the Apple flaws are fucking massive, unupgradable-except-if-you-buy-next-year's-version is the very definition of non-forward thinking, especially for end users).


Not going to respond to the unnecessarily rude tone here generally, except to say that any argument you want to have taken seriously is better not couched in terms of "hardons" and "jerking itself off". If you are a teenager, consider trying not to sound like one. If you're not a teenager, consider trying not to sound like one. Either way, this kind of communication is why you're likely still not taken as seriously as you could be.

Also you're taking it all off on an angry tangent about the HN audience when I am talking about buyers (I meant including institutional buyers, not end users, and I didn't clarify that enough but I would have thought the focus on architecture would have made it clearer) but:

> I don't give a single crap if my workstation uses up 600W to run a 5900X and an RTX3070 (or, well, if I could find one, but shhh).

I would. Between this April and next April my energy bill will double, and it is unlikely to ever fall back to where it is now. Power consumption matters A LOT in Europe right now. Corporate buyers will have to care about that, for the long term.

> 3/ People actually upgrade their hardware and don't rebuy a macbook pro every two years like half of this forum does.

No, they really don't upgrade their hardware, in fact. Almost no computer buyer upgrades through anything other than replacement.

But I'm using a seven year old, secondhand MacBook Pro, so I'm not the target of your comment.


For what it's worth, I recently went and traded in a pile of old Macs because I was travelling with a newer Macbook Pro (16" with M1 Pro) and needed to be able to effectively replace not only my first early M1 laptop, but also the iMac Pro I'd got as a flagship machine before the M1s came out.

Between those and an intel Mac Mini and an older iMac, I got over $3000 in trade-in value. Made me feel quite foolish for not having tried that before. The oldest iMac still got me a couple hundred bucks, as would your seven year old MacBook Pro most likely. A two year old high-specced computer?

I get that the list price for any of these is still way higher than putting together a PC from parts… but again, I belatedly traded a bunch of stuff in, and got more than $3000 in gift card value from Apple doing it. That makes 'rebuying a macbook pro every two years' a whole other state of affairs. I'm kicking myself for not having tried that earlier: I'm sure I left over $1000 on the table simply by letting recent computers sit around unused rather than immediately trading them in for Apple credit.


> Power consumption matters A LOT in Europe right now.

Should have gone with nuclear years ago, our prices are stable.


Well, I was going to spend this money on upgrading my laptop, but I guess I'll just invest in a nuclear power plant instead... /s


At the cheap end, it's ARM too. Chromebooks if you need a keyboard, tablets if not. It's only enterprise corporate office drones that are tied to windows. I've got a gaming rig too, but we are also a minority. Most people don't need a $1200 Macbook air, but also most people don't need a $800 RTX3070. PC Gamers are already a niche. Mobile gaming is where the money is, then console, then PC games, and mobile is 10x the both of them.

So to the point at discussion: Apple uses their common tech across their entire Market. The M1 chips are the result of making the fastest mobile phones. The M1 is going into tablets. They are just printing money and pumping that money into chip design. If your DIY PC Building is what is going to keep Intel/x86 alive, then I agree with GP: Intel is the walking dead. And it really is just us gamers.

Who needs raw CPU/GPU at their desk? I had one of the first nehalems under my desk (NDA and everything). Then I heard about AWS and started running my shit on that. Back then, that was intel. Now it's Graviton ARM: more processing per $.


In fairness, are you not merely implying the opposite prediction? I'm sure there were people who thought pagers/beepers wouldn't instantaneously disappear from use, and there were plenty of developers who thought Java was going to be the language for the web. I'm not sure how pointing to the year of the Linux desktop is a good argument.

Though I think x86/x86_64 inevitably will outlive its usefulness for general purpose computing, but it's not literally "dead". There's too many Intel architecture devices out there still. To me, it seems clear that it's more of a dead architecture walking, because I really can't think of anything that x86 does better for most use cases. I'd love to be shown wrong, though.


> They won't act on it for your regular employees because it costs them money to do so and developers _routinely_ jump ship, though they want you to know that they technically have the power to because you signed that contract.

This kind of thinking is exactly why these clauses get worse and worse and worse over time, I'm afraid.

People should be prepared to argue the toss over every single line of an employment contract, and seek advice for every single line.


Sure, but it's prisoner's dilemma. In practice, the company almost always denies the request to remove the clause, and if I then refuse the offer, I'm worse off because few others are refusing offers over this clause. The only way we win here is if most people walk away, which they should, but they're not.

Anecdata: at a different FAANG, I requested a less important clause to be removed and the lawyers denied even that request. Lawyers will tell hiring managers to pass on a candidate in the name of protecting the company. They won't pass on everyone, but it's a long way to get there.

I think that having access to specialized legal council would have helped, but attorneys have a reputation of being expensive. I wish levels.fyi offered a legal service as well, in addition to their negotiation coaching service.


This is a fair assessment.

There is another strategy than asking them to remove it, though. Ask them to qualify it. I've done this in the past (a long time ago, mind you, and in the UK, where employers are still somewhat bound by decency as well as law). I was asked to sign an extensive NDA for a short period of employment, and I agreed, on the condition that I could decline to join any meeting that wasn't strictly related to the project I was hired on, and that the same NDA bound them when discussing my personal projects.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: