Hacker News new | past | comments | ask | show | jobs | submit | ekr's comments login

Bafang BBS series had only two wires (positive and ground) connecting the battery to the motor.

I think other motors by bafang have similar setups.


From my POV, Rust is already a bad decision for the same reason Linus was against C++. There's a continuous trade off against complexity and this is just such a huge addition to what was already a behemoth. I think Linus nowadays is a lot less strict in imposing his own rigor and engineering esthetics than he used to be.

It's probably been a long time since he was trying to steer the project to his standard of perfection, he gave up on that. Like he grew apart from it, life got in the way, and steering Linux is now more of a job than anything else.

What I'm saying is, 2004's Linus would have never allowed Rust into the kernel.


Interesting point, but I don’t agree.

To begin with, there wasn’t really anything like Rust back in 2004, so it’s a very hypothetical scenario.

And since then, the world of tech has changed a lot. So I don’t thing it’s strange that Linux has also changed. All change is not good and that applies to Linux as well.

The Linux kernel has been a sprawling project from the start. AFAICT perfection and philosophical purity was never an aim of the project. This is sometimes a liability, but it’s also one of its greatest strengths.

For contrast, compare to FreeBSD. Much more coherent and from my POV more well-designed. But the pace of development is much much slower. Many FreeBSD users seem content with that. Which is fine! You can’t be all things to all people. But the user base is also a fraction of Linux’ user base.

So the fact that Linus is open to adding another language in addition to the 50-year old C language just seems to me like a good thing that can help to modernise Linux and Unix.


No. Various manifestations of mental illness and diseases of the mammal nervous system are deeply linked to the whole body and not just the conscious part of the brain. They were arbitrarily sculpted by evolution and encode a lot of information from the ancestral environment of all ancestors. You can't replicate that by simulating a process of evolution of various AI implementations. The fact that the vagal nerve activation influences facial muscles and human interaction in general is not something you'd expect in what most people think of an AI. And yet it plays a crucial role in mental health.

Would you expect an AI to suffer from ADHD? PTSD? Almost certainly not. Because most of these conditions result from an interactions of brains of different evolutionary ages.

Unless of course you're trying to replicate a mammal nervous system its entirety. But then your goal is not singularity and you're definitely not optimizing for intelligence.


> Would you expect an AI to suffer from ADHD?

Yes, I would.

ADHD is the condition where there are many problems around and you cannot decide which ones are really worth solving, so you spend some effort on most of them but solve none. I really see no reason why an AI that is presented with multiple problems would not show the same behavior. It might even notice this problem by itself, but it already has so many other problems to solve.


> ADHD is the condition where there are many problems around and you cannot decide which ones are really worth solving, so you spend some effort on most of them but solve none.

What you described here has nothing to do with ADHD. You described a situation, a problem solving strategy and an outcome. That says nothing about the brain involved, or its functioning.

ADHD is a particular state of a nervous system, one of over stimulation/over excitation in one end and under stimulation in another. You can see it on an EEG, excess beta waves (among other things). You can see it as a neurotransmitter inbalance, neuroinhibitors not working properly.

But besides, the functioning of a brain and the environment it finds itself in are two interested things. But of course there is some influence between them.


No, I described externally observable behavior that would classify as ADHD. What is said is that that behavior could also be observed in an AI, and I see no reason why it could not.

If you claim that an AI cannot have ADHD, because it lacks a particular state of a nervous system, you might as well claim that AI cannot exist because it seems impossible to build a machine with a nervous system.


I'm personally both impressed and surprised by how well this M1 chip performs. But one thing most reviews omit to mention is that it has a fab process advantage compared to the competitors, especially against Intel. And going from TSMC 7nm (which is the latest line of AMD CPUs are using) to 5nm can bring a few tens of percentage points in power consumption and/or performance improvement.

This of course doesn't take anything away from the design, as clearly Apple has had a hefty lead against other smartphone SoC designers in the smartphone space for years.


I suspect they do not want to mention it because fab process advantage isn't easy to understand as they're not all the same. For an example, TSMC's 7nm process is more comparable to Intel's 10/14nm process [^ 1]. It's not clear what TSMC's 5nm process is comparable to for Intel's sizes.

As you said, design itself is key also. Look at how well AMD was able to optimize Zen 3 vs. Zen 2 and it is on the same 7nm TSMC process for ~20% IPC gain without increasing power usage too much.

It's going to be interesting to see what Apple can do for the next generation, which I suspect will be on 5nm again.

And Ars did mention it in this specific Ars review:

> Lastly, let's remember that the M1 is built on TSMC's 5nm process—a smaller process than either AMD or Intel is currently using in production. Intel is probably out of the race for now, with its upcoming Rocket Lake desktop CPU expected to run on 14nm—but AMD's 5nm Zen 4 architecture should ship in 2021, and it's at least possible that Team Red will take back some performance crowns when it does.

[1]: https://www.techpowerup.com/272489/intel-14-nm-node-compared...


*At all. Amd didn't increase power draw from 3XXX to 5XXX.


> But one thing most reviews omit to mention is that it has a fab process advantage compared to the competitors, especially against Intel.

This is mentioned in the article.


If you listen to interviews with Apple’s chip team, they talk about optimizing performance of the M1 based on how Mac OS works. An example is how they handle allocation and deal location of objects. Objective-C and Swift using this a lot for loops and for their reference count memory garbage collection. The M1 is 5x faster than Intel for those operations.

that kind of optimization between hardware and software can have a big impact and is part of Apple secret sauce.


Exactly! This is why I am holding for the next generations of CPUs and GPUs, that should be 5nm.


You might be waiting a long time. Hope you have a decent setup already.


One year? AMD moves to 5nm in 2021 according to leaked memos.


At best It will be Late 2021 and widely available in 2022. AMD normally use High Performance Node and Mainstream Node Pricing. And that is assuming there will be that many capacity left, as far as I can tell, ASML hasn't been keeping up with their shipment numbers. Possibly one reason why rumours are pointing to Apple using 4nm instead of 3nm in 2022.


What does moving to 5nm mean here? Starting to build stuff with 5 nm or releasing 5nm CPUs AND GPUs? Point is we don't know... and waiting for next gen is always an option -- what about next next gen?


There were talks of splitting Google and other huge companies due to their size (https://arstechnica.com/tech-policy/2020/10/house-amazon-fac...).

Wouldn't it be awesome if Apple's CPU/IC design segment became a separate company and sold these CPUs and maybe SoC by themselves?

I think that would make a big dent into AMD/Intel market shares. Since Apple's part/die should be quite a bit smaller than most of the x86 dies, the fab costs should also be smaller and so should the final price.


No, it wouldn't be awesome.

This thing ONLY EXISTS in the first place because of Apple's continual vertical integration push, and because other parts of the business were able to massively subsidise the R&D costs necessary to come up with a competitive SOC in an established market that's otherwise a duopoly. If their CPU/IC design segment were its own company, the M1 would never have seen the light of day. Period.

Furthermore, this chip is not meant to be a retail product. It's optimised for the exact requirements that Apple's products have. The whole reason why they're able to beat Intel/AMD is because they don't have to cater to the exact same generic market that the established players do, but instead massively optimise for their exact needs.

I genuinely don't understand how can anyone who wishes to break up Apple not see that these things?


>This thing ONLY EXISTS in the first place because of Apple's continual vertical integration push, and because other parts of the business were able to massively subsidise the R&D costs necessary to come up with a competitive SOC in an established market that's otherwise a duopoly. If their CPU/IC design segment were its own company, the M1 would never have seen the light of day. Period.

Eh, that's some very biased thinking.

In the real hardware world of both mechanical and electrical. I can approach a company and say "we want something with these specs, we'll buy 10 million pieces per month, what can you do you us?" and that kicks off R&D efforts after some ground contractual agreements to commit both parties.

You know, exactly what Microsoft did with AMD when they commissioned unique SOC designs for their consoles with a host of never before implemented features such as direct gpio <-> ssd io.


> This thing ONLY EXISTS in the first place because of Apple's continual vertical integration push

This seems pretty grounded.

> The whole reason why they're able to beat Intel/AMD is because they don't have to cater to the exact same generic market that the established players do, but instead massively optimise for their exact needs.

I'm less convinced of this. Their exact needs seem to be making laptops... and so these chips would make interesting candidates for other laptops, if split off from Apple.

It's never going to happen, and an independent company might struggle for R&D money, but if these prove to be better laptop CPUs there is a market there.


But they're not just making generic laptops. They're making Macs.

Everything from the memory model, to the secure enclave for TouchID/FaceID, to countless other custom features, are parts that other SOCs do not need to have present on the die, and cannot optimise for.

For good or bad, this is truly a piece of engineering that could only have come out of Apple.


Stirling engines built into a wood stove is a mature technology capable of delivering lots of power (kWs for a regular wood stove found in a home). Simple to build.

As for pollution, no-one can deny that, but the rocket stove bunch claim much reduced emissions compared to even modern wood stoves (which I personally find hard to believe). But still, billions of people are still using wood to heat their homes; that's not going to change soon. Reusing that to produce electricity as a byproduct is something that some people call 100% efficient.

[1] http://volodesigns-sterlingproject.blogspot.com/

[2] http://www.oekofen-e.com/en/engine/


Isn't that true of any building though? Where I live most houses have ceramic roof tiles, and even though they are supposedly long-lasting, after 60-80 years they do need replacing (most home-owners replace them much more frequently, every 10-20 years). I've replaced tiles on my grandparents' house which was around 60 years old, because they became brittle and started cracking after hundreds of temperature cycles, hailstorms etc.


Here's Torvalds' view on the matter: https://lwn.net/ml/linux-fsdevel/CAHk-=wg2JvjXfdZ8K5Tv3vm6+b...

I also side with this view, namely that this is something that would be better placed in the userspace rather than the kernel, which really doesn't need more complexity for things that add so little value (negative value to some).


He must have changed his view because he ultimately allowed the feature. Does anyone know what changed his mind?


While Linus kicks off about things, he doesn't tend to outright refuse things. I don't think he sees himself as the gatekeeper of the kernel and that is evident in the way the kernel developed right from the beginning. That has attracted criticism from the likes of Ken Thompson who thought that too much crappy code was allowed into linux.

|I've looked at the source and there are pieces that are good and pieces that are not. A whole bunch of random people have contributed to this source, and the quality varies drastically


The question is almost tautological because an AGI is fully general problem solver. Every human being has needs, wants that they are working to fulfill, because otherwise they would simply stop living. An AGI can be used to solve those problems for them.

The most common reason people are hesitant about rushing to build an AGI is the issue of AI safety. (at least that's the general consensus in the community).


So people by definition want AGI because they have problems and AGI can solve them?

Lot of assumptions embedded here, making it pretty remote from tautology.

(1) People want their problems solved

(2) People are indifferent to how their problems are solved

(3) The resolution of people's problems do not conflict with one another

(4) People will have their own AGI

(5) AGI will not cause problems of its own, etc.

Beyond that, I think the question is more an aesthetic choice of "What kind of universe do you want to live in?"


No. The biggest value that Wikipedia provides is the content. None of the donations are going to the content creators, very little of them are going to covering IT costs. And a lot of them are used for paying big Bay Area wages to people in the foundation whose work I regard as highly peripheral to the core value provided.


What work do they do that you consider peripheral? I'm asking from a place of ignorance.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: