Hacker News new | past | comments | ask | show | jobs | submit login
Welcome Chris Lattner (tesla.com)
879 points by nil_is_me on Jan 10, 2017 | hide | past | favorite | 275 comments



What's really interesting about this is not just that Lattner is brilliant and liked, but that is highlights just how critical software correctness and reliability is to autonomous vehicles. Naively one might have expected some machine learning expert to take over the reins at Tesla. But fast-moving Silicon Valley needs a fundamental shift in quality standards when it comes to safety-critical software, and if you look closely, Lattner has been leading this charge at the language and compiler level.

His work has been distinguished by the melding of language safety, reliability and clarity, that is, not merely having sophisticated constructs that help the guarantee correctness, but also making code simple, beautiful and easy to read. Ultimately writing safe code depends on the ability of the programmer to comprehend it, so creating a programming environment that's successful on all fronts is a foundational achievement.

A notable example: LLVM enabled ARC, a beautifully simple approach to memory management that removed much (not all) of the need for the developer to implement details in code, while providing high efficiency and, perhaps even more importantly, predictable performance (no garbage collection pauses). These are all essential for safety-critical realtime software.


> LLVM enabled ARC, a beautifully simple approach to memory management that removed much (not all) of the need for the developer to implement details in code, while providing high efficiency and, perhaps even more importantly, predictable performance (no garbage collection pauses). These are all essential for safety-critical realtime software.

A GC algorithm known since the early days of Lisp GC research and used in languages like Mesa/Cedar in the late 70's.

I can think of lots of other examples, like how VB, Delphi and C++ Builder interfaced with COM in the mid-90's.

Yes, ARC PR made automatic memory management to those without background in compiler research more easy to accept, specially if they weren't aware of the stability issues that trying to implement a tracing GC in Objective-C semantics meant.

However it goes without saying, that Lattner is really brilliant and his work on LLVM as a compiler building tool, as well as, clang in regards to improving the status quo of C static analysis tooling and compiler error messages, is really notable.

In regards to ARC, maybe also in terms of PR, as now there is a whole generation of developers that thinks reference counting isn't garbage collection.


> I can think of lots of other examples, like how VB, Delphi and C++ Builder interfaced with COM in the mid-90's.

One curious thing is how similar Swift is to modern Pascal, ie Delphi. It has many other influences, and it is certainly not Pascal with another syntax, but reading the Apple guide when Swift was released gave frequent moments of deja-vu.

On the subject of COM, yes. If you think COM means ATL you should try it in a language designed to work with interfaces and reference counting inbuilt - Delphi has COM interfaces as first-class language primitives, and a heap of classes and other code making COM quite straightforward. Much of this even spills into C++Builder, though since it's a different language it's not as clean as in Delphi. Still miles past ATL though.

Re ARC: I think the thing that makes it appealing is it's conceptually simple, completely deterministic, and can be traced by reading code rather than understanding an environment's implementation. Delphi does ARC now too, and if you've ever wanted ARC in C++, C++Builder optionally supports it for some classes. ARC is not yet on Windows for either language, we're talking just iOS, Android and soon Linux here.

(Disclosure: I recently started working at Embarcadero on C++Builder. This is a personal comment only. But liking the languages Embarcadero makes was one main reason I joined.)


By PR you mean public relations? Note, I did not claim that Lattner invented the general idea. I'd suggest that dismissing the careful selection, refinement, integration and popularization of design elements as "PR" underestimates how great products are made. It's like saying Tesla's doing nothing new or the Macintosh was invented by Xerox.


ARC is a whole debate, but the one thing it is not is simple and I would argue its more error prone than a traditional GC. I've used it for most of my career and I have seen what sloppy/unaware coding can do to it.

Lattner is a well known expert on compilers. Having used Swift since its inception, I would call into question the reliability of the Swift LLVM compiler. In its current state (3.0.2) its absolutely terrible and does not back up the sentiment; "But fast-moving Silicon Valley needs a fundamental shift in quality standards when it comes to safety-critical software, and if you look closely, Lattner has been leading this charge at the language and compiler level".


I'm not claiming Swift 3.0.2 is the most reliable language ever. All systems require time to mature and it is very young historically speaking. I am saying Lattner's design principles should realize better results for safety-critical software in the long run. Compiler bugs can be fixed; programmer error is tied to design. It might be Swift, it might be something more pared down, but he's a good guy to have in charge of an autonomous driving software platform.


Well I'm sure he's a very experienced manager for technical projects and also brings a large amount of experience with compilers to the table.

His online presence always has that Apple Arrogance™ to it. Thats coming from someone who was born and raised an Apple fan.


I'm not sure I would say Swift is the kind of robust that is needed for safety critical software, but it is a nice step forward for application code. It mostly forces you to deal with things safely while still allowing Objective-C dynamic behaviors when you need to get around the restrictions (often to interact with other Obj-C code).

So, yes I can see why one would call Lattner at the forefront of making more reliable compilers and pushing shifts in quality standards in fast-moving Silicon Valley. It is an awesome achievement to create a new language that improves both readability and safety and even more awesome to get it mainstreamed so quickly.

There are a few people who I would like to trade places with. Lattner is one of them, Musk is another. They both fulfill different parts of my long-held dreams. So I consider them to both be quite awesome. Its cool that they'll be working together too I guess.


Sure Lattner and Musk are interesting people, but I find the level of hero worship in the tech industry to be sickening.

Having used compilers for a few new languages (Rust, Go, Dart, Kotlin, Swift). Swift is the only one I've had any issues with as well as Swift seems to be the only language to have adopted the "move fast and break things" philosophy of Silicon Valley. I dunno, I just don't see the argument.


Lattner is best known for starting the whole LLVM project. Swift is just a small side project in comparison, it just got adopted by Apple for some reason.

LLVM is one of the most influential pieces of software of the past decade. Hero worship isn't good but credit where credit is due.


Are you sure no one else was involved in starting LLVM? PhD supervisor, fellow students, etc. Just saying...


Oh yea LLVM and Clang are incredible pieces of software! He is truly a master.


clang was mostly steve naroff.


You're confusing language development with autonomous vehicle development. Think of the long term goal. It's desirable to move fast and fix things with language development in the near term, to achieve a more perfect design and accelerate its maturation at the temporary cost of more volatility. After this process achieves a high level of maturity, said design principles may offer a safer, more reliable programming system that would be better suited to safety-critical applications.

Additionally I'm sure we can all agree there is no substitute for maturation through time and usage in the field. Which frankly is an argument for more popular languages over obscure ones. None of the ones you mentioned are ready for safety-critical system development (including Swift 3), but which one is most likely to achieve widespread adoption and field testing in the long run?


No I'm not confusing them. I'm responding directly to the comment that Chris lattner represents a more measured approach to software development than is tradition in the tech industry.

I don't think Swift stands to gain wide spread traction outside of Apple orientated app development. Aside from a lack of stability, Apple is to well known for boxing its competitors out. I've used and loved their products my entire life and I know how annoying it is to go against Apple's grain.


>I don't think Swift stands to gain wide spread traction outside of Apple orientated app development.

It already is though, there are several Linux web frameworks etc. It's open source and community run so I'm not sure how they're planning to box out competitors from it.


There are some web frameworks that are indevelopment. That does not mean Swift has gained any traction. Also having toyed around with one, the experience was not great.

When writing a server, I would take Go over Swift anyday. It out preforms it, uses less memory, its simpiler, oh and it uses a "tradiontal" GC.


>uses less memory

That is very much _not_ the case according to the testing I have done recently.

Swift uses a lot less memory than Go unless the program uses only trivial amounts of memory in the first place. Using interfaces in Go data structures makes the difference even more pronounced.

On top of that, all runtimes that use a tracing GC require tons of spare memory at all times unless programs are very carefully written to avoid memory allocation.

That said, Swift has a few very weak spots when it comes to memory. Most notably the String type, which is terrible on all counts, but that is a whole different story.


> On top of that, all runtimes that use a tracing GC require tons of spare memory at all times unless programs are very carefully written to avoid memory allocation.

Only if the said language doesn't allow for stack or static globals.

Quite a few languages do allow it.


Doesn't that effectively amount to manual memory management? What particular languages are you referring to?


> Doesn't that effectively amount to manual memory management?

Not really, example in Active Oberon:

    TYPE
      point = RECORD x, y : INTEGER; END;

    VAR
      staticPoint : point; (* On the stack or global *)
      gcPoint     : POINTER TO point; (* GC pointer *)
      noGCPoint   : POINTER(UNTRACED) TO point; (* pointer not traced by the GC *)
> What particular languages are you referring to?

Mesa/Cedar, Oberon, Oberon-2, Active Oberon, Component Pascal, Modula-2+, Modula-3, D, Oberon-07, Eiffel, BETA.

There are probably a few other ones.


>noGCPoint : POINTER(UNTRACED) TO point;

That's fine for one point. How about N points where N varies at runtime?

If I allocate memory dynamically outside the GC's remit, I'm going to have to release that memory somehow.


Depends on the specifics of the language.

On Active Oberon's case, those pointers are still safe. They can only point to valid memory regions, think of them as weak pointers that can also point to data on the stack or global memory.

This in safe code.

If the package imports SYSTEM, it becomes an unsafe package, and then just like e.g. Rust's unsafe, the rules are bended a bit and usage with SYSTEM.NEW() SYSTEM.DISPOSE() is allowed.

Just like any safe systems programming language, it is up to the programmer to ensure this pointer doesn't escape the unsafe package.


I still don't get how you can say that this memory is not under the GC's control but it's "not really" manual memory management either. Is it reference counting then? How does that memory get released?


It doesn't get released, unless you are doing manual memory management inside an unsafe package.

In a safe package it can only point to existing data, there isn't anything to release.

If the pointee is something that lives on the heap, it is similar to weak references. Points to GC data, but doesn't count as yet another GC root.

If the pointee is on the stack or global memory (data segment in C), then there is also nothing to release. Global memory only goes away when program dies, stack gets released on return. Memory that was allocated by the compiler due to VAR declarations, it is static.

Usually the idea is that you use untraced pointer to navigate statically allocated data structures, they are not to be exposed across modules.


It will be great if you have any benchmark to share. From what s available on internet Swift does seem to use more memory than Go.


What sources have you found on the internet?

My own code is unfortunately a bit messy and entangled with unrelated stuff. If I find the time I'm going to clean it up.


ARC is a tradeoff between manual and automatic memory management. Requring a little bit more care from programmer is intentional, not a disadvantage as you picture it, it is a price for not having, you know, GC. GC is less error prone not for free but at the price of eating CPU and memory, which in the world of mobile devices equal less battery life, so it is quite desirable for iPhones and MacBooks software not to have it.


ARC is automatic memory management.

"The Garbage Collection Handbook", chapter 5

http://gchandbook.org/


Yup but people love to argue over the small shit. Chris Lattner even refers to ARC as a form of GC.


Different approaches to memory management differ in extent of how much of programmer's job they automate.

Garbage collectors are fully automatic and rarely if ever require to mind anything; automatic RC does almost everything but requires programmer to analyze and annotate some things as 'weak'; manual RC requires a lot more programmer's effort while still technically being "automatic"; and manual memory management means the programmer does everything.

Automatic/manual is a scale, not a boolean yes/no, and the point is that ARC lies on it a bit closer to manual than garbage collectors.


The thing is, ARC is a garbage collection algorithm.

There isn't anything like ARC vs GC, that is layman knowledge and just wrong from CS point of view.


ARC is not an algorithm, it a language-level feature that generates retain/release calls automatically so that the programmer does not have to. Unlike GC systems where the resulting program does run an algorithm (and wastes CPU on that), with ARC the generated program is no different as if retain/release were written manually and runs no extra code.


That is an implementation detail of how a reference counting algorithm can be implemented.


If you're going to say that, then hand-coded memory allocation is just an implementation detail for a garbage collection algorithm.

True in some sense, but mostly useless. Come on.


The right comparison for ARC is with manual memory management -- not GC.


Depends on what your point is. Both ARC and GC are approaches to limit the complexity and difficulty of memory management. As such, I think it's very reasonable to compare them, because they're different approaches to the same underlying problem.

FWIW, as someone who was a Java programmer for over a decade before learning Objective C right after ARC came on the scene, I greatly prefer ARC over garbage collection. I find the things you have to remember to think about with both ARC and GC (e.g. circular references and unintentionally strongly reachable references) to be about the same cognitive load, but the deterministic, predictable behavior of ARC means you won't have to try to debug random GC hangs that only happen in prod under heavy load and the subsequent fiddling with a million GC options to get performance to be acceptable.


ARC is a GC implementation algorithm, you probably mean tracing GC algorithm.

"The Garbage Collection Handbook", chapter 5

http://gchandbook.org/


I think it is time for you to buy more books.


I have a very good collection of CS books and papers about programming languages and compiler design...


I'm actually excited about server-side Swift for exactly this reason. It's early days though.


Not really. They are both valid comparisons, since ARC is much easier to work with than manual management and can offer more predictable performance than GC. That said, it's also slower than manual management and can be trickier to work with than GC.


ARC is a GC implementation algorithm, you probably mean tracing GC algorithm.

"The Garbage Collection Handbook", chapter 5

http://gchandbook.org/


Just to point out that the Swift LLVM compiler is written in C++. So it being unreliable doesn't necessarily say much about the reliability or goals of Swift as a language.


It looks like you haven't used C++ in a while.

C++14 is a whole other world, and can be written with most (if not all) the safety guarantees you would expect from Swift or Rust.

I had to use it for a project and was very surprised about this too...


You're lucky if you get to use C++14 in the real world. My last C++ job was maintaining a 2 millions of line of legacy MFC code.


Even there one can write quite safe C++ code with help of CArray, CString, CComPtr, ...

I used MFC like that in the late 90's/early 2000.

The problem are the team mates that write Win32/C like code, instead of MFC/C++ code.


I wasn't trying to suggest that C++ was the cause of the compiler issues, just that Swift wasn't.


Can you explain why you call into question the reliability of the Swift LLVM compiler?


It would be apparent if you were a regular user of the compiler. Random crashes while compiling well-formed code and performance problems with larger files/projects are quite common.

Still, I would say it's mostly usable now. It used to be a lot worse.


You find it better than before? I find that Swift 3.+ is drastically worse than Swift 2.


Do you use Swift at all?

-edit- I meant it as a serious question. But the person who responded to me sums up the issues.


This is downvoted presumably for lack of information, but it's pretty much true.

The Swift compiler segfaults very frequently. I do find this amusing in that it's the compiler for a theoretically largely-memory-safe language (yes the compiler is written in C++, it's still funny). The syntax highlighter in Xcode, which is driven by the same stuff, also crashes, which breaks autocompletion and even indentation. Using Xcode, you just have to get used to it. It frequently reports the wrong error message - just something that isn't even close to related. Sometimes object files just become 0 bytes and you either need to clean (and experience the Swift compiler's blazing performance again) or go and modify that file so that the incremental compiler will pick it up.

I've found most of these to be triggered by using a lot of closures and possibly type inference. Shaking out the incorrect errors or segfaults is... not fun.


The most annoying one is the incremental compiler is broken under Xcode 8, leading to full recompiles every time a line of code is modified.

https://forums.developer.apple.com/thread/62737?start=0&tsta...


Oh? I meant it as a srs question. I wasn't sure what parts I should include. But thanks for essentially saying most of it.

I should mention I also find the community to be sorta toxic. They are so focused on Swift being the one language to rule them all and they use terms like "Swifty".


All language communities do that.


I get your point, programming language communities all have a certain level of fan-boyism. However I find Swift's community to be particularly abhorrent. I stopped partcipating when people started to ask if certain codes is "Swifty" and people would judge the merit of something on whether its "Swifty". It also got tiring with how militant they were with other programming languages, especially Java.



Yes I can agree with these.

I wonder if we'll have refactoring in Xcode for C++ now Lattner has gone. I wonder why they never added it.


Swift, XCode, and Apple software in general don't scream reliability to me. No doubt Lattner is a smart guy, but I'd be more comfortable with a smart guy from NASA or somewhere similar with experience developing survival-critical software systems.


This is my feeling too. Yes, I'm a scientist, so I'm from that side of the pond, but does it make sense to hire a developer type for engineering where lives could be at stake?

Not that dev'ing should rise to that level, but it isn't the typical level most developers work at. Also, if he is more of a "software correctness and reliability" guy as abalone says he is, then yes, that is the right direction.


Pretty much none of Apple's current software is written in Swift though, from what I can tell.


Actually a fair bit is in Sierra. Apple decided apparently not to invest resources in targeting 32-bit platform support, so they have had to hold off shipping software relying on it until the platforms drop 32-bit support. Sierra did that last year, and I'd put $5 on iOS 11 doing that as well.


Correct, I should have said "the platforms dropping 32-bit processor support".

The ability to run 32-bit applications means that preexisting libraries cannot incorporate Swift code yet.

The "application may slow down your phone" warnings that users are getting with 32-bit apps this year is a pretty strong indicator that Apple is going to remove support for running 32-bit apps completely for iOS 11 or 12. They previously had a deadline for apps to have 64-bit version submitted, but backed off the ultimatum for now.


> They previously had a deadline for apps to have 64-bit version submitted, but backed off the ultimatum for now.

Apps have had to support 64-bit since June 2015 (February 2015 for new apps), and Apple hasn't backed off that deadline. But there are still 32-bit apps on the store that haven't received an 64-bit update, and I don't think Apple has ever stated what it is going to do about them (other than showing the warning).


Swift supports 32-bit iOS, just not 32-bit Mac.

Apple can't write Swift libraries (at least not ones that are publicly exposed) until the ABI stabilizes, so that will not happen until at least Swift 4. Apple can write Swift apps as long as they don't need to support 32-bit Mac, which they haven't needed to do for years.


Sierra did not drop support for 32-bit apps. It doesn't support 32-bit hardware, but that change was made many releases ago (Mac OS X Lion IIRC).


Sierra did not drop support for 32-bit applications. Pretty sure Microsoft Office just released a 64-bit version of their product a month or two ago. Good luck dropping 32-bit on the Mac and godspeed.


Including the Swift compiler itself - though at least that might prevent the segfaults.


If I remember right, its just the calculator app. Lol....


The dock and launchd were rewritten in Swift for Sierra.

There is a WWDC session talking about it.


Well that doesn't bode well ;) Sierra's dock is sooo buggy. I have to restart my machine numerous times a day because the dock stops working, literally doesn't work lol. I'm sure theres a way to restart just the dock app, but eh i've been quite busy. Sierra has been one of the buggiest version of Mac I can remember using.

I would surprised if they actually did tho. I know someone did some static analysis of the apps for mac and iOS and found Swift was barely used at all.


The dock is a separate process, as is Finder, you can just kill them.

I have to stomp on coreaudiod a few times a month because my USB DAC stops responding.


NASA has a culture of safety and redundancy ad nauseam because people are indoctrinated into it, not because their hires have an innate proclivity for it.

It is not fair to judge Lattner's ability or commitment to safety based upon Xcode and Swift. One of these predates him, the other is the result of his decisions (no doubt) but also countless decisions of others, including those above his pay grade at Apple.

Xcode is basically the evolution of something designed for NEXT. Swift is a solution to various problems in application development. I'm not aware if he (et.al.) had real time processing or safety critical devices in mind with 3.0.2.


> NASA has a culture of safety and redundancy ad nauseam because people are indoctrinated

I'm not claiming Lattner is missing some innate ability -- it just appears they're putting someone in charge who has never been exposed to this mindset. Maybe they have a culture or other leaders already in place who can foster this within the team.

> It is not fair to judge Lattner's ability or commitment to safety based upon Xcode and Swift.

To be fair, we cannot judge Lattner's ability or commitment to safety at all, because he has no publicly-known experience with safety-critical systems.

Perhaps he has relevant experience that's not public. And if it turns out he has no relevant experience, I'm not saying he can't learn. It's just strange for Tesla to put someone in charge who will be learning on the job.


Probably not, but the WWDC session about real time audio used Swift on their presentation.

http://devstreaming.apple.com/videos/wwdc/2016/507n0zrhzxdzm...


"Real-time" audio doesn't involve human critical systems.


Of course not, but part of the sentence said

> I'm not aware if he (et.al.) had real time processing


I know the first thing I thought about was does this mean Tesla's going to start writing its software in Swift. If so they are fucked :P. (BTW I write non-critical consumer software in Swift as my day job)


Swift has its problems but keeps getting better. I only get to use it for my at-home projects, since at work the portion of the codebase that is for iOS is Objective-C with no plans to switch any time soon. Old Obj-C too, and started by people used to programming in VB on MS platforms. So count your blessings.


I find Swift to be getting worse each year. While sure i'm blessed that I don't have to write applications in assemble, I have grown jaded towards Swift. If it wasn't because I am an iOS developer, I would happily not use the language. I have been slowly positioning myself away from doing iOS development. I feel like a massive corporation like Apple can provide better tools to write apps for their walled garden than what they are providing me.


What would you recommend instead?

C#? C++?

C++ has become incredibly great recently after languishing in the C++2003 period for too long.


SPARK ADA of course.


Well considering that most of the autopilot software is driven by black box deep neural networks, I don't think hiring someone with a strong coding background is going to make that much of a difference when it comes to safety.


Sure it is. NNs can behave non-deterministic (i.e. they produce different output for same input because AFAIK some operations on GPUs are undeterministic) and it makes sense to invest into software systems that clearly take a closer look on software correctness.

One idea to get correcter code is clean code layout (I heard good things about the LLVM code base), simple to read code and compilers that exploit theoretical knowledge we gained over years of compiler and type theory research. I think Chris Lattner has expertise in all of them (or at least knows about their importance, contributions and drawbacks), and if want to build a full-blown self-driving car it is important to have no indeterminism in your car, and advanced languages and compilers help to guarantee specific statements about your code.

So it absolutely makes sense to invest in your compiler research team for safety, as our security, reliability and correctness expectations will rise (which is good) in particular for self-driving cars (they are not just DNNs).


It ultimately depends on how much of the car's action is driven by DNNs, a miss-classification in the CNN can easily lead to a car crashing on a highway and causing a pileup. DNNs are so complex, you can't really write unit tests like you typically do for deterministic code.

That's why some of the big banks flat out refuse to implement any form of deep learning for risk analytics. They're much more reliant on simpler ML models like random forests and logistic regression that are easier to analyse and diagnose by model governance teams.


Yeah and it will happen. I am all for self-driving cars simply as they increase the life-quality for our civilization (more independency, cheaper ways of transportation, less fatalities), but there will be accidents and then you have to analyze them (which is good).

In the end you want to know what did go wrong (the public will demand it and they are right) and it might be misclassification.

But that is not enough. You want to know why did it classify situation X wrong? So the answer is because the inference network (which was created via the training network) computed its weights because the input was Y. Now you might throw your hands in the air and say "oh it's complicated, the network is undeterministic, blame NVidia", but you can also go farther and build your networks deterministic (which is possible and AFAIK not a performance penalty). Compiler research helps in at least guaranteeing that certain parts of code are deterministic which makes it easier to debug and maybe avoid complex NN misclassification scenarios, but the way to do it doesn't have to do much with NN's itself but more with language design and (real-time, in particular deterministic) OS research.

So the statement oh, that's so complex, we do not no why we misclassified is no excusion, we can do better.

For starters we have to publish NN papers with implementations that describe how to make a particular NN out of given trainigs data (provide that too). We already publish the code and network structure (see caffe, etc.), but often with pre-trained models that have been build on a cluster with many forms of training-data going through network structure, etc.

Now at the moment you read a paper, head to the published code (often the case, again a desirable property of the ML community) and try to reproduce some examples by training on the data.

However it is hard to say in the end, if your network is really as good as the published, as a simple

  $ diff my-net-binary-blob.dat tesla-net-binary-blob.dat
might fail, if the stack (training etc.) to build the network is undeterministic. However if you have a good (i.e. deterministic) stack you might be able to reproduce NNs bit-for-bit, which makes it simpler to answer the question "why did we missclassify".


> However if you have a good (i.e. deterministic) stack you might be able to reproduce NNs bit-for-bit, which makes it simpler to answer the question "why did we misclassify".

It does make it simpler, but surely not usually simple enough to answer the question "why did we misclassify". It's like saying we will finally understand consciousness once we simulate the quantum mechanics of a certain cubic metre of space with perfect accuracy - which need not be true even if that cubic metre happens to contain a functioning brain.


So would you say that the reliability and correctness of the non-NN part of the system, and the code that implements the NN itself, is not worth bothering with because the system includes a black box NN in it's design? What a bizarre conclusion to draw!


I like LLVM. But in the world of programming languages/compilers, someone like Xavier Leroy (a real, proved correct C compiler) is much more a name that I would associate with a drive for "software correctness and reliability".


If you want to talk about safety and reliability in automotive, then we talk about Autosar[1].

One of the concepts is to prevent the use of any dynamic behavior, which means you don't need any garbage collection at all. Because garbage collection is not predictable. It can be good and bad, but even a good one is not predictable.

Embedded design and development is completely different to classical IT. For IT that sounds archaic. But it always proved right at the end.

[1] https://en.wikipedia.org/wiki/AUTOSAR


True, AUTOSAR is the current standard for embedded automotive systems. Regarding safety I heard different things about it, some parties would only use it for up to ASIL A, others to C, others also to D. My personal opinion is that AUTOSAR (and even it's basics like C as a programming language and the OSEK OS) are not the best solutions that we should come up with for all the safety critical tasks in the autonomous driving world. Someone like Lattner would clearly have the potential to improve the current state of the art there a lot - but it's uncertain if Tesla would share any advantages with the remaining industry.

And regarding memory allocation in reliable automotive systems: Yes, the best practice would be not to allocate at all to get to some deterministic behavior. However I've seen lots of projects where "don't allocate" is implemented as "don't allocate with malloc", and you find dozens of custom memory allocators and pools throughout the code. Some of those designs are probably less reliable and safe than using a garbage collected language would be.


I often wonder would Erlang be a good language for embedded automotive systems. It's seems to have the right traits - no GC pauses, built for reliability etc.


I think it would fit great from the semantic. Message based communication is exactly what a lot of automotive systems are about (CAN, FlexRay, ...). The downside of the actor model from an implementation point of view is that you need potentially unbounded memory for the incoming message queue of each actor. For each message that is sent to an actor you need to allocate a message, populate it and place in that that actors queue. If the actor can't process fast enough the queue will fill hap.

It might work out with a very careful system design, but determining statically how much queued messages and memory is needed seems like a very hard task.


That's not very surprising honestly. ML/AI algorithms alone are useless if they are written incorrectly or are not executed properly by the underlying OS/firmware.


"predictable performance (no garbage collection pauses)"

No garbage collection means more fragmentation though which can be an issue at some point perf-wise, there's no free lunch.


software correctness eh?

so build the autopilot and other systems in haskell?


Apple has a car project. Autopilot must be one of its core features. But Apple would not create a VP-level position for Autopilot or any one feature for any of their products (even if they it called something else for secrecy/generalization reasons).

This move increases his compensation and clout. Post-Tesla, he'll only have VP or founder titles elsewhere, never anything lower-level (unless he gets his old job back). The change must be welcoming, too.

That Apple could not retain him speaks volumes of the company they've become. They're a conglomerate at the intersection of tech and fashion. Groundbreaking engineering is not always given its proper due (or compensation) because there's only so many seats at the table. They've become rigidly corporate and not particularly inspiring.

Good luck to him, and good job taking a chance. Working for someone with a vision other than "thin" must be a welcoming change.


> Working for someone with a vision other than "thin" must be a welcoming change.

This is so spot on. I would kill for a thicker phone with a less vibrant display that would last 2x as long on a single charge. They're pushing stuff that the consumer doesn't want (thinner/less battery volume, no headphone jack) in order to make money. It reminds me a lot of the TV industry pushing 3D TV's. I don't know a single person who has ever watched, let alone regularly, any 3D content on their 3D TV. It's a technology that few wanted but was pushed to drive sales.


>They're pushing stuff that the consumer doesn't want (thinner/less battery volume, no headphone jack) in order to make money.

But how do you know that people don't want this? All the people who told me they were switching to Android because of the headphone jack ended up with iPhone 7s anyways and love them. And now it seems that Samsung's next phone is going to ditch the jack too. Not to mention the numbers aren't really in favor of suggesting this change is hugely unpopular.


Because the portable battery pack market size was estimated to be $15B in 2014[0], and is projected to grow to $17B by 2020[0]. That's saying nothing of the massive anecdotal evidence.

[0] https://globenewswire.com/news-release/2016/04/04/825448/0/e...


I would argue that the battery pack market has little to do with whether or not folks think their phones are too thin. For instance, in day-to-day use I don't recall ever plugging a device into a battery pack. But I own two of them. Because once in a while I'm in the middle of, say, the Yukon or Alaska and haven't seen a power outlet in days, but perhaps would like a movie in my tent or to write a blog post. (I could charge from outlets on the motorcycle, but I've got enough crap plugged in the way it is.) So most of the time those packs sit on a shelf and are no reflection on whether or not I feel the iPhone battery is adequate for normal use.

Not everyone hangs in the same circles, but lots of (for example) motorcyclists buy these for general "plug in at camp" use. Run a small light, charge the GoPro, and some do use them to charge a phone. Hell, some of them use them as jump start batteries which is about as far away from anything related to cell phones as I can imagine in the portable power market.

In summary, quoting battery pack market size merely reflects the desire for folks to have portable electrical power, for whatever they feel they might need it for.


Power banks are convenient and I'd rather have a power bank and a thin phone than a really thick phone and no power bank. Thin when you want it, bulky when you don't.

The 3.5mm headphone market is also massive but people don't want 3.5mm headphones, they just want headphones. You're confusing demand for intrinsic virtue.


>That Apple could not retain him speaks volumes of the company they've become.

Sorry if this sounds naive (and it's not intended as veiled criticism), but could it not be the case just that he is more valuable to Tesla so they are able to justify offering him more compensation?


I thought Apple's car project was dead? Now autopilot is the one feature they're building?

http://www.motortrend.com/news/apple-car-dead-software-shift...


Both are correct: Apple's car project is dead, Apple is on autopilot.


> That Apple could not retain him speaks volumes of the company

Or it could be just that there 100s of engineers and tech leads working on Swift/Xcode/LLVM and giving enormous compensation to one because (s)he is famous engineer and wanted to leave does not really make sense. Unlike marketing or sales writing core software does not seem to be star driven, or at least I hope so.


> That Apple could not retain him speaks volumes of the company they've become. They're a conglomerate at the intersection of tech and fashion.

Well said. Apple would have made the strides it did in its developer platform had it not been for Lattner. It does seem that the company has different priorities than it did when he was hired


Elon Musk is the new Steve Jobs.

He is an inspirational leader and talented people want to work with him. It appears it will be harder and harder to attract and keep talent at Apple with out a leader like Steve Jobs.


It's also the stages of companies. Apple is in very much a "maintain the working business model" phase, especially with the current management. I'm not in a position to say they are wrong for doing so given they've continued to grow and now make as much money as ExxonMobil but with far better profit margins.

While Tesla is still on the upward rocket phase while they are still figuring out their product/market. Much more exciting for the type of talent who are looking to make a big impact on the world.

Steve Jobs brought that life back into Apple when he rejoined and maintained that environment through various product launches over many years.

I'm not sure I'd want them to still pretend they are the same company or not, rather they should work on the talent they have. Just like how basketball teams rebuild after losing their star players, you rebuild around new talent instead of clinging on to past glory.


Exactly, Apple is to microcomputing what Tesla is to green power and vehicles. We'll see in 10 years after Musk is fired from Tesla by John Sculley.


> We'll see in 10 years after Musk is fired from Tesla by John Sculley.

I know you are being facetious, but similar to Jobs, Musk got ousted as CEO from his first and second companies: Zip2 [1] and PayPal [2].

[1] Vance, Ashley (2015). Elon Musk: Tesla, SpaceX, and the Quest for a Fantastic Future, p72.

[2] https://en.wikipedia.org/wiki/Elon_Musk#X.com_and_PayPal


Heh, I forgot that. But we all know these were really insignificant compared to his later ventures.


Perhaps you're reading too much into things?

Chris Lattner has been at Apple for over 11 years. He shepherded LLVM, Clang, lldb, and Swift. All while climbing the ranks to Director. A huge number of engineers would simply be ready to think about and do something different after that much time.

You might say that Elon answers "Why Tesla?" but in no way can you claim an answer to "Why leave Apple?".

(I have no inside knowledge, I've just been in the industry long enough to have gone through this myself).


Apple is famous for having some of the most tenured employees in the valley. 11 years is good, but not particularly long there. Google will be the next Apple in this regard, I predict.


... in no way can you claim an answer to "Why leave Apple?".

Seems obvious, doesn't it? Nothing interesting going on.

Under Jobs, terms like "courage" and "innovation" used to mean something, like kicking sand in the face of the entire mobile phone industry and competing against their own bestselling product. Tim Cook's idea of "courage" involves selecting a headphone jack in a CAD tool and hitting the Delete key. And "innovation" means late nights at the office pushing the limits of dongle engineering.


My guess is that Apple was a vehicle to further his compiler and programming language plans. Now that he has done that, it becomes more mundane maintenance and incremental changes. So it is time to move on to a new challenge.


Apple still has the best CPU on mobile. But that does not count for those who cannot see past a headphone jack.


I see Musk as more of a Bushnell/Woz hybrid. He knows his shit technically, which is why good people want to work with him. Jobs knew how to delegate knowing his technical shit to others though.


Steve Jobs' contributions came from from his leadership and marketing chops, but he also had pretty good technical knowledge for a CEO. I mean, he got his start designing circuit boards at Atari.


Jobs was a true polymath. He knew enough about hardware design, software engineering, typography, music, pretty much any relevant subject that he could work well with experts in those fields and synthesise their efforts together into whole products.


Well, he got his start passing of Woz's circuit board designs as his own at Atari. Even he conceded that.

I do think his technical knowledge was probably higher than most give him credit for, however.


A lot of his best insights required technical knowledge - adopting Ethernet, SCSI, Mach kernels, BSD userland, object oriented programming, Postscript, TrueType fonts, WebObjects (and Java server, while ditching it on the client), html5 (over Flash)


Were they his personally formed insights or did they come from talking to the right people?


Even then it would be 'known unknowns' to him as compared 'unknown unknowns'. It is big deal considering we just saw Marissa Meyer who was at center of creation of a web scale system proved out of depth at slightly different web scale system.


Jobs wrote the Atari game brick breaker. So like Gates, Zuckerberg, and Musk he does have a technical background.


Not really. He was contracted to do it, but subbed all the actual engineering work to Wozniak then essentially lied to Wozniak about how much that work was worth.[1]

[1] https://en.wikipedia.org/wiki/Breakout_(video_game)#History_...


My mistake. I was under the impression it was another game, asteroids.


As the story goes, Wozniak was actually the primary (or maybe even sole?) implementer on that project:

https://en.wikipedia.org/wiki/Breakout_(video_game)#History_...


Pretty sure Woz wrote Breakout, with er ... "assistance" from Jobs.


> He is an inspirational leader and talented people want to work with him.

This is a convenient narrative when Tesla makes a high profile hire but they've also lost a lot of talented engineers and I don't remember the convention wisdom being that it was because of Elon's shortcomings.


Who left and why?


This is why I think Apple missed its chance to buy Tesla and make Elon Musk its CEO at the same time.

I can't even imagine what Musk would do with Apple's $200 billion in cash. I think he would've been much more daring with that money than even Steve Jobs would've been.

But I think Apple missed its shot, and the merger of Tesla and Solar City probably sealed that for good. Now Musk is probably already seeing a 10-20x larger combined Tesla/Solar City company in his head, 10 years from now, and a potential merger with a bigger Space X as well.

So from his point of view, it probably won't be worth it anymore for him. He would probably have to take over a declining iPhone market and deal with that at the same time as dealing with explosive growth at Tesla and an imminent launch of SpaceX' big rocket to Mars.

On the other hand there would be hundreds of billions of dollars he could get access to, so I wouldn't say it's impossible to happen anymore either. However, at least to me, this would only be interesting from the "let's give Musk unlimited money and see what he can do wit it" point of view. Otherwise, I would rather see Tesla/SpaceX be on their own, than join an Apple/Tesla/SpaceX megacorp.


I'd be curious what that level of money would do to the culture at SpaceX. I work here as a tech and I get the impression from top to bottom that a big part of the overall culture centers around doing things as inexpensively as possible, largely out of necessity. People here genuinely get excited when we find ways to shave off even small amounts of waste.

I'm not knowledgable enough about the psychological/sociological aspects of this but I wonder if it's possible to maintain that kind of culture when you have $200 billion sitting in the bank. It might lead to a kind of resource curse that some countries suffer from.

I wonder if there have been studies done on this at the corporate level.


That's a really good point, what would Musk do with $200 billions.

It's questionable if he would be the profit-maximizing CEO choice for Apple though.

Perhaps he wouldn't even want the job (if we ignore the cash), because the smartphone and computer industry is maturing, i.e. tougher competition, decreasing rate of innovation, maintenance mode ahead.


But unlike Steve Jobs, he's an engineer.

My favourite bosses were engineers first.


Inspirational? People keep saying this but I cant see it. He has inspirational "ideas", but a person? At least not behind a screen and when he is doing demo.

And for the iPhone Vs Electric Car comparison. Oh well may be i am old school, if BMW or other Car makers made a Electric Car ( assuming we have to choose a e-car ) i would choose them over Tesla any day. The interior quality of Tesla just dont compare well. The iPhone was truly a revolutionary product, in that many people tried smartphone before but they have ALL FAILED. This is speaking from someone who has used pretty much all smart phone prior to iPhone introduction.

If anything i say the difference is Elon Musk lack of taste.


How does Telsa compensation compare to Apple?


I think Elon Musk is what Steve Jobs wanted to be. Steve Jobs thought he was changing the world with the iPhone, and while it was new it wasn't world changing. What Elon Musk does is world changing.


iPhone wasnt world changing? I mean sure, you could make the argument that if it wasnt Apple with the iPhone it would have been someone else, but it's undeniably daft to think that the iPhone didn't lead to a complete change in not only consumer electronics (mobile phones), but also so many services around it.

It's well documented that Android was going to be Blackberry-ish, then they pivoted to be iPhone-ish after the original iPhone announcement. Think the App Store and things like Uber, which exist primarily as mobile apps.


The iPhone broke the carrier's backs. Before the iPhone the carrier was the customer, not you. They dictated features. They had final approval over the awful Java applets on phones. They also set the app prices and took a huge cut.

Very rarely is a new technology useful in abstract; the inventor of the steering wheel made a contribution but without the rest of the car it is meaningless.

Touch screens existed, sure. There were one or two largish screen phones, sure. Smartphones existed (all using keyboards and styluses). OS X and Safari existed. But no one had put it all together into a single device, nor had anyone created a sensible UI design language to take advantage of things like multitouch.

You're basically saying Dropbox is garbage because rsync/SMB/NFS/FTP existed. Or Uber/Lyft are garbage because Taxis existed. Yes there are some superficial similarities but it turns out the details make a massive world of difference and it is intellectually dishonest to be so dismissive.


Who are you replying to here?


The app economy was the revolution.


Interested to see your sources on this. I was under the impression Android started first but Apple beat them to the announcement


I had read they had a "Dream" android phone prototype they were working on that was blackberry like [1]. There was also rumored to be a second one that was more iPhone like slated for down the line, but when the Apple demo happened Google immediately scrapped their blackberry like phone and the second one became the first.

Interesting that Google recognized the importance and adapted quickly while Ballmer laughed it off: https://www.youtube.com/watch?v=eywi0h_Y5_U

[1]: http://www.theverge.com/2012/4/25/2974676/this-was-the-origi...


If you walk through the new London Google office by kings cross, you can see the original android phone right by their restaurant. It looked just like a blackberry / Nokia E62..



Probably not a popular opinion, but I think Musk is overrated. The guy has a huge ego, just like Jobs had, and while he's putting it into a nice package, he's not the guy to come up with anything; electric cars, spacecraft etc.

Yes, he might change the world if/once he gets to Mars, but most of his stuff is a marketing tech demo.

Bell Labs changed the world. Tesla did not.


Have you read Vance's biography of Musk? He's definitely a flawed individual in many ways, but I don't think he's fundamentally an egotist. Certainly not to the sociopathic degree many business leaders are.

And I think your categorisation of his work as a tech demo is unfair, and uninformed. SpaceX have built rockets that are delivering satellites into orbit and supplies to the international space station, and have achieved reusability. They have played a significant part in building a private sector space industry. These are not tech demos, they are real things. Can you boast any such achievement in your own life?

Likewise, Tesla have built and sold electric cars that people want. They've created a global re-charging network to supply them. And they've been the first company to deploy significant automation into the automobile market on a large scale. These are not tech demos, any more than the gigafactory rising out o the Nevada desert is.


> Can you boast any such achievement in your own life?

That was an unnecessary remark; the commenter never claimed that he/she has achieved more than Musk.


The commenter claims Musk hasn't achieved anything at all, other than give some tech demos.

I'm perfectly entitled to ask what they've achieved themselves that puts them in a position to so casually demean what, by most people's standards, are quite considerable feats.


What OP has or has not achieved is irrelevant in my opinion. I think it's better form to just refute the criticism and leave OP out of it. But it's up to you of course.


In the meanwhile you've taken the entire conversation off course to serve as the politeness police.


I'm not sure that OP's accomplishments are relevant in this case. You don't have to be above someone to point out mistakes. That is not to say I agree with OP, only that I disagree with your statement.


You don't have to be a chef to know the food tastes bad.


For the egoist part, have a look at some of his quotes regarding nationalism, lobbying, competition, the rockets that currently get most of our cargo onto the ISS, simplistic comments on A.I, the way he treats lower-level engineers at SpaceX and Tesla etc. am on mobile, but it's easy to find more info for yourself.


>but most of his stuff is a marketing tech demo.

Disclaimer: I work at SpaceX, albeit as a technician so I'm very far down the corporate ladder, but I feel that may be unduly harsh.

There can certainly be legitimate criticisms around Musk personally, and SpaceX/Tesla in regards to whether they are overhyped relative to competitors or whether they will succeed on delivering what they promise. With that being said, when a company delivers 70k+ cars in a year, even if this is just a tiny percentage of the overall new car market, or a company puts satellites into orbit, I think we've moved beyond "marketing tech demo" status.


I'll offer what might be an uncommon perspective on 70k cars. I used to have a 2nd gen Toyota MR2. Great car. Vibrant community, guys developing and selling alternator brackets to shave off a few pounds of weight, there were meet ups, etc. I eventually sold mine, miss it, and still see a few around. In the four years the 2nd gen was available in the U.S. (91-95), Toyota sold a grand total 20k. 70k may not be a lot compared to the overall market, but it's nothing to sneeze at.


Of course he doesn't come up personally with, say, rocket engine innovations. But what he does is orders of magnitude more valuable - motivating people, making good high-level decisions (because he's smart and knowledgable in multiple areas such as engineering, design, marketing), getting the right people work for him and being very hard-working.

Being a good CEO is just way more valuable than being a good individual contributor, because it multiplies the output and growth of the whole company.

And ego doesn't really matter that much in the big picture.


His "story" seemed to hint at a more engineering capable than Jobs. Jobs could do a bit of hacking it seems. Musk could do a bit of physics which is a tad harder IMO.


If you think the iPhone didn't change the world, you need to take a second look.


Jobs is complicated because he had technical knowledge and design taste.

Most engineers wouldn't have thought about having proportionally spaced fonts back in the 80's when personal computers only had 80 x 24 fixed withd character green phosphor displays.

When Jobs dropped out of regular college and dropped back in to take the classes he was truly interested in, he took calligraphy; years later that lead to the Mac being for first personal computer (the LISA had it too, but that was the $10,000 predecessor to the Mac) to have proportionally spaced, bitmap display.

That's not something Woz (or someone like him) would have prioritized for a brand new computing platform.


Jobs also has over 300 patents to his name. His name is first for example on the multitouch patent, since he created many of the gestures and ui/ux concepts we use everyday: https://www.google.com/amp/s/www.technologyreview.com/s/5328...


He hasn't changed much yet.


Funny you mention that. Apple will crash if they don't get a serious personality cult back in there. Even that probably won't save them. Plus Musk is not the same brilliant, simple communicator that Jobs was.


Apple has done more than okay since Tim Cook took over as CEO: https://qz.com/765360/the-first-five-years-of-tim-cooks-reig...


So did Microsoft when Ballmer took over from Gates. Until suddenly they didn't any more.

https://steveblank.com/2016/10/24/why-tim-cook-is-steve-ball...

> If you think the job of a CEO is to increase sales, then Ballmer did a spectacular job. He tripled Microsoft’s sales to $78 billion and profits more than doubled from $9 billion to $22 billion. The launch of the Xbox and Kinect, and the acquisitions of Skype and Yammer happened on his shift. If the Microsoft board was managing for quarter to quarter or even year to year revenue growth, Ballmer was as good as it gets as a CEO. But if the purpose of the company is long-term survival, then one could make a much better argument that he was a failure as a CEO as he optimized short-term gains by squandering long-term opportunities.


Long term opportunities like Azure? Bear in mind 'Nadella' launches like Office for iOS came out so soon after he took over they must have started bell back into Ballmer's reign.

Yes, he blew it on Mobile and that's a huge deal. The hugest. He had his blind spots, but he was very far from incompetent.


Why is it funny that he mentions that?


Second that. I get the different opinions, but why is it funny (the downvoting implies that it was funny).

I can for 100% assure that I only nitpick because I want to understand the logical reasoning for down-voting and are scared that I am too high to understand the meaning of the phrase "It's funny you mention that". I am not a native speaker, but I know all the words, heard it in English and there exists a translation into my language and the interpretation is the same for both, so long story short ... why is it funny?



So there are 2 meanings, and I read it "funny peculiar". But why is it "funny peculiar"? I think it is right to say Elon Musk is the new Steve Jobs (as in "most popular tech/computer-stuff person for the public"), so working for Tesla has some kind of "coolness" factor and they have good marketing and might beat Google's self-driving endeavors simply via time-to-market (similar to IPhone).

But why is that funny (haha) or funny (strange)? If they can build a self-driving car for the masses Elon Musk will be the uber-tech guy for a whole generation, and Tesla seen as one of the good guys with cool tech.


I think you are over-thinking and misreading this. One person says something to the effect of "I think Musk is the new Jobs" another replies "I think Apple will die without a cultlike leader". The thing at the beginning of the reply is just a mostly content-free throwaway phrase not intended for the Talmudic analysis you're giving it.


You are right, I probably read too much into it. It's because I am high, and when I am high I get extremely interested in languages (both natural/real and computer languages) and as a non-native speaker but fluent reader I often wonder about phrases ... too long.


Funny that I've been thinking about the same issues going on with AAPL right now. That was all.


Wow, that was unexpected. Great hire by Tesla. Interesting move from compiler development to driving AI, though I guess for the VP position, his experience in managing those teams is much more important.


> Wow, that was unexpected. Great poach by Tesla.

Tangential, but I really hate the term "poach" when referring to recruiting employees.

We shouldn't think of hiring people as "poaching", because employees are not property. You can't "poach" an employee because there's no ownership, and employees should be free to make their own decisions regarding their employment opportunities.


I hope you similarly dislike the term 'head-hunt' on the grounds that no-one's head is severed. 'Poach' is a useful metaphor that suggests that one company actively tried to lure an employee from another. It is a metaphor, it doesn't imply that the employee is a chattel. I think one can get a bit too precious about these things.


Language shapes perception shapes language, so it's absolutely worthwhile to be vigilant and critical about the idioms we endorse by adopting, especially for engineered phrases (e.g., "right to work").


I absolutely agree about the power of language to shape thought. But consider, whenever you've heard of company A poaching employee Y from company B, has it really subtly made you think of Y as a helpless object?

Whenever, I see it, I think of the two companies plying Y with enticements, with Y manipulating them until (s)he finally gets rich rewards with the new company.


It doesn't make me think of Y as a helpless object. But it does make me think of A as doing something wrong. I strongly object to the term because it implies that companies shouldn't compete for employees, which pushes our salaries down.


I think that's fair. After all, poach has literal meanings that apply to acts that are clearly wrong or illegal. In fact, if I look at the dictionary, you see words like trespass, steal, etc.

You do hear poach used colloquially or even somewhat jokingly when it's not intended to imply anything nefarious but the word does imply something underhanded.


> whenever you've heard of company A poaching employee Y from company B, has it really subtly made you think of Y as a helpless object?

Yes. If that wasn't what someone was trying to convey, why did they pick the word "poach" in the first place? The whole point of using that word is to evoke a metaphor where the hired employee is mere game animal ensnared by the company.

> I think of the two companies plying Y with enticements, with Y manipulating them until (s)he finally gets rich rewards with the new company.

"Hire" is a good verb for that.


"Poaching" doesn't refer to targeted hunting, though; it refers to hunting where one has no right to hunt. Of course, if this were just the etymology and nobody used it that way in the context of hiring, that wouldn't matter — but people do sometimes use it derogatorily, to suggest that one company has wronged the other. So I can see why somebody might prefer a less proprietary word.


When large powerful companies conspire to keep wages down by illegally entering "anti-poaching" agreements to prevent employees from having freedom to make career choices because it would also give them leverage to capture a larger percentage of the value they provide, I think it then becomes warranted to question terminology that makes this sound acceptable.


It is a metaphor, but it does imply that the employee is a chattel - perhaps tongue-in-cheek, but nonetheless.

It's always worth thinking about the language we use to describe things, because that language does have an effect on our modes of thought.


OK, I'll make another attempt at conveying my point, perhaps I wasn't clear enough.

I believe it's a mistake to treat words as though they were nothing more than abstract parse tokens, devoid of any cultural, historical or etymological baggage and conveying only the precise meaning that we intend to convey. Language is more complex than that, and the baggage that comes along with words does sometimes imply things that may be entirely unintended by the speaker, and often slip easily below the level of conscious awareness. Entire fields of propaganda (and advertising!) are based on this.

I am unsure why even the idea of thinking about the words we use is apparently so controversial - is critical self-examination really so scary?


> I hope you similarly dislike the term 'head-hunt' on the grounds that no-one's head is severed.

this comparison really seems to miss the mark.



the comparison not being apt isn't due to some qualitative assessment of original meaning/usage. thus linking to a headhunting wiki page seems like a reductive nonsequitur.


Yeah, they both sound kind of annoying.


I'm not sure why OP was down-voted, you may disagree but his post was credible and merits a response.


I downvoted most of the subthread as it was completely off-topic. I didn't come here to read this garbage.


Yea, I mean that's a good point, it is kinda a weird term. I don't mind changing it I guess.


I agree, it's a distasteful term. People choose jobs according to what's best for them and "poaching" implies the employer's interests should matter more than that.


Looking at the definition of poach, as it relates to this context, the subtext seems to be of the case where an employee leaves without warning or discussion of why they no longer want to remain at their place of employment. Like you cannot save an elephant after it has been poached, but maybe you could have if you know that there was a problem. The poach is the realization that you didn't see it coming.

Of course, the employee is free to just up and walk away at any time, but if you feel it is courtesy to discuss the relationship before parting, and it wasn't discussed, then poach does not seem completely inappropriate.

Having said that, in this case, we don't know many details about the parting. He and Apple may have had a good discussion about the future and came to a realization that the relationship wasn't going to work anymore, for whatever reason, and parted amicably.

From another perspective, perhaps we should not even think of employer/employee agreements to be relationships in this day in age?


i.e. it's an business management term.


as Freud said, sometimes a poach is just a poach


what a whiny, irrelevant comment


No.


Yes. It is language policing nonsense. Quit trying to tell people how to talk because of imagined slights.


Chris Lattner : Tesla :: Bjarne Stroustrup : Morgan Stanley [1]

[1] http://www.morganstanley.com/profiles/bjarne-stroustrup-mana...


I'm surprised Stroustrup is working at Morgan Stanley.


Finance is one area where C++ still rules as full stack language.

In the typical enterprise, whose domain isn't selling software products, C++ tends to be mostly used as infrastructure language for .NET/Java/JS native libraries or interacting with their runtime APIs.


I work in the games industry and C++ is used for almost all system code (not tools) but this still feels like Shigeru Miyamoto taking a gig at a toy shop or Spielberg working at Universal Studios amusement park.


You know, Google seems to attract a lot influential technologists that don't necessarily have a role in their organization. I guess it makes sense that other companies do this as well.


There's got to be more to this story. You don't spend years developing a language (Swift) as culmination of previous work (LLVM) and then abandon it for a job in a relatively new and different discipline. It doesn't make much sense to me.


My guess is that the Tesla Autopilot Software team is working on language/compiler abstractions to reduce their dependency on semiconductor hardware manufacturers, particularly NVIDIA [1]. NVIDIA hardware has been Tesla cars for many years, and they have a dominating position in both embedded and data center hardware for self-driving AI, and they are not afraid to partner with Tesla's competitors [2]. This whole ecosystem is based on CUDA toolchain, which is built atop LLVM [3]. And Chris Lattner is the original developer of LLVM.

[1] https://www.nvidia.com/object/tesla-and-nvidia.html [2] https://www.nvidia.com/object/audi-and-nvidia.html [3] http://docs.nvidia.com/cuda/nvvm-ir-spec/#axzz4VMqXIKfo


Considering the recent hire of Tesla's new Vice President of Autopilot Hardware Engineering team was Jim Keller[0], the lead architect for AMD's K8 (Athlon 64) and a very influential force behind both Apple's A* chips and AMD's Zen (Ryzen), I think you're right that they're looking to rely less on NVIDIA.

My guess would be that Tesla's ambitions are set on controlling the whole stack. Something along the lines of a custom software toolchain for tailor-built silicon.

[0] https://en.wikipedia.org/wiki/Jim_Keller_(engineer)


Or perhaps they want to go down the road of FPGA / ASIC. There is some LLVM based FPGA tooling around today [1], but perhaps they want to uplevel the core programming language to a new programming language? Chris has done this already from Obj-C to Swift.

[1] http://llvm.org/devmtg/2014-10/Slides/Baker-CustomHardwareSt...


I think you'll find that it's extremely unusual for modern software developers to remain at the one company doing the one job for a long time.

Given it's largely a field for intellectuals, sticking at the one thing for a long time can become repetitive and boring; losing the challenge and the appeal.


He has a track record of being able to scale engineering organizations in open source and corporate environments.


Exactly. If you can lead and develop a complex, technically challenging open source project that competes head-to-head with, say, one of the top 10 most venerated open source projects, achieve parity with it (some would say surpass it), without too much drama, _at Apple_, then, well, it's probably less about the raw technical talents and more about technical taste, managerial talent, and character.


Mm, I disagree. It's amazing and admirable how hard Chris has pushed on LLVM and such, but if he sees an opportunity to do something he thinks is really important, why wouldn't he take it?


There is, it's called seven figures worth of reasons.


I wonder what his salary is.


Salary will be modest, but the bulk of his comp will be stock and bonuses for sure.


10 mil/yr?


It's not a positive indicator of the health of Apple's engineering and business/engineering management that they couldn't keep someone in an extremely key technology position.

I'm reminded of their loss of both Avie Tevanian and Bertrand Serlet.


Well, mystery solved.

Now, the new mystery: What does this mean for Apple's car project?

Yes, I realize he probably didn't have much (or anything) to do with it (whatever it really even is). There are going to be some batshit crazy theories though, and I can't wait to see how this affects Apple's stock price.


Could be that rumors are true and they stopped the car project. Could also be that Apple's extreme compartmentalization came into play and Chris wasn't privy to that team and therefore couldn't really express interest in it. Could also be that they already have a full staff and didn't really have a spot for him or feel he was a good fit.

I don't think we can derive much about their project from this hire tbh.


From Chris website: "My overall objective is to improve large software systems through the development of novel techniques as well as the application of known engineering principles. Of particular interest to me are programming languages, compilers, debuggers, operating systems, graphics, and other system software."

So his new work will have a lot to do with his inters ;)


Doesn't sound like it.

All of those ("programming languages, compilers, debuggers, operating systems, graphics, and other system software") imply interest in tools for software development, where here they'd mostly be developing software (self driving engines etc), period.


Tesla likely has specialized software demands, and could benefit from specialized tools. In this sense, Lattner's interests coincide with Tesla's.


Car is clearly a bet-the-company level product for Apple. Tesla looking more and more like a bet-the-company acquisition.


How is Apple Car clearly a bet-the-company level product for Apple? It could just be Apple's normal secrecy, but as far as I know that project isn't generally well-known enough to be mentioned in the same pantheon as Google or Tesla.


Spending $10 billion on the car project would not be "betting the company" level of investment for Apple. That's "let's acquire a Motorola" level of investment.

Plus, I think the latest rumors say Apple has given up on making a car anyway, and it's working on a Waymo competitor of sorts.


How many billions does it cost to experiment with Car? How many billions does apple have sitting in cash?


Why discuss the imaginary car project that has stealth launched, stealth stopped, stealth started and who can even keep track instead of the company delivering cars now?

This has gotten so much coverage on the back of ... nothing really.


> Well, mystery solved.

That was a short mystery! :)


The best kind!


So I kinda what to be Chris Lattner when I grow up. :) LLVM, Swift, and Autopiloting cars is a hell of a run.


I've met Chris and he is one of the nicest and humblest guy you can imagine. I'm not betting on him lasting in Tesla for more than 12 months. I think there will be a cultural conflict.


Could you explain why you think there will be a cultural conflict? Do Tesla engineers typically act in ways counter to being nice and humble?


It's more like the person above than the people below.


Chris did work under the leadership of Steve Jobs for 5 years.


He worked under Jobs as a direct report, like a VP role is?!

I had the impression he wasn't that high in the totem pole before Jobs died...


Those two personality types can work fine actually, I think. Two egoistic-ish people working together is much more likely to be problematic.


Was there a cultural conflict for the myriad other ex-Apple employees at Tesla?


Where they "one of the nicest and humblest guys you [could] imagine"?

Because that was also part of the parents framing for the conflict.


This is very exciting for the future of Autopilot. I thought it was amazing already. I can't wait to see what they have in store.


Being a Rust-lang fan, I would have loved to see him join the rust-lang group (which I am not a part of). I thought there might be a chance of this after I saw the post of him leaving Apple.


Top two stories right now are about people leaving Apple... I don't think that bodes well for Apple.


I always liked that Chris Lattner pubicly acknowledged Light Table as the inspiration for SWIFT Playgrounds: http://www.nondot.org/sabre/ Now, of course their new project is Eve


Obviously, also related from the home page right now: https://news.ycombinator.com/item?id=13366542


(wish) Maybe he's being hired to make enhancements to swift/llvm for numerically intensive computation? It would be awesome to see a pure-swift version of something like tensor-flow and all the compiler machinery to make it fast!


What languages are used by Tesla for developing autopilot software?


Way to go Chris! It has to be pretty awesome when the CEO tweets and blogs that you're joining the team. No pressure!


Interesting twist : Apple buys Tesla and elevates Elon to the missing role of Steve Jobs. Wouldn't it be nice...


It's quite interesting that there is no public comment from any other member of the Swift Core Team.


Yesterday reading "Update on the Swift Project Lead" I was going to comment "He will go to Tesla".

Pure lucky, but my reasoning was that it is difficult for a compiler (etc) engineer like him to find a better job than the one he was doing at Apple: developers' tools, compiler infrastructure, a new programming language, Coding playground and so on.


Argh who's in charge of Xcode and Swift at Apple now????


Recording the conversation and filmed


Wow! My faith in autopilot just increased.


I doubt that this is a right decision.


Care to elaborate?


Congrats from us CS @ UIUC peeps!


[flagged]


Lattner had nothing to do with any of the things you listed.


The reading comprehension on HN is pretty bad lately. OP didn't say Lattner had anything to do with those examples.

The point is that Apple's been dropping the ball a lot lately. Lattner leaving is either another example of that (why didn't they make him a better offer to keep him?) or a consequence of it (what smart person wants to work for a company that's constantly screwing up?).


>The point is that Apple's been dropping the ball a lot lately.

They might, but the examples for that are bad themselves:

>Macbook with less ports

Which has always been something Apple pushed for. Dropping deprecated ports early (to complains) and adopting new ones. Few doubt USB-C is the future, even if they complain about the dongles.

>copy&paste iPhone with no originality

The iPhone, like the iPod before it, had always had incremental updates. What originality exactly should it had at it's 10th year? Magic pixie dust spray? Can you point to some competitor doing anything original?

Besides, while everyone is always about how "Apple is all about style and no substance", nobody pays attentions to the large internal changes inside the iPhone year over year, with new processors, boards, camera setups and other internals designed by Apple. Processors, than, in all tests, leave the competitive top-end Android phones behind in single/multi core performance.

>imac/macpro with old CPUs :-(

Intel announced Kaby Lake CPUs suitable for the iMac just last week (Jan 03).

Mac Pro, yes, but it's probably a duying niche product.


My counterpoint is the simple fact that we're even discussing this. Apple products used to "just work," and now they're a mess.

I've never considered myself a diehard Apple fan, but over the years I've spent thousands and thousands of dollars on Apple products. From a 1 Gb iPad shuffle all the way up to a Retina MBP, and quite a few things in between. I don't need to argue if Apple is going down hill because I'm seeing it first hand while using their products, or, more often lately, choosing not to use their products.


>My counterpoint is the simple fact that we're even discussing this. Apple products used to "just work," and now they're a mess.

My counterpoint is that I've been following Apple news ever since I was enthused with the idea of a NeXT-based OS X back in 2001, and know that "the simple fact that we're even discussing this" doesn't mean much.

People, pundits, media have "discussing" that Apple "lost the plot", "is doomed", "can't compete anymore", etc. all the time, from the introduction to the iPod until now.

Maybe this time it really is. But NOT because people are discussing this, because "people discussing this" has been a constant, not a differentiating factor for these times.


That was quick! Congrats.


Did not see that coming...



Yeah, I would work for Tesla in a heartbeat (or SpaceX, or Solar City or any of the Mega Factories). Well it's back to Access programming for me ... :(


Good for him.

This is awful though. People can't really think spending time on autopilot for rich people sports cars is more important than the LLVM compiler infrastructure.

Give me a break!

Even if tesla invents cool new batteries and changes the way we think about power all of that stuff still has to run on software, that the compiler infrastructure depends on...


Developing a system that can handle a complex, variable, real-world environment like driving surely has applications in other domains. LLVM and Swift are great contributions, and make development easier. That said, I don't see them changing software development as significantly as the possibilities opened up by systems like Tesla's autopilot.


I guess CarPlay or "Project Titan" just wasn't as interesting?


I hate it when big companies buy other, smaller entities, and hope they don't interfere and change them too much.

But seriously, exciting news for Tesla.


This is just a normal hire, no different than thousands of others that happen every day aside from being a fairly high position in the company and being someone well known in the tech community.


What? What small entity are you referring to?


Some apple vendor, probably


Chris Lattner, personally


are you a bot?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: