Hacker News new | past | comments | ask | show | jobs | submit login
James Gosling leaves Oracle (nighthacks.com)
146 points by past on April 10, 2010 | hide | past | favorite | 63 comments



Where would somebody like Gosling go to find a home? In other words, what Sun-like industry Research and Development centers (not just research shops or application software vendors) remain?

Microsoft the software company has a competing technology to sell you. Microsoft Research does not ship code. IBM and HP are professional services, not technology R&D companies. RedHat's JBoss division had been (very positively) driving the direction of Java EE, but they're much more of a consulting/software vendor (focused on enterprise software, not on programming languages) than an R&D shop.

Google would be the most likely candidate: they use Java extensively, make their own JVM and contribute to Apache Harmony. They do highly advanced R&D work (you could say they're the modern day Bell Labs), but most fruits of it are internal (with some work ending up as research papers and a very tiny fraction going out as open source).

Seems like something is missing: a first-class R&D shop that's a home to top technologists (who aren't interested wealth through entrepreneurship, but would rather work on many different, challenging projects -- something focused start-ups can't provide) and which ships software and hardware to the world. I'd love to see Google step up to that plate, but is that realistic?


you could say they're the modern day Bell Labs

I think this comparison is unduly favorable to Google. http://en.wikipedia.org/wiki/Bell_Labs#Discoveries_and_devel...


Certainly agreed. I could say "a modern day runner-up to Bell Labs". Nonetheless they employ actual Bell labs people (Rob Pike, Kernighan, etc...).


And Pike and Ken Thompson have gifted us with Go (http://en.wikipedia.org/wiki/Go_%28programming_language%29).


And Google doesn't use Go in their production systems.


Well ... UNIX back in the old days was something of a moonlight or stealth project. Then they got money for an original PDP-11 (later the 11/20) for a specific project or task, serious text processing as I recall, and that lead to the money for their PDP-11/45 (significant since its split I and D allowed for 64KB of code, 56KB of data and 8KB of stack (the latter split due to the 45's MMU), significant because that allowed much larger programs to be written (as I recall the 11/20's architecture left the last 8KB for devices)).

That's why nroff (from a Multics program, as I recall) and troff were such a big thing back then, troff would output to a professional phototypesetter. And they had upstream piped software that would DTRT with e.g. typesetting math. And this was practical, all parts of AT&T had a lot of manuals and papers to produce.

At some later point, AT&T/Bell Labs realized they had something seriously useful here, and the research people got funding for OS work while Western Electric forked UNIX for AT&T production work. (You can find this all in the appropriate histories, I know it from them and from starting with V6 in the summer of 1978 and writing my final project in nroff for a XEROX Daisywheel printer.)

Anyway, my point here, going back to the original point, was that Bell Labs per se was much more of a pure research lab, and it was the job of Western Electric to turn some of their output into stuff they'd use in the system.

And Bell Labs was a tacit part of the monopoly agreement that gave AT&T all the telephony business in the US that they wanted, it was "a price of doing business". That's why they couldn't sell research UNIX (Vx) for real money, things like the transistor weren't locked down with evil patent terms, etc. etc.

In other words, not a situation we expect to see repeated after the '80s breakup.

Right now Go is in an alpha stage (significant parts are missing and known to be needed, code you write will get broken) ... we should perhaps judge it more by guessing whether Google will eventually use it if and when it's of production quality. In the meanwhile, we can look at them as a fairly unique company that really understands that technology is a competitive advantage and that having some of that be open source is good for them.

See Joel S. on how companies desire to commoditize their supporting technologies. Google's open source browser and mobile phone and netbook OS projects are examples them doing very directed projects that support their real business. Indirectly they do things like hire Mr. Python.


IBM does do research, and plenty of it. Quantum Computing anyone? http://www.research.ibm.com/quantuminfo/teleportation/


Does the result of this research end up as products?

Microsoft Research is excellent as a research organization, but it seems that it's goal isn't to develop new products based on this research but rather to make sure that the scientists aren't developing products based on their research elsewhere.

Note: there's also lots of research work that needs to be done that isn't going to be applied immediately. That's certainly fine -- but there's been research that has went on at Sun that has been turned into industry changing products (from NFS to Java).


Yes. Off the top of my head, IBM invented hard disks, relational databases and RISC. They definitely aren't as innovative now as they were thirty years ago, but at least they have more to show than Oracle does.


From what I've read, IBM was a top-notch R&D company up until the late 90s/early 2000s (with all the innovations you mentioned, as well as literally writing the book on software development -- they were in no way a pure research shop). Since then they've transformed into a professional services company, however (although they've always been known for having great sales/professional services).


Microsoft funds a lot of fairly pure research with some of it slipping out into the real world, e.g. Simon Peyton-Jones http://research.microsoft.com/en-us/people/simonpj/ of Glasgow Haskell Compiler fame (and perhaps the most interesting interview in Coders at Work, although that has a lot to do with my familiarity of the work of others in the Lisp and PARC worlds).

One of the biggest complaints about Microsoft Research (or whatever it's called) was how little gets out of the lab into the "real world", with things like F# being recent exceptions, and their funding of the GHC is another example of how they indirectly ship code. Although this mostly legit complaint is precisely your complaint about "not just research shops".

I'm pretty sure IBM is still doing some tech R&D, how much in the coding area I'm not sure, let alone for how long.

Hmmm, has such a beast other than Sun for a brief time ever existed? Obviously Bell Labs, and the UCB Unix project (and some other DARPA projects that had useful code as a required deliverable, e.g. their general VLSI infrastructure push: http://en.wikipedia.org/wiki/VLSI_Project). MIT's exokernel project found itself reified in the real world as Xen, through yet another university lab.

Plus today there are other business models, e.g. look at Clojure (admittedly a one man project for the first few years), the cost of doing a R&D project that ships real world code is fantastically lower than it used to be when e.g. XEROX PARC was doing most? of its work on bit slice (i.e. not too fast) 16 bit Altos with 1-4 banks of 64KB RAM.

I came to the conclusion in the early '80s that some of the differences in the granularity of traditional Lisp and Smalltalk objects (the latter are larger) is in part due to the machines these people did their work on. Lisp has always gone for a big flat address space, Smalltalk had bank switching to contend with an I think that encouraged larger objects (larger than a cons cell, atom, numeric immediate, etc.

The architecture of X has a lot more to do with GAO "most preferred customer" rules/law than anything else.


I'll make sure to re-read that interview (Simon Peyton Jones), thanks!

There's certainly something to be said about it being possible -- more so now than ever -- for hackers to work alone, without affiliation with a large R&D organization or academia. Rich Hickey's work has been amazing and will be influential (the persistent immutable data structures are beginning to spread beyond Clojure). It's also interesting that Hickey was able to "bootstrap" himself from consulting rather than full-time work, regaining full rights to his work (not being subject to the all-too-common agreements which require one to give the fruit of all of their ideas to their employer).

Academia is still doing well (Scala from EPFL, some fairly interesting distributed systems work going on at Berkeley) and there are also university-based spin-offs e.g., Stonebraker start-ups (even though I disagree with some of his approaches, it's still great to see companies built to solve difficult and interesting problems). Historically these have been big on the West Coast: Google, Inktomi, Ousterhout's start-ups (Scriptics, Electric Cloud). I haven't seen any emerge recently, but perhaps I haven't been looking in the right places (or have been too cynical about what I saw).


You're welcome.

And to take this further, the recent explosion of new and successful languages shows something right is going on. Around of the turn of the century I was about to give up on this one field in pure CS I like, we seemed to be in a Dark Age where almost everything I could use in practice was of 1960's origin (maybe repackaged and improved a bit ... and, well, it helped (hurt) that I don't like Perl, despite learning it on my own for my own DP, and never took a liking to tcl).

Since then Python really broke out, need I mention Ruby, Clojure has given Lisp a new hope, Haskell is simply fascinating, the ML world is moving out of academia (e.g. F#), and those are just some of the big names (I don't know much about Scala, especially its trajectory).

In terms of getting new stuff into people's hands, this is probably the best period since the '70s for language geeks (I'm not counting C++ and Java, which are firmly based in the '60s/very early '70s).

I also "blame" the dot.com bust, which has required people work smarter (e.g. use Ruby on Rails), not harder (e.g. J2EE).


Re: Scala

This is off topic, I'd suggest playing with Scala. In addition to ML family influences (immutability, pattern matching, tail recursion, static typing with type inference, optional lazy evaluation, etc...) it's also very true to the Smalltalk/Ruby vision: everything is an object, there are Mixins (one can mixin multiple traits, which can contain implementation code unlike Java's interfaces), it's really easy to write your own control structures (despite the lack of macros).

Afaik, one of the implementers Clojure's collections was also Odersky's student at EPFL. So there's talk now if seeing them back in Scala (it can also be done directly e.g., http://github.com/codahale/yoink)

OCaml had shown to me that statically typed languages can still be very expressive and succinct. Scala shows that OO + static typing combination has been unfairly tainted by C++ and Java.

The only "issue" with Scala is that it does aim to be entirely backwards and forwards compatible with Java. There are countless obvious benefits (e.g., being able to use tools such as Jetty's websocket support, freely mixing actors together with j.u.concurrent, using mixins to enrich/"de-boilerplate" existing libraries/APIs), but it does mean that some features aren't as robust as they are elsewhere in ML family (pattern matching is better in OCaml).

It's interesting if F# had to do the same "Faustian Bargain" (perhaps the CLR allows for more flexibility?), I'll have to play with it under Mono.

That being said, there is one 70s/80s concept that I'd like to see more widely used: dynamic typing, with optional static type hints -- to allow for judicious optimization. Common Lisp does this beautifully (at least with SBCL and CMUCL you can actually examine assembly code and see it match what you'd write by hand). Some Smalltalk VMs had been able to do this as well.

Clojure does do type hints, at least for Java interop. Interestingly, HotSpot VM is based on Strongtalk (one such Smalltalk VM). With Clojure re-awakening Strongtalk's ideas while running on what was once its VM, we may be coming full circle :-)


Hopefully off topic but interesting is OK on HN:

I've developed a strong "allergy" for "traditional" OO; perhaps/probably unfairly, since it's based on a lot of C++ work I did in the '90s, and it's as anti-dynamic (not just in typing :-) as you can get in the OO world (insert Alan Kay's comment here (I think it was about C++)) ... but it's also based on my recoiling in distaste from seriously learning Java for some Clojure related stuff, too much accidental complexity. My tastes in languages run a lot more to the Scheme end of things (T may remain my favorite Lisp dialect and language of all time) so my current language interests are more like:

Play with Haskell since there's so much ferment there.

Learn a pre-OCaml ML so that I can grok all the ML based FP literature from the period before Haskell's ascent.

Help liberate Clojure from Java: as part of the Clojure in Clojure effort, work on what I call Turtle Clojure ("It's turtles all the way down"). While a Lisp traditionalist, I'm willing to "Get rid of cons!" (http://news.ycombinator.com/item?id=814632) but I'd like to see what can be done without Java in the middle. I can't imagine that Java tuned GC is optional for something so seriously functional as Clojure, so that's what I'm looking at the most. Maybe an Appel-Ellis-Li GC can be efficient under Xen (it uses VM for read barriers, but traditional OSes don't optimize that path (10,000 cycles circa 1990)).

The thing I find most interesting and challenging in the area of programming is the single address space multi-core SMP problem, which Clojure is very most focused on and is obviously very relevant at a time when I can buy a x86-64 4 core 8 MB shared L3 cache server class chip from Intel for as little as $200.

I have done the mixin thing with Flavors (proto-CLOS) and C++, and, yeah, the pure Java people don't know what they're missing.


Where would somebody like Gosling go to find a home?

      I think it depends on the road-map of java


No surprise here. It's too bad the IBM/Sun talks broke down, I think it would have been a much better marriage.


No surprise? Java is pretty big stuff in the corporate world that Oracle operates in, and had to be part of why they bought Sun. You wonder just what they're doing to piss off people like this so much.


Don't read too much into any one departure. Gosling has been working on Oak/Java for how long?

By the end of the year the pattern will be clear; me, I'm watching to see if they keep Fortress and Guy Steele.


That would mean the death of SPARC. How could this be good?

Well... It's better than Microsoft, I guess. The thought of a "Windows 7 Server for SPARC Enterprise Edition Plus" is frightening.


I'm guessing you're very young. NT used to be available on x86, MIPS, AXP and PowerPC. It was developed not on x86 but on i960, a RISC processor from Intel. There were plans for a SPARC version too but endianness issues in the HAL meant it was never performant.

The only reason Windows only runs on x86/x64 now is that customers weren't interested in it on other platforms. Microsoft really tried to make it cross platform.


> Microsoft really tried to make it cross platform.

They just forgot to port Office. IIRC, the then-current release of Visual Studio would run on x86 but compile to supported RISC platforms. That turned it into a server-only platform. No Office and no development tools before the dawn of the web application means certain desktop doom.


There are alternate histories regarding that: like how non x86 versions of NT were subpar and had a dearth of available software, and likely only existed to freeze the market for open-systems unix vendors, all of which had those CPU architectures as a strategic advantage.


I played with NT on Alpha. It was a blazing-fast workstation that had almost no software except Softimage.

Lack of Office killed the desktop Windows on RISC.

On PPC, I remember SQL Server had some weird network bugs too.


Highly recommend the book "Showstoppers" about Dave Cutler and the making of Windows NT


Glad HN is not slashdot. Please elaborate why selling more SPARC servers would not have been a good thing for the many SUN employees affected by the merger.


A huge market for high-end SPARC machines are enterprises, who use them to run Oracle RDBMS and SAP/other similar business software. IBM competes directly in this market with what used to be AS/400 and System/390 (not sure I know what they're called now, afaik iSeries and zSeries).

This isn't really about technology, it's about convincing these enterprises that it's better for them to scale vertically on a single machine than to scale horizontally on commodity hardware. From a technical point of view this has long been proven false, but for certain companies it may be true from a business point of view: if technology is a cost center (i.e., it's called "IT" not "R&D") and you don't have the talent needed to operate a cluster of commodity hardware (e.g, there are either no operations engineers who can program in your geographic area or they simply won't want to work for you) leasing/buying big iron (especially if it comes pre-configured and with a support contract) starts to sound attractive.

Even if SPARC is superior hardware, the customer simply wouldn't care. It would be more profitable for IBM to cut SPARC off and continue only with POWER. The professional services involved in this is the lucrative part for the vendor.

People who do care about technology, have in-house talent and do HPC (scientific computing, machine learning/data mining, Internet companies with scalability problems) are best served by x86_64 (in almost all cases excluding some types of computation), ia64 as well as IBM's own Power-based 1U/2U servers (what used to be the RS6000 series).


The OP hypothesized merger between IBM and Sun would have been better than Oracle/Sun. If IBM had bought Sun, they would probably have killed the SPARC because IBM has their own RISC CPU, the Power series.

Brian Aker says "I'm sure everything else Sun owned looked nice and scrumptious, but Oracle bought Sun for the hardware." http://radar.oreilly.com/2010/04/a-mysql-update-from-brian-a... which means the SPARC survives.

[edit: proper quote]


IBM maintain their mainframe business. While nobody's writing new applications for ZSeries, much like SPARC, there are some very profitable, untouched apps that run on Solaris 8/SPARC and they'll be around till the company gets the time and budget to port them.


Apart from seeing the beautiful elegant hardware running a second-rate port of a second-rate OS and the patents Sun owns ending up on being auctioned to patent trolls so they could have a field day destroying every competition for Windows without implying Microsoft, nothing.

Apart from that, it would be a good thing.


Guys... Even if you disagree (and you don't want to debate the issue and prefer to down-vote the comment) with Windows being a second-rate OS and that a SPARC port would not be a priority for Microsoft, you you have to agree that having all those shiny patents covering all kinds of stuff being auctioned off to trolls would impact everybody.

Not me, at least directly, because I live in Brazil and we have no such software patent nonsense, but tech business would be quite impossible where they are valid.


I doubt people are down-voting because of your views of the relative merits of this or that OS, but rather because the expression of those views is long on slogans and short on insight.


I tend to downvote posts if the author complains about being downvoted.

Also, not only were you "long on slogans and short on insight" but your argument doesn't fly. (IBM usually keeps the patents of the companies that it buys.)


I was speculating on what would happen if Microsoft bought Sun and decided to port Windows to SPARC.

IBM would want to keep the patents for itself, but Microsoft would probably wrap them in a do-not-hit-us term and sell them to trolls. This way they could undermine the competition while staying completely out of it.


> Microsoft would probably wrap them in a do-not-hit-us term and sell them to trolls.

Oh really? How about some supporting evidence?

The best evidence would include examples. After all, Microsoft buys companies all the time. If this is something that Microsoft will "probably" do, surely they've done it several times before.


Oh... You mean IBM-branded SPARC servers?

That would overlap with their POWER lineup. It's unlikely they would keep SPARC because it would eat away their POWER market share.

Don't know if it would, in the end, be bad for IBM, but the managers at the POWER side would pocket smaller bonuses. They would never let that happen.


Talks didn't break down, there were big anti-trust issues.


As I recall didn't the Sun board say the price was too low?


Nope, there were serious legal issues.

Edit: Just to be clear, these two firms would essentially own the mainframe market (sure IBM does pretty much already but that doesn't mean you let the acquire more market share)


this doesn't bode well for (ex) sun products if oracle is imposing significant cultural changes on former sun employees.


Can anyone involved in or close to the Java/OpenJDK comment on what this might mean for Java/JDK/JVM?


Gosling himself strongly objected ("No fucking way") to being called "the father of Java".


Unless they log into all machines running java and break them, why does it particularly matter? What's the worry here?


I am just wondering how involved Gosling is in JDK (in terms of roadmap, new features and such) now a days ... or is it primarily driven by the community?


I don't think he has a close BDFL-style involvement with the Java SDK or standard library. From what I can observe, Java is, and has been, committee-driven since about 2000.

And Gosling is too expensive to write code. His time is much better used in conferences, motivating developers.


I find it funny how you can be too expensive to code. I wonder when was the last time billionaires like Bill Gates, Steve Jobs or Sergey Brin coded for a large project just for the love of coding.


Has Steve Jobs ever coded?, I know Gates and Brin did at some point. A quick search revealed a quote from Wozniak: "Steve jobs never programmed in his life." http://arstechnica.com/apple/news/2006/10/5672.ars


Wow, I had no idea. Thanks for the info. According to Wikipedia, he worked at Atari creating circuit board for games.


Legend says he sub-hired Woz for a fraction of what he was being paid.

At that time, games were "programmed" with hardware too, so, it would no qualify as a programming job.


He used to do all the coding demos at NeXT: http://www.youtube.com/watch?v=j02b8Fuz73A (his little database app starts at 23:10)


I don't remember having seen his name for a long time in any technical discussion or decision involving Java, besides Closures, where he co-authored the BGGA proposal. Although it appears that Gafter and Bracha put more work in that one. Even then he did not apparently have enough say on the matter to make his blessed proposal official. His main role seems to be an evangelist for the platform and perhaps researcher for some more involved areas, like real-time systems and numerical processing, AFAICT from his blog posts.


Will Gosling join in IBM?


It seems logical he will end up at Google working on Android.


Google seems the best fit with people like Vint Cerf there.


that's a thought, he'd fit in very well at T.J. Watson


Another elf leaves Middle Earth.


In case that didn't make sense to you, it's a reference to: http://steveblank.com/2009/12/21/the-elves-leave-middle-eart...


I think it's more likely to be a reference to Lord of the Rings. There's no need for the Steve Blank sub-reference for that statement to make sense.


He left Rivendell, sure, but maybe he's just moving to Lothlórien.


to Mr. Gosling: thank you for your work on Java!

I hope you take much deserved time off then maybe end up at one of the newer brain trusts like Google.


Well said, mkramlich!


Will Gosling work at Apple ? (just kidding)


On the topic of Java, anyone know what's happening with the Java app store? No doubt it will fail to reach critical mass with the public, but it's a shame. Had Java had a more presentable front end and Sun been more commercial they could have started the trend for apps many years back.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: