Hacker News new | past | comments | ask | show | jobs | submit login
The Deno Company (deno.com)
883 points by elisee on March 29, 2021 | hide | past | favorite | 431 comments



Nice to see! How do they plan to monetize? I either became blind or missed it somehow. The article does say how they DON'T plan on monetize:

    "Rest assured that Deno will remain MIT licensed. For Deno to grow and be maximally useful, it must remain permissively free. We don’t believe the “open core” business model is right for a programming platform like Deno."
There are some hints though:

    "If you watch our conference talks, you will find we've been hinting at commercial applications of this infrastructure for years. We are bullish about the technology stack we've built and intend to pursue those commercial applications ourselves. Our business will build on the open source project, not attempt to monetize it directly."
Does anyone have some insight into those? I haven't watch any Deno talk (maybe one actually?) so it feel a bit strange to make people watch technical talks to find hints of the monetization strategy.

PS, if I was a rich investor I'd throw money at this project even as a donation, so no complain at all, but I'm very curious on the monetization plan.


Looks like https://deno.com/deploy will be a managed service - the implication seems to be that the default option will be to use their CDN to run code with an option to DIY if you prefer.


How does this business model survive Amazon AWS making a blog post, "Here's a template to run your deno code on Lambda!"? They'll never beat AWS on costs in the long term. They can burn VC cash to stay afloat and try I guess.


Lambda has a ton of caveats and limitations... but it seems their platform is full of limits as well:

https://deno.com/deploy/docs/pricing-and-limits

AWS sucks hairy balls at providing things that are simple for developers to use, so that could be their competitive advantage, but I'm just guessing here.


And deno will also have a lot of limitations


There are many, many companies competing with AWS on products that strictly speaking AWS also has.

I mean Zoom has no right to exist because you can use Chime?

There is always room for better UX, better support, different approach etc.


I find the idea that AWS will just eat the competition always a little silly as well. I've used AWS managed offerings that were far inferior to the alternatives.


So why were you using the AWS managed offering then instead of the alternative?


Usually because the company already pays an AWS bill. No one will care if you add another Lambda function, but to use an alternative you’ll have to get past some gatekeepers.


So there you go. Lots of friction for competitors.


And yet we've never seen anyone beat Amazon yet. Look at how much the PaaS area has churned over the last 10 years. Giants that stuck around like Docker are completely deflated and near worthless compared to their initial values and expectations. The smart ones like Heroku got out when the getting was good.


> And yet we've never seen anyone beat Amazon yet.

That largely depends on your definitions of "beat" and "win". There are plenty of software infrastructure firms out there that Amazon has yet to smash into the ground.

Nature seems to think (and I agree) simply existing is winning.


Please give examples. I would like to learn more about it (even if subjective)


DigitalOcean is a competitor of comparable size (even if its like a pebble to a boulder): https://finance.yahoo.com/news/blavatnik-backed-digitalocean...


I use Digital Ocean and definitely vouch for them. They are way cheaper compared to AWS too. Nice, slick UI too.


Lambda uses containers vs. cloudflare workers use v8 isolates. v8 Isolates are much much faster and more secure for serverless functions.

Deno seems to be targeting cloudflare as a competitor for their service... But it's probable that AWS will release a cloudflare worker competitor themselves if deno continues with the MIT license.


> Lambda uses containers vs. cloudflare workers use v8 isolates. v8 Isolates are much much faster and more secure for serverless functions.

You're right that v8 Isolates are blazing fast, but Lambda runs functions in a microvm spawn by Firecracker [0], which is likely to be more, not less, secure than Isolates [1].

[0] https://github.com/firecracker-microvm/firecracker/

[1] https://fly.io/blog/sandboxing-and-workload-isolation/


> Firecracker [0], likely to be more, not less, secure than Isolates [1]

This is debatable. It's true that V8 is a much larger attack surface than Firecracker, therefore likely to have more security bugs than Firecracker itself. However, Firecracker runs attacker-provided native code directly on hardware, which means that hardware itself becomes an attack surface, one that is quite wide, not fully documented, and very hard to patch if problems arise. It's much easier to work around hardware bugs when you're working from JS / Wasm and can control the code generation.

Ultimately I don't think you can really say one or the other model is more or less secure.

(Disclosure: I'm the tech lead for Cloudflare Workers so I am obviously biased here.)


Thanks Kenton.

> Firecracker runs attacker-provided native code directly on hardware, which means that hardware itself becomes an attack surface, one that is quite wide, not fully documented, and very hard to patch if problems arise. It's much easier to work around hardware bugs...

I see your point. I mean, Google wouldn't put as much effort as they are on gVisor if KVMs were the best possible answer.


To be fair, gVisor also runs native code directly on hardware. Any modern VM-based system is still depending on the CPU to enforce boundaries. A big CPU bug could ruin that at any time. (Spectre has been pretty bad, but not quite a showstopper...)


They can always get acquired by Amazon first.


It wouldn't surprise me if that's the eventual end goal. Amazon is looking for good Rust talent too from what I've read.


That’s the goal of almost all startups. Your chances of being acquired by a FAANG company are millions of times greater than going IPO. It is almost always the exit strategy.


They compete much more with Cloudflare Workers, not Lambda. So much so that Deno Deploy is worker API compatible


Workers is now a much capable platform. It supports eventual KV storage, caching large files, longer runtimes (30m+), WebSockets, WASM-executables, and distributed mail-boxes àla Erlang.

Lambda integrates with existing AWS services, whilst Cloudflare invents newer services to go along with the serverless-first paradigm. Different strategies but they do compete with each other.


I'll be very surprised if "quick and easy hosting" is still a viable business model, considering how crowded the space is by now.


Eh, Vercel pulled it off rather well with Next.js. Given that Guillermo Rauch is an investor in Deno, I wouldn't be surprised if they partnered in some way.


Quick, easy, and cheap hosting will always be a viable model, for varying definitions of cheap.


This is probably a stupid question, but is AGPL/commercial dual-licensing a viable option for something like this? Instead of relying on goodwill donations & maybe assigning 1-2 devs from major corporations, just explicitly charge them money if they refuse to ship source code to end-users.


Many companies would just never touch an AGPL package, and while there are other languages out there with commercial licenses (e.g. Delphi), those are not nearly as mainstream as the open & freely available ones.

Deno is competing against Node.js, which is MIT-licensed. Deno is arguably better, but it would have to be _so much better_ to get people to even give it a second look if it was commercial.


That's why you dual license, which is significantly more approachable. "This is AGPL/GPL unless you pay $200/month per developer seat" is a common licensing scheme for frameworks, people aren't scared by it.


In some places, the people who hold the purse strings are rather far removed from the actual developers, that it's a lot easier to just go with something that's "free".

I think this is one of the reasons why cloud computing monoliths like AWS are so successful. It's way easier to set up a $50/month VM on AWS then to get permission to spend $5/month on a VM elsewhere.


Enterprise sales requires paying salespeople for their network of potential customers and time to reach them. The sales cycle is long and risky for companies that can't absorb the cost of the sales salary and time to get revenue.

Unless you have a ridiculously good relationship with a handful of organizations or a perfect system for fitting into existing infrastructures, it's a big mistake to care about selling something to those organizations.

AWS is popular because it's free for companies with a high potential for getting big (like YC companies) and they can absorb the costs of the sales cycle.


> This is AGPL/GPL unless you pay $200/month per developer seat" is a common licensing scheme for frameworks, people aren't scared by it

Really ? I can think of QT - but that's such a huge body of work with some very specific use cases in which it doesn't really have competition (aside from DOM based shell for UI maybe, but that has it's own problems and is often a hassle to get working).

In this case Node is established, Deno is going to have an up hill battle to gain that mindshare on equal footing, adding a licensing restriction would probably kill it off the ground.


I promise: No trolling with this question.

Can anyone name a few that come to mind?

I only know about one (Qt: GPLv2/3 or LGPLv3), but I don't use a lot of commercial software. Interestingly, Qt looks about 233 USD per month per seat!

Ref: https://www.qt.io/pricing


Unfortunately Deno gave up on their most unique differentiator, the TS [runtime].


We most certainly did not, TypeScript is and continues to be a primary concern for us and we plan on continuing to support it as a first-class citizen of the ecosystem


Is Deno still going for TS runtime?

> Deno is a runtime for JavaScript and TypeScript that is based on the V8 JavaScript engine and the Rust programming language.


Insofar as we take `.ts` files seamlessly yes -- though to be clear, one does not simply "run" TypeScript. There are no runtimes for TypeScript directly (there's AssemblyScript that _looks_ like TypeScript, but isn't exactly TypeScript)

We've simply incorporated the type-checking and transpiling steps into the deno cli, making it super simple to get going, no config needed.


This is actually quite confusing, because

> A secure runtime for JavaScript and TypeScript.

Shouldn't require caveats, with the caveat being that there is no runtime for TS.

There is reason to believe that the extra information given from TS typings could be carried into runtime wins, and so there is absolutely pent up demand for an actual TS runtime.


Exactly. And I guess the package still needs to be compiled to JS before deploying to production? To avoid shipping a full TS compiler on a production server which would be a crazy thing to do.


I don't think that would be that crazy, especially if it's in the stdlib.


It seems like a huge amount of code and potential vulns to bring in production for no good reasons.


Why do you need a full TS compiler - you can just strip type annotations ala babel.


No we didn't?! Where did you get this info from? Deno is made to work with TypeScript out of the box. The Deno Standard Library is written in TypeScript: https://deno.land/std. Most userland modules are TypeScript too.


If it helps, I think this is the main source of misinformation:

https://startfunction.com/deno-will-stop-using-typescript/

It featured on HN some time back:

https://news.ycombinator.com/item?id=23592483


I do find the messaging to be confusing, because what would you think I meant if I talked about the runtime of Elixir or Scala?

This is the marketing line on the front page of the Deno project:

> A secure runtime for JavaScript and TypeScript.

They are using the same term "runtime" to describe JS as well as TS. All replies here by Deno team members use the language "Typescript support" but nobody here is saying TypeScript runtime.


I can understand where the confusion comes from, but I think it's a matter of context. I would never have thought that they would want to build a runtime that runs TS directly. The effort would just be insane.

Hence the implied step that TS needs to first be transpiled to JS before getting executed in the runtime was always in the back of my head and I'm guessing this is the way most people think as well.

And from a black-box perspective, whether TS first gets transpiled to JS or run directly doesn't make a difference. In that sense one could argue that it is a "TS runtime". This is a petty semantics-fight I don't want to get into though and I've probably already said too much :).

If you are interested in something that gets close to a TS runtime though, have a look at AssemblyScript [0]. It compiles to WASM and tries to keep most of the syntax of TS. Very interesting project imo.


Wasn't a TypeScript runtime the original Deno selling point? Didn't the project have to pivot away from that?


No. Nothing has changed about Typescript support since 1.0. It is just as well supported as JS. We never removed support...


So Deno is still committing to TS runtime?


I feel like people are talking past each other here.

To me "TypeScript runtime" means running TypeScript in a mode that runs faster than JavaScript by actually using the type info to generate better machine code. It does not mean translating to JavaScript at runtime and running in V8.

For example at one time there was an experimental version of Chrome that had a Dart Runtime. A quick google finds this HN thread when they decided NOT to have a dart runtime

https://news.ycombinator.com/item?id=9264531

So can we get clear? Does Deno plan to execute TypeScript natively (not via V8 or via a heavily modified V8) or are the plans to continue compiling TypeScript to JavaScript internally and actually just run JavaScript


That's what I understood as well, and now I'm confused at the messaging.


Y E S


Heck yeah


I don't know where I got so misinformed. I thought Deno had stopped going for a TS runtime.


Unfortunately, the words "Deno", "TypeScript", and "removal" made for good headline material, and what was effectively a design document about performance optimization[1] became misinterpreted by many as heralding the removal of TypeScript from Deno, despite the document being updated with a explanatory warning that it was a very deep technical document about a specific part of architecture, and that Deno remains completely committed to supporting TypeScript forever.

1: https://docs.google.com/document/d/1_WvwHl7BXUPmoiSeD8G83JmS...


Was does a TS runtime means anyway? Doesn't checking types at run time rather than at compile time a pure loss?


No, because TS hints should also be able to translate to runtime savings.

I do find the messaging to be confusing, because what would you think I meant if I talked about the runtime of Elixir or Scala?

This is the marketing line on the front page of the Deno project:

> A secure runtime for JavaScript and TypeScript.

They are using the same term "runtime" to describe JS as well as TS. All replies here by Deno team members use the language "Typescript support" but nobody here is saying TypeScript runtime.


They had to remove it from the Deno build process so Deno internals didn't need a compiler. However TypeScript support remains untouched and receives constant updates


I don’t remember “TS runtime” as you’re meaning it ever being pitched. I don’t think Ryan would consider that in scope for Deno unless MS wanted to collaborate on it.


I do find the messaging to be confusing, because what would you think I meant if I talked about the runtime of Elixir or Scala?

This is the marketing line on the front page of the Deno project:

> A secure runtime for JavaScript and TypeScript.

They are using the same term "runtime" to describe JS as well as TS. All replies here by Deno team members use the language "Typescript support" but nobody here is saying TypeScript runtime.


I would say no. I think they should go public domain, as that's the future.

One of the funniest/best business models out there is SQlite (https://sqlite.org/copyright.html). They give it away to the public domain, but some lawyers wrongly claim that that is not enough, so they will "sell" you a warranty asserting it is all public domain.


Those lawyers are correct.

Not all jurisdictions of the world legally recognize the concept of 'public domain'.


There does exist the CC0 license[0] which attempts to alleviate it, but I haven’t heard if it’s been tested in the courts.

[0]: https://creativecommons.org/publicdomain/zero/1.0/


Patent/copyright insurance? Sounds good, and not difficult to get a check written for, considering how much is spent on the service of open-source scanning.


I guess the "commercial applications" should be https://deno.com/deploy


Glad to see I'm not the only one having a hard time figuring out just how they intend to make money. My best guess was that they will offer a tailored Deno to companies that want to embed it (my comment about it is swimming somewhere here).

Other than that, there is the following vague statement at the end of the post:

> The Deno company hopes to enable the millions of web programmers out there to maximally leverage their craft in other domains.


Ah I read that statement in the end like it'll enable front-end devs to also write back-end code, like Node.js but better (APIs follow more the browser than in Node.js), but I didn't think it'd be related to the monetization?


Sounds to me that they will build some sort of hosted service or maybe PaaS based on the Deno runtime (like AppEngine or AWS lambda), but yeah it's pretty vague.


Looks like CDN with Serverless edge functions.

https://deno.com/deploy


Heroku doesn't (yet) officially support Deno. That's the main reason I don't use Deno for my personal site. So I'd say there's a market there.


> Sounds to me that they will build some sort of hosted service or maybe PaaS based on the Deno runtime (like AppEngine or AWS lambda)

or like Joyent


How did Joyent manage to turn Nodejs into money though? I mean weren't they mostly selling unix services?


looks like a crowded space.

vc's will issue course correction very soon.

looks like a repeat of docker.


In other words, capitalise on the decreasing quality of software engineers by offering services to them at a high price because they don't have the ability to make any of these services themselves.


This is a rather naive take I think..

There's a semi-joke that goes "all companies are software companies now" but there is a lot of truth to that..

It comes to down where a company wants to put its effort.

Do you want to own/operate/maintain all your infrastructure and services (and all the challenges that come with retaining scarce talent that is capable of that work), requiring you to substantially invest in an area that is not core to your business?

Or do you want to spend that money (and let's be real, less money too) to pay a trusted third party provider who will do a better job than you would anyways, because that's their area of expertise?

Then you can focus your (likely limited) resources on delivering value to your business instead.


Any of the big 10 tech companies wouldn't mind owning this if it becomes the standard way of developing back end JS. It gives them influence and the ability to lock out competitors.


I know this isn't the SV-approach to things but maybe they intend to offer consulting services for those looking for some expertise with the ecosystem, bill hours, and retire in their 50s as millionaires instead of billionaires.

This might be surprising, but not everyone is looking to be a unicorn.


The article says they "raised 4.9 million dollars of seed capital" so I expect their investors will want a decent return over the next few years.


Happy to see Deno get some financial backing!

I've been building my new multiplayer games website [1] with Deno over the last 4 months and apart from some minor growing pains, it's been a joy to use.

The lack of unnecessary package management, and the TypeScript-by-default approach makes Web dev much nicer. We're also using TypeScript on the client-side, relying on VSCode for error reporting. We use sucrase to strip the types just as we're serving the script files, so that there is no extra build time, it feels like TypeScript is Web-native and we can share typed code with the server.

[1] Not yet launched but we ran a preview past weekend with hundreds of players over WebSockets: https://twitter.com/MasterOfTheGrid/status/13757583007179735... - https://sparks.land


https://sparks.land doesn't load properly on mobile (iOS, Firefox)


Ah thanks for the heads up. It requires WebGL 2 which isn't yet in iOS's Web engine I believe? And IIRC all browsers have to use it on iOS. It does work on Android.


No webgl2, but there are a lot of webgl2 extensions supported. The biggest omission for me is not being able to render to float textures. (Although a lot of android devices cant do this either)


Peeking at sparks.land I see that you're serving .ts files, I assume that's what you mean by using sucrase, you're transpiling "live" instead of building/deploying bundles offline?

I notice your script files are all pretty small, have you run into any upper limits on performance or scalability so far with this approach?


Correct! In production we've got Cloudflare in the middle, so we're only using sucrase on-the-fly for each .ts file during development. So far it's unnoticeable in terms of loading times.

> I notice your script files are all pretty small, have you run into any upper limits on performance or scalability so far with this approach?

Not that I can tell. But if we need to, we can always do a minified bundle in production later on. So far it's just nice to not have to even think about it!


Wait, so you're running Sucrase in a Cloudflare Worker?

It compiles, and then caches the output I assume?

That's a really cool use case I hadn't thought of..


Not quite, I'm running Sucrase on my Deno HTTP server: if the extension is ".ts", I put the file through sucrase before serving it as text/javascript. In development, it happens every single time I reload (and it's fast enough that I don't even notice). In production, Cloudflare requests the .ts file from my server once (triggering sucrase), and then caches it.


Is the VSCode support good? I tried using Deno with WebStorm a few months ago and it wasn't a great experience.


It's getting there! They finished a rewrite of the extension recently and it's quite nice.

If you're on Windows like me, sadly there's still a nasty bug with path mismatches between the LSP server and the VSCode extension (https://github.com/denoland/deno/issues/9744) which requires reloading the window to fix spurious errors, but I'm sure it'll be fixed soon enough.


Jetbrains extension hasn't been updated much since release and doesn't interface with the official LSP. The experience is poor and outdated.

Vscode extension is maintained by the official team and will provide the best experience. There are unofficial plugins for sublime and Vim. They use LSP too and provide a comparable experience.


Hey, the Discord invite link is not active anymore.


Which one? The home page one seems to work for me. Otherwise try: https://sparks.land/discord


Are you running multiple cores/threads of deno? If so how are you holding/communicating state server side?


There's a central Deno server program called the switchboard, which serves static content, runs a small REST API for account management / login, and starts a WebSocket server for game servers to register themselves.

Each game server (a stand-alone Deno program that might or might not run on its own machine) connects to the switchboard over websocket and authenticates itself with an API key (since people will be able to make their own game servers).

When a player wants to join a server, they POST a request to the switchboard, which gives them back a token that they can send to the game server after establishing a WebSocket connection to it. The game server checks the token with the switchboard and gets back public user account info if it's valid.

Each game server's logic is currently single-threaded. I guess we might end up offloading some work to WebWorkers later on.

A server can publish some state info through the switchboard that will be broadcasted to other servers from the same user. This is used to show player counts in game rooms from a lobby, things like that.

I run the whole thing on a couple cheap Scaleway servers, with Cloudflare in front (no AWS nor containers or anything of the sort). My previous platform, built with Node.js (https://jklm.fun) is able to sustain at least 2000 concurrent players like that, though admittedly those are board-like games which are not very demanding, unlike the games for Sparks.land which will be more fully-fledged... so we'll see how that holds up!


Thanks I run something similar that can only really do 300 players before it starts to lag badly but TBH needs to be rewritten as its all single threaded and 1 process controls the lobby, every game, all chats, etc. Don't do what I did -_-


Haha I feel your pain, I did the same initially, took a few rewrites over the years to get to something I'm happy with and runs well.


For completeness, you should check out Elixir and Phoenix (channels and presence) for the server. Easy websockets, isolated player processes, non-blocking VM, plus deep introspection for debugging issues. https://youtu.be/JvBT4XBdoUE. We see more and more indie games being built with Phoenix LiveView.


What I find most exciting here is:

> Our infrastructure makes it possible to... create custom runtimes for different applications [like] Cloudflare Worker-style Serverless Functions

Fascinated to see what happens here. The serverless / edge compute paradigm fits Javascript hand-in-glove philosophically, but until now it's always felt quite clunky to me. When I've tried it out, I've always been left thinking "but this would just be so much easier with a server".

Reading this has made it click for me why that is. A new paradigm needs a new set of tools native to that paradigm.

The entire server-side JS ecosystem is currently structured around Node, a fundamentally stateful-server paradigm. You can try to abstract over it, but only so far. It's not the serverless paradigm that's clunky, per se, it's that the tools right now were built for another way of doing things.

As a concrete example - Deno has an import system based on URLs, rather than on-disk node_modules. I thought that was a cool feature for convenience, less overhead, more open and sharable packages, etc. But now I realise the full intent of it. It's much more than all that, it's a fundamental shift in the paradigm: no implied dependency on stateful disk in the runtime itself.


There's lots of things going on this this space. It seems every other day I discover another Cloudflare Workers-like implementation (granted, most of them are for testing/development). I'm cataloging them here for anyone who's interested: https://workers.js.org


Not against url-based imports but I don't quite understand why is "no implied dependency on stateful disk in the runtime itself." a big deal?


> Not against url-based imports but I don't quite understand why is "no implied dependency on stateful disk in the runtime itself." a big deal?

I think that lets you deploy to things like edge devices that don't have a hard drive, and even more ephemeral environments.


Yep, that will be something to follow. It will be really interesting to see what kind of market emerges here by allowing new custom runtimes to compete, with specializations for a different niches and environments.


Their comparison to Cloudflare workers doesn't seem right. The benefit of cloudflare workers is that they run on the edges of a CDN. Putting Deno on one server wouldn't achieve that.


Then you put Deno on the edge : https://deno.com/deploy


Where is the edge in Deno terms though?

It's the same issue with Netlify, Vercel etc. where their edge is very different to say Akamai, Cloudflare or Fastly's edge.


> Extending web programming beyond the browser is not a novel idea. Indeed, we have done that with moderate success in our “Node.js” project. But over a decade later, we find server-side JavaScript hopelessly fragmented, deeply tied to bad infrastructure, and irrevocably ruled by committees without the incentive to innovate. As the browser platform moves forward at a rapid pace, server-side JavaScript has stagnated.

wow lots of bold statements there. And another one for the usual "JavaScript is fragmented, let's create another tool to fix 'em all.".


Ryan Dahl (author of the announcement post) is the creator of Node.js, so I think he's got a right to say these things! Also see "10 things I regret about Node" [0]

[0] https://www.youtube.com/watch?v=M3BM9TB-8yA


Ryan Dahl left the leadership of the Node.js pretty early in its development. A lot of people can be considered "the creator" of Node.js to be fair.


While I'm not going to argue the history of if/when he left the project, my understanding is that it's fairly agreed upon that he is the creator of Node.js. If you google "node js creator", he's an embedded answer (not a search result).

The first line on Wikipedia in the nodejs history section is "Node.js was written initially by Ryan Dahl in 2009".

You can make whatever point you want, but maybe try to do it without rewriting history, or changing the agreed-upon definition of "creator".


> If you google "node js creator", he's an embedded answer (not a search result).

Embedded answers are worse than useless.

https://www.google.com/search?hl=en&q=who%20invented%20hands


Finding a exemplar of a bad response does not make the entire category "worse than useless".


The result is relevant and the information is (presumably) correct: "Introducing hand disinfection standards"


>Ryan Dahl left the leadership of the Node.js pretty early in its development. A lot of people can be considered "the creator" of Node.js to be fair.

Don't think so. I have been using Node.js since 2010 and following the ecosystem. Node.js is the child of Ryan Dahl. Other than Ryan, perhaps Isaac Schlueter is the most influential person who developed the NPM as a separate project and later merged into Node. Ryan could have done it himself or embedded it in Node.js, but guessing from Deno he is not keen on a centralised package repository so could have let Isaac build NPM as a separate project.

From what I could observe, the overall API is still pretty much the same from 2010. What's changed is V8 getting updated to newer versions (and thereby supporting new ES features) and other performance optimizations. But no one can't deny or take away the creator credit from Ryan.


No. Ryan Dahl was the initial individual creator of Node.


Yeah, no kidding.. shots fired...

As someone who uses Node but doesn't closely follow the steering/proposals side of things, I can't say I had this impression of that process..

Is Node really that bad compared to how JS/ES is innovated on in the browser?


That's very subjective

As an outsider who likes to lurk, I have the impression both ECMA and Node committees are stuck in the "we're nice and therefore right" field.

They made technological choices that broke the platform (eg. require vs import). It took ages of pain to innovate on things that matter (eg. promises, async)

I wish Deno the best, but I'll just try to stay away from JS from now on (same as I've been avoiding Python after the 2to3 migration).


yeah the dream of being able to share at the very least data types client and server side is pretty elusive. Most projects you could probably build twice before you could get client and server side sharing much code. Also just shocking me to me how averse the whole nodejs community is to shell commands like

https://guides.rubyonrails.org/v4.2/command_line.html#custom... https://docs.djangoproject.com/en/3.1/howto/custom-managemen...

I guess though, the people in the Ruby / Django community are actually building projects and the node community are padding their resume with new npm packages.

I hope deno or dart can make the JS world better, but I suspect they will both end up just making it more fragmented. One thing I really like on flutter / dart is they do try to rate / otherwise rank / categorize support of their packages aside from stars https://pub.dev/packages?sort=popularity / downloads which are easy to manipulate. IMO deno should do this as well or let someone willing to takeover as the primary package manager https://deno.land/x/lodash@4.17.19


The NodeJS community isn't averse to shell commands as you described; far from it I'd say. But a main differentiating factor between the Node and Python community is that Node folks like to compose things from lower level functionality a bit more, and are much more into functional paradigms and patterns. It helps that in Node, everything has essentially consolidated to using Express and Connect-style middleware, so interop between frameworks and libraries is high, and it's easy to go from a lower level basic framework to one that has batteries included (to put it in Python analogies: going from NodeJS "Flask" to NodeJS "Django" is easy because in Node "Django" runs on top of "Flask" and has support for the same middleware pattern, so all "Flask" middleware works fine if you switch to "Django"). The higher level batteries included frameworks almost always ship with their own CLI and tooling that closely matches what you'd find in something like Django; take a look at Nest.js for an example of such a framework.


> "we're nice and therefore right"

Golden phrase to describe a lot of folks. If I were to translate, it's behaving pleasantly and politely in order to justify poor decisions.


Interesting... thanks for sharing some insight.. :)


Node is only just now getting around to adding promise based APIs to the core. Any time you interact with the Node core APIs you have to drop back to using callbacks - that's at least one area where Node is behind the times.


util.promisify() seems to work for many of the node stuff I use a lot, like pipeline(), readFile(), etc. Perhaps require('xyz/promises') is a newer development.


do you really want to keep util.promisifying everything? I know I don't.

Streams (and event emitters, and the ecosystem bits that rely on them) still haven't fully caught up with promises either, e.g. async iterators and generators are still quite awkward


late reply. I've sort of experimented with async iterators and using pipeline() to pipe them together and it seems to work out for the most part; not sure what is awkward about them, I mean consume with an async iter on the source passed as an arg and you can throw at any time to kill it with an error. In fact I find the new async generator transforms easier and more ergonomic. in terms of event emitters. I can just use `await once(emiter, 'event');` now. Not sure how .on would be expected to work outside of it being some kind of async iterator.

I mean, I don't mind using util.promisify or just importing the "to be" xyz/promises paths. The code exists clearly, either behind a symbol or otherwise.



Why util.promisify `readFile` when `fs` exports promises?


Of course. It’s just new to users, relatively; only since 2019. To boot, one of the current LTS versions -the oldest- does not include it


Or use the ubiquitous NPM to install a promise orientated package... I don’t see this as a big problem honestly.


I'm not sure what you think could qualify Node as not keeping with the times, if not implementing the modern standard for async interfaces isn't it.


I agree with you that it would be nice if more of core supported promises, but I think there are probably plenty of examples of them keeping up with the times or charging ahead as well. Top-level await and optional chaining are a couple features that come to mind.


true, the language-evaluation side is keeping up. But isn't that just a case of integrating newer versions of v8? NodeJS doesn't implement its own JS engine.


I hear you, but I'll also give them some credit—you can't create anything worthwhile without a clear vision which basically requires an opinionated sense of purpose.


Right behind the giant understatement of calling Node a “modest success.”


A modest understatement if you will


It feels like the old IO.js vs node.js. I hope this time too the two ecosystems get merged, but this time it feels much harder to happen


Wasn't that more of an ideological than technical split?

Deno is more of an ecosystem reboot than a fork, I would think.


Yeah, with the use of "it feels" this time I was literally trying to express the feeling of "oh no, another big divergence in js" like it felt at the time (even if it was indeed ad "ideological split" as you say, I remember clearly at the time a lot of blog posts about how it meant that in the long run we COULD have different ways of doing things / calling apis / build frameworks / etc... )


Fair enough.. I agree that we'll have to see where this goes, and hopefully a technical effort doesn't get overshadowed by an ideological one..

Because you know this blog post and announcement will draw some strong reactions from entrenched interests.


You cannot always cleanly differentiate these two. You could argue that "Node vs io.js" was an "ideological" split because they wanted different governance, but one big reason they wanted different governance is to make different technical choices (for example, track upstream V8 more closely), so you could also claim this was a "technical" split too.


[flagged]


It doesn't really apply here.

When you create something in a field that already has other entrants, that's a completely different thing from trying to unify all the other entrants, and ending up just creating another also-ran.

There's nothing wrong with identifying an area that you think you can contribute something innovative to and building something opinionated for it.


Literally everyone has seen this. There's no chuckle.


Deno makes sense in a variety of situations. The build pipelines of Typescript are excessively complicated and Deno hides that complication away (less dev effort).

Furthermore Node has its own maintenance/risk issues in production systems (think permissions), and Deno reduces those with custom built runtimes.

I cannot see it replacing Node though. Node has created a vast ecosystem that includes modules (npmjs), client side bundlers (eg webpack), serverside frameworks (eg express), etc. But because Deno is solving some of the issues for those who run sensitive code in production (eg Lambda functions) it'll most likely gonna become another VM on the public cloud providers' list.

All in all Javascript interpreter is becoming something like a JVM. Everyone wants to use it but without writing vanilla Javascript.


This probably either sounds nuts or over opinionated but I don't think the NPM ecosystem is as valuable as people think it is... stuff is deprecated continuously anyway - when you find a way to move away from 10k dependencies (because your shortsighted previous self decided to depend on a single package without looking closer), it's a damn relief. I hate how needlessly complicated the NPM ecosystem is, I would actually like it if no one built bridges to Deno, it could be a clean start with lessons learned.

Most of the packages on NPM are complete garbage - most not all - (and I say this as a full time JS dev who actually likes JS), liking the technology does not have to mean liking what people do with it, and JS is the worst, both in terms of the crap on NPM and the web.


Your thoughts and experience resonate with me.

Although I would look for similar examples elsewhere to understand where it is all headed. And one of those could be the rise and adoption of the JVM. There are a bunch of old dependencies, multiple repositories (compared to just one popular npmjs), a couple of build tools (ant, maven, gradle) compared to npm, yarn, then multiple languages (invented for somewhat similar reasons as Typescript). One thing is sure that the dependencies stay somewhere for-almost-ever, and all else kind of evolves around it.

It seems that the main problem at the moment is the developer (me, you, everyone) who instead of writing 20 lines of code uses a dependency that has 10 other transient ones. You cannot fix it but rather challenge those situations at work and reduce the tech debt for good. Maybe there should be a build tool that throws errors depending on the amount of dependencies. Maybe there should be a frontend framework which will only need 10 dependencies for the helloworld instead of 1500; otherwise we kind of think that if a boilerplate template has 1500 deps then surely adding another 100 will not make much harm.


> I don't think the NPM ecosystem is as valuable as people think it is

Its value is less about the existing dependency tree of libraries and more about how many apps directly depend on libraries that are npm-hosted.

There are many 200+ kloc apps out there making millions/yr each that deeply depend on libraries and frameworks hosted in npm.


I mostly agree with this, after using Node the last several years. IMHO the primary valuable components are making it free to implement the commenest kinds of concurrency correctly. I.e. most programmers don't even know they are doing it, other than "Oh I have to use `await` here". Even in something as simple as Go, you still have to intentionally set it up. And in Java, I run into very slow very low volume apps all the time because programmers don't realize they added a blocking call somewhere.


> I don't think the NPM ecosystem is as valuable as people think it is

I generally think the NPM ecosystem is pretty cool, but I have noticed that even simple stuff like creating a RESTful service doesn't seem to have stablised in the ways I would have expected given node's target audience, and you end up writing a lot of the boilerplate yourself. Hopefully this will result in hundreds of replies telling me how I should be doing it, but even the fact that it's not trivial to find that out is a sign of a fragmented ecosystem.


> Most of the packages on NPM are complete garbage

And don't do anything. I'm amazed by how often I read the source for a package I'm interested in, only to find it's about 10 or 20 lines of code.

The convenience of adding a package means that if you're not sure how to do something, you can easily just add a package to do it rather than figuring out how to do it yourself. A lot of the time, you can just read the source and adapt it rather than installing the package.


So if you need that package in your 10 projects, do you copy those 20 lines of code in your 10 projects? Yikes. Then what happens when it requires a change? Do you make the change in all your projects? Double yikes. When it becomes 30 lines, do you still copy/paste those 30 lines in all your projects? Do you also copy/paste the tests related to those 30 lines in all your projects? ...

Having reusable code wrapped into packages is basic software engineering.


Or you create a library which contains a collection of 10-20 line of code modules and share it among your projects.


Yes I agree it's better to have one library with 20 helper functions than 20 libraries with 1 function each. Think Lodash.

Although some people will then complain about the size of the library...


Yes. Every Lodash function is available in its own npm package too. I use it like this, I rather have 4 lodash packages than the big whole lodash library in one package with almost everything left unused.


These will go away in v5 (if a v5 is ever released) and is not recommended if using a bundler as it'll end up wasting space rather than saving as it'll include duplicate code that each individual function could share.

https://lodash.com/per-method-packages


Thanks for the information. I think it will be about time to completely remove lodash from my code bases.


JS libraries only did that to reduce bundle size being sent to the browser. It’s done out of necessity, not because it makes any sense.

Exporting out one method at a time is pretty absurd.


Lodash has a lot of features that are not needed anymore or even dangerous sometimes. If you don't need them, why would you install them?


Because it’s simply not a big deal on the server.


I'm not afraid to say I agree with you, i'm not sure why you are being down voted... This is one of the attributes that is part of my argument against NPM, packages which are at the extreme end of too small, delivering almost no value at the cost of a lot of risk.

This is also an opinionated subject, and as with most things in life there is a balance somewhere in between the two extremes. However the NPM packages you refer to (and yes I have seen the <10 line offenders you refer to) are clearly at a stupid end of extremely useless shared code that amount to the most trivial stackoverflow JS examples.

In response to the sibling comment on duplication: all I can say is that it cuts both ways, deduplication to the extreme also has downsides: when someone changes that code you don't control, it now affects a large number of dependencies that the author may not care about. But even ignoring those issues and assuming you package-lock everything and never look back, extreme de-duplication in general can result in highly illegible code due to a lack of holistic view: whether it's achieved with thousands of tiny external packages or thousands of tiny 3 line functions in the same file - many have covered this topic in detail before, deduplication for deduplication sake never ends well, abstractions always have a cost.


I think the JS ecosystem has such a huge amount of raw developer-attention that even if Deno only gets 10% of the market, that's probably enough to bootstrap all those foundational packages/frameworks.

Also: if you're porting from Node, the only things that really have to change are the system/IO API calls. The import style is a bit different but that could pretty much be automated. It's still just TypeScript at the end of the day; your core logic will be the same.


>All in all Javascript interpreter is becoming something like a JVM. Everyone wants to use it but without writing vanilla Javascript.

Except not because lots of people prefer to write JavaScript and those other JVM languages are usually less verbose rather than more verbose.


But why people create Coffescript, Typescript, Kotlin (which compiles to JS), and many more? Why are we compiling ES7,ES8,ES9 to compatible version of basic javascript then? Why do we use wasm?

I think this shows the lack of willingness to write a vanilla, all-compatible javascript. Also it seems people use new features without understanding how those get compiled to a lower version.

Sorry when I meant vanilla I did not mean ES9.


I think raw (framework-less), modern (no J2EE, Spring) Java is most likely a much better language than raw, modern Javascript.

Plus, aren't Deno libs compatible with Node libs?


> much better language

Maybe but who cares when your code does not execute instantly on Lambda and requires you to use GaalVM to convert bytecode to binary. Besides you have a lot of people who would say Kotlin is better.

Javascript, like Python wins here. Well Javascript kind of wins because it is interpreted but devs do not use it directly but rather with a bundler/compiler.

> Plus, aren't Deno libs compatible with Node libs

Not really, if your dependency relies on say `https` module then it'll not work on Deno as it does not have it. And in the case of Deno modules, they are not on `npm` to begin with but even if you import them then they are written in Typescript.


Better language or not Java is reigning king in enterprise companies. We have built our platform on top of Node.js assuming it would gain popularity but Node.js is nowhere near production use in large enterprises. Somehow the enterprise adoption has been miniscule. JVM performance is top notch. Many features like distributed transactions, messaging support, official libraries for enterprise applications like SAP is severely lacking in Node.js and hindering its adoption in enterprises.


A lot of people seem to think the difference between Deno and Node is trivial, but having actually used Deno, I think they're wrong. Here's why:

- Typescript as a first class citizen

- An actual `window` global with familiar browser APIs

- Sandboxing w/ permissions

- URL-based imports (no need for NPM)

- Bundling into self-contained binaries

- Things like top-level-await which Node.js still treats as experimental.

- Better-designed APIs than the Node standard lib (esp. when it comes to promises instead of callbacks)

To me, those aren't just minor details. This has the potential to create a new epoch in server-size JavaScript.


My personal view is that syntactic debate over imports (including whether it's in an import/require/url/package.json/whatever) are basically meaningless except for surface level debate. How "nice" it is to write imports is pretty worthless to me.

The structural and semantic differences between imports are a much more important discussion. Import syntax is the front end to the languages meta programming semantics. It's meta programming in the sense that your code and other code is the data, but what you're really programming is a task runner with specific instructions about how to find and satisfy the requirements to build and/or run the software.

Integrating package resolution into the language itself is a really important distinction for contemporary designs - passing that buck to external tools but simultaneously coupling them to the runtime is a mistake that I think we should learn from. Deno is a good step in that direction.


> URL-based imports (no need for NPM)

What happens when there is the next Codehaus-like shutdown, and so much source code just won't work? Or when a bad actor takes control of a domain that commonly hosted packages (perhaps through completely legitimate means, such as registration expiration), can get a completely legitimate SSL certificate for it, and responds to requests for packages with malicious code? I think the abstraction, and to some degree the centralization, of package management is generally a good thing


URL-based imports aren't less secure. They just make an existing attack vector more obvious. Is NPM really keeping you safe? What happens if a package maintainer is compromised and the attacker adds malicious code?

The fact that URL-based imports make you uncomfortable is good. Let that discomfort guide you to adopt some extra security measures to protect yourself from third-party dependency exploits


I would argue that NPM and other centralized package managers have the ability to add security: if npm made 2FA a requirement (or publicized the status of a package maintainer having 2FA enabled like GitHub does, which is perhaps a security concern itself), there would be some assurance that a maintainer is not compromised.

If we are using URL-based imports, the scope of security assurances are much broader: an SSL certificate doesn't say anything about whether the current owner of a domain was the owner at the time of a dependency being safe. There is no one authority who we can expect to take swift (or even slow) action to remove malicious packages from being available on the myriad hosting protocols that are accessible through an environment like Deno


Every time you fetch a package from NPM you are accessing that code from a URL (at npmjs.com) and then caching that locally. Deno is just eliminating the middleman. If you still trust npmjs.com for source code delivery you could continue to do that.

What it isn't eliminating is the ability to define "local" caches. Just because you are importing everything from URLs doesn't mean that they can't all be URLs that you directly control. You don't have to use CDN-like URLs in Production, you can entirely copy and paste all the modules you need onto a server you control and URL scheme that you maintain.

There will still possibly be good uses of caching automation in Deno (once people get bored with xcopy/robocopy/cp -r/rsync scripts), but it will likely seem a return to more bower-like tools rather than necessarily full blown packager npm-like tools. (ETA: Or proxy services like GitHub Artifacts that pull an upstream source for you and rehost it on controlled URLs with some additional security checks.)


I thought this was exposed by npm in package metadata?

At the least it can be made a requirement for packages in the case of multiple people with publishing access that everyone uses 2fa


You can pick where you import from. Use github.com instead of nohackershere.ru


don't get tricked into a slightly wrong URL


a note for many: yarn v2 provides '0-config' fully cachable dependencies (zips in lfs). This makes it possible to fully vet dependencies and enforce change approval and analysis in CI/CD.


I didn't know about that regarding yarn 2. Is there a good link to read about this? I did a search for yarn 2 and didn't find that immediately.


https://yarnpkg.com/

They don't do a great job of advertising the changes; take a look under features/zero-installs.

The common practice is to still use yarnv1 as your global yarn, just do "yarn set version berry" inside your project to use v2. 0-config can be a breaking change - though I haven't had problems in a long time.


I must chime in here. Yarn 2 is a fantastic experience!


Exactly. Not to mention installs being insanely fast.


> Is NPM really keeping you safe?

I wish there was something like Docker Hub's automated builds in the Node world because the way NPM works right now, what comes from NPM is an unknown. The only thing you know is if you download a specific version once, you'll always get that same version again, unless it's invalidated. Otherwise, whatever the package author wants to upload and include, that's what you get and you can't know that what you're seeing in some Git commit is what's running in your application. I wish that was the state of the art.


THIS! I cannot believe that these is still no auto hash validator thing between git and npm. Feels like npm should force a commit containing hash for the current version on every publish or something. How can we make this happen?


It would have to be some kind of integrated CI build system thing that builds the product in a container. Seems like they have no incentive to offer that given that they totally own JS packages.


Answered in https://deno.land/manual@v1.8.2/linking_to_external_code#but...

Also: Nobody prevents you from using a package manager anyway. Just because you can use urls in imports doesn't mean you have to. But it is very convenient that deno downloads exactly the code that is imported. A package manager will always just throw everything at you. Some packages in node.js try to fix this by splitting the project into submodules like @babel/core, @babel/env, .... But that made installing dependencies worse. Just let deno figure out what is required is way more elegant IMO.


Not sure I understand, are you implying Deno does automatic tree-shaking on package imports? If not, how does "deno download exactly the code that is imported" and not just a whole package?

Also, from your link:

"In Node this is done by checking node_modules into source control. In Deno this is done by pointing $DENO_DIR to some project-local directory at runtime, and similarly checking that into source control:"

I disagree with this. In my opinion, this is done by using a pull-through cache that caches every package developers request and so inherently has a cache of the packages that will go to production.

Is it possible to do this in deno today? I don't really get that sense.


> Not sure I understand, are you implying Deno does automatic tree-shaking on package imports? If not, how does "deno download exactly the code that is imported" and not just a whole package?

npm install copies in to your node_modules an entire zip/tarball package. Deno uses ES2015+ module syntax and it's only going to grab the JS files that are imported (or imported by imports). So it "naturally" tree shakes at the file level, and doesn't have a concept of a "package" in the same way. It's not directly going to tree shake inside of single file boundaries in the way that a major bundler/packager might (though the V8 JIT should still sort of indirectly compensate).

So yeah, if the package is published as just a single (M)JS file it will still get entirely downloaded by Deno, but if the modules are split across multiple files, Deno will only download the ones that are directly or indirectly imported (and will have no idea of the remainder of the files in the "package").

> I disagree with this. In my opinion, this is done by using a pull-through cache that caches every package developers request and so inherently has a cache of the packages that will go to production.

> Is it possible to do this in deno today? I don't really get that sense.

Yes, because URLs are just URLs, you could always have a Proxy service running at a URL that knows how to request the packages from upstream. https://jsproxy.mydomain.example.com/lodash/v1.1/module.js could be simple caching proxy that knows how to get lodash from upstream if it doesn't have the requested version cached (or sends a 404 error if it isn't allowed to cache a specific version or whatever).


Thanks for the package / module / import explanation.

Re: URL proxying, this all feels ad-hoc and overly decentralized. I agree with your assessment that "rolling it yourself" looks simple enough at first glance, but after having so much success with package managers and associated tooling I can feel the doubt in my mind that a new approach won't just reskin the same problems. I see they've done some reasonably smart decision-making along the way though, so let's hope they walked down this logic tree and are happy with the likely paths


> Re: URL proxying, this all feels ad-hoc and overly decentralized

Well, I started from the "full paranoia" mode where people would want it to be ad hoc and as decentralized as possible. It's very easy to imagine that there would still be trusted/trustable 3rd parties creating central proxies for stuff like this. Just as unpkg today provides one way to pull individual JS files from npm, you could imagine unpkg as an option for Deno. Similarly, there are lots of artifacts repositories with upstream pulling options already in the wild such as GitHub Artifacts or Azure Artifacts or jFrog Artifactory as three easy examples to mind (among many). It's not hard to imagine those artifact libraries also supporting Deno style URLs (in a similar manner to what unpkg does with npm) as Deno becomes more popular.

Ultimately it is likely to be a huge spectrum from people that want a JS proxy they entirely control/run/manage themselves, to those that want one they trust from a 3rd party, to those that are fine with whatever CDNs their libraries suggest. That's already a spectrum that exists in the npm world: people have private npm servers, people use npm artifact libraries like GitHub Artifacts, people use unpkg directly to pull in JS files from npm, people still go to Bootstrap or JQuery in 2021 and just copy and paste whatever CDN is mentioned in the "Getting Started" section of the appropriate docs. That spectrum is still likely to exist for Deno libraries, and while it might make it harder as a user/developer to choose which part of that spectrum is best for your own projects, Deno having little to no "out of the box" opinion on which part of the spectrum your project falls into (as opposed to Node defaulting to npm these days and the two ever more seemingly inseparable) isn't necessarily a bad thing for the ecosystem's health as a whole.


>"deno download exactly the code that is imported" and not just a whole package?

In a fairly simple and elegant manner. You specify a URL to the specific file FOO you want to import and FOO gets downloaded. Then deno looks at FOO to see what files FOO depends on, and downloads those, so on so forth...

That's very different from how NPM works where you have to download the entire package, including parts you may never use, along with every dependency the package depends on, even if you never end up using those dependencies either.

NPM's approach, which frankly worked well given the circumstances, has the downside of not providing end-users much benefit of writing code in a modular fashion, so there's no advantage to breaking a package up into multiple files versus distributing a single and large minified file.

Deno's approach promotes breaking up packages into multiple files so that an end-user only downloads the files and dependencies that are actually needed.


Thanks for the explanation. I would be a bit scared to have a recursive solver install my dependencies, but it's ultimately not that different from packages I suppose.

I will be interested to see how the tooling & community resulting from this decision look. Hopefully good.


deno has a lock file like other package managers. If you use a lock file, deno will refuse to run if the file has been modified.

Deno caches everything offline and only reloads if you use --reload flag.

I would recommend going through the manual to learn more about deno workflow.

https://deno.land/manual


"Deno can store and check subresource integrity for modules using a small JSON file." - https://deno.land/manual/linking_to_external_code/integrity_...


Unfortunately this won't help those who add dependencies based on shared snippets, without the context of a project. Or the innocent mistake (or choice?) of not checking the lock file into version control. But yes good point, existing projects that have checked in the integrity file will be safe against future malicious modifications of the resource


> Unfortunately this won't help those who add dependencies based on shared snippets, without the context of a project

Could you expand on this? Any examples would be appreciated.


Code examples abound on Stack Overflow, GitHub Gist, blog posts, etc. These may contain direct URL dependencies.

Example guiding users to include a Maven dependency: https://www.baeldung.com/guava-mapmaker#map-maker

There is some degree of assurance that this dependency won't last long in the Maven central repo, or any other user configured repository, if it contained malicious code. Obviously it is not foolproof and incidents happen, but without a centralized authority for package management, there is much less assurance that a package is not malicious


I find your concern valid.

Deno makes it easy to import from temporary places. However, this wouldn't be solved by having a centralized registry or a package manager like npm.

Npm can install from git repos and registries other than npmjs.

Having a package on npm by its own doesn't make it any less malicious.

A solution would be to enforce registries you can import from and fail if it's outside that.


if you can magically copy and paste in code that also includes a dependency then you might have just screwed yourself if you didn't read the code of said dep (or even if you did, maybe you missed something) if it just looks like a comment then maybe your team missed it in review. its harder to reason about deps that live in deep modules.


But if you just paste and execute code you found on the internet willy nilly, you're in trouble no matter what the plattform.


There will likely be some kind of npm at some point.


Yes, i think recreating an npm kind of central place for your project dependencies seems almost required if you want to avoid depending on a different version of a lib in each file of your project.

I cannot understand the benefit of this scheme honestly. I would have preferred they fix something npm is lacking, like adding the ability to sign packages


If the goal is to get rid of NPM, why would you add another one in the future?


The goal is not to eliminate npm, but to decouple it from the runtime itself


Those still are all trivial in the "too little too late" sense.

URL-based imports are already something you can do in package.json if you really thought that was great. Top-level await really is trivial. The permissions system does very little for you since they are process-level permissions when I'm scared of transitive dep attacks. Typescript in Node is good and Deno has to catch up in tooling.

If Deno was where it is today but like 8 years ago, it would have made a splash. Now it just looks like someone made a few creature comforts to Node.


Thanks, I agree 100%. Perhaps I'm getting unnecessarily cocky in my old age, but this really does feel like other cases I've seen in the past where people (including the original founder) want "a clean slate", because of course there are warts and lessons learned from the original implementation, but the cost of workarounds for those warts is actually lower than the cost of switching to an entirely new competing technology for most people. Going back to the original list:

- Typescript as a first class citizen: Agreed, it's not hard to get TS set up in Node, and it's a "one and done" cost.

- An actual `window` global with familiar browser APIs: I've never needed or wanted this, though I could see how the server-side rendering crowd could benefit, but I still have to believe there would by necessity be enough difference between the browser-based window that implementing SSR is still non-trivial.

- Sandboxing w/ permissions: I honestly think this is going to be useless. As you stated, if you're really running untrusted code better to rely on process permissions. This actually reminds me of many of the overly complicated security/permissions architecture in the Java SecurityManager stuff that I rarely if ever actually saw being used.

- URL-based imports (no need for NPM): NPM certainly has its warts, but I think a lot of Deno supporters woefully underestimate how most server-side JS developers love having a single primary global package repo.

- Bundling into self-contained binaries: Again, this is nice, but also gets a "Meh, I don't really care" from me.

- Things like top-level-await which Node.js still treats as experimental: Once you learn how to do an IIFE who really cares?

- Better-designed APIs than the Node standard lib (esp. when it comes to promises instead of callbacks): Again, a minor nice-to-have, especially with Util.promisify.


top-level await isn't really trivial implementation or action wise. But as a feature it seems obvious until you step into all the edge cases it can create.


Care to share some examples of said edge cases?

I write a lot of short JS files that aren't part of a big cohesive backend, and little things like top-level await make the code somewhat easier to read. Yes, you can just do an async IIFE, but that's just another thing for the eye to parse and feels quite vestigial.

EDIT: And since I can't respond to your other comment, I'll ask this here; what do you consider great and important about Deno that doesn't fall under the bullet-points I listed? I'm simply curious.


I think deno is really interesting in the aspects of how it is parsed and handled, I have a lot of hope they can continue to do cool things with it too. I would like to continue seeing them take strong opinions on things that improve the language, and hope to see it evolve beyond javascript. But thus far - it has been: stealing code from node.js and taking strong opinions on things that make the overall DX worse. Deno's strength will being a scripting language on its current path.

Edit: just realized I gave no example.. an example would be a dep that has gone missing and there is no backup for it. Deno sort of handles this but all we have actually added is no centralized way to describe dependence. So if you dynamically import a package via its package manager you also need to write some code to ensure it even exists. how is that better?


Can't we have archives in case any packages fail to resolve? Archive.org might even work in some circumstances. (not saying that we should use that, but that it would be trivial to have backups in a repository that we don't always rely on for all dependencies)


It's cute for scripts. Otherwise there's the trivial top-level wrapper runProgram.catch(console.error) entry point. It's just not a detail that does much to sell Deno.

Though I admit it's not that fun for me to come in here to poopoo Deno.


I am not trying to do that either - I think the work they're doing is great and important, but the reasons listed above ain't it.


> An actual `window` global with familiar browser APIs

This doesn't seem good?


it's not. actually I would argue any sort of global state like this should go away.


I agree with the spirit of what you're saying, but there billions of lines of JavaScript code that can't "just go away". Being able to use e.g. 'window.crypto' without worrying about whether I'm in Node, Deno, some other runtime or a browser is great news for reusing huge amounts of code.


I'm in favour of compatibility and against a global named 'window' in a non-browser JS implementation, and I don't understand why the former necessitates the latter. Since window is the global object, why not just refer to crypto? Then you don't care what the global object is named.


It is relevant to this that ecmascript specifies `globalThis` as the global object (the one pointed by this outside of functions in non-strict mode), all js environments have it and simiply globalThis === window || globalThis === global


Well, okay, but it's pretty much here to stay and doesn't have that great of a drawback that it's worth breaking any cross-compatible code that currently exists.

If there is going to be a global object, then it might as well be the same one used in the browser, which everyone who writes JS is familiar with.


- AFAIK - it still compiles into normal JS nor does the TS team work on deno (hope I am wrong) so it's not really FCC as most would think

- this doesn't make sense, its bad API design and shoe horning in a completely different paradigm is not a good idea, it should have moved somewhere less familiar and more oriented to the environment (I know that sounds weird but giving a quick answer, there are better solutions and this aint a browser)

- not bad, but seems odd to me as a language feature in some ways, there are plenty of ways to achieve this. (macOS will do this to your node scripts even itself, why do we need more of it)

- this is subject to the same problems NPM could have, but I guess it is easier? Now you have to lexically parse all files to determine an import and you also have to remember where you keep your 3rd party packages without a centralized way to know it (edit: this seems to harm the previous point)

- bundling isn't that interesting in this context as it just bundles the whole runtime with the package, which is terrible for size of packages and doesn't net a lot of benefit since they are also now tied together - if there is a security flaw you now must fix the entire binary and hopefully it still compiles. (edit: technically swift did this a long time too... but they also were reasonably sure they could include their runtime with the OS itself once ABI was stable, I am not sure if Deno has a path for this or if we just get fat binaries forever)

- top level await is a weird one to me, there are valid reasons its not in node currently. but yeah- no one likes having to write janky init code to work around this

final edit: I have a lot of opinions on this and would love to learn more about why deno is great. from what I can tell, its just a faster language of js which, imo, is great. But the points drawn from GP are just bizarre to me.


- I've had the typescript compiler do WEIRD errors on me, depending on configuration, a zero configuration, no brain environment is a godsent

- it makes a ton of sense if you don't want to maintain two different versions of the same library. With WASM there is zero need for Node native modules anymore, so there is no need to have platform specific idiosyncrasies that go beyond Web APS's. Is the fact that it's called "window" a favourite of mine? Certainly not, but when you try to get a library running that you really need, that was originally written for the browser or vice versa, you don't care what the thing is called, as long as it's backwards compatible.

- defense in depth, multiple sandboxes are better than one

- this has to do with the window point, it's a lot easier to make code cross platform compatible if you only have to reference publicly accessible http endpoints

- maybe that's not interesting for you, but I've had to deliver command line tools as binary, and I'm super duper happy that I could do so in JS, the security I gain from controlling the runtime version is also much better than what I'd get from that being the users responsibility, besides that fact that not knowing exactly which runtime your code is gonna end up on is also a big source of security vulnerabilities

-


- TS also lets anyone do whatever they want so not a great thing to support, I have always said TS would be better if people couldn't configure it without a PR to TS itself

- we have the chance to define a new and better API, we should do it - you are already adopting something different

- I agree, but its not a selling point, these things are probably in containers already and my OS should be doing most of the work

- window: just no, call it global if you must. perpetuating ideas like this isn't good for anyone but the lazy.

- I think the cmd aspect of shipping a whole binary is cool for sure, but lets not conflate this as a "compiled language" its not. you are shipping a runtime and a scripting language together.


You go out and fix all the library code that uses window then.

The web has been paving the cow paths for good reason.

There are hills worth dying on, what the common namespace is called isn't one of them.


relatively speaking the windows vs. global should be a non issue. I believe that every js runtime except for IE supports globalThis such that `globalThis.window === window` in the browser and `globalThis.global === global` in node.

Deno's difference from node is the choice to implement web API instead of nodejs stdlib API

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...


While I also think url based imports are a weird idea, it might just stem from the fact that I haven’t used it much, it might be wonderful, who knows.

But what I’d like to question is why the idea of parsing everything is considered bad. Semver itself, while miles ahead of what came before, is still just lies we tell the compiler.

You can never actually rely on the fact that a miner fix will not be a breaking change and the like.

Rich Hicky had a great talk on the subject of imports and breakage, and the conclusion that I agree with was to actually load and check every function in all the libraries you use, so that if there is an incompatible change you will not be none the wiser.

I’m glad people are trying to experiment here, as package management is far from a solved problem and issues with it cause sweat, tears and blood to most devs during their careers.


I've used imports in this manner before and the issue with having a non-centralized place where packages defined has 2 sides:

1. people will import packages willy-nilly and not think about what they are doing and it becomes harder to understand WHAT they imported (the why becomes more clear imo), I am aware that is very much so JS culture today but I also believe that to be harmful.

2. Having to parse all files to find deps takes time, obviously not a ton of time, but it takes time, it simply doesn't scale appropriately

Working in finance - I think personally that it is really important to make changing the dependency chain something everyone should be VERY aware of.


Re: point 2

Yes to download each url and walk the tree is going to take time. But it can be cached. Not sure how long it can be cached for.

The important thing about url only imports is that it’s a consistent layer to build new tools on top of.


Have you used it?


I have. And I am not saying it doesn't make sense, but the GP reasoning are probably some of the worst aspects of what it can do.

edit: but at the same time now I realize why people are attracted to it, they are desirable features wether or not they actually make sense.


I was just looking for git tooling and found a few node base projects with npm-install instructions. I thought deno is going to eat these things up for teams. We don't want C++ devs to need a node tool-chain for one tool.


Highly disagree on the impact of those points.

> - Typescript as a first class citizen

This doesn't seem to add much besides having to install `tsc` or `ts-node` and also not having the choice of the TypeScript compiler version you use.

- An actual `window` global with familiar browser APIs

Node.js has a `global` global object and the only API I would understand having in common with the `window` object is the `fetch()` API.

- Sandboxing w/ permissions

Sandboxing is so basic that any large project will have to enable all permissions.

- URL-based imports (no need for NPM)

I would consider this a disadvantage.

- Bundling into self-contained binaries

Again, I would say that this is rarely useful in a world where a lot of operations use container technology.

- Things like top-level-await which Node.js still treats as experimental.

This is trivially solved by anonymous self-executing functions

- Better-designed APIs than the Node standard lib (esp. when it comes to promises instead of callbacks)

I think that this is the strongest advantage, however I would argue that this is not a reason to start a completely new backend platform. Also, I think that it might be a disadvantage in some high performance scenarios because Promises are much, much slower than callbacks currently.


> This doesn't seem to add much besides having to install `tsc` or `ts-node` and also not having the choice of the TypeScript compiler version you use.

Depends on your point of view. With TypeScript being built in, you don't have to think about using tsc or whatever version of TypeScript you have. It's just what version of Deno you use. If someone doesn't like that, then they still can have the option of using TypeScript by itself.

> Node.js has a `global` global object and the only API I would understand having in common with the `window` object is the `fetch()` API.

It also supports `addEventListener`, which is commonly used by browser code on the window object.

Just the existence of something defined as `window` makes more sense than a `global` which never existed in browsers in the first place.

> Sandboxing is so basic that any large project will have to enable all permissions.

That's pretty dismissive. Why should an app that doesn't interact with the file system be allowed to write or even read from it? I don't know how this feature can be considered a drawback. Don't like it? Don't use it. I don't see how it detracts objectively from Deno.

> I would consider this a disadvantage.

Then you can still use NPM. Others of us get the option to just import packages from a URL instead of publishing it to a central repository.

> Again, I would say that this is rarely useful in a world where a lot of operations use container technology.

Why? Building Docker images requires extra software, Linux images, time spent running apt-get or apk, time spent downloading and installing your runtime of choice, and so forth. Having Deno build a binary can give you a bit of a shortcut in that you have one tool for running and bundling code, and you don't need to deal with as many OS-level nuances to do so. Docker and k8s are there for anyone who needs something beyond that.

> This is trivially solved by anonymous self-executing functions

That's your opinion. Just promise me you don't go on to say that JS is a bad language, because people keep saying that yet are opposed to reducing complexity they consider "trivial". If using IIFE for the mere purpose of accessing syntax makes more sense to you than making `await` syntax available, then I really don't know what to tell you. What exactly is the argument for not implementing this feature besides "all you have to do is type some extra characters", to loosely paraphrase you.

> I think that this is the strongest advantage, however I would argue that this is not a reason to start a completely new backend platform. Also, I think that it might be a disadvantage in some high performance scenarios because Promises are much, much slower than callbacks currently.

> I think that this is the strongest advantage, however I would argue that this is not a reason to start a completely new backend platform. Also, I think that it might be a disadvantage in some high performance scenarios because Promises are much, much slower than callbacks currently.

I honestly have to wonder if you are joking. This is exactly why people invent new backends, new libraries, and new languages.

My only response to your point about Promises is that perhaps one shouldn't be using JavaScript if Promises are that much of a bottleneck. What you're saying is totally valid, though.


On the sandbox part, you can use workers to offload risky code. There is cost to doing it but it can be minimal depending on what you are using it for.

An example I shared few days ago here: https://news.ycombinator.com/item?id=26560464

https://deno.land/manual@v1.8.2/runtime/workers#specifying-w...


> That's pretty dismissive. Why should an app that doesn't interact with the file system be allowed to write or even read from it? I don't know how this feature can be considered a drawback. Don't like it? Don't use it. I don't see how it detracts objectively from Deno.

Because any app of a medium size and above will require access to all permissions. Also Deno had some obvious security vulnerabilities around symbolic links for example which really detracts from the supposed security goal.

> Why? Building Docker images requires extra software, Linux images, time spent running apt-get or apk, time spent downloading and installing your runtime of choice, and so forth. Having Deno build a binary can give you a bit of a shortcut in that you have one tool for running and bundling code, and you don't need to deal with as many OS-level nuances to do so. Docker and k8s are there for anyone who needs something beyond that.

But you are going to need some kind of operating system image anyway due to other tools that will need to live with your app like log shipping, load balancers, DNS caches, firewalls, daemons, etc. So in the end you will need to describe this somewhere and why not also describe the dependencies of your apps at the same time.

> If using IIFE for the mere purpose of accessing syntax makes more sense to you than making `await` syntax available, then I really don't know what to tell you.

If using IIFE is so heavy that a new backend platform needs to be built I don't know what to tell you. In the apps I see, there is exactly one top level IIFE that is needed in the whole application.

> This is exactly why people invent new backends, new libraries, and new languages.

New libraries yes, new languages no. The util.promisify() already makes 90% of the cases work painlessly and some promise wrappers for existing core libraries already exist on top of that. Since core is moving to promises slowly anyways I fail to see how this advantage will carry on being one in the future.

> My only response to your point about Promises is that perhaps one shouldn't be using JavaScript if Promises are that much of a bottleneck.

Yup, that's absolutely true. I would say that there is always an advantage in having leeway in a programming language between the convenient option and the fast option so that when something becomes a bottleneck you have easier options than porting it to another language. But of course this might not be the most common case.


> due to other tools that will need to live with your app like log shipping, load balancers, DNS caches, firewalls, daemons, etc.

some apps are just CLI tools.

Top-level await helps with the rigidity of the ES module systems[1]; I believe they can also be used with dynamics imports giving ES-modules and CommonJS similar expressivity

[1] https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...


First class in all but its REPL.


The issue with Deno, personally, is that it feels like it doesn’t deviate enough from NodeJS to even make it worth taking the time to learn/migrate projects over to it. From what I recall, the only really new and nice features are: a) sandboxed by default b) no need for a node_modules folder since you can directly import from a URL

And is that really worth dumping loads of money into developing further? I just find it hard to believe people are going to bother with Deno any time soon - we’ve gone too far down the NodeJS road.


The standard library will be a killer feature. Even for people that know the existing ecosystem (as I do), as more libraries fall out of fashion or become abandoned, and people have to pick up (several) new libraries from scratch for basic tasks, they'll realize (as I did) its not actually any different than just using Deno. I don't know if that will be enough, but that's one of the angle's I'm watching closely now. I'm partly biased because I like playing with streams, and those can be a nightmare in Node.


Hard pass. There are globs of features and design changes that make it worth the switch immediately if your use context can allow for it. I've now done a handful of apps and libraries in it, and given the option, I'd never look back to node. I don't say that lately. Node's was the best. These new designs are more than marginally overcoming the shortcomings of node. The motivation for migration is strong, and if best proven through experimenting. If you are TS saavy, do yourself a favor and try it out.


> I just find it hard to believe people are going to bother with Deno any time soon - we’ve gone too far down the NodeJS road.

It depends on migration effort. Take typescript for example, it's very similar with JS that migrate the codebase is not that painful. If the standard library and package manager can prove to highly useful, we'll see two possible scenario that aren't mutually exclusive:

1. People migrating to Deno

2. Newer nodejs version follow what Deno has

In the end it's good for us


I think the biggest missing piece right now are the big libraries: Webpack, Babel, Next.js, ESLint (maybe this will no be needed with `deno lint`), NestJS. Stuff like that. I'd think it's hard right now to spin up a project that you would've usually used these frameworks for without writing a lot of infrastructure code yourself.


I feel like sandboxing could be huge, but WASM might be cutting the legs out from that feature.

Sandboxing doesn't sound so unique or innovative when WASM is coming along and doing the same thing, and with a much wider audience and thus more likely to have massive traction.


They compete on different use cases, most js/ts apps cannot be (reasonably) compiled to wasm and also wasm sandbox doesn't limit what the environment can do.

As a glaring strawman if you expose eval to wasm it will not help you.

Deno's sandbox will allow you to crate a dedicated worker with no network/disk access to handle sensitive informations, or to force your application to use a specific worker as a proxy (by making it the only thing with network acess)


I think sandboxed by default can make a lot of sense for the large companies that can potentially suffer heavy losses if an unauthorized attacker got into the file system.

The other feature is TypeScript as a first class citizen which is pretty great for devs.


If I understood correctly this is how they intend to make money:

> Not every use-case of server-side JavaScript needs to access the file system; our infrastructure makes it possible to compile out unnecessary bindings. This allows us to create custom runtimes for different applications: Electron-style GUIs, Cloudflare Worker-style Serverless Functions, embedded scripting for databases, etc.

So it's basically more of a Redhat approach to making money from open source? They intend to build tailored services on top of Deno for companies that request them?


I'm not sure that they meant those specific runtimes would be premium products.

My read was that anyone could more easily configure a runtime to expose (or not expose) the underlying bindings that it requires, vs. just having them all in there by default.

I think "us" in that statement is the Deno community, not Deno the company.

But maybe I'm wrong.


Clicking around the website and reading more comments leads me to believe that this is the product they intend to monetize with:

https://deno.com/deploy

I still find it strange that there is no mention of it in the post...


That makes sense...

So if you're up for it, you can still roll your own deployment infrastructure and tailor your builds as needed etc.. and support all of that internally.

Or you can pay Deno to handle it for you and you can focus on building your apps.


That's a description of a command line flag or feature of the runtime, not a business model.


I recall Ryan regreting nodes full-permission approach. Not every script should be able to access the fs for securities sake.


Deno's permissions are per-process though, it's a big jump for sure, but also still leaves the door wide open for abuse by dependencies of any serious project.


It does allow for some coarse grained improvements for certain services, though.

You might write an smtp daemon that only delivers email on to another service via LMTP - thus without ability to write to (or read from, depending on configuration) the file system, for example.

Yes, you can accomplish this via containers, too - but it's nice to get as much out of process isolation as possible, not to mention it is likelier easier to test and debug the closer to the surface of the runtime limitations are implemented.


How is that different from Node.js which also runs in a single process? Or does Deno create per-request processes (or v8 "isolates" etc) like CGI does?


I think the point was that it's not different from Node.js - and thus not much of a benefit.

If it was more along the lines of "I want to use this array helper library, but it shouldn't have any permissions" then it would be a lot more useful, but right now if your Deno app needs any file or network access, then all of your dependencies get access too.


Yes, this was my point. It's only trivially better in practice since you'll have to open all the doors nearly all the time.

We have so many dependencies that really only need to work in-memory and have zero IO needs. That part is not solved in Deno at all.


Javascript embedded scripting for databases .../shivers


Why?

I think the point being made here is that it will be easier to create purpose-built runtimes that only provide needed bindings, making them more secure and possibly also more performant?

JS/TS as languages are here to stay, so let's give them the best possible ecosystem.


Stuff like CouchDB have been around for a decade now, it's fine. All of NPM still runs on it...


I know this sounds crazy on the surface level, but I really wish I could do data engineering and machine learning with TypeScript instead of Python. TypeScript's type system is so good, it makes refactoring large projects so easy. Python's typing module leaves a lot to be desired, and on top of that PyCharm doesn't properly support everything. Perhaps I should switch to VSCode--but I do like IntelliJ, and it works really well for TypeScript.


There are a number of newer projects in this area. Arquero from Heer's Group (https://observablehq.com/collection/@uwdata/arquero), TensorFlowJs (https://github.com/tensorflow/tfjs), and (biased) CoreTable from OurWorldInData (https://github.com/owid/owid-grapher/tree/next/coreTable).


Deno has built in support for web gpu. You might wanna check that out. It's very early stage.


I have never used Deno, but I just wanted to say that I really love the branding and graphics on the website, especially https://deno.com/deploy


I love the Deno dinosaur. Side note: I wish it were "deno" (uncapitalized). Just a stylistic annoyance for me.


Agreed. I also would love it to be "deno" (I always thought it was).


I've always been very skeptical of the value-add of Deno over Node, and this only increased my skepticism. Good luck making money, I guess.


I've only dipped my toes in the water with Deno, but it solves a few pain points I've felt with Node. Direct deps via url vs npm as middleman, a standard library (thank goodness!), and single binary distribution. Types are great too, but these other things would be enough for me.


> a standard library (thank goodness!)

Then what are all these modules, if not a standard library? https://nodejs.org/api


libuv bindings ?


Honestly a standard library was needed since a decade but... I would have preferred a coordinated effort between browsers and node/deno. At this point even if just the browsers and deno get two different stdlib we risk _a_lot_ of useless fragmentation and headaches.


_If_ the browsers do get a std lib akin to what we've built for deno, I would be comfortable archiving the deno_std repo and pointing people to use the browser one. We're not yet 100% compatible with browsers in the stdlib, but being compatible as much as possible is one of our goals, specifically to address the browser/server-side divide and generally make it simpler and more enjoyable to program in JavaScript


The built-in Typescript support is amazing, but the selling feature for me was the ability to generate static binaries. Shipping is much easier when your built process can produce a single output!


You've been able to do that with Nexe for Node.js apps since basically forever.


Yes but because it's not natively supported there's common issues like https://github.com/nexe/nexe/issues/639.


A NodeJS Docker container is a few lines. I agree Deno's single binary is a nice feature but it's not very interesting for everyone.


Docker is very nice to deploy on docker-dedicated servers, but on legacy app or personal workstations binaries are more flexible


A docker container works for services - but CLI tools are more easily shipped as binaries.

It certainly isn't that interesting for everyone, but for a subset of users it's great!


> a standard library (thank goodness!)

What's a "standard" for you? Whatever definition, I think it implies there are > 1 implementations of it.

Node.js is based on CommonJS APIs, an initiative lead and implemented by earlier server-side JavaScript (SSJS) runtimes at the time such as v8cgi/TeaJs, Helma, etc. until Node.js dominated SSJS around 2011. Remember, SSJS is as old as the hills, starting 1996/7 with Netscape's LifeWire, and arguably has seen even longer use than Java as a mainstream server-side language.

Also not a "standard" in the above sense is TypeScript.

As to "standard library", the core packages on npmjs plus package.json are the standard library of Node.js on top of core Node.js APIs, and were developed in that spirit.


Using the word 'standard' was probably a mistake in my parent comment, I just parroted the terminology from the docs.

Really what I mean by 'standard' in this context is a batteries-included experience for most day-to-day programming problems.

I can appreciate the minimalist view where every project selects a core set of libraries for the given challenge. On the other hand, a core set of libraries that are good enough for most things just reduces decision fatigue, and makes it easier to just code.


Excited to see a commercial company centered around the TypeScript ecosystem (both server and client) and betting on WebAssembly. Any successful Open Source must provide commercial value to become sustainable. Funnily enough that assures its longevity long term.

Keep up the great work Ryan, Bert & team. Exciting times!


I met Bert in the StrongLoop days after the IBM acquisition. Good guy and good luck to him.

We were consultants scaling node to production for a major international bank, circa 2016.

Love the security improvements in deno, will have to give it a look.


Deno seems well poised to replace Nodejs for isomorphic Web programming.

The leading app framework for that is Nextjs and I hope the Rauch Capital investment sígnals Vercel will be supporting Deno.

Anyone know?


It seems there are many toothing problems with Deno. I just tried stuff from their blog (https://deno.com/blog/v1.8):

    $ deno run --unstable --allow-write=output.png https://raw.githubusercontent.com/crowlKats/webgpu-examples/f3b979f57fd471b11a28c5b0c91d0447221ba77b/hello-triangle/mod.ts
    Download https://crux.land/2arQ9t
    Download https://crux.land/api/get/2arQ9t
    error: Import 'https://crux.land/api/get/2arQ9t' failed: 404 Not Found
        at https://raw.githubusercontent.com/crowlKats/webgpu-examples/f3b979f57fd471b11a28c5b0c91d0447221ba77b/deps.ts:4:0
(A dependency got removed?)

Another one:

    $ deno run https://deno.land/v1.8/permission_api.ts
    error: An unsupported media type was attempted to be imported as a module.
      Specifier: https://deno.land/v1.8/permission_api.ts
      MediaType: Unknown
(The site is a 404 returning status code 200... just... why?)


Thanks for reporting, we recently moved our blog to deno.com/blog and haven't updated the post -- we'll fix that as soon as possible


From the last paragraph:

> But JavaScript and TypeScript scripts calling into WebAssembly code will be increasingly common.

Why is WebAssembly a key concept here? How does Deno uses it?


We use it for the hash module in our standard library for example: https://deno.land/std@0.91.0/hash. The wasm version is magnitudes faster than a pure JS implementation. Another example is sqlite, running in WASM: https://deno.land/x/sqlite@v2.4.0. Fully sandboxed sqlite :-)


It will be the ultimate irony if, 20 years down the line, Deno is still around, but solely as a de facto standard / cross-platform WebAssembly execution engine.


Why would that be ironic?

It's essentially the route Java is taking with graal/truffle - allowing heterogeneous mix of languages to be compiled and optimized.

Why not allow typescript to use Fortran numeric libraries via wasm?


JavaScript in Deno doesn't itself run on top of WebAssembly, so this isn't really quite like Graal. If you use Deno to run wasm, all the JS bits are basically ignored. It's a bit like using a web renderer solely for <canvas>.


You can write in other languages beyond JavaScript or TypeScript and generate WebAssembly. This means that say someone wrote a nice library or utility in C# you can compile that C# down to WebAssembly and use that library or utility in JavaScript or TypeScript (or anything else that can access WebAssembly).

This is analogous perhaps to C# and F# within .NET currently. C# and F# have the same BCL (base common layer) so you can use C# code in F# and vice versa. WebAssembly is like that and much more.


WebAssembly would be analogous to CLI (Common Language Infrastructure) in .NET land, which includes CIL (Common Intermediate Language - bytecode) and CLR (Common Language Runtime - VM).

.NET BCL is the Base Class Library. Outside of a few classes in it that are fundamental to the runtime (e.g. Object, String, Array), it's not actually required for cross-language interop on .NET platforms. E.g. if you have two different C compilers both outputting CIL, they can interop just fine without any objects in the picture. WebAssembly interop is really more like that, and doesn't have the high-level object model layer that .NET also has (and which is used in practice).


Trying to signup for Deno Deploy, it asks for the 'Act on your behalf' github permission to make an account.

Clicking on 'Learn more about Deno Deploy' leads to https://github.com/apps/deno-deploy , which does not tell me more.

What does 'act on your behalf' mean for Deno Deploy?


I believe it's a very poorly written description in the newer GitHub App permission system. My understanding it describes something akin to "can act on your behalf, but only in the scope of what other permissions are being requested" but it's overall very unclear wording.

See https://news.ycombinator.com/item?id=26485844


Deno's response here seems to confirm that https://github.com/denoland/deploy_examples/issues/15


Woah. Clicking on this link crashes Firefox. Version 86 on Windows. I don't remember seeing FF crash in several years now.

Anyone else is running into this?


> [...] Firefox. Version 86

Few days ago, there was new version released (87). In certain situations, when silent upgrade had been made during using the browser it displays an "Oops, something went wrong" notification with a button to refresh. If you will close and reopen the Firefox the problem will vanish. It is less kind of crash but more likely as problem to free/monkeypatch resources.

I have run into the same problem on Linux but I have quite complicated Firefox configuration with at least few extra profiles (about:profiles).


No problems with 87.0 on Windows for me.


I understand that the authors have very strong pedigree in their field, but given a lot of the motivation stems from regretted node design decisions, is the rust etc expertise on the project deep enough to not make equivalent mistakes that will be rued in another few years?

Genuine question (I assume it is, but presumably it was before with c++) - it just strikes me that once something becomes as successful as node, and given that nothing is ever perfect it might be useful to clarify why the technical insight might be better this time around - at least regarding the idioms of the underlying tech; the added experience at the architectural and implementation side a given.


> Of the myriad ways to program computers, scripting languages are the most effortless and practical variety. Of these, the web browser scripting language (JavaScript) is the fastest, most popular, and the only one with an industrial standardization process.

I haven't fully investigated in a few years, but isn't it still true that LuaJIT is is faster than V8 JavaScript? The last I saw it was outperforming V8 by a lot. The practical use of LuaJIT is still very niche though. The lack of a comprehensive standard library, and being forever stuck on Lua 5.1 makes it even less generally appealing. I still love it for programming Nginx though..


Probably the benchamark you saw was just a iteration of any math calc. In real programs V8 js is orders of magnitude faster than LuaJit

https://benchmarksgame-team.pages.debian.net/benchmarksgame/...


>In real programs V8 js is orders of magnitude faster than LuaJit

Fact Check: False

LuaJIT is fastest than all JavaScript JIT implementations in existence.


fyi as-it-says — Lua 5.4.2 — not LuaJIT


AFAIK it hasn't been faster for a while, especially not in the GC area.


I feel like the mass adoption tipping point of Deno will be when Create React App moves over to it or has a Deno specific branch.


Why would that be a benchmark?


You want me to google Create React App for you or you good to do it yourself?


>>> Our investors are Dan Scholnick from Four Rivers Ventures, Guillermo from Rauch Capital, Lee Jacobs from Long Journey Ventures, the Mozilla Corporation, Shasta Ventures, and our long-time collaborator Ben Noordhuis.

Why is the Mozilla Corporation an investor in a Chrome based technology startup ?


Deno poses itself partially as node with web standards, Mozilla likes when things are webby


Well, Deno is using Rust extensively and V8 is the only "Chrome" part.


Didn't Mozilla fire a substantial amount of their Rust developers?

And all Servo developers according to: https://paulrouget.com/bye_mozilla.html


No, they laid off a substantial amount of their Rust team, but the number of overall developers employed by Mozilla was quite small.

Mozilla continues to be interested in Rust even though they did that, they’re a founding sponsor of the Rust Foundation, for example, and are continuing to use Rust even though they do not employ people to work on the language itself.


From a distance I think Deno has a lot of promising features going for it. Expressing modules as URLs seems like a small difference but I believe it has big ramifications. With clever use of DNS resolving and caching I wonder how fast new instances will be able to spin up. I'm guessing it'll be fast!

My only gripe with the Deno company is that taking investor funding is a double-edged sword. Yes, they'll get to hire very skilled developers. However naturally the investors want a tidy exit, and I wonder if that would be to be bought out by Amazon, Microsoft or Google.

Edit: Just realized that there's a key difference in that Deno does not have something like NPM to be bought and sold because dependencies are URLs and thus decentralized. Also, Deno itself is open-source.


I'm excited about Deno, but I'm finding that the docs still need to be improved. For example, I'm trying to build a tcp server. I'm not able to get information on how back-pressure is handled.

I can see that Deno.listen returns an object which implements reader and writer interfaces, but it isn't clear to my how to look for events, such as disconnect or that new data is available.

I wish there were examples showing how to correctly parse frames or implement protocols.

I'm sure these things will be expanded over time, partly by programmers in the community, but from the outside, things are still a bit rough.


This is probably unpopular here, but I wish people would just stop using Javascript on the server.

Well, I wish they would stop using it period, but at least in the browser it makes some sense.

Edit: to be clear, I have no beef with Typescript, Dart, Clojurescript, and the many other languages that compile into JS. It's JS itself I have issue with. I feel like it gives too much flexibility to young programmers to screw things up. There don't seem to be enough safeguards or training wheels. On large projects its my nightmare.


If you have to use JavaScript, you can configure eslint with some plug-ins to have pretty good safeguards or training wheels. You will still miss some strong typing but eslint will warn or prevent most of the awful or stupid code.


> be clear, I have no beef with Typescript

One of Deno's main selling points is that it runs typescript out of the box?


> The Deno company hopes to enable the millions of web programmers out there to maximally leverage their craft in other domains.

I know this might be hard to see, but Rust is actually in the same domain. It is also, among other things, enabling product/frontend/web engineers to build backend/native/browser-less applications.

I'd bet Rust will be more successful here, especially given its amazing ability to change itself and innovate.


> It is also, among other things, enabling product/frontend/web engineers to build backend/native/browser-less applications.

Care to explain? I can imagine web programmers being productive in deno in a few minutes vs however long it takes to learn an entire new language, not to mention a language that requires memory management.


See the Rust Foundation announcement: https://foundation.rust-lang.org/posts/2021-02-08-hello-worl...

> "A language empowering everyone, but especially folks who thought that systems programming wasn’t for them.”


> Many developers, we think, prefer web-first abstraction layers.

Many developers, I think, don’t look past web-first abstraction layers.

I can’t tell you how many times I’ve seen CLI tools which are huge chunks of node wrapping a thin bash command. They are multiple files, orders of magnitude larger than they need to be, and require external dependencies because these developers are fixated on their proverbial hammer.


If you want hot-deployed distributed Java VM HTTP app. server + JSON database you can use my open-source alternative: http://host.binarytask.com

It uses comet-stream instead of WebSockets.

But it's fully "joint" parallel on all cores.


> To provide a modern, productive programming system that adheres to browser APIs.

In my opinion, the best part of node is (or was) that it didn't adhere to the browser APIs. That brought in some fresh air, and gave us buffers, native bindings, streams etc.


Web does have streams and buffers. Web assembly seems to fill in the gaps for native code but deno supports rust plugins which removes any sandbox guarantees so it's a trade off.


This is cool and exciting to see. I've mostly watched Deno from the sidelines, only playing with code a little bit, but it's clear Ryan and co. are serious about applying the lessons learned from Node. Best wishes to him and the team.


This is a great news!

Ryan and the team should be capturing some of the value produced by their creation. And because of Deno's a developer tool, it's actually capturing the far minor part of the whole value and enable a much bigger value creation!


>But over a decade later, we find server-side JavaScript hopelessly fragmented, deeply tied to bad infrastructure, and irrevocably ruled by committees without the incentive to innovate.

Trying to sell something based on FUD is always a bad sign.


Article left me wondering what the plan is for them here - why is it becoming 'The Company Deno'... other than they have a bunch of investors. Is it just to get a team in to properly manage the project as a whole?


It's the absolutely right move to get investors. Deno has substantial advantages over NodeJS.

However, I personally would prefer Go or .NET Core for my backend any day. We need to wait and see where it's going ...

Good luck and success anyway!


This is super cool to see! Deno is really refreshing coming from node fulltime.


congrats, i guess but not a fan of the VC route, we all know how this ends.


They figured out a way to fund work that they care deeply about.

With respect, and without impugning your right to make this criticism, I find your criticism shallow.

I don't know what your particular circumstances are, but I see your view expounded a lot by developers who are getting their salaries from companies that can afford to pay them because they took VC money in the first place.

We are certainly not entitled to Deno for free (although the MIT license is in their best interests for now). I am glad they found a way to sustain its development for the near future.


> They figured out a way to fund work that they care deeply about.

No, they figured out a way to delete that problem.

VC money isn't a gift. It's a loan.


It is most definitely not a loan. It is time/runway, purchased with equity and eventual loss of control.

Edit: There is more to it, of course. It is also entry into a somewhat exclusive network of future investors + customers if you choose your VCs wisely.


Which route on an independent MIT licensed project would you suggest?


It is matter of time until they release a proprietary version which is better than the MIT version.

Chrome vs chromium etc


The blog post specifically says that that will not happen


Yes, but Facebook said it will not monetize WhatsApp, and something similar when they bought WhatsApp Instagram.

We all know how that turned out to be

History repeats itself, the above two are the recent incidents that comes to my mind


And you will never need a FB account for Oculus


Most people will do things when in difficulties that they swear they would never do if asked while comfortable.


Most people are pathological liars.


no they're not, a large number are - maybe 13% https://www.moneycontrol.com/news/science/13-people-are-path... , what it is is that most people just aren't that self-aware or have a better opinion of themselves than is perhaps warranted.

So let me make clear here, if I ever start a company based on open source principles and it looks like I am going to get squeezed out of business by Amazon if I keep sticking to those principles I will probably abandon those principles (unless I'm already a billionaire and have something really cool I want to work on anyway) because damned if I would let Amazon beat me. The chance of my abandoning those principles will correlate closely to how bad my finances will be after losing out to Amazon.

On the other hand if I build up a big enough company and start making money off of using open source technologies as a service for other technologists I hope I would support the open source projects from which I was benefiting.


> So let me make clear here, if I ever start a company based on open source principles and it looks like I am going to get squeezed out of business by Amazon if I keep sticking to those principles I will probably abandon those principles (unless I'm already a billionaire and have something really cool I want to work on anyway) because damned if I would let Amazon beat me. The chance of my abandoning those principles will correlate closely to how bad my finances will be after losing out to Amazon.

If you want to keep this option open, then make no such promises or (perhaps better) formulate the promises in a way that is aligned to the options that you do want to keep open for the company.


I don't really see myself as ever wanting to start a company under open source principles as I believe the whole Elastic / Amazon situation is highly instructive in this regards, at any rate none of the projects in my big projects list relies on being open source for gaining traction.


Let's make a gentlemens' bet. If Deno is still completely free in 5 years, I'll give you $500. You up for that?


A gentleman’s bet with $500 is just a bet.


Isn't a "gentleman's bet" one with no bookie / office / dealer / etc?



Damn. Looks like I have lost a few of these.


> If Deno is still completely free in 5 years, I'll give you $500. You up for that?

Heck, sign me up. Who wouldn't be up for that? What's the downside supposed to be?


It wasn’t clear enough that if it isn’t, you’ll be expected to give me $500?


Maybe if you had included that in the terms of the bet.


You'll lose your money. Deno will be free. DenoPro won't be though.


Part of what VC investment means is that this is not his decision to make anymore.


sell some of the deno artworks [0] as an NFT?

[0] https://deno.land/artwork


Prediction: Microsoft will be buying Deno. Screencap this.


It's cheaper and gentler to the ecosystem to just add Typescript support to node and call it a day.


Deno is licensed as MIT. Awesome! But how will they prevent from being freeloaded?

My sense is that GPL3 gets a ton of criticism on HN, but isn't it the perfect defense against freeloaders?

* license the code for proprietary use in your stack * use GPL3 if you have a non-commercial use, and are willing to accept the requirement to open source your own code.

I don't understand why this option isn't used more by open source projects that want to be able to fund themselves.

Can anyone explain? (Even better if there are examples / case studies)


> I don't understand why this option isn't used more by open source projects that want to be able to fund themselves.

One possible reason is that such dual licensing requires copyright assignment from external contributors.


The best IMO is to go public domain. SQLite is a great example: (https://sqlite.org/copyright.html).


SQLite's case is super interesting.

It is said that they can get away with having it as Public Domain because a key part of their business is SQLite's reliability which is asserted by their large and closed source test codebase.


> which is asserted by their large and closed source test codebase.

Oh I didn't know (or forgot about) this! Super interesting. I love that strategy.

That's competition at its best. Don't assert any control over someone else, but do keep a few secrets to yourself so people know to come to you for the best stuff.


I hope that Deno can maximize TS performance. Right now they compile TS internally, but especially with this new financially-backed focus I hope that they can perhaps modify V8 or fork it to create a TS engine that takes advantage of type information for optimization. That's really what's needed; in a "no implicit any" application performance would near native.

Also, it's nice that they're using Tokio for the HTTP server instead of a JS implementation (from what I understand). I want to see Deno near the top of the TechEmpower benchmarks.


"We are bullish about the technology stack we've built" - I would sure hope so!


I'm very pleased Javascript / Node evolves. But thanks, too late, I'll skip it as soon as possible. It has so many unpleasant surprises. I'll move the ladder up, to something which compiles to Javascript, or whatever else, but no Javascript anymore please.


You can write TypeScript natively with Deno.


> Of these, the web browser scripting language (JavaScript) is the fastest, most popular, and the only one with an industrial standardization process

Haha, this made me laugh hard, stopped reading


> Many are more familiar with the Chrome DevTools console than they are with a Unix command-line prompt. More familiar with WebSockets than BSD sockets, MDN than man pages. Bash and Zsh scripts calling into native code will never go away. But JavaScript and TypeScript scripts calling into WebAssembly code will be increasingly common. Many developers, we think, prefer web-first abstraction layers.

Every time I read something like this I realize how much in the minority I am. I am not a web developer. I have never written JavaScript before in my life. I hate working with “web-first abstractions”. I feel like it is just massive bloat on top of true native application. But given the popularity of things like electron, react-native, Node, and Deno I don’t speak for the majority.

And the thing is, I don’t know if I just learned web dev if I would love this new approach to software that is eating the world and I would “get it”. Or if it just exists because JavaScript developers don’t want to learn something new.


This is a common misconception. Javascript, and the web as a whole, have one of the lowest barriers to entry. You don't need to learn how to use a shell, install a programming language, or anything. You can literally open the developer tools in a browser you already have installed and try playing around with programming. When it comes time to ship, your users already have the runtime ready to go. This low barrier to entry means that many people who are new to software development end up choosing it. Some of these go on to become serious software developers, some learn enough to make a small career in a niche without picking up the skills of a generalist developer, and some keep it as a hobby. There are quite a few people that fall in those last two camps, but there's nothing inherently wrong with that.

I'm a fairly seasoned developer with experience shipping things written in Java, Scala, Ruby, Python, Perl, and JavaScript/TypeScript to large and high traffic systems and services. The tooling and developer experience of working with TypeScript is still the most pleasant I've interacted with. On the UI side, it isn't even close.


The web is eating the world because it's the easiest way to ship an application. It has nothing to do with whether developers like to do it or not.


And because web app publishers have full control. If I have a web application, I can update it right now and it will be updated for ALL my users at the same time. No store policies bullshit, no need to somehow notify users that a new version has to be downloaded and installed, no fragmentation of your user base because half of your customers stay at an old version due to whatever reasons out of your reach.


But you don’t get full control, and that’s why people go native. Because platforms rightly distrust web apps. This will likely be true for a long time as security becomes a bigger deal everyday.


Most "modern" platforms distrust native apps too. (Android, iOS and increasingly macOS, Windows and even Linux)

This is one of the main reason that I often prefer web apps. (my preference obviously depends on the use case) I would much rather run a random company's messaging/video chat/whatever app inside my browser with strong sandboxing. Because browsers have truly accepted that applications should be considered malicious by default.


This. I cannot stress how important is this for modern app development. Just like Deno doesn't trust any script that is passed to it, no user should trust an app just cause it's installed directly in the system (and I can't believe I lived with that mindset as well). The developers should grow cautious of any tool and library they install, and the user should inspect more often what kind of access is it giving to anything they browse on the web, cause at least there they can block it.


But in those more modern platforms, web apps still have a big disparity vs native in terms of privilege, and it's not obvious why the gap should shrink; after all, apps that run native have passed their internal bureaucracy process and are thus more trustworthy.


You're talking about control of the hardware/filesystem/etc, I'm talking about control of the application itself.


You're talking about the relationship between publisher and platform, and how there are web wins in the direction of managing and updating apps.

But those wins must also be balanced with the loss of control due to those same platforms distrusting your app. Now your app, for all the wins it's going to achieve on maintenance and updating, is also going to take hits from its inability to do things that other people take for granted on native apps.

They are part of the same story of balancing pros and cons.


Of course. Everybody here is aware of the tradeoff of using web technologies versus native ones. The fact that I emphasize what I see as the strongest benefits doesn't mean I'm not aware or ignore the tradeoff.


As someone who works with a lot of large businesses; they vastly prefer SaaS delivered over the web these days. Not only does it mean there’s no infrastructure to manage, there are also no deployment / upgrade headaches if you need to roll a client out to a few thousand corporate users.

Web apps are also a hell of a lot easier to secure these days (as long as you trust / validate the platform’s security, which should be in the contract anyway). Which is also why a lot of “native” apps are pretty much web apps with some wrappers around platform-native hardware integration.


The OP didn’t sound like it was referencing client / server applications, rather writing native applications using web abstractions.


A Venn diagram of top coders in world and coders passionate about democratization of technology:

                 Deno are in overlapy    
                                         
                     |                   
                     |                   
                     |     /---          
          /--- --    |   /-    \---      
      /---       \-- | /- Best      |    
     | Coders       \|-  coders      \   
     /  passionate /-|\--     in     |   
    |  about      -      \-  world    \  
     \ democratization -              |  
      | of           -\               -  
      \ technology -/  -\           -/   
       |---\     -/      -\       -/     
            ----           -\   -/       
                             --          
There is just enough overlap to bring any/all of the best ideas from the non-web world to web tech.


> Or if it just exists because JavaScript developers don’t want to learn something new.

It's probably a big reason. But if you think about it from the other angle... you don't want to learn Javascript, which would be new to you :-)


Speaking for myself, I have already learned JavaScript - I just don't want to have to use it.


With WebAssembly, now you don’t have to. You do need a JS shim to load the WebAssembly, but after that, pick your favorite source language that can target wasm.


This is fascinating. As someone who programs primarily in Python, I have been struggling with adapting to a JS-heavy environment in the past several years.

I have began utilizing Node + React for frontend use cases but find that my build pipelines become incredibly cluttered and esoteric rather quickly.

Am going to explore wasm solutions, thanks :)


I’ve long been of the opinion you must learn JS today regardless of what your preferred language might be - it’s more or less (thanks to browser vendors and the web) the closest thing we have to a “zero” dependency runtime on all platforms. Learning a WASM based workaround to avoid learning JavaScript is not helping you or the products you might want to make. There is a huge ecosystem of existing JavaScript code to adopt or extend in the client out there too. Expecting or wishing the market to bend to your preferred language or style is an often great way to become a very unhappy developer.


As someone who programs primarily in Python, I have been struggling with adapting...

Would you mind providing some more details here? After Python packaging drove me batty for the last time, I wouldn't have described the switch to JS/CS/TS as "struggling"?

With respect to "build pipelines", you don't have to use grunt, gulp, etc. It's totally fine to have regular bash commands in npm scripts.


Hm... I think the biggest issues I experience have more to do with frontend design rather than composing JS itself.

I love using it as a "one-off" scripting language which usually involves interaction with some sort of an existing codebase.

I'm also fairly confident in doing any sort of backend-centric work, in which existing components most likely exist as well (or at the very least something POC-esque to iterate on or scaffold from).

With that said I believe most of these difficulties stem from most of my experience being backend-centric roles, so when it comes time to implement a frontend from scratch, I become unsure of where to start. The sheer amount of technologies available and vast amount of flexibility is also very intimidating.

However I've been slowly but surely immersing myself in the stack(s), and have found that much like my experience with CSS the best way to learn is to simply do it-- making mistakes and learning from them in my exploratory learning.

I do appreciate what you said about build pipelines and feel much better about my current projects :-)

quick edit: I think my lack of experience in functional programming is another factor here, however Typescript's object orientated style is very attractive.


WebAssembly holds a lot of promise for sure, but the interface with the outside world still has to go through JS for now for anything non-trivial. I really do hope that this will change soon.


For web programming, WebAssembly is still not enough.

It solves the problem of how to run your non-web based code in a browser, but until it can interoperate with the DOM, JS will continue to be mandatory.


I think you mean efficiently interoperate with the DOM. There are plenty of abstractions (at least that I’ve worked with in Rust) that allow for dynamic management of the DOM.

For example, see the wasm-bindgen and the generated web-APIs in web-sys: https://docs.rs/web-sys/0.3.49/web_sys/

A nice framework that takes advantage of this is yew: https://yew.rs/docs/en/


What I learned about myself working as a NodeJS developer:

I’m okay writing JavaScript as long as I don’t have to spend all day every day writing it.


This is an interesting thing to think about.

I was primarily a PHP backend dev for 15 years who only used jQuery to toggle classes on click actions, but decided to try something new and switched to Node in 2015.

Node + ECMAScript 2015 just BLEW my mind and was so fun, that not only am I now primarily a Node dev, but I've excelled extensively in the frontend.

I've never enjoyed building websites or web apps as much as I have in the last 5 years.


Touché. I will admit I am resistant to learning JS. Part of this is that it seems like no one actually "likes" to program in it, they just have to because it is so ubiquitous. But I guess that just goes back to the Bjarne Stroustrup quote: "There are two kinds of programming languages: The ones people complain about and the ones nobody uses."


I like to program in JS. I've written non trivial amounts of code in Java, PHP, Clojure and OCaml. I've written approaching trivial amounts of code in Go and Rust.

JS has warts, undoubtedly so. But I try to approach it like I would Clojure and it makes the experience much better for me. I think JS has the bones of a Lisp if you're willing to look for them. Sure it doesn't have macros or decent conditional expressions or immutability but using something like Ramda gets you some of the way there. And there are proposals for pattern matching and immutable data structures though they are fairly far off.

I think JS is at its worst when people try to code it as poor version of Java or C#. I have a bit of a love hate relationship with Typescript for this reason. Some aspects of its type system are great like being able to make a union out of anything, I wish OCaml had that. But I think denying Javascript's inherent dynamic nature is a mistake.

Overall I've learned to ignore its worst parts and just focus on what made it good from the start, being a weird cousin of Scheme that runs in the browser.


> Part of this is that it seems like no one actually "likes" to program in it

JavaScript consistently ranks as one of the most loved languages in Stack Overflow surveys...

I find plenty of counter-examples of this point on HN too - engineers who have worked across many different stacks and rank JS/TS as one of the best dev experiences they've had (there is one such comment in this comment tree)

Anecdotally, I feel as if pre-ES6 a lot of the complaints you read online were from users of the language. Nowadays, it's as if most of the complaints are from people who don't use it.

I think there is something to be said about refusing to learn anything but JS, as knowing different languages can be a boon for an engineer, but I believe it's an excellent tool to have in one's belt nowadays.


I love it, but I use Typescript, which has a few niceties (e.g., stronger typing) beyond modern JS.


For me, Typescript changed everything. I don't enjoy programming in plain old JS but I love Typescript.


I learned JS when it first showed up. Still write backend servers in C++ and JS libraries to interact with those servers from a browser.


I replaced recently an unreadable bash-script through a small maintainable JS program. It's doing the same thing as the bash script and accepting the same parameters but does not require a black belt in regexp and going through hundreds linux/unix command man pages to understand what it does.

I think you'd be surprised how many modern applications are written in JavaScript or Python.

One of the more prominent ones is VSCode.


There's that famous Knuth exchange with another programmer where Knuth develops a super-complex data structure and algorithm to solve a problem, and the other programmer responds "that's cool, but I can do that in a single command line by piping three Unix commands".

There is a case to be made for programmers being at least familiar with Unix utilities and shell scripting so that they can unlock the superpowers described in that anecdote.

However, I largely agree with your sentiment - I pretty much do all of my scripting in JavaScript and Python and would not be particularly happy to have to deal with a large bash script.


It is time to post this again. The Birth & Death of JavaScript ( 2014 )

https://www.destroyallsoftware.com/talks/the-birth-and-death...


Electron and React Native aren't popular because they're in JS. They're popular because you can write the application once and use it cross-platform. Mac/Windows or Android/iOS.


Electron is popular because when your application is in JavaScript writing and improving plugins for it is a breeze.

That was what propelled Atom and VSCode.

What you said is valid of course when you look at Slack and such, but you ommited most important cross-platform target.

Your app can run on Windows/Linux/Mac .. and the Web.


If you consider the web as a platform then you'll see immediately how writing an app in JS rather than in C++ can be advantageous.


Maybe I got a little too worked up in my other comment. Maybe you didn’t mean the tone that I took your comment as.

But in a serious answer to your last sentence ” And the thing is, I don’t know if I just learned web dev if I would love this new approach to software that is eating the world and I would “get it”. Or if it just exists because JavaScript developers don’t want to learn something new.”

My opinion is the answer is a resounding “No, you will not just get it”.

This is a complicated question of course, but to distill my thoughts down

1) You do not need to view web development as a threat to your skill set. You didn’t say specifically what stack you work in, but I have to believe that you will be able to continue making a living in it.

2) Tools like Deno are specifically designed for and intended for web developers to be able to easily leverage their skill set in other areas like systems.

So if your not a web developer, and you don’t have a particular interest in learning web languages/ APIs, then don’t worry about it. Just because it’s trendy right now doesn’t make it a better approach technically speaking. It’s quite possibly worse than the approaches you know.

So what I’m saying is this tool isn’t meant for you. And that’s ok. Just because it makes HN front page and web development is huge right now, doesn’t diminish the tech you know to be good.

Last sentence of the blog post: ”The Deno company hopes to enable the millions of web programmers out there to maximally leverage their craft in other domains.”


A lot of modern software has to ship a browser version and this simple fact can be all it takes to inform downstream choices.

Let's go with WebSockets as an example. You decide you need them in the browser. This will likely mean some other component will have to support them too. By now you need a pretty good reason to reimplement it over something else. If you need a handshake you may exploit the fact that WebSockets open with a standard HTTP request. Your handshake over another socket may look pretty different. DDoS services, proxies etc may support WebSockets but not arbitrary TCP or they may support it differently. A WebSocket "message" doesn't exist in plain TCP, will you make your TCP protocol work the same or will you handle frames differently?

Point is - your choice may be between just WebSockets or WebSockets AND something else.


Typescript / Javascript have significant reason to exist in their own right.


There's a big difference between "interpreted" and "JIT compiled". Both of these end up JIT compiled in most cases


> Or if it just exists because JavaScript developers don’t want to learn something new.

Looks to me like they don't want to write stuff twice, and they would have had to write for the web anyway.

The web stack is not good by any measure. The native GUI toolkits are suffering form abandonment, so the web-based toolkits are among the best available (but not at the top). But if you don't have any reason to expect to create web code, your life will be better if you ignore the stack.


> The native GUI toolkits are suffering form abandonment

SwiftUI is new.

Flutter is new.

Both have taken birth recently and have huge investments and usage.


They answered this at the end of the article: “The Deno company hopes to enable the millions of web programmers out there to maximally leverage their craft in other domains.”

It takes time to learn an ecosystem, and when people know one ecosystem and not another, most people would rather do high skill, high value work in the ecosystem they know than start over as a noob in a new one.

The only thing unique about the web / JavaScript land is its ubiquity and size due to its place as the language of the browser. So anyone who can make an abstraction that lets JavaScript developers do stuff outside the browser without learning much new has a virtually guaranteed large audience of developers who are likely to think “I would love to be able to build X, I just don’t have time to learn a new ecosystem.” Those folks are then thrilled to dive in and use the new abstraction. And there are a lot of those people. And that’s why we are all stuck using Electron apps for so much stuff. :)

But that doesn’t have to be a bad thing. Electron can evolve to be less of a resource hog, and better alternatives are being tested all the time. The same is true for other non-browser applications of JavaScript.

I don’t know if this vision is reality, but I think that it may be that we’re in the early days of a transition toward the browser stack being the new “GUI.” Which is to say, back in the 80s there was a lot of debate around GUIs and whether they were a good idea etc., and while most people liked them to some degree, they also lamented the idea of losing the command line. But in the end, GUIs didn’t shrink the number of CLI tools in the world, rather they increased the size of the domain that computers are used for my making it more accessible to more people. I think that so far the web vs native debate seems to be following a similar trajectory.


> It just exists because JavaScript developers don’t want to learn something new.

Are you sure? I started building a desktop application recently. As much as I abhor Electron and refuse to touch anything built on top of it, there basically wasn't any other choice due to the absolutely massive amounts of libraries written for the web. I threw a couple of them in and it saved me at least 90% of work compared to what I would have to do if I'd chosen Qt (or anything else native). Would I rather spend a week and have something to show to my client, or roll up my sleeves and re-implement everything myself (which would take about half a year before I'd have anything working at all)?

(For the record, I didn't downvote you. I consider it a shitty practice to downvote somebody for their opinion, without even replying.)


PWA installs are another way to get onto a system. It's less resource intensive as you leverage the browser already installed on the system.

It doesn't work if you need extensive access to local hardware. But if you're mainly on the network anyway to operate, it's very frictionless and relatively lean.

With my last app, the network asset bundle is about 2MB, uses only ~11 MB on disk on my Mac and ~25MB RAM when running.


Yeah, and I meant that with a wink then wrote at lengthy justification for that thinking, lol. Oh well, I rephrased to take out the jest.

I don’t disagree with you about why people use Electron. There’s a positive feedback loop around ecosystem sizes. The bigger the platform the more devs use it the more tools they make the more libraries are made the more it’s the easiest way to do X the more applications are written for it the bigger the platform gets.

Anything you can offer to a large group of developers that lets them extend their existing high skill level to new domains where they would otherwise be novices is very likely to be adopted.


[flagged]


I think you took the wrong tone from the parent comment; it sounds like this person was genuinely interested in sparking a real conversation about how people feel about the environment that they spend time in, and whether that could have dramatically shifted based on what they decided to or had to learn


I feel like if this person was interested in “sparking a conversation,” they wouldn’t open said conversation with “I hate how bloated web abstractions are, but I guess they’re popular for some reason,” with the exact same digs at electron that have been rehashed in every HN comment section.

I agree, to be clear — I use many of said abstractions, and they’re often quite bloated and everything else this person is complaining about. This just doesn’t strike me as someone who’s asking for insight; it’s just complaining, and it’s fine to point that out.


Others said they didn't get the same tone as I did, so maybe I'm off there. I'm willing to reconsider.

I'll just give my opinion on the final sentence which I guess is the conversation we should be on.

"And the thing is, I don’t know if I just learned web dev if I would love this new approach to software that is eating the world and I would “get it”. Or if it just exists because JavaScript developers don’t want to learn something new."

I think the answer to this is a resounding "No, if you're not a web developer you will not 'just get it'".

I don't think one approach or the other is 'better', but it just depends on what your background is and how you got into programming, and that's fine.

These tools are meant for web developers. We are talking about the next iteration of NodeJS here. It's the whole point is that it merges web programming with systems programming. So if you're not a web developer, there really is no good reason to jump over unless you are just curious.

From the blog post: "The Deno company hopes to enable the millions of web programmers out there to maximally leverage their craft in other domains."


> Respectfully, your comment is way off topic, and nothing to do with discussing The Deno Company,

My take from reading the blog post was that the Deno Company's entire thesis is that the developer community will increasingly be moving away from these older abstractions and towards web-based abstractions, and Deno is positioning itself to fuel that migration. GP's comment addresses this directly; it seems rather on-topic to my reading.


Fair enough. But I don't see it as the community will be moving away from older approaches. The old ways will always be there. I see it as this is a on-ramp for web developers to get into systems programming.

The reality is there is a huge wave of new developers in the last decade who entered into programming through web. I guess the OP sees this a 'threat' to his way of doing things somehow, but I don't think it's a threat at all. Normal to feel that way though.

These tools are meant for web developers, and if you are not one that's okay. There is probably no good reason to make the jump unless you are just curious or bored.

Last sentence of the blog post: "The Deno company hopes to enable the millions of web programmers out there to maximally leverage their craft in other domains."


When I read the parent comment I didn't infer any of the spite you seem to have felt. It's an interesting question, rephrased: "Is this the best way forward? Or is it only better because people don't need to learn a new language? What are the advantages and disadvantages? Will my current approach to development go out of fashion?"


Fair enough, maybe I misinterpreted. But things like "Or if it just exists because JavaScript developers don’t want to learn something new", are very divisive. As if there is something different about Javascript developers that make them lazy, and C programmers are infinitely curious and learn everything new they can.

Edit:

I don't believe one way or the other 'better' technically speaking. Usually the 'best' way forward is the one you have the most skill in. So yes, the one you don't need to learn a new language. Tools like Deno are designed for web developers. Just because they are popular and you hear about them a lot doesn't mean someone else's way of doing things it threatened.


The New Way prints money much more reliably. It gets more people involved in the process. It forces deeper coordination across communities in and out of tech, strengthening connections.

The old way gets _your_ job done _effortlessly_. idk, I guess that’s not ideal in the meat world.


[flagged]


> From there I stopped reading.

Why? Is it https://news.ycombinator.com/item?id=6845286 ?

Because the next paragraph goes:

Rest assured that Deno will remain MIT licensed. For Deno to grow and be maximally useful, it must remain permissively free. We don’t believe the “open core” business model is right for a programming platform like Deno. We do not want to find ourselves in the unfortunate position where we have to decide if certain features are for paid customers only. If you watch our conference talks, you will find we've been hinting at commercial applications of this infrastructure for years. We are bullish about the technology stack we've built and intend to pursue those commercial applications ourselves. Our business will build on the open source project, not attempt to monetize it directly.


"it must remain permissively free."

... until the VCs change their mind or AWS uses our code to make more money than we do.


> ... until the VCs change their mind or AWS uses our code to make more money than we do.

That is the correct answer.

Amazon (AWS) can lift Deno and use it as a service on AWS for devs and effortlessly compete, out scale and out do them 100 times over and supersede the Deno Company. Thanks to using the '...permissively free' MIT license (Being AGPL will also make no difference) AWS can do just that.

Just like how they did that to MongoDB, Redis and Elasticsearch who were all victims of Amazon's ruthless tactics on open-source.


I just don't understand this. This project uses distributed version control. This implies 2 things

1. It is impossible to delete every copy of this project. If the authors wanted to restrict access by taking the repo private, they can't. It will always be out there.

2. Every commit is licensed with MIT. So even if there is a licensing change, you have access to every commit licensed by MIT. You've lost nothing except rights to future work.

Even if they change their mind, you can still use their existing code to run your software. If it's popular enough to be forked by the community, you can use the fork if you prefer.

You only need to be concerned if you're heavily invested on their platform and it's not popular enough for community support and you need new features added. In this unlikely scenario, yes, you won't have any option but to buy in to their new business model.


"If it's popular enough to be forked by the community"

Yes.

And if it isn't you need to switch.

If you are tied in in a way to proprietary tools or APIs then you have some major migration in front of you. If you don't you you might be able to use Node.

If you have a tight schedule already, or no development capacity for the migration, you will need to buy a license because you might not want to run your code on a plattform that no longer gets security updates and you no longer get support.

When this happend to me in the past, I've got some major bills for licenses b/c most of them are server based (or core based) and also count development, testing, staging and CI servers (>$100k/y).

"You only need to be concerned if you're heavily invested on their platform and it's not popular enough for community support and you need new features added. In this unlikely scenario, [...]"

s/unlikely/likely/g

I have no data but would assume this is the default ending, because most projects are not being forked (Mongo, ...) and create a successful community around a fork and most companies I work with are heavily invested in a plattform and do not adhere to the standard APIs and tools (e.g. I'd say 10% of my customers use AWS in a standard way, 90% are heavily invested in the plattform, same for GCP.) If you have other numbers and/or studies, I would be interested.


If it is run by an established foundation like Apache with established procedures for how to manage large complex projects, then you can be sure that the project will remain open-source and usable for your purposes. Otherwise, I'm not so sure.


I'm incredibly slow, and only just realized that Deno is an anagram of Node.


> Of these, the web browser scripting language (JavaScript) is the fastest, most popular, and the only one with an industrial standardization process.

Most popular, I can agree. Fastest, & only one with industrial standardization process? Have they met Erlang?

edit: you have to be kidding me, downvoted to oblivion for an honest observation. Sorry I hurt javascript's feelings.


Erlang is compiled to bytecode, right? So is it considered a scripting language?


I don't understand the connection being made between having a bytecode compiler and being a scripting language.


Java is definitely not a scripting language


Is JavaScript? Python? Ruby? All major implementations of all three are bytecode compilers/VMs.


It "feels" like a scripting language and is run in a runtime shell environment. But I would concede to your point, as well.


Sounds like total bullshit. Deno hasn't put any dint in nodejs and never will because nobody is rewriting all of their stuff for the new API. It is just that all the scummy founders and rockstars of silicon valley have found that offering higher level services is a great way to scam people of their money because it is the only way the new "average" programmer can even make anything. The only problem is long-term the people using these services will need to find actual programmers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: