A real barrier to a decentralised web is the difficulty of installing software on a server. I know that sounds really mundane and inconsequential in the broader debate about a decentralised web, but consider the following...
Imagine if installing a server-side chat app, message board, project management app, or CMS were as easy as installing a desktop app. In a desktop app, it's usually one click to start the install and then, if necessary, you're guided through a few screens to complete the install. Want to uninstall? The OS (operating system) will provide a feature to manage that.
Now consider how complicated installing on a server is in contrast. Upload you files to a folder or directory, enable permissions, set configurations not just for your server but also the language the program is written in - the list goes on. No wonder SaaS (Software as a Service) is thriving like never before. Who, other than technical folks, could possibly have the time, interest or inclination to set up a self-hosted solution when the barrier is so high? Perhaps some in the tech field would like to keep it that way? Would Saas be less attractive if installing a self-hosted solution was simple, easy, quick and secure?
Surely an essential part of a decentralised web is that companies, organisations and individuals choose to run their own software using open protocols and data formats. But until the ease, security and simplicity of installation improves for web software, it simply won't happen on a large scale.
Living billboard arrives to announce, drumroll please :
sandstorm.io
The auto updating server that you can self-host or use as a hosted service, allowing you to install and uninstall lots of applications, both free and proprietary, just by clicking the button.
(No, not paid to tell this unless you count the stickers I got. And no, not perfect but still a refreshing change.)
Sandstorm is awesome. Of course, it would help if ISP's weren't so keen to make sure nobody actually uses the internet to make computers talk to each other, instead of just having Amazon, FB, and Google's computers talk at you.
Has anyone ever actually seen an ISP network that filters local traffic to other ISP endpoints? I'm not even sure why they would do that. DDOS almost always targets big things in the cloud so it would not help there.
Here (SoCal) traceroute to a host in the same local neighborhood is three hops.
Its only purpose is to force businesses up to business class. You might see it enforced if you ran a public for profit site from a consumer connection.
In any case the encrypted p2p protocols run by dapps (decentralized apps) are opaque, and these don't really qualify as servers because architecturally they are not client-server at all.
This. I am an ISP owner and we block popular ports used to run services by default. 99% doesn't care and their computer gets hacked or they get virus/malware and those malware starts running services and spamming the world and our IPs gets blacklisted (which is a huge headache) and now cloudflare's broken shit ip checker will automatically start blocking ips, and customers will start blaming us for it. Occasionally, 1 or 2 customers will ask to have email server port or web server port to be opened, we usually ask them to sign up for a business contract, which clearly states that you can run this kind of services and you are fully responsible for any negative consequences.
This is not about money. This is just good housekeeping. When I was in NY, I believe they TimeWarner and Verizon (or was it AT&T?) also used to do the same thing.
Thanks for responding with the ISP's point of view. My frustration came from just wanting to run a small Ghost blog on an odroid with, at most, hundreds of visitors per month, and hearing that this required a business contract. Of course, whether I want to do this for business or pleasure doesn't really matter if you're dealing with malware and being blacklisted.
At one time ISP's gave you a bit of space on a shared host for free as a courtesy, it would be nice if these days you could get the equivalent of a t2.micro. I know a few people (teenagers I'm related to mostly) who would like to mess around with programming and building web sites but for whom even a tiny cost is a barrier. As it is I throw a few bucks at nearlyfreespeech for them and it's good for a year, but I imagine plenty don't have that option.
I think there is a miss communication between consumers and internet providers. I think if the consumers were aware of how things are run in an ISP they will be more understanding and reasonable when facing a problem like not being able to host services yourself. From my part, I try to write about my struggles in running our ISP business <1>.
Another thing to consider why we can't easily allow our user to run services if we wanted to because most home users IPs are not static, and often your IP is shared by many users. IPv4 is running out, giving our user the ability to run service will also mean giving him a unique IP, which they will have to pay for it and because of the scarcity of unique IPv4s only business users are allowed to have them and business accounts cost money. We haven't moved to ipv6 yet, it will require some investment and overhauling some of our networks. Depending on your local law, you may actually have to have a business under your name, meaning you have to show business license papers, to get a business connectivity.
I know what you mean by having to run your own blog or services. I have done it myself and I have learned what I know today mostly due to me tinkering and optimizing apache and WordPress on the fly when it was on the homepage of Digg. It was such a rush and I have come a long way since those days, thanks to being able to host my blog on a time warner DSL line. This was in the early or mid-2000. But it was a different time, and things have changed a lot since then.
One of the reasons ISPs back then used to give you free email address and shared hosting, because they wanted to tie you down to their services. Specifically, if you used their email for a long time it would be difficult for you to leave it behind. But the landscape has changed since then, consumers are smarter and webmails are much better and free blog hosting are dime a dozen.
My advice to people who want to learn hosting services by doing it themselves on their own computer that, even though it can be a thrilling experience, hosting it remotely and configure/securing server from scratch can be an even better learning experience. And VPS hosting services like digital ocean can be very affordable.
There is a popular misconception that ISPs are a profitable business and they try to shaft the consumers at every opportunity they get. Maybe for the big players like Comcast, it is true, but the rest of us small and medium sized ISP its far from the truth. The complexity in running reliable services can be staggering, the equipment and licensing can be an astronomical investment (at least for us it's a large amount) and running the last mile and upkeep for it with 24/7 support are very expensive for us. BWs are not expensive, but equipment is. When we saturate our 10gbps port, moving to 40gbps or even 100gbps network (switches and routers) can be a mind-numbing expense. On top of it, we are taxed and need high license fees for all kinds of services. We really don't make a lot of money and you constantly have to spend on your infrastructure as you grow (fighting to get customers by lowering your cost, and hence your revenue and profit). You have to keep growing to stay relevant and reach that point you can connect more users with minimum costs without having to invest in growing your network. Reaching that point can take anywhere for 5 to 10 years, if not more.
Sorry for the rant, I just needed to get it out of my system. :)
I can attest to that, my previous ISP had explicitly stated that running web or other file server was not allowed - I'm not exactly sure if TOS was in violation of my consumer rights but the fact is that they could cut me off if I were going to set up a server... I mean even for personal use (it didn't say anything about running a server business)
> DDOS almost always targets big things in the cloud
That isn't true. Those are the ones that you'll hear about in the media but there are plenty of DDOS attacks on smaller sites both for extortion purposes and to force them offline.
There is a highlight point that many internet services are indeed asymmetrical. My Comcast service is 180/20 Mbps, down/up respectively. Even if you get Comcast's new gigabit service, you get 940 down, and 40 up. And many home ISPs prohibit you from serving a website at home, according to their terms of service.
It's pretty common to have outbound traffic to port 25 blocked as an anti-spambot measure. Sometimes you can call them and get them to remove the block, other times it's non-negotiable.
Most cloud servers do this too. Spam has basically destroyed smtp as an open federated protocol anyway. Network reachability is the least of the issues you'll face trying to run your own mail.
Lesson: any distributed or federated protocol that is not robust against abuse is doomed.
I blacklist 25 out except from our mail server. That's just basic common sense given some of the spyware. Users who need to send mail will use encrypted 465 or 587 anyway.
P2P filesharing is part of the "decentralised web", and people seem to have no trouble installing the software for those.
Now consider how complicated installing on a server is in contrast. Upload you files to a folder or directory, enable permissions, set configurations not just for your server but also the language the program is written in - the list goes on.
I think a lot of the difficulty is artificial, created by software that is far more complex than it needs to be, to cover far more use cases than most users actually need. In the "enterprise" space, a lot of this complexity probably also drives auxillary revenue in the form of training, consultancy, etc. In other words, it could be a deliberate barrier to entry. Building big, complex, immensely flexible, yet difficult-to-configure systems with plenty of dependencies just seems to be the norm.
Perhaps some in the tech field would like to keep it that way? Would Saas be less attractive if installing a self-hosted solution was simple, easy, quick and secure?
Indeed, the whole category of "enterprise software" often fits this business model.
But me and many others have written HTTP and FTP servers which do not require any installation at all --- they're just a single (often very tiny compared to most other software) binary, sometimes with an optional configuration file. If you're doing something like hosting static pages, this fits the use-case perfectly well.
Finally, a huge part of making the web truly decentralised is to abandon the notion of dedicated servers/clients altogether --- and thus also the notion that you must need a dedicated or "server" computer to host anything, or for that matter a dedicated Internet connection. Of course some machines will have more resources to serve, and a typical residential connection may be more limited, but the key idea as exemplified by P2P is that any machine can serve.
>P2P filesharing is part of the "decentralised web", and people seem to have no trouble installing the software for those.
We can't really call P2P filesharing today decentralized. It depends on trackers. And it's trackers who help users (often out of commercial interest) overcome all the troubles with software.
Some torrents depend on trackers, yes. Others use magnet links, which aren't dependent on trackers at all. The data typically served by a tracker is instead stored in the DHT:
Every peer to peer system requires bootstrapping, often this is achieved with some central or federated nodes/trackers. Once you have peers, you can forget about trackers and use the DHT.
Have you looked into Sandstorm? Their whole goal is to make installing server apps as easy as in mobile platforms, while securing them from the outside and each other, and allowing the user to switch providers bringing their apps and data along.
Here's something from the first part of the install which is not something a normal person would ever do:
Configure the EC2 security groups (firewall) to let you reach Sandstorm
By default, Amazon EC2 protects your virtual machine from being reached over the Internet.
In the above tutorial, we allow the virtual machine to be reached on port 80. By default, Sandstorm uses port 6080, so look through the above tutorial and add another security groups rule allowing port 6080.
I've just installed Sandstorm to test it: Created a droplet in Digital Ocean, logged in ("ssh root@...", and they had my ssh key), the curl|bash, and it worked. It was way beyond awesome for an open-source Linux app. So the EC2 problem exists only because someone advised to use Amazon for a hobby installation.
I love the concept, and I'm looking forward the day a facebook alternative could be built into it. And if it had existed when Google Reader ditched the internet, everybody would be on Sandstorm today. I've just experienced the feeling of browsing my RSS feeds of porn without feeling watched and without being afraid of hitting a "share to facebook" button: It's a great experience.
It's more like suggesting that they should drive their own cars, pump their own gas, or change their own tires in an emergency. Nobody is suggesting that people should be able to program a webserver.
> A real barrier to a decentralised web is the difficulty of installing software on a server.
Also, economies of scale.
If people used a decentralized service called "UnFaceBook", the total cost of servers, administration, etc. would dwarf the cost of Facebook running their data centres. From a business perspective, it's just not feasible.
Hmm... perhaps everyone running their own systems is, in fact, doable. Most people have smart phones which are much more powerful than servers from 6 years ago. Why not just use that?
Have the content at the edge, and controlled at the edge. Scalability can come from lots of caching at the core.
Not a startup I'd want to do, but it's technically feasible.
> Most people have smart phones which are much more powerful than servers from 6 years ago.
This isn't even remotely true.
The most powerful servers from 6 years ago is a xeon x7560, which has a 40% higher passmark score than even the best enthusiast consumer-level cpu on the market today (an i7 6700k), never mind even the most expensive smartphone on the market.
I think that qualifies as "remotely true" actually. They didn't say "more powerful than the most powerful servers" they said "more powerful than servers".
People on the internet have an odd tendency to interpret statements in the broadest possible way.
I did misread that. I think the point still stands. The cheapest server cpus from 2010 are 60% as good as the best 2016 cpus. It might be close to the iPhone 7's latest chip or Google's Pixel. But most people don't have the latest and greatest. A normal phone won't be close to the latest and greatest i7. And I don't think it will be close to a 2010 era server.
Economics and scale can be a strange beast. The sum of Amazon EC2 + Google Cloud + Digital Ocean + Rackspace +... is about 10,000,000 servers, which makes... 1 server per 7,000 inhabitants on Earth. Have you ever looked at it this way?
And that's only for public cloud, not including Facebook, Google's internal servers, Apple's infrastructure, ISPs, and servers hosted by all companies. So to provide all IT services to citizen of modern economies, we're certainly close to 1 server for 100 inhabitants. Sometimes I wonder what we're doing with so many servers on Earth: I don't spend 24hrs a day sending requests to public servers, and even if I did, the server I'd be pinging could handle a few thousand users at the same time. So where does all this processing power go?
And there's even more computing available if you include everyone's home and work PC, phone and router, but those are not always-on.
> Hmmm... perhaps everyone running their own system is, in fact, doable.
Crunching the numbers, we're already above one system per person ;) So we might as well go full-decentralized, if we could conceive a theoretical model around it.
Security, redundancy and isolation. Often times you have an extra server not because you need the processing power, but to separate things for security reasons, to provide failover and to avoid noisy neighbors.
Certain things, like filtering out spam, or handling video, require rather long computation per user per day. Same probably applies to just transferring data quickly enough, with a lot if spare capacity to handle spikes.
I like the idea of smartphones as servers feeding content to a CDN... Combined with IPFS [1] that should work well.
I was going to make a different comment though. You suggest that, because it would have higher infrastructure costs, a decentralized network is "not feasible" from a business perspective.
I'm wondering a) whether infrastructure costs are currently a limiting factor in the growth of social networks, and b) if a decentralized social network needs to be a business at all.
But more to point a, what if it cost 10x more, but the current costs were $0.10 per user per year. Do you think a service with a cost of $1 per user per year would be too expensive to operate?
I was exactly just thinking about this. If phone is offline we need CDN's. But then again phone's have to watch out for their precious battery life.
With decentralized, you also have a huge issue of protocol. Facebook can upgrade millions of people instantly to new version. Decentralized could be a major pain.
> A real barrier to a decentralised web is the difficulty of installing software on a server … Imagine if installing a server-side chat app, message board, project management app, or CMS were as easy as installing a desktop app.
If you have a desktop, you can install any server software you want. And if you have a desktop, you can leave it running as long as you want. You don't need to purchase a host somewhere; you can just use the computer you own.
Heck, you own a computer which is on 24/7 already: your cell phone. And software written well could run on that phone to serve whatever you want, at a minimal cost in CPU & hence battery.
Software written well can work that way on Android, but not on iOS which force-suspends your process and sockets all the time. The only exceptions are granted unilaterally by Apple and the chance of them allowing server-like behavior on iPhones is infinitesimally small.
That's why it's important to hold on to people owning general-purpose computers, open standards and community governance: because once you move away to centrally-controlled appliances, there won't be a platform from which to bootstrap the next, better system.
> A real barrier to a decentralised web is the difficulty of installing software on a server.
LOL.
Decentralised would mean the user installs software on his/her desktop/phone and it works as you think a server works. Thats all there is to it.
If you make "installing on a server" - somebody else computer easier, its not gonna be detentralised. Youll have shareing of peoples "apps" on a single server, and eventually Cloud is going to be invented. Look then how easy it is to run apps and install shit on other peoples computers.
Running "a server" is not any more difficult than running any other app, on Android or on Linux, the difference is pacman -S kwrite or pacman -S lighttpd, or picking "primitive ftpd" app and running it.
I think the real problem is that people think servers are some kind of magical computers, different from any other general purpose computers people are already using as clients.
> Running "a server" is not any more difficult than running any other app, on Android or on Linux, the difference is pacman -S kwrite or pacman -S lighttpd, or picking "primitive ftpd" app and running it.
Which directory does the ftpd app store it's data in, and on which partition? How do I get alerted if I'm running low on space and what do I do if I am? Is the data backed up and what is my disaster recovery process? What port is it running on and how do I connect to it? BTW, insta-fail because we should be using secure ftp. Is it using encrypted communications and how do I install a certificate and share public keys? What about configuring access through the firewall? etc, etc.
It is as easy as installing a desktop app. Sometimes even easier (you can not just write "apt-get install Word<Enter>" and it will be just there).
The problem is that you don't just want a CMS, you want it to look a special way, do special things with it, don't want other people to access it without you allowing it, etc.
> Imagine if installing a server-side chat app, message board, project management app, or CMS were as easy as installing a desktop app. In a desktop app, it's usually one click to start the install and then, if necessary, you're guided through a few screens to complete the install. Want to uninstall? The OS (operating system) will provide a feature to manage that.
It's this way already. Virtually every shared hosting provider (Hostgator, Bluehost, etc.) provides a Cpanel admin panel with Softaculous software installer. All kinds of apps (blogs, CMSs, project management apps, etc.) are a single-click install and removal.
Or an organisation people could freely join without being forced to learn system administration concepts and practices. Imagine WhatsApp, but owned and sustained by (a subset of) its users on a non-commercial basis.
I dunno, there are an awful lot of server apps that I can install by simply running apt-get install. Achieving nirvana might be a simple matter of packaging.
Imagine if installing a server-side chat app, message board, project management app, or CMS were as easy as installing a desktop app. In a desktop app, it's usually one click to start the install and then, if necessary, you're guided through a few screens to complete the install. Want to uninstall? The OS (operating system) will provide a feature to manage that.
Now consider how complicated installing on a server is in contrast. Upload you files to a folder or directory, enable permissions, set configurations not just for your server but also the language the program is written in - the list goes on. No wonder SaaS (Software as a Service) is thriving like never before. Who, other than technical folks, could possibly have the time, interest or inclination to set up a self-hosted solution when the barrier is so high? Perhaps some in the tech field would like to keep it that way? Would Saas be less attractive if installing a self-hosted solution was simple, easy, quick and secure?
Surely an essential part of a decentralised web is that companies, organisations and individuals choose to run their own software using open protocols and data formats. But until the ease, security and simplicity of installation improves for web software, it simply won't happen on a large scale.