And what about privacy? An entire OS that records everything you do online, it even records printing to your desktop printer.
Microsoft rightly gets skewered for tracking user behaviour in Windows 10. A few years ago, Canonical was heavily criticised for sending anonymous data from Ubuntu to Amazon. Meanwhile, Google captures more user data than possibly any other tech company - we're talking gargantuan levels of data - and yet on matters of privacy, the tech community gives Google a free pass. Why such double standards?
This is a quote by Niklaus Wirth, a Swiss computer scientist who has created many programming languages, most famously Pascal. It's a long quote, but I like it very much:
"Many people tend to look at programming styles and languages like religions: if you belong to one, you cannot belong to others. But this analogy is another fallacy. It is maintained for commercial reasons only. Object-oriented programming (OOP) solidly rests on the principles and concepts of traditional procedural programming (PP). OOP has not added a single novel concept, but it emphasizes two concepts much more strongly that was done with procedural programming.
The first such concept is that of the procedure bound to a composite variable called object. (The binding of the procedure is the justification for it being called a method). The means for this binding is the procedure variable (or record field), available in languages since the mid 1970s. The second concept is that of constructing a new data type (called subclass) by extending a given type (the superclass).
It is worthwhile to note that along with the OOP paradigm came an entirely new terminology with the purpose of mystifying the roots of OOP. Thus, whereas you used to be able to activate a procedure by calling it, one now sends a message to the method. A new type is no longer built by extending a given type, but by defining a subclass which inherits its superclass. An interesting phenomenon is that many people learned for the first time about the important notions of data type, of encapsulation, and (perhaps) of information hiding when introduced to OOP. This alone would have made the introduction to OOP worthwhile, even if one didn't actually make use of its essence later on.
Nevertheless, I consider OOP as an aspect of programming in the large; that is, as an aspect that logically follows programming in the small and requires sound knowledge of procedural programming. Static modularization is the first step towards OOP. It is much easier to understand and master than full OOP, it's sufficient in most cases for writing good software, and is sadly neglected in most common languages (with the exception of Ada).
In a way, OOP falls short of its promises. Our ultimate goal is extensible programming (EP). By this, we mean the construction of hierarchies of modules, each module adding new functionality to the system. EP implies that the addition of a module is possible without any change in the existing modules. They need not even be recompiled. New modules not only add new procedures, but - more importantly - also new (extended) data types. We have demonstrated the practicality and economy of this approach with the design of the Oberon System."
I agree that many non-fiction titles could be shorter (although probably not the length of a blog post).
I confess I am an impatient reader, and many non-fiction titles I read feel padded out with unnecessary verbiage. I do wonder if the tools we use to write have some bearing on this. When using a computer to write, the ease of editing means it's much easier to just write and write and write.
Imagine if you were forced to write your manuscript by hand. I'm guessing the length of your non-fiction manuscript would be shorter, sharper and more to the point.
From what I've heard, it's more of a market pressure.
Imagine you wrote a typical 250 page non-fiction book that is padded out in the typical way. Just before publishing, you get fed up and edit it down to just 50 pages that convey everything without any fluff. Your publisher will tell you you're nuts and insist on publishing the 250 page version. Why?
Imagine you are shopping for a book on a topic. You find 6 such books in the store. 5 are the typical 200-300 page pieces. One 50 pages. They all cost the same. You don't have time or patience to investigate their relative quality. Do you A: assume the 50 page book is a low-effort blog series repackaged in print. Or, B: bet that the 50 page version is a high-effort careful edit? Most people choose A and I'd bet they are almost always correct in current practice.
Tldr: Short books look cheap. Customers assume they are ripoffs and don't buy them.
Minor nit: writing ≠ editing. One could argue that the ease of editing could result in more concise pieces as it's easier to refactor what's already been written. Editing does take effort, though.
The Labour Party's abstention on this bill is unforgivable. We need to hold the Conservative government to account for their deceit and utter incompetence. We need a strong opposition party now more than ever. But Labour have completely failed in that regard.
Yes, I realise Labour is up against our mostly Labour-hating right-wing national press who seem as culpable as our politicians in perpetuating lies and untruths to push their own agenda. (Is there any country in Europe that has a national press as vile, racist, and nasty as our national press in the UK?)
Labour should be pummelling the Conservative party for their endless lies and dishonesty. They should have ripped the Conservatives to shreds over this bill (as they should have for Brexit too).
Politics in the UK has never felt so oppressive, depressing and dysfunctional in recent memory than it is now. And it's probably only going to get worse.
A real barrier to a decentralised web is the difficulty of installing software on a server. I know that sounds really mundane and inconsequential in the broader debate about a decentralised web, but consider the following...
Imagine if installing a server-side chat app, message board, project management app, or CMS were as easy as installing a desktop app. In a desktop app, it's usually one click to start the install and then, if necessary, you're guided through a few screens to complete the install. Want to uninstall? The OS (operating system) will provide a feature to manage that.
Now consider how complicated installing on a server is in contrast. Upload you files to a folder or directory, enable permissions, set configurations not just for your server but also the language the program is written in - the list goes on. No wonder SaaS (Software as a Service) is thriving like never before. Who, other than technical folks, could possibly have the time, interest or inclination to set up a self-hosted solution when the barrier is so high? Perhaps some in the tech field would like to keep it that way? Would Saas be less attractive if installing a self-hosted solution was simple, easy, quick and secure?
Surely an essential part of a decentralised web is that companies, organisations and individuals choose to run their own software using open protocols and data formats. But until the ease, security and simplicity of installation improves for web software, it simply won't happen on a large scale.
Living billboard arrives to announce, drumroll please :
sandstorm.io
The auto updating server that you can self-host or use as a hosted service, allowing you to install and uninstall lots of applications, both free and proprietary, just by clicking the button.
(No, not paid to tell this unless you count the stickers I got. And no, not perfect but still a refreshing change.)
Sandstorm is awesome. Of course, it would help if ISP's weren't so keen to make sure nobody actually uses the internet to make computers talk to each other, instead of just having Amazon, FB, and Google's computers talk at you.
Has anyone ever actually seen an ISP network that filters local traffic to other ISP endpoints? I'm not even sure why they would do that. DDOS almost always targets big things in the cloud so it would not help there.
Here (SoCal) traceroute to a host in the same local neighborhood is three hops.
Its only purpose is to force businesses up to business class. You might see it enforced if you ran a public for profit site from a consumer connection.
In any case the encrypted p2p protocols run by dapps (decentralized apps) are opaque, and these don't really qualify as servers because architecturally they are not client-server at all.
This. I am an ISP owner and we block popular ports used to run services by default. 99% doesn't care and their computer gets hacked or they get virus/malware and those malware starts running services and spamming the world and our IPs gets blacklisted (which is a huge headache) and now cloudflare's broken shit ip checker will automatically start blocking ips, and customers will start blaming us for it. Occasionally, 1 or 2 customers will ask to have email server port or web server port to be opened, we usually ask them to sign up for a business contract, which clearly states that you can run this kind of services and you are fully responsible for any negative consequences.
This is not about money. This is just good housekeeping. When I was in NY, I believe they TimeWarner and Verizon (or was it AT&T?) also used to do the same thing.
Thanks for responding with the ISP's point of view. My frustration came from just wanting to run a small Ghost blog on an odroid with, at most, hundreds of visitors per month, and hearing that this required a business contract. Of course, whether I want to do this for business or pleasure doesn't really matter if you're dealing with malware and being blacklisted.
At one time ISP's gave you a bit of space on a shared host for free as a courtesy, it would be nice if these days you could get the equivalent of a t2.micro. I know a few people (teenagers I'm related to mostly) who would like to mess around with programming and building web sites but for whom even a tiny cost is a barrier. As it is I throw a few bucks at nearlyfreespeech for them and it's good for a year, but I imagine plenty don't have that option.
I think there is a miss communication between consumers and internet providers. I think if the consumers were aware of how things are run in an ISP they will be more understanding and reasonable when facing a problem like not being able to host services yourself. From my part, I try to write about my struggles in running our ISP business <1>.
Another thing to consider why we can't easily allow our user to run services if we wanted to because most home users IPs are not static, and often your IP is shared by many users. IPv4 is running out, giving our user the ability to run service will also mean giving him a unique IP, which they will have to pay for it and because of the scarcity of unique IPv4s only business users are allowed to have them and business accounts cost money. We haven't moved to ipv6 yet, it will require some investment and overhauling some of our networks. Depending on your local law, you may actually have to have a business under your name, meaning you have to show business license papers, to get a business connectivity.
I know what you mean by having to run your own blog or services. I have done it myself and I have learned what I know today mostly due to me tinkering and optimizing apache and WordPress on the fly when it was on the homepage of Digg. It was such a rush and I have come a long way since those days, thanks to being able to host my blog on a time warner DSL line. This was in the early or mid-2000. But it was a different time, and things have changed a lot since then.
One of the reasons ISPs back then used to give you free email address and shared hosting, because they wanted to tie you down to their services. Specifically, if you used their email for a long time it would be difficult for you to leave it behind. But the landscape has changed since then, consumers are smarter and webmails are much better and free blog hosting are dime a dozen.
My advice to people who want to learn hosting services by doing it themselves on their own computer that, even though it can be a thrilling experience, hosting it remotely and configure/securing server from scratch can be an even better learning experience. And VPS hosting services like digital ocean can be very affordable.
There is a popular misconception that ISPs are a profitable business and they try to shaft the consumers at every opportunity they get. Maybe for the big players like Comcast, it is true, but the rest of us small and medium sized ISP its far from the truth. The complexity in running reliable services can be staggering, the equipment and licensing can be an astronomical investment (at least for us it's a large amount) and running the last mile and upkeep for it with 24/7 support are very expensive for us. BWs are not expensive, but equipment is. When we saturate our 10gbps port, moving to 40gbps or even 100gbps network (switches and routers) can be a mind-numbing expense. On top of it, we are taxed and need high license fees for all kinds of services. We really don't make a lot of money and you constantly have to spend on your infrastructure as you grow (fighting to get customers by lowering your cost, and hence your revenue and profit). You have to keep growing to stay relevant and reach that point you can connect more users with minimum costs without having to invest in growing your network. Reaching that point can take anywhere for 5 to 10 years, if not more.
Sorry for the rant, I just needed to get it out of my system. :)
I can attest to that, my previous ISP had explicitly stated that running web or other file server was not allowed - I'm not exactly sure if TOS was in violation of my consumer rights but the fact is that they could cut me off if I were going to set up a server... I mean even for personal use (it didn't say anything about running a server business)
> DDOS almost always targets big things in the cloud
That isn't true. Those are the ones that you'll hear about in the media but there are plenty of DDOS attacks on smaller sites both for extortion purposes and to force them offline.
There is a highlight point that many internet services are indeed asymmetrical. My Comcast service is 180/20 Mbps, down/up respectively. Even if you get Comcast's new gigabit service, you get 940 down, and 40 up. And many home ISPs prohibit you from serving a website at home, according to their terms of service.
It's pretty common to have outbound traffic to port 25 blocked as an anti-spambot measure. Sometimes you can call them and get them to remove the block, other times it's non-negotiable.
Most cloud servers do this too. Spam has basically destroyed smtp as an open federated protocol anyway. Network reachability is the least of the issues you'll face trying to run your own mail.
Lesson: any distributed or federated protocol that is not robust against abuse is doomed.
I blacklist 25 out except from our mail server. That's just basic common sense given some of the spyware. Users who need to send mail will use encrypted 465 or 587 anyway.
P2P filesharing is part of the "decentralised web", and people seem to have no trouble installing the software for those.
Now consider how complicated installing on a server is in contrast. Upload you files to a folder or directory, enable permissions, set configurations not just for your server but also the language the program is written in - the list goes on.
I think a lot of the difficulty is artificial, created by software that is far more complex than it needs to be, to cover far more use cases than most users actually need. In the "enterprise" space, a lot of this complexity probably also drives auxillary revenue in the form of training, consultancy, etc. In other words, it could be a deliberate barrier to entry. Building big, complex, immensely flexible, yet difficult-to-configure systems with plenty of dependencies just seems to be the norm.
Perhaps some in the tech field would like to keep it that way? Would Saas be less attractive if installing a self-hosted solution was simple, easy, quick and secure?
Indeed, the whole category of "enterprise software" often fits this business model.
But me and many others have written HTTP and FTP servers which do not require any installation at all --- they're just a single (often very tiny compared to most other software) binary, sometimes with an optional configuration file. If you're doing something like hosting static pages, this fits the use-case perfectly well.
Finally, a huge part of making the web truly decentralised is to abandon the notion of dedicated servers/clients altogether --- and thus also the notion that you must need a dedicated or "server" computer to host anything, or for that matter a dedicated Internet connection. Of course some machines will have more resources to serve, and a typical residential connection may be more limited, but the key idea as exemplified by P2P is that any machine can serve.
>P2P filesharing is part of the "decentralised web", and people seem to have no trouble installing the software for those.
We can't really call P2P filesharing today decentralized. It depends on trackers. And it's trackers who help users (often out of commercial interest) overcome all the troubles with software.
Some torrents depend on trackers, yes. Others use magnet links, which aren't dependent on trackers at all. The data typically served by a tracker is instead stored in the DHT:
Every peer to peer system requires bootstrapping, often this is achieved with some central or federated nodes/trackers. Once you have peers, you can forget about trackers and use the DHT.
Have you looked into Sandstorm? Their whole goal is to make installing server apps as easy as in mobile platforms, while securing them from the outside and each other, and allowing the user to switch providers bringing their apps and data along.
Here's something from the first part of the install which is not something a normal person would ever do:
Configure the EC2 security groups (firewall) to let you reach Sandstorm
By default, Amazon EC2 protects your virtual machine from being reached over the Internet.
In the above tutorial, we allow the virtual machine to be reached on port 80. By default, Sandstorm uses port 6080, so look through the above tutorial and add another security groups rule allowing port 6080.
I've just installed Sandstorm to test it: Created a droplet in Digital Ocean, logged in ("ssh root@...", and they had my ssh key), the curl|bash, and it worked. It was way beyond awesome for an open-source Linux app. So the EC2 problem exists only because someone advised to use Amazon for a hobby installation.
I love the concept, and I'm looking forward the day a facebook alternative could be built into it. And if it had existed when Google Reader ditched the internet, everybody would be on Sandstorm today. I've just experienced the feeling of browsing my RSS feeds of porn without feeling watched and without being afraid of hitting a "share to facebook" button: It's a great experience.
It's more like suggesting that they should drive their own cars, pump their own gas, or change their own tires in an emergency. Nobody is suggesting that people should be able to program a webserver.
> A real barrier to a decentralised web is the difficulty of installing software on a server.
Also, economies of scale.
If people used a decentralized service called "UnFaceBook", the total cost of servers, administration, etc. would dwarf the cost of Facebook running their data centres. From a business perspective, it's just not feasible.
Hmm... perhaps everyone running their own systems is, in fact, doable. Most people have smart phones which are much more powerful than servers from 6 years ago. Why not just use that?
Have the content at the edge, and controlled at the edge. Scalability can come from lots of caching at the core.
Not a startup I'd want to do, but it's technically feasible.
> Most people have smart phones which are much more powerful than servers from 6 years ago.
This isn't even remotely true.
The most powerful servers from 6 years ago is a xeon x7560, which has a 40% higher passmark score than even the best enthusiast consumer-level cpu on the market today (an i7 6700k), never mind even the most expensive smartphone on the market.
I think that qualifies as "remotely true" actually. They didn't say "more powerful than the most powerful servers" they said "more powerful than servers".
People on the internet have an odd tendency to interpret statements in the broadest possible way.
I did misread that. I think the point still stands. The cheapest server cpus from 2010 are 60% as good as the best 2016 cpus. It might be close to the iPhone 7's latest chip or Google's Pixel. But most people don't have the latest and greatest. A normal phone won't be close to the latest and greatest i7. And I don't think it will be close to a 2010 era server.
Economics and scale can be a strange beast. The sum of Amazon EC2 + Google Cloud + Digital Ocean + Rackspace +... is about 10,000,000 servers, which makes... 1 server per 7,000 inhabitants on Earth. Have you ever looked at it this way?
And that's only for public cloud, not including Facebook, Google's internal servers, Apple's infrastructure, ISPs, and servers hosted by all companies. So to provide all IT services to citizen of modern economies, we're certainly close to 1 server for 100 inhabitants. Sometimes I wonder what we're doing with so many servers on Earth: I don't spend 24hrs a day sending requests to public servers, and even if I did, the server I'd be pinging could handle a few thousand users at the same time. So where does all this processing power go?
And there's even more computing available if you include everyone's home and work PC, phone and router, but those are not always-on.
> Hmmm... perhaps everyone running their own system is, in fact, doable.
Crunching the numbers, we're already above one system per person ;) So we might as well go full-decentralized, if we could conceive a theoretical model around it.
Security, redundancy and isolation. Often times you have an extra server not because you need the processing power, but to separate things for security reasons, to provide failover and to avoid noisy neighbors.
Certain things, like filtering out spam, or handling video, require rather long computation per user per day. Same probably applies to just transferring data quickly enough, with a lot if spare capacity to handle spikes.
I like the idea of smartphones as servers feeding content to a CDN... Combined with IPFS [1] that should work well.
I was going to make a different comment though. You suggest that, because it would have higher infrastructure costs, a decentralized network is "not feasible" from a business perspective.
I'm wondering a) whether infrastructure costs are currently a limiting factor in the growth of social networks, and b) if a decentralized social network needs to be a business at all.
But more to point a, what if it cost 10x more, but the current costs were $0.10 per user per year. Do you think a service with a cost of $1 per user per year would be too expensive to operate?
I was exactly just thinking about this. If phone is offline we need CDN's. But then again phone's have to watch out for their precious battery life.
With decentralized, you also have a huge issue of protocol. Facebook can upgrade millions of people instantly to new version. Decentralized could be a major pain.
> A real barrier to a decentralised web is the difficulty of installing software on a server … Imagine if installing a server-side chat app, message board, project management app, or CMS were as easy as installing a desktop app.
If you have a desktop, you can install any server software you want. And if you have a desktop, you can leave it running as long as you want. You don't need to purchase a host somewhere; you can just use the computer you own.
Heck, you own a computer which is on 24/7 already: your cell phone. And software written well could run on that phone to serve whatever you want, at a minimal cost in CPU & hence battery.
Software written well can work that way on Android, but not on iOS which force-suspends your process and sockets all the time. The only exceptions are granted unilaterally by Apple and the chance of them allowing server-like behavior on iPhones is infinitesimally small.
That's why it's important to hold on to people owning general-purpose computers, open standards and community governance: because once you move away to centrally-controlled appliances, there won't be a platform from which to bootstrap the next, better system.
> A real barrier to a decentralised web is the difficulty of installing software on a server.
LOL.
Decentralised would mean the user installs software on his/her desktop/phone and it works as you think a server works. Thats all there is to it.
If you make "installing on a server" - somebody else computer easier, its not gonna be detentralised. Youll have shareing of peoples "apps" on a single server, and eventually Cloud is going to be invented. Look then how easy it is to run apps and install shit on other peoples computers.
Running "a server" is not any more difficult than running any other app, on Android or on Linux, the difference is pacman -S kwrite or pacman -S lighttpd, or picking "primitive ftpd" app and running it.
I think the real problem is that people think servers are some kind of magical computers, different from any other general purpose computers people are already using as clients.
> Running "a server" is not any more difficult than running any other app, on Android or on Linux, the difference is pacman -S kwrite or pacman -S lighttpd, or picking "primitive ftpd" app and running it.
Which directory does the ftpd app store it's data in, and on which partition? How do I get alerted if I'm running low on space and what do I do if I am? Is the data backed up and what is my disaster recovery process? What port is it running on and how do I connect to it? BTW, insta-fail because we should be using secure ftp. Is it using encrypted communications and how do I install a certificate and share public keys? What about configuring access through the firewall? etc, etc.
It is as easy as installing a desktop app. Sometimes even easier (you can not just write "apt-get install Word<Enter>" and it will be just there).
The problem is that you don't just want a CMS, you want it to look a special way, do special things with it, don't want other people to access it without you allowing it, etc.
> Imagine if installing a server-side chat app, message board, project management app, or CMS were as easy as installing a desktop app. In a desktop app, it's usually one click to start the install and then, if necessary, you're guided through a few screens to complete the install. Want to uninstall? The OS (operating system) will provide a feature to manage that.
It's this way already. Virtually every shared hosting provider (Hostgator, Bluehost, etc.) provides a Cpanel admin panel with Softaculous software installer. All kinds of apps (blogs, CMSs, project management apps, etc.) are a single-click install and removal.
Or an organisation people could freely join without being forced to learn system administration concepts and practices. Imagine WhatsApp, but owned and sustained by (a subset of) its users on a non-commercial basis.
I dunno, there are an awful lot of server apps that I can install by simply running apt-get install. Achieving nirvana might be a simple matter of packaging.
Microsoft has a long history of providing accessibility options in their products (long before Apple or Google started taking accessibility seriously).
Microsoft have never been very good with visual and interaction design in their desktop operating system. However, Windows Phone is the exception. It's visually attractive, and has nice, well thought-out interactions. I think it's superior to Android and iOS in many ways. It's obvious that both Apple and Google took some cues from Windows Phone in the updates to their own mobile operating systems. (Undoubtedly, all these companies look at each others products when designing new features).
The UK's NHS website is an excellent, reputable source of health information. It may not be the most attractive looking site, but it has a goldmine of info. There's information on ailments and conditions, treatments and general health advice.
Importantly, the information is written and vetted by qualified medical professionals.
No real surprise. E-ink readers are still only available with black and white displays. Their screens are mostly shaped to fit a small paperback.
Physical books, in contrast, come in an infinite variety of shapes and sizes. Yes, tablets come in larger sizes and with high resolution colour screens, but they still aren't as comfortable to read as paper or even e-ink displays.
E-ink readers are a luxury, unfortunately. Where phone and tablets can be justified in most homes as being multipurpose, ereaders are fundamentally single-purpose. This is why their share of the market grows so slowly - it's limited to hardcore readers and the wealthy. We need eink to go beyond dedicated devices.
I guess so, but when the basic Kindle costs about the same as a full tank of gas, or just a little more than a newly released AAA video game, it's hard to say that it is really restricted to the wealthy.
The biggest appeal of the Kindle is that you can read it outside in the glare of the sun, and the battery lasts for months. I would agree, that for most people, the smartphone they have in their pocket already is good enough for reading ebooks.
> it's hard to say that it is really restricted to the wealthy.
Good point - eReaders aren't really all that expensive (I got my first one for 60€). What might be more important when considering the distribution of eReaders is the correlation wealthy <-> educated <-> reads a lot.
> for most people, the smartphone they have in their pocket already is good enough for reading ebooks
I can't imagine reading a full book on the tiny screen of a smartphone. Do people really do that? (As in, for serious reading.) And if so, how much?
I can't imagine reading anything technical or even most non-fiction on a phone, but for plain old entertainment-grade fiction it's doable. I read most of the Aubrey-Maturin series and The Expanse series on my phone last year.
When discretionary spending is limited, multipurpose devices are easier to justify. This is why the number of readers on tablets and phones is growing at much faster pace, despite a poorer reading experience: people buy those devices for multiple purposes, including reading books.
Amazon should try giving away paperwhites for free.
"Amazon should try giving away paperwhites for free."
It would make sense in a lot of cases. Personally, I buy many times more books than I did before I got a Kindle but, I guess, they have calculated that this won't be the case for the majority?
The problem is that it's hard for them to compel people to keep buying content - unlike the way mobile operators give phones away in exchange for a binding, fixed term contract?
"...the US isn't particularly unique, just more public about it."
I agree. I strongly dislike the US proposals discussed here, but the fact is travelling abroad means you've already given up much of your privacy.
Those of us in Europe and the US have a somewhat contradictory attitude toward privacy - opposing certain rules or regulations, while happily ignoring others. Take the UK, where the British government has intrusive surveillance laws. A self-serving and apathetic media discourages discussion or opposition to such laws. On the other hand, there was strong opposition against mandatory ID cards when the idea was proposed a few years ago, possibly more to do with the cost of the scheme and little trust in the competency of the government to carry out such a proposal.
So, yes we do have a somewhat inconsistent attitude to privacy. Even in Continental Europe (if the country is part of the Schengen area), biometric passports are common. Applying for a passport often means providing your fingerprints - something a lot of people might be uncomfortable doing.
Microsoft rightly gets skewered for tracking user behaviour in Windows 10. A few years ago, Canonical was heavily criticised for sending anonymous data from Ubuntu to Amazon. Meanwhile, Google captures more user data than possibly any other tech company - we're talking gargantuan levels of data - and yet on matters of privacy, the tech community gives Google a free pass. Why such double standards?