Hacker News new | past | comments | ask | show | jobs | submit login
Centralized Wins. Decentralized Loses (highscalability.com)
275 points by ghosthamlet on Aug 25, 2018 | hide | past | favorite | 150 comments



The author claims winners and losers but doesn't talk about the criteria for winning and losing. Decentralized networks aren't after the same things as centralized ones. They're not after engagement metrics, ARPU, or stock valuation.

If the criteria is number of engaged human users, then for sure the decentralized networks are far behind. But these things can change quickly after changing slowly.

If decentralized networks cause big centralized ones to change their behavior in meaningful ways, for fear of losing users to the decentralized ones, then I would argue that's a win for decentralized networks. Small win, but still a win.


> If the criteria is number of engaged human users, then for sure the decentralized networks are far behind.

7B humans are organized in decentralized social networks (linked by small world phenomenon). 99% active daily.


Actually, they're on Facebook.


At best, 2B of them are.


Being on Facebook is not the same thing as being active on it.


There's only 3 billion internet users.


>There's only 3 billion internet users.

I am actually assuming you meant "Social Media Users" and not Internet users. There are now more than 4B internet users.


I was wondering when this was going to happen. A repeat of when people thought www = Internet.


I'm sorry you're about to have a big death event. I just saw an article posted on here about it. I feel bad for the fishies.


>If the criteria is number of engaged human users, then for sure the decentralized networks are far behind. But these things can change quickly after changing slowly.

Well, if and when they change, we can write another article about how "decentralized is winning".

For now, they don't change, not even slowly. If anything, they change in the opposite direction (centralized networks getting more users).


the are gauging the consumer experience

which is of course, a mistake

just like you, I don't know what the criteria for this comparison actually would be


> They're not after engagement metrics, ARPU, or stock valuation.

You’re also not defining a winning criteria. What are they after? And who are the big winners?


> What are they after?

Safety, freedom, control, privacy.

> And who are the big winners?

Nobody cares who the "big winners" are. That's the point.


Well, I care.

I'd rather we had decentralized networks for social media, blogging, sharing, content distribution like video, etc.

But we don't.

I can start one of these up (or join one) where it's just me and a handful of other like-minded people in favor of decentralization, but it would defeat the purpose.

I don't merely want a decentralized service, I want a STRONG decentralized service, with tons of nodes and tons of millions of users.

Without that, the "safety, freedom, control, privacy" in today's decentralized services are irrelevant. Being able to do nothing much and having it be safe, totally controllable, and private, is nothing to write home about.


I think we're getting there. I'm hoping one of the federation protocols becomes widely accepted & implemented, so that users from any social platform are able to communicate and cross-post to any other social platform.

https://en.wikipedia.org/wiki/Distributed_social_network


This reminds me way too much about the IM situation in the late 90s/early 2000. We somehow forgot about that problem because everybody having Facebook kinda solved it for a while.

It feels we are back to square one, but now with massive privacy issues added to the whole problem.


you don't know millions of people. you honestly can't. today's decentralized services are pretty decent, i'm in the process of transitioning my family (and extended family) to a handful of servies that i host. we've already successfully replaced facebook messenger with matrix and i've been asked about other things that i could host for the family.

that's my core social network right there and i can host them on activitypub services that all talk to each other. they can even connect to other groups online if they like.

you're looking for this giant thing that doesn't even matter. your social network isn't that large.


>I don't merely want a decentralized service, I want a STRONG decentralized service, with tons of nodes and tons of millions of users.

This I have hope for matrix.org


If you define winning as safety, freedom, control, privacy, then I'd define the "big winners" as those who have made an impact by those metrics.

If nobody has been impactful in those areas, there's a problem. That's the point.


Citation needed for like everything in your first and last paragraph.


What nonsense. The claim is that the article does not define certain criteria and then goes on to list a bunch of stuff that is reasonably obvious.

Citations are required at the end of scientific papers, they are not required for stating uncontroversial or obvious things in comments on a forum.


Watching the Internet develop since the 1980s, this leaves a lot of stuff out.

Yes, decentralized protocols like NNTP are not around like they used to be. The reason for this is because Usenet was effectively crushed by the US government. Many net luminaries, scientists, mathematicians, journalists, artists etc. were users of Usenet. In the 1990s and on, the Baby Bell monopolies used their monopoly powers to crush ISP competition. Then in 2008 Andrew Cuomo strongarmed the remaining monopoly telcos in New York to shut down Usenet saying "This literally threatens our children, and there can be no higher priority than keeping our children safe".

Obviously you can't say you want to crush decentralization in order to take power away from the individual and place it in the hands of corporate monopolies and the government, so you have to hear from the RIAA/MPAA about how artists are being ripped off (although many artists say the RIAA and MPAA are the ripoffs), or about terrorism, or the "think of the children" stuff from this case.

> "it's controlable, it's governable, ..., it's walled gardenable,..., it's brandable, it's ownable, ..., it's auditable, it's copyright checkable, ..., it's safe for China searchable,..., it's monitorable"

Well this is correct. Now we can be more controlled, more governed, more owned, more copyright checked, more monitored - these are all the benefits for them as they remove our decentralized communications.

As the tendency toward recentralization seems to be in the US, a broader metric is the ratio of US GDP to world GDP, i.e. centralization of the world economy. This ratio has been falling since World War II - each year the US economy becomes less and less significant. So while network control is being centralized in the US, economic control in the world is becoming decentralized.


The point about Cuomo is definitely worth remembering, but USENET suffered its own decline. The vast majority of folks who got online from the mid-nineties on never learned how to use it or engaged with it in any fashion. Once the center of gravity shifted to web-based forums, and then social media, the older users mostly abandoned it.

The lack of comprehensive solutions for spam in the mid-90s should also not be discounted. Spam is an easier problem for centralized systems to fix.


I'd argue that its now even easier to combat spam using decentralization technologies. If you gateway all actions prone to spam with a signed transaction to a smart contract platform, where that execution costs the spammer real value for each action the spammer no longer has a unequal advantage. This means as an app developer I can mostly ignore combatting spam, that's now an infrastructure problem that the transaction miners need to deal with and handle. It's an empowering separation of concerns! Web 3.0 really shaping up to be a revolutionary development in humanities technology progress and modeling of the world around us.


It will take a lot more time, but I think that decentralization will win in the end.

Society could not operate without social contracts and those social contracts are founded on certain universally agreed upon principles of fairness. As the fairness of our economic system comes into question, more people feel justified in exploiting loopholes within social contracts; this causes social contracts to lose their integrity in the minds of people; which further exacerbates the problem/behavior. Eventually, there comes a point when most people are cheating the system; at that point, the system can no longer function and it collapses.

Centralization creates an unfair economic environment because it limits the spread of economic opportunities. Most people already see centralization as a problem; as a source of injustice and inequality.

Just as the process of acquiring private property (as a result of wealth accumulation) becomes increasingly unfair, social contracts founded on principles of ownership of private property are themselves becoming weaker.


there are situations where decentralization must win. You cannot, for example, have a centralized network between the Earth and Mars.


I'd argue economics is a system that requires decentralization as well. Now with cryotocurrencies the burden required to get a seat at the table went from basically an impossible game of politics and expression of violence/dominance to a purely technical one. If I want to become a member of bitcoin I just need to start computing and now I'm a member with voting power.

At an individual level this doesn't mean much, but at the geopolitical level this is a huge improvement. The future I see before us is our governments dropping the administrative burden of managing, minting, and distributing there own currencies and becoming a contributing member to the crypto computing platforms. If I was an up coming developing nation I'd be pouring money in into computer engineering to become a member of these global currency movments.


The article is extremely light on substance. This issue can be more readily addressed with a simple list:

Centralization pros:

  * security
  * ease of management
  * lower overhead in task delegation
  * maintenance simplicity
  * versioning
  * no distribution
Centralization cons:

  * fixed resources
  * fixed configuration
  * fixed environment
Looking at these lists centralization at first appears immediately superior. These lists is an illusion in that it accounts for concerns in the present, but it does not account for growth or future concerns. Decentralized systems can scale in ways centralized systems cannot.

The fixed nature of a centralized system is also a huge limitation. Projects like SETI@Home and Folding@Home illustrate that thousands of lower powered distributed nodes are better for computation than a single centrally managed super computer. While the computation power of super computers grows rapidly over time so too does the computation power of your average home computer.

There is also an economic qualifier for distribution too. Computational resources cost money in both hardware and electricity. When computational tasks are distributed so too are the expenses of computation.

There is also a security/availability benefit to distributed systems. Centralized systems are a single point of failure. Distributed systems, at least in the public space, are diverse in their hardware and software which allows the system immunization from many forms of attacks. A centralized system can be isolated from the world by a DDOS attack, but it is substantially hard to attack a distributed system in this way.


Bigger problem with centralized services is stifling monoculture. There will never be 10 different frontends to Facebook the way there can be 10 different email clients. Or 10 different newsfeed sorting algorithms.

It’s the same problem that Amazon has - one size does not fit all in UX, and one decision maker, even if very smart and well-meaning, will sooner or later get himself into a corner and be stuck there.


I suppose I did not list this because it is a social factor rather than a technical factor and I simply wasn't thinking in those terms.

This is a major point though. I spent time at Travelocity, Orbitz, and Expedia. All three companies were (not any more) direct competitors based in far away geographies. Regardless of that I was utterly astonished by how similar the work cultures and technology were. Everything was being run from archaic centralized giant monolithic Java applications using nearly identical technology stacks and it was always an incredibly slow and painful experience to update, extend, or maintain the beast. The business solution was always the same: hire more developers until market conditions present financial barriers.


But one could say the same for a monoculture of protocols in decentralized systems, the email protocol has been stable for a long time and has some serious issues due to its decentralized nature.

For emails it is surely worth, but for many other application it might be different.


If you attach a signed monetary transaction cost using a decentralized smart contract transaction system to sending email to a federated host I hypothesize that spam would start dwindling. Email, IMO, largely grew into a centralized system due to the difficulties in combating spam. From the receiving end where filtering and blacklisting is a major system administration task. Self hosting also requires high quality SRE skills to combat servers from being compromised and used to flood the federated network with spam.

Make the act of sending a message cost the send or money and I think spam goes away and self hosting becomes much simpler.


> Make the act of sending a message cost the send or money and I think spam goes away and self hosting becomes much simpler.

...and the service itself becomes less useful to everybody who doesn't want to self-host.

If you choose to off-road, you can't really complain about it being bumpy.


>monoculture of protocols in decentralized systems

Can you compare SMTP, NNTP, ActivityPub, XMPP, ATOM, Scuttlebut (SSB), to make your point clearer?


That’s true - federated systems, unlike monoliths, suck at protocol and this capability evolution.

But think about what it means for a minute! If one figured out protocol evolution or migration might be all it takes to give federated systems prominence.


> Centralization pros:

> * security

No, that is bullshit. It's the same kind of security that a "strong leader"/dictator provides. Really, it's the same kind of power structure. It provides security against certain kinds of adversaries. But at the same time it eliminates all security against a malevolent "strong leader". That looks like improved security as long a you have a benevolent and competent person/entity in control of the centralized structure, but the huge risk is built into the structure, and that is very much a security problem.

Facebook concentrating control over social interaction is a security problem. It's not a security problem for Facebook, but it is for society, and that is what counts.


You missed one of the most important points about centralization: performance and legal aspects/governance. Could you sue IPFS if the protocol removed your files? You can against companies like Google.

Not saying that centralization is always the best options, just adding more food for thought.


Obviously, you know one cannot sue a protocol or TheFappening torrent would not be available. That seems to be a strength: censorship resistance. But, it does have its downsides.


Are you saying TheFappening was a good thing? If the best example of torrents being used to avoid censorship is the massive leak of salacious, private, non-newsworthy photos, then maybe your argument isn't all that great.


> The fixed nature of a centralized system is also a huge limitation. Projects like SETI@Home and Folding@Home illustrate that thousands of lower powered distributed nodes are better for computation than a single centrally managed super computer. While the computation power of super computers grows rapidly over time so too does the computation power of your average home computer.

There's very few problems that are embarrassingly parallel like this which is why the examples you gave are almost 20 years old. Most of the time you're bottlenecked on sharing state.

> There is also an economic qualifier for distribution too. Computational resources cost money in both hardware and electricity. When computational tasks are distributed so too are the expenses of computation.

Compute is already pretty cheap and getting cheaper.

> There is also a security/availability benefit to distributed systems. Centralized systems are a single point of failure. Distributed systems, at least in the public space, are diverse in their hardware and software which allows the system immunization from many forms of attacks. A centralized system can be isolated from the world by a DDOS attack, but it is substantially hard to attack a distributed system in this way.

I would say the diversity of hardware and software creates more surface area for attacks, not less.

I would also argue that distributed systems are way harder to design for privacy and security. For example, it's basically impossible to revoke access to something after it's granted in a distributed system unless you want users to run their own nodes 24/7 (which introduces availability problems).


I'd argue that the biggest cons for centralization are:

* lack of ownership

* lack of privacy

* lack of resilience[1]

* lack of self-determination

In my opinion, these are the factors that will make decentralization win out in some niches for the average user.

[1] You might laugh at this given the 99.99etc.% uptime most cloud services advertise—and yet I still can't use my Google Docs when I'm passing through a tunnel.


> Projects like SETI@Home and Folding@Home illustrate that thousands of lower powered distributed nodes are better for computation than a single centrally managed super computer.

Only for parallelizable problems. It won't work if each node needs to communicate much with the rest of the nodes.


That depends on the application in question. I don't see this as in issue behind bit torrent. The biggest limitation in this regard is versioning. A change of API or format major version could break backwards/forwards compatibility that disrupts the decentralized system since you don't control the version present at each node.


> I don't see this as in issue behind bit torrent

The main bottleneck in data processing is often the interconnect, not the compute.


That depends on the amount of data to process and the computational effort involved. Provided a sufficiently challenging task and a fixed amount of computational resources you can easily predict the time needed to reach the goal. If the duration is exceedingly large or expensive it might make more sense to distribute the work load. Even though there is a delay to interconnect this delay will not likely reduce the benefits provided by distribution if the decision criteria are costs and expenses over time. These are all things that can be graphed and mapped against financial estimates.


Centralization cons: fixed resources, fixed configuration, fixed environment

I'm not willing to assign any of those cons to centralized services in the Cloud Era. All of those a just one one click away in AWS or any other such service.


This is far easier than swapping hardware, rebuilding the framework, or moving to a different programming language. The ease of adjustment is certainly present with the AWS dashboard, but the fixed qualities still applies just the same. If you want a change of resources or environment YOU must make that change with the AWS tools. In a decentralized system the diversity and flexibility are inherently present.


I wish it weren’t true but for right now I agree with the author, including what it might take to tip the balance: “1. Complete deterioration of trust such that avoiding the centralization of power becomes a necessity. 4. The decentralization community manages to create clearly superior applications as convenient and reliable as centralized providers.“

I was eager to use GNU Social, and after some use for a year, my host shut down. I just opened another Mastadon account but haven’t started using it yet. Also, the value of centralized services like Twitter is the really interesting people I follow. Social media is best when used as an advertising and directory service for content we put on our own blogs and web sites, but even that seems to be less common.

I really enjoyed the Decentralized Web Conference June 2016, but it also made me slightly sad that fighting back against centralized web, political, corporate, etc. centralization is an uphill fight.

Sometime network power laws suck.


If a complete deterioration of trust is a prequisite for decentralisation winning, then I think it may be its time to shine soon enough. That's what's already behind all the social problems we're facing right now, like increasing polarisation and distrust of the media and the rise in fake news/resurgence of pseudoscience. Add to this people's current disdain for Facebook and Twitter, and the ingredients may be there for decentralised services in that sense.

Whether the second part (the clearly superior applications as convenient and reliable as centralised providers) comes true is probably the main deciding factor about now.


> the value of centralized services like Twitter is the really interesting people I follow

Same as that shitty club with overpriced beers... We go where the interesting people are and the bad ones kept under control.


So anywhere but Twitter then.



What about centralized but libre servers and clients, and run by the EFF and FSF? If they one day start behaving badly, we all migrate away to a new benevolent dictator. Or maybe EFF and FSF somehow legally bind themselves to not be able to behave badly, if that's possible. Maybe the domain name is owned by an impartial committee of some kind.


> What about centralized but libre servers and clients, and run by the EFF and FSF

I can't say much about the project right now. But something will land middle to late next year that has this in mind.


Intersting. But you can't say or won't say? Won't, fine by me. Can't, I'm already skeptical of anyone involved.


I can't because I'm really passionate about this and I'll end up going on a rant. :) But let me give you a glimpse of the scope involved.

For a system to be completely free from censorship, it needs to have a moat around it:

- Own Domain registrar

- Independent Colo provider / Own infrastructure

- Own Payment processor

- Advertising networks / Own Ad Network for individuals

- Content creators, that are free from brigading.

- Company put into a trust, with a board of influential influencers abiding by an internet bill of rights.

Currently no independent platform exists with the above and allows anyone irrespective of their voice or opinion to be heard and be supported by donation, merchandising sales or advertising revenue.

You could say, they are green shoots in the form of d-tube, bitchute, steemit, but they are still very early in the process and do have a high surface area of attack.

There's a lot wrong with the current system and I aim to fix that. >)

Oh and I'm not at the beginning of the journey either. 3/4 of the 6 times I know how to implement. It's really all about making sure that it definitely is do-able and generates enough revenue year 1 to ensure it survives to year 3-5.


>For a system to be completely free from censorship, it needs to have a [moat (aka Own Ad Network)] around it:

It looks like "free from censorship" and having an "ad network" are contradictory properties.

If one invents an uncensored website e.g. "www.uncensorablecontent.com" -- that allows any content (including uncensorable child porn and uncensorable beheadings), advertisers like Pepsi and Toyota will avoid it. The site's freedom of speech ideals become radioactive to advertiser funding.

Unless you're deliberately foregoing mainstream advertisers by intentional design. That's fine but that also means you're limited to niche advertisers which also means your content creators will have a smaller revenue pie to receive ad payments from.

Those economics creates a feedback loop because the content creators making mainstream videos (e.g. harmless pop music videos or gardening tutorials) don't need the freedom properties of UncensorableContent.com and would just use a centralized site like Youtube because the mainstream advertisers are there and that's where the money is. Therefore, through the self-selection of certain content creators being restricted to outlets like UncensorableContent.com, it evolves to become a haven for unsavory content even if the architectural intention was to be a place for everybody's "good" content.


I want the platform to be like cable television. 10000s of channels. If someone doesn't like a particular channel because they don't believe in the content. Simply block it and move on.

If someone comes across a user in the comments, whose views they don't agree with. Simply block them too and them move on.

What I would like to do, is put the agency back into the hands of the user. I don't want them to be coddled. I believe everyone is better by listening to viewpoints from those even if we don't agree with it. There's always something to be learned.

When it comes to advertisers, well they are advertising on the channel. They are not necessarily supporting the platform. My wish would be, that once launched, going forward users understand that when Toyota sponsors a channel, they are sponsoring that user and adverting to that audience. They aren't sponsoring another user elsewhere which is banging on the table for Trump to build that wall 10ft higher.

That said. In the beginning, I will be forgoing mainstream advertisers. It is my belief that they will eventually see the light. That the user has full agency and they can craft the platform to cater to their needs and what they want to watch.

Finally, terrorism, child porn, revenge porn, nudity and all that nonsense will not be welcome on the platform.

Look, the central tenant is for everyone to be welcome. Craft an audience on whatever they feel to be passionate and to be able to make a living. With no fear of being de-platformed because there is a minority that wishes to subdue another minority. I believe such censorship is a slippery slope. That's all.


>Finally, terrorism, child porn, revenge porn, nudity and all that nonsense will not be welcome on the platform.

I support them not being welcome.

How do you plan on keeping them out if this is open source and supposed to be censorship free?


Let be preface by first saying, that when this gets off the ground. There is going to be a huge learning curve. I am humble enough to say, I don't know what I don't know.

So, I will have to meet the challenges and handle them as best as I can.

That said, from what I understand. There are companies which have developed ML tools with government assistance from the UK. That can detect whether a video is terrorism.

So if you run with that thread, perhaps similar tools will be available to the other content.

I'm sure as time goes on, other complex systems will evolve to meet said challenges.


Can you say more about the stack you are building? Are we talking open source? What language are you writing this in?


Sure.

It's completely open source.

1) VueJS/Bootstrap for the UI.

2) GoLang API for the UI and the backend MQ/Job workers.

3) VideoJS for the player.

4) Not sure about the DB Yet, either Postgres or Memsql.

5) SideKiq for the Job Queue.

6) Nats.io for the MQ.

7) Centrifugal/centrifuge for the real-time messaging layer.

8) Probably React/Vue Nativefor the mobile application. Whichever one works out best on mobile.


That falls under won't, which is understandable. Can't implies someone is legally preventing you from talking about it.


That sounds good! Hopefully I will see the HN announcement when it is released.


Don't forget "shouldn't"


I didn’t see an easy way to move my quitter.no posts to my new Mastodon host.


"That wasn't always the case. In the early days of the internet the internet was protocol driven, decentralized, and often distributed—FTP (1971), Telnet (<1973), FINGER (1971/1977), TCP/IP (1974), UUCP (late 1970s) NNTP (1986), DNS (1983), SMTP (1982), IRC(1988), HTTP(1990), Tor (mid-1990s), Napster(1999), and XMPP(1999)."

Yeah, no one uses TCP/IP, DNS or SMTP nowadays. Good thing that we have gotted rid of those antiquated protocols and instead can use facebook and whatsapp! </sarcasm>


Why the sarcasm? It's true that (almost) no one uses them. Or, rather, no one much cares about them, and when they do use them, it's proxied and controlled by big corporate players.

SMTP is email -- and people (outside of business email) all the more commonly use social networks for their communication. And even when they use email, a huge percentage of it is through Gmail. Heck, intra-gmail user emails don't even need (and maybe already don't) use SMTP/POP/IMAP at all, and that's a heck of a big percentage of today's email use.

DNS is irrelevant when people just use 5-10 services: Google, YouTube, Gmail, Facebook, Twitter, Netflix, and so on (and even more so when they use them through mobile apps, which could drop DNS and TCP/IP tomorrow and connect through proprietary protocols and nothing much will change in the user experience).

Ditto for TCP/IP.

The fact that these protocols lark behind what we do at the higher level is irrelevant, when the abstractions we've built totally hide them, in ways that they didn't in the 80s and 90s.

Here's some food for thought:

https://qz.com/333313/milliions-of-facebook-users-have-no-id...


> DNS is irrelevant when people just use 5-10 services: Google

Google is kind of useless without the rest of the web, though.


>"DNS is irrelevant when people just use 5-10 services: Google, YouTube, Gmail, Facebook, Twitter, Netflix, and so on (and even more so when they use them through mobile apps, which could drop DNS and TCP/IP tomorrow and connect through proprietary protocols and nothing much will change in the user experience)."

Except that those proprietary protocols would still need to go over the internet where every router uses IP. So no they could not drop TCP/IP tomorrow. And most(all?) of those 5-10 services you mention make extensive use of IP Anycasting of DNS servers which has allowed them to scale to what they are so DNS is very much relevant to the success of those services.


>Except that those proprietary protocols would still need to go over the internet where every router uses IP.

Special pipes and contracts could take care of that too.

But in any case, it alone is small comfort when it comes to decentralization, since TPC/IP still allows total consolidation of control of anything that really matters to 5-10 companies (plus the telcos).


Please explain what these "special pipes" are connected to? Oh right they would have to to connect to "special routers" that didn't run IP at all. Special routers that these companies would have design and produce themselves. And convince carriers to put in their networks.

Do you think these telcos, who have an almost adversarial towards these "over the top" services are going to build out a parallel network and commit cap ex to accommodate these companies new proprietary non-IP protocol networking devices?

"Special contracts"? What does that even mean?

The dustbin of history is filled with proprietary network protocols - DECnet, IBM's SNA, Xeros XNS, Banyan Vines, Novell Netware Apple Talk. There's a good reason the world converged on TCP/IP.

Seriously your assertion is completely absurd.


There are examples for this, eg carrier IPTV.

3GPP IMS had the grand vision of telco-controlled Internet Multimedia System (IMS), an Internet modeled after the phone network. It runs over separate lines or at least separate layer 2 sessions. An application would use a signaling protocol (SIP etc) to request access to "multimedia" services like gaming, the telco would reserve bandwidth and guarantee that the service could be accessed with assured quality (and billed, of course).

While IMS uses internet tech, it has a completely different philosophy.

I'm not aware of any IMS network that went beyond doing telephony, though.


Are you listing these as failures then? IPTV would be little more than a niche presence globally no?


Those systems are not complete failures, but the underlying philosophy has certainly failed. People don't prefer the ISP's QoS-guaranteed IPTV product over Netflix or Youtube, which was the IPTV systems vendors' promise.


>Do you think these telcos, who have an almost adversarial towards these "over the top" services are going to build out a parallel network and commit cap ex to accommodate these companies new proprietary non-IP protocol networking devices?

Telcos can (and sometimes are) part of nice profitable conglomerates with those "over the top" services. Inversely, other times those "over the top" services try to be their own telcos (e.g. AOL and TW/Cable, on one hand, and Google and it's Fiber or FB and its "balloon internet").

It's not unfathomable than one or the other will succeed at some point. Here's some food for thought:

https://www.wired.co.uk/article/google-project-loon-balloon-...

>"Special contracts"? What does that even mean?

Carriers have had special deals with companies like Facebook, (and of course with their own owned content delivery subsidiaries), since forever...

>There's a good reason the world converged on TCP/IP.

Yeah, and there's a even better reason why the first that can create a proprietary pipe and lock its users in even more will get off of it pronto. You don't want to be tied to commodity resources and technology if you can avoid it.

We already have companies like Google who control both the half mobile OS space and half of the web's traffic (search, mail, video) and who venture into pushing their own protocols (AMP, SPDY, the "Google Wave Federation Protocol") and connectivity.

In fact, if you had paid attention, you'd seen that Google already has not just plans, but it's own implementation, to replace TCP: https://en.wikipedia.org/wiki/QUIC

Probably they didn't get the memo about how "completely absurd" my assertion is...


>"In fact, if you had paid attention, you'd seen that Google already has not just plans, but it's own implementation, to replace TCP: https://en.wikipedia.org/wiki/QUIC"

What you have just demonstrated is that you don't understand networking or networking terminology. "TCP/IP" is used to refer to the entire suite of protocols.[1] Part of that suite iS UDP. The U in QUIC is for UDP. QUIC uses UDP. And both UDP and QUIC are IETF standards.[1].

There is nothing proprietary about an an open standard is there? And what protocols use QUIC? HTTP and TLS also IETF standards. Oh and QUIC falls back to TCP too. And the decision to put QUIC on top of UDP rather than on top of IP was due to problems with middle boxes(see SCTP.) Is had nothing to do with trying to create a "proprietary pipe."

>"Carriers have had special deals with companies like Facebook, (and of course with their own owned content delivery subsidiaries), since forever.."

Yeah that special deal is called "zero rating" and it's where FB gives a ton of cash to the incumbent PTT in developing countries. Any carrier is happy to do this for you if you pay them enough cash. This has nothing to do with a carrier operating special networking gear for FB. FB builds and operates their own CDN and previously used Akami which is not owned by a carrier.

[1] https://tools.ietf.org/html/rfc1180

[2] https://datatracker.ietf.org/wg/quic/about/


And chances are if you use SMTP outside of the main clients/mail servers, your emails might get marked as spam automatically before it reaches the intended recipient.


A corporation could replace TCP with some proprietary protocol (but I don't see why would some even think about it) but can't replace IP: IP is the protocol that routers on the internet use to know how to route packets, you can't replace it without implementing the new protocol on all the routers that your packets have to go through and those routers aren't own by a single company, so TCP/IP isn't going anywere soon and probably would never go away.


Stop saying people don't care. (you can imagine me putting clap emojis between those like a cringy twitter poster). A majority of people supported net neutrality but the powers that be voted against it. The reality is of course people care if they know or have time to learn about systems, they just have no agency or power to effect change.


People cared about “Net Neutrality” because the name has a “free speech” ring to it, even though that isn’t really anything to do with that. How could anyone against the internet being neutral?

I think “decentralised” isn’t going to drum up any interest unless you can couch it in the terms of a movement that people already understand and can be certain that it’s something the “good guys” would definitely want.


Honestly, I think that decentralized networks are able to leverage advantages of centralization, like trust, while remaining decentralized enough to still have the advantages from that front too.

GitHub and git are actually a good example of this, since you can use GitHub and know the platform is reliable, trustworthy, etc. but you can also use other hosts too. This degree of decentralization is at least better than fully centralized services like Twitter where you have zero interoperability at all. You can also see this in Mastodon too; the vast majority of users are on a handful of instances like mastodon.social anyway.


Centralized vs. decentralized is a pendulum that has been swinging since time immemorial. Heck one could argue that cooperateive single-celled vs. multicellular organisms boils down to a similar principle. Given that this pendulum has been swinging back and forth since long before humans, and certainly throughout human history, I believe it is safe to say that neither is going to "win in the end". They each must obviously contain the seeds of their own downfall. Which is interesting and worthy of study, but I don't see this article particularly advancing the understanding of that.


What's often overlooked is that this same back-and-forth history predates computing. The greatest strengths of centralization typically turn out to be its weaknesses.

The strength of well-designed decentralized systems is resiliency. But it's like multithreaded code; it's hard to write. Centralized systems trade off brittleness and kludges for a certain amount of predictability and the ability to actually hire programmers. As a result yes, you can run on multiple AWS sites, but you have to know too much to make that convenient.

And other "benefits" (as the author points out those are two-edged) ("controlable, walled gardenable,defensible, brandable auditable, copyright checkable, GDPRable...") are also by design chokepoints. Its those chokepoints that make it brittle and that is where the weaknesses lie.

Pre-computing: in the 1950s and 1960s there was a public debate as to whether the US government should be descentralized as a way of withstanding nuclear attack. Though we got the highway system, instead the US became more centralized, not less. And as a result it has contributed to more alienation of the populace from their representatives. I long thought California should spread its government around more for more public buy in and support (and for that matter knowledgable opposition). But of course everybody who cares about government wants to be "close to the seat of power", which we see particularly in the federal government.


It's a very old phenomenon indeed. Plato's Republic can be read as a meditation on the centralisation of power, the pros and cons of centralisation vs decentralisation, and how and why the balance shifts over time.


I think there are 2 issues at play here.

The first one, the article touched on obliquely:

> Back in those days of high adventure hosts were far more than mere pets, they were golden temples where crusaders came to worship speaking prayers of code. > > Today, servers aren't even cattle, servers are insects connected over fast networks. Centralization is not only possible now, it's economical, it's practical, ...

"Back in those days" users were administrators of one sort or another because that was the barrier to entry. But administrating systems sucks, even if it's just a PC. The rise of server-less is just another testament to this: being able to outsource the boring, tricky bits to experts is great as long as the principle-agent problem doesn't rear its head too often.

The second issue is that distributed (I suppose some people might call them federated) identity systems don't work very well. Look at the vast gulf between the spam problem on Facebook and the spam problem with email.

If you've signed up with a centralized identity system, it's pretty convenient to use a bunch of services synced up with that identity: all of the social media stuff is probably the tip of the iceberg.


This is a huge barrier to today's decentralized systems.

I wanted to check out Mastodon and looked into setting up a node. It's a nightmare of rube goldberg machine dependency hell. It's not that I can't, but that I don't have time to mess with it.

If decentralized system designers want these things to take off they must get better at writing clean code with a good UX. Start with a good UX for admins: single install, dependent on not much more than maybe a database or language runtime, easy upgrades, and good docs about how to get started. Then as the product matures work up to UX for end users. You will never get to the latter if you don't plan ahead a little and avoid design and runtime choices that drag in heaps of complexity or limit portability.

Decentralized protocols are maturing, but UX is still awful. I think UX is half the problem.

The other half the problem is defending against attacks. In today's world where computer networks and even simple services on them like discussion platforms can be targets for professional criminals and even national intelligence agencies, anything that gets popular will get seriously attacked. Centralized systems are far more straightforward to defend: block IPs, kick off users, update software in one place, etc.

People like to hold up cryptocurrency as one system that has withstood constant attack, but that is only true if you limit scope to the protocol. As a holistic system cryptocurrency has been and continues to be successfully attacked with social engineering. Scammers are siphoning billions out of the cryptocurrency economy. On centralized economic systems like banks and exchanges you can ban, freeze funds, delist, and regulate.


> I wanted to check out Mastodon and looked into setting up a node. It's a nightmare of rube goldberg machine dependency hell. It's not that I can't, but that I don't have time to mess with it.

The sad thing is that we've had a solution to this for decades. Installing Mastodon should be "apt install mastodon" and you're done. Maybe add a PPA first if it's new enough the distributions haven't included it yet.

But people keep using Docker for this. The problem is that most users, even many sysadmins, have never used Docker before. Then they look into what it is and discover that it's a heavy, complicated thing with security issues and the user walks away.

I mostly blame the UX the package managers have for package maintainers for this. It needs to be as easy to create a package for the major distributions as a Docker container. Otherwise people take what looks like the path of least resistance at first and then don't revisit the decision soon enough.


I agree quite a bit with your last paragraph. Linux package managers are awful, with the two most popular (deb and rpm) being hellscapes of haphazard poorly documented cruft.

Linux distributions are also harder to deal with than the Apple Store, an "accomplishment" given how bad that is. Getting a package into them is terrible. They are informal cliques, and that worked back in the 90s but now doesn't scale.

Package management in general is an area ripe for a complete overhaul. Unfortunately that's hard, and it's hard not to fall into the "second system effect" trap and create something even more unwieldy than what you are replacing. Look at systemd.


Indeed, I tried to learn how to make a Debian package. A proper one that followed the guidelines (even though I wasn't trying to get it into a repo), not just one that's shoddily converted from another format.

The tools are just a bunch of hacks designed for the particular environment and habits of the few people who make Debian packages. They don't even try to document how an outsider would learn to do it.

Not sure why you added systemd to this, though, except to appeal to the old guard of sysadmins who don't like systemd and show up on HN a lot? It seems a completely unrelated conversation.


Which is what things like snaps, appimg, flatpak are trying to do


Honestly just a better package manager could work. Apk in Alpine is lean, pretty easy to build packages for, and works.

Alpine is the best distro today, at least for servers.


> I wanted to check out Mastodon and looked into setting up a node. It's a nightmare of rube goldberg machine dependency hell. It's not that I can't, but that I don't have time to mess with it.

I've come to the conclusion lately that the thing holding decentralization back the most is that sunning any kind of server-side code is challenging to set up and a regular pain in the neck to maintain.

This prevents even tech-savvy people from setting up their own nodes of whatever cool tech happens to come out (not just social networks, but RSS readers, bookmarking apps, you name it: there is an OSS thing out there that fits your needs but is a PIA to install and run.)

And in turn this limits the number of nodes that will ever run on a decentralized service, and nodes will constantly go offline as admins get fed up with running them. We need the ability to install and run server software as easily (almost) as we can install something on our phones.

Maybe a layer on top of Ubuntu Server, with a baked in web UI so that any Joe can get a DO droplet and start serving.


Much like many projects on github have a "deploy to heroku" perhaps Linode or DO could make deploying a Mastodon instance as simple. There would need to be a good admin panel UI, as you said (manage federation links, users, blacklists/whitelists). It could even be free for the 1st month. If there is demand, my guess is these services will emerge.


Go has the right idea: big static binary. This will remain the best answer until dependency and package management systems advance beyond the 90s.


You'd appreciate sandstorm.io


I was a backer when it came out but it has not become what I had hoped it would be.

It seems the need to alter the applications to work as a sandstorm app means most apps were a few versions behind and often contained bugs. I tried doing some of the simplest things and got major fails, so I let it go.

Plus it didn't handle the web-centric use case well (only static content [0]), which makes its usefulness very limited as a server-side platform.

https://docs.sandstorm.io/en/latest/developing/web-publishin...


> I wanted to check out Mastodon and looked into setting up a node. It's a nightmare of rube goldberg machine dependency hell. It's not that I can't, but that I don't have time to mess with it.

The thing is, a lot of that complication is caused by things like user registration, permissions systems, etc. which (a) aren't particularly relevant when running a node for onesself and (b) are probably the most vulnerable attack surfaces.


And if you just want a one-person instance, there are managed solutions like masto.host.


The author includes an incorrect diagram swapping around the definitions of distributed vs decentralised, then ignores that one of the design criteria of the internet was to survive against nuclear attack.

Google's 40 centralised global datacenters wouldn't stand up very well to that sort of thing. Centralisation only works when nobody stronger is around to compete with.


You don't think Google could survive losing a single data center? Cloud users might lose some data, sure, and it'd likely cause issues, but I bet search wouldn't go down.


You think the world only has 1 nuke?

Also it looks like 40 was an overestimate by me, the real number is 15: https://www.google.com/about/datacenters/inside/locations/in...


What I think will happen is that we will see a different type of decentralization become popular. Common platforms are needed or convenient. Right now these platforms are generally controlled by large companies with something approaching monopolies in various markets. For example, Amazon provides services and a destination that help people sell things. The problem is Amazon is a company that also sells things and is trying to maximize profits. So vendors are competing with the company running their vending platform.

Uber provides a platform for people to make money with their cars and to get around. It has the large network that drivers and riders need. But they are also so dominant that there is very little competition and so they can price things unfairly.

My belief is that A) we do need common platforms but B) the monopoly platforms are unfair and C) decentralized technologies can provide common platforms that aren't controlled by monopolies.

So what I think would make sense would be for drivers or even self-driving car companies to use decentralized protocols to make one large network and platform. That platform is open source and decentralized and used by many competing companies that want to provide transportation services. This allows companies access to a large pool of drivers and riders and a core technology implementation, but does not relinquish control to one company, so we can preserve competition and fairness.


We will eventually get to a point where everybody is publishing content over arcane but decentralized protocols that may or may not be supported by their ISPs, or with content easily easily discoverable otherwise. It will take the emergence of aggregators and archivers to find and package access to this content in a convenient form while also filtering out the spam.

When that happens, it will be 1996 again, and we'll be sharing stuff over NNTP and reading it through entities like dejanews.

It was a solid plan. A lot of old Usenet content is still around, which is more than can be said about content shared on privately-owned forums or Facebook.


I would argue that abuse protection has been one of the main drivers toward centralized systems.

Many goals of fully decentralized morality-neutral platforms are in conflict with the expectations of people to be protected from harm while using the internet. Centralized platforms get closed to providing such a protection (even if flawed).

There is tons of research in decentralized methods for protecting against abuse (huge research topic in the 2000), yet not much has really worked (Bitcoin solves SOME, but it still suffers of many other abuse problems).

Small-scale federated solutions (where all participants know each other) are potentially better on that front, but I don't think they will be profoundly different from solutions where control is centralized. There will need to be coordination in response to problems, hard to see how different parts of the system might implement profoundly different policies about content, anonymity without splitting the group.

So, my conclusion is that solutions with centralized control are dominating the market not because a few evil corporations are conspiring to steal the control away from the free people, but because they DO provide a much safer environment for regular people to do regular things, despite all downside.


Email is a working example of federated network, and it’s profoundly different from Facebook. And not any more abusive than Twitter.


Twitter is pretty abusive. And email used to be even worse.

It didn't get decent until it people started flocking toward centralized solutions like GMail, which (coincidentally or not) did a pretty decent job of reining in spam and viruses.


The industry got good at that, Google is not a magical spam filter in fact in some ways it is not even really that great at all.

There are many many ways to achieve the same or better results for spam protection they by using google


Such as? I switched to gmail mostly for the spam filtering.


It's federated but certainly not completely decentralized. Spam filters in a way have become "a protection (even if flawed)". With email, you can still send an email to anyone but if you send it from a personal server versus gmail, it's much less likely to be read.


* citation needed

That's not been my experience at all. I've been running my own email server for years and I've yet to hear about an email being blocked. I know many other people who run their own email server just fine.


Indeed, I've been successfully running my email server for the past five years and mail has no problem reaching gmail users.

I think people have just become defeatist.


I would argue Email is heavily abused, much more than Twitter. It's just not very noticeable if you have a good spam filter.


I would argue that 'abuse' is the most significant characteristic of an open system. What is one persons abuse is absolute freedom to another.

The proficient design subsystems to isolate themselves from the 'abuse' and attract customers of their design. These become walled gardens. Eventually the closed system becomes oppressive.

As many others note it is a sort of dialetic that can be traced in historical political movements and other human endeavor.


I don't think "regular people" are concerned about being harmed on social media. The story around abuse comes from very high profile people that receive the social media equivalent of what we used to call hate mail.

When a regular person signs up to Twitter they don't even get any followers let alone harassment. Abuse only happens when you reach a suitably large audience to enable it. Most people will never have that kind of attention.

I think centralization's domination in tech is due mostly to technical and business reasons. Centralized search is fast, centralized content is fast and high quality, discovery is easier...


Not "abuse" as in "targeted bullying". Abuse" as in "untargeted spam".

Just look at what happened to IRC. And Usenet. And Email. Heck, even the Fediverse. The major providers colluded in the form of de-facto network-wide blacklists as a way to counter spam, ultimately centralizing control of the network. People stick to the major providers that participate in the blacklist because, while a few people will occasionally be unjustly kicked out, most of the time it actually works pretty well and the major providers aren't flooded with garbage.


I took the "safer" in the OP to mean harassment but totally agree with you on your point. It goes to what I was saying about centralization offering better technical advantages.


Even if you don't personally see the benefit of a platform having a way to stop hate speech and harassment (we can all agree on the value of stopping spam at least), you still have the problem that a platform that allows hate speech and harassment will self-select for horrible users who aren't allowed on the major platforms. It'll be a wasteland of garbage, like 8chan, Gab or Voat.

Notice that you're currently expressing yourself on the actively-moderated HN, not on one of those.


[flagged]


> Are you a straight white male?

Does it matter? Please feel free to respond to my comment and not my identity.


> Does it matter?

Yes, because it's only straight white cis-men who say things like "Abuse only happens when you reach a suitably large audience to enable it."


I normally wouldn't feed the trolls, but no I'm not all of those things you list so I guess I prove your point incorrect.


I agree with you in this case, but you shouldn't refer to your parent as a troll when they are a long-term participant on HN. I believe there is good faith in this this argument and there is no good reason to be rude.


It's pretty rude to bring someone's race, gender and sexual orientation into an argument. That's not good faith.


It is worth pointing out that the "winning" centralized systems are largely built using decentralized systems.

A lesson to start ups is to be very careful building your product by assembling pieces of commercial centralized systems. Make sure if one part goes up in smoke (Facebook deprecates an API, the App store bans you, Google Cloud API pricing quadruples, etc.) you can easily swap in out for something else.

Another thing to consider is that centralized systems die. When support ends, but users are still there, software systems may decentralize. Take a look at "dead" MMOs that are still live because someone wrote a server emulator. Or, out of date DOS games which can be run in high resolution on a modern GPU because someone bothered to release the source code.


Peer-to-peer sharing in the 90s was the best example of decentralization that really worked, grew really quickly and had a massive user base.

Napster was probably the most successful consumer decentralization software ever with something like 25M users in 2001.

Sci-Hub, Torrent video sites etc... are very popular and those are all decentralized with torrents.

So it's not that decentralization can't work, it's that legal structures won't allow it to work, because decentralization doesn't play nice with locking down IP.


This is the real answer. Decentralized is actually better for most things on the internet, but it is less profitable and harder to control for those on top.

I personally believe time will show decentralized systems winning out on the internet eventually, but it will take a lot of centralized services collapsing under their own weight before we start to realize this. Luckily, this has already started happening.


The examples you listed gained many users because they were (and are) sharing content that they do not own. Imagine peer to peer sharing of personal photos and videos by the users that own the copyright.

The difference in outcome may be due to less economic incentive. It's easy to share photos on instagram and it's much more difficult to, say, download 20 academic papers without emptying one's bank account. There is a cost to the user to sharing photos with instagram. The cost hasn't been realized by the average user.

https://github.com/ssbc/patchwork/


I disagree, I think it's because they let users access content easier and cheaper than the standard way.

Music sharing is basically dead because of Spotify, Youtube Red etc... so they solved the simplicity problem, and the cost wasn't worth putting up with the problems of music torrents, which made decentralization benefits moot.

There isn't any reason you couldn't make photo sharing decentralized as easy as instagram, but Instagram isn't photo sharing it's social networking, and decentralized social networking hasn't shown that it's easier/better than centralized.


Hasn't centralized social networking proven to be a complete disaster? The user has become the product.


>locking down ip

that's the golden ticket right there. a meshnet is entirely possible RIGHT NOW but it would entirely destroy the idea of copyright and IP


I think the main reason decentralization lost is not due to technology, but trust and network effect. Like we will deposit hard earned money in a bigger bank which has less chances of failure, even if internally it might not be safe and they will need our help to bail them out. More and more people deposit the money the bigger the bank.

Like in real world we are responsible to support monopoly and large centralized companies, it's the same we did with internet and internet companies.


>reason decentralization lost

Nothing has lost, the story is still evolving. Most people have only ever been on one social network: FB. People are now getting comfortable with Twitter, and Mastodon (or something like it) will eventually make more sense to the mainstream, non-technical audience.


You are right that some apps will become popular and they will become part of centralized backbone. People using Mastodon or any other decentralized apps based on Kadmelia DHT and other technology will still be using Android, ios, Mac or Windows as primary base platform. Only very small subset of people will use Linux. So in the end, it will still be dependent on centralized infrastructure. It's not just about one or two apps but the overall internet architecture and its control and dependence on bigger company. Now as Apple, Google, Microsoft and Amazon lay their cables across oceans, acquire ISP's and telecom, will control from packet switching network to application delivery used by normal person.


Also, convenience. Large number of branches and ATMs make it more accessible. Decentralized systems will have to be similarly convenient to get mainstream adoption initially.


I may be feeling a touch pedantic, perhaps I just live in a different world being on the ISP side, but the Internet is not centralised at all. Large quantities of traffic ends up going to certain central locations, but the network itself is decidedly decentralised. Just look at LINX for example and the other major IXs, nobody there is telling everyone else where to send their packets, they just go where needed. In fact LINX as an organisation is a members organisation run by its members with each one, big or small, having one vote.

IXs, BGP, RIRs like RIPE, and the proliferation around the globe of access ISPs, all prove how decentralised the Internet is.

But if the point the article is trying to make is "all this content is being hosted in fewer and fewer locations and that's bad" then I completely agree. It damages choice and hugely increases the blast radius of service outages.

Now if Amazon and Facebook started offering end user connectivity, that's the time to really worry. cough Google.


I don'think it would be too hard to sell decentralized, even in general terms. You can buy a tomato in a supermarket, most people do so. Yet a large amount of people buy it from farmers on markets, also a lot grow it themselves. Everybody prefers local, homegrown tomato over supermarket one, always, even if that is cheaper, controlled and easier to get, everyone understands the differences, even while knowing nothing about how to grow a tomato.

Such differences would also be increasingly trivial for apps and data we use daily. Customization, inherent in every individual, would also be a powerful driving force. Why letting FB/Google etc at our data requires also less and less explanation.


Decentralized is good for experimenting, testing all the possibilities in parallel. Once we have a working model - and one with network effects - the play is about squeezing the last bites of efficiency and that is better done centralized.


I suggest looking at the Holochain project.

https://developer.holochain.org

It’s operational today (prototype stage), and solves many of the problems that have prevented the roll-out of distributed systems in the past. Useful right now, and moving fast toward alpha stage, with high scalability.

This is going to have shocking effects in many fields where decentralized solutions have previously been difficult to scale.


there's a relevant gap in his 'history of the internet' analysis where centralized web was basically all that existed for users (i.e. AOL walled gardens).

as many things in society, these internet trends seem to follow cycles of discovery/innovation and improvement/cost-efficiency.

and the same way the previous centralized web crumbled with the appearance of novel services, so will the current centralized networks. trust in innovation.


Maybe this blog can survive on a single server answering direct calls from modems, but last time I checked, decentralization of content delivery was the rule, not the exception.

Centralized content with decentralized systems won. And they're doing so well it all appears centralized.


An inconvenient truth in the golden age of crypto mania. Even crypto itself cannot put up a good fight against centralization, as companies like Bitmain and Coinbase dominate the landscape.

Decentralized protocols and centralized services on top of those protocols is the future.


Email is decentralized, works great.


Unfortunately we’ve lost most of the great parts, due to the fact they were easy to attack and hard to defend...

It’s not really practical to run your own mail server anymore. This is from someone who’s been running a triply redundant MTA for over a decade. Works great — but I couldn’t describe to someone how to do it, without telling them in the next breath not to bother.

Also, sending email was trivial and totally decentralized (any script could do it manually in a few lines of code), but that no longer works; now an authenticated connection back to a centralized industrial-scale sender is required. Easy, just not distributed anymore.

The next gen, however, is going to be totally decentralized, and end-to-end authenticated and encrypted. Very exciting times:

https://developer.holochain.org


It's the wrong year to pick winners and losers. There's a huge amount of stress in the system right now. The coming quakes will drastically reshape the landscape.

Let's check in around 2030 and see if centralized is still winning.


in the energy world (power grids), there are very practical advantages to decentralized generation. It actually makes for a good example of similar pros/cons to the kind of (de)centralization this post is talking about.


Its just the beginning of decentralized. These articles are so easy to write.


This article presents absolutely nothing as regards an actual argument about how to write computer programs. That's enough to say about it.


Why don’t we create network systems that model real social networks?

Can’t we combine centralization and decentralization?

Must it be all one or the other?


isn't the author aware that Kardashian is into Bitcoin and publicly announced it?


7. Killer app




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: