I use the React-Relay-GraphQL combination. I think it's cool, but I don't like that the ecosystem of these tools encourages cargo cult.
The pattern could be called "social proof sophistry". It first involves social proof:
- the reputation of a known figure or organization
- marketing techniques
- blog posts from converts
and then, here is the clincher, needless complexity is added:
- fragmented subsystems
- sophistry / obfuscation in documentation
- increasing boilerplate by not establishing patterns
What these factors do is increase the level of time and energy required to get to square one. People are prepared to devote these resources due to social proof.
What happens when people devote time and energy? The become committed. By the time people get to a simple prototype they have already sunk enough effort into it that they would feel foolish for not continuing.
They invest further resources and speak positively about the framework in order to feel consistent. They might even feel like writing a half-informed blog post just to feel ahead of the crowd, which further increases the social proof + obfuscation.
The big breakthrough here is obviously GraphQL. Not React or Relay, since there are many competing implementations that do similar things, but GraphQL. Even though there are graph query languages out there (Gremlin, etc) they were not suited to querying JSON over the wire. GraphQL is ideal. For my next project, I hope to do a Clojure implementation of most of GraphQL, because I think it can be married to Om.Next in a very powerful way.
Actually, I am not being fair. There are some interesting ideas that came out of the whole collection of technologies: React, Relay and GraphQL. Among those interesting ideas:
1.) Components should be mostly immutable (favor props over state)
2.) one way data flow
3.) Relay containers: the component and the query that gets data for that component should be married together, so a developer can easily see what data a component needs, and the query never goes out of date
4.) combine all queries at a higher level -- the highest level container combines the queries from all the lower level containers, so you end up with one query. Relay also runs a diff on the info it has, versus the info it needs, so the query is always optimized for maximum efficiency. This has many advantages over the RESTful approach.
My criticisms of the system:
1.) this is a ridiculous among of boilerplate. I can hardly believe how much I have to write by hand. I do get that some of this is only because of the immaturity of the system. A year from now there will probably be command line tools that will give us some of the automation that we might expect from something like Ruby On Rails. Given a schema, I should not have to write, by hand, all of my Types and Connections and Models -- much of that can be inferred.
2.) Mutations -- in the middle of all of this beautiful Functional programming, they dredged up a classic Object Oriented pattern and they made it as painful as possible. The amount of boilerplate needed for mutations is really amazing. What should be 1 line of code takes 40 lines.
With all that said, I am 100% excited about GraphQL. It does seem to me very much a win over RESTful APIs, for all the reasons mentioned here:
The facilities for configuring a mutation in Relay are non-trivial and the documentation is partial and sometimes wrong (I should submit a documentation PR instead of writing this sentence).
If the author is reading this, I'm curious what parts of the documentation you're talking about here.
`REQUIRED_CHILDREN` used to not be in docs -- only code. This is what I had in mind when I said partial.
I don't know which, but I think one of the `RANGE_DELETE` vs `NODE_DELETE` said it could take a list of ids but only one of them could -- it was something like this at least ... maybe path vs key? I don't remember how it was documented -- this is what I had in mind when I said wrong.
I didn't submit PR's btw -- I suck at open source :).
The pattern could be called "social proof sophistry". It first involves social proof:
and then, here is the clincher, needless complexity is added: What these factors do is increase the level of time and energy required to get to square one. People are prepared to devote these resources due to social proof.What happens when people devote time and energy? The become committed. By the time people get to a simple prototype they have already sunk enough effort into it that they would feel foolish for not continuing.
They invest further resources and speak positively about the framework in order to feel consistent. They might even feel like writing a half-informed blog post just to feel ahead of the crowd, which further increases the social proof + obfuscation.