Hacker News new | past | comments | ask | show | jobs | submit login

I see this and then I think of all the success of strongly opinionated things like Rails. Having one way to do things forces you to focus on what’s most important, make choices and get things done. It keeps you away from premature optimization and analysis paralysis. Yes, you could split things into 100 different pure pluggable microservices but should that really be where you put your resources at the beginning? To me, engineering and for that matter everything you do isn’t any doing things the perfect way— it’s about doing them the best way you can and moving on to the next problem.

Some things work well together in a functional pipeline— Linux commands that do one thing for instance. EMRs are famously malleable too, for better or worse.

But could you imagine being IT managing 1000 PCs all with slightly different versions of word processors or something like that?




Here's a counter argument:

In college (17 years ago) a professor once told me that databases largely outlive the apps that use them.

Imagine a world [1] where IT manages data centrally, but users are free to interact with it how they best see fit?

Would this kind of a world be better?

(provided of course you have the right permission model & audit trail for modification)

--

[1] Founder of https://mintdata.com, so I'm biased -- malleable systems are near and dear to our heart

[2] Professor Franklin at UC Berkeley, if you're reading this, I am very sorry for not being the most attentive student :)


In this sort of world, how do you prevent users from making bad queries and hogging all the database resources or even crashing the database? How do you prevent users from wiping out data? (I get that you could back it up and restore it, but depending on the company, this could cause serious downtime and loss of revenue).


I think of this in two parts:

1) Computing is cheap, human time is not.

2) Competition in established industries is fierce (finance, real estate, education, construction, etc)

-- To the extent possible, I'd:

a) time out bad queries

b) add more hardware where feasible

-- For the "wiping out data" part, I think:

1) Mark things as "removed", delete when {GDPR,CCPA,etc} requires it

2) The UIs designed can/should have human processes in mind -- that is, things like multiple people involved (async) on what to do with a piece of data (flag it, mark it removed, send it for approval, etc) -- what a lot of companies are calling "workflows" and "workflow management platforms" these days.

3) Backup/restore should be used only as a last resort (DR and the like)

4) Versioning like Google Sheets of the UI is paramount and integration with CI/CD systems is crucial to having the cake (allowing people to create the tools they want) while also eating it too (making sure IT doesn't drive the above-mentioned dune buggy off a cliff in protest)


That's kind of how we design my web applications.

We store all the data in the database and the applications query the data/ process and return. We actually have multiple applications in various languages accessing the data (php, java, perl...).

Its genetic data so a lot of tables are shared across our applications. Gene symbol lookup for example).

When I started in this field I worked with Lotus Notes. Its actually a shared nosql database which users can create applications that query it as they see fit. I was clunky, but it could be effective in making custom team tools.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: