Interesting, but no iOS client or WebDAV support. That would prevent me from using it.
Most developer/engineer types have a private server and that’s a great place to install Nextcloud. Nextcloud has cross platform clients, supports strong authenticators (U2F), and open protocols like WebDAV. I even use it to sync org-mode files across many devices with great success.
The lack of (fully working[1]) iOS client is a giant glaring hole in the SyncThing ecosystem. Which is a shame because it works really well across my various machines and servers.
[1] fsync() seems to have been abandoned but still partially works on my phone - only local networks and you have to manually add folders by ID.
I like applications to be efficient as well, but come on, surely this memory consumption is broadly pretty reasonable? A few hundred MB?
Also, what is your memory there for, if not to be used? Is Dropbox good at releasing or allowing memory it no longer needs to be released, paged out, or processes killed if there is pressure from other applications in the system? If so, what's the problem with Dropbox using the memory until then?
I used to run Windows 3.11 on 4MB RAM (it struggled to run Word 6.0 without insane paging, I was poor) and Windows 95 on 64MB of RAM (4 x 16MB on a 486 DX2 66).
How is this memory usage by Dropbox reasonable? It seems to be another example of insane bloated memory usage which is not only the trend but the fashion these days.
Remember, allocation is the enemy of speed. More CPU cycles, more allocations, more power consumed, shorter battery life on mobile devices (laptops), more charging of devices, more fossil fuels burned.
It wouldn't be reasonable on a 4 MB system, but this isn't a 4 MB system, so how's that relevant to anything?
Why do you have 16 GB of RAM or whatever in your laptop if not for applications to use?
> allocation is the enemy of speed
But the complaint wasn't the volume of allocation - it was the size of the working set.
Does reducing that working set perhaps consume more power than allowing it to sit at the current level? For example an in-memory cache of something could save power.
I think it's very naive and simplistic to just complain about memory consumption in isolation. The memory is there as a tool to be used.
> Why do you have 16 GB of RAM or whatever in your laptop if not for applications to use?
This line of thinking is _why_ I have 16 GB of RAM to use. I have an "old" machine -- a 4GB Mac Mini from barely a few years ago. It's now almost unusable with some apps. It's like a kind of software inflation. What used to run fine is now an impossibility I'd briefly recall in a dream.
I have 16GB of RAM on my work computer so my compiler can process hundreds of MB worth of source files into a fully functional program in less than 5 seconds. I have it on my home computer so I can run a fully simulated 3D world inside a videogame at 60FPS. I don't have it so some coder somewhere doesn't have to add lazy-loading to their file-syncing app.
Yes. Dropbox built its reputation on being a reliable, set-it-and-forget-it background service that kept your files synchronized. Applications that occupy user attention and serve the explicit purpose of the user may make a variety of dubious cases for reckless memory consumption, but it's a lot harder to justify for something that's supposed to be invisible.
This way of thinking works as long as you have very few bloated apps.
It stops working once it affects literally everything. 1password? Few hundred MB (unless it goes crazy into few GB sometimes). Browser? A GB or so. Some electron apps? Another GB. It really adds up, but no single app will take responsibility because "what else are you going to use the memory for?"
I'm sure it's the wrong place to put the trade-off all the time on memory, because we already see that failing. Few people have the 16gb systems mentioned above. It's the reason ripcord (https://cancel.fm/ripcord/) can exist and charge money even though slack client is available for free.
I expect time-to-market is the trade-off in this case, but I'm not even sure about that. (For example ripcord is written by one person - how many web developers does slack have?)
It’s the wrong place to put the trade off because I highly doubt I’m receiving more value from Dropbox using 500MB as compared to 100MB. Unlike what VCs are telling Dropbox, I only want something to sync files from one computer to another and it most certainly is not the center of my workflow.
In my opinion, it reeks of lazy engineering brought about my PMs who want to turn Dropbox into something it isn’t.
The discussion is about dropbox, a background service that used to have much more reasonable memory use and for which has competitors using far less memory. A trade-off isn't necessary.
I don’t really understand your question. I do spend, invest, and gift the money I make and I wouldn’t bother making it if didn’t need and want to do those things.
And you’ve already spent the money on your RAM. Why do you want it to sit idle when it could be being used to, for example, decrease sync time in Dropbox?
To reduce memory consumption of Dropbox costs money. Either via development time at Dropbox (so an increase in your subscription fee), via power drawn at your socket, or perhaps somewhere else.
Why are people here so utterly convinced that reducing memory consumption is the right place to spend money to get the best value? What do they know that I don’t?
I don't know how you do it but when sending data from a file, I read chunks of the file and send the chunk I have read, not the entire thing. These chunks can be very small, particulary as the MTU over a network is something like 1500 bytes (unless using jumbo frames). So to sync 10 files in tandem (10 threads), they would need 10 * 1500 bytes for the read buffers (+ overhead for storing the file pointers). Even you can see that it is a tiny amount of RAM required for this.
Or are you somehow living in a world where your internet upload speed is somehow faster than reading from disk or memory, and your network interface has to wait for reading from disk/memory???
Are they reading the synced files entirely into memory or something stupid like that?
I don't disagree, but I also recall an almost certainly apocryphal story of a web application developer who spent many many weeks honing their application and trimming waste wherever it could be found, proudly announcing they'd finally got it down to running comfortably within the 2Gb memory limit - only to be asked "why not spend £100 and add more memory?".
The moral is I guess is it really cost effective to optimise relentlessly - and to whom?