Hacker News new | past | comments | ask | show | jobs | submit login
Shutting down public FTP services (debian.org)
157 points by JoshTriplett on April 25, 2017 | hide | past | favorite | 126 comments



The only reason I ever remember preferring FTP over HTTP in the 1990s was because FTP usually meant that I could resume large downloads if my connection dropped mid-download. That was a big deal when it took several hours to download something. That benefit largely disappeared for me as broadband got faster and the connections became more reliable.


Also disappeared when http 1.1 added range requests around 1999


HTTP/1.1 was published in 1997:

https://tools.ietf.org/html/rfc2068

But it probably took a few years to be universally adopted.


heh the shitty proxies i saw in uni still have issues with 1.1 5 years ago :(


FTP has been obsolete since 1999 thanks to ssh/sftp.

It should have died along with RSH, but people insisted on keeping it alive.


SSH/SFTP is mostly for authenticated access. The submission is explicitly talking about "public FTP servers" (i.e. anonymous access).


How about the disaster that is FTPS, or even more confusing FTPES. What were they thinking.


FXP was pretty cool too and an FTP server can be used almost as local storage with the right apps.


Echo the good riddance comment.

Psst... Someone tell CPanel Inc and all the millions of horrid cPanel shared hosting providers which still promotes this nonsense as the default way to manage files. It is somewhat sad when you meet a new young fresh developer where deployment = filezilla.


filezilla supports SFTP just fine (nothing to do with FTP)


It also comes with bundled malware, so be careful.


Not entirely true. Sourceforge bundles things with their installers.

https://news.ycombinator.com/item?id=8849950

But you could always get a normal installer even then, and I think (don't quote me on that, I never really used SF) Sourceforge stopped doing that after their acquisition ?


Sourceforge stopped it, filezilla did not. And Filezilla explicitly defended the sourceforge practice when it was still in place. They're a very shady project.


It's not just if you dl from sourceforge:

https://filezilla-project.org/download.php

>This installer may include bundled offers.


But the default is ftp. If you don't explicitly choose sftp and an attacker blocks the ssh port and spoofs an ftp server, FileZilla will happily send the password for the attacker to intercept.


Cpanel supports sftp too. Ftp it's just one option and can be disabled.


How do you like to manage your files and your deployments?


git, mercurial, fossil. In short, version management. Usually with a good helping of CI so bad things don't go live.


Public FTP servers were where I downloaded most of the software for my computers, back in the 90s. There's nothing really like it anymore - you can't have anonymous sftp.

But perhaps we don't care anymore. The web is gradually consuming all that came before it.


> But perhaps we don't care anymore. The web is gradually consuming all that came before it.

It is partly the web/HTTP eating everything but also that FTP is legitimately a bad protocol and is less tolerant of horrid shit going on in layers below it (like NAT) than HTTP is.


I think my favorite "feature" of the FTP protocol has to be ASCII mangling, wherein the FTP server tries to mess around with line endings and text encoding mid-transfer. It's so bad that vsftpd, one of the better FTP servers for Linux systems, pretends to support it but silently refuses to perform the translation.

http://sdocs.readthedocs.io/en/master/sdocs/ftpserver/vsftpd...


I wrote a custom FTP server once (it was database-backed instead of filesystem-backed - e.g. you could do searches by creating a directory in the Search directory) and I added in insulting error messages if a client tried to exercise one of the more antiquated features of the spec (e.g. EBCDIC mode)


>There's nothing really like it anymore - you can't have anonymous sftp.

Strictly speaking there's nothing stopping someone from writing an anonymous sftp server that lets anyone log in as a 'guest' user or similar - it's just that nobody has (as far as I'm aware).


"Unauthenticated SSH" is basically what the git:// protocol is. I wonder if you could use git-daemon(1) to serve things other than git repos? Or you could just convert whatever you want to serve into a git repo, I guess.


You could, but since git isn't designed for handling large binary files the performance will be poor. That's why there are large file support plugins like (the aptly named) Git LFS[0] and git-annex[1].

[0] https://git-lfs.github.com

[1] https://git-annex.branchable.com/


> "Unauthenticated SSH" is basically what the git:// protocol is.

git:// is usually unencrypted (people have run it over TLS, but not commonly).



Please use IPFS instead.


IPFS requires a stateful thick client with a bunch of index data, no? Would it be efficient to, say, build a Debian installer CD that goes out and downloads packages from an IPFS mirror? Because that's the kind of use-case anonymous FTP is for.


Many many years ago I was on the team that managed the compute cluster for the CMS detector at the LHC (Fermilab Tier-1).

When we would perform a rolling reinstall of the entire worker cluster (~5500 1U pizza box servers), we would use a custom installer that would utilize Bittorrent to retrieve the necessary RPMs (Scientific Linux) instead of HTTP; the more workers reinstalling at once, the faster each worker would reinstall (I hand wave away the complexities of job management for this discussion).

I'm not super familiar with IPFS (I've only played with it a bit to see if I could use it to backup the Internet Archive in a distributed manner), but I'm fairly confident based on my limited trials that yes, you could build a Debian installer CD to fetch the required packages from an IPFS mirror. No need to even have the file index locally. You simply need a known source of the file index to retrieve, and the ability to retrieve it securely.


And a day or two to wait for the installer to finish fetching packages...


Such pessimism! Bittorrent started with only a few nodes too. It is now a majority of internet traffic.


Isn't IPFS not anonymous either? I thought you needed I2P for that.


Anonymous within this context refers to unauthenticated clients, not communication privacy.


With this caveat: https://blog.filippo.io/ssh-whoami-filippo-io/

(Granted you can turn you can stop this client-side, but it's worth noting that your SSH client will generally identify you to a server on connect.)


That ssh server seems to be down.


Some public roguelike servers use SSH with a single account and posted private key in place of telnet.


You have to be really careful though because the default is to give users shell access. If you think you can limit that by forcing users to run some command you'll run into trouble because the user can specify environment variables.

The user also by default gets allowed to set up tunneling which would allow anonymous users to use your network address.


> There's nothing really like it anymore - you can't have anonymous sftp

Nonsense. http is exactly like anonymous ftp and it does a much better job of it. Pretty much every anonymous ftp site started also serving their files via http decades ago -- which is why ftp is no longer needed.

Case in point: Debian makes all these files available over http. This isn't going away.

ftp://ftp.debian.org/debian/

http://ftp.debian.org/debian/


It really isn't as convenient if you have to download lots of files at one time though. FTP has mget. That's probably why FTP lives on for scientific data (NCBI, ENSEMBL, etc). Yes, you could use some tool like wget or curl to spider through a bunch of http links, but that's more work.


> FTP has mget

Not quite, ftp CLIENTS have mget. The ftp protocol has absolutely no awareness of mget. In fact, ftp is terrible at downloading more than one file at a time because it has no concept of pipelining and keepalive, both things that http supports.

With a nice multi protocol client like lftp, http directory indexes work just like an ftp server:

  $ lftp http://http.debian.net/debian/
  cd: received redirection to `http://cdn-fastly.deb.debian.org/debian/'
  cd ok, cwd=/debian
  lftp cdn-fastly.deb.debian.org:/debian> ls
  drwxr-xr-x  --  /
  -rw-r--r--         1.0K  2017-01-14 10:44  README
  -rw-r--r--         1.3K  2010-06-26 09:52  README.CD-manufacture
  -rw-r--r--         2.5K  2017-01-14 10:44  README.html
  -rw-r--r--          291  2017-03-04 20:08  README.mirrors.html
  -rw-r--r--           86  2017-03-04 20:08  README.mirrors.txt
  ..[snip]..
  lftp cdn-fastly.deb.debian.org:/debian> mget README*
  5315 bytes transferred
  Total 5 files transferred
  lftp cdn-fastly.deb.debian.org:/debian>


For anyone else wondering how to recursively download with this:

    lftp cdn-fastly.deb.debian.org:/debian> mirror doc
    Total: 2 directories, 43 files, 0 symlinks                             
    New: 43 files, 0 symlinks
    1031755 bytes transferred in 1 second (678.7 KiB/s)
    lftp cdn-fastly.deb.debian.org:/debian> 

Warning, -R means reverse (upload!), not recursive. ;)


Wow, I had no idea lftp had that feature. That's super cool.


lftp has a ton of features. background jobs, tab completion, caching of directory contents, multiple connections, parallel fetching of a SINGLE file using multiple connections.

Yes, it looks like '/usr/bin/ftp' from 1970, but it's far far far more advanced than that.


It would make more sense to offer rsync (unauthenticated), to ensure integrity of what's transferred.

  rsync -r rsync://... ./
will retrieve everything in a directory.


I suppose you can sometimes do `mget x.csv` on FTP (if the client supports it?), but with wget you can do:

wget -r -A 'x.csv' https://example.org/

(where 'x' is an asterisk, but HN's formatting eats it)

More work, in the sense that it's more command line options to remember, I agree, but otherwise it's easier to integrate in scripts and much more flexible than mget.

(I don't miss FTP for the sysadmin side of maintaining those servers.)


Download managers such as Down Them All! [1] for Firefox are more convenient (and useful) than mget in an ftp client.

[1]: http://www.downthemall.net/


> there's nothing really like it anymore

a public facing httpd that uses the default apache2 directory index can be configured to , of course, allow anonymous access and with a log level that is neither more or less detailed than an anonymous ftpd circa 1999.


> Public FTP servers where where I downloaded most of the software for my computers, back in the 90s. There's nothing really like it anymore

Modulo UI details, the common download-only public side of public FTP servers is a pretty similar experience to a pretty barebones file download web site. Anonymous file download web sites are, to put it mildly, not rare.


... until you want to offer directory structure a with multiple files


In the "transition" from FTP to HTTP, the level of abstraction in popular use has shifted out of the protocol and into resources (mime-types) [1], rel types [2], server logic [3][4], and client logic [5].

In the past, I've said that this extensible nature of HTTP+HTML is what made them so successful [6], but once specialized protocols began to falter, tunneling other semantics over HTTP became not just a niceity, but also a necessity (for a diverse set of reasons, like being blocked at a middlebox, being accessible from an the browser where most people spend their time, etc).

[1] http://www.iana.org/assignments/media-types/ [2] https://www.iana.org/assignments/link-relations/ [3] https://wiki.apache.org/httpd/DirectoryListings [4] http://nginx.org/en/docs/http/ngx_http_autoindex_module.html [5] http://stackoverflow.com/a/28380690 [6] https://news.ycombinator.com/item?id=12440783


So, configure your HTTP daemon to serve directory indexes, and point it at the root of whatever tree you want to serve.


Apache works better for this than FTP. I use it all the time: just configure it to serve indexes. Apache lets you configure the index to include CSS, fancy icons, custom sorting, and other stuff. All over HTTPS.

What's not to like?


Because then I need to use a browser plugin to download all the files in a directory.


True. Or, you know, wget.


> The web is gradually consuming all that came before it.

It's about cost, too. HTTP can be cached very efficiently, but FTP not at all. If I were the operator in charge and I had the choice between next-to-free caching by nearly anything, be it a Squid proxy, apt-cache or nexus, or no caching and having to maintain expensive servers, I'd choose HTTP.


Exactly. You can offload HTTP to anyone, an HTTP proxy is absolutely trivial to set up, and an HTTP cache is even easier.

FTP is not nearly as trivial, plus it's a stupid, broken protocol that deserves to die. The whole thing is a giant bag of hurt.


> FTP is not nearly as trivial, plus it's a stupid, broken protocol that deserves to die.

I agree with you, but FTP has one very valid use case left: easy file sharing, especially for shared hosting. FTP clients are native to every popular OS, from Android to Windows (only exceptions I know are Win Mobile and iOS), and there's a lot of ecosystem built around FTP.

There is SCP and SFTP but they don't really have any kind of widespread usage in the non-professional world.


> [Has] one very valid use case left: easy file sharing, especially for shared hosting.

Nope. Nope. Nope. Not easy. Not secure. Not user friendly. Not anything good. Have an iPhone and need to FTP something? Don't have installation rights on your Windows workstation and need to FTP something? Unpleasant if not confusing as all hell.

Dropbox or a Dropbox-like program is significantly easier to get people on board with.

Any "ecosystem" built around FTP is rotten to the core. Blow it up and get rid of it as soon as you can.

Some vendors insist on using FTP because reasons, but those reasons are always laziness. I can't be the only one that would prefer they use ssh/scp/rsync with actual keys so I can be certain the entity uploading a file is actually them and not some random dude who sniffed the plain-text password off the wire.


Huh?

Windows explorer has FTP built into the shell. As does every other desktop OS.

If you care about file integrity, you want to out of band to verify the signatures on the binary anyway.

Dropbox is blocked in most commercial networks and offers no assurance of anything.


Yes, I'm sure walking your accounting department through how to use the command-line FTP tool in Windows is going to work out fantastically well.

Versus drag and drop file to this page. There you go. It's uploading.

That's why I said Dropbox or a Dropbox-like service, of which there are hundreds. Microsoft SharePoint is but one example.


Not that shell. The UI shell. Explorer. You can browse FTP sites just like network file systems, including drag and drop.


Go to Start->Run.

Type ftp://ftp-site/path/to/file. Drag and drop to your hearts content.


When I worked ftp (with accounting departments ironically), the advice was not to do that because the client often corrupted files.


The protocol you want is SMB.

Windows has first-class support (obviously); but Samba gives Linux and BSD support that, in modern Desktop Environments, is exactly as good. Mobile devices don't tend to have OS-level support for it, but there are very good libraries to enable individual apps to speak the protocols (look at VLC's mobile apps.)

Even Apple has given up on their own file-sharing protocol (AFP) in favor of macOS machines just speaking SMB to one-another.

Yes, it's not workable over the public Internet. Neither is FTP, any more. If you've got a server somewhere far away, and want all your devices to put files on it, you're presumably versed with configuring servers, so go ahead and set up a WebDAV server on that box. Everything speaks that.


> The protocol you want is SMB.

Uh, hell no. Never ever I'd expose a SMB server to the Internet. SMB is really picky when the link has packet loss or latency issues, plus the countless SMB-based security issues.

> Even Apple has given up on their own file-sharing protocol (AFP) in favor of macOS machines just speaking SMB to one-another.

TimeMachine still depends on AFP.


> TimeMachine still depends on AFP.

No, it doesn't:

https://developer.apple.com/library/content/releasenotes/Net...


Is there a way to tune SMB to work better over low bandwidth / high latency links? The last time I tried it through a VPN it was working at less than 10kb/s


AFAIK no, other than increasing frame size, CIFS is more or less an inverse latency gauge.

I wonder if GP is pulling our legs about WebDAV. Yuck.


> Mobile devices don't tend to have OS-level support for it...

Yeah, nope. Dead out of the gate. Thanks for playing.


Mobile devices don't tend to have OS-level support for anything, though.

The GP comment:

> FTP clients are native to every popular OS, from Android to Windows (only exceptions I know are Win Mobile and iOS)

To rephrase: only 1/3 of mobile OSes support FTP.


Anything web-based works fine on the phone. FTP and SMB do not.

Not working on iOS is like saying "Oh, this road doesn't work with German made cars. That's not a big deal, is it?"


But we're talking about picking a thing to replace FTP for the use-cases people were already using FTP for. It doesn't matter if it doesn't do something FTP already doesn't do, because presumably you were already not relying on that thing getting done.


FTP is used to exchange files, a task that HTTP/HTTPS and/or email and/or IM and/or XMPP and/or Skype and/or Slack and/or a hundred other services can do just as well if not better.


More like German made cars have been artificially limited so they can't use certain roads.


...But it does work on iOS. It's just not built in. For example, Transmit for iOS supports FTP, and includes a document provider extension so you can directly access files on FTP servers from any app that uses the standard document picker.

https://panic.com/transmit-ios/

Searching the App Store I also see some apps for SMB, but I don't know whether they have document provider extensions.


You can also send faxes from iOS. What's your point?


The post I replied to implies that iOS is (somehow) "artificially limited" to be unable to access FTP - or at least I interpreted it that way.

FWIW, I'm not convinced that "web-based" is a better alternative for read/write file access, assuming you mean file manager webapps. No OS can integrate those into the native file picker, so you can't avoid the inefficiency of manually uploading files after changing them. WebDAV works pretty well though, if that counts...


It's just needlessly exclusionary. One of the greatest things about "the web" is it's pretty accessible by anyone with a browser that's at least semi-mostly-standards-compliant.


No one uses Windows mobile, that comment was hardly fair. Pretty sure Android is 80%+ share.


even mobile windows/ios/android have ftp and variants support via software but not native.


Why is the protocol broken ? Is there something that doesn't work ? Perhaps you mean to say complicated protocol ill-fit for modern times ?


Have you looked at the spec? If you do, then you'll understand.

Imagine a file transfer protocol that defines the command to list files in a folder, but does not specify the format of the response other that it should be human-readable.

https://www.ietf.org/rfc/rfc959.txt LIST and NLST commands for example. No way to get a standard list of files with sizes and modification dates. yay!

Oh, and the data connection that is made from the server to the client. That works wonders with firewalls of today.

It was an ok spec when it was invented, but today it's very painful to operate.



> It's about cost, too. HTTP can be cached very efficiently, but FTP not at all.

It's ironic that you mention cost and caching but lot of services used for software distribution of one kind of another (e.g. Github releases) are following the "HTTPS everywhere" mantra and HTTPS can't be cached anywhere other than at the client.


> and HTTPS can't be cached anywhere other than at the client.

No. Nexus for example can certainly cache apt, as well as Squid can do if you provision it with a certificate that's trusted by the client.

Also, Cloudflare supports HTTPS caching if you supply them with the certificate, and if you pay them enough and host some special server that handles the initial crypto handshake you don't even have to hand over your cert/privkey to them (e.g. required by law for banks, healthcare stuff etc)


To clarify; what I meant is that HTTPS can't be cached by third parties. If I want to run a local cache of anything served over HTTP it's as easy as spinning up a Squid instance. With resources served over HTTPS I can't do that.


Github probably trust their CDN with their TLS key.

For Debian, packages are secured by PGP in combination with checksums so it's not relevant for them. The debian repos are often HTTP only.


What we need if FTP over HTTP.


Well, there is WebDAV. At least Windows and OS X support it (Windows from Explorer, OS X from Finder), no idea about mainstream Linux/Android/iOS support though. Also, no idea if WebDAV can deal with Unix or Windows permissions, but I did not have that problem when I set up a WebDAV server a year ago.

IIRC WebDAV uses GET for retrieval, so the read parts can be cached by an intermediate proxy and the write part be relayed to the server.


As someone who once tried to write a WebDAV server I cannot with good convince recommend it. It's Bizarre extended HTTP protocol that should not exist.


Out of curiosity: why did you try to write your own WebDAV server? Apache ships a pretty much works-OOTB implementation - the only thing I never managed to get working was to assign uploaded files the UID/GID of the user who authenticated via HTTP auth to an LDAP server.


More specifically a CalDAV server which is a bizarre extension of WebDAV that shouldn't exist. We wanted one to connect to our internal identity server. That project was abandoned.


Related: "I Hope WebDAV Dies"[1]

1. https://news.ycombinator.com/item?id=10213657


Linux has great webdav support with the fuse filesystem, davfs. I used it with my box.net account and it worked fine.

http://savannah.nongnu.org/projects/davfs2


What about Gopher? Will nobody think of Gopher?


Actually i thought about Gopher (i even have my own client - http://runtimeterror.com/tools/gopher/ - although it only does text) since it basically behaves as FTP++ with abstract names (sadly most modern gopherholes treat it as hypertext lite by abusing the information nodes).

Gopher generally avoids most of FTP's pitfalls and it is dead easy to implement.


As a side note, there was a nice podcast about Gopher a few days ago at Techstuff [1]

[1] http://shows.howstuffworks.com/techstuff/what-was-gopher.htm


At least there's still Jigdo.


lftp (https://lftp.yar.ru/) works over http (and more) and works just the way you would expect it to ie:

lftp http://ftp.debian.org/debian/


There's Bittorrent. It's anonymous and not web based.


and it does checksum for you.

edit: thinking about it not sure I agree with the anonymous part considering the swarm can be monitored. the access log is essentially publicly distributed.


Neither is FTP really, the user's IP is still logged somewhere, you just use a common user (anonymous) with everyone else. The modern name of such a feature would probably be something like 'No registration required'. It goes to show how much the meaning of the word 'anonymous' changed over the last 30 years.


Not quite; the "currently accessing" list is public. While it is of course possible to make an access log from this with continuous monitoring, it's not possible to arbitrarily query historical data.


i wrote a simple sftp server awhile back that advertises no auth methods, plain ssh tools log right in without prompting for a password

code; https://pypi.python.org/pypi/noauthsftp

handy for moving files around a network


Ftp is an old protocol, it was good for it's time but http is just better now.

Even though you can't have anonymous sftp you can have anonymous ftps.


kernel.org is shutting down their FTP servers, too:

https://www.kernel.org/shutting-down-ftp-services.html


Almost the exact same wording was used in the deb announcement. And it's all true.

I'm a nostalgic kind of guy, but FTP is terrible through firewalls and nowadays, bittorrent or http are just as well.


What has changed in HTTP over the past 10, 20 years? Was there ever a clear benefit of using FTP instead of HTTP?

Just wondering why suddenly everyone is in agreement FTP should go away ..


FTP was older and had more widespread client support in the 90s, so you were more likely to have at least a basic client preinstalled. That's obviously moot by now since just about everything ships with a web browser and/or something like curl.

FTP has a standard way to list directories. Using HTTP that way would require you to either have a well-known index file or parse HTML looking for links which don't return text/html responses.

The downside is that FTP still has issues with firewalls – I had to troubleshoot that earlier this month, actually – and is another service to maintain if you are already running an HTTP server.

In the case of either single-file downloads or something like a Linux package manager, the URLs are well known so directory listings are irrelevant. HTTP has a number of good options for CDNs and caching, so if you care about performance or reliability that's a turn-key service.

In the cases where directory listings were more valuable, people usually wanted a richer UI than just a file listing, too, and there are tons of options for that in the web world.


FTP has a standard for way to request a directory listing but the format for that listing is not specified and impossible to parse reliably


In the old days, we had decent clients for http but they were all manual operated GUIs, and the decent clients for FTP were all CLI and/or extremely automatable. Also in the old days it was unusual to have a machine with a GUI maybe 30 years ago.

Also there was a day before dependency resolving automatic downloading GPG verifying distribution clients. I was there... Say you needed to roll back (or security upgrade) some software, you'd quite possibly go to a FTP site thats trusted, download some tar.gz or tar.z or shar archive that was trusted, make, make install, plus or minus some configuration of course. You can't do that via http in 1990 while telnet'd into a server, the http infrastructure and commands and servers hadn't been invented yet.

Its also important to point out that at least some of us were fooling around with unix and FTP (and FTP to non-unix OS, I vaguely remember some TOPS20 server in the late 80s early 90s... simtel20?) before http ever existed. So naturally there were a lot of tools and processes and experience getting things done with FTP when HTTP arrived.

Decent FTP client CLIs were extremely advanced. Not exactly "wget someurl" and hope for the best LOL. Automatic login with differing accounts per site, text mode graphs and stats as downloads commence, tab completion, local help command functionality, multi-connection support... Mirrored and often lead the developments in modem/BBS download functionality (think like Zmodem on Telix not kermit)

Its sort of like those "how could you have thought using telnet was a good idea compared to ssh?" Well, I had 15 years of unix experience before OpenSSH was deployable, so we had a lot of telnet experience...


It's hardly sudden! The last time I encountered anyone using FTP by choice was around the turn of the millennium. Since then, it's always been supported reluctantly and under protest, because some integration partner critically important to a profit center couldn't or wouldn't support anything less terrible.


All the reasons listed in the article. It's also insecure. It's Hell with firewalls because of the way it negotiates what ports to use, instead of using a standard port for data and control.


Universal adoption significantly less protocol cruft.


Thanks, Russ!

https://research.swtch.com/glob

> but if you have an anonymous FTP server accepting glob patterns, there are two more fundamental questions to ask: Do you really need to run an anonymous FTP server anymore?



Good riddance to FTP. I've wasted enough time troubleshooting it in my career.


In the old days we had Windows and wanted to install Netscape or download Linux we used anonymous ftp servers. Now Windows has a web browser installed that we can use to download anything we want.

Anonymous ftp servers are outdated now. Like Gopher was when web browsers teplaced them.


Gopher had a lot going for it over the early web. But its death knell was University of Minnesota wanting to charge a licensing fee for it.


ftpmail. Now that is outdated. I vaguely remember around 1990 you'd send an email to some peculiar address and in more than an hour but less than a day, the service would execute the commands you gave it and return ... perhaps uuencoded files via email? Some encoding? Text just appeared as text.

I think I got new phrack issues that way. Or some other zine. I also obtained some software this way. Some of it was even FOSS/legal.

It was a way of working around various quota or security limitations. You'd get your file just as well as FTP, merely slowly. There were of course limitations on the ftp-mail service, you couldn't fetch an entire OS distro this way.

Unbelievably, you can still do this today

http://www.nws.noaa.gov/tg/ftpmail_using.php

The docs on that page do bring back some memories. Obviously real ftpmail service didn't have a "open" line it was more like "open anonymous@simtel20.something.something.something.mil"

thats what we did for a good time during the first Bush administration with our 2400 baud modems, LOL.


There's really nothing stopping FTP from working on CDN's and proxies. For public distribution Bit-torrent is far superior though. I still think SFTP/FTP/FTPS is the best way to upload files, is there any better free alternatives !?


There are a fair number of businesses in old industries that "went digital" 10-20 years ago still have the same tech running, including FTP. I work with a company that requires uploading files via FTP into a co-mingled directory (everyone uses the same password). Company is a major player in my industry, too.

I would be curious if some government services were still running FTP, judging by the number of Y2K-era government websites I run into.


But not offering https:// services, huh?


yesterday, I have received a link to download some tutorial from china: http://pan.baidu.com/s/1sl993lV. I have performed the download using firefox, but the connection is so bad, there are many failures. I have tried with curl that fails also. A real nightmare. All these problems would not exist if it was a simple ftp link: filezilla is perfectly optimized to work around shitty connections.

And today I learn that ftp is falling in oblivion. What a sad time.


Curl is perfectly capable of resuming a download over HTTP unless the server is doing something stupid. Blame Baidu if you like, but HTTP is not the problem here and there is no benefit to FTP.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: