You might end up having some trademark issues. For example, I could see Amazon not being too happy with the example link http://amazon.twi.bz/a.
The other thing that I see is that for a URL shortener, the above is 22 characters. What about if you were linking to a page here. For example, a link to this page is http://ycombinator.twi.bz/b. That's 27 characters.
If you are going to use a URL shortener (not that I'm suggesting that it is a good idea), why use one that is so verbose (relatively speaking)?
So, I'm not sure that there is a sweet spot here. You have people on one side that will never _ever_ use a URL shortener. On the other side, you have people that want the smallest link possible. There isn't much space in the middle (IMHO).
Not every usage for url shorteners is for twitter. I personally don't use twitter, but when I send long links to my mother via e-mail, she has trouble unless I use a url shrinking service. I'd much prefer one that shows the domain outside of extremely space constricted environments.
If your mom has trouble opening large links, she's probably not the type to be comforted by a subdomain. If the link is from her son she's going to click it.
The flip side for Amazon is that since it's clear that link is going to Amazon you are more likely to click on it which means more business for Amazon.
A similar service is DecentURL. It generally has slightly longer URLs, but lets you put your own descriptive string in the link, in addition to adding the target's domain as the link's subdomain.
Yes, the first thing I thought about twi.bz was, "I thought I already solved this with DecentURL" ... but I didn't quite. Looks like twi.bz is still about shortening URLs, whereas DecentURL is about making URLs decent.
Occassionally DecentURL will shorten a URL, but mainly it's intended to turn crazy random-numbers-and-letters addresses into nicely-named ones, while including the domain to make it somewhat "safe" (like twi.bz does).
Amazon and eBay have systems going back to 1995 to support; they also expose prettified URLs that redirect to Google. They know they have a (slight) problem there, but they do the best they can with what they've got.
As for Facebook, I've always thought they were pretty human-readable.
Yes. When I'm dealing with a new language/framework, I usually make a URL shortener. It's simple enough to not take long, yet represents a more useful learning experience than simply 'Hello World'.
Also, when I shorten a URL, it'd be handy to have the new url as a link in addition to a text box, because then I can right click on it and 'Copy Link Location', which is slightly less effort than highlighting and copying. (I know, but I'm uber lazy.)
A problem with using the original domain as a subdomain might be that for people not familiar with your service, these URLs look more like a phishing scam than a shortened URL, quite the opposite of the intended effect ("transparency"): Would you like to go to "http://paypal.twi.bz/c"?
This is a cool idea. The host name in the URL goes a long way.
But there's still some trickery that could be used with redirection, e.g. "I'm Feeling Lucky" Google URLs and the like.
Also most malicious URLs will include the payload in the request parameters. You can still do a lot of damage with a GET request. (Ask Goolge bot if you don't believe me http://blogs.securiteam.com/index.php/archives/746 )
What I'm trying to say is, there's no substitute for seeing a full URL. Especially for security conscious users like the HN crowd. For the average user though this might be enough information.
How come people aren't looking at this from the other angle: a twitter client could pre-preemptively look up URLs from url shortening services and annotate the tweets with the full URLs?
For that matter, how about a twitter client with a memory? One that keeps dossiers on the people you follow?
How about a twitter client that isn't written in AIR?
It seems to me that url transparency is an issue for the clients to solve. Why can't Twitter (or whatever you're using) decode the short url and supply a javascript pop-up or something to tell you where its going?
I like that you include non .com suffixes but it does add length to the url. However it seems to me that some people may think you're trying to be malicious by using the domain.suffix as some spammers do. I'd suggest you do a replace on the . with a - or some other character.
Your original web address had 26 characters, and the converted version has 30 characters."
I can see how this will always happen with the document root of any domain, so maybe you can refuse to shorten those, or make it a kind of in-your-face suggestion that it's silly to shorten a relatively short URL?
If the original is shorter, just return the original. You can make a note as to why you gave back the original, but I think we can safely assume the user wants the shortest he can get, so give it to him.
Ehrm... It does now in any case. I didn't see it before blushes (Seriously, how can I miss the only line on a four line page that completely invalidates the point I was trying to make?)
Anyway, that probably was a good demonstration why it should be more in-my-face.
And it also demonstrates why web browsers need to send an im-an-idiot header :(
You mean the mapping between twi.bz URLs and the real URLs? Yes, I absolutely plan to release the data. I'm just working on what the best way is, and I wanted to get some feedback.
It's worth knowing that I hacked this together in a two hour session very, very late last night. This is its public debut.
I want to be able to release the mapping and also the trending information.
Please release a separate file without trend information, as not everyone will need it. For the data format: How about a simple CSV file with 3 fields: "domain-part,path,url". This would translate to such entries:
Great to hear that you also plan to release the trending information. For the URL->twi.bz mapping I would opt for a simple text file with a mapping per line.
Imagine you search for a solution to a very annoying bug in your favorite program. Now imagine you find someone posting on a mailing list with a link to the solution/a patch/something that solves your problem. Imaginary problem solved.
Now imagine the same, except that the link is a tinyurl link and tinyurl went out of business 2 days ago. You will never be able to solve your problem.
If there were some kind of tinyurlarchive.org, you could solve your problem again.
Idea. If you return the 'www.' instead of 'http://' at the beginning, you save 3 chars, and I think most of the Twitter clients and Twitter web create a link too.
I appreciate the hackery and speed with which you implemented this idea born out of HN. That being said, I really, really don't care about another URL shortener.
The other thing that I see is that for a URL shortener, the above is 22 characters. What about if you were linking to a page here. For example, a link to this page is http://ycombinator.twi.bz/b. That's 27 characters.
If you are going to use a URL shortener (not that I'm suggesting that it is a good idea), why use one that is so verbose (relatively speaking)?
So, I'm not sure that there is a sweet spot here. You have people on one side that will never _ever_ use a URL shortener. On the other side, you have people that want the smallest link possible. There isn't much space in the middle (IMHO).