I know there's a plethora of URL shorteners these days, and most platform providers (eg: Twitter) are introducing their own so they can harvest analytics themselves. For whatever reason though, I keep trusting Google with all of my most personal information. I don't know if that's a good or bad thing, but I have a feeling this will become my standard URL shortener just as Google Apps has become my standard for email, invoicing customers, document management and scheduling. Not to mention Google Reader for having information on all the RSS feeds I visit and Google's search knowing what links I am visiting daily, and probably thirty other products I'm using that stores my information on their servers.
yeah same here it seems like my entire life is run through Google these days. I just assume they have so much data there's no way they could possibly care about what I'm doing.
We don't need them. Companies that create the shorteners need them to collect information about who clicks on your links. That, and there's a service out there that limits updates to 140 characters.
We do need them. While twitter pretty much spawned the whole ecosystem, there still is a very important and functional use for URL shorteners.
The biggest, outside of twitter, has to be the method of emailing enormous html links to someone. With so many email clients out there, it is a crap shoot to send a link. Personally and professionally bit.ly has really helped out in those cases.
With that in mind, I wonder what impact this announcement has on bit.ly and the other players. Does google provide analytics on their shortened links? What is the value be between the different shorteners now that Google has an offering.
Yes. I guess communicating URL orally or on paper (where people have to type them in to use them, or you may even have to write them by hand) is the only real use case for URL shorteners.
Even before twitter, url shorteners were useful in many places. For example, a Google Maps link would be a good candidate for url shortening. When used in emails/usenet/IM windows, long urls often get mangled.
I had the same question. Until last week when I sent out a link to a google spreadsheet, and my awesome </sarcasm> Navy Marine Corps Intranet Windows XP/Outlook 2007/Exchange system broke the link across two lines (but only after it was sent), and then every single recipient (about 100 physicians) said they couldn't see the document. I now use tinyurl.
Yeah, sending emails with absurdly-long links is just about the only use case I've found for URL shorteners. I find it particularly useful for links to Google Maps locations or direction sets.
I prefer text email in theory. It's cleaner, simpler. Text for messages, that's how the web was meant to work, right?
I also cringe when I purple text in Comic Sans, as well as other unmentionables.
Yet, when I want to emphasize something, I hate using asterisks. It's not semantically correct, and it looks ugly. And how do you differentiate between underline and italics?
Using asterisk for italics and underlines (_) for underlines is an age-old convention that predates the web and HTML email by years.
I much prefer plaintext only mails. Not only are they readable on any device, I also don't have to be afraid of getting my machine infected by who-knows-what.
I don't know about need, but I often use them for emails. I prefer a plain text email program (so no HTML mail to hide the link), and it's been my experience that many receiving email clients break overly long URLs. So I use Metamark[1], which I have a reasonable trust won't disappear in a week or a month.
I understand that you might not like short URLs in the wild (so to speak), but do you really not follow short links when sent to you from friends or relatives? (And, no, I don't mean short URLs from your aunt that sends you all the "funny You Tube" videos.)
Why is that? I'm assuming it's because when you see a link you check the URL to make sure you trust the location it's about to send you to. But surely this get's unscalable pretty quickly. Does it not?
The URL check is not against a whitelist. My head parses the URL using poorly defined heuristics to decide if it looks safe or not. I rarely get it wrong.
I recently hopped on the custom shortened bandwagon for our twitter accounts. mostly for branding purposes.
macrumo.rs
toucharca.de
even if people don't click through, they see the names. and I'm not too worried about long term stability of the links, since they are only sent out through twitter, which tends to be a rather transient medium anyway.
Not sure about we, but Google needs a shortener, because it's a great source of a web activity information. Compliments the data they collect through urchin.js and the ad network. Makes them track and profile you better.
1. Twitter. (Obviously)
2. Print magazine. (There was no possibly way to link to pages with long urls in print, now its bit.ly/fgyr )
3. Billboards (Eg. Find us on twittter, facebook, blah on witter with bit.ly/grwt )
In magazines and on billboards I see links ${company_name}.com/${campaign_name}, which I think makes more sense — it's more memorable than random shortlink characters.
If the entire web were RESTful, we probably wouldn't; however, the web is full of broken applications that embed lengthy get request parameters into URLs. I'm not saying that get params are bad, just that it reaches a level of absurdity very quickly.
A: Google needed a url shortener for its own products where we knew the shortener wouldn’t go away. We also wanted a shortener that we knew would do things the right way (e.g. 301/permanent redirects), and that would be fast, stable, and secure.
I think this is some indication of a continued push toward a social network/layer of some kind. Maybe just for Buzz but I think this is something more.
After pasting it in google service, it produced a different short url: http://goo.gl/YH1c
I first assumed that every destination would render an unique short url, but thats not the case.
This way google knows the full stats of http://www.magicbeef.com/. But you can only know the real stats if you know ALL the short urls pointing to it. :)
EDIT: I my short url stats page theres a notice:
"119 total clicks on all goo.gl short URLs pointing to this long URL"
EDIT: Notice the browser stats of the link (http://goo.gl/info/Jvhu) Almost NO IE. Also less than a half windows.
Probably because of the analytics -- if you create a short URL and post it somewhere, you want to know how many people (etc.) clicked your link, not how many people clicked any link that goes to the same place through the shortener.
the chart looks to be a hang-over from google.com/finance; I like the interactivity of it... I'm sure they will replace it with a html5/canvas solution as soon as they feel the adoption is worth it (for those alternatives)
Amusingly, I clicked the first link which took me to goo.gl, and closed the window to see every one of your links turn 'visited', giving me instant confirmation that I did indeed visit each (in spite of there being nothing in the back-history)
Ugh. Google entering this pretty much wipes out the space - companies like bit.ly are going to need to work on seriously differentiating themselves from the pack, and even then I don't think they're long for the world.
I can honestly say I hope to never be in a market - or working on a product - where google suddenly decides to enter the game. Even if their offering sucks, or is broken, it sucks all the air out of the room because it's OMG Google.
I don't think anyone really believed that something you can hack together in twenty minutes, where your business plan is completely based on "for as long as Twitter doesn't do one", really counts as a "space". Anyone who invested in it deserved to get burned.
bit.ly were working on partnerships with publishers as their means of getting some monetization, but that's all I've really seen.
I'd prefer the Google one to the others, because at least I have some guarantee that the Google one won't go away.
To be fair, Wave is still functional and Google announced intentions of releasing "Wave in a Box" (1). And they have explicitly said they'll provide a method to export existing data. Either way, their decommissioning of Wave is a lot more responsible than, say, what Xmarks is doing. I trust Google a lot more than a fly-by-night startup to gracefully shutdown their services.
where your business plan is completely based on "for as long as Twitter doesn't do one"
From what I could tell from the last emails sent from twitter, they are doing one. I wrote a very short blog-post on that since I was surprised nobody else seemed to notice:
What historical evidence is there for this? In every case I can think of in which a Google product supplanted an established competitor, it's been pretty clear that it did so through merit. As a converse example, I don't see Google Finance establishing any obvious position of dominance. Google has had no lack of failed products.
I rarely use shorteners, so excuse my ignorance here, but what does Google offer here that bit.ly doesn't? Why would I switch to Google after using bit.ly for a long time?
One nice thing is that goo.gl is pretty fast. See the link at http://techcrunch.com/2010/03/17/url-shorteners-speed/ for a graph from a few months ago, for example. We're also shooting for great uptime and we have a bunch of malware checks when shortening urls.
Have you checked out Bit.ly recently? They have differentiated themselves quite nicely -- they offer URL shortening for custom domains. I don't know if that can keep them afloat, but it's certainly a cool idea.
Right now it looks like the companies are neck and neck. Each offering a 6 character prefix for urls. bit.ly good invest in R and D and try to get that number down a bit.
I wonder what the web will be like in 10 years when most of these URL shorteners don't exist anymore and we're just left with a web full of links that go nowhere. I'm sure the Google one will survive, seems like we shouldn't have to use these URL shorteners at all though.
i agree with you, but how many of those unshortened links do you think will exist 10 years from now? i think the web is like a brain in which memory is active, not static. it`s just a bit too young for that to be obvious.
At least the unshortened links have a chance of being scraped by things like archive.org; I wonder if it scrapes anything meaningful from the popular URL shorteners.
There was some initiative to create an archive of shortened URLs if the provider were to close down. http://www.archive.org/details/301works. But looks like it did not gain much traction.
Just out of interest - do the top level DNSs look at the entire address or is Greenland's main internet supplier going to melt under the load of redirecting everything to Google?
Its obvious that we need shorteners, the original purpose being email but Twitter bucked that trend.
So now, what we dont need is more URL shorteners. What we do need is more sites to implement their own shortening like YouTube and Flickr (eg, http://youtu.be/3jDfSqtG2E4 and http://flic.kr/p/MGuRJ). Its all about the rev canonical. This way, a shortened URL will stay about as long as that site is about and the internets doesn't break.
People seem to want click stats though, although i dont see why you cant get these out of google analytics etc.
[edit - Nevermind, solved below in the comments.]
Does anyone know how these work long-term? This one shortens a URL to 4 characters, so for a potentially major URL shortener, there is a relatively low number of possible outputs.
Once they hit their limit, do they start erasing the oldest ones? The most inactive ones?
For shorteners that give consistent shortened URLs (i.e. everytime you shorten the same input URL, you get the same shortened output URL), how do they deal with hitting their maximum?
> Once they hit their limit, do they start erasing the oldest ones? The most inactive ones?
Let's do the math. If we assume that a service only uses a-z, A-Z and 0-9 as characters for the shortened URL, we have a set of 62 characters.
/x -> 62
/xx -> 62^2 = 3,844
/xxx -> 62^3 = 238,328
/xxxx -> 62^4 = 14,776,336
/xxxxx -> 62^5 = 916,132,832
/xxxxxx -> 62^6 = 56,800,235,584
In the worst case scenario, the URL shortener will have to use a 6 character identifier at some point, giving them a complessive coverage of almost 58 billion URLs.
The 4 characters they are currently using will only be enough for a while. They'll switch to 5 soon enough.
I tired about 10 random ones and 3 were real links. So are they approx 30% full already of the 4 character urls? I guess you better not shorten a url to a page you don't want to be public, because then the address will be easy to guess.
It seems like a bad idea to differentiate between upper and lowercase case to me. I imagine most people reading a URL aloud would not specify the case.
In all the excitement, we forgot to ask about the API. I am curious what they will provide, and with what limits.
One big problem I had with bit.ly on Twitter is there was no way to get all shortened URLs pointing to a "real" URL. This was a problem because I wanted to search for references to the "real" URL.
The lack of that feature of course empowered dedicated services like backtype (I suppose), who have special deals with Twitter so that they can get all references.