Hacker News new | past | comments | ask | show | jobs | submit login

> The existing Freebase dumps are far too useful for a Google would be competitor

from the article:

> The last Freebase data dump will remain available

and even if it wasn't, I'm sure archive.org will grab a copy (if they haven't already).




Part of the value was that the dumps were updated weekly.

Say you're using it to get a list of named entities (proper nouns). The purpose is to cluster news stories about a given entity (if you look at Facebook's Trending News, each headline begins with a proper noun followed by a blurb. Not sure if they use Freebase, but it could be a useful input).

The value of Freebase will decline over time as the content becomes out of date.


The Wikidata dumps are also updated weekly; see [1].

Wikidata RDF exports are made every two months or so from those dumps and are available at [2]. I imagine that frequency will pick up. You can generate your own RDF exports using the Wikidata Toolkit [3, 4].

[1] http://dumps.wikimedia.org/other/wikidata/

[2] http://tools.wmflabs.org/wikidata-exports/rdf/

[3] https://www.mediawiki.org/wiki/Wikidata_Toolkit

[4] https://github.com/Wikidata/Wikidata-Toolkit




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: