DBpedia provides Linked Data URIs for 2.9 million things.
DBpedia extracts structured information from Wikipedia makes this information available on the Web as Linked Data. DBpedia allows you to ask sophisticated queries against Wikipedia, and to link other data sets on the Web to Wikipedia data.
The DBpedia Data Set http://wiki.dbpedia.org/Datasets
DBpedia architecture - http://wiki.dbpedia.org/Architecture
Report dbpedia related issues, e.g. bad sameAs links, at: http://bugs.semanticweb.org
DBpedia Lookup  http://lookup.dbpedia.org provides a service to find the most-likely DBpedia URIs for a given keyword.
The underlying algorithm ranks DBpedia resources based on their relevance in Wikipedia and includes synonyms into the index.
Try the terms "Shakespeare", "EU", or "Cambridge" and see for yourself if the results you'd expect show up at the top. The result ranking is
different - and supposed to be more useful - than a simple full-text search or SPARQL-Query with embedded regular expression for matching labels.
There is a web-service available at http://lookup.dbpedia.org/api/search.asmx . You can use the KeywordSearch method for searching full terms (as you see at ), and the PrefixSearch method for an autocompletion-style interface such as the one you see at http://lookup.dbpedia.org/autocomplete.aspx .
The webservice returns a list of resource URIs with English abstracts, dbpedia classes and categories.
All DBpedia pages include RDFa. Thus, re. the DBpedia Linked Data Space, you now have HTML+RDFa as a structured metadata representation alternative to N3, Turtle, RDF/XML, and RDF/JSON (*new*)
An example producing N3 from a DBpedia HTML page using the pure-Perl RDFa parser:
DBpedia 3.4 describes more than 2.9 million things, including 282,000 persons, 339,000 places, 88,000 music albums, 44,000 films, 15,000 video games, 119,000 organizations, 130,000 species and 4400 diseases.
The DBpedia data set now features labels and abstracts for these things in 91 different languages; 807,000 links to images and 3,840,000 links to external web pages; 4,878,100 external links into other RDF datasets, 415,000 Wikipedia categories, and 75,000 YAGO categories.
The data set consists of 479 million pieces of information (RDF triples) out of which 190 million were extracted from the English edition of Wikipedia and 289 million were extracted from other language editions.
Future plans include deploying the DBpedia live extraction which updates the DBpediaknowledge base immediately when a Wikipedia article changes.