Wednesday, December 06, 2006

Greatest Task of Web 2.x: Meta-Validation

+--------------------------------------------------------------------+
| Greatest Task of Web 2.x: Meta-Validation |
| from the vetting-the-metadata dept. |
| posted by kdawson on Sunday December 03, @21:33 (Editorial) |
| http://slashdot.org/article.pl?sid=06/12/03/2134235 |
+--------------------------------------------------------------------+

[0]CexpTretical writes "This Technology Review [1]article about Web 2.x
problems fails to mention the 800 pound gorilla in the room when it comes
to fulfilling the dreams of the Semantic Web รข€” i.e., assumptions about
the validity of metadata or tagging schemes. We can add all of the
metadata and/or tags we want to web resources but that does not mean that
the 'data about the data' honestly or accurately describe the resource or
are 'about the data' at all. This is why Google does not place much
importance on the metadata already contained in HTML document headers for
search ranking, because it cannot be trusted. And to validate it would
require more effort than to search and index that data from scratch.
Ensuring or verifying the validity of metadata would be a task equal to
that of initially creating it, but would have to be repeated on an
ongoing basis. Hence all of the talk about 'trusted networks,' which then
require trusting the gatekeepers of those networks. Talk about
'semantics.'" Slashdot's moderation and meta-moderation offer one example
of getting useful metadata in a non-trusted environment.

Discuss this story at:
http://slashdot.org/comments.pl?sid=06/12/03/2134235

Links:
0. http://www.conceptexplore.com/
1. http://www.technologyreview.com/read_article.aspx?id=17845&ch=infotech

No comments: