Skip to main content

Semantic Web

Beginning from a small project , now web(popularly known as word wide web) consist of  billions of web pages hosted on millions of servers located across the globe. While Web has now become a basic source of Information covering a wide area of interest , (I must say ,almost all the area of interest where current research covering Science,Arts,Engineering is going on) , it also created a challenge of proper information retrieval from unstructured and semistructured web of data.

Keeping these things in mind, Tim Berner Lee(The creator of www) coined a term "Semantic Web", and defined it as:

"a web of data that can be processed directly and indirectly by machines"

According to World Wide Web Consortium (W3C):

"The Semantic Web provides a common framework that allows data to be shared and reused across application, enterprise, and community boundaries."

In a Nutshell , the concept of Semantic Web can be described as:

"The main purpose of the Semantic Web is driving the evolution of the current Web by enabling users to find, share, and combine information more easily. "

Ofcourse there are many research challenges ,like:


  • Vastness: The World Wide Web contains many billions of pages.  Any automated reasoning system will have to deal with truly huge inputs.
  • Vagueness: These are imprecise concepts like "young" or "tall". This arises from the vagueness of user queries, of concepts represented by content providers, of matching query terms to provider terms and of trying to combine different knowledge bases with overlapping but subtly different concepts. Fuzzy logic is the most common technique for dealing with vagueness.
  • Uncertainty: These are precise concepts with uncertain values. For example, a patient might present a set of symptoms which correspond to a number of different distinct diagnoses each with a different probability. Probabilistic reasoning techniques are generally employed to address uncertainty.
  • Inconsistency: These are logical contradictions which will inevitably arise during the development of large ontologies, and when ontologies from separate sources are combined. Deductive reasoning fails catastrophically when faced with inconsistency, because "anything follows from a contradiction". Defeasible reasoning and paraconsistent reasoning are two techniques which can be employed to deal with inconsistency.

  • Source:
    http://en.wikipedia.org/wiki/Semantic_Web

    Current research on semantic web can also be found at
    "Journal of Web Semantics
    Science, Services and Agents on the World Wide Web"
    The journal is having an IF of 3.4, an impressive one.....
    Web site for the journal is http://www.semanticwebjournal.org/

    Comments

    Popular posts from this blog

    Advantages and Disadvantages of EIS Advantages of EIS Easy for upper-level executives to use, extensive computer experience is not required in operations Provides timely delivery of company summary information Information that is provided is better understood Filters data for management Improves to tracking information Offers efficiency to decision makers Disadvantages of EIS System dependent Limited functionality, by design Information overload for some managers Benefits hard to quantify High implementation costs System may become slow, large, and hard to manage Need good internal processes for data management May lead to less reliable and less secure data

    Inter-Organizational Value Chain

    The value chain of   a company is part of over all value chain. The over all competitive advantage of an organization is not just dependent on the quality and efficiency of the company and quality of products but also upon the that of its suppliers and wholesalers and retailers it may use. The analysis of overall supply chain is called the value system. Different parts of the value chain 1.  Supplier     2.  Firm       3.   Channel 4 .   Buyer

    Big-M Method and Two-Phase Method

    Big-M Method The Big-M method of handling instances with artificial  variables is the “commonsense approach”. Essentially, the notion is to make the artificial variables, through their coefficients in the objective function, so costly or unprofitable that any feasible solution to the real problem would be preferred, unless the original instance possessed no feasible solutions at all. But this means that we need to assign, in the objective function, coefficients to the artificial variables that are either very small (maximization problem) or very large (minimization problem); whatever this value,let us call it Big M . In fact, this notion is an old trick in optimization in general; we  simply associate a penalty value with variables that we do not want to be part of an ultimate solution(unless such an outcome is unavoidable). Indeed, the penalty is so costly that unless any of the  respective variables' inclusion is warranted algorithmically, such variables will never be p