© Kıbrıs Haber Sitesi 2023

Web Technologies

 

 

To put it bluntly, formalism as a paradigm is a dead end. But it turns out to be efficient when mobilized locally, specifically and in controlled contexts.

 

This is a dead end, because to be successful in the absolute and in any situation, semantics must be formal in nature. Therefore, there should be no exceptions to the formalization of meaning.

 

However, there is a contradiction between the totality of meaning and the syntax mechanism. The meaning of a text depends on its context. The meaning of a paragraph also depends on the text in which it is integrated, and the meaning of a word in the paragraph containing it.

 

Abbreviations for technologies related to the semantic web have been created. Meaning goes from global to local, from global understanding to analysis.

However, formalism works in the opposite way. The meaning of a logical formula is constructed from the meaning of its parts going from the local to the global.

 

We know how to calculate the accuracy value. But in a sentence, it is not enough to understand all the words to grasp its meaning.

 

In other words, the logical operation of meaning is opposed to its operation in a linguistic or semiotic context. It is therefore impossible to reduce one to the other.

 

But formalism can be a very effective tool for characterizing an object and the treatments that can be applied to it.

 

For example, the concept of typing makes it possible to describe a behavior specific to a category of objects. Formalization specifies the nature and extent of their combination.

 

This is why terminology has evolved, moving unnoticed from the semantic web to the data web to even the web of objects. In other words, formalism ceases to be a paradigm that becomes a tool for denoting nature and behavior.

 

In this perspective, it is appropriate, depending on the fields, to characterize the objects under consideration and to define appropriate formalisms for handling and managing them.

 

This leads to an explosion of formats and formalism that stuns novices and system designers alike. Because each field has its own characteristics that are incompatible with the others.

 

Therefore, it remains an important issue for the future of the Semantic Web, in addition to the distribution of data, its heterogeneity.  Instead of a formalization that homogenizes the data, the focus is on the representation of differences and the calculations specific to them.

Processing of Content

 

The semantic web arises from the need to add additional information to the content to improve the processing of the content. From this point of view, it fits perfectly with other perspectives of web 1.0 and 2.0.

 

However, the Web should not consist only of applications that exchange data without users. Thus, it becomes a huge database, provided it allows users to intervene as interpreting subjects and complements localized yet refined semantic tools.

 

Web Technologies

 

A vast project in continuous evolution since the beginning of the web, the data web is based on a set of standards whose function it is important to know at least.

 

One of the most important metaphors accompanying the birth of the web is the universal library.

 

The web is then perceived as a gigantic documentary system, where the pages are documents that can be navigated by following links and marked with a bookmark, as in a book.

 

But it quickly loses its purely documentary status. Its first mutation will come from search engines, which will dynamically create pages with every request to view their results and not necessarily serve a stored document.

 

The document becomes calculation. This documentary is the beginning of various mutations of space. One of them is the data network, which no longer relies on documentary foundations but on Web architecture to interconnect databases.

 

When we talk about web architecture, we're not talking about the Web object we browse every day, but the standards that define the technological infrastructure from which it emerges.

 

To approach the architecture of web standards, you must first know a little about their architects.

 

In 1994, the CERN (European Organization for Nuclear Research) WebCore project, where the Web was born, was transferred to Inria.

 

The Massachusetts Institute of Technology (MIT) then, together with Inria and Keio University in Japan, initiated the creation of the World Wide Web Consortium, or W3C, which provides a framework and neutral processes for the standardization of Web architecture.

 

To properly understand the architectural documentation, you should first of all know that the W3C uses different terms to describe its documentation.

 

Dr.Yaşam Ayavefe

İlginizi Çekebilir

TÜM HABERLER