Web Architecture


Web Architecture can be defined as the conceptual structure of the internet. Types of web architecture include the client-server model and three-tier model.

Web Architecture definition

Web architecture is the conceptual structure of the World Wide Web. The WWW or internet is a constantly changing medium that enables communication between different users and the technical interaction (interoperability) between different systems and subsystems. The basis for this is different components and data formats, which are usually arranged in tiers and build on each other. Overall, they form the infrastructure of the internet, which is made possible by the three core components of data transmission protocols (TCP/IP, HTTP, HTTPS), representation formats (HTML, CSS, XML), and addressing standards (URI, URL). The term web architecture should be distinguished from the terms website architecture and information architecture.

Origin of web architecture

The world wide web is a concept that was realized in the 1990s so that people and machines could communicate with each other within a certain space. It is used to exchange, distribute, and share information in a network. At that time, the web consisted predominantly of static websites based on HTML, in other words, hypertexts that can be retrieved by a browser. Dynamic websites and distributed web services were added later.

Types of web architectures

The internet is a medium that is constantly changing and expanded by numerous developers, programmers and various consortia such as the W3C. However, the architectures used can be schematically distinguished.

Client-server model

Initially, the web consisted of a two-tiered architecture: clients and servers. Clients and servers shared the tasks and services that the system was supposed to perform. For example, the client may request a service from the server; the server answers the request by providing the service. Retrieving a website using a URL address that directs to a server to load the site in the client’s browser is an example of the two-layer model, also known as the client-server model.

The internet protocol family, which now consists of around 500 different network protocols, is usually used as the basis for the WWW, but it usually comprises the TCP/TCP/IP reference model. Three prerequisites must exist in the web architecture for the distributed application systems to communicate with one another:

  • Representation formats with a fixed standard: The most frequently used formats are HTML and CSS; or XML when machines communicate with one another.
  • Protocols for data transfer: HTTP (Hypertext Transfer Protocol) or HTTPS (Hypertext Transfer Protocol Secure) is used in the web. Other applications, such as mail servers, use SMTP (Simple Mail Transfer Protocol) or POP (Post Office Protocol). Determining the protocols used depends on the application.
  • The standard for addressing: This refers to the URL (Uniform Resource Locator) which is an instance of the more general concept of URI.

Finally, the web architecture is analogous to the operational structure of application systems for data storage, data transmission, and presentation. When transferred to the web, the web architecture typically consists of database servers that manage the data and resources. They communicate with a client using a transfer protocol that can retrieve the data and view it in a browser. The representation is usually done with HTML and CSS.

Three-tier model

Three-tier models include an application logic between the client and the server, which handles the data processing and allows a certain degree of interaction. For example, an application server can process data while a database server is dedicated solely to data storage. In this way, content can be dynamically loaded and saved. The script language JavaScript is often responsible for the behavior of the client.

Generally, a distinction is made between server-side and client-side data processing. Dynamic websites are characterized by the fact that content is changed on the client side without new communication between server and client being required. Action on the client side is influenced by scripts so that no asynchronous data transfer is necessary. On the server side, modified content is stored via the application server on the database server. Optionally, this can be a virtual server that emulates a physical one.

There are different programming languages and frameworks to implement three-tier models. A selection:

  • Hypertext Preprocessor (PHP)
  • Common Gateway Interface (CGI)
  • JavaServer Pages (JSP)
  • Active Server Pages (ASP.NET)
  • Asynchronous JavaScript and XML (AJAX)
  • Microsoft Silverlight
  • JavaScript Object Notation (JSON)
  • Java applets, JavaScript and VBScript (client-side technologies)

Service-oriented architectures (SOA)

Today the web is used for the networking of globally distributed IT structures. Each IT system can, in turn, consist of subsections whose individual components are linked to one another via a fixed structure or architecture. Think intranet and internal enterprise software. Modern IT and web applications are much more complex than the client-server model. Distributed web services, which are set up as service-oriented architectures (SOA), offer many functions and modular functional units, which can be supplemented. With SOAs, business processes can be automated by the involved systems communicating with one another - partly without human intervention - and performing certain tasks. Examples include online banking, e-commerce, e-learning, online marketplaces, and business intelligence applications. These architectures are not only much more complex but can also be modularly extended. They are known as N-tier architectures and have so far been used primarily in the business sector.

There are generally two approaches:

  • Web Services Description Language (WSDL) and Simple Object Access Protocol (SOAP): WSDL is a meta-language for describing network services based on XML, enabling a web service to interpret and execute specific tasks. An interface to a web service can be defined with WSDL. SOAP is also based on XML and allows the control of web services in the form of procedure calls, which are realized with the protocol RPC (remote procedure call). SOAP, WSDL, and XML Schema are often used together.
  • Representational State Transfer (REST): REST is a similar approach used to communicate between machines in distributed systems. It is based on a client-server architecture, but is characterized above all by its uniform interface making REST easy to use with different resources or objects. With the Hypermedia as the Engine of Application State (HATEOAS) concept, it is also possible to change interfaces during operation, instead of having to redefine them as is the case with WSDL.

The Internet of Things or Semantic Web can be considered a current research area in this context. If the Web architecture was represented as an evolutionary timeline, IoT and Semantic Web would be the top of the development. The architectures that are used there are correspondingly complex.

Relevance to online marketing

The effects of different architectures are extremely diverse. From a user perspective, websites and web services are changing to a degree that not even developers can keep track of, what with hundreds of protocols, programming and scripting languages, frameworks and interfaces. For users, however, an extended range of functions is an advantage, as long as the system functions. Websites become interactive, data can be exchanged faster, and services interact with each other easily. Depending on the model chosen, certain KPIs of a web project can increase enormously. Keyword: Performance or page speed. But even the user experience and the joy of use can be positively improved.

For ambitious web applications, however, developers now also need a profound knowledge of IT infrastructure, programming languages, APIs, security, and data protection. From a developer’s perspective, web architectures are becoming more and more complex and many different approaches exist at the same time. The Internet as such does not know otherwise. Technologies come and go and only the best applications prevail because they solve a particular problem and are accepted by the users. The client-server model is already a classic, even if it is still used for billions of websites. Successors are probably already established with service-oriented architectures.