1. Trang chủ
  2. » Công Nghệ Thông Tin

Assignment 1 cloud computing trần thành đạt bh00073

40 4 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Assignment 1 Cloud Computing Trần Thành Đạt BH00073
Người hướng dẫn Do Quoc Binh
Trường học University of London
Chuyên ngành Computing
Thể loại assignment
Năm xuất bản 2023
Thành phố London
Định dạng
Số trang 40
Dung lượng 1,34 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Cấu trúc

  • I. Demonstrate an understanding of the fundamentals of Cloud Computing and its architectures (6)
    • 1. Client – Server (6)
      • 1.1. Client (6)
      • 1.2. Server (6)
      • 1.3. Relationship between Client and Server (7)
    • 2. Peer-To-Peer(P2P) (7)
    • 3. High Performance Computing (10)
      • 3.1. Definition (10)
      • 3.2. Example (11)
    • 4. Deployment Models (14)
      • 4.1. Public Deployment (14)
      • 4.2. Private Deployment (16)
      • 4.3. Community Cloud (17)
      • 4.4. Hybrid Cloud (19)
    • 5. Cloud Service Models (21)
      • 5.1. Infrastructure as a Service (IaaS) (21)
      • 5.2. Platform as a Service (PaaS) (23)
      • 5.3. Software as a Service (SaaS) (25)
      • 5.4. Comparing Service Models (27)
    • 6. Characteristic of Cloud (28)
    • 7. Virtualization and Multicore (29)
      • 7.1. Virtualization (29)
      • 7.2. Multicore (30)
  • II. Evaluate the deployment models, service models and technological drivers of Cloud Computing and validate (31)
    • 1. Overview Scenario (31)
    • 2. ATN Solution (31)
    • 3. Deployment Model (32)
    • 4. Service Model (33)
    • 5. Programming Languages (33)
    • 6. Deployment Model (34)
    • 7. Cloud Platform (34)
    • 8. Cloud Architecture (35)
  • III. References (37)

Nội dung

ASM1 Cloud Computing . Đây là asm 1 của môn cloud computing đầy đủ và chi tiết . Làm đến hết phần M 434234234234234234234123333322222222222222222222222222222222222222222222222222222222222222222222222

Demonstrate an understanding of the fundamentals of Cloud Computing and its architectures

Client – Server

A client-server network model consists of two main components: the client and the server The server acts as the central hub where resources are stored, program services are hosted, and client requests are processed efficiently The client, typically a computer or device, plays a crucial role by sending requests to the server to access these resources and services (Thịnh Hạnh, 2010) This structure enables effective resource sharing and communication within a network.

The client or workstation acts as the communication hub, managing interactions with users, servers, and external environments It receives user requests, constructs query strings, and sends them to the server for processing Once the server responds, the client organizes and presents the results to the user, ensuring seamless information exchange and user experience.

When a client sends a request to the server, the server processes the query and communicates with the outside environment as needed Once the server completes processing the data and parses the request's character strings, it sends the results back to the client The client can then utilize these results to serve the user effectively This seamless request-response cycle is essential for efficient web server operations.

1.3 Relationship between Client and Server

The client-server model defines the relationship between two computers, where the client requests services from the server This model's key feature is the client's reliance on the server to provide and manage information For example, websites are hosted on web servers, highlighting the essential role of servers in delivering online content When connecting a client computer to a server, the process involves establishing a communication link to enable efficient data exchange and service access.

• Log on to the computer that will be connected to the server

• Open an Internet browser, such as Chrome

• The Connect your computer to the server page appears

• In the file download security warning message, click Run

Peer-To-Peer(P2P)

A peer-to-peer (P2P) network is a decentralized platform that enables direct communication and transactions between individuals without third-party involvement P2P platforms facilitate seamless buyer-seller interactions, incorporating services such as search, screening, rating, payment processing, and escrow to ensure secure and efficient exchanges (Indeed Editorial Team, 2021).

P2P model provides many essential features such as (Indeed Editorial Team, 2021):

In a peer-to-peer (P2P) network, each computer actively contributes resources such as files, printers, storage, bandwidth, and processing power while also utilizing these shared resources This decentralized structure allows multiple computers to exchange and access various resources efficiently, fostering a collaborative environment that enhances network performance and resource utilization.

A P2P network is easy to configure and manage, offering straightforward access control through sharing permissions on each computer By setting appropriate sharing permissions and assigning passwords to specific resources, users can effectively restrict access and enhance security within the network.

Some P2P networks are built by overlaying a virtual network onto a physical infrastructure This setup enables computers to communicate with each other through the virtual overlay, even as data is physically transferred across underlying physical links This architecture enhances connectivity and flexibility within the peer-to-peer network environment.

The table below will show the advantages and disadvantages of P2P network:

Easy file sharing Lack of Decentralization

Table 1 - Advantages and Disadvantages of P2P

Online gaming platforms like StarCraft, World of Warcraft, and other major publishers are highly popular within P2P networks P2P technology is widely adopted in online gaming due to its simple and efficient data-sharing capabilities between players This ease of configuration enhances multiplayer experiences, allowing seamless interaction For example, two game controllers connected to a single gaming station enable different players to control separate characters simultaneously, illustrating the practical application of P2P in online gaming (Codrut, 2019).

In the modern world, P2P is still used for online gaming platform and other services Another example of P2P network is

Bluetooth which commonly used to connect mobile phone to other electrical devices.

High Performance Computing

High-performance computing (HPC) is essential for solving complex problems that require extensive computations, utilizing supercomputers and computer clusters As technologies like IoT, AI, and 3-D imaging generate vast amounts of data, HPC's ability to process this data in real time has become increasingly critical There are three main HPC architectures employed to address these demanding computational tasks, highlighting its growing importance across various industries.

Parallel computing is an innovative architecture that divides large problems into smaller, independent tasks, allowing multiple processors to execute them simultaneously By sharing memory and facilitating communication between processors, this approach effectively combines results to solve complex problems efficiently The primary goal of parallel computing is to enhance computational power, enabling faster data processing and problem resolution (Heavy.ai, 2022).

A parallel computing infrastructure is typically housed within a data center, where multiple processors are installed in server racks to efficiently handle large-scale computing tasks The server distributes computing requests into smaller parts, enabling them to be executed simultaneously across servers This approach significantly enhances processing speed and overall system performance, making it essential for high-performance computing environments.

Cluster computing involves a network of interconnected computers functioning as a single entity to enhance processing speed and data integrity, providing efficient solutions to complex problems This system creates the illusion of a unified virtual machine through coordinated operation, a concept known as system transparency Based on distributed systems principles, cluster computing primarily uses LAN connections to link computers, ensuring seamless performance and system transparency (WatElectronics, 2020).

Cluster computing requires that all connected machines are identical in hardware and interconnected through dedicated network links to ensure optimal performance Additionally, these systems must share a common home directory, facilitating seamless data access and coordination across the cluster These key characteristics are essential for the effective operation of cluster computing environments.

Distributed computing involves multiple computer systems collaborating to solve a single problem by dividing it into smaller parts, with each computer handling a specific portion This approach enhances processing efficiency and scalability, as the connected computers communicate over the internet to work seamlessly together When implemented correctly, distributed computing enables these systems to function as a unified unit, significantly improving computational performance (Techopedia, 2017).

Distributed computing aims to enhance performance by connecting users and IT resources in a cost-effective, transparent, and reliable manner It offers fault tolerance and continuous resource accessibility even in the event of component failures Compared to centralized systems, distributed computing provides significant benefits such as improved scalability, resilience, and efficient resource utilization.

• Scalability: The system can be easily expanded by adding more machines as needed without affecting the original setup

Redundancy in technology ensures continuous service by utilizing multiple machines that can perform the same functions, so operations are not disrupted if one device fails This approach allows for deploying numerous smaller devices, which makes implementing redundancy cost-effective without requiring excessive investment.

Supercomputers are among the most recognized types of high-performance computing (HPC) solutions, comprising thousands of compute nodes that collaborate to execute complex tasks This collaboration relies on parallel processing, allowing multiple tasks to be handled simultaneously, significantly enhancing computational speed Essentially, a supercomputer functions like a network of thousands of interconnected PCs pooling their resources to accelerate processing efficiency (Netapp, 2022).

Supercomputers play a crucial role across various industries, powering advanced applications such as artificial intelligence and data processing Notable companies like Meta Platforms utilize AI supercomputers to enhance social media and user experiences, while IBM owns two of the world's most powerful supercomputers, primarily used for media, entertainment, and high-performance computing tasks These cutting-edge systems demonstrate the vital importance of supercomputers in driving innovation and efficiency in today's technology landscape.

The Google search engine, Earthquake Simulation, Petroleum Reservoir Simulation, and Weather

Forecasting System are all examples of popular cluster computing applications (Murugesan, 2022)

Figure 5 - Google search engine (Financial TImes, 2021)

Distributed computing plays a vital role in everyday life, exemplified by Content Delivery Networks (CDNs) that strategically store data across geographically dispersed locations to enhance end-user experience through faster content delivery Additionally, platforms like Cloudflare's Ridge Edge Platform are advanced examples of distributed computing systems, showcasing how data processing and services are distributed across multiple nodes for improved efficiency and reliability.

Figure 6 - Content Delivery Network (Miko Tech, 2023)

Content Delivery Networks (CDNs) play a crucial role in enabling the fast and efficient transfer of assets necessary for loading web content They currently handle the majority of global web traffic, supporting major websites like Facebook, Netflix, and Amazon to deliver seamless online experiences.

Deployment Models

The most common form of cloud computing deployment is the public cloud, where third-party providers own and operate cloud resources such as servers and storage across the Internet These providers manage all hardware, software, and infrastructure, ensuring scalable and reliable cloud services An example of a public cloud is Microsoft Azure, which offers businesses flexible and cost-effective cloud solutions (Muhammad, 2020).

The advantages and disadvantages of Public cloud include:

Lack of cost control Lack of security Minimal technical control

Table 2 - Advantages and Disadvantages of Public cloud

Amazon Elastic Compute Cloud (Amazon EC2) is a web-based service that enables businesses to run applications in the AWS cloud It allows developers to create virtual machines (VMs) that offer scalable computational capacity for IT projects and cloud workloads Hosted across AWS data centers worldwide, Amazon EC2 supports flexible and reliable cloud computing solutions for organizations.

Figure 8 - Amazon Elastic Compute public cloud (Bojankomazec, 2022)

Amazon EC2 is managed via APIs, allowing developers to quickly scale their infrastructure by launching or terminating multiple server instances simultaneously This service provides users with full control over their instances, enabling them to manage their cloud environment with ease, as if the servers were located in their own office.

Private clouds consist of cloud computing resources exclusively used by a single organization, ensuring enhanced security and control They can be physically installed on-premises within a company's data center or hosted by a third-party provider, offering flexibility in deployment In a private cloud, all services and infrastructure are hosted on a private network with dedicated hardware and software, providing tailored solutions to meet specific business needs (Muhammad, 2020).

The advantages and disadvantages of Private cloud are:

Expensive Mobile difficulty Scalability dependence

Table 3 - Advantages and Disadvantages of Private cloud

Amazon Simple Storage Service (Amazon S3) is a secure, highly accessible, and redundant cloud storage solution that enables users to store data safely in a private cloud environment Widely adopted across various industries and company sizes, Amazon S3 offers reliable data storage for diverse business needs.

/Figure 10 – Amazon Simple Storage Service (Aviatrix, n.d.)

Amazon S3 offers easy-to-use administrative tools for efficient data management and precise access control Data stored in Amazon S3 is organized into buckets or folders, allowing users to efficiently categorize and manage their files These storage buckets are scalable, capable of holding unlimited data, with no restrictions on the number of objects uploaded Each individual object within a bucket can store up to 5 TB of data, making Amazon S3 an ideal solution for organizations with large-scale storage needs.

A community cloud is a multi-tenant platform that combines private and public cloud features, enabling multiple organizations to collaborate seamlessly on shared projects and applications Its primary goal is to provide a centralized cloud infrastructure that facilitates collaboration among large groups of users on community-owned initiatives Essentially, a community cloud is a distributed infrastructure that integrates various cloud services to cater to the specific needs of different business units (TUCAKOV, 2020).

There are advantages and drawbacks of Community cloud:

• Remote access may be limited

Table 4 - Advantages and Disadvantages of Community cloud

Government agencies require constant communication and seamless data exchange across multiple departments to deliver efficient services However, due to strict privacy, legal, and security concerns, many governments limit their use of public clouds, making community clouds an ideal solution Community clouds enable secure collaboration among government entities while addressing their unique compliance needs (Mohanakrishnan, 2021).

Consequently, IBM's federal SoftLayer cloud collaborates with local partners to create, install, and manage industry-specific community clouds

Figure 12 - IBM SoftLayer community cloud (Mohanakrishnan, 2021)

The IBM Community Cloud is built on essential principles to give government agencies the flexibility, agility, and consistency they need to move to the cloud faster and more securely

A hybrid cloud seamlessly combines private and public cloud environments, allowing data and applications to transfer between them efficiently This flexible setup enables enterprises to deploy private clouds for sensitive IT workloads while leveraging public cloud resources to manage traffic spikes and scale operations as needed Hybrid clouds offer enhanced agility and cost-effectiveness, making them ideal for balancing security and scalability in modern business infrastructure.

There are advantages and drawbacks of Hybrid cloud:

Control Flexibility Cost-effectiveness Ease

Table 5 - Advantages and Disadvantages of Hybrid cloud

Hybrid cloud solutions integrate the strengths of both public and private clouds, offering flexible and scalable infrastructure Amazon Web Services (AWS) stands out as one of the largest and most prominent hybrid cloud providers globally, renowned for its extensive public cloud offerings In addition to its public cloud services, AWS also provides comprehensive platforms for hybrid cloud deployments, enabling organizations to optimize their IT environments with seamless integration, enhanced security, and operational agility.

Figure 14 - Amazon Web Services (AWS, 2022)

AWS Cloud connectivity enables direct connection from your network to any AWS Region or Local Zone, eliminating reliance on the public internet for improved reliability and low latency Additionally, it provides recursive DNS services across both AWS and on-premises networks, ensuring seamless and secure DNS resolution.

AWS Storage Gateway is a powerful hybrid storage solution that enables on-premises applications to seamlessly access AWS cloud storage With features like File Gateway, Tape Gateway, and Volume Gateway, it effectively supports hybrid cloud workloads, data backup and restoration, and disaster recovery strategies This makes AWS storage solutions ideal for ensuring scalable, reliable, and secure data management across on-premises and cloud environments.

Cloud Service Models

Infrastructure as a Service (IaaS) is a cloud computing solution that enables businesses to rent servers for processing and storage without the need for physical hardware It allows users to run any operating system or application on cloud-based servers while eliminating maintenance and operational costs IaaS dynamically scales resources based on demand, ensuring flexibility and cost-efficiency Additionally, it provides guaranteed Service Level Agreements (SLAs) for uptime and performance, simplifying data center management by removing the need for manual provisioning of physical servers (Comptia, 2022).

Infrastructure as a Service (IaaS) transforms traditional on-premises data centers by delivering computing, storage, networking, and supporting software like operating systems and databases as cloud-based services While there are some drawbacks to the IaaS model, its numerous advantages—including cost-efficiency, scalability, and flexibility—make it a valuable solution for modern businesses A detailed comparison of IaaS benefits and disadvantages highlights its potential to streamline IT operations and reduce infrastructure management overhead.

Pay for What You Use: The fees are calculated based on consumption indicators

Unexpected Costs: Monthly expenses can pile up quickly, and peak usage may be higher than anticipated

Reduce Capital Expenditures: IaaS is typically a monthly operating expense

Process Changes: IaaS may necessitate changes to processes and workflows

Dynamically Scale: The model increases the capacity quickly during peak time and reduces capacity as needed

Complex Integration: Interaction with existing systems poses a number of challenges

Increase Security: Security technologies and expertise are substantially invested by IaaS providers

Limited Customization: Public cloud users may have limited control and customization options

Reduce Downtime: IaaS allows for immediate outage recovery

Managing Availability: Even the most well- known service providers have downtime

Boost Speed: Once IaaS computers are provisioned, developers can start working on projects

Security Risks: New vulnerabilities may emerge as a result of the lack of direct control

Self-Service Provisioning: Access through simple internet connection

Security Risks: Businesses must be responsible for anything they put on their servers

Table 6 - Advantages and disadvantages of IaaS

Platform as a Service (PaaS) is a cloud-based environment that provides developers with comprehensive resources to build and deploy a wide range of applications, from simple cloud apps to complex business systems It includes infrastructure such as servers, storage, and networking, along with middleware, development tools, and database management systems PaaS supports the entire application lifecycle, including building, testing, deploying, administering, and updating web applications, making it a versatile solution for modern cloud development.

There are some advantages and disadvantages of PaaS model:

Cost-effective: There's no need to buy gear or pay for expenses when being offline

Dependence on Vendor: Extremely reliant on the capabilities of the vendor

Time savings: The core stack does not need to be set up or maintained

Risk of Lock-In: Customers may become locked into a language, interface, or software that they no longer require

Speed to Market: Increase the speed with which apps are developed

Compatibility: If PaaS is utilized alongside traditional development platforms, compatibility issues may arise

Boost Security: PaaS providers make significant investments in security technologies and expertise

Risks of security: While PaaS providers secure the infrastructure and platform, businesses are responsible for the security of the apps they create

Dynamically Scale: Increase capacity quickly at peak times and reduce capacity as needed

Flexibility: Employees can log in and work on applications from anywhere they have an internet connection

Custom Solutions: Developers have access to operational tools that allow them to construct custom software

Table 7 - Advantages and disadvantages of PaaS

Software as a Service (SaaS) enables users to access cloud-based applications via the Internet, including common tools like email, calendaring, and office software It offers a comprehensive software solution that businesses can subscribe to on a pay-as-you-go basis, with the service provider managing all infrastructure, middleware, application software, and data SaaS providers ensure the availability and security of applications and data through service agreements, allowing businesses to quickly deploy software with minimal initial investment This model simplifies software management and enhances scalability for organizations of all sizes.

The table below shows the disadvantages and advantages of SaaS model

Accessibility: SaaS provides the ability to run from any device using an internet browser 24/7

Loss of Control: Because the seller is in charge of everything, users are reliant on their abilities

Operational Management: Installation, equipment updates, and traditional licensing management are not part of the operational management

Limited Customization: The majority of SaaS solutions allow little in the way of vendor customization

Cost Effective: No upfront hardware expenditures and flexible payment methods such as pay-as you-go models

Slower Speed: SaaS solutions have a higher latency than client or server programs

Scalability: A solution can be easily scaled to meet changing needs

Security Risks: While the SaaS provider secures the application, sensitive data should be protected with extreme caution

Increase Security: SaaS companies put a lot of money into security technology and expertise

Data Storage: Data is saved in the cloud on a regular basis

Table 8 - Advantages and disadvantages of SaaS

Features IaaS Model PaaS Model IaaS Model

Characteristics Infrastructure as a service(IaaS) refers to pay-as-you-go storage, networking, and virtualization services

Platform as a service (PaaS) refers to hardware and software that may be accessed via the internet

Software as a service (SaaS) is software that is accessible via the internet and is provided by a third party

Delivery Over the internet Over the internet Over the internet

Uses IAAS is used by network architects

PAAS is used by developer SAAS is used by end user

Access IAAS provides users with access to resources such as virtual computers and virtual storage

PAAS provides application deployment and development tools with access to the runtime environment

SAAS give access to the end user

Model It's a service model that uses the internet to deliver visible computer resources

It is a type of cloud computing that provides tools for application development

It's a cloud computing service model in which the host software makes client software available

Required technical expertise Required to have prior understanding of the subject in order to comprehend the fundamental setup

There are no technical requirements because the company will handle everything

Popularity Popular between developer and researchers

Popular among programmers who specialize in the creation of apps and scripts

Popular between consumer and company

Cloud services Amazon web services, sun, vcloud express

Facebook, and google search engine

M.S office web, Facebook, and Google apps

Table 9 - Difference between IaaS, PaaS, and SaaS

Characteristic of Cloud

The Cloud computing has five important characteristics, each feature is very essential that it will not be Cloud computing if any of these characteristics is not presented (Ankit, 2019)

On-demand self-services: Human administrators are not required for cloud computing services; users can deploy, monitor, and manage computing resources as needed

Broad network access: Computing services are typically delivered over regular networks and a variety of devices

Figure 18 - Characteristics of Cloud computing (AnkitMahali, 2023)

Rapid elasticity is a key feature of cloud computing that allows IT resources to scale quickly and efficiently according to demand This means computing services can be scaled out or in as needed, ensuring users receive resources promptly when they request them Once the user's needs are satisfied, the services are automatically scaled down, optimizing resource utilization and reducing costs.

Resource pooling involves sharing IT resources like networks, servers, storage, applications, and services in a flexible and ad hoc manner across multiple applications and users This approach allows a single physical resource to efficiently serve multiple customers, optimizing resource utilization and reducing costs It is a fundamental concept in cloud computing that enhances scalability and resource efficiency.

Measured service is a crucial principle that involves tracking resource usage for each application and tenant, providing both users and resource providers with detailed insights into consumption This transparent monitoring supports accurate billing and efficient resource management, ensuring optimal allocation and accountability across cloud services.

Virtualization and Multicore

Cloud virtualization in computing involves creating virtual platforms for server operating systems and storage devices, enabling multiple machines and users to share a single physical resource This technology enhances efficiency by allowing simultaneous access and utilization of resources, thereby improving scalability and cost-effectiveness Additionally, cloud virtualization simplifies workload management, making traditional computing more flexible and efficient (DataFlair, 2022).

There are four types of Virtualizations in Cloud computing, including:

• Operating System Virtualization: The virtual machine software in Cloud Computing operating system virtualization is installed in the host's operating system rather than directly on the physical system

Hardware virtualization plays a crucial role in cloud computing by enabling flexible and efficient use of server resources It allows multiple virtual machines to run on a single physical server, optimizing hardware utilization This process involves installing virtual machine software directly onto hardware systems, abstracting and partitioning physical resources for various applications As a result, hardware virtualization enhances scalability and reduces costs in cloud infrastructure management.

Server virtualization in cloud computing involves installing specialized software directly on a physical server, enabling it to be partitioned into multiple virtual servers This technology allows a single physical server to be divided based on demand, effectively balancing the load and optimizing resource utilization.

Storage virtualization in cloud computing involves aggregating physical storage from multiple network devices to create a unified, virtual storage resource This process simplifies storage management, enhances scalability, and improves resource utilization by presenting a seamless storage environment Implementing storage virtualization is essential for optimizing cloud infrastructure performance and ensuring flexible, cost-effective data storage solutions.

The advantages and disadvantages of Virtualization in Cloud computing are shown on the table:

Security High cost of implementation

Economical Availability and Scalability issue

Eliminates the risk of system failure Requires several links in a chain that must work together cohesively

Flexible transfer of data Time consuming

Table 10 - Advantages and Disadvantages of Virtualization

Multicore processing (MCP) involves using multicore processors to enhance computer performance by executing multiple tasks simultaneously A multicore processor is an integrated circuit with multiple cores, enabling software applications to run on several processing units concurrently The primary objective of multicore technology is to boost system efficiency and capacity by supporting parallel computing By integrating two or more CPUs into a single chip, multicore processors significantly improve overall system speed and performance while reducing heat generation and power consumption, making them essential for modern computing systems.

Multicore processors, which consist of two or more cores working together as a single system, are the most common application of this technology They are widely used across various devices, including mobile phones, desktop computers, workstations, and servers, to enhance performance and efficiency.

Evaluate the deployment models, service models and technological drivers of Cloud Computing and validate

Overview Scenario

ATN is a leading Vietnamese toy retailer specializing in serving teenagers across multiple provinces, with annual revenues exceeding $500,000 Each store maintains its own local database to record daily transactions, but the company currently faces challenges in consolidating sales data, as store owners must send monthly reports to the board director, which is time-consuming and inefficient Additionally, the management team lacks real-time inventory updates, hindering their ability to make quick, informed business decisions Implementing a centralized data management system could streamline operations, improve data accuracy, and provide instant access to stock levels across all locations.

ATN Solution

ATN is a large company with high average revenue and an extensive data system, but its decentralized database approach leads to time-consuming data management and delays in updating real-time information Implementing cloud computing services is essential for ATN to streamline data integration, improve operational efficiency, and ensure timely access to up-to-date transaction data Embracing cloud technology enables ATN to optimize data management, support scalable growth, and enhance decision-making processes.

Cloud computing enhances accessibility by allowing data to be accessed from any location, making it ideal for international businesses like ATN This enables managers to oversee branch store databases remotely, facilitating efficient management across multiple regions Additionally, the cloud increases data access flexibility, allowing companies to easily scale their resources up or down based on their current needs.

Implementing cloud computing is easier than many companies assume, despite initial hesitations The cloud offers the latest technologies that facilitate the seamless transfer and management of large data volumes Moreover, cloud systems eliminate barriers associated with traditional internal databases and enable faster data collection from external sources, enhancing overall data efficiency and agility.

Maintaining a large data warehouse can be resource-intensive, straining ATN company's budget and making it difficult to afford equipment, cooling systems, and security measures By transferring data storage to a cloud service provider, ATN company benefits from a more cost-effective solution that reduces the need for on-premises infrastructure Cloud storage provides scalable, secure, and efficient data management, helping ATN company optimize resources and enhance overall operational efficiency.

Deployment Model

There are four cloud deployment models: private, public, hybrid, and community clouds, each offering significant benefits in efficiency and revenue growth For ATN company, the public cloud is highly recommended due to its cost-effectiveness, scalability, and ease of access The public cloud's ability to support flexible resource allocation makes it an ideal choice for ATN's operational needs Additionally, public cloud services enhance overall efficiency by enabling quick deployment and reliable performance, which can lead to increased income for the company.

Public cloud offers a highly flexible pricing model, enabling organizations to pay only for the resources they use by the hour, which helps keep IT expenses under control Its ease of deployment allows businesses to set up infrastructure in just a few hours, as services can be quickly purchased online and remotely configured through the cloud provider's platform.

The cloud provider is responsible for maintaining the cloud's hardware, software, and networks, relieving ATN from concerns about infrastructure updates, security, and upgrades Public cloud hosting offers flexible, pay-as-you-grow plans, eliminating the need for long-term commitments or significant investments This setup simplifies the engagement process, making cloud adoption straightforward and hassle-free.

Data is automatically duplicated across public cloud data centers, eliminating the need for organizations to manage backups and reducing associated costs This setup offers ATN company high flexibility and minimizes redundancy Furthermore, the interconnected cloud system ensures seamless performance; if one server fails, workloads are automatically transferred to another, maintaining smooth and reliable operation for business-critical applications.

Service Model

According to the scenario of ATN company, the PaaS service of cloud computing is recommended to be implemented Several reasons for this recommendation are:

Utilizing cloud provider-hosted capabilities simplifies application development by offering pre-built software features, such as database support and IoT tools, which save time, resources, and project budgets Cloud services also enhance security and stability by leveraging proprietary capabilities not accessible to users, resulting in more efficient and adaptable implementations compared to custom solutions.

Skills are highly transferrable across the industry because cloud provider tools are standardized for all customers Companies utilizing cloud provider features for IoT can more easily find professionals experienced with these tools, unlike those developing their own IoT logic PaaS solutions streamline the support process by standardizing common functions, making applications easier to manage Additionally, PaaS platforms simplify deployment, redeployment, and scalability, reducing operational complexities and minimizing errors.

Programming Languages

Node.js is a powerful JavaScript server environment that enables running code outside of a browser, making it ideal for real-time applications like streaming platforms, online games, time trackers, and social media apps Its scalable and fast performance benefits businesses requiring quick data processing and seamless user experiences As it is built using JavaScript—the world's most widely used programming language—Node.js is accessible to a broad range of developers and organizations This versatility and efficiency make Node.js an excellent choice for modern web development projects.

Node.js provides a variety of benefits which is the reason for the recommendation of implementing into ATN company Here are some of the important benefits:

• Node.js is easy to learn

• Node.js is particularly helpful in making the time-to-market cycle shorter

• The essence of Node.js is scalability, it provides the ability to handle a huge number of concurrent connections

• Node.js enables to quickly develop a Minimum Viable Product

Deployment Model

MongoDB is a highly popular database solution known for its scalability and flexibility, making it an excellent choice for businesses of all sizes (Esayas, 2019) Its numerous advantages include ease of use, high performance, and flexible data modeling, which cater to diverse business needs For ATN company, MongoDB is particularly recommended due to its ability to handle large volumes of data efficiently and support real-time applications, ensuring reliable and scalable database management.

• Open Source: An open-source database provides with peace of mind and a lot of opportunities development options

• Flexible Use Cases: The MongoDB database was designed for online transaction processing, which is appropriate when ATN wants to develop an online trading system

• Data Structure with No Schemas: Data can use the fields and structures that make the most sense for each application

• Data Sharing: MongoDB's data sharing feature allows ATN to store data across multiple machines to improve the performance of huge databases.

Cloud Platform

Render is a popular Platform as a Service (PaaS) that offers developers an easy way to build, deploy, and scale applications, similar to Heroku While Heroku supports multiple programming languages, Render specializes in key languages such as Python, Node.js, Ruby, Go, and Rust, providing tailored support and optimized performance Key benefits of using Render include simplified deployment processes, scalable infrastructure, and support for multiple modern programming languages, making it an efficient alternative for developers seeking a reliable PaaS solution.

Render provides a free tier, similar to Heroku, enabling users to start without any upfront costs This makes it an ideal choice for development, testing, and prototyping projects, offering accessible opportunities for developers to explore and build without financial commitments.

Render offers a user-friendly and intuitive interface that simplifies the process of setting up and managing applications Developers can easily deploy their code and configure application settings with minimal effort, ensuring a seamless and efficient experience.

• Serverless Architecture: Render leverages a serverless architecture, which means developers don't have to worry about managing servers or infrastructure They can focus solely on writing code and building their applications

Auto Scaling enables applications to automatically adjust their resources based on demand, ensuring optimal performance during traffic fluctuations By dynamically modifying CPU and memory allocation, Auto Scaling efficiently handles traffic spikes without requiring manual intervention, maintaining seamless application performance under varying workloads.

Render prioritizes security by providing a protected environment for application development, including features such as automated SSL certificates, encrypted data in transit, and secure access control, ensuring that applications and data are well safeguarded.

Render offers seamless integration with managed databases such as PostgreSQL, MySQL, and Redis, simplifying database setup and management This integrated database and caching support enables developers to concentrate on building their applications without worrying about backend infrastructure By streamlining database management, Render enhances development efficiency and accelerates project deployment.

Render offers seamless Git-based deployments by integrating directly with Git repositories, allowing developers to deploy applications effortlessly from their code This integration simplifies the process of updating and deploying new application versions, as a single Git push command is sufficient to trigger the deployment, ensuring rapid and efficient software updates.

Overall, Render offers a user-friendly experience, automatic scaling, and robust security features, making it a viable alternative to Heroku for developers looking for a PaaS solution.

Cloud Architecture

Cloud architecture describes how technology components collaboratively create a pooled and shared resource environment over a network, enabling efficient cloud computing (source) Key components of cloud architecture include the front-end platform, back-end platform, cloud-based delivery models, and networking infrastructure, all working together to deliver seamless services Auto scaling, specifically dynamic scaling, allows cloud resources to automatically adjust capacity based on changing demand, ensuring optimal performance This auto scaling process is triggered when specific metrics reach predefined target values, providing a responsive and scalable cloud environment (Estell, 2022).

Figure 20 - Cloud service consumers are sending requests to a cloud service (1) The automated scaling listener monitors the cloud service to determine if predefined capacity thresholds are being exceeded (2) (Estell, 2022)

The figure above shows an example of a Cloud architecture, the image shows that:

(1) Cloud service consumers are sending requests to a cloud service

(2) The automated scaling listener monitors the cloud service to determine if predefined capacity thresholds are being exceeded

During the process, requests are generally delivered to the cloud service if the workload remains within the power limit When the workload is below or equal to the capacity limit, the request proceeds as usual For instance, if the workload is scaled to 4 but the production limit is only 3, the request is canceled because the workload exceeds the production capacity.

In the modern world, the number of service requests from cloud service users is steadily increasing, leading to workloads that surpass acceptable performance levels To address this, a predefined scaling strategy is employed, with a scaling listener that automatically determines the appropriate course of action When the workload exceeds the performance threshold, the system follows a three-step process to ensure optimal resource management and maintain service quality.

• Step 1: If the cloud service implementation is deemed acceptable for further scalability, start the scaling process immediately

• Step 2: Producing multiple copies of the cloud service by sending signal resource duplication system to meet the requests of a large number of customers

During increased workload periods, the automatic listener continues to process user requests seamlessly However, if the user’s request exceeds the capabilities of the current cloud translation implementation, additional scaling will not be triggered This ensures stable performance while highlighting the limitations of existing scalability measures.

Ngày đăng: 02/08/2023, 22:34