Hi everyone. Welcome to the seventh chapter in our Tencent Cloud Developer Associate course, Serverless Architecture. At the end of this chapter, you'll be able to understand the concept of serverless architecture and the concepts and features of Serverless Cloud Function, SCF. In this chapter, we'll cover two sections, overview of serverless architecture and overview of Serverless Cloud Function, SCF. Let's get started with Section 1, overview of serverless architecture. In this video, we'll cover the background of serverless architecture, an overview of serverless architecture and the use cases and development trends of serverless architecture. In 2006, the release of Amazon Web Services set off a wave of infrastructure changes in physical machine hosting. Cloud computing enabled the decoupling of computing resources and physical hardware. The development of virtualization technology made Cloud hosting the infrastructure of enterprises. At the same time, Infrastructure as a Service, IaaS, began to be widely adopted. With the popularity of container technology, Platform as a Service, PaaS also emerged with features such as operating system installation, monitoring, and service discovery. Although PaaS has been widely used, there was still room for optimization to completely separate the business application from the underlying infrastructure. Serverless architecture began to be practiced as a result. Since the burden of development and operations can be reduced by using the Serverless Cloud Function, SCF, developers can focus on the development of applications without worrying about the service operations. With Function as a Service, FaaS, you can focus more on business innovation and improving productivity. Infrastructure architecture always evolves with the development of software architecture. In the era of monolithic architecture, the application was relatively simple since the overall deployment of the application, the iterative update of the business and the resource utilization efficiency of the physical server were sufficient to support the deployment of the business. However, as the business complexity rapidly increased and the functional modules grew complex and large, the monolithic architecture became a bottleneck for development and deployment efficiency. As a result, the microservice architecture in which separate modules can be developed and deployed in parallel has become more and more popular. Then the virtualization technology bridges the gap between physical resources and reduces the burden of managing the infrastructure. The container services further abstract the operating system and provide the applications dependent services, operating environment, and underlying computing resources. This improves the overall efficiency of application development, deployment, and ops. However, some problems still exist after applications are transformed into microservices. First, can the resource granularity be more refined than the container? Second, can the application granularity be more refined than services? Third, can we not consider the runtime? Forth, can we not pay attention to the management of computing resources? In addition, is there an architecture that is able to delegate the management of all resources, help eliminate the need for infrastructure operations, enable users to focus on high-value businesses and further increase the productivity of software applications and operations? To meet such needs, the idea of serverless architecture was born. Serverless is an architectural concept whose core idea is to abstract the infrastructure into various services and provide these services for users to call in the form of APIs. Serverless architecture completely separates the applications from the infrastructure, eliminates the need for infrastructure ops, and therefore enables developers to focus on the development of the application logic without having to worry about the service operations. Serverless architecture invokes computer resources only when the event is triggered and implements on-demand scaling and pay-as-you-go billing. The term serverless can be traced back to an article "Why The Future Of Software and Apps Is Serverless, published by Ken Fromm in 2012. At that time, the Cloud computing world still revolved around the concept of server, and Cloud users still needed to consider the quantity and price of servers and when to expand the capacity. Ken proposes that serverless does not mean that there was no physical server, but that Cloud application developers no longer need to consider servers, and Cloud users can use Cloud resources without worrying about physical capacity or upper limit management issues. As mentioned, this architecture eliminates the need for traditional massive online server components, reduces operating costs, shortens the delivery cycle of business systems and enables users to focus on the development of business logic with higher value. Since a large number of services are maintained by vendors, this also makes the service architecture bound to vendors. Most serverless providers provide services in the form of a function as a service, fast platform. The computing resources that can be directly used are operated according to the established business application logic. The serverless architecture refers to an application that relies heavily on third-party services or Backend as a Service, bass, meaning that code is executed and managed in short-lived containers. The latter one is known as Function as a Service fast. The serverless architecture also includes services such as database, authentication, API, gateways, orchestration, and other services specific to a particular domain, such as live streaming services and AI services. The serverless architecture can achieve approximately 100% utilization. Pass applications either run at a specific scale or scale very slowly and incur some overhead costs due to scaling operations, such as having unused and idle instances waiting to receive requests. By contrast, there are no costs incurred when a serverless service is not in use. A serverless service can scale to serve millions of users almost instantaneously and it's cost depends on its usage. The use cases of serverless architecture can be divided into the following categories. First, it is very common to build the application backend services using the serverless architecture technology since the serverless architecture allows developers to build the backend services based on the Cloud platform and focus more on optimizing the mobile applications. Second, serverless architecture is suitable for Internet of Things, IOT backend services. In IOT scenarios, the amount of data transmitted by devices is small. Data transmission is often carried out at fixed time intervals and there are obvious peaks and troughs during data transmission. The peak period of data transmission triggers the centralized processing of backend function services, which can be released quickly after processing to improve the utilization efficiency of resources. The third use case is AI inference and prediction. The call demand of AI reasoning and prediction will change with the fluctuation of business needs, which is different from the fixed computing cycles and runtime of AI training. At the same time, AI reasoning typically uses GPU acceleration and any significant spikes would lead to resource waste. Using serverless architecture technology can effectively solve these problems. When service requests rise, the execution instance of the Cloud Function would automatically expand to meet the increase in business needs. In contrast, the Cloud Function would automatically shrink or stop to save resources in case of a decrease in business needs. In addition, you can use the powerful parallel computing capability by asynchronous computing with only a short run per day. The tasks of input and output and network access are very suitable for the serverless architecture because these tasks can consume the resources needed when they run elastically and avoid incurring costs when they are not being used. The last use case is event-based content processing applications, such as real-time file processing and customized event triggering. For example, some applications might need to crop pictures to different sizes, or add different labels. When the picture or video stream is uploaded through the object storage, it triggers a function calculation that is automatically processed on-demand according to the calculation rules. In terms of customized event triggering, let's take the scenario of sending an example to verify the registration e-mail address as an example, the customized event can trigger a subsequent registration process without configuring additional application servers to process subsequent requests. In terms of development trends, despite the fact that the Serverless architecture technology has been around for a relatively short time, it has managed to attract widespread attention. With a further application of this technology, Serverless architectures will adapt to more and more scenarios and its ecosystem will continue to evolve. The development trends of serverless technology can be viewed from the following aspects. First, the serverless architecture will have more specific function specifications that help improve the resource utilization efficiency. Second, the toolchain ecosystem of serverless architecture technology will continue to expand. Third, Kubernetes will become an important infrastructure that allows users to run serverless applications in a Cloud environment without worrying about vendor lockouts. With the further improvement of Kubernetes deployment around Cloud-native architectures, the future of serverless architecture frameworks based on Kubernetes is bright. As a review of what you've learned in this section, let's go over the following reflection question. What are the components of serverless architecture? You can pause the video for a few minutes and think about how you would answer this question.