|
Text/ITValue reporter Wu Ningchuan From entering the public eye in 2008 to Amazon's recent $100 million Federal Administration cloud computing contract with Microsoft, cloud computing has gone through a full seven years. In the past 7 years, new information technologies such as mobile computing, social networks, and big data based on the first generation of cloud computing technology have set off a new business movement in the global business field, creating personalized business models with consumers and users as the core. The Docker technology trend that has swept the cloud computing space since last year is a big step forward for enterprises to get closer to consumers and users. This article provides an in-depth analysis of what Docker is and how it will transform cloud computing, thereby incubating next-generation business models that can industrialize the production of personalized products and services. What is Docker? The answer is: Docker is the next generation of cloud computing. Docker translates to Chinese as "dock porter", so what does "dock porter" carry? It is a standardized "container", and this standardized "container" contains applications. "Terminal porters" can receive standardized "containers" at any "terminal" in the world that provide standardized connections, and then quickly install, run and manage applications in the "containers", and it is various cloud service providers that provide standardized connections to "terminals". In this way, Docker brings application development and distribution in the cloud computing environment into the era of industrial production, which is what Docker is all about. In the Docker environment, program developers produce programs according to certain packaging standards, and the produced standardized programs are loaded into standardized containers, which are "containers". Cloud service providers around the world provide standardized "docks" that can easily receive standardized containers and applications within them, assemble these standardized applications into their own personalized solutions in a plug-and-play manner, and then provide them to end users. The standardized program architecture corresponding to the Docker "container" is the familiar microservices. In the Docker era, the IaaS layer and PaaS layer in the first generation of cloud computing merged into one to form Container-as-a-Service (CaaS), which is the next generation of cloud computing architecture. The next-generation cloud computing architecture based on CaaS gives enterprises the ability to produce general software industrially, and then quickly assembles general software to form personalized solutions according to the individual needs of consumers and users, which is the next generation of business model. Pioneers of the container era According to Q2 data released by Synergy Research, a US market research company, in July 2015, the global cloud service market is now firmly monopolized by the four major vendors, AWS Amazon Cloud, Microsoft, IBM and Google account for 54% of the global cloud service market share, while the average annual growth of the cloud computing business of the four major vendors is as high as 84%, compared with the annual growth of other cloud computing vendors in the market of only 33%. Among them, AWS Amazon Cloud generated revenue of $1.82 billion in the second quarter of this year, an increase of 81% year-on-year; And Microsoft has already invested $15 billion in its global data centers. Obviously, as the four major manufacturers in the first-generation cloud computing industry, especially Google, Amazon and Microsoft, the three major public clouds have left other cloud service providers far behind, and these three have firmly established the market pattern of the first generation of public cloud. In 2014 and early 2015, Chinese technical experts in the core technical teams of the first generation of mainstream public cloud service providers began to leave their original companies and return to the domestic entrepreneurial container/Docker field. These include Lingque Cloud from Microsoft's Windows Azure core technology team, Digital Cloud from Google's advertising core technology department, DaoCloud from EMC and VMware core technology teams, Hourspeed Cloud from IBM Bluemix and Alibaba Cloud Shield core technology team, and Hyper Cloud from China Mobile Research Institute's cloud computing technology team. The common feature of these startups is that the founders are all from the core technical team of the first generation of public cloud, on the one hand, they found containers/Docker to be the mainstream trend of the next generation of cloud computing, and on the other hand, they left the original company because the first generation of public cloud has basically matured. "Why is cloud computing talent in Seattle? This is because Amazon Cloud and Microsoft are both in Seattle, and the technical capabilities of cloud computing mainly come from the experience of operation and maintenance, and there are only three technical talents with experience in operating and maintaining more than one million servers: Google, Amazon Cloud and Microsoft. Zuo Yue, founder of Lingque Cloud and former head of the container project of Microsoft Windows Azure US core technical team, said. Chen Kai, co-founder and CTO of Lingque Cloud, also comes from the core technical team of Microsoft Windows Azure in the United States and was in charge of Windows Azure's global scheduling system Fabric Controller. Wang Pu, the founder of another digital cloud, is from the core technical team of Google's advertising business in the United States, and he told reporters that Google has the largest number of servers among public service providers in the world, and Google, founded in 1998, operates and maintains tens of millions of servers around the world; followed by AWS and Microsoft Windows Azure, each owning and operating millions of servers; thirdly, IBM SoftLayer owns and operates hundreds of thousands of servers; Finally, regional public cloud service providers in various countries have and operate and maintain servers ranging from hundreds to thousands, and the operation and maintenance experience and technical level of public cloud service providers can be directly judged from the scale of the operation and maintenance servers. Evolved into the age of containers What is a container? This must mention a person and a company. This person is the famous second and last chief architect of Microsoft, Ray Ozzie. In October 2005, Ray Ozzie, who had just joined Microsoft, released a memorandum called "The Internet Services Disruption," which was primarily intended to promote Microsoft's overall transformation to Internet services. In this article, Ray Ozzie proposes the famous "seamless user experience" enabled by service-oriented software architecture and interprets it as "seamless communication", "seamless productivity", "seamless entertainment", "seamless market", "seamless operating system", "seamless solution" and "seamless IT", the core of which is "seamless operating system", "seamless solution" and "seamless IT". Google is a pioneer when it comes to "seamless operating systems," "seamless solutions," and "seamless IT" practices. Chen Hao, a famous blogger, former Amazon China R&D manager, and senior Alibaba expert, has a vivid metaphor, cloud computing "is to drive a Mercedes-Benz car with a Xiali car". Google is the hardcore player who drove "Xiali out of Mercedes-Benz", saying that it is an Internet company and the originator of Internet technology. Wang Pu said that Google has operated and maintained tens of millions of servers around the world, which has far exceeded the upper limit of many existing technology suppliers, so Google itself has invented many technologies to manage tens of millions of servers, "For example, Google has developed a top-notch network switch, and even Cisco can't build such a network switch." The reason is simple, because there is no corresponding experimental environment. Since its establishment in 1998, Google has developed and stockpiled a lot of "nuclear bomb-grade technology", but most of these technologies have been blocked by Google and can only be found through academic papers published by Google. In order to avoid the use of expensive physical machine-based virtualization products on the market, and at the same time to release its own software and services faster and cheaper, Google has developed a new container-based virtualization technology from the beginning, through which it simplifies the underlying operating system environment required for all Google services to run. At the CNUTCon Global Container Technology Conference in August 2015, Dawn Chen, a software engineer on Google Cloud Platform with more than 8 years of experience at Google, said that when she joined Google eight and a half years ago, when Google was just starting to develop container technology, there were only two people in Google's container technology team plus her. Today, all of Google's services run in containers, including Gmail, Maps, GFS file system, MapReduce, and more. Google now launches about 7,000 containers per second and releases more than 2 billion containers per week. Google actually realized the vision of a "seamless operating system" with containers. Anyone who knows a little bit about cloud computing knows the importance of virtual machines at the IaaS layer, and changing the way virtual machines are doing is changing the structure of the IaaS layer, which is why containers are the next generation of cloud computing models. Containers are essentially an operating system technology, which is an operating system-based virtualization technology. Application software developed based on containers can achieve the effect of "one place to develop, run everywhere", regardless of what kind of operating system or IaaS cloud service environment is underlying, which is actually the concept of "seamless operating system", which is corresponding to "seamless solutions" and "seamless IT". Docker unifies the world's containers Container and Docker are two English words, and Docker is equivalent to a standardized container, which is the latest result of the development of container technology in the past 30 years. As an operating system-level virtualization technology, container technology itself dates back to 1982. At that time, Chroot technology introduced by Unix is recognized as the origin of operating system-level virtualization, which is the earliest state of container technology. Subsequently, operating system virtualization technology was intertwined with the Linux kernel and the development of the Linux operating system. Because operating system virtualization is primarily aimed at cheap x86 servers, the development of server chip technology from Intel and AMD has also influenced the development of container technology. In 1991, Linus Torvalds, a graduate student at the University of Helsinki in Finland, developed a Linux kernel for the 386 machine. Based on the Linux kernel, different vendors have developed commercially available Linux operating systems. In January 1995, RedHat was founded, launching RedHat Linux, a Linux "distribution". Subsequently, the Linux kernel was continuously updated, and it was not until 2007 that more mature container technology entered the Linux kernel, which also benefited from the 64-bit server chips launched by Intel and AMD around 2005. It is precisely because of the significant increase in CPU and memory chip capacity that multiple spaces can be virtualized in one operating system. In 2008, LXC, or Linux container open source project, was established, and container technology began to be widely used in the industry, and Microsoft also launched the first generation of Windows Azure public cloud. In 2010, a startup called dotCloud was established in the United States, dotCloud was initially a PaaS platform based on LXC technology, and its concept was to provide a development cloud platform that spans the underlying IaaS cloud and supports multiple development languages. In early 2011, dotCloud raised $10 million in Series A funding. DotCloud originally ran on AWS EC2, but as more public cloud providers entered the market, dotCloud's concept was difficult to implement with a single company's proprietary technology. As a result, the founders of dotCloud simplified and standardized container technology based on LXC, named it Docker and opened it up, and launched the Open Container Program (OCI), which quickly became popular with Docker and the Docker open source community. On October 29, 2013, dotCloud was renamed Docker. Subsequently, several vendors began announcing support for Docker. As of August 2015, the public container package application Registry (maintained by Docker) has published more than 180,000 applications in the public community. It can be said that X86 architecture PC servers are the winners of heterogeneous hardware architectures, Linux and Windows are winners of heterogeneous operating systems, and Docker provides a unified virtual operating system for cloud data centers based on X86 servers and Linux/Windows operating systems, and the era of heterogeneous architecture begins to end. The era of containers has arrived In addition to startups, large manufacturers are not to be outdone and have quickly followed up. In October last year, Microsoft announced plans to implement container technology on Windows Server, announcing a partnership with Docker to guarantee a unified and open experience on both Linux and Windows Server. Although Linux containers and Windows containers are incompatible with each other based on different operating systems, container managers are unified. Recently, Microsoft further announced that it will release Windows Server containers and Hyper-V containers for Windows Server 2016, both of which support Docker API and Docker client. VMWare, another company that has been hit hard by Docker, also couldn't wait to announce its support for Docker at VMWare World 2014. VMware's approach to containers is positive, and despite competition between containers and physical machine-based VMs, VMware remains committed to expanding its collaboration with the container ecosystem. At VMWare World 2015 at the end of August 2015, VMware introduced a series of new Docker-enabled technologies and proposed a new technical architecture for fully supporting containers in the future, allowing Docker programs to run in virtual machine VMs. In addition, AWS Amazon Cloud launched AWS ECS, an EC2 container service, in November last year, allowing users to no longer install, operate, and expand cluster management infrastructure, but can start and stop supporting Docker applications with simple API calls. Huawei has always been an active sponsor of various open source projects, foundations, organizations, and summits. In 2015, Huawei joined OCI and the Cloud Native Computing Foundation (CNCF) as a founding member, becoming the only Chinese company on the list. Liang Chenye, Senior R&D Engineer at Huawei's Open Source Competence Center, said at the 2015 CNUT Global Container Conference that Huawei actively participates in the Open Container Test Project (OCT) and works with the OCI organization to promote the implementation and popularization of open container standards. Since 2015, Huawei has ranked among the top three in terms of contribution to the Docker community, and the earliest Docker community maintainer in China is from Huawei. In addition to domestic cloud service providers such as Huawei and Alibaba, domestic Internet companies such as Tencent, Baidu, 360, JD.com, and Sohu have begun to fully practice container technology since 2011. According to Liu Haifeng, chief architect of JD Cloud Platform, at the 2015 CNUT Global Container Conference, JD.com began to introduce Docker in October 2014, made a strategic project for Docker in February 2015, released more than 11,000 container instances in the production environment and connected to more than 1,000 applications in 2015 at 618, and fully adopted container technology in the new data center from August 2015. At present, JD.com has launched more than 20,000 Docker instances, which are expected to double by the end of the year, when most of JD.com's applications will be released through Docker. In the future, JD.com's Docker vision is to manage all machines through Docker, fully decouple applications from physical resources, achieve fully automated system maintenance, and R&D personnel can focus on the development of new applications. Having said so much,In fact, Google is the biggest contributor to containers.Docker technology is written in Google's Go language, the second open source programming language released by Google in 2009. Google releases more than 200 million containers every week, giving Google the ability to invent many key container technologies. This included a container management system, the first version of which was called Borg, followed by a version called Omega. This management system allows the use of container technology on Google's large-scale cluster resources. Later, according to Google's relevant academic papers, the industry imitated the Mesos system developed by Borg, which is being used by Airbnb, Twitter, Apple's Siri, etc. Personalized business: The enterprises of the future are software companies Douglas M. Baker, Jr., chairman and CEO of Fortune 500 Ecolab, said in a study conducted by PwC in the 2015 Global CEO Survey, "No company can take for granted that today's business will guarantee future success. It is taken for granted that large companies are safer and more stable, but the past 50 years have proven the opposite. If the enterprise cannot continue to change rapidly, the risk can only increase rather than decrease. ” The ability to change has obviously become the core competitiveness of future business. As the influence of the Internet industry becomes more and more large, the Internet's invasion of traditional industries is becoming more and more intense and deep, especially the strong introduction of China's "Internet +" national strategy, and future enterprises will be more and more Internetized. In such a fully Internet-based business environment, future enterprises will have more or less software capabilities. In addition to the future enterprise business will be partially or fully structured on the Internet, enterprises will also rely on software capabilities to provide personalized services to consumers and users. If the enterprises of the future are software enterprises, the importance of Docker can be imagined.In mid-2015, Adrian Cockcroft, a technology expert at Battery Vetures, a veteran venture capital firm in Silicon Valley, released the 2015 Cloud White Paper: A Review of Industry Milestones and Future Prospects, a former cloud platform architect at Netflix, a former streaming service in the United States, and a founding member of eBay Research Labs and Sun Microsystems Distinguished engineer and chief architect of the high-performance technical computing department. He believes that Docker will gradually grow into a standardized production tool, which also reflects the high acceptance of Docker from one side. Adrian says the challenge with Docker is to carefully manage the ecosystem while quickly adding features to support production deployments. So far, Docker has prevented an ecosystem split. Today, even chip manufacturer Intel has crossed over to join the Docker ecosystem. In May, Intel launched Clear Linux, a container-centric OS project that is currently in its experimental phase, and Intel indicates that the system will be available in production environments in the future. It can be seen that in the process of moving towards the future of business, there will be more and more cross-border behaviors. According to Thomson Reuters, there were 10,330 mergers and acquisitions in the United States in November 2014 alone, representing a total of $1.9 trillion in deal value. These mergers and acquisitions are more about expanding business alliances and creating business value that cannot be achieved by a single company through mergers or acquisitions of companies with different businesses or capabilities. According to PwC's 2015 Global CEO Survey Report, about 44% of CEOs in the United States will launch a new strategic alliance within 12 months of the survey. In the past, business alliances were mainly about connecting with suppliers or users, but in the future, more and more CEOs will choose to form alliances with competitors, startups, or companies in different fields. Of course, many Docker technical experts, including Sun Hongliang, a member of DaoCloud's core team, have said on different occasions that Docker is still in its early stages of development, with challenges such as weak network functions, security, and difficulties in running traditional businesses. It is precisely because of these technical immaturities that a number of Docker startups have emerged in the United States and China, each showing their own powers, and having the courage to lay out the next generation of cloud computing and next-generation business. The cloud computing team of the Software Engineering Lab (SEL) of Zhejiang University, which is one of the earliest in China engaged in container and Docker research, was established in 2011 to build, analyze and study open source cloud computing technology. In its new book "Docker - Containers and Container Cloud", the SEL team of Zhejiang University writes: "Docker, which relies on container technology, has quickly become a treasure in the hands of major cloud computing manufacturers and developers at home and abroad. In the midst of the heat, a new revolution has quietly arrived. ”
|