GCP Next 2016: Don’t delay, the cloud awaits

0

San Francisco, California—The cloud is no longer a dispensable luxury reserved for the biggest of companies. Today, as the service has become incrementally more affordable, the advantages it proffers for any enterprise is too large to ignore. Multinational tech giant Google sees the big picture of the public cloud as “a network, not just a collection of data centers.” Upon its global infrastructure developed over the course of a decade, Google has built products that “billions of users can depend on,” such as Gmail, Search, and Maps. You could say that cloud ubiquity is the true end goal.

Google wields its expertise proudly at the forefront of the cloud frontier; in fact it invited a whole delegation of media practitioners to San Francisco to personally witness the holding of its “GCP (Google Cloud Platform) Next 2016” global user conference. The conference, held annually, is sort of a signpost for the state or development of cloud technology and adoption. Attendees have a chance to learn how startups and global enterprises are using the GCP, allowing them to work hands-on with cloud tech, hear directly from engineers at work on the next generation of cloud, and, of course, network with others in the community.

In a release, Google declared that, “in the last year, cloud has gone from being the untrusted option to being seen as a more secure option for many companies. We know that compliance, support, and integration with existing IT investments is critical for businesses trying to use public cloud services to accelerate into new markets.”

It’s all about enabling businesses to safely and confidently adopt cloud into operations in order to leverage its competitive advantages. Google is of the opinion that enterprises can no longer ignore getting into the cloud. Appropriately, vendors in the space are doing the prep work to support enterprise workloads.

P1140033

With the advent of so-called big data, organizations are hard-pressed to come up with commensurate big insights from them. Google reports that 77 percent of these enterprises believe that their big data and analytics deployments fail to meet expectations, due in part to “siloed teams, high-maintenance gear, and the need for better tools.” The cloud addresses this by making said tools and data more accessible to so-called “citizen data scientists.”

The cloud also helps unlock the full potential of computers and data centers—harnessing machine learning’s benefits for more enterprises. Significantly, strides in cloud augur well for the Internet of Things. “Scale of ingestion and stream-based data processing will become part of every IT strategy,” Google maintains on its official blog. By 2019, the enterprise market is expected to account for 9.1 billion of 23.3 billion connected devices.

Cognizant of a connected future, Google announced a significant infrastructure investment with platform additions in the western United States (Oregon), and Asia (Tokyo). “As always, each region has multiple availability zones so that we can offer high-availability computing in each locale,” the company said in a release.

The company isn’t done yet. These two sites will be augmented by “more than 10 additional GCP regions” through 2017. Last year, Google had commenced operations on a GCP in South Carolina that accommodated the company’s services.

The opening of new regions is in keeping with the Google’s vision for its clients to deploy nearer to their own customers—promising lower latency and greater responsiveness. “With these new regions, even more applications become candidates to run on Cloud Platform, and get the benefits of Google-level scale and industry leading price/performance,” says the company.

Those who are still put off or intimidated by the idea of cloud should give the Google Cloud Launcher a look. Now thoroughly updated, its marketplace now boasts solutions “no more than a few clicks away.” Clients can “find the right configuration and specs for a solution more easily. Every solution now features a customized set of pre-configuration options as well as ready-to-use defaults.”

Google Cloud Platform Director of Product Management Greg DeMichillie said at a press session held at the company’s Spear Street office that the company “has gone so far” with the cloud, even if it might still feel novel to some. “There’s a lot of excitement about cloud right now as a brand-new thing,” he averred, and conceded that “in many ways, this is true… it’s a step in the evolutionary process that’s been going on for a decade, and will continue to go on.”

DeMichillie narrated that the first cloud form, if you will, was merely a co-location facility. “You would go rent 20 feet by 20 feet of cage space, put your own equipment in it, then the provider would give you power ports and plugs—but you managed everything.” It was merely about outsourcing infrastructure to somebody else.

The Google executive continued that the second wave of the cloud featured virtualized data centers. “You can put an API (application program interface) in front of these, and instead of managing physical services, you manage virtual servers.”

But while it drove another wave of innovation, DeMichillie stressed: “If you’re using a virtualization product, you’re still managing and patching servers. You’re still specifying how much CPU, memory, disk… you’re still configuring networks… the work you do hasn’t really changed.” Even if it sped up provisioning time of hardware from “minutes instead of months,” virtualization, said the executive, doesn’t “fundamentally change” the way an IT organization works.

P1140283

“We think that the direction that the cloud is going to is to one where you don’t manage individual disks and storage and CPUs. Instead, you have an elastic cloud—a sea of compute capacity that you just work on at the level of applications and services.”

An elastic, on-demand cloud is exactly what Google trumpets through its expanding product portfolio, such as Big Query. “With most data house products, your first step is to configure machine sizes. With Big Query, you just upload a petabyte of data and run a query, and everything else is managed for you,” explained DeMichillie. “You see it in products like App Engine where you just write code and App Engine takes care of scaling based on demand. Those are little glimpses of what cloud will look like. If we look at how we developed systems at Google, as (in) other web-scale companies like Facebook and Yahoo, all of us realize that the secret to really being efficient in the use of the cloud is to move to a model where you’re operating at a higher level.”

Google believes the world is on the cusp of the third wave of cloud evolution. “We’re on the verge of that being made real for companies,” DeMichillie continued. This actual, global, elastic cloud will allow developers to invest their talent and energy in great apps without needing to worry much about backroom concerns.

The pitch to trust Google and its services is pretty convincing. “You don’t get experience in operating clouds any other way than through actually operating clouds,” said DeMichillie. “We’ve been at this for 15 years. We have made all the mistakes than can be made. We have the bruises to show for that, and along that, it’s those experiences that allow us to know how to operate systems at scale—to know on your behalf the best way to manage petabytes of data.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here