David Linthicum. (Via Cloud Technology Partners)

Amazon Web Services’ recent re:Invent conference in Las Vegas highlighted the excitement and momentum that public-cloud computing is generating these days, perhaps epitomized by shipping line Matson’s announcement that it is abandoning its own data centers and moving its entire IT operation to AWS.

But such wholesale moves to the public cloud, where organizations rent all their computing, storage and networking resources rather than buying and maintaining their own hardware, remain unusual, noted David Linthicum, a podcasting pundit who’s senior VP of consultancy Cloud Technology Partners in Boston, during an interview. The great majority of organizations using the public cloud — about 85 percent, he estimated — are also using data centers, whether their own or those of a managed-service provider (MSP) or a private hosting company.

Such straddling will remain the norm for a long time to come, Linthicum predicted. There’s simply too much computing and data that can’t cost-effectively be moved to the cloud. That means there will remain a strong need for tools and expertise that help the data center and the public cloud communicate. And there will always be a need for privately controlled infrastructure, for legal and privacy reasons.

“You can’t have systems that exist on an island, so people are using baling wire and duct tape to bind together” data in the cloud and in data centers, he said.

Executives “standing up and saying ‘We’re gonna move every one of our workloads to the cloud’ — I see that done all the time, and making that occur without killing your business is tricky,” Linthicum said. “I’d say most organizations are going to hit their heads at about 70 percent. With that much workload and data moved to the cloud, they’re going to have to have traditional systems that MSPs or private hosting companies take over” if they want to close their own data centers.

Why can’t everything be moved to the cloud? Sometimes because apps are too archaic, like a COBOL-based DB2 database controlling inventory, he said. “There might be no analog for that in the public cloud, so you’d have to rewrite the software, and if that’s going to cost $5 million, it’s economically unfeasible.”

It’s not as though AWS and rival Microsoft Azure are ignoring this issue. AWS offers linkage products including Virtual Private Cloud (VPC), Virtual Private Gateway and Hardware Virtual Private Network. Dan Zelem, CTO of Johnson & Johnson, who spoke at re:Invent, has said his company has 120 applications running on VPC and plans to triple that number over the next year. Azure is offering an early version of Azure Stack, a limited tool intended to allow using the same services in the cloud and in data centers; and Operations Management Suite.

But still, “in essence, it’s ‘Bring your own solutions,'” Linthicum said. “You have to figure out how to make your traditional systems work with the stuff that’s in the public cloud.”

Asked to compare which public cloud provider makes it simpler to do the public-cloud/data center straddle, Linthicum apologized for falling back on the consultant’s favorite phrase: “It depends.”

AWS “can work and play well with anything, because of all its features and functions and the huge number of companies that are in orbit around it providing services and technology that make it easier to play with traditional systems,” he said. “If you have an eclectic array of stuff, with mainframes and LAMP (Linux, Apache, MySQL, and PHP/Python/Perl) stacks and MySQL systems that have been around for years, then AWS is typically the path of least resistance.”

On the other hand, for a Microsoft shop heavily into .NET and SQL Server, then “Azure will be the course of least resistance.” And, of course, there are exceptions to those rules: “A lot of Microsoft shops work and play well with AWS, and AWS can run .NET stuff.”

As to third-place Google Cloud, “for now it’s typically not a one-stop shop like AWS and Azure, but it can be the cheapest way to buy storage or compute, or to combine them,” he said.

During its ten-year life, the cloud-computing marketplace has grown more complex and confusing, Linthicum said.

“We used to have ‘hybrid cloud,’ which usually referred to a combination of public and private cloud,” he said. “Now we’re not seeing private cloud at all” on consulting projects. “It hasn’t gotten the traction everyone expected, and it seems to be getting worse as AWS and Azure pump up their offerings.” The private cloud, exemplified by Open Stack and Eucalyptus, offers cloud-like services on user-owned hardware, and some see it as mainly an effort among hardware-makers to hold onto their revenue streams.

So Linthicum proposed a new term: “pragmatic hybrid cloud.” It meets NIST’s definition of hybrid cloud and acknowledges the straddle but also implies the decline of the private cloud for general purposes.

“Six years ago, it was ‘public, private, hybrid,’ and everyone memorized the definitions,” he said. “Now that we’re actually doing stuff, it’s morphing all over the place. It’s getting very complex out there.”

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.