The Web at 20!

The most recent edition of EMC’s ON Magazine contained a whole series of articles and musings celebrating the Web at 20 years and imagining what the next 20 years will bring for it. The EMC Community of bloggers has taken this meme and shared a number of very cool stories and ideas. I’ve been tagged by Christine Christopherson, one of our very talented user experience designers, to contribute my story and ideas for the future of the Web. Like my fellow EMC’ers I’ll be addressing the following three questions:

How has the web changed your life?
How has the web changed business and society?
What will the web look like in 20 years?

How has the web changed my life?
I was a research assistant in the Physics Department at the University of Notre Dame during the summer of 1991 working with Prof. Carol Tanner’s team researching Optical Atom Traps. The lab I was working in was not too far away from the computer lab with the recently acquired NeXT workstations. These things were exceedingly cool as up to that point I’d only been exposed to Apple II’s in my rudimentary programming classes and rather clunky IBMs that my Dad got through work. There was a team at CERN that was doing very similar work to the ND team and they were publishing their notes and results to an internal system utilizing the CERN httpd server, which funnily enough ran very well on NeXT. At this point I’d never heard of Tim Berners-Lee or his grand vision, it was simply regarded as the next wave of Physics documentation management. I remember being a little dismissive of it at that point, mostly because I was just in love with lab notebooks and couldn’t see how a computer would be better than that.

I forgot about httpd for two years until I was a software engineering student at the Illinois Institute of Technology and got reacquainted with the very nascent Web. I discovered Yahoo and all this new content that was coming online and played around with the W3C httpd server more, learning about HTML and UNIX administration in the process. The Web changed my life because it was the gateway drug to Solaris and Irix I must be honest. I became a UNIX snob, thrilled by the power of the Sun Sparcs and SGI Indys running their server daemons and databases. The Web and the openness of its communication lured me away from the closed systems that I had been programming for, after seeing the power of the Web there was no way I was going to sit in a cube and code 1 function or class for some humongous software package for three years. I became a Web administrator at Chicago Kent College of Law supporting the Circuit Court and the paperless law school and from there I went into consulting for first the Web, then intranets, then Data Centers until finally I was running operations for MyPoints.com, one of the top 10 web properties in 1999 and 2000. I learned a lot along the way about connecting people and ideas and have been able to develop a much more expanded vision of the power and purpose of systems and I am very grateful.

How has the web changed business and society?
Let me count the ways, they are legion. The power of the web to bring people and information together is exactly what drew me to it. It’s changed the way that I do just about everything, from shopping and learning about new products to finding information, teaching my daughters, watching movies and TV and interacting with my friends and peers. I think it is great the way that many companies are expanding their use of the Web, engaging their customers, learning what they expect of the products and services, how they’re being used, how they could be improved and allowing customers to get together to share even more about themselves around a shared passion or interest facilitated by that company. The Web has even changed product design, more and more companies crowdsourcing their designs or creating contests via the Web to develop new products. Nike has done a great job of this allowing everyone to design their own custom shoes and hosting design competitions online. Awesome stuff. Web-enabled customer forums help people get answers, best practices, unvarnished opinions and new contacts all in one place. EMC has done a ton of work in this area and I’m proud of the communities we’ve built for our customers. Gina Minks blogs often about our communities and has done a ton of work setting them up and managing them.

What will the Web look like in 20 years?
Well there certainly has been a lot written about the future of the Web and Technology but I’ll add my 2 cents. I am very much in the school of Neal Stephenson and his views presented in Snow Crash and The Diamond Age. The Web will become more immersive and more pervasive, if that’s even possible. I’m not sure if Virtual Reality will really take hold, but there certainly is a lot of potential there. I think the biggest differences we’ll see is around search and the ability to federate searches and be able to more quickly integrate and analyze the results a la Wolphram|Alpha on steroids. We’ll also see a lot more integration of location aware and other context based integration into search and content presentation. I’m especially excited by the possibilities of more integration of open-source and crowd-source design like that at Local Motors and in the Makers by Cory Doctorow. I guess I’d sum it up by saying ubiquitous access, high bandwidth, context aware natural language search and analytics with data privacy and even more by the way of integration of social networks and academics etc. Needless to say I am excited to be a part of the continued transformation.

At this point I’d like to tag that font of information Christopher Kusek aka CXI and Kathrin Winkler who I hope will talk about the Web and sustainability in 20 years!

Share

Cloud Differentiators

I’ve spent a lot of time talking with my clients and partners lately about what makes a Private Cloud a cloud. There are many schools of thought on this, no end to the opinions really, but I think it comes down to a few differentiators between “just” virtualized infrastructure and a cloud. For me those differentiators are less about technology and more about how you manage and provision things. A virtual infrastructure is still managed and provisioned on a resource or asset basis, where a cloud is managed and provisioned as a service or by policy. A service being some aggregation of resources to deliver something meaningful to your customer. An integrated approach to Governance, Risk Management and Compliance (GRC) is required to accomplish management as a service or by policy. It’s not enough to have a Dashboard that shows you the status of your environment, you need a console that reports and allows you to interact.

The virtual infrastructure is a key enabler of the cloud, but it’s not the cloud. At EMC we’ve developed a product and services portfolio that enables the Private Cloud vision of any device, anywhere accessing your information and your applications regardless of the infrastructure it happens to live on. Our Virtual Computing Environment coalition extends that enablement by including the components of unified internetworking and compute with the cloud operating system. Private Cloud is more expansive than VCE and the first technology solution offered by it in the form of the VBlock. The real differentiator between the virtual infrastructure and the Private Cloud is any device, anywhere is able to access your applications and information with your governance controlling it regardless of the underlying infrastructure, be it internal assets or those provided through the public clouds.

It’s the integration of GRC into the environment that delivers on the Private Cloud promise of all the agility, flexibility, scalability, multi-tenancy and automation associated with cloud computing tempered with the security, availability, resiliency, and control of the data center. This means that getting to a Private Cloud has to be about a lot more than deploying new technologies, it’s a wholesale transformation of IT and a new way of interfacing with the Business and your customers. A lot of what has been promised and demanded by frameworks like ITIL, SOA, MOF, COBIT, etc. is now able to be delivered through the infrastructure and toolsets supporting it. It’s possible to implement the Service Catalog and things like automatically approved changes into the resource management infrastructure to begin to provide real self service of IT where appropriate. The appeal of many existing public cloud solutions are the ease with which users can consume them: a credit card; a few clicks; and bam you have storage, or a server, or a CRM system. An integrated approach for GRC can provide this same user experience, plus the enterprise necessities like Service Levels, Business Continuity, Data Protection and the like for enterprise IT. This is the stuff that gets traction with the people I talk with about cloud and to me is the real promise of Private Cloud, a promise that is actually deliverable today.

Share

Accelerating the Journey to Private Cloud

I argue, frequently and with just about anyone who will engage, that Cloud Computing is the model and there are several different types of instantiations.¬† This certainly isn’t a new or controversial idea, and not a sea change in and of itself.¬† The same could be said for Web 2.0, SOA, N-Tier, Client-Server and back to the Platonic Ideal.¬† The blogosphere and twitterdom is filled with talk of IaaS, PaaS, SaaS &c. as various forms of Cloud Computing and those are interesting forms but not necessarily new ideas or modes of computing.¬† EMC has laid out the vision for a Private Cloud, it’s rather well defined and we have gathered together a number of partners to help us enable our customers in the creation and operation of private clouds.¬† I’m certainly a proponent of Private Cloud, believe in the model and think that it is innovative and a new mode of computing, but I come here not to praise private cloud, but to enable it.

I’ve spent the last few months talking with customers all over the world about Cloud Computing in general and what EMC means by Private Cloud in particular.¬† I’ve been fortunate enough to get a lot of feedback from the CXO level down to the managers and administrators that will be tasked with running these clouds.¬† A few common themes have emerged in these conversations.¬† Rarely does the question, “Why Cloud Computing?” come up, it’s almost as if Cloud is a foregone conclusion, hyped into the mainstream.¬† I am almost consistently asked by people at every level, “So now what?”.¬† EMC and our partners, and the market in general, has done a good job of laying out the groundwork and vision for Cloud Computing and its benefits and a hardware and software portfolio to enable it.¬† The question becomes how do I actually execute against the vision with the products to make it reality, as it does with most paradigm shifts.

It seems to me that a lot of IT organizations are positioning themselves for Private Cloud, knowingly or unknowingly.  The virtualization of the data center, not just of servers, but real enterprise virtualization is a key milestone on the path to Private Cloud.  Not only does it provide the framework to build a Private Cloud on, it brings real benefits to the organization in terms of reduced Capital Expenses, Operating Expenses, time to provision, mean time to repair and improved customer satisfaction for internal and external customers.  These benefits are core to the allure of Private Cloud and IT is keen to realize them as quickly as possible.

I’ve often seen, and industry analysts seem to weekly report, that virtualization efforts seem to hit a wall when around 20-30% of the workloads in the data center have been virtualized.¬† There are many reasons for this, ranging from applicability of previous virtualization solutions to enterprise workloads, and insufficient application owner and line of business buy-in to the transformation leading to lack of approved downtimes and applications not being approved for P2V.¬† We’ve helped a number of customers push through this wall and drive towards their goals of 80-90% of workloads being virtualized through the development of enterprise virtualization programs, acceleration services, documenting the activities and processes surrounding the virtualization of servers and applications, training and comprehensive communication and marketing plans to get the buy-in of the stakeholders and application owners.

It’s not just driving enterprise virtualization that will help IT realize the benefits of Private Cloud, however.¬† A lot of outsourcing companies operated for years on the concept of “Your mess for less”.¬† For this to be a real transformation it can’t just be the same old problems running on a shiny new architecture.¬† A key component of the journey to Private Cloud has to be the rationalization of the application portfolio.¬† We are constantly adding new applications and features and functionality into the environment, and for every “server hugger” out there I’d argue there’s an “application hugger”, we all have our babies and we’re certainly not going to let them be torn from our arms.

A systematic review of the existing application portfolio to identify opportunities for retirement, feature\functionality consolidation, replatforming and virtualization on proprietary unix systems provides the roadmap for how many of the promised savings can be realized.  If you want to embrace x86 as the chosen platform you have to figure out how to get as much of your application portfolio as possible onto it.  Coupling this portfolio rationalization with a comprehensive business case for Private Cloud provides the framework for driving line of business and application team compliance and for a realistic timeline of how quickly you can actually realize Private Cloud.

So that accounts for the infrastructure and the applications, now for the trifecta, governance!¬† A new model of computing requires a new model of governance and the associated tools and processes.¬† Thousands of virtual machines crammed into a small number of cabinets dynamically allocating and deallocating resources is a daunting environment if your key governance tool is Microsoft Excel.¬† The identification of appropriate services to provide, service levels to achieve, and a chargeback model to allocate costs are required, absolutely required, to have any chance of successfully building and operating a Private Cloud successfully.¬† This requires transparency into what you have, what you’re using, where it is, who owns it, what it requires, how it is to be measured and monitored, backed up, replicated, encrypted, allowed to grow or shrink, &c.¬† Sounds scary, I’m sure.

The service catalog, an integrated management tool framework and automated processes allow you to monitor, maintain, provision and recover the costs of such an environment.¬† Your administrators, engineers and operations teams need to be trained on the technologies, service levels, communications plan and have their roles and responsibilities well documented to empower them in this kind of model.¬† New tools and proactive methods for communicating with your clients have to be developed and integrated to ensure they understand what services you are providing them, how they are being charged for them and what service levels you guarantee.¬† I personally think that self-service plays a key role in the development of a Private Cloud, or most cloud models for that matter, and integration of Change, Release and Capacity Management into a self-service portal can make the difference in your client’s adoption of this new paradigm.

We’ve packaged these services up under the umbrella of Accelerating the Journey to Private Cloud and have integrated our Technology Implementation Services, and several new EMC Proven Solutions into a holistic stack to enable our customers. It’s not a light switch or a silver bullet, it still is a journey, but we’ve worked hard to take the lessons learned from many years of data center consolidation and migrations, process automation, custom reporting and dashboards, building innovative solutions and architectures, product training and managing transformative programs and integrate them into an effective services and solutions stack to accelerate the journey to Private Cloud and realize real benefits today.

Share

Five Years

I thought I’d post a few thoughts and get back into the swing of blogging on the occasion of my five year anniversary with EMC Consulting.¬† The last five years have brought a huge amount of change for me both personally and professionally.¬† I joined EMC Consulting two weeks after finding out my wife was pregnant with our first child, and the changes have just kept coming from then on.¬† I had spent the previous ten years working for small organizations and then growing with venture backed start-ups.¬† This new transition in many ways was my graduation into “adulthood”, first time working for a large company supporting well established products.¬† I was very hesitant at first, but haven’t regretted the decision once in the last five years.

In many ways the transition wasn’t as difficult as I thought it might be, while EMC is a very large company the organization I joined was only a couple hundred back then.¬† We’ve both grown a lot in the last five years, I went from being a project manager to global CTO for my practice and the organization has gone from a few hundred people to a 2700 person global organization.¬† It’s been exciting and back breaking and fun.¬† While the consulting side has been growing the overall make up of the company has changed dramatically too, more than 50% of our revenue coming from software and services, a real sea change from where we were five years ago.¬† I’ve been lucky enough to watch our product portfolio morph into what I do truly believe is the most comprehensive in the industry for information management, we’ve accomplished much more than I would’ve hoped for in 2004.

This has been a real milestone for me, for ten years I hopped between jobs just about every year seeking new opportunities and challenges, constantly looking for the next organization I could help grow, not really wanting to grow stale in my role.¬† I can honestly say that hasn’t been a worry for me the last five years, constant challenges and new opportunities.¬† For a long time I never thought I’d find a place to settle down, but I can easily see myself working for EMC in 2014, and I couldn’t be happier about it.¬† So as I look back on the last five years I am grateful: for the work, the opportunities and mostly the incredible people that I have the privilege of working with.

Here’s to another five great years!

Share

Clouds on the horizon

There’s been a lot of discussion lately about clouds and the future of IT across the blogosphere: Chuck is always good for a post or two; IBM spoke up the other day; and there are even reports that “Hey, this is real!”.¬† I can’t help but wonder if Cloud Computing is really just the marriage of flexible architecture, ubiquitous networks and IT Service Management?¬† As has been noted on this blog I am highly infrastructure biased, but I think it is apparent that fast, readily available networks are changing IT, your phone, laptop, Kindle, &c. are now viable end devices for application and content delivery almost anywhere on the planet.¬† Exciting times indeed!

If you scratch beneath the surface a bit the magic and mystery of the Cloud becomes a little more apparent: you have a high-performance, omnipresent network; a flexible delivery engine that is highly scalable and efficient; and a management framework that provides the appropriate Service Levels, security, compliance and communications the customer is seeking.  To truly deliver a cloud service you first have to identify and define a service that can be readily doled out to customers clamoring for it.  I can think of tons of services internal to an enterprise that would qualify for this designation, so I think the concept of a private cloud is a cogent one.  Take for example File Sharing, or Email, or Market Data, or Order Processing.

So why now?  The emergence of good allocation and resource management tools certainly makes the management of the service a lot easier, add adaptive authentication, identity management and role based access, couple that with the virtualization capabilities and infrastructure components geared to hypervirtualization and you have the recipe for easy to deploy private and public crowds.  The market adoption of frameworks like ITIL and ISO 20000 and their focus on Service Level Management provides the appropriate mindset for the IT organization looking to become service oriented.  Now ride all of that on a ubiquitous, converged, highly available fabric and you can provide these services to pretty much any client, via any platform, any where.

Suddenly Clouds aren’t so amorphous but really the next logical progression of virtualized infrastructure, Service-Oriented Architecture, and IT Service Management.

Share

Product Management

Awhile back I got a call on a Friday night that is familiar to many consultants, “Can you be in City X on Monday morning?”¬† The program manager on the other end of the phone remembered hearing that I had a degree in Product Management and was eager to get me in front of his customer who was looking to transform his organization into one that managed infrastructure according to a Product Management Lifecycle (PML).¬† Now I admittedly view the world through PML-tinted glasses, but this concept had really piqued my interest.¬† The idea was a pretty simple one: convert his organization to be product-oriented and merge the PML with the IT Infrastructure Library (ITIL) framework and the Software Development Lifecycle (SDLC) that the organization was already spottily using.¬† As a Unified Field Theory devout I was hooked!

The customer, like most, was approaching the development, testing and management of their infrastructure through a number of siloes: people thinking about the long term strategy; another group concerned with the implementation of systems; a group that tested the integrated infrastructure; a group responsible for the daily management of the environment; and an organization dedicated to interfacing with the customer to understand their requirements (and on occasion their satisfaction).¬† Strategy, architecture, engineering and operations were divided across the organization with several silos within each knowledge area.¬† No one was incented to work together, no one had a vision of the entire infrastructure as a “system” and finger pointing was the order of the day during any outage.¬† Walking around the several floors the IT department was spread over there was an air of discontent, people bolted for the door at 5pm, at the latest, were largely disengaged and took pride in the walls they put up around their particular part of the organization.¬† Worst of all the business, their customer, was unhappy and questioning why they were spending so much on that black box called IT.

Continue reading Product Management

Share

Utilization

One of my biggest pet peeves over the years has been utilization or capacity reporting.¬† I firmly believe that in order to figure out how to transform an environment into a more efficient one you have to first know what you’ve got.¬† Over the years I’ve walked into customer after customer, or dealt with admins or peers when I was on the other side of the table, who couldn’t tell me how much storage they had on the floor, or how it was allocated, or what the utilization of their servers were.¬† Part of the problem is that calculating utilization is one of those problems were perspective is reality, a DBA will have a much different idea of storage utilization than a sysadmin or a storage administrator.¬† And depending on how these various stakeholders are incented to manage the environment you will see a great disparity in the numbers you get back.¬† It may sound like the most “no duh” advice ever given but the definition of utilization metrics for each part of the infrastructure is a necessary first step.¬† The second step is publishing those definitions to any and every one and incorporating them into your resource management tools.

Stephen Foskett has a great break down of the problem in his post on “Storage Utilization Remains at 2001 Levels: Low!“, but I’d like to expand on his breakdown to include database utilization at the bottom of his storage waterfall.¬† I often use the “waterfall” to explain utilization to our customers.¬† In this case knowledge truly is power and like Chris Evan’s mentions in his post on “Beating the Credit Crunch” there is free money to be had in reclaiming storage in your environment.

It’s not just knowing about stale snapshots sitting out in the SAN, knowing how many copies of the data that exist is imperative.¬† One customer had a multi-terabyte database that was replicated to a second site, with two full exports on disk and replicated, a BCV at each location and backups to tape at each site.¬† That’s 8 copies of the data on their most expensive disk.¬† Now I’m all for safety, but that’s belt, suspenders and a flying buttress holding up those trousers.¬† A full analysis of utilization needs to take these sorts of outdated/outmoded management practices into account for a full understanding of what is really on the floor.

Old paradigms regarding the amount of overhead at each layer of the utilization cake need to be updated, the concept of 15% – 20% overhead for the environment is a great concept, until that environment gets to be mutli-petabyte, then you’re talking about hundreds of terabytes of storage sucking up your power and cooling.¬† Of course storage virtualization is supposed to solve problems like this, but proper capacity planning and a transparent method of moving data between arrays and/or systems with realistic service levels in place can address it just as effectively.

Share

Outsourcing & margins

It seems about 50% of my clients these days have outsourced, are thinking about outsourcing or are insourcing.¬† Some of my customers are themselves outsourcers.¬† An interesting facet of the model these days is the introduction of new services to meet customer needs while providing opportunity for the outsourcer.¬† I’ve had the opportunity to meet with several outsourcers over the past few years and advise them on their service catalog, usually for storage.¬† A common complaint has been that “we’re losing money on this deal” which always manages to surprise me.¬† If you’re losing money on so many deals you may want to get out of the business, but I digress.¬† Usually it’s not just the outsourcer that is unhappy, the customers generally are too: they think the prices are too high, the service is lousy, and that they aren’t really getting what they need.¬† You can rarely go back to the table and renegotiate your price for Tier 1 service and raise the price, so how do you create a win-win situation?

I’d like to present one solution that has been used to good effect in the past: the introduction of a new, and necessary, tier of storage to the service catalog.¬† I think this applies not just to outsourcers but anyone who runs their environment in a service provider mode.¬† A lot of customer I interact with complain that the performance of their Tier 1 storage is suboptimal and that their backups never finish on time, or aren’t validated, etc.¬† While hardly a novel or new solution appropriate archiving is the answer to these sorts of problems, and if you view it from a TCO perspective you can gather a lot of financial evidence for the executives on why it should be implemented.

I encourage my customers to think of their production data in terms of two classes, this is the highest level of data classification in my opinion, Operational Data and Reference Data.¬† Operational Data is that which the enterprise uses on a regular basis to run the business, the key is understanding where the cut-off is for “regular basis”.¬† Reference Data is that which is helpful to have around,¬† you might use once in awhile, for a quarter close or year end analysis, but which is ignored on a daily basis.¬† Reference Data takes up valuable Tier 1 storage, backup bandwidth and storage, and as a result can lead to blown SLAs.¬† The appropriate archiving of this data provides an opportunity to right-size the environment, delay the purchase of additional Tier 1 arrays, streamline the backup flow and improve Service Levels by administering data according to its value to the business.¬† The creation of an Archive Tier(s) provides an opportunity to deliver a necessary service to the customer while also enabling the provider to structure it at an improved margin.¬† Customers will want to archive Reference Data when they can link it to improved Tier 1 and backup performance, driving archive utilization and with it the improved margin while at the same time improving the margin on the other services due to fewer SLA misses and a lower administration cost.

Share

The changing nature of Information Technology

I’ve been lucky enough to be in our industry for the last 17 or so years and I have seen all sorts of changes, as we all have. If I think back to my days as a research assistant at a university using the engineering lab Sparcs to create lab reports and pass emails back and forth with other researchers, I’d never have envisioned helping to design and run a system that would send out more than six million customized emails per hour less than ten years later.

In the early 90s IT departments, if you could call them that for most organizations, were necessary evils, a band of misfits who toted various cables and dongles and floppies around to who knew what ends. Today IT is at the heart of several large industries, the difference between successful, profitable businesses and those on the bubble. We’ve seen the industry evolve from sysadmins being a bunch of doctoral and master’s students to kids graduating from high school knowing how to program in a number of languages with a CCNA certification. When I try to imagine what the next 17 years will bring I’m mystified to be honest, the change has been rapid and amazing.

There are a lot of challenges facing us as we move forward as a profession. The interconnectedness of today’s market means that everyone wants access to everything, NOW. Cell phones are becoming viable compute platforms, they are fitting 32 cores on a chip and we have a pretty ubiquitous, fast fabric tying most of it together. At the same time there is more regulation now that pretty much the sum of recorded history to about five years ago. My colleague, Chuck Hollis, talks a lot about the need for a CFO of Information, I think he’s on the right track. But that new position requires tools for reporting and analysis that cut across the many silos that make up IT and the heterogeneous infrastructures supporting them.

No IT framework like ITIL or COBIT or MOF will act as a silver bullet, no off the shelf Resource Management system will give you all the insight you need, no new analyst acronym like GRC will encapsulate everything you need to worry about. A change in the way we design, implement and manage our infrastructure is required to ensure that IT continues to be a source of business value and not just a cost center, or worse the place were Information goes to become confused, lost and irrelevant.

Share