Bespoke IT vs. Fit for Purpose

I’ve been focused on blogging over at EMC’s InFocus blog for the last year, but I want to get back to Mr. Infrastructure and start blogging more frequently about a wider variety of topics. First up is some thoughts on a topic I’ve been spending a lot of time thinking about these days: Bespoke IT versus Fit for Purpose. We’ve spent a lot of time on bespoke IT in the industry, building new applications, silos, architectures, etc. to meet a specific need based on the skills and tools we are familiar with. We often don’t have the luxury to go out and investigate what the right tools would be and learn them in order to best apply them. If I think of this in sartorial terms, we make some outstanding, finely fit suits, but it might be in last decade’s style or colors. The suit might be of the highest quality and yet might not meet the needs of the wearer, or might stand out for all the wrong reasons in a crowd. Just because it’s bespoke doesn’t mean it’s the best way to approach the problem.

This is a long lead in to what I really want to talk about, the idea of Fit for Purpose. EMC acquired Adaptivity and I’ve been lucky enough to get to work with that great team, and learn a lot about how they think about IT, Applications and Infrastructure. They have a lot of talent on that team and I’ve learned a lot in conversations and brainstorming with them. Their Chief Scientist is Sheppard Narkier and he’s started to share many of his ideas, thoughts, and experiences on InFocus, see his post on Lessons Learned: The Quality of Design is not Fuzzy. On the surface , “Fit for Purpose” is nearly self explanatory, the idea of designing IT and Business systems based upon what they’ll be used for and how they’ll consume infrastructure. But to those not used to thinking in that paradigm, this explanation could be considered too coarse grained as a definition, let me explain a bit further. Continue reading Bespoke IT vs. Fit for Purpose

Share

Go big or go home!

Go Big or Go Home, seems trite, but it is applicable to IT transformation.  Companies that are successfully adopting cloud technologies are taking a transformation approach, not a technical project approach.  The larger the scale of the program the more traction they are getting across the enterprise, business and IT.  For too long virtualization has been conflated with consolidation and that’s been one of the sticking points when it comes to trying to get the business and application owners to buy in to change. Continue reading Go big or go home!

Share

Strategies for Private Cloud Initiatives

Later this week I’ll be presenting as a part of our EMC Live! webcasts on Building Strategies for Private Cloud Initiatives. I’ve been thinking more about what EMC’s Private Cloud vision means and how it is being implemented by our customers.¬† The initial idea of Private Cloud being a destination, part of a linear progression does a bit of a disservice to the whole concept of cloud computing and the control and choice offered by these new models.¬† Many companies are already thinking about Private Cloud as an approach to balancing their IT Service portfolio across internal and external resources based on criteria like cost and risk.¬† In my opinion, and I think EMC’s strategy and approach on Private Cloud bears this out, Governance, Risk Management and Compliance (GRC) is what makes the Cloud private.

Organizations have had a portfolio approach to IT for quite some time, now the various components within that portfolio might have started out as Mainframe, Open Systems and x86 in their own data center, or it could’ve been App Dev/Test and Pre-Prod in their data centers and Production at a hosting facility, and many, many other permutations.¬† Until recently there have always been pretty significant differences between those IT Services in the Portfolio and usually different management interfaces, organizations, reporting, etc. associated with each of them.¬† I posit that an integrated GRC framework with a Unified Service Portal not only bind the portfolio together and provides commonality in terms of how IT’s customers provision, manage and report on their services, but that they provide the framework for efficiency, control and choice which are the hallmarks of EMC’s Private Cloud vision.¬† This allows, as the portfolio matures and the GRC framework becomes more integrated, the CIO to deliver against the CEO’s expectations of cost reduction, the CISO/CLO’s expectation of a secure and compliant environment and his or her own expectation for more automation and transparency.¬† The goal then becomes not having only one method of computing achieved via a linear transformation of IT, but rather a portfolio of services delivered via several methods that is balanced for cost and risk with the ease of consumption and transparency of the public cloud and all the security and compliance associated with the data center.

I’ve geared my presentation for Thursday to address some tactical approaches to implementing such a strategy with achievable early successes to build momentum for the adoption of the model.¬† I’d welcome discussion, questions, another perspective via the comments, engagement via Twitter or on the webcast session.

Please feel free to register here and join in the conversation:

EMC Live Webcast:
Create an Architecture and Roadmap for Your Private Cloud

Thursday, May 6, 2010
8:00 am PT / 11:00 am ET / 15:00 GMT

Register Today!

The private cloud vision has captured the attention of enterprise IT leaders and strategists because it promises unprecedented economies of scale and dramatically improved business agility.

EMC Consulting experts can help you find the best path to the private cloud by leveraging virtualization, pooling enterprise resources, and adopting a service-oriented model.

Attend this webcast and learn how to:

  • Identify the key attributes of a private cloud architecture
  • Establish a business case for private cloud
  • Develop a high-level architectural plan for private cloud
  • Transform operations into a service-oriented, self-service model
Share

The Web at 20!

The most recent edition of EMC’s ON Magazine contained a whole series of articles and musings celebrating the Web at 20 years and imagining what the next 20 years will bring for it. The EMC Community of bloggers has taken this meme and shared a number of very cool stories and ideas. I’ve been tagged by Christine Christopherson, one of our very talented user experience designers, to contribute my story and ideas for the future of the Web. Like my fellow EMC’ers I’ll be addressing the following three questions:

How has the web changed your life?
How has the web changed business and society?
What will the web look like in 20 years?

How has the web changed my life?
I was a research assistant in the Physics Department at the University of Notre Dame during the summer of 1991 working with Prof. Carol Tanner’s team researching Optical Atom Traps. The lab I was working in was not too far away from the computer lab with the recently acquired NeXT workstations. These things were exceedingly cool as up to that point I’d only been exposed to Apple II’s in my rudimentary programming classes and rather clunky IBMs that my Dad got through work. There was a team at CERN that was doing very similar work to the ND team and they were publishing their notes and results to an internal system utilizing the CERN httpd server, which funnily enough ran very well on NeXT. At this point I’d never heard of Tim Berners-Lee or his grand vision, it was simply regarded as the next wave of Physics documentation management. I remember being a little dismissive of it at that point, mostly because I was just in love with lab notebooks and couldn’t see how a computer would be better than that.

I forgot about httpd for two years until I was a software engineering student at the Illinois Institute of Technology and got reacquainted with the very nascent Web. I discovered Yahoo and all this new content that was coming online and played around with the W3C httpd server more, learning about HTML and UNIX administration in the process. The Web changed my life because it was the gateway drug to Solaris and Irix I must be honest. I became a UNIX snob, thrilled by the power of the Sun Sparcs and SGI Indys running their server daemons and databases. The Web and the openness of its communication lured me away from the closed systems that I had been programming for, after seeing the power of the Web there was no way I was going to sit in a cube and code 1 function or class for some humongous software package for three years. I became a Web administrator at Chicago Kent College of Law supporting the Circuit Court and the paperless law school and from there I went into consulting for first the Web, then intranets, then Data Centers until finally I was running operations for MyPoints.com, one of the top 10 web properties in 1999 and 2000. I learned a lot along the way about connecting people and ideas and have been able to develop a much more expanded vision of the power and purpose of systems and I am very grateful.

How has the web changed business and society?
Let me count the ways, they are legion. The power of the web to bring people and information together is exactly what drew me to it. It’s changed the way that I do just about everything, from shopping and learning about new products to finding information, teaching my daughters, watching movies and TV and interacting with my friends and peers. I think it is great the way that many companies are expanding their use of the Web, engaging their customers, learning what they expect of the products and services, how they’re being used, how they could be improved and allowing customers to get together to share even more about themselves around a shared passion or interest facilitated by that company. The Web has even changed product design, more and more companies crowdsourcing their designs or creating contests via the Web to develop new products. Nike has done a great job of this allowing everyone to design their own custom shoes and hosting design competitions online. Awesome stuff. Web-enabled customer forums help people get answers, best practices, unvarnished opinions and new contacts all in one place. EMC has done a ton of work in this area and I’m proud of the communities we’ve built for our customers. Gina Minks blogs often about our communities and has done a ton of work setting them up and managing them.

What will the Web look like in 20 years?
Well there certainly has been a lot written about the future of the Web and Technology but I’ll add my 2 cents. I am very much in the school of Neal Stephenson and his views presented in Snow Crash and The Diamond Age. The Web will become more immersive and more pervasive, if that’s even possible. I’m not sure if Virtual Reality will really take hold, but there certainly is a lot of potential there. I think the biggest differences we’ll see is around search and the ability to federate searches and be able to more quickly integrate and analyze the results a la Wolphram|Alpha on steroids. We’ll also see a lot more integration of location aware and other context based integration into search and content presentation. I’m especially excited by the possibilities of more integration of open-source and crowd-source design like that at Local Motors and in the Makers by Cory Doctorow. I guess I’d sum it up by saying ubiquitous access, high bandwidth, context aware natural language search and analytics with data privacy and even more by the way of integration of social networks and academics etc. Needless to say I am excited to be a part of the continued transformation.

At this point I’d like to tag that font of information Christopher Kusek aka CXI and Kathrin Winkler who I hope will talk about the Web and sustainability in 20 years!

Share

Product Management

Awhile back I got a call on a Friday night that is familiar to many consultants, “Can you be in City X on Monday morning?”¬† The program manager on the other end of the phone remembered hearing that I had a degree in Product Management and was eager to get me in front of his customer who was looking to transform his organization into one that managed infrastructure according to a Product Management Lifecycle (PML).¬† Now I admittedly view the world through PML-tinted glasses, but this concept had really piqued my interest.¬† The idea was a pretty simple one: convert his organization to be product-oriented and merge the PML with the IT Infrastructure Library (ITIL) framework and the Software Development Lifecycle (SDLC) that the organization was already spottily using.¬† As a Unified Field Theory devout I was hooked!

The customer, like most, was approaching the development, testing and management of their infrastructure through a number of siloes: people thinking about the long term strategy; another group concerned with the implementation of systems; a group that tested the integrated infrastructure; a group responsible for the daily management of the environment; and an organization dedicated to interfacing with the customer to understand their requirements (and on occasion their satisfaction).¬† Strategy, architecture, engineering and operations were divided across the organization with several silos within each knowledge area.¬† No one was incented to work together, no one had a vision of the entire infrastructure as a “system” and finger pointing was the order of the day during any outage.¬† Walking around the several floors the IT department was spread over there was an air of discontent, people bolted for the door at 5pm, at the latest, were largely disengaged and took pride in the walls they put up around their particular part of the organization.¬† Worst of all the business, their customer, was unhappy and questioning why they were spending so much on that black box called IT.

Continue reading Product Management

Share

Utilization

One of my biggest pet peeves over the years has been utilization or capacity reporting.¬† I firmly believe that in order to figure out how to transform an environment into a more efficient one you have to first know what you’ve got.¬† Over the years I’ve walked into customer after customer, or dealt with admins or peers when I was on the other side of the table, who couldn’t tell me how much storage they had on the floor, or how it was allocated, or what the utilization of their servers were.¬† Part of the problem is that calculating utilization is one of those problems were perspective is reality, a DBA will have a much different idea of storage utilization than a sysadmin or a storage administrator.¬† And depending on how these various stakeholders are incented to manage the environment you will see a great disparity in the numbers you get back.¬† It may sound like the most “no duh” advice ever given but the definition of utilization metrics for each part of the infrastructure is a necessary first step.¬† The second step is publishing those definitions to any and every one and incorporating them into your resource management tools.

Stephen Foskett has a great break down of the problem in his post on “Storage Utilization Remains at 2001 Levels: Low!“, but I’d like to expand on his breakdown to include database utilization at the bottom of his storage waterfall.¬† I often use the “waterfall” to explain utilization to our customers.¬† In this case knowledge truly is power and like Chris Evan’s mentions in his post on “Beating the Credit Crunch” there is free money to be had in reclaiming storage in your environment.

It’s not just knowing about stale snapshots sitting out in the SAN, knowing how many copies of the data that exist is imperative.¬† One customer had a multi-terabyte database that was replicated to a second site, with two full exports on disk and replicated, a BCV at each location and backups to tape at each site.¬† That’s 8 copies of the data on their most expensive disk.¬† Now I’m all for safety, but that’s belt, suspenders and a flying buttress holding up those trousers.¬† A full analysis of utilization needs to take these sorts of outdated/outmoded management practices into account for a full understanding of what is really on the floor.

Old paradigms regarding the amount of overhead at each layer of the utilization cake need to be updated, the concept of 15% – 20% overhead for the environment is a great concept, until that environment gets to be mutli-petabyte, then you’re talking about hundreds of terabytes of storage sucking up your power and cooling.¬† Of course storage virtualization is supposed to solve problems like this, but proper capacity planning and a transparent method of moving data between arrays and/or systems with realistic service levels in place can address it just as effectively.

Share

The changing nature of Information Technology

I’ve been lucky enough to be in our industry for the last 17 or so years and I have seen all sorts of changes, as we all have. If I think back to my days as a research assistant at a university using the engineering lab Sparcs to create lab reports and pass emails back and forth with other researchers, I’d never have envisioned helping to design and run a system that would send out more than six million customized emails per hour less than ten years later.

In the early 90s IT departments, if you could call them that for most organizations, were necessary evils, a band of misfits who toted various cables and dongles and floppies around to who knew what ends. Today IT is at the heart of several large industries, the difference between successful, profitable businesses and those on the bubble. We’ve seen the industry evolve from sysadmins being a bunch of doctoral and master’s students to kids graduating from high school knowing how to program in a number of languages with a CCNA certification. When I try to imagine what the next 17 years will bring I’m mystified to be honest, the change has been rapid and amazing.

There are a lot of challenges facing us as we move forward as a profession. The interconnectedness of today’s market means that everyone wants access to everything, NOW. Cell phones are becoming viable compute platforms, they are fitting 32 cores on a chip and we have a pretty ubiquitous, fast fabric tying most of it together. At the same time there is more regulation now that pretty much the sum of recorded history to about five years ago. My colleague, Chuck Hollis, talks a lot about the need for a CFO of Information, I think he’s on the right track. But that new position requires tools for reporting and analysis that cut across the many silos that make up IT and the heterogeneous infrastructures supporting them.

No IT framework like ITIL or COBIT or MOF will act as a silver bullet, no off the shelf Resource Management system will give you all the insight you need, no new analyst acronym like GRC will encapsulate everything you need to worry about. A change in the way we design, implement and manage our infrastructure is required to ensure that IT continues to be a source of business value and not just a cost center, or worse the place were Information goes to become confused, lost and irrelevant.

Share