Welcome to the Farm – The Goat Farm – Episode 1

Excellent way to collaborate and communicate on current enterprise needs while maintaining original DevOps underpinnings and culture.

The Goat Farm

Welcome to the first episode of The Goat Farm. In this episode Ross and I chat about our backgrounds, our thoughts about DevOps, why the world needs another podcast, and DevOps in the Enterprise.

TL;DL: With The Goat Farm Ross and I hope to expand the content around DevOps in the Enterprise, with the specific focus around real world case studies and successes in the Enterprise.

View original post

Posted in Uncategorized | Leave a comment

Managing Complex Systems Through Orchestration with Chef

When talking about the management of complex systems, orchestration is always a hot topic. This is because orchestration is often seen as the easiest way to represent and model complex systems, as well as provide a path to delivering complex systems.

Most often orchestration is represented through a topology model. What is a topology model you ask? A description of the order-of-operations across a group of machines. A common example is provisioning a database, cache layer, multiple application servers, web servers, and load balancer(s). This model will include distinct technology components that must interact, are interdependent, and more often than not the provisioning is accomplished through a very specific order.

Although orchestration solutions have been around for a long time, very rarely have they fit the bill. This is one of the reasons why Chef has opted to take a different approach, which we see proving more functional in the real world. In other words, we’ve worked hard to create a solution that actually solves the problem consistently in a wide range of complex, real world implementations.

At the core of ensuring a robust orchestration mechanism is following the principles that have made our core product Chef great: scale, idempotency, flexibility, and test-driven automation.

In this post we’re going to take a look at how Chef can make orchestrating complex systems easier, faster, and more reliable.

For those that might be relatively new to Chef let’s do a quick recap of some of the core principles.

Here’s how this diagram breaks down:

  1. Having an open syntax to provide flexibility and consistency to how the product may need to be configured. Whether at a single system / application instance level or in the case of multi-component orchestration this is critical.
  2. Driving everything from a single source of truth. Source control has been the tried and true de facto standard for modeling, tracking, testing, and allowing for visibility and collaboration on state.
  3. Desired state management as well as providing a consistent mechanism for applying change (idempotency) so that getting and tracking execution of state is intuitive.

So how do we achieve orchestration with Chef? An extension to Chef known as chef-metal allows you to use the core Chef principles to manage other aspects of your infrastructure. In this example we will use AWS as a platform, however, we are continually expanding our overarching orchestration concepts across different platforms.

As you’ll see chef-metal allows us to manage core AWS components such as ELBs, SQS queues, compute resources, and more. Chef-metal makes it extremely simple to provision these core AWS components, as well as ensure that they stay within compliance to the policy you define with Chef.

As always, everything starts from a recipe (policy).

  • 1) In this case we can define our multi-component service and the order it should be provisioned in. Very intuitive and obvious. We’ve defined everything within a datacenter and start with creating an “sqs queue” named mariopipes.
  • 2) We define our machine bowser along with optional bootstrap and dynamic configuration parameters

  • 3) Next we define our load balancer webapp-elb and options such as availability zone, protocol and ports.
  • 4) We include the machine bowser defined in our previous step to be added to the load balancer.

    • 5) Lastly we’ve included some additional services such as our “sqs queue” named luigipipe and “sns topic” namedus_west_topic.

This is just one example of how chef-metal can enable you to orchestrate complex systems. At the core of ensuring a robust orchestration mechanism is following the principles that have made our core product Chef great: scale, idempotency, flexibility and test driven automation.

This blog post was reprinted from here

Posted in Technology | Tagged , , | Leave a comment

Open Source, Enterprise, Service and Value … It’s Just Simple Math.

I’ve been involved with technology for what feels like multiple lifetimes. Automation has been a critical part of my whole career, in all aspects as a technologist and business owner. Invariably because automation equates to productivity and output, key drivers for successful execution. My first foray into serious automation was back before the .com boom. Like many practitioners of the time, I created an automated provisioning solution for my ISP to rapidly deploy and reclaim systems for re-assignment. Amazing how that’s still an issue today but a testament to the ever changing landscape.

Open source was a key component of most of my engineering experience and I started using open source even when doing so was questionable by most leadership. I’ve seen first hand the transformation of this negative perception. Going from reluctance to enthusiasm over the last decade both in and outside of large companies. Although open source has gained in popularity, there is still a lack of understanding as to how a customer may benefit from a commercial relationship or why a vendor would base their core offerings on something that is … well free?

We’ll explore that question further but this discussion is not limited to open source. This change in perception applies to the new delivery methods of software and services today. The reason for the continuing success can be represented with a simple math equation:

value = use / cost

Simply stated: The value we hope to achieve is directly related to it’s use and inversely related to it’s cost. No matter how many graphs, charts, analysis are provided the value of software has nothing to do with the fact that it’s simply delivered to the customer. Pretty obvious right? Yet this is a mistake that has proliferated across many industries and continues to afflict many enterprise endeavors. Especially over the last decade as complexity of use cases has increased and cost of said software to fulfill these use cases has also increased. it’s in part why these new models have continued to grow in popularity.


Traditional Approach – use is constrained due to :

  • Dependence on what vendor provides in product.
  • Dependence on what vendor is able to iterate within product development to meet new demands.
  • Whether vendor is willing and able to provide further training and that can adapt to meet new demands.
  • Whether vendor is willing and able to provide mechanism to support shared content and experiences, also known as best practices.
  • Even when above is provided by vendor, they’re directly tied to more cost, happily passed on to customer whether used or not.

New Approach – use is liberated by :

  • Source visibility and openness.
  • Rapid iteration through greater collaboration, influence and transparency.
  • Shared learning paths by practitioners.
  • Shared content and mechanism for transparency on pragmatic usage and best practices.
  • At the foundation, cost is minimal, entry to market is easy and ongoing improvements are supported by ever growing community.
  • Cost is always related to use, commitment is flexible with new pricing and delivery models.

At the practitioner level subscription based, trial and open source software proliferates due to the numerous factors which contribute to ease of use. With an already low cost, value is readily achieved and directly proportional to the continued and expanded use. Commitment is flexible and based on use of consumer which can often times elevated to a commercial relationship at the customers discretion. Taking a closer look at open source software you may ask:

What is the value in a commercial open source offering? Service

os_service_modelAlongside an open source product a commercial relationship acts as a facilitator for and encourages increasing use between vendor (V), customers (C) and partners (P).

This is continually influenced by shared input, patterns of successful execution and service enhancements that directly impact improved use across the board. Acting as facilitator also provides a degree of accountability which is not present in the “Go It Alone” approach which is a typical entry point when using open source software.

Sounds too good to be true, how can a vendor sustain this model?

The vendor capitalizes on the open source offering through :

  • Shared knowledge, experience and content
  • Which develop larger communities, market visibility and adoption.
  • Which identify and correlate more avenues for use
  • It’s from both rapid use and having insight into this use that continued service can be provided and then benefitted from.
Why should customers want to contribute anything back?

There is an increase in use and hence value the larger the “community” gets. This includes those giving and receiving. We’ve seen the pendulum of in-house desired expertise swing the other way over the last decade. Many experienced and technical resources are in high demand across all levels of an organization. Attracting this talent is often times is done through involvement in open source based movements that gain in momentum, experience and practitioners. It’s sometimes tough to change ones thinking but this system is not only real, working in use today but also proving more and more successful.

“When a platform is self-service, even the improbable ideas can get tried, because there’s no expert gatekeeper ready to say, ‘That will never work!’ and guess what? Many of those improbable ideas do work, and society is the beneficiary of that diversity.” – Jeff Bezos, 2011 letter to shareholders

It’s an absolute no-brainer to make a simple change to vendor relations to receive a reliable cost model that matches the value received (use) and which scales over time … period. In a follow on article I’ll compare and contrast this to the “Keiretsu” (Hey, I needed one buzzword in the article!)

Posted in Business, Technology | Leave a comment

Achieving “Awesomness” with Opscode Chef (Part 2)

In part 1 of this series we focused primarily on the value of leverage and how tools, inside and outside of the context of IT can directly influence the output of the desired function.

In part 2 we’ll discuss more specifically the reasons why companies are and why companies should be executing on Continuous Delivery and DevOps initiatives to gain more leverage in their business.

When most people not heavily involved in either the Continuous Delivery or DevOps movement are asked what they hope to achieve with each, the answers are typically :

  • Utilize resources more efficiently for application release.
  • Gain better understanding of application release process through cross domain enablement.
  • Provide accountability to all groups involved in production application lifecycle.
  • Increase “time to delivery” of application to production environments.
  • Improve “quality” of applications operating in production environments.

Although all of these are true, they point to aspects of value but don’t necessarily do a good job of conveying the big picture, which is what top level business influencers often need to make a key business decision.

So what is the “Big Picture” you ask?

The answers has it’s basis within economic globalization. Globalization has been rapidly expanding and its impact is both broad and deep. To succeed in a globalized economy, businesses are increasingly leveraging software. Software provides the leverage to derive and deliver value faster, which, in turn, enables businesses to better compete in a worldwide marketplace. Unfortunately, it’s not as easy as simply desiring to leverage software, and although all companies leverage software to a certain degree, this proliferation is both good and bad. On one hand, the rate at which we can provide value to a consumer increases (good) but a business’s ability to do so can be constrained by legacy thoughts, processes and supporting infrastructure (bad).

This is where initiatives such as Continuous Delivery and DevOps come in. Companies who have been able to execute on these initiatives are now able to not only perform the tactical bullet points mentioned above, but also deliver variations of services to the consumer in a consistent and reliable manner. The result is more valuable consumer services that generate greater revenue. So, businesses can grow demand for service offerings while continually improving and expanding its revenue opportunity in parallel.

It is with this understanding that initiatives such as Continuous Delivery and DevOps no longer become a “Should we …” question of the business but rather a “When will we …” question of the business. All the while keeping in mind that it is in these times of economic recovery that market leaders are born.

For a sneak peak of some of what you’ll experience at Chef Conf 2013 view our videos :

[iframe width=”640″ height=”360″ src=”http://www.youtube.com/embed/EUrHpAJhCbU?feature=player_detailpage” frameborder=”0″ allowfullscreen][/iframe]

Rationale Behind CD and DevOps – video
Product Only Demonstration – video
Full Video –  video

Register Now for Chef Conf 2013 in San Fransisco California, April 24 – 26th to learn more about a model for success and patterns of failure within the business that we’ve identified along the way. Hear our customer experiences as well as learn more about Chef which can both enable and drive this transition to “Awesomness”.

Stathy Touloumis is a Solution Architect for Opscode and only wishes he had stumbled upon Chef when he founded and managed a software consultancy back in 2005.


Reprinted via [Opscode]

Posted in Business, Technology | Leave a comment

See an “Alpine Rd” and I just have to attack it!

Man, I love California. At any location, at any time you can find just about the best riding. Both visually appealing and physically challenging.

[iframe height=’405′ width=’590′ frameborder=’0′ allowtransparency=’true’ scrolling=’no’ src=’http://app.strava.com/activities/42391748/embed/56342ffc5a84787872fdcbeb95cf87e35709a497′%5D%5B/iframe%5D

Looking forward to more in the coming months!

Posted in Cycling | Leave a comment

Achieving “Awesomness” with Opscode Chef (Part I)

I’d like to start by sharing a statement that was articulated to me a while ago by a vendor of antiquated enterprise technologies :

“It’s not about the tool but the craftsman that wields it.”
– Legacy technology rep

It seemed to hold some semblance of truth until I realized why the statement was made in the first place, to fulfill an underlying agenda of continued incumbency NOT in an effort to enable the organization to strive for awesomeness. This statement does bring up some questions though:

“Is it really necessary to retool within an organization? If one solution accomplishes a specific goal why would it ever need to be changed?”

This 2 part article will provide insight to answer these questions (and those like them) along with how it relates to how Opscode Chef aims to deliver “awesomeness” for our customers.

Leverage = Tools

Since man has first learned to use a rock to smash something or two sticks to create fire, the primary value of tools has been leverage. It is with leverage that the craftsman (Engineer OR Biz Dude) is able to achieve above and beyond what could be purely accomplished by the craftsman himself. But let’s not limit ourselves to the concept of leverage as it’s use is quite pervasive:

  • Financial leverage which can be provided at varying rates of return which can change over time and hence provide varying degrees of value derived
  • A simple tool such as the screwdriver can provide leverage based on the construction of it’s design.

It’s not surprising that humans have continued to evolve to find new ways of increasing leverage. This applies even in the context of business process or more specifically how technology can enable business. Looking back on my career in tech, I have noticed a highly common pattern:

  • The ever growing need for change not just within an organization but the competing landscape.
  • The need to continually adapt “leverage” in it’s varying forms to provide new avenues by which a business can succeed.

Chef = Leverage

So what does this have to do with Chef? In part it has to do with disruptive movements we are seeing in technology today including consumerization of IT, Agile Development, DevOps, cloud computing and more. The other part  has to do with the new “leverage” meant to support these movements such as “Infrastructure as Code” driven platforms like Opscode Chef. Opscode Chef was designed within the context of cloud, agile methodologies and enabling the craftsman in today’s global marketplace. The execution and ongoing success (even in the midst of occasional failure) of these new order initiatives is what we’ll coin, “Awesomness”.

“During the downturn, a lot of companies have chosen to downsize, and maybe that was the right thing for them. We chose a different path. Our belief was that if we kept putting great products in front of customers, they would continue to open their wallets and that is what we’ve done. We’ve been turning out more new products than ever before …”
-Steve Jobs

Awesomness = Opscode Chef

In all honesty, achieving “Awesomness” with Opscode Chef will take some work. Oh yes, that dreaded term which is embedded in most product vendors sales teams  ANTI-mantra. The key question is this:

“Will the leverage obtained from Opscode Chef and the ensuing execution of these new order initiatives provide enough value to my company?”

In today’s climate you better be thinking more and more about the new ways to capitalize on leverage because those companies with the greatest leverage in a down economy are most often, if not always, the ones that continue to strive, grow and capture the market.

So let’s chat briefly about Chef and what to expect from the first of 2 videos we’ll be posting. The real value of Chef is in it’s capability to enable DESIGN of your systems – How your systems are configured, operate and integrate on an ongoing basis. There are several ways leverage is provided but the one which stands out in our video is the dynamic data driven and buffering capabilities. These capabilities are similar to the magicians of old and new. It’s not that they were capable of performing magic, it’s that they were able to provide buffers around the constraint(s) or trick(s) and effectively remove them. Yes, the ol’ thinking outside the box paradigm – aka disruption.

“Any sufficiently advanced technology is indistinguishable from magic”
-Arthur C. Clarke

It’s with this leverage that a platform like Opscode Chef enables IT to deliver to the business reliable, resilient services and to do so cost effectively and repeatedly all while enabling your craftsmen to be at their most productive.

[iframe width=”560″ height=”315″ src=”http://www.youtube.com/embed/GqBunRbR0I4″ frameborder=”0″ allowfullscreen][/iframe]

Simple Continuous Delivery Tool Chaining with Opscode Chef, Jenkins and GCE

Reprinted via [Opscode]

Posted in Technology | 2 Comments

Top 10 sales trends for 2013: An insiders perspective.

I read a great article recently in the Harvard Business Review covering the top 10 sales trends for 2013. I wanted to provide an insiders perspective to these trends with a focus on enterprise software sales. I aimed to cover just the most salient points.

1. Sales Force Behavior “Modeling” Models are verbal descriptions and visual representations of how systems work and processes flow. Models enable repeatable and predictable experiences. More organizations will study their top salespeople in 2013 to understand how they formulate their winning account strategies based upon customer politics, evaluator psychology, and the human nature of executive decision makers that are unique to winning every account.

It absolutely makes sense to adapt models based on current success but I’d argue to start from scratch or provide no guidelines for strategy is insane. “Waiting” for the sales force to determine in todays rapid climate will leave you too far behind to catch up.

Models for qualifying and forecasting sales have been baked for years and years. I propose instead a baseline based on past lessons and success with adaption to current landscape changes. An iterative approach can be leveraged to adapt what is working, what’s not and how changes should be made. A great example of a change that will inevitably impact a sales model is the move to SaaS and self service trial based delivery models.

2. Win-Loss Analysis Studies 
All companies and their salespeople are well versed on the logical arguments for selecting their product. However, the decision to make a major purchase is also influenced by internal politics, how the decision-makers receive information along with individual biases and personal desires. Unfortunately many companies don’t perform any type of win-loss analysis so they don’t understand their customers in these regards. Because of the economy and relentless competition, 2013 will be the year that many companies have to re-discover the lost art of win-loss analysis.

As with all data be sure not to jump to conclusions. You can’t make an empirical analysis off of 1 win or 1 loss. Often times, an agenda may be the underlying reason a number, figure or metric is being conveyed (vanity metrics).


5. Sales Process Ineffectiveness Many companies have realized that their sales didn’t increase even after spending a great deal of money and effort implementing a sales process methodology. The reason for this is because the “black hole” of the sales process is what happens during and at the close of sales calls. Today more than ever, it’s the personal interactions with prospective customers that determines winners from losers, not the internal processes of the sales organization. In 2013 more companies will be studying and categorizing these customer interactions so they can improve sales force effectiveness.

In part due to moving the focal point away from quick iteration and execution in large orgs to a primary focus on visibility/reporting. Failure is due more to system gaming syndrome where the users of the system take the path of least resistance. In this case the path of least resistance ends up being report/pipeline fluffing vs. getting the job done. Case in point, I witnessed a pipeline grow 300% in the course of weeks with no increase in staff (actually decrease through attrition) to accommodate a recent doubling of quota. Take a guess if the closure rate increased proportionally to the pipeline increase … not even close.


7. No Decision as the Main Competitor For sales forces involved with large capital expenditure sales cycles, never before has the mantra “Call High or Die” been so true. Salespeople must reach C-level executive decision makers early in the sales process because the default for organizations today is to maintain the status quo and delay every major purchase.

I have seen all to often falling back on industry or current technology acronyms and mapping to products as the path of least resistance. Sales get paid to influence not perform rote memorization.


10. Continued Migration from Field to Phone One final trend that bears mentioning is the accelerated move from field-based sales to phone-based internet sales. Many companies have quickly transitioned the majority of their field reps to be almost exclusively phone based. Therefore, these reps must now be able to create winning relationships with their voices as opposed to how they sold in the past with their physical presence. Understanding and mastering the art of persuasion will become even more important for all salespeople in 2013.

I’d argue there is greater divergence in the delivery models: lower revenue but higher transaction products and services can now more easily be handled in automated fashions. High touch accounts without face time will inevitably lose to those who realize relationship building is about human behavior; A facet of our DNA developed over thousands of years that can’t be erased no matter how compelling the technology.


Stay tuned for later posts where I’ll convey operational strategies to align and help execute a revenue strategy based on the above trends.


Posted in Business | 4 Comments