Item description for Guerrilla Capacity Planning: A Tactical Approach to Planning for Highly Scalable Applications and Services by Neil J. Gunther...
In these days of shortened fiscal horizons and contracted time-to-market schedules, traditional approaches to capacity planning are often seen by management as tending to inflate their production schedules. Rather than giving up in the face of this kind of relentless pressure to get things done faster, Guerrilla Capacity Planning facilitates rapid forecasting of capacity requirements based on the opportunistic use of whatever performance data and tools are available in such a way that management insight is expanded but their schedules are not.
A key Guerrilla concept is tactical planning whereby short-range planning questions and projects are brought up in team meetings such that management is compelled to know the answer, and therefore buys into capacity planning without recognizing it as such. Once you have your "foot in the door", capacity planning methods can be refined in an iterative cycle of improvement called "The Wheel of Capacity Planning". Another unique Guerrilla tool is Virtual Load Testing, based on Dr. Gunther's "Universal Law of Computational Scaling", which provides a highly cost-effective method for assessing application scalability.
Promise Angels is dedicated to bringing you great books at great prices. Whether you read for entertainment, to learn, or for literacy - you will find what you want at promiseangels.com!
Est. Packaging Dimensions: Length: 1" Width: 6.25" Height: 9.5" Weight: 1.35 lbs.
Release Date Dec 19, 2006
ISBN 3540261389 ISBN13 9783540261384
Availability 0 units.
More About Neil J. Gunther
Neil Gunther, M.Sc., Ph.D. is an internationally recognized computer system performance consultant who founded Performance Dynamics Company (www.perfdynamics.com) in 1994. Originally from Melbourne, Australia, he has resided near Silicon Valley in California for 25 years. In that time he has held teaching positions at California State University-Hayward and San Jose University, as well as research and management positions at Xerox PARC and Pyramid/Siemens Technology, and also worked on the JPL/NASA Voyager and Galileo missions. His performance and capacity planning classes have been presented at such organizations as America Online (AOL), Boeing, FedEx, Motorola, Stanford University, and Sun Microsystems. In 1996, Dr. Gunther was awarded Best Technical Paper at Computer Measurement Group, and in 1997 was nominated for the A.A. Michelson Award. He is a member of the ACM, AMS, CMG, IEEE, and SIGMETRICS.
Neil J. Gunther currently resides in Mountain View Castro Valley, in the state of California. Neil J. Gunther was born in 1950.
Reviews - What do customers think about Guerrilla Capacity Planning: A Tactical Approach to Planning for Highly Scalable Applications and Services?
Great coverage of Capacity Planning and Performance Management Aug 24, 2008
Very readable coverage of Capacity Planning and Performance Management. Doesn't presume any previous knowledge, but doesn't talk down either. Several good chapters talking about queueing theory. A great practical handbook.
Who does this better? Mar 15, 2007
I've read the other reviews and they seem to ignore the "Guerrilla" concept. The fact that scientific analysis is ignored and decisions made on perceived knowledge in most companies for me is the key to the book. Excel is a great way to get the performance point across even with precision errors. Getting management buy in is 99% of the process. GCP makes that argument simple. Read this book and get the word out. Performance is not linear!
Enlightening, however ... Mar 11, 2007
First of all, this book was worth the money I spent on it. I came away from reading this book with a clear understanding of the differences between speed and scale, and with a system for modelling the scalability of systems in general.
However... really all of this value was in the first quarter of the book. I read on and read on looking for further conceptual gems but they weren't to be found.
I guess that books are "meant" to be at least a particular length, but this one could have been much shorter and more concise.
a gem and a keeper Mar 10, 2007
With Wall Street analysts drives the planning horizon, Management prefers getting a sense of direction quickly and repeatedly, instead of belated precise readings of compass bearing. It is in this agile and opportunistic spirit and philosophy that Dr. Gunther introduces Excel, linear regression, and 2 parameter scalability models into the performance analysts' tool chest.
Excel is ubiquitous. It is also easy to use. Use it. If there is sufficient time, better tools such as R or Mathematica can be used to cross-check Excel results. Similarly, linear regression is another tool in the agile performance analysts' tool chest.
Two chapters I have not seen presented elsewhere are the virtualization spectrum and effective demand. In a prior job, having virtualization spectrum chapter available to me would have save me much grief with an workload manager. The effective demand makes another useful capacity project tool to keep handy.
The best part is Dr. Gunther's 2 parameter universal scalability model. It can be immediately used to frame your load testing results to project application scalability. This alone is worth the cost of the book and admission to his classes.
Conjecture 4.1 on page 65 on 2 parameters are necessary and sufficient for scalability model based on rational functions are an interesting open questions. Given that the denominator is a quadratic equation with c = 1, we should be able to argue that it behaves like a parabola, except with c = 1, we won't get into singularity/infinity. For more details, please see Dr. Gunther's blog at
Useful, but only in conjunction with "Analyzing Computer Systems Performance With Perl::PDQ" Jan 7, 2007
I've only given this three stars because it isn't really a self-contained capacity planning "textbook". In conjunction with "Analyzing Computer Systems Performance: With Perl: PDQ", one can "figure out" how to do capacity planning. But neither of these books is really a "textbook" -- they're more a collection of lectures, previous papers, case studies, and irrelevant diversions away from computer capacity planning into physics.
On the plus side, there are quite a few unique contributions that Dr. Gunther has made in this book, and his two previous books. For example, I have not found either his use of the gamma distribution for computing quantiles of response time distributions or his "universal scalability model" anywhere else. As far as I know, his course, also called "Guerrilla Capacity Planning", is the only place you can learn to do capacity planning outside of a university, and his "Perl::PDQ" package is the only open source analytical modeling tool set available. And his analysis of the capacity effects of hyperthreading in "Guerrilla Capacity Planning" is much better than anything I've seen elsewhere. It's too bad Intel didn't have his expertise available when they developed hyperthreading. :)
Finally, some very specific criticisms of the "Universal Scalability Model". First of all, as Dr. Gunther takes great pains to point out, Microsoft Excel does not do a very good job of calculating it. He even has an appendix with Mathematica code to redo one of the examples, showing how inaccurate the Excel version is. Why, then, does he *use* Microsoft Excel? Why did he not include Perl code that does a better job? Why did he not add a module for the Universal Scalability Model to Perl::PDQ? There are plenty of statistical libraries for Perl available on CPAN; I'm sure he could have found a non-linear least squares routine there.
Second, and much more serious, Dr. Gunther advocates fitting the Universal Scalability Model to test data, and then *extrapolating* the results to project the capacity of a system to values outside of the range of the test data! This is absolutely, positively the wrong thing to do!
If the model were *linear*, such extrapolation could be valid over some limited range. But the model isn't linear, it's highly non-linear. And the parameters of the model are in the *denominator* -- *small* changes in the parameter values cause *large* changes in the projected capacity of a system! That makes extrapolation even more risky.
In spite of this, I think the Universal Scalability Model is an important contribution to capacity planning practice when used properly -- for an initial diagnosis of the nature of the bottlenecks in a system, or to estimate the capacity of a system *within the range of available test data.* It's also a good way to characterize the potential scalability of a workload from easily obtained data.