Chapter 5

Chapter 5, Technical Studies

Section 5.7: Design-to-Cost

NOAO Logo    Gemini Logo

Constructing and operating an extremely large telescope like GSMT will be expensive, and it will be necessary to reduce these costs below levels that would be predicted by scaling from existing facilities. This section describes a design-to-cost approach that can ensure that GSMT will be affordable and cost-effective. Key to its success is adherence to a management approach that can address the technical and programmatic challenges presented by GSMT.

5.7.1 BREAKING COST PARADIGMS

Several authors have discussed the relationship between telescope size and cost (see References 1-5), and different exponents have been proposed for the exponent of the cost curve. For telescopes of basic design similar to the Palomar 5-m, the traditional cost scaling law is:

Cost ~ D2.7

where D is the diameter of the primary mirror.

At one time, it was assumed that this empirical "law" would make telescopes larger than about 5-m aperture unaffordable.6 However, telescope builders began to realize that it was possible to take advantage of modern design approaches to effect significant cost reductions largely by engineering telescopes to be smaller and lighter (see Section 4.1). In the 1970s and early 1980s, several telescopes were built that departed from traditional designs: the Multiple Mirror Telescope (MMT), United Kingdom Infrared Telescope (UKIRT), and the 2.3-m Advanced Technology Telescope (ATT) at Siding Spring Observatory. In each case, design innovations led to costs below the canonical cost curve. The Keck telescopes provide a recent and compelling example of how radical design innovations can "break the cost curve." A comparison with the Mayall 4-m telescope at Kitt Peak National Observatory (KPNO) illustrates the point dramatically.

The Mayall telescope cost $10.65M in 1970 dollars, equivalent to $34M in 1992 dollars. By comparison, Keck I, completed in 1992, cost approximately $100M. The ratio in diameter is approximately 2.5; raised to the 2.7 power, this predicts that a 10-m telescope would cost 12 times as much, or about $400M in 1992 dollars. Therefore, compared to the Mayall, Keck I was about a factor of four below the canonical cost curve. If the Keck design were scaled to a 30-m diameter using the D2.7 scaling relationship, it would cost approximately $2B in 1992 dollars, and probably more that $3B in year-2012 dollars. The Astronomy and Astrophysics Survey Committee (AASC) decadal survey estimated that the appropriate cost of a 30-m GSMT should be ~ $600M. Meeting this estimate will require producing GSMT with a similar "innovation factor" to that which Keck achieved over the KPNO 4-m; that is, utilizing technologies and management approaches that will break the "Keck cost curve" by a further factor of about four.

Beyond investments in key technologies, part of the paradigm shift required to meet this challenge is to examine in far more detail than has traditionally been done for ground-based telescopes the total life cycle cost of a GSMT as an observatory, and to adopt a design-to-cost management approach for the entire GSMT endeavor.  This approach is described in the following sections, and in greater detail in reference 9.

5.7.2 LIFE CYCLE COSTS

Construction cost is only the tip of the iceberg when it comes to the total cost of an observatory. The life cycle cost of a facility includes development, construction, operation, and support. The yearly cost of operating an observatory is typically about 10% of the construction cost, when instrument upgrades are included. Over the 20-30-year prime operating phase of an observatory, the operating cost will likely be two to three times the construction cost in constant dollars.

It is vitally important to minimize the life cycle cost of GSMT. The way to ensure this is to use design-to-cost methods.

5.7.3 DESIGN-TO-COST APPROACH

The design-to-cost approach has been described in the following manner:7

"The essence of design to cost ... is making design converge on cost instead of allowing cost to converge on design. ... In design to cost, cost is elevated to the same level of concern as performance and schedule ... Realistic cost goals are established from early trades with performance and schedule, but not at the expense of the basic function the product or service is to provide, and never at the expense of quality."

Design-to-cost is a key concept used in Department of Defense acquisitions. It has also been applied to astronomical telescope projects, such as the Gemini Project. For GSMT to be affordable to the astronomy community, design-to-cost must be applied from day one.

A broad experience base with large, technically complex projects shows that 70% of the life cycle cost of such programs is determined by decisions made by the end of the conceptual design phase. After that time, it becomes increasingly difficult to accomplish significant savings. Cost reductions imposed later in the program will likely result in a sacrifice in performance and/or quality.

5.7.3.1 Value Engineering

One of the central concepts of design-to-cost is that of value engineering. Value engineering is the discipline of reducing life cycle costs while preserving the essential performance of the system. It requires finding ways to simplify designs and reduce acquisition and maintenance costs, as well as a clear understanding of what truly constitutes value to the customer (i.e., the astronomical community).

5.7.3.2 Cost Effectiveness

Another key concept in design-to-cost is that of cost effectiveness. This can be thought of as maximizing value to the customer per dollar spent. This is an important concept in evaluating alternative system designs and alternative operational models.

For a given mission, there is often a particular range of system size or complexity that optimizes cost effectiveness. If the system is made smaller or simpler, the cost will go down but the value often goes down faster. Similarly, if the system is made larger or more complex, the value goes up but often not as fast as cost. The optimum point depends on the performance goals of the system, and it will change with time as new technology becomes available.

In order to evaluate the cost effectiveness of a proposed system, it is important to be able to express the value to the customer in terms of criteria that can be defined in precise, quantifiable terms. In effect, what is needed is a merit function that can be evaluated in relation to costs.

The following paragraphs discuss the two factors of cost effectiveness: defining scientific value and estimating cost.

5.7.3.2.1 Scientific Merit Function

How can the scientific value of a proposed telescope be quantified? We believe the first step is to understand the scientific context in which it will operate. In the case of GSMT, this context is defined in part by the synergies that will be possible with the James Webb Space Telescope (JWST), Atacama Large Millimeter Array (ALMA), and later, the Square Kilometer Array (SKA) and Constellation-X, along with the existing suite of 8-10-m telescopes. Based on an evaluation of GSMT's role in this context, its most important scientific goals will be defined and prioritized.

From these scientific goals, science requirements will be derived (an initial set of science requirements is listed in Chapter 3). However, in a design-to-cost approach, these requirements must be iterated along with cost estimates to optimize scientific return in relation to the cost. Instead of considering the requirements as specifications requiring formal approval for deviations, in the conceptual design stage it will be more effective to derive a scientific "merit function" that can be used as a guide for optimization.

There are a number of factors that are likely to be important in the merit function, including the following:

For each observing mode/scientific program, the dependence of scientific performance on the key system parameters should be described by a quantitative expression to define a figure of merit.  Proposed conceptual designs can then be evaluated to determine the values they provide in terms of each figure of merit.  The figures of merit can be combined with appropriate weighting factors to determine a quantitative merit function.

Where appropriate, the figures of merit should consider statistical weighting rather than absolute requirements. For example, rather than specifying a single standard that must be met under worst-case operational conditions, the figure of merit should relate to the distribution of predicted performance, perhaps based on average performance, or applying a higher weight to the performance under the best conditions.

It will take effort to set up this type of merit function, both to reach a consensus on the priority of the science requirements and to derive a weighting scheme that truly reflects the scientific value. However, the better this can be defined in a quantifiable manner, the more likely that GSMT will be affordable and cost-effective, yielding the highest scientific return for the investment.

5.7.3.2.2 Cost Estimating Methods

When evaluating alternative designs, we need to be able to accurately estimate the cost of each and understand the dependence of cost on the key factors in the design (e.g., aperture, field of view, etc.). Several cost estimating methods are in common use:8

  1. Expert opinion - Estimate is based on judgment of experts in the field. If there is little historical data on which to base an estimate, this approach can provide a number to serve as a placeholder.  The credibility of this method depends on the independence of the expert; if the estimator is an advocate for or opponent of the program in question, the estimate has less credibility because of the potential for bias.  An added concern is that technical experts are not always experts on cost estimating, and some telescope and instrument projects have cost 2-5 times what the "experts" initially predicted.

  2. Analogy - Estimate is based on cost data from previous programs. This approach can provide a good starting point or a valuable sanity check on a more detailed estimate.  Usually, adjustments are needed to account for differences in project size or complexity, and to account for inflation.  The accuracy of the estimate depends on the similarity of the programs and the expertise of the person doing the adjusting.  If large adjustments are needed because the programs are of significantly different scale or complexity, the validity of the estimate declines.

  3. "Industrial Engineering" - This is a bottom-up approach that starts with a detailed work breakdown structure and estimates the cost of each item using commercial prices, vendor quotes or cost estimates, standard costs derived from time-motion studies, and so on. This approach requires a lot of investigation and must be repeated in detail for each alternative design considered.  Therefore, it is best suited to later phases in the project, after detailed designs have been developed.

  4. Parametric - The parametric method evaluates cost data from various sources and derives parametric relationships that express the dependence of subsystem cost on variables that characterize the size or complexity of the subsystem. In this approach, extensive data are collected from similar previous programs, and cost-estimating relationships (CERs) are derived based on measurable parameters, such as maximum power, total weight of steel, etc. This approach is similar to the "industrial engineering" approach in that it requires a lot of investigative effort, but once the CERs have been derived, they can be used to estimate the cost of many alternative designs with only a little additional effort.

As discussed in section 5.7.4 below, parametric cost estimating is the preferred estimating technique for design-to-cost trade studies during the conceptual phase of a project, and it is well matched to the approach of optimizing cost and performance in the context of a "science merit function."

Considerable effort must be invested to collect the information on which the CERs can be based. Because there are not yet any completed 30-m projects to learn from, cost data from existing radio telescopes and 8- to 10-m optical/IR observatories will be used to provide a database. Additional information can come from cost estimates from vendors and consultants, although in general, estimates are not as good a basis as actual costs. This appears to be an excellent area for collaboration with other ELT (extremely large telescope) groups, not only because they have a similar need to develop cost estimates, but because in many cases the same groups also have cost information from having recently constructed a large observatory.

Once a database of costs from recent telescope projects has been assembled, it will be possible to determine CERs by multiple regression analysis. This will help us understand the key parameters that predict project cost.

5.7.4 NEXT STEPS

To minimize the life cycle cost of GSMT, key decisions must be made during the conceptual phase regarding the telescope configuration, size, operational model, unique capabilities, and so on. The basis for these decisions will come from trade studies that consider the scientific merit and associated cost of several alternative designs and associated operational models. In each case, some iteration of the science requirements will be allowed, while tracking the changes in scientific merit and life cycle costs associated with each alternative. The cost effectiveness of each alternative will be determined, and the best design concept will be chosen. By the end of the conceptual design phase, the design concept, revised science requirements, and cost goals will be established.

To be effective, the trade studies should be conducted as early in the project as possible, and they should be undertaken in the most efficient manner possible. The key to this will be to make advance investments that prepare us to "hit the ground running" once the project is formally started. The following activities, discussed in other parts of the book, are viewed here as necessary advance investments that will make possible an effective design-to-cost process.

5.7.4.1 Define the Science Context and Goals

The steps involved in reaching a community consensus on the science goals are outlined in Next Steps . The GSMT Science Working Group (SWG) will examine the role of GSMT in the era of JWST, ALMA, and other forefront facilities of the next decade. The SWG will develop a consensus on the relative priority of the various scientific missions of GSMT, and an understanding of the most important science requirements. From this start, a smaller group reporting to the SWG will derive a merit function that reflects the community consensus.

5.7.4.2 Develop a Point Design

The point design can be thought of as a tool we have invested in, one that provides an opportunity to identify key technical issues and highlight design factors important to fulfilling the science requirements. The point design provides a strawman that can be used as a basis for developing modeling and cost estimating methods.

The point design cost estimate gives us information that will allow a Pareto analysis (see box) to determine the most important cost drivers in an ELT. It will indicate the areas we should emphasize in cost reduction studies.

Pareto principle is a term coined by noted quality expert J. M. Juran to describe the observation that in evaluating any problem, there are a "vital few" causes that contribute the majority of the defects, and a "trivial many" that contribute only a few. Juran named this principle after the Italian economist who formulated similar ideas about the vital few and trivial many in regard to the distribution of wealth. Pareto analysis is the process of identifying the vital few causes so that the maximum benefit can be obtained by concentrating attention where it will do the most good.

5.7.4.3 Develop Parametric Cost-Estimating Relationships

In addition to estimating the cost of the point design, we are collecting information that will allow derivation of CERs to use in parametric cost estimating. This information includes construction and operation costs of existing large observatories, cost estimates from vendors, and so on. It is our intention to evaluate this information in collaboration with other astronomy organizations that are interested in estimating the cost of future projects, thereby providing a uniform and objective cost estimating method that can be used to compare alternative designs.

5.7.4.4 Develop Integrated Modeling Methods

To evaluate the merit of a proposed telescope design requires predicting its performance. Complex hierarchical control systems will be needed for a 30-m telescope to deliver diffraction-limited images in the presence of disturbances such as atmospheric seeing and wind-buffeting. Simulation and analysis of the performance of GSMT, in a statistical sense rather than just a worst-case sense, will require modeling: atmospheric seeing; response of the telescope structure to disturbances; effect of the disturbances on optical performance; and the ability of interrelated control systems to correct the disturbances in real time. This type of simulation is called "integrated modeling." Our plans for integrated modeling are described further in Section 6.1.

To develop the integrated modeling capabilities necessary to evaluate alternative telescope design concepts will require an investment of time and money. However, once we have brought our design team up to speed in doing this type of modeling and the necessary computational tools have been purchased and/or developed, creation of additional models will be straightforward and efficient. We view the creation of an integrated modeling capability as a necessary investment that should be made before the start of the formal conceptual design phase, and a key aim of this investment is to serve the goals of multiple community efforts to explore the performance of proposed GSMT concepts.

5.7.4.5 Technology Development

The design concepts envisioned for GSMT require extending technology in several areas, both in terms of technical feasibility (e.g. , high-order deformable mirrors) and reducing cost (e.g., cost-effective segment fabrication). The necessary areas of technological development are described in several sections of this book and are summarized in Section 6.1.

Investments must be made over the next year or two to develop the technology that will be needed to make GSMT successful. Unless these technological solutions are sufficiently mature by the time of the conceptual design phase, it will be far more risky to incorporate them into GSMT, with potential long-term effects on the telescope's performance and life cycle cost. Because the funds available to any single ELT project are limited, it will be of great advantage to the community for all projects to work together to ensure their efforts are complementary whenever possible. We describe plans to facilitate such collaborations in Section 6.1.

5.7.4.6 Site Selection

The choice of site drives the technical requirements in terms of environmental conditions (wind, temperature, seismic accelerations) and in terms of the local topography (type of underlying rock, available space on the summit). It also can be a significant cost driver. Therefore, the choice of site is tied to the evaluation of alternative designs, and an understanding of the cost implications of the site choice will be important in determining the cost targets for the rest of the project. In addition, it is crucial in the process of site selection that the evaluation criteria consider the life cycle costs of constructing and operating a GSMT at the specific site being examined.

Consideration of viable sites will take several years. Site selection involves identification of the most promising sites, on-site data collection for an extended period, analysis of the data, a quantitative assessment of the construction and operating overheads of each candidate site, and crucially, a decision making process based on design-to-cost.

It is clear that site selection is another area where an advance investment of effort is required. Consequently, we have been pursuing an aggressive site testing program for more than two years.

5.7.5 REFERENCES

  1. Whitford, A. E. Ground Based Astronomy: a Ten-year Program. Washington, D.C.: National Science Foundation (1964).

  2. Meinel, A. B.; Meinel, M. P. "Some comments on scaling law information relating to very large telescope cost goals". Optical and Infrared Telescopes for the 1990's, pp. 1027-1042. Tucson, AZ: National Optical Astronomy Observatory (1980).

  3. Schmidt-Kaler, T.; Rucks, P. "Telescope costs and cost reduction". Proc SPIE 2871, 635 (1996).

  4. Sebring, T.A.; Moretto, G.; Bash, F. N.; Ray, F. B.; Ramsey, L. W. "The Extremely Large Telescope (ELT), A Scientific Opportunity: An Engineering Certainty". ESO Conf. Proc. 57, 53 (1999).

  5. Andersen, T.; Christensen, P. H. "Is There an Upper Limit to the Size of Enclosures?". Proc SPIE 4004, 373 (2000).

  6. Learner, R. "The Legacy of the 200-inch". Sky & Telescope, 71, 349 (April 1986).

  7. Michaels, J. V. and Wood, W. P. Design to Cost. p.1. New York, NY: John Wiley and Sons (1989).

  8. Ibid, page 290.

  9. Stepp, L., Daggert, L. and Gillett, P., "Estimating the costs of extremely large telescopes". Proc. SPIE 4840, (2002)

November 2002