Post a New Topic
Scale down the library proposal
Original post made
by diana diamond, Palo Alto Online blogger,
on Feb 20, 2008
As the cost of renovating and redoing three of our libraries in Palo Alto keeps rising, my enthusiasm for the project keeps falling.
Once proposed as a $45 million endeavor for one library, the current estimate to rebuild Mitchell Park Library, add a community center, and renovate the Main and Downtown libraries is now priced at a whopping $81 million.
If residents agree to fund a bond measure, the estimated property-tax increase for the average Palo Alto homeowner is $180. During the 30-year duration of the measure, a homeowner will spend $5,400 in library property taxes. I could buy a lot of books with that amount.
Yet I am in favor of renovating our libraries and want the library bond measure to pass. But I worry that residents will not support a project with an $81 million price tag.
Remember that just four years ago Palo Altans turned down a $49 million Measure D that called for 63,000 square feet of new space rebuilding Mitchell and the Community Center and renovating Children's Library.
So how did we get to $81 million and what can we do about the cost, since so many of us love libraries?
Well, building costs have gone up 10 percent a year for the past two years, and the price of steel and cement is spiraling. That alone can't account for this giant escalation.
Another factor: residents were once again asked what they wanted in these libraries and their ideas were incorporated into the library design work. Palo Alto residents have a plethora of ideas.
Council member Yoriko Kishimoto said the cost also went up because polls
indicated there would be more support if the Main and Downtown libraries
were included in the bond-measure package. These two renovations account
for $31 million of the $81 million. Perhaps, but I wonder if people were told the amount of additional money involved with these inclusions.
What would we get for $81 million? At Mitchell, plans call for a new expanded library, up from 9,500 square feet to 36,000, and a 15,000-square-foot Community Center (up from 10,000). Total project cost: about $56 million.
At the Main Library, the proposal calls spending $26.5 million for a 3,700-square-foot addition (from 26,300 to 30,000 square feet), most of which consists of a group-study area, a community meeting room that seats 100, more bathrooms and a new vestibule so the meeting room can be accessed when the library is closed.
The $26.5 million also includes new heating, lighting and a facelift.
The Downtown renovation would cost $4.5 million with no additional space. Technical Services would move to Mitchell, so Downtown would have more room for public space and would get back its small community meeting room.
But my calculator adds $56 million, $26.5 million and $4.5 million and comes up with $87 million creating an unexplained $6 million gap above the $81 million bond measure.
I know design plans are in place, but it was just a couple of weeks ago that we were hit with the revised $81 million cost estimate.
Might I suggest our council gain resident confidence by scrutinizing the proposed facilities to decide whether all the features in the library plans are really needed and also ask if we should even renovate the Downtown branch.
For example, the Mitchell Library plans call for a dedicated children's area, an acoustically separated teen area, a technology-training room, group-study rooms and a program room. Other than the training room, is there much here for adults?
If we worry about places for our children to study after school, why not open more after-hours school libraries? Not as much fun as Mitchell, but they would cost a lot less.
Do we need group-study areas at Main, or a teen area? Could the community meeting room be entered without having a new vestibule? Could the money be better spent on, say, books?
I realize that our libraries have not been renovated for 50 years, and I absolutely believe they need redoing. Mitchell and Main are ragged around not only the edges but the insides.
I also understand that what we do now to the libraries will last decades. And I want libraries that we are proud of because they look and feel so good.
Nevertheless, a couple of questions keep nagging at me:
Why is it that San Jose seems keep its library costs way down? Two years ago San Jose opened a 28,000-square-foot facility that cost $18 million at the same time Palo Alto's cost estimates for a 30,000-square-foot facility were coming in at $45 million. San Jose recently opened another similar-sized library that cost $21 million.
I wonder if our Public Works Department sets any limits when asking for bids for the library project. In some communities, when projects go out for bid, developers are told the city can only afford to spend $XX million, and architects submit designs that are within those financial constraints.
Council members were recently told by staff that any changes to the existing design plans would be expensive and cause delays. Council members want a November ballot measure because of escalating construction costs. So they are in a bind.
I don't agree with the staff analysis that changes would be too costly to consider. The goal is to get the measure passed, and if modifications would save residents money, let's modify so we can get the bonds approved.
Like this comment
Posted by Mike
a resident of College Terrace
on Feb 29, 2008 at 9:53 pm
Shuold have posted this earlier - lots more where this comes from:
Public Library Benefits Valuation Study
1. The study clarified the usefulness of using recognized CBA methods of contingent valuation as a basis for calculating a dollar estimate for all five cities. The contingent-valuation methodology works well in a large public library setting. The study demonstrated that cost benefit methodology is a tool well adapted to measuring the direct benefits of library services.
2. Recognizable methods of cost benefit analysis used in many other kinds of CBA studies were used to measure the direct benefits of library services to each class of patrons. The project team calculated direct benefits for general users, teachers, and business users. A team of market researchers and economists designed the survey instrument. The survey was branched and tailored by a process of self-selection on the part of the survey respondent: general user (household), teacher, or business. This branching included both categorical and open-ended questions. An average interview took 25 minutes.
3. When subjected to standard statistical tests for reliability, the study proved to be valid and reliable. The tests indicated that the survey produced an accurate measurement of services based on accurate Reponses by those surveyed.
4. Based upon their answers to similar questions, the study demonstrated that different user groups receive different levels of benefits from library expenditures. The general user was asked consumer-surplus (CS), willingness-to-accept (WTA) and willingness-to-pay (WTP) questions. Teachers were asked about their professional use of the library with consumer-surplus and willingness-to-accept questions. Business users were also asked consumer-surplus and willingness-to- accept questions. Initially, the team attempted to query caregivers (professionals at senior centers, nursing centers or retirement homes) as a separate user group. The number of respondents in the latter category in all libraries was too small to treat the results as a separate sub-sample. The direct-user benefits to the various groups varied considerably from city to city, as outlined in Appendix D.
5. The outcome of the study is defensibly conservative as the research team intended at the outset.
a. They are conservative because the study captures benefits to cardholders only. No benefit estimation was attempted for walk-in or virtual visitors who did not hold cards.
b. Neither were there estimates of benefits to third party beneficiaries.
c. The consumer-surplus estimates are conservative because they are based on very conservative pricing of library services and, direct benefits are calculated only if those surveyed expressed them as substitutes for library services.
d. Finally, capital benefits exclude depreciation. Thus, estimated returns to capital compared with annual benefits with an overstated capital stock, yielding a conservative estimate of the annual return.
6. Annual local taxes spent for library operations yield substantial direct benefits. Each library returns more than $1 of benefits for each $1 of annual taxes. Baltimore County Public Library returns $3-$6 in benefits per tax dollar. Birmingham Public Library returns $1.30-$2.70 in benefits per tax dollar. King County Library System returns $5-$10 in benefits per tax dollar. Phoenix Public Library returns over $10 in benefits per tax dollar. SLPL returns $2.50-$5 in benefits per tax dollar.
7. Each library studied yields a good return on invested capital. Baltimore County Public Library returns a minimum of 72%. Birmingham Public Library returns a minimum of 5%. King County Library System returns a minimum of 94%. Phoenix Public Library returns over $150%. SLPL returns a minimum of 22%.
a. Shortly after completing the IMLS CBA study and before publicizing its results, Phoenix Public Library participated in a city-wide bond referendum that will expand its capital assets by 20% over 5 years. The referendum passes more than 75% of voter support. The overwhelming strength of this majority confirms the public's (and cardholders') perception of the high social rate of return to the public's investment in library assets, consistent with the results of the CBA study.
b. The return on invested capital measurement and that for return on annual taxpayer investment are both summarized in the seminar casebook, Libraries Are Valuable
Prove It! (Appendix A)
8. The methodology detected differences in benefit streams flowing from different levels of investment. The CBA methodology is sufficiently fine-grained to detect differences in levels of benefits that flow from different levels of support for various areas of library activity. St. Louis Public, for example, had higher levels of benefits from children's services than did King County which invests a lower percentage of its annual taxpayer investment in youth services. Not surprisingly, the difference in cardholder categories in different systems also affects CBA benefits-stream outcomes.
9. Consistency, however, proved to be the theme of the various studies, especially when calculations were made for categories of library benefits. In the case of all five libraries, when benefits were calculated, they did so in the following order: 1) Materials for adults, on average 35% 2) staff interactions, on average 30% 3) materials for children, on average 20% and 4) library technology, on average 15%. Of these, the most problematical was technology, because comments that those surveyed made during their exchanges with interviewers often indicted that they were placing technology benefits implicitly into other categories (e.g. electronic newspaper and magazine databases were thought of as adult materials, not technology).
10. CBA has considerable value as a communications tool. Not unexpectedly, the first persons to utilize the CBA findings were the directors of the systems in which the economic analysis was accomplished. They addressed the CBA findings to diverse audiences.
a. The director of Birmingham Public states that his system is using the CBA analysis because it A) "provides a good marketing tool," B) "demonstrates library effectiveness," C) is useful "in planning and decision making," and D),"helps place a dollar value on services."
b. Using the CBA findings, the director of Baltimore County developed a presentation for staff and the public entitled "The Baltimore County Public Library: A Great Investment Any Way You Measure It." The Baltimore County presentation concludes with these phrases: "The Bottom Line Formula: Effectiveness + Cost Effectiveness = A Really Good Deal for the Public. That system also melded the statistical findings of the CBA study into findings from a customer service estimate the extent to which benefits streams were flowing from service investment dollars.
c. The director of King County used his institution's CBA findings to develop a presentation for library employees entitled "Inside Story: Letting Staff Know How Great they Are." This program oriented staff to the significance to their work in the system and helped stress the continuing importance of taking in-service training to provide quality service.
d. The director of St. Louis Public Library provided programs to staff and board demonstrating how the system provided higher benefits streams than a dollar-for-dollar payback. More significantly, SLPL has been using its CBA results as an effective fund-raising tool in communicating with the community's conservative private-donor community.
The director of Phoenix Public Library and Professor Elliott, the project consultant, made presentations of the study's results to members of the Phoenix City Council, and, in a second presentation, to the Library's Advisory Board, staff, state librarian, and mayor's aide. Copies of these presentations can be found in Sections 2 and 4 of Libraries Are Valuable
Prove It!, (Appendix A), which is submitted with this report. Copies of the Phoeniz presentations are provided in Appendix D.
11. Quality of library databases critical for successful completion of survey. The most problematical element in this study was the statistical character of library-user databases. No library that has not taken considerable care in creating or maintaining its user database should undertake a CBA study of the type described in this report. Databases have to be relatively accurate to guarantee the appropriate rate of completion of telephone surveys. In several of the study sites, missing telephone numbers in cardholder fields lowered the completion rates, and the researchers had to ask that the participating library systems obtain missing data on cardholders before the telephone surveys could be started.
12. Population demographics can effect survey outcome. Phoenix, known for its seasonable residents and diverse ethnicity, presented this study's most serious challenge in implementing the survey design.
a. In two independent samples of the Phoenix database in surveys taken in two different seasons (April and September/October), approximately 30 percent of the cardholders who were active at some time during the previous 12 months had moved or changed phone numbers.
b. The response rate to both surveys of general users in Phoenix were about 18%. Data for the general user surveys were weighted in proportion to the frequencies of cardholders by library branch to correct for any possible response bias.
c. To obtain a sufficient number of educator responses, a list of Phoenix public school teachers was matched against a sample of Phoenix cardholders.
d. For additional information on the Phoenix study, See Appendix D.
13. The study team cautions against comparing the benefit estimates across the five libraries studied. The benefit measures are designed conservatively to provide a defensible lower bound to the annual benefits of each library, not as unbiased estimates of each library's annual benefits. For this reason, comparisons across libraries are fraught with problems. Nevertheless, some observations are apparent. For example, average or median family disposable income is correlated with benefits per household across cities: King County and Baltimore County households reported higher benefits per household than the central cities of Birmingham, Phoenix, and St. Louis.
14. Value of service/user matrix. It is possible to measure the nature and extent of economic benefits received by each class of patron for each type of service used. Classes of patrons can be identified by cardholder type and/or by self-identification. No matter what are the means of differentiation, care has to be taken because user types tend to overlap. An example of the calculation of benefits for general users is illustrated in Appendix E (not available in electronic format).
15. Some CBA measures more useful than others. As the CBA literature predicts for the whole range of activities, consumer-surplus and willingness-to-pay benefits estimates of library services were more accurate than willingness-to-accept measurements. The researchers also found that the cost-of-time measure that had been considered at the beginning of the project was less useful than other CBA study methods. This methodology, therefore, was not reported in the study results.
16. CBA measured the benefits from both public and private dollars. Return on taxpayer investment calculations, in addition to tax-dollar benefits, can assess the benefits of private contributions, foundation grants, and grants from different levels of government.
17. The study produced a replicable methodology, but one that is not without high expense. The biggest expense was the cost of surveys, and this expense was based on the amount of detail that the research team was attempting to capture. Based upon the experience in this project, the researchers recognize that they need less detail to produce reliable results. The costs of future CBA studies therefore can be lower than this one.
18. The study reports the distribution of benefits by category of user: general users, business users, and educators.
a. The central city libraries place higher priority on business users in implementing their missions than do the suburban library systems, and this priority is reflected in the distribution of benefits. Business users received 18-22% of all benefits in Birmingham, Phoenix, and St. Louis verses only 12% of all benefits in Baltimore County and 6% in King County.
b. Similar differences in mission emerged for educational use. The central city libraries strive to be stronger partners to urban schools than the suburban systems to be stronger partners to urban schools that the suburban schools to their school systems. St. Louis educators received 14% of all benefits: Phoenix educators, 11%; and Birmingham educators, about 10%. In the suburban systems, Baltimore County Educators, about 9%.
c. Thus, in each case, the benefits streams reflected the relative emphasis and financial effort that each system made to support these constituencies.
Sorry, but further commenting on this topic has been closed.