Nov. 11, 2016 Meeting Agenda + Notes

Participants: Alan Heyvaert (DRI), Marc Pitchford (DRI), Sudeep Chandra (UNR), Steve Tyler (UNR), Geoff Schladow (UCD), Steve Sadro (UCD), John Melack (UCSB), Pat Manley (PSW), Matt Busse (PSW), Ramon Naranjo (USGS), Todd Ferrara (CNRA), Jim Lawrence (NDCNR) Zach Hymanson (CNRA), Jennifer Carr subbing for My-Linh Nguyen (NDEP), Dan Segan (TRPA), Julie Reagan (TRPA), Patrick Wright (CTC),

Shane Romsos (SIG), Alison Toy (UCD)

1. Welcome and Recap

Geoff welcomed all attendees and asked for introductions. He then reviewed the proposed agenda. No changes were made to the agenda.

2. Update on Council funding, contracting, and operations

Zach confirmed the existing Council budget is $450,000, which has been allocated according to the approved work plan. TRPA has agreed to administer these funds on behalf of the Council. Additional funding ($150,000) is expected to become available with approval of the State budget (July 2017). The Council is still relying on a single funding source.

First subcontract between UCD and TRPA provides funds to support Council operations, communications, and website development. Contract is at UCD for review.

Next group of contracts will support substantive work by Council members. Dan sent Council members a model contract that TRPA has used for consultant contract work. It is a general contract that allows for work orders to specify the work along with other details (e.g., funding, timeline, and products).

The model contract has been through two rounds with UNR; TRPA and DRI have essentially come to terms; UCD is still reviewing language used in a previous agreement with TRPA; and TRPA is getting close with USGS. Council members mentioned that intellectual property and how it’s handled will be an important issue. TRPA does not have a stance as the contracting agency. The standard operating procedure is that the universities have unfettered access to data collected as part of the contract work, as does whoever is funding the project. Standard clause: 30 days’ notice to TRPA that data or information is being published or made public.

John ran the model contract language through UCSB contracts office. Issues with contract insurance language were flagged: their read is that everything lands on the shoulders of UCSB. Dan stated the proposed language is standard two party contract language from TRPA. Geoff noted that UCD sponsored programs office has accepted this language. He suggested having UCSB UCD sponsored programs office staff talk directly.

ACTION: Dan and TRPA contract staff will continue working with all respective Council entities to come up with a contract that all can use. Dan is hopeful that these contracts can be completed at the beginning of 2017, although the holidays are a factor. Geoff thanked Dan, on behalf of the Council for his work to get these contracts in place.

Zach summarized the indirect cost rate (ICR) discussion during the Bi-state Executive Committee. Zach has some latitude to negotiate a reduced indirect cost rate, with help/advice of the co-chairs. The current ICR with UCD is 25% for projects entirely funded with state of California funding. This rate will increase to 35% in 2017. UNR has agreed to accept the same ICR as UCD for Science Council work. Zach is in discussions with DRI to negotiate a reduced ICR. He expects to have an answer by the end of November. PSW ICR is 15%. USGS ICR is roughly 81%, and there is no negotiation possible.

Zach sent out travel reimbursement forms to all Council member. They seem pretty straight forward, but if there are any questions contact Zach, or TRPA staff. TRPA strongly prefers making electronic payments, which will require you to provide an account number and routing number.

There is still one council member position available (UC systemwide position). Scott Stevens, a fire ecologist at UC Berkeley declined the offer to join. Scott suggested contacting Max Moritz (UCB), who is also a Fire ecologist. Max is now working at UCSB. Zach asked the Council members for other suggestions of expertise they would want to add to the existing Council. Suggestions have been made for finding someone with a ‘forest background’ e.g., landscape ecology, forest management, forest ecology, forest soil and water. There also was a suggestion for an ecological statistician, someone who could help thinking about monitoring in the long run. There also was a suggestion to include someone with expertise in researching the effects of recreation, possibly a social scientist. Todd acknowledged that we do want someone with good technical expertise, but they also must be a good contributor at the meetings.

Matt noted that he can always reach out internally (USFS/PSW) to statisticians. Marc noted that all Council members are to be able to reach out internally for additional expertise. Matt agreed, and suggested the Council should not to focus on covering all disciplines.

ACTION: John will work to identify possible candidates in the UC System outside of UC Davis who can contribute social science/economics/ sociology. Zach will work with Alan and Geoff to contact Max Moritz to determine his interest.

3. Schedule for 2017 Council meetings

The MOU establishing the Science Advisory Council says the Council will meet four times each year. Zach initially suggested a quarterly meeting schedule, and that Council members choose a specific day of the month. Once things get going, meetings may be longer as projects pick up. Alan questioned if every third month allows enough meeting time during

the Council’s formative stage. He suggests every other month would be a more proactive schedule.

ACTION: Council members decide to meet every other month starting January 2017 through July. Meeting day is the 3rd Thursday of the month, so the specific meeting dates are January 19, March 16, May 18, and July 20, 2017. Meetings will start at 10 AM. Meeting location is room 119 of the Tahoe Center for Environmental Sciences building. Meetings beyond July will be determined later in 2017.

Sudeep asked if it’s possible to set up video conferencing abilities. Weather might necessitate video conferencing, and travel is costly for some Council members. Other Council members supported this idea.

ACTION: Alison and Zach will work with UCD information technology staff to determine what equipment is needed. Zach will pursue equipment acquisition with TRPA staff.

4. Council review of threshold assessment

Zach provided some general background regarding the Tahoe Regional Planning Agency (TRPA) threshold standards. TRPA currently has more than 170 threshold standards, and the majority of these standards were established in the 1908’s. TRPA is required to evaluate the status and trend of environmental conditions in the Tahoe basin relative to the established standards every 4-5 years. These evaluations have occurred since the late 80’s. The last two evaluations (2011 and 2015) have

undergone an independent technical review. Both reviews have identified substantial deficiencies with the

threshold standards, and have recommended TRPA undertake a wholesale assessment of the standards.

In 2015 the TRPA governing board and staff identified assessment and update of the threshold standards as one of several high priority initiatives for the agency. An initial assessment methodology was prepared by TRPA staff as part of the 2015 threshold evaluation report. This assessment methodology was subjected to independent peer review; however, staff has not addressed those comments. The assessment methodology and the peer review comments have been provided to the Science Council with a request for the Council to critique these materials, and provide comments to staff that focus on recommended revisions to the assessment methodology. TRPA staff’s intent is to critically examine the entire threshold system: threshold standards and the associated monitoring.

According to Dan Segan there is broad support from TRPA that the science council should look at threshold standards. The hope is that the Council can identified deficiencies in the standards, and help determine where we focus our energies. There is recognition that there could be a stronger nexus rooted in current science.

Marc noted that revisiting the thresholds has always been a dicey endeavor. Now is the time for a united spirit of collaboration and care, a time where different constituents can offer different viewpoints and come to a common point. It was a superb time in the 70s when these standards were being put together, but current conditions are different and we want to ensure we’re tracking the correct things. Can’t do everything, must be focused, to help things get funded.

The last threshold evaluation cost about $1 million. The work that the Council has funding to complete, will not fund a complete update of the threshold system. Developing a sound

assessment methodology, and testing that methodology on a subset of the standards is what the Council is shooting for. The hope is that outcomes of the Council’s work with TRPA staff will: (1) yield an assessment methodology that has been put to use; (2) generates assessment results the reveal needed; (3) produce products that can be used by groups to debate/act on; and (4) give the Council and TRPA an idea of what the costs will be for a full update of the threshold system.

Geoff: the goal for the Council’s work should be to provide a product that meets the needs of agencies. We run the risk of not fulfilling this goal if don’t have some idea at outset of how much money the agencies expect to spend on implementation of the threshold system. Ideally that information is needed at the start. There’s a balance between the kinds of data and information people want as part of the threshold system, and what agencies and elected officials are willing to spend. Do the agencies have an idea or preconceived notion as to costs or funding available?

Sudeep seconds this issue: given finite resources available now, focus on one thing. What do the agencies want? Get that out on the table early-on, and work it out from there.

Dan and Shane noted that TRPA has gone through what has been spent on monitoring, and the Science council can examine that information to get a general idea of what usually gets spent.

Marc suggested that it also is possible to offer something that is attractive enough that people will want it, and they will work to figure out the funds when they see it. Write a plan that is cognizant of existing funding, but also explain the need for additional funding. Monitoring happen all the time, and budgets are tight on that. We don’t want to feel financially constrained if it means not doing a complete and thorough job.

Alan agrees funding shouldn’t constrain plan. But the Council does need to be aware of how much we’re working with, and what the reality is. Multiple agencies are engaged, so funding possibilities are different. Other opportunities could be available.

Marc suggested that recommendations for a revised threshold system could come up with a well-reasoned list of needs as well as costs. This would like result in a prioritized list of activities.

Jim noted there will be greater success of obtaining funding with specific ideas, i.e., of everything we looked at, these are the items that we think needs to be done, how we strategically chose these items, and how this meets the needs in Tahoe specifically. From a state department point of view we need specifics.

Marc: Let the decision makers make the decisions, we are providing them with choices.

a. Assessment process:

Dan mentioned the existing standards were written some time ago, and the science of setting standards has changed. The core of the assessment process is to evaluate the existing standards against a set of established criteria (i.e., the SMAART criteria). Dan envisions a small group would go through all of the standards to determine if standard is specific, is it measurable, is it

attributable to the agency, etc. The results of this effort would generate a common assessment

data base, put context around the needs to improve the thresholds, and provide a common

information base to speak from. The relative importance of different threshold standards would

be based on stakeholder interests.

One option is to also have a place to show what level of effort has been applied to inform a threshold standard over the years, to provide additional perspective. Todd thinks this is important for entities to understand how much money has already gone in, and putting a lot of money into something irrelevant is useful to know.

Alan noted that the Councils initial objective is to help the TRPA capture an assessment methodology that is robust.

b. Timeline associated with the assessment process:

Dan says TRPA released the final draft threshold evaluation report at the end of September, and had it open to the public for review. The final report will be officially released at the December 14th governing board meeting. Everything in the report that has been reviewed, and all review comments have been addressed, except those on the assessment methodology.

The current plan is to bring a revised assessment methodology to the February 23, 2017 governing board meeting. The assessment would begin soon after governing board approval. It is expected that the assessment methodology would go the TRPA Advisory Planning Commission (APC) in early February, so the Council’s work to review/revise the assessment methodology will need to be completed by the end of January.

Alan suggested the following steps related to the Councils work to review/revise the assessment methodology in collaboration with TRPA staff:

• Compile Council’s comments from today’s discussion in the form of a memo from the Council to Dan.

• The comment memo also should reference as appropriate, independent peer review comments.

• Council members would be available to consult with Dan as he works through the comments to revise the assessment methodology write-up.

• Dan will provide the Council with a revised write-up prior to the next Council meeting (Jan 19th).

The Council would discuss the revised chapter during its meeting. The Council would submit a second memo based on its meeting discussions, which would go to the governing board along with the assessment methodology.

c. Discussion of the TRPA assessment methodology:

How TRPA develops the assessment information is what the Council is asked to help determine.

Geoff says that the whole threshold system review needs to be predicated by conceptual models (CM). Unless you have a conceptual model how can you communicate how the parts of the system interact? How can you identify at which point in the process you want to have a threshold standard? How do you come up with conclusions about how the system operates? How can you tell it provides value?

Comment: Insert a question as part of the assessment: is this supported by a results chain or CM? Science Council could easily say all standards should be supported by some sort of CM according to Dan.

Sudeep is just trying to think of a work flow, to get this information. Call for building CM doesn’t mean that we need to build them. This is all process-based information that can be considered as guidance provided from this Council.

Patrick notes that there is a desired to reduce the 170+ standards to a manageable level, and to craft the standards so that they drive agency investments. Currently agency investments are not driven by threshold standards. The Environmental Improvement Program (EIP) and associated performance levels currently carry much more weight in driving funding decisions. In pointing out areas where CM would be necessary, we also need a way to use the threshold as guides to investment, e.g., in a ten year road map, that includes estimates of what it would take (financial resources) to realize a change in environmental condition relative to a threshold standard. This is the kind of thing that could affect funding.

Marc says that someone will have to define what people’s tasks are. Someone will need to find an expert for each of those standards/thresholds, who will be tasked to come up with a CM. Find where each threshold standard fits, and those don’t would be candidates for removal. Comment: Completing the SMAART assessment would occur first and evaluation in CM would occur second.

Sudeep says set realistic timeline and realize that just bringing people up to speed will take time; however, bringing people up to speed is very important. Prefers to not have a point person but to have more common meetings.

Geoff thinks the thresholds need to be relevant to every other issue in the basin.

Alan proposes to form a subcommittee of the Council for development of the assessment methodology comment memo. Keep longer term goals in mind that can be built into the assessment. This needs to be a collective effort, other people with more time to move this memo forward. Help TRPA come up with methodology for thresholds that will be relevant to other stakeholders in the basin. We want to create something that serves the current collective purpose, but could be modified over the years. While the Science Council continues to comment, give Dan some prioritization for how to start working on it and where to focus his time initially.

Comment: Zach - noted that his review of the draft assessment methodology generally described what would be done, but there was little information on how the assessment would be done or who would do what tasks. Who is going to work through the SMAART process and populate the results matrix?

Who’s going to process this data into usable information to make recommendations for next steps?

This needs to be written up so that agencies, stakeholders, and the general public call understand the

process. Clearly communicating the overall approach will encourage greater support.

Dan thinks expanding the assessment methodology description would be helpful. How do we do it? Is it one person, is it a subcommittee, is it consensus, divvy up by expertise or background? When it goes to board, it can’t just be an assessment; it has to be a full proposal. Perhaps we also should expand more about how the results will be used. Should it be a public process where everyone can weigh in on the threshold assessment? TRPA has had thoughts about who will do the assessment. If this was an internal assessment, it will not be seen as credible. Dan envisions different people will be in the room, interested in guidance, working with subcommittee, whose opinions needed, may not be the same people depending on the standards.

One suggestion is to divide the standards and find the appropriate knowledgeable people to deal with the various groups. Diagrams of how things fit together with background information, and describe the knowledge base that supports this model. Organizing them this way will enable us to see the interconnectedness (Marc). Things can easily relate back to scenic, forest, air quality, etc.

Jennifer thinks sustainable recreation touches everything, the web will be a tremendous resource.

Comment: Geoff – If you do decide to go the route of surveying different part of the community then don’t lump all the result together, keep the science community responses separate from public responses. A lot could be learn by examining the differences and commonalities. What agencies want may be different from science and public. Keep result separate, so you can compare and contrast.

Comment: Marc – Look at a broad slice of different stakeholders. Use public outreach to populate the matrix additionally with a list of technical expertise and policies. If three set of answers show broad consensus, that’s good. Alternatively, it also is worth noting if three set of answers show a widely varied preference. Sudeep says this is a good reason to integrate a social scientist into the council.

Differing opinions potentially could result from a survey, e.g., responses to the question is the threshold standard ‘X’ specific? What is important is that TRPA engage the broader community, and utilize the results in the assessment. How this engagement occurs should be a documented part of the process, in order to build confidence and support.

Comment: Dan would like guidance when that outreach should occur? Is that part of the assessment? Who decides what is relevant? Do we put people in a room and ask them to allocate their 100 coins to choose what standards are relevant.

d. Thoughts on independent peer reviews:

Alan noted that there are some common themes that can be pulled out of peer reviews. For Dan’s purposes it would be great if we could capture the most important comments for Dan to deal with initially.

Comment: Marc - Reviewer 3 gave the best ideas; we should just go forth with what he proposes.

Comment: Alan - Work on creating preliminary conceptual models, broader community interaction, and integrating some of the suggestions.

Comment: Unknown- What is the purpose of the thresholds to begin with? Protect parts of the basin that people like. Do you then need to ask people what they care about in the form of survey?

Dan – take note of the relevance to policy makers and public. Not sure if there is one answer for relevance. Are these relevant to policy makers, public, stakeholder, or CM? The audience for this question could have a big effect on the answer.

Comment: Unknown- It was suggested that TRPA choose one set of threshold or standards and try the assessment. Issues and ambiguity will come out. Rather than do it all the first time through. Take that approach to start in on the actual assessment. SMARTER framework, add that other R. Achievable, Realistic, Relevant, Matrix, Specific, etc.

Funding priorities: biodiversity, resiliency, how are the thresholds relevant if it’s supposed to be so interdisciplinary. How do the thresholds evolve parallel to that? – Patrick Examine how standards and the thresholds are moving forward. Can’t be too narrow, it’s not as effective as it can be. Look at the standards as they exist today.

Comment: Sudeep- Need to think about the important funding mechanisms, 30 years from now, how to revisit that? Has the TRPA thought of that? One goal… is it forest management or clarity? Biodiversity, carbon management, forest fuels, what are the priorities?

e. Post-lunch recap

Alison to compile meeting notes and distribute. All Council members should take some time to dive a little deeper into the peer reviews and notes. Everyone can email back to everyone any further comments. Send comments back to Dan by early- to mid-December is optimistic. Don’t want to spend too much time capturing our vision

Issue as a draft in January, have assessment proposal ready to go to APC and governing board in February.

Hard date for 1st draft memo soliciting feedback on – 16th Dec Friday, Dan ideally wants bullet points on what is going to inform the memo.

Comment: Unknown- Need to include information on who is doing the assessment, talk more about how assessment results will be used once it’s completed. Substantive changes, any heads up would be appreciated.

Take some time to compile comments and suggestions. Reference those suggestions in the final memo. Expect the final memo to include a sentence like “We have reviewed this chapter, and it is likely to meet the needs of the…”

Things Dan can start working on immediately: The brainstorm ideas that came out of this meeting.

Don’t have to deal with the matrix, but think more about describing the process. How to incorporate conceptual models into the process.

There was quite a bit of back-and-forth regarding CM. Should they be part of the initial assessment, or developed after the assessment for use in the larger threshold update? There are pros and cons to either approach. Dan noted that CM were developed as part of the work to define the original threshold standards. No doubt these CM would need to be updated, but they may provide a starting point. Concern was expressed about the timing of work on the CM, as they can take substantial time to develop.

Comment: Unknown- As mentioned earlier, part of the initial assessment could include a Look at the CM associated with each standard to see if that model works, in need of revision, or is there something better for the assessment. So you could add another column into the assessment matrix as to whether or not the current CM is still viable. Is it working? If yes, then things are good. If no, then it is addressed later.

Comment: Unknown- The Council’s comments should help prioritize comments from the peer reviews Make sure that the main comments in these reviews are incorporated in the process represented in this chapter.

Comment: Unknown – The Council should be able to say whether an assessment of this nature the appropriate first step? Yes or no. If no, then start with something else, and that something else looks like ‘X’.

Comment: Dan – would like the Council’s comments to recommend where in the process CM development should occur and how they should be used.

Comment: Unknown- Think in terms of a three step process: 1) SMAART Assessment, matrix is the outcome 2) Develop/revise CM and use to show the points in the process the thresholds are focused 3) Use outcome of assessment and concept model to investigate a measurement system

Linking outcomes into management policies, i.e. air quality affects invasive species which affects recreation, etc. how that model will be changed. Can we see how what’s happening here affects the forest?

Memo is trivial part of the process but it needs to be right to guide the steps forward.

A lot of this partitioned based on what the compact says. Is there is going to be an RPA on top of state legislature that is tracking the thresholds and report on it?

Jim thinks the tracking method needs to be cleaned up. Make this a viable system and make it applicable to larger systems. Make it consistent, but it won’t be the driver for priorities. Immediate things that needs to be addressed, is how we take that and make it something more valuable.

Immediate results from assessment will give you strong indications. Prioritize based on assessment could be helpful.

Look at policies and look at goals. Is this redundant with another part with the system? Where issues live in the system and are there solutions? Is it fixing the standard or is it something completely different?

ACTION: Geoff proposes drafting a memo to send to Dan at TRPA before January 13th. This can be completed with a little back and forth with Dan. Who does this? Alan, Pat, Marc, and Sudeep volunteered to serve on a subcommittee. Alan will serve as subcommittee lead and point of contact. Zach is also available to help.

ACTION: Dan will send a link to the original threshold documentation, which includes conceptual models. Zach will make sure all Council members receive this link.

Comment: Steve suggests allowing everyone to comment on draft memo one week/couple of days before it is due.

Comment: Marc- noted that assessment methodology proposed to exclude evaluation of a time component associated with a threshold standard. He would like time to be retained as part of the assessment to make sure that information is retained. Timing for when a threshold should be met is important. Over what period of time do we want to achieve this threshold? If there’s no time, do we not have to worry about it? It’s a deficiency that should be included. This is our goal for 20 years or risk of being non-attainment. Should this time element live in the standard, or is it expressed as targets for restoration?

Is there value in writing timing into the standard? It helps with expectations. Not every standard may need to have timing tied in. In terms of the assessment whether a time component is present or not could be a simple yes/no question tracked in a separate column in the matrix.

Comment: Zach- Suggests that initial assessment should include clear articulation of the goals and objectives relevant to each threshold area. The threshold standards aren’t well- connect to the regional plan, and the EIP largely drives where government money goes. One way to unite those are common goals and objectives. The goal statements probably take the form of desired conditions or vision statements. Are these things that need to be updated? Defining goals and objectives should be an early step in the assessment process.

Sudeep noted that it would be good to review the goal statements, and have them in one place.

Dan and Julie noted that goals were developed as community vision, in context of regional plan, several years ago. Julie has not seen a lot of changes in the last 7 years.

ACTION: TRPA will pull together key documents in order to provide some context.

ACTION: Subcommittee to look at all materials, and the goal is to have a draft memo by Dec 9th. Draft memo circulated to all Council members. Science Council provide feedback and finalized by Dec 13th.

ACTION: Alison and Zach will have draft of the notes of Council comments to subcommittee by Dec 1st.

5. Council member updates on Tahoe Basin science projects

Tahoe West project- Pat: This project is relevant to the Council. Collaborative effort among government landowners throughout much of the west side of the basin: Emerald bay to Tahoe City. This is a landscape planning effort for restoration effort to the forest ecosystems. Executive team, core team, science team that PSW is coordinating. On the science side: it’s a collective that has taken shape as opportunities have come along, it helps that regional institutions work together. Support a variety of interactive modules, i.e., climate change (Geoff, UCD). DRI, UNR, Portland State, and PSW contributing to an interactive modeling effort, to estimate how forest management treatment affect wildlife, water quality, and air quality. The science-management meetings have shown interest in coming up with a revised rendition of management issues identified long time ago. What was the original intent here? What does watershed restoration mean? All sciences come into play. Lots of discussion about meadow restoration, hydrology of streams, riparian restoration, how does that affect water quality? Funding depends on the growth aspects of project. What could be developed as a result of this? SNPLMA funding, building capacity, and how to apply this approach across the basin. There is the potential to connect with nearshore research, leading to more science collaboration at the land-water interface. Initial science effort is well-supported, and could get a lot of traction if approached correctly. For more information contact Pat.

Patrick - Monitoring, how about working to get a coordinated monitoring plan. One reason it hasn’t happened is because the TRPA discussion has dominated the conversation. Do we wait for the thresholds and standards to coordinate monitoring? This process for TRPA can really provide a framework for driving the direction of resource management. Monitoring the basin and regional change is an opportunity to really help drive things happening in the basin. Make what’s happening now relevant, find the relationship between thresholds and EIP with monitoring. Make the outcomes relevant and something that we ourselves would use in the future.

Extreme climate project – Geoff: This project is underway. The goal is to use climate models for the Tahoe area to develop estimates of the extreme effects (droughts and floods) of climate change, rather than the average effects. Estimates of the extreme effects should be much more meaningful to land managers.

SNPLMA funding – Alan: - An email about SNPLMA funding was sent out to Council members. In summary, there is about $8 million in SNPLMA funding available for expenditure in the Tahoe basin. These funds are the residual from previously approved projects that were not completed, or did not expend all of the allocated funding. There is a ‘secondary list’ of projects, which can be considered for

funding with the remaining funds. Most of the projects on this list are capital projects originally

proposed by the USFS. However, there is a science category on the list, although there are no specific

project proposals. They don’t want a big science proposal. During the regular SNPMA rounds up to 10% of the funding was allocated to science, so the maximum available would be $800,000. SNPLMA doesn’t like to fund indeterminate things e.g., monitoring. Something more focused with a discrete product/outcome. For science, this means an applied research project, or develop of new tools such as conceptual models. The timeline for developing a proposal is currently unknown. Alan will attend the next Lake Tahoe Federal Advisory Committee meeting, and bring back a report to the Science Council on the best way to proceed.

Jennifer – There would be value in communicating and tracking the science that is going on in the Tahoe Basin. Who’s doing what? How do we share information? A clearinghouse of some sort of what’s going on? Who’s done what? How does the Science Council attack the idea of what’s happening and what are we learning. Zach mentioned that science updates ideally could be something posted on a website, but currently there is no expectation that the Council will serve as a clearinghouse for Tahoe basin science. The Council is not in the position to oversee or coordinate all of the science going on in the basin.

Geoff adjourned the meeting 2:15 p.m.