for Women

and Girls









September 29-30, 1997
National Science Foundation
Arlington, Virginia




Table of Contents


About the Program

Maintaining and Expanding Friendly Collaboratives between Diverse Organizations and Institutions

Reports from the First Break-Out Session

Effective Mentoring: Tools and Practices

The Role of Evaluation A Systems Endeavor

Reports from the Second Break-Out Session

The Law, Privacy, Ownership, and Mentoring in Cyberspace

Equity Sub-PanelDepartment of Education

Program for Women and Girls: Documenting Where We've Been

GPRA Implication and Impact on PWG: Closing Comments

Addendum: Visiting Professorships for Women A-1


Assistant Director
Dr. Luther S. Williams


Division Director, Human Resource Development Division
Dr. Roosevelt Calbert

Program Directors, Program for Women and Girls
Dr. Margrete S. Klein
Dr. Ruta Sevo
Former Program Director
Dr. Dawn Pickard

The Program for Women and Girls can be contacted via:
Telephone number: (703) 306-1637
Fax number: (703) 306-0423
e-mail address:

Program for
Women and Girls


1997 Annual Awardee Meeting

September 29-30, 1997

National Science Foundation

Arlington, Virgina

The content of this booklet was abstracted from the original tapes and transcription of the event. Every attempt has been made to preserve the original context of the speakers.

The views, opinions, and recommendations expressed at the awardee meeting are those of the participants and do not necessarily represent the official views opinions, or policies of the Foundation.

Cover photographs are students of Mary Kaye Foshee, John V. Bilotta, and the Physics department and the MRSEC Summer Program for Seventh and Eighth Grade Girls at the University of Maryland.



Doris Ash
Chabot Observatory & Science Center

Krishna Athreya
Iowa State University

Priscilla Auchincloss
University of Rochester

Suzanne S. Austin
Miami-Dade Community College

Katherine Beckingham
Rice University

Judy Brown
Miami Museum of Science

Carolyn Carter
Appalachia Educational Laboratory

Glenn Cassis
United Connecticut for Women in SEM

Susan Cavin
POWER (Positive Opportunities for
Women Engineer's Retention)

Meera Chandrasekhar
University of Missouri

Arlene Chasek
The Center for Family Involvement
in Schools

Liesl Chatman
University of California,
San Francisco (UCSF)

Carmen Cid
United Connecticut for Women in SEM

Pamela Clute
ATHENA Mathematics
and Science Project

Maureen Coelho
United Connecticut for Women in SEM

Christina Corkran
Baylor College of Medicine

Nanda Duhe' Kirkpatrick
Rice University

Eileen Engel
Lawrence Berkeley Laboratory

Mary Ann Evans
Iowa State University

Melanie Flat
Girls Incorporated of Rapid City,
South Dakota

Merle Froschl
Educational Equity Concepts, Inc.

Suzanne Gage Brainard
University of Washington

Larry Genalo
Iowa State University

Larry Gene-Shaio Toy
Chabot College

Jean Girves
WISE Initiative Programs

B. Marshall-Goodell
University of Iowa

Penny L. Hammrich
Temple University

Margaret Hauben
Management Systems Association

Etta Heber
Chabot Observatory & Science Center

Nancy Henkin
Temple University

Yvonne Hogan
Rice University

Debra Humphreys
Association of American Colleges and Universities

Jackie M. Hundt
Wake Forest University

Wendie James
Miami Museum of Science

Janet M. Johnson
Cranbrook Institute of Science

Laura Koch
Girls with Disabilities On Line

Patricia Kusimo
Appalachia Educational Laboratory

Pamela Lawhead
GIST: Girls in Science and Technology

Rebecca Litherlandt
Columbia Public Schools

Maralee Mayberry
PROMISE (Projects for Multicultural and
Interdisciplinary Study and Education)

Judith Mazique
Texas Southern University

Robin A. McCord
Chandler-Gilbert Community College

James McLean
Elizabeth City State University

Caryn McTighe Musil
Association of American Colleges
and Universities

Susan Metzler
Mid-Continent Girl Scout Council

loannis N. Miaoulis
Tufts University

Mamie Moy
University of Houston

Lynne Orr
University of Rochester

Adele Pittendrigh
Montana State UniversityBozeman

Anne Papakonstantinou
Rice University

Ann Pollina
United Connecticut for Women in SEM

Louise Predovic
Oakland Unified School District

B. Ramakrishna
Arizona State University

Margaret N. Rees
PROMISE (Projects for Multicultural and
Interdisciplinary Study and Education)

Mary Richter

Robert Rosenbaum
United Connecticut for Women in SEM

Mary Anderson-Rowland
Arizona State University

Frederick B. Rudolph
Rice University

Marilynn Sikes
Discovery Place Inc.

Amy Siskind
POWER (Positive Opportunities for Women Engineers' Retention)

Gayle R. Slaughter
Baylor College of Medicine

Mark Sonata

Barbara Sprung
Educational Equity Concepts, Inc.

Susan Staffin Metz
Achieving Success in Academia Conference

Mj Terry
United Connecticut for Women in SEM

Carolyn Thorsen
Georgia Institute of Technology

Dyanne M. Tracy
Cranbrook Institute of Science

Gail Whitney
Oregon Graduate Institute of Science and Technology

Peter Y. Wong
Tufts University

Karen Wynn
WISER Research Program

M. Zardetto-Smith
Creighton University

About the Program

A basic understanding of science and mathematics is essential to maintain a population prepared to meet the needs for a technically competent work force. The low numbers of women and girls who are being trained to meet this need is of concern to the National Science Foundation (NSF).

NSF initiated the Program for Women and Girls (PWG) in 1993 to enhance the participation of women and girls in the fields of science, engineering, and mathematics. PWG projects may take place in school districts, colleges or universities, state and local governments, museums, and community organizations. Awards can be as high as $300,000 per year for a three-year project, depending on scope.

At the time of the conference, there were two kinds of PWG projects: Information Dissemination Activities and Implementation and Development Projects. Information Dissemination Activities provide a mechanism for interacting and sharing strategies and information related to the partipation of women and girls in science, mathematics, engineering and technology (SMET). They are national in scope, and include conferences, workshops, videotapes, and electronic communication. Implementation and Development Projects encourage the development of innovative activities to increase recruitment and retention of women and girls. These programs implement existing models or research new ones, and focus on critical barriers or aids to women's participation in SMET.

The PWG is under the auspices of the Division of Human Resource Development, Directorate for Education and Human Resources (EHR). PWG projects are designed to complement other EHR activities, which address educational issues from grade school through graduate school, as well as professional women's issues.

These proceedings discussing PWG activities are not simply about gender, they also touch on ethnicity, accessibility issues, mentoring, and technology. Although the common thread is gender, it is critical for projects to reflect the understanding that all girls and women are not the same. Developmental factors as well as individual differences should be considered by all investigators.

We must determine what steps educators and employers can take to better include women and girls in their schools and organizations. Significant changes are necessary in the way science and mathematics are presented to women and girls both in education and the professions.

The Program for Women and Girls is making great strides in identifying the methodologies for improving the future of women and girls in SMET fields.

Maintaining and Expanding Friendly Collaboratives
Between Diverse Organizations and Institutions

INGEAR: Integrating Gender Equity and Reform

Carolyn Thorsen

The overall goal of our program is to change teacher education programs by integrating gender equity throughout the course work. Therefore, everything we have done is leading toward developing this framework and then integrating it throughout the whole teacher preparation programs that education colleges offer.

Our NSF-supported Georgia Statewide Systemic Initiative involved five universities that are all very different. After that initiative was underway for over a year we saw that, although there was a strong focus on diversity, nothing was really focusing on girls.

We found that if you really want to change the situation for girls as they go through their K-12 years, you have to change teacher education programs. The teachers going into the classrooms must integrate the classroom strategies that support and encourage girls all the way through to college level.

Each university in our program focuses on a particular strand.

During this process we learned a lot about collaboration. Initially, we set up quarterly management team meetings the first year after being funded. We found during that first year that we did not have time to talk through the problems in a half-day, quarterly meeting, so we instituted monthly meetings and have met monthly ever since. I think that meeting frequently is absolutely critical. We had seven co-PIs on this project: one from each of our institutions; the PI on the state-wide systemic initiative; and the state president of the American Association of University Women (AAUW). This gave us a large management team.

If you and your collaborators talk things through monthly, issues can be addressed before they become problems and collaborators can keep each other informed about how to address them. It is much easier to reach consensus talking face-to-face than it is by telephone, or e-mail, or long distance.

Collaboration is critical to the project. Each institution had responsibility for developing and helping all the campuses with the strands, and conducting it on their own campusit meant we had to work together on all the strands.

I mentioned that we have very diverse institutions involved. Georgia Tech is primarily an engineering school. The University of Georgia is the state university, a land grant university and our flagship university. Georgia Southern, located in the southern part of the state, has a very rural setting. Clark-Atlanta is a historically black university, and Georgia State is an urban university with primarily a commuter population. If we could come up with a curriculum framework for our education unitswith these different kinds of settingsit would work.

Another important characteristic is that colleges and universities all have institutional agendas. We all work within the environment of our college or institution. You tend to think: How will my college react to this? Or: My school would never buy into that. It is hard to ever achieve consensus when you are coming from that frame of mind. You can begin to move a project forward when you can set aside your own personal or institutional agenda. We have worked the best when we have stayed focused on our goals. It seems that when we have had things that slowed us down, it has been when some of us have said: "Well, I know I could not do this on my campus." Somehow you have to set that aside, and figure out how you can make it work on your campus. I think our institutions have moved to that point, and that is why things are working well for us.

Another challenging area is personnel changes that mean many involved now were not there last year. It may take a new person six months to get up-to-speed. We still have some challenges in this area, because we have just had more personnel changes. We are working to resolve that issue.

There is nothing easy about a collaboration. Working out any problems that come up requires all players' commitment to the project. Without commitment, it is very easy to throw up all kinds of roadblocks. If there are conflicts, you have to keep talking them through until you find a point of resolution. It is important for the people involved to want to collaborate, not to be on center stage doing their own thing.

Another strength of our project is that we have been equal collaborators with the universities. It is very important to have regular meetings with everyone involved in attendance. If your collaborators are not at the table to talk, you cannot resolve issues. But, if your people are committed to the goal, come to the table, share and move forward based on the project's goals, you have the best chance for a successful collaboration. The effort and difficulties are well worth it.

The Wise Initiative

Jean Girves

Committee on Institutional Cooperation

The Committee on Institutional Cooperation (CIC) was formed in 1958. It includes the Big Ten universities and the University of Chicago. Anything that is non-athletic is open for collaboration among the CIC institutions. Our Women in Science and Engineering (WISE) initiative started just last year. The major point in collaborations is communication. If you can meet together on a regular basis, that's terrific. With the WISE initiative, the representatives from all of our campuses meet twice a year.

The WISE initiative focuses on undergraduate and graduate women. We are looking at travel grants, getting women involved in professional societies, cyber-mentoring, and Web pages. We are just about ready to have our first student leadership conference that will include 12 students from each campus. They are responsible for taking what they learn at this conference back to their own institutions and running a seminar program.

We will have a series of best-practices workshops. Purdue held the first workshop last spring. We focused on the classroom climate and brought together faculty from all the campuses to talk about different programs that they have in place that we could adapt and/or adopt across all the campuses.

One approach that we have taken within the CIC is using peer pressure to get people to do things. If you cite one university's successful program, such as Michigan's Women in Science, other universities will establish similar programs so as not to be left behind. As a result of peer pressure, most of the institutions now have a program in place to support women in science and engineering. Providing information on the number of students enrolled in different science programs and the number of students who graduate constitutes another approach. We will start doing that this spring. When the representatives from the institutions look at that information, by field, they can start asking questions. You need to constantly put pressure on different groups. Peer pressure is, for me, one of the biggest motivators.

We also have an academic leadership program for faculty, targeting women and under-represented minorities. We go through a list and identify the people who are in the sciences and engineering who have participated since 1988. Some of these people might be speakers for our leadership program at the best-practices workshops. We take other programs and use them to reinforce what we are doing. A key element is having an infrastructure in place. We have our provost and governing board. We set up committees if we need them. We use over 100 groups that already exist. When I talk about women in science to the liberal arts and science deans, suddenly, they become interested.

We want to have the faculty involved in the design and implementation of the program as another aspect of the infrastructure. However, we know that faculty is extremely busy, and thus we put together an infrastructure, using the faculty only as needed. We try to focus on curriculum, content, and research projects with the faculty rather than the administrative side in order to utilize people's skills and expertise effectively.

Tied to faculty involvement is the fact that any project we put together has to be win-win-win. What is in it for the faculty? What is in it for the institution? What is in it for the provost? If you can articulate those issues, you can bring them onboard.

For us, it is a constant balancing act. How do you involve people? How do you not take advantage of them? Talk to people about those issues as you are going through the process. But throughout, make sure everybody has a place at the table. Keep your faculty involved in policy making and implementation, not administration.

Lastly, there has to be trust among the participants. Our institutions are all competing with one another on everything. However, we make the argument that if we work together we can do much more than we can accomplish by working individually. We have to keep reminding ourselves and our participants that through the sharing of experiences, in the long term, we all will benefit.

Reports from the
First Break-Out Session

Institutional and Systemic Change in Recruiting
and Retaining Diverse Populations.

The conversation turned to how one could institutionalize these programs. The advice was to get powerful men involved so that these programs could move from the margin to the centerhave them on advisory boards, get provosts and senior male scientists involved.

We talked about some of the problems women face on the campus. Then we discussed speaking about power and change coming from the outside: What could NSF do to promote equity on campuses? We talked about asking in the grant proposals for specific disaggregated data about gender and race breakdown among faculty and administrators and students on campus. This kind of human resource data could be a warning to the campuses about what they need to look at.

Another suggestion was to invite program officers, especially from the research programs, to come to meetings like this. A lot of discrimination on campus takes place within the research groups and the research laboratories.

Basically, we wanted to ask NSF to demand demonstrable commitments to equity in the whole granting process. Unfortunately we didn't have time to discuss a very germane question. Are the issues really about gender or a larger cultural question about how science has acculturated both female and male practitioners?

Mathematics Curriculum and Professional Development

The issue was primarily one of how to get female students and teachers interested in mathematics. Some of the people in the group worked with younger students. They developed curricula for young mathematics students in the 3rd to 5th grades, for instance. Some had extracurricular programs. Others taught college, in-service and pre-service teachers. One of the issues we discussed was the integrating of various aspects in a mathematics curriculum. For instance, how do you integrate method and content? Is there anything different that one has to do in order to keep female students interested?

There was some discussion regarding hands-on type mathematics activities and how to get things into the curriculum in different places. We came to the integration of mathematics with other disciplines, which seemed to be a natural way to introduce mathematics. We discussed various disciplines, such as, art, language, and science. A comment made by a person on the panel was that it was actually more useful and worthwhile to start out with the strength of the teacher in the other discipline. For example, begin with the art teacher, work with his or her strengths, and then have the mathematician make the connections. It was easier to start with the teacher's strength (i.e., art) than to begin with mathematics and try to explain how to put mathematics and art together.

We discussed teacher preparation, a sore topic with a lot of us. We also discussed how to attract teachers to come into the program. It is usually the risk takers who sign up for a program. One wants to reach more of the teachers than those who come back all the time. We want to reach the teachers who appear not that interested to begin with.

Various techniques were discussed for attracting the less-than-interested:

With various strategies available, people have had success using one or a mix of these strategies. Getting mathematics involved early, and in various aspects of the curriculum, seemed to be the most obvious way to go about it.

The Big Ten

Finding advocates is an area that deserves serious consideration. There are some wonderful things happening in the State of Washington on all levels. You definitely would want to be in a position where you get support from the top, not only in policy, but also in funding. You definitely need support in fact, not rhetoric. In the State of Washington, a young female dean is directing, requiring, and mandating that they look at all variables in selecting students and certain faculty. It is a foregone conclusion that her mandates will happen. Of course, we are not always that fortunate to have that type of support, but we want to make that happen wherever we can within the structure.

In the areas of advocacy and mentoring, mentoring should be at all levels from high on down, horizontally, and vertically. The graduate students have to support the undergraduates. The undergraduate students have to support the high school students. The high school students have to support the middle school students, and things of that nature. Certainly, if we are going to look for diversity in these programs, we have to exhibit diversity in our staffs, our faculties, in those running the various projects, and certainly, in selection committees. This diversity needs to be highly visible.

In the area of support, the library staff discussed doing videotapes of individual interactions and assisting the faculty members in analyzing them. It was very important to have some kind of follow-up. It is poor methodology to confront an individual who may have a problem and not assist him or her in remedying it.

We discussed the issue of pressure. If, in your environment, students and/or faculty are under-represented, you might try calling in industry. Industry has the funds by which to exert pressure. The threat to withhold certain funding given your institution until a problem is satisfactorily resolved, can be very constructive pressure. On the positive side, industry can also provide assistance with seminars and advisers in your programs.

Lastly, you must have unique approaches to problems, but they must work. We learned that an interactive theater was a very easy way for the teacher to expose a graduate assistant's problem. However, when the teacher was subjected to that same theater, there was a problem. So, certainly, it is advocacy, support, well-placed pressure, and just looking at things a bit differently.

Informal Community-Based Programs Group

Each of the members of this group had an exciting program and their efforts were really making a difference. They included programs with:

We would like some sort of listserv (e-mail system) so that we could continue to have this kind of communication and interaction easily. I do not think cost would be a factor and we would only need a few hours of someone's time to get the information together. We would like to have it soon while we still have this level of excitement and enthusiasm about the different projects.

Crosscutting Across Interdisciplinary

We introduced a very diverse group of educators who came from across the country, from elementary school level to graduate school level and found that the phrase crosscutting across interdisciplinary had very different meanings for everyone. This phrase also transcended different cultures. Briefly stated, crosscutting across interdisciplinary means reaching teachers who teach subjects other than science or mathematics. Generally, teachers in science and mathematics are aware of the gender equity issues where teachers in other disciplines are not.

Working with Connecticut United for Women, we have tried to reach the guidance counselors because of their involvement in the decision making process. They need to be aware of the lack of understanding of gender equity within their own faculty because they are guiding the students. The problem that becomes very difficult for guidance counselors, is to see 100 to 300 students personally. Thus a solution to this inequity needs to become a priority at the district level. Talk to the people­the key leaders­who make decisions. Because if you do not do that, it does not become incorporated into your project mission.

We use a model that integrates all the disciplines. Doing something like a science fair integrates many disciplines. We do that at Yale University. With our partnership with Yalewith United Connecticutwe work with the science fair, because in that model, we address all the disciplines. You have English for writing. You have content science. You have art in the design of the projects. Finding creative ways of integrating all the disciplines is something that we are trying to do now.

We have a problem with a lack of role models in engineering, physical sciences, etc. It is important to know that it is okay to use men as mentors, identifying supporting males who will nurture a girl. They are not necessarily role models but you cannot leave the loop completely. You need to have somebody there for them. It may not be a woman, but you work with what you have, and there are nurturing men available. Getting a woman in engineering or in physical science is extremely difficult. She is very valuable to the field and other commitments tend to keep her overextended. Getting her to commit to your mentoring program is a challenge.

We wanted to know if anybody had found a way to attract a critical mass of women. It is important to stress the value of service in a faculty. If you are not evaluated on service, then you are not going to think service is important.

We noticed that, when we are training teachers we have to work with the teacher training institutions and the state certification departments to make them realize that certification is very important.


Four things that have come up over and over as themes:

Effective Mentoring
Tools and Practices

The following section is a panel discussion concerning effective mentoring tools and practices. The panel is composed of Liesl Chatman, Penny Hammrich, Hollis McLean, Susanne Brainard, Jo Saunders, and Gayle Slaughter.

Research-Mentoring for Young Women Scientists and Engineers

Gayle R. Slaughter

Baylor College of Medicine

The population that I will be referring to are all college undergraduates who have been part of summer research programs. I would like to focus on a very specific activity we did this summer with funding from the National Science Foundation. It focuses on using a mentor-based approach to improve scores on standardized tests, which are often a blockade to progress for many women, career-wide. The test in this case was the Graduate Record Exam, because we were hoping to help young women access master's level and Ph.D.-oriented education. First, though, I want to briefly mention the type of summer research program in which the students participated.

I know many of you are working with very young people. I hope you can communicate to them the excitement of the wonderful research programs available to college students and, in some cases, high school students. Programs where they actually get to go in and really do science. They become a real part of the scientific community. Since 1989 we have had a summer undergraduate program that has had approximately 110 to 130 students every single summer. Approximately 90 to 100 of those, in recent years, have all been college undergraduates with the others coming from a high school program that has been a spin-off. These young people come from all kinds of colleges and campuses. Approximately 50 percent of them are women. This year 54 percent of the ones in the college program and two-thirds of those in the high school program were young women. This is a very comprehensive program where, at the hub, actual research is taking place. These projects for young women range from engineering projects, to computer science and virology projects, to projects that involve studying gene expression. We have young women involved in, and making progress at, all the frontier levels of science today.

About 90 percent of the participants get real scientific results over the course of the summer. This enables them to have something they can show to the scientific community as evidence of their ability; evidence that they belong to, and function in, a laboratory, in a research setting. We also give them exposure to women role models. We have a seminar series every day. We use women scientists and engineers whenever possible, so these girls get a chance to meet women who have very successful careers.

We show participants the range of careers through activities such as a graduate student night. We do not avoid practical issues. We talk about science funding and policy and show them that the National Institutes of Health's budget is increasing. If the National Science Foundation could have the same kind of success we have had, we would see more money for science. We need to work together to increase funding for science in this country.

I want to focus on the mentoring part of this program. We asked the participants to identify the research experience that helped them find a mentor, someone with whom they connected. We found that 68 percent of the young women stated that the person they were assigned to during the program truly became a mentor to them. Another 14 percent found a mentor within the program, not necessarily the person they were assigned to, but someone else that they worked with along the line. The numbers here show that, primarily, the individuals they turned to, and connected with, as mentors were faculty; and remember, these are undergraduates.

For our high school students, we found graduate students and post-doctoral fellows from a range of career development were often mentors. However, it was primarily faculty from whom they felt they derived the most benefit as a mentor. They did feel these experiences were really valuable; 73 percent said, "very valuable." Most of the other students said, "moderately valuable." We looked at what activities were the most important to them. The most significant were scientific discussions, teaching them how to do science, talking to them about science. The excitement of that was, for these young women, the most important factor. Discussing career opportunities was very important. Also, they found sharing personal experiences and providing contacts were important. But the key thing was writing them letters of recommendation. No matter where you go in life, you need to be recommended by people who understand the system. That was something they felt was extremely important. We found that they wanted the person they dealt with to be knowledgeable. They wanted that individual to be approachable. But they really didn't care if that person was kind. That was a little bit of a surprise to us. They wanted people to tell them the truth, to prod them on and move them ahead. They wanted somebody who knew the system and could relate what the next step up was.

And we asked them: "Does it matter if your mentor is male or female?" We got an interesting result. We found that the women who had had male mentors said: It did not matter; this person was wonderful for me, helped me develop; it was a wonderful experience. On the other hand, three-fourths of the women who had women mentors said it mattered that it was a woman. So we got a bit of difference. We are going to pursue this difference.

The Graduate Record Exam prep workshop was a personalized mentor approach where participants met once a week with a project intern. That took one hour a week per individual. We had a lot of hand-out materials, and used diagnostic tests and practice tests. This approach for the pilot study we did was quite successful. We found 7 of the 10 women in this program increased their analytical score to above the 90th percentile. Any graduate school in the country will look seriously at your graduate application if you are above the 90th percentile in the nation. Overall, our scores increased dramatically with a mean change of 371 points and a median of 420 points. These undergraduates, some considered second-tier students, were working a lot of hours and their grades were not quite as strong as those of other people. We found they were able to change their scores significantly when really focused with a mentorship-based approach to prepare for something as challenging as the GRE.

The Women's Triad Project

Liesl Chatman

University of California­San Francisco

The University of California­San Francisco has a partnership with the San Francisco Unified School District. In that partnership, we have a program called the Triad Project, which partners women graduate students and post-doctoral fellows with middle school science teachers. They look at two things: Onecoeducational strategies that will create an equitable environment for girls in their regular science classes; and twothey sponsor a science club for girls after school.

How much attention do we need to pay to the professional and cultural differences and similarities between science and education? I think some of the common things may be fairly familiar to you. Both teachers and scientists have quite a bit of enthusiasm. They spend long hours at their practice. They base their practice on a body of research and knowledge, although sometimes educators don't get credit for that body of knowledge that scientists do. The public somehow has a blind faith that education and science will solve all the problems of the world, but has a mistrust of educators and scientists to do that.

Some of the more interesting ground that we found in setting up these mentoring relationships and learning from each other has had to do with the uncommon ground between them. One of the most prominent things that we have seen in discussions is that the scientists tend to be critical, and the teachers tend to be nurturers. This has shown itself very profoundly in our program evaluation.

When we did the evaluation with the teachers, the teachers thought the program was great. They were becoming aware of gender equity. They were learning a lot of science from the scientists. They were treated professionally and this was in the evaluator's report. The scientists were more critical, observing you could do this and that better; you could improve this and that. We gave that report to the participants to read, and the teachers were crushed! They thought the scientists did not like the program. These were people who had worked together for 30 hours a month over the course of one, two or three school years, and this was the first time they were talking about this.

One of the scientists said, "Scientists are trained that if it is 98 percent effective, pick apart the two percent that is not." The teachers said, "We are trained that if it is 45 percent effective, that is better than the 40 percent it was last year." Even though the teachers provided feedback, the scientists felt they had not received any because the teachers were giving positive feedback, and the scientists did not even register it.

Similarly, it is not appropriate for a scientist to rip a middle school student to shreds on their report by picking apart the two percent that was not effective. So, we find that you need to pay attention to this issue and help each other with those differences in style. I think the tendency is even more profound at the elementary level. Ninety-five percent of elementary school teachers are women socialized not to give and receive critical feedback.

Scientists work with people interested in science. Teachers need to cultivate that interest in their students. Scientists may spend five, six or seven years on a molecule; whereas, a teacher is looking at friction, balls and ramps, butterflies, etc. Teachers are generalists. Scientists ask questions to converge on an answer; whereas, teachers are asking questions, hopefully, to get a lot of students thinking and responding to that question. So they have different questioning styles.

For scientists, if you can identify a problem, that implies a solution or how to get at the solution. Teachers can identify a lot of problems that do not lead to the solution. You cannot identify a kid who has just seen a murder the night before in a gang war and be expected to come up with a solution in your classroom. So there are some fundamental differences there, and that relates to starting with easier problems. Scientists are used to controlling variables. When it comes to program evaluation and thinking about how kids learn, they say: Control that, and do a pre- and a post-test. The teachers say: You may not know the impact of this program on the kid for 10-20 years. You don't know what will make the difference in the long run, and that can be a very difficult thing to talk about. Program evaluation and assessment of student learning is definitively different between the two cultures.

We found that both cultures speak a different language, but often, the scientific language is more easily identifiable. A scientist can talk about electrophoresis, and most people know that is a specialized use of that word. If a teacher talks about cooperative learning, you know what "cooperate" means and what "learning" means. So you think you know what "cooperative learning" means. But, in education, that may have a very specific meaning. Scientists may be just as intimidated by the term "curriculum matrix" as an elementary school teacher is of "electrophoresis." Building a common language is essential. Scientists may talk about "training" teachers, which can be offensive to the teachers. The scientists do not realize that the comment is offensive. The same thing happens with the term "model project." For scientists, a model is something that you can change. You can have several models. They are not good or bad. However, in education, "model" may mean exemplary. If a scientist talks about a model program, the teacher might ask: "Well, isn't my program a model program?" We have found that paying attention to these differences helps structure relationships so that participants can start understanding each other. When they can understand each other, they can start learning from each other.

Indirect Mentoring

Jo Saunders

Washington Research Institute in Seattle

The kind of mentoring that I have been doing I think of as indirect mentoring. Ultimately, the point is to reach kids, but I do not ever see the kids. I work with a series of mentors, who work with individual universities, who, in turn, work with pre-service students in teacher education. At the bottom are the kids.

I have learned several things that govern what I try to do with the people with whom I work. The really important thing is that I see the process like a relay race. I give the people that I work with whatever I can give them, but I can't reach the goal. They reach the goal. And that orients my thinking. My objective is to enable them to reach their goal, which is the kids.

In terms of my relationships with mentors, there are several things that I try to achieve.

Advocates for Women in Science, Engineering and Mathematics (AWSEM) Project

Hollis McLean

Saturday Academy at the Oregon Graduate Institute

I would like to explain to you about a mentoring ladder that we put together. We have middle school and high school science and mathematics clubs that meet after school. College-age women who are studying science and mathematics run clubs. These women, in turn, take the girls out to leadership teams of professional women at partnering businesses and organizations. The girls spend entire days with professional women doing hands-on, interactive science and mathematics. They really get a sense of what these women do in their careers.

One of the things about Advocates of Women in Science, Engineering and Mathematics (AWSEM) that may be different from other organizations is our partnerships with industry. We are fortunate to live in the "silicone forest." We have a lot of high-tech companies in our backyard. What I have learned is that there are women who are doing science and mathematics in every community, whether it is the traffic engineer or the water control people. These are the women that we can bring in to mentor our girls. Professional corporations and businesses recognize that this concerns their future work force. They know that there is going to be a shortage of engineers and computer scientists, and they want these girls to be computer and science literate.

Professional corporations and businesses can provide two things for us: mentors and money. I think this modular approach works well. These professional women have access to being involved at different levels. They can be involved by being listed in our directory of practitioners. Educators, Girl Scout leaders, and other people can call them, ask them questions, or have them speak at their school about what they do in their career.

We also have leadership teams of professional women at these organizations. We find somebody we know is interested in promoting women in the sciences. We go to the local university, the research university, Hewlett-Packard, Intel, or Women's Veterinary Clinic, and we get them interested in the idea of becoming involved with AWSEM. We put together a leadership team of three to five women who organize what we call site visits. Site visits are the interactive, hands-on sessions where girls spend the day with the leadership teams. The girls may only spend one or two hours with each woman or each group of women so these women are not losing their entire day from work, but they are becoming involved. They also meet with the girls at lunch time so that they can have a more informal time to chat about what makes them real people"I climb mountains, I love to ride horses, I have a pet, or yes, I loved that movie."

Then we go to the public relations and human resources people at the company. We say: "You have these great women at your company that are doing this work for us. Wouldn't you like a little positive public relations? Wouldn't you like the people in your community to know about the great work that your company is doing?" Their eyes light up. They can do the press release. That is their professional job. Then we go to the CEOs and we say: "Your company has been doing this really great work with us. Wouldn't you like to sponsor a club for next year?"

I have taken parts of the presentation that I give to the human resources and public relations people, or CEOs to give you a sense of the kind of information I am sharing with them.

First, I introduce myself and tell them about AWSEM. I explain that AWSEM is a part of Saturday Academy at the Oregon Graduate Institute of Science and Technology. There are other projects there as well. One of the things that we do is refer the girls involved in our program to one of our other programs such as ACE, which has eight 40-hour weeks of mentoring available to them. So that if they have sampled careers they would be interested in with AWSEM, they can move on to ACE for a long-term mentoring opportunity. Next, I show them our mission statement. I tell them why their support of AWSEM is so important. We have a research article that backs up what we are saying. These are available on our Web page. Then I use peer pressure. I talk about who else is already supporting us. This tactic seems to be pretty successful.

We have worked really hard to be diverse in terms of who we serve. We choose club locations in different socioeconomic and cultural areas throughout the Portland metropolitan area. Last year we had 30 percent minority participation. We use zoos, veterinary clinics, and municipalities. Anybody who uses science and mathematics can have a leadership team of women. The advantage of this approach is that it splits up the work and makes it doable. Professional women do not have a lot of spare time. Breaking it down into bite-size pieces seems to really help.

Next, I explain to the human resources and public relations people what AWSEM can do for their company and their community. These companies appreciate being seen in a positive light in reference to gender equity, and they want to bring women into their companies. Women in Technology International (WITI) has chosen us to be their official young women's program. These companies are mandated to recruit women. There were over 3,500 professional women at the conference we went to last June. The women were all carrying around their resumes, and these companies were vying for their attention. The prospective companies really need to know the benefits of being involved in a program like this.

Lastly, I hit them with a menu of opportunities. I say "These are all the products and things that AWSEM has put together. Wouldn't you like your logo on them?" I think we have to work within a company's culture. I have already had companies sponsor our directory of practitioners. We have two different companies talking about sponsoring the Web page. We have a company that is sponsoring two clubs in its community. We had another company that wants to sponsor lab coats. Anything that gets their name out is of interest.

In summary, the advantages of connecting with private business are the avenues to professional women that are opened to young girls, allowing them to get a vision of women in real-life science careers. We also recognize the positive effect monetary support and continuing sponsorship have on reaching our goals.

The Sisters in Science Program

Penny Hammrich

Temple University

When we talk about what mentoring is, we see our program as being not so much concerned with traditional one-on-one relationships, but more with an intergenerational group mentoring model. When I say "group," I refer to a role modeling effect. We have groups of individuals at different levels within our program, which deals with our target audience, fourth and fifth grade girls. Within our model, we have Temple University students on science career paths, and elementary education students that will be the teachers of the students we are targeting. There is a level of role modeling and mentoring with them. We have been working with the teachers because we have found that elementary teachers tend to not like science very much. We work very much with disadvantaged youth in Philadelphia, because their families and parents do not have much exposure to science or mathematics. Then we have an intergenerational core of volunteers, who include retired and working women scientists.

We have an in-school program and are working with six elementary schools in Philadelphia. One of our success stories is that, when we first started out with our model project, the role models were Temple University students, and I was their teacher. In addition to my class, I went out into the school and worked with the students in the in-class program. That became very successful. One of the goals for most grant programs is that they become institutionalized. This program has been institutionalized at Temple.

Starting this year, all elementary education students will take a three-credit, in-school program in role modeling and mentoring. These students also learn that they must teach more science and mathematics. We currently have 106 Temple students this semester out in the schools. We have 19 classrooms, with four to seven Temple students in each classroom. They work in small groups with the children. The in-school program is for both boys and girls.

We also have an after-school program where our intergenerational core of volunteers work to expose the young girls to the possibility that they could be like them. There is an elementary school right on the Temple campus. You often see the elementary school students around campus. I approached a couple of young girls that were walking on campus and asked them what they thought about science. They said, "Science is in a building like the Franklin Institute. Right?" I said: "Oh, no! Something has to be done!"

The population that we work with is 99 percent African-American. We give them a mentor or a role model, be it an elementary pre-service teacher, currently working women in the field, or a retired woman coming back. When they see a woman that looks like them and can share experiences with them, they start to identify with her. They begin to think: I can do that. I know somebody who has done this that looks like me, that is like me. I can be this.

One of the major focuses of our grant is to increase exposure to many different possibilities of a career in science, not just the traditional. We struggled with that. One thing that troubled us the very first year was that we could not identify very many retired women scientists, especially minority women scientists, because we were looking for traditional scientists. So we expanded that concept to look at nursing, technicians, etc. We expanded into any area that used knowledge of science, mathematics, or engineering.

Another pitfall has been transportation for the retired women. An example is our oldest volunteer, an 80-year-old woman who is a former medical doctor. She gets around Philadelphia, believe it or not, on a scooter. We have other volunteers without any mode of transportation. We looked to the women that were currently scientists and working in their fields. We looked at flextime, and other ways of getting them out of work, because we had some really great people who wanted to be role models and mentors. But it came down to an issue of time commitment. We are working with the corporations to give these women the time they need to participate.

A Comprehensive Curriculum for Training Mentors and Mentees in Science and Technology

Suzanne Brainard

University of Washington

Since 1988 we ran a mentoring program at the University of Washington (UW), where we matched approximately 100 professional engineers, male and female, with female students in engineering. We ran the program very successfully the first three or four years. Our success the first couple of years may have been a halo effect. My primary goal was to increase retention rates, because our retention rates at UW were incredibly low.

After the second year, the mentors began to ask questions: "How do I deal with this cross-gender issue? How do I deal with cross-racial issues?" The mentees did not know how to sustain a relationship, how to maintain the contact and keep it going. We realized that training was absolutely necessary.

We did comprehensive literature research through the ERIC system and others. There was nothing on training mentors and mentees, except some training at a very basic level done in nursing, and in some at-risk programs. Most people perceived mentoring as a cure-all for everything including corporate life. In truth, people did not know how to mentor. It is not an intrinsic skill.

We received a grant from both the National Science Foundation and the Fund for the Improvement of Post-Secondary Education (FIPSE) to design and develop a comprehensive curriculum for training mentors and mentees. Our grants were to design, develop, implement, pilot test, and evaluate a comprehensive curriculum for training mentors and mentees in science and engineering.

We brought in outside trainers to pilot test the materials at our university. Carnegie-Mellon, Pacific Lutheran, a small, private liberal arts school near Seattle, and the University of Michigan received our material to implement. There were other sites, and an advisory board that reviewed the materials. The administrator's guide includes an overview of definitions, strategies for delivering the material, and an assessment of individual needs.

It gives guides for delivering and administering the material. Among other things, the core curriculum includes: roles and responsibilities, setting expectations, pitfalls, and hints. It includes some cross-gender, cross-racial mentoring concerns to look for and issues to be aware of.

It includes specific training on diversity, faculty and graduate mentoring training and some interpersonal communications. It also includes a stand-alone module for evaluating the curriculum. When you actually implement it, you will have evaluation forms that can be used for just the relationship between the mentor and mentee, or forms that can be used in terms of the whole effectiveness of administering the curriculum.

The Role of Evaluation
A Systems Endeavor

Jane Butler Kahle

Condit Professor of Science Education
Miami University
Oxford, Ohio

I spent more than the first half of my career very concerned about equity issues. I began working in the inner-cities of Chicago and Gary when I was at Purdue University, where I worried particularly about racial and ethnic equity. At one point, I was challenged to begin thinking about gender equity. This was long before gender was created as a substitute for girls, or sex, or whatever. That started me down a very different and interesting road. We first started out by looking at teachers in science who were able to retain and encourage young women to continue in science, mathematics and engineering. This was in 1983. We found some commonalties among these teachers as we looked at them across the country. It was a very good learning experience for me, and I followed that work with about a decade of really looking at all kinds of factors that affected girls and women in nontraditional career patterns.

The NSF began the Statewide Systemic Initiatives (SSI) to be followed later by Urban Systemic Initiatives (USI) and Rural Systemic Initiatives (RSI) programs and finally Local Systemic Change (LSC) projects. SSIs are huge projects. They are very good in that they change the way NSF looked at education and funded educational projects. These were five-year grants rather than three-year grants. There could be only one from a state. So they had to be very encompassing projects, and funded at a much higher level.

I agreed to be PI and write the proposal with a colleague from the Ohio State University, Ken Wilson, who just happens to be a Nobel laureate in physics. So I learned a lot about physics, as well as about state bureaucracies. The SSI experience in Ohio was perhaps fairly unique, and I will explain in a minute why. But it certainly allowed me to combine two interests: my interest in equity and my interest in reforming science and mathematics in the K-12 arena.

By combining these two interests, I became intrigued by how policy can be affected. That is, how one really moves from a project to a change in policy at a state or a district or a school. The SSIs came along at a time when we were involved with equity and science, mathematics, and technology education.

The Systemic Initiatives, whether rural, urban, state or local, have looked at equity as an add-on. It is something NSF made you do, and it was very difficult to do, because you were already dealing with a bureaucracy that was a multi-headed hydra and very difficult with which to deal.

In Ohio, we conceptualized our SSI on the basis of equity and on what equity research had told us was needed. Therefore, we focused on mathematics and physical science, the two academic areas that tend to rule out girls and many under-represented minorities. We focused on middle school because we wanted to get the kids when they all were still in science and mathematics. We focused on some learning strategies we knew helped encourage under-represented kids: cooperative learning, less competitive classroom atmosphere, authentic assessments, and inquiry. Equity became central to our SSI. It was the cornerstone upon which we built and developed.

In systemic reform, we talk about reforming all parts of the system. You have to address state policy, as well as how you distribute frogs in the biology classroom. It gets pretty complex. You have to think about policies. You have to think about programs. You have to think about teacher certification. You have to think about the quality of the mathematics and science taught, and so on ad infinitum.

We have to change from thinking about equity in a simplistic way to a very complex and changing issue. The groups that are under-represented in our country are constantly changing. What works today is not going to work tomorrow. The kind of assessment we do today is not going to be appropriate for tomorrow. We have to begin to grapple with very large and complex issues with equity, as well as with the reform of schools.

As a researcher, it is very important for each of you with your own projects and activities, to understand the reality of what we are trying to change. We really have to be in the schools if we are working with schools. If we are addressing the work place, we have to be in the work place. If we are working with a university, we really have to understand departmental processes and procedures. We have to understand the reality of what we are trying to change. It is also extremely important that we throw out the old guidelines on assessment, and we look at assessing any kind of program, project, or reform in multiple ways. In Ohio's SSI, we do nothing now that we don't follow up with classroom observations, with qualitative analyses and data. I think we all have to begin to think in multiple ways about how we look at assessment.

It does you and the world of children, of girls, or whomever, little good if you find out all these thingswhat is working and whyand you do not get that information out to the public. You must go beyond assessment to dissemination, and I do not mean academic papers and reports. I mean the types of things that are easily read and easy to disseminate.

We need to use multiple ways of assessing, and we need to get the word out. Because if you want to change policy, you have to build consensus and a desire for reform. Unless you have some evidence of change at the classroom level across the state, unless you have some evidence of children learning, particularly under-represented groups of kids, you probably are not going very far in systemic reform.

In Ohio, we did a "landscape" study. We were going to paint the Ohio landscape of science and mathematics education. We began painting it in '94 and continued yearly so that '95, '96, and '97 data are now available.

We used a multilevel design. Level A was a random sample of over 100 schools in the State of Ohio. Level B was a Horizon Research study that complemented what we were doing, and Level C was a close look at 15 carefully selected schools. There was overlap in our populations, so we had a good basis for comparison.

We used a matched sample. We matched teachers, who had been through the SSI professional development with teachers who had not, on the basis of the kinds of kids they taught and the kinds of classes they had. So we were matching at the classroom level rather than on teacher characteristics. We were able to collect a great deal of data across these three levels (A, B & C), and the study is still ongoing. For the next two years, we will go into four schools; they will be intense case studies.

Now, how did we choose the 15 schools for the study? We wanted to know how the equity issues were working. So we only went into schools that had at least 30 percent minority population. Ohio has about 16 percent African-American population and 17 percent Appalachian, so we had two minority groups that we wanted to assess. It also meant that most of the 13 schools we call Level B schools, were very poor schools as well. They were in our cities of Akron, Toledo, Columbus, Cleveland, Cincinnati, or in our rural Appalachian area in southeastern Ohio, along the Ohio River. It is beautiful, but very poor.

We collected quantitative data, as well as qualitative data. We based our whole project on inquiry in science and problem-solving in mathematics. We cut down the amount on content. We did everything possible to foster active learning in the classrooms. As a result, there was not a commercial test we could give.

From the National Assessment of Educational Progress (NAEP) items, we chose the ones that involved interpretation or analysis or some type of inquiry. The test works as well as I think it can work with the kind of cost constraints in large-scale assessments. I cannot recommend anything more highly than getting your hands on the NAEP public release items. NAEP surveys the whole country in a very careful random sample about every four years in each subject, i.e., science, mathematics, social studies, and reading.

One of the things you have to think about is representing your findings to your public. Our rule of thumb was that we could not use a graph that couldn't be explained in less than four lines, because four lines is about all the public will read.

Now, reporting achievement scores was a touchy issue for us. We were attempting to close the achievement gap between minority and majority children, and we looked at our data in lots of ways before deciding what to share with the public.

What happened next along this road of assessment was that I was asked to think about something that was just flippantly called the equity metric. An equity metric had to assess who has access, who has the appropriate education, who achieves to his or her ability, who has resources or opportunities to learn, and who is in leadership positions. Equity in participation was quickly added to the metric.

My challenge was to come up with a way to measure when systems were moving toward reform in an equitable way or reaching equity. The metric was based on indicators, and the indicators had to be measurable.

If there is evidence of inequity, it can become an indicator if your system is moving toward equity. Therefore, I began to look for evidences of inequity. The idea was to help projects, school districts, classrooms, and universities measure their progress toward becoming more equitable.

It also became clear that there were leverage points in the educational system. These are the points at which data are routinely collected. They are 4th, 8th, 10th, and 12th grades for the K-12 system. Therefore, if you are going to make a change, or do something to make your system more equitable, you need to think in terms of those leverage points as well.

The indicators have to be sensitive to diversity among groups. When you are trying to measure your project, you must get something that sorts out whether the girls who have gone through your program are doing better in science than girls who have not gone through it.

The indicators are used to inform action; that is, to push the envelope to the next stage. They distinguish among opportunities, accessibility, and participation. Those are three very different things to measure. Further, the indicators are directed at leverage, or pressure points.

Then I also looked at what would make a system equitable. An equitable system is one in which all children achieve at or beyond a specified level. Thus, regardless of your subgroup, you are expected to participate and achieve equitably.

One of the issues I had to resolve was, "Is this appropriate?" There was the feminist argument, for example, that the science and mathematics standards do not reflect the feminist viewpoint. But we came to agreement that the national standards were the most appropriate, and that all children could achieve them.

The National Science Standards state all children will learn high-quality science and mathematics. The question is, "Do you really mean all children?" Achievement should not only be represented by membership in a subgroup, but all subgroups should be represented proportionately at all achievement levels.

I will give you a description of how the equity metric could work in a real school system. If your project, or school district is trying to become more equitable, what are the key indicators? For example, if you wanted to be sure that the girls in your middle school program, having gone through your activities, are reaching a more equitable state in their science and mathematics education in the formal sector of school, what would you look for?

The first key indicator is enrollment in 8th grade algebra. If a student doesn't enroll in 8th grade algebra, she plays catch-up from then on, it is a key indicator of equity within a system.

The next key indicator is both the quantity and the quality of the science and mathematics courses. Quality is included, because there is very good evidence from the Third International Mathematics and Science Study (TIMSS) that very different kinds of things go on in classrooms that affect the achievement and future participation of kids. That describes the quality of the course. The quantity is just how many you have.

Quality issues are very important in gender studies, because many of the indicators don't work for girls. Girls enroll, except for physics, in equal numbers in every science and mathematics course in high school. That includes calculus, by the way. They do better in those courses than boys do. They achieve higher, so if you are measuring indicators of enrollment and achievement, girls aren't under-represented. But there is new research from qualitative studies that point to very subtle indicators.

For example, Valerie Lee and her group looked at single sex, and coeducational schools across a variety of subjects, and they identified something they call gender-related incidents. They found the highest number of those incidents in chemistry classes in coed schools. As the number of males in the class rose, the number of incidents rose.

The other subtle indicator has been identifed very recently; it is named gender lore. Gender lore is commonly accepted, and believed, information or ideas from the media about what is appropriate for boys and girls. The problem is that adolescent boys and girls both believe the gender lore. The researchers found that accepting gender lore was very restrictive to girls participation and retention in science courses.

Although the equity metric has access, retention, achievement, and overall indicators of equity, some indicators are so subtle that you must look at the instructional quality of each course. Because you want to collect measurable data, you must go in and see if there are gender-related incidents, Then, they can be quantified.

In conclusion, the four most critical indicators of a system moving toward equity are:

Those are the four indicators that you should assess as you proceed with your projects and your work. However, if you become interested in changing school systems, you may have to look at both academic levels and a range of programmatic and classroom indicators.

Reports from the
Second Break-out Session

Midstream Measurables

It is important to know what your goals are from the beginning, in terms of what you want to assess with teacher preparation. Obviously, goals change as you progress through a project. It is important that people know what your data goals will be at various points in the project. As necessary, reassess those goals and make those goals clear.

In both in-service and pre-service teacher education, there is a tremendous latency effect. It takes a long, long time for change and growth to occur. Certainly, it takes a long time to see the effect of that change in the students that these teachers teach. In the lives of most of our projects, we would not be able to get student data on course-taking patterns, achievement, changes in attitudes, and beliefs of students.

Many of us dealing with pre-service teachers see them for one term, and then they move on. It is very hard to get data over time and we need to begin developing midstream measurables.

Some measurables that are important to determine the progress of projects that are in midstream include:

Depending on the size of the project, it may not be feasible to directly measure some of those things. If you have a project that is impacting teachers over an entire state, you may not be able to observe those teachers teach. Therefore, we need other ways of looking at and measuring change. On a continuum of change, we must be able to measure and calibrate different kinds of changes.

Different kinds of measures of assessment that can be used include:

Following Through

We began our discussion by having people identify problems they were trying to assess, such as: how to do a climate survey when there are differing versions?

One of our key principles was to ensure that people who are involved in assessment intend to do something with the information that they find. People get discouraged and cynical filling out questionnaires and participating in focus groups if they don't think it is going to make a difference. The action component and commitment of following through is very important.

Follow-through includes discussing how to effectively publicize what you have found. What forms and forums will make the information you have discovered available to the audiences that need to know in order to take action?

Education Equity Concepts

Our workshop was about non-formal programs where we focused on the issue of evaluation. Some programs we discussed were:

We focused on what is realistic to evaluate in such programs, given their time frame. If you have a one-year project, formal or non-formal, what is reasonable to expect to be evaluated in terms of your doing the program, and showing the funders the results in the students?

If you are training teachers who work with middle school students, the funder wants to know what effect that one-year program will have on their choices of chemistry in high school. Obviously, the answer is you cannot evaluate that. Other questions were, how can you trace an inherently transient population over years if you had the money and luxury to do it or what is inherent in the project and what can you do to meet those goals? We could not answer those either. We concluded that you must look carefully at the goals you expect your project to meet.

Also, look at what criteria you are willing to be judged on. If you are breaking new ground, doing something experimental, the odds that something can go wrong increase exponentially. Both you and the funders need to understand that. Are you willing to fail? You must remember that evaluation isn't always about success. What you learn in doing your project is sometimes more important Negative evaluators may give you the insight into how to build a less negative program. Use negative indicators to plot a more positive course. Ultimately, a positive program will evolve.

Another question we pondered was: "If your institutional research board makes you get parental consent for human subject research, how do you explain that to the parents?" It becomes even more difficult if the human subject research includes race, ethnicity or disability. How should parents be approached so they are not intimidated, insulted or enraged?

Talk to parents honestly and candidly at opening meetings, one-on-one's, or with phone calls. Make sure you have people who can "communicate" with the parents to explain how their children will benefit from your program. Conduct parallel programs for families so that they can see and understand more about the program and how it can benefit their children. Provide literature about the success of other programs, explain your program goals, and most of all, believe in your program. If you have confidence that you can reach your goals, and that the children will benefit, the parents might be more willing to sign a human subject consent form.

A true sign of our times is dealing with the issues involved in putting a student's name on a Web page. The consensus was that you absolutely must have parent permission. A rule of thumb was not to use a last name or a phone number. It was suggested that you always have monitoring and have somebody forward the information. You don't want the student to be at risk in any way.

The legal counsel of the National Science Foundation has made that statement. Your institutions, however, may have a different set of criteria. Since the awards are made to institutions, you may have to go through some hoops at the university that NSF, in its federal role, does not see.

Technology and the Internet

Technological programs tend to fall into two camps:

But even within those two camps, there are distinctive and different roles that these technologies are playing. In the case of Web sites, girls are actually creating Web sites and home pages as a publishing medium for dissemination of information, and for doing collaborative research on-line.

The issue that comes up around technology and assessment are access issues. Rather than focusing on the assessment per se at that point, we discussed some strategies for dealing with the question of access. One suggestion was the creation of a clubhouse for girls. When developing effective programs you must:

When you are dealing with communication-based programs, what are the program aspects that stimulate conversation? How do you start and then sustain conversation on-line?

Indicators of a successful Web site are functional design, assessibility, and interactivity.

Keep the capabilities of the end users in mindwhat technology is available for children with disabilities, particularly those with visual disabilities.

With a graphic-intensive medium, are there ways in which you can use the technology and design it for use in different ways? How do we ensure that all end users are being taken into consideration in the design?

An important point is that if we truly mean interactive, interactive doesn't mean clicking on the mouse. You want to look at how the technology or the Web site is fostering collaboration among the students. Look at what they are doing on-line and more importantly, what are they doing off-line concerning this experience that you have structured within the environment.

You have to pay attention to context. Again, you are evaluating the design on-line, but you are also evaluating the design off-line. What are the conditions that are set up to get the girls there in the first place? How are you structuring the use of that environment? Is it in pairs? Are you creating activities around this on-line experience so that it is not just a straight dissemination vehicle?

One issue was safety on-line. In addition to access, the Internet is posing real safety issues, particularly for educators. How do you deal with parents who are concerned about what students are getting access to on-line. How the Internet is used is part of a program evaluation. Is the project just dealing with it in a censorship mode, which we have seen happen in schools? Categorically, because of fear of what the students will have access to, a school may say no to e-mail access for students. What strategies can you set up so that safety is taken into consideration without losing the benefits of the technology?

What are you willing to be evaluated on? One of the things that we must accept, as a group of researchers, is that evaluation and assessment are not forms of punishment. When we learn something new that does not work, it is just as valid as the things that do work, because it tells us what does not work. Otherwise, we will continue to make the same mistakes. We all remember the quote about history repeating itself.


We can probably come away with issues in four areas concerning assessment.

The Law, Privacy, Ownership,
and Mentoring in Cyberspace

John C. Chester

National Science Foundation
Assistant General Counsel

One of the things the National Science Foundation is very careful about is the language we use, the commitments we make in terms of policy, and how we interpret what is legal to do. NSF gets its learned understanding from the Office of the General Counsel. This session deals with issues of the media and copyright. Many projects have Web site space and deal with issues of cyberspace. The issue of safety, particularly for children using the Internet has been raised. How do we protect ourselves under the law?

Most of the questions you have concerning ownership you will have to decide yourselves. I can't help you. The law involving the World Wide Web, cybermedia, is evolving as we speak. The whole question of the governance of the Internet is up in the air. There is an interagency group working on it. Our Acting Deputy Director , Dr. Joseph Bordogna, appeared at Congressional hearings held last week. They are holding another hearing today to discuss the domain name system and how it will be structured in the future.

I can say that the Foundation has no intention of changing our basic laws. We leave rights with our awardees. We give all the responsibility for the protection and use of those rights to our awardees. One of the rights is the right to make income from any of your material. Rights reserved to the Foundation itself are very minimal. It is basically just a license. If something is patentable, then there are disclosure requirements, but that is about it. The best thing is to ask me questions and see if I can help or, more likely, whether some of your colleagues could help answer them.

Q: If you have a collaboration with a publisher, and they are going to disseminate your materials nationally, are there certain aspects of the materials that come under NSF regulations that deal with free dissemination?

Mr. Chester: Any copyrighted material produced with NSF support is subject to our copyrighted material clause. I have to drop a footnote here. Unless you are cooperating in some international agreement, all that is required is your agreement that, if anyone claims copyright, the federal government has a license to use the material. And we also have a requirement that you give us an acknowledgment and a disclaimer.

Other than that, what happens to rights is completely up to you. You can keep them yourselves. You can transfer them to the publisher. The publisher can then transfer them to other folks, sublicense, that sort of thing. Our rights only pertain to the material prepared with our support. So that, when you are talking development down the road, second, third edition, we are not involved. At that point, our hope is that it is an ongoing project being supported by customers.

Q: There was some discussion yesterday that these projects do not fall under the Institutional Review Board (IRB) rules and regulations at institutions, and we were told to ask you about that. Can we have a statement from the teacher to take back to our institutions that says we are relieved from those responsibilities?

Mr. Chester: NSF does not interfere in an institution's regulation of its research-educators. If the institution wishes to provide an extra layer of protection for human subjects by subjecting any research to the IRB process, they can do so. Our rules, however, which are the common rules adopted by all federal agencies, do have a broad exemption for educational projects.

Q: Do you have a cite for that?

Mr. Chester: 45CFR690.101(b). It says, "Unless otherwise required by the department or agency, research activities in which the only involvement of human subjects will be in one or more of the following categories, are exempt from this policy." That is an example from the Protection of Human Subjects policy.

The first one is "...research conducted in established or commonly accepted settings, including normal educational processes, such as research on regular or special education, instructional strategy, and research on the effectiveness or comparison among instructional techniques, curriculum, or classroom management methods."

I suspect most of the things you are doing probably fall under that one. So, as I said, you would not be required by the NSF regulationwhich is implemented through a grant policy manual section and a grant general condition clauseto go through the IRB process.

However, your institution gets to make the rules for itself and, if it has decided it wishes to cover all areas, it can do so. We are not going to argue with them about it, but we would make it clear that as far as we are concerned, it is not necessary.

Q: It is a chicken and an egg problem, because we are told that the NSF requires it. So we need some sort of note from you.

Mr. Chester: You can refer them to this, and if there is a problem, you can suggest that they write in. You can write to me, but the name of the attorney in the Office of General Counsel who is actually in charge of this is Anita Eisenstat. She would be glad to confirm that we have not limited the exemption in the common rule.

Q: If, in a proposal, you state that you are going to use a Web site as part of dissemination, post activities or policy. What if your results came out much better than expected and you don't want to put it on the Web now?

Mr. Chester: Anything you put in your proposal is what you intend to do, what you wish us to fund at the moment, you have put in the proposal. If circumstances change and you discover a better way to achieve the shared objectives of the project, obviously, that can change. We would expect if you said you are going to put it on the Web site, and later you say, no, we wish to do something different, that something different has to be, from the program's perspective, as good or better than putting it on the Web site, or else the program will be disappointed.

Now, as far as legal liability goes, you don't have any. Unless we would look at that deviation and say that is such a violation of a basic term of the award that we are going to come after you. Probably we wouldn't. It is really up to you. If we didn't trust you, we wouldn't give you our money.

Q: Most of the language, even in the NSF Manual, still refers to publications, which we interpret to be hard copy.

Mr. Chester: No, we have decided that a Web page is a publication. You are required to include acknowledgment and disclaimer when you prepare one as part of a NSF-supported project.

Q: What about materials presented as slides or Powerpoint presentations or other medium, if it specifically has the name of your project in it?

Mr. Chester: A license wouldn't really be meaningful if the material isn't available because no one else has copies of it to use. But, yes, we would always like to get an acknowledgment and, if appropriate, a disclaimer, as with publications in scientific and engineering journals. I think the same would be true of a presentation to an equivalent committee of educators or scientists. We are going to be clarifying this in our next revision of NSF's Grant Policy Manual and the Grant General Conditions.

Q: In the case of a project jointly funded by two federal agencies, National Science Foundation and the Department of Education, Fund for the Improvement for Post-Secondary Education (FIPSE), are the rules the same about publication and copyright. If not, who oversees the other one? Is it by the amount of money one agency has given?

Mr. Chester: No. Actually, in situations involving intellectual property, it is the agency that cares more. In most cases, and this comes up notably in relation to inventions and patents, there is no agency that cares less about such things than the National Science Foundation. Thus, the other agencies usually get to be the lead agency. This would be something we should clarify when making the award as to what rules apply. Again, if it is going out as an NSF grant, then the legally binding terms are our terms. If it is going out as an ED (U.S. Department of Education) award, it is theirs. If it is joint funding from two sources, you have a little bit of a problem. All I can say is I think the Foundation would be willing to sit with the folks from ED here and try to work out something that would be worthwhile.

All agencies are governed by OMB Circular A-110, which sets out the basic rules for copyright and that sort of thing. So you should not have any real serious disagreements. It should just come down to who is acknowledged.

Q: What is the copyright status of material you post on the Web?

Mr. Chester: Good question. I have no idea. It is a publication. I have seen numerous Web pages with copyright notices. Certainly, under the copyright law, it would seem fixed in a medium that would be copyrightable. I have not seen any cases where somebody has been pursued for copying a Web page. Actually, the only case that I can remember involved the opposite situation. Microsoft set up a link so they didn't copy the Web page. They just provided direct access to it from their site. The folks who were operating that site were angry, because doing that bypassed a lot of the advertisement you normally had to drill through in order to get to that page.

I believe it is copyrighted and something you can protect. However, as I was saying, we leave all the rights out there, in part, because we don't have to worry about protecting them. That is you folks' job. This isn't an issue for NSF, because as an agency of the federal government, anything prepared by our folks that goes on our Web page is in the public domain. There is no copyright under United States law on such material. So it is something with which we have no experience.

Q: When you say that with NSF policy, the right resides with the awardees, does that mean the PI or the university?

Mr. Chester: Yes.

Q: Which?

Mr. Chester: Since our agreement is with the institution, the grantee, we say that the grantee may claim or permit others to claim rights to any copyrightable material. The grantee can allow the author to retain it. They can transfer it to a publisher. They can sell it basically to anybody they want. All they have to do, as I said, is get everyone who has any rights to agree that the federal government has a license. We try to stay out of it. We know some institutions take the position that stuff done under sponsored research is a work for hire, the university is the author; therefore, the university owns it. Others regard the researchers more as independent contractors and allow them to retain the rights. We don't care. All we care about is that the work gets done, and the minimal right we retain is protected.

Q: With respect to the Web, I am wondering if there are distinctions being made between things that reside on the electronic Web site and things that are downloaded. Is there a difference in the copyrighting those two entities.

Mr. Chester: I have not heard anybody suggest that there would be any difference between them, since even, as you say, something resides on the Web, but it is captured into the memory and often onto the hard drive on which the browser resides. So, even technically, physically, there is not much difference between the two. Perhaps somebody would draw a distinction based upon that again as part of the hashing out as to what copyright means when you have to protect it. But, thus far, I haven't heard of any distinction drawn.

Q: I am still a little confused about multimedia. If instead of publishing a paper, you hold a meeting, and you do a presentation to a general audience, a site visit, or a group of advisory board members in Powerpoint with slides and animation. The presentation contains the name of the project, it contains an NSF logo. Is the content of that presentation the sole property of the presenter, or because it is done in the context of an NSF proposal, is it treated like any other materials produced by the proposal for dissemination?

Mr. Chester: Good question in the sense that Powerpoint slides are, in the terms of the copyright, fixed. So copyright would reside in them. They would be a physical product of the project, presuming they were done as part of the NSF-assisted project. Yes, they would be covered as copyright material.

The presentation, to the extent that it wasn't written down, would not be fixed in any medium; therefore, there wouldn't be any rights to it. Unless somebody was recording it, which would be a fixation, there wouldn't be anything five minutes after it finished. So there is no worry about what rights there are. Normally, anything produced­copyright material is deliberately a wide term­would be subject to the rules in the grant. These are so minimaljust a reservation of a license and the requirement of acknowledgment and disclaimerthat no one has ever had any problems with that.

Q: We want to disseminate a lot of what we produce, and we want people to be able to access it. Where can we find the exact wording to say that we are copyrighting the research that we have done, or a research paper, but we give permission to use it as long as it is not used for profit? Is there some one- or two-line statement that we can print on the bottom of what we are disseminating that allows other people, especially educators, to use and access and copy those materials without each one of them having to come to us and ask permission?

Mr. Chester:There is no set language of which I am aware. This is the sort of thing that perhaps one of the learned societies might wish to develop and have everybody agree to. I would have to direct you to your institution's attorneys, counsel, to see what it wanted to put on, because what you are basically doing there is giving license to the material with certain conditions. Obviously, the wording would affect just how broad that was and whether you could then charge other people or sue other people for infringement when they violated a license.

Q: Could you summarize when what we produce is in the public domain and when what we have produced is marketable?

Mr. Chester: What you produce is in the public domain only if you make a real fervent effort to put it there, since you are no longer required to put a notice of copyright on these things. NSF never requires anything be put in the public domain. Everything is marketable in any case.

Q: Even during the time that the grant is in place?

Mr. Chester: Yes. Certainly. The only consideration is that any money you receive that is not copyright royalties during the term of the grant is project income and has to be ploughed back into the grant. So you have to retain it and use it for the purposes of the grant. With copyright royalties and patent royalties, there is no obligation on the institution, and there is no obligation as far as any income received after the termination of the NSF award.

Q: What would be an example of that kind of income?

Mr. Chester: Non-copyright royalty income. For example, you are selling copies of a book. You are making the book itself or the materials, and you are selling them. Obviously, some of the money coming in is for the copyright on the item, but some of it is also for the paper and the printing and the distribution and the rest of it. So selling a book would be non-copyright income.

Royalties would basically be where you went to the publisher and said, for $5 million, we will give you the right to print these books, and they give you the $5 million. That is royalty income. The money they earn, which is apart from the award, since they are not our awardee, would be the other kind of income.

The rules have been set up very carefully to say that most of the income received by universities is not subject to any restrictions. Since, normally, they would work up a royalty arrangement, where somebody else does the heavy lifting, as far as the printing, the distribution, the rest of it, and the university gets the royalties back. As I said, with royalties, there are no restrictions. The university can choose to use them for the project, but is not required to do so.

I have found this very interesting. If you have more questions in this area, I am here at the Foundation, Be glad to help you.

Equity Sub-Panel
Department of Education

Sue Klein

The Department of Education

The Gender Equity Expert Panel is part of a system of expert panels that the Department of Education is hoping to create. Another is the Panel in Mathematics and Science Education.

I will discuss the Sub-Panel on Mathematics, Science, and Technology, one of the six sub-panels in the Gender Equity Expert Panel. We have a member of the Gender Equity Expert Panel here with us, Caroline Thorsen, and our formative evaluator of the Gender Equity Expert Panel, Pat Campbell.

I would like to invite all of you, if you aren't already a member, to join the advisory group for the Gender Equity Expert Panel. We are seeking people who have expertise in gender equity, evaluation, dissemination or in product production who want to keep informed of what is going on in the Gender Equity Expert Panel. We need people who would like to be on-call as a reviewer to help others with their submissions. We have established a special e-mail system (listserv) that is our main way of communicating with members. Our listserv is being operated by our support contractor at the Education Development Center, the Women's Educational Equity Act Resource Center.

I encourage you to write me on e-mail. Include your address, phone number, all the information on your sign-up sheet, and your areas of expertise. Lynn Fox, who is the chair of that sub-panel, will know whom to call on specific issues.

The Gender Equity Expert Panel is trying to identify and share promising and exemplary products, programs, and practices to help consumers learn about their relative strengths and weaknesses, so that they can make wise choices. In addition to submitting your own work, we encourage you to urge others to do so. Unlike grant competitions, there is no cap on the numbers that can be approved.

The panel will be focusing its attention on four criteria categories:

Incorporate all four of these important criteria as you develop your projects.

As you document your success, we urge you to give the results by breaking down by gender analysis and, if possible, by analysis on gender by race, etc. Specify how your product or program can be made or is already accessible to special disabled populations. Please disaggregate your data wherever possible in sharing the results.

We see the Gender Equity Expert Panel as breaking new ground. We are using an inclusive, positive, and empowering approach to evaluation that will focus on consumer, developer and funder needs. For example, we are working collaboratively with NSF and with you on the information and submission guidelines in the package of materials you have received from this program.

We encourage your own submissions, and we are hoping that the end result is greater accessibility, not only to NSF, but to all of the Department of Education-related dissemination networks and activities. We hope a contractor will develop consumer reports to better understand the strengths and weaknesses of the various products and programs the Gender Equity Expert Panel approved as promising and exemplary. We also hope that by learning more about the good things that exist, we will be able to identify those that are promising but need additional support for evaluation and dissemination.

While the Gender Equity Expert Panel applies clearly to the Program for Women and Girls model projects, it is also relevant to experimental and dissemination activities. Some of the experimental projects are using, and we hope evaluating established models. We need continual information on how these projects work with different population groups. This also fits into the notions that Jane Kahle had in her article on reaching equity and systematic reform. The work of the Gender Equity Expert Panel should help improve curriculum and other aspects related to her model by helping users be more informed as they select courses, projects and programs. We can also help the users by eventually making this consumer information available. We will have information put together systematically by involving experts in a very careful analysis and sharing.

We hope that this will be a continuous, ongoing process that will be owned by the community that cares the most about each expert panel's topic.

In the Program for Women and Girls, we believe in the notion of practicing what we preach. We don't tell you to do anything that we ourselves don't do. And that means cooperating with other federal agencies and organizations that are committed to the participation of all people in mathematics, science, and engineering.

Note: For those of you looking to expand your projects or develop new avenues, The Guide to Programs, gives you information about the kinds of things NSF funds. The publication number is NSF 99-4. Find it on the Web at

Program for Women and Girls:
Documenting Where We've Been

Conrad Katzenmeyer

National Science Foundation
Division of Research, Evaluation and Communication

Let me say something about the general approach to evaluation and this thing called the Government Performance and Results Act (GPRA) of 1993. We are from the Research, Evaluation, and Communication (REC) Division responsible for doing all the program evaluations within the Directorate for Education and Human Resources (EHR). We are looking at the broad questions about the program as a whole. We do not do any project evaluations, per se, but rather are looking at the impact of the program. Contractors do the work in all of our program evaluations. We have a set of task-order contracts and each evaluation goes to a particular contractor.

We have been working on a cycle of program evaluations under Congressional mandate since 1992. Congress asked us to do evaluations of each of our programs on a regular cycle and we are now in the process of completing that cycle.

We have about 30 programs in Education and Human Resources, which means, at any given time, we have about 15 evaluations underway. Women and Girls is one of the very last programs we have taken on. In part, because it is not as old as some of the other programs and we had a period of time to get ready for what we would do.

The evaluations vary from perhaps a year in length, up to three or even longer in terms of completion. The most significant change since last year is the GPRA which will be increasing its influence.

The GPRA requires that every government program, not just the education programs, have measurable goals. Further, for each program there will be indicators and measures. It is very much an outcome-oriented approach to accountability and management. Within a few years, it is expected that the outcomes that we collect will be linked to the budget process. In theory, it could be possible for the Office of Management and Budget and for Congress to monitor our programs and all other federal agencies in terms of outcome. Then, dollars could be assigned based on how we are doing. That is a fairly frightening thought.

We will see if it actually goes that far and how well we link up to the dollars. It is difficult, but certainly that is part of the intention, and we are just about to file our first performance plan. We will spell out our intentions for each year. We will then state how we are going to know whether we have met our objectives in terms of the indicators that we specified and the measures we will use for those indicators.

It gets to be a fairly mammoth effort, and it is a major shift in the way that we provide accountability for our programs. The reason for making a big deal about this is that it affects you. It will impact the projects, and we are, of course, nothing but projects. That is what NSF is about. Therefore, I am sure we will need to come to you for that information, and there will be data collection requirements. We are not ready to specify them yet, but they will be specified in the near future.

Our role in the Evaluation Program within Research, Evaluation, and Communication is to be responsible for the data collection and reporting of the indicators for research, evaluation, and communication as a whole. We will be linking to the programs so the monitoring information that they need will be tied to the information EHR needs.

We will actually be responsible for evaluations in GPRA. We will also be responsible for the development of materials and activities that are of help to the projects, PIs, and evaluators who work on our projects. I am sure you are familiar with the User-Friendly Handbook for Project Evaluation, the one with the yellow cover. It has been passed out before. I want to alert you that there is now a companion document, called the User Friendly Handbook for Mixed Method Evaluations, NSF 97-153.

One of the problems with the original User Friendly Handbook was that the
emphasis was strictly on quantitative approaches to evaluation. We recognized early the limitations of that approach. We wanted to broaden this so our projects would consider the qualitative, as well as the quantitative, and use them together in mixed methods. This new handbook, developed by Westat is available. To request copies, call Aspen Systems, who distribute publications for NSF, at (301) 947-2722. They will send you a copy.

We are currently developing a directory of evaluators. This is being done for us by the Evaluation Center at Western Michigan University. The purpose is to have a readily available electronic means of locating evaluators. This system is up and running. We don't have a lot of evaluators in it yet, but we are now beginning to increase the number as quickly as we can. I encourage those of you who are evaluators, or who have evaluators working on your project, to register with the directory. You can reach them at Western Michigan, and it is all done on a Web site.

We think it can be extremely helpful. One of the problems I hear about on a regular basis is that our PIs don't know any evaluators. Or they don't know any evaluators near them that would be appropriate to involve in their project. Often, it happens that there are evaluators, perhaps even on the same campus, or in the same institution, and they haven't met them. They don't know who they are. This directory will be a means of finding people nearby with the right kinds of credentials for doing evaluations. I am sure it will be very useful to you.

We are also compiling an electronic library of evaluation instruments, evaluation plans, and evaluation reports. This one is not as far along as we would like, but we hope to move quickly on it in the next year or so. The source for this information will be you. It will be our projects, and we will draw a number of examples and put them together on a Web site. You will be allowed to use the Web site library for your own purposes. So, I hope we will have something that will be of real value to you and allay some of the worry about evaluation questions over the next year or two.

Let me finish by saying something about the NSF culture and the role of evaluation. For many of you who have dealt with NSF, this will come as no surprise. For others who have dealt with other agencies, federal or state, you may find this somewhat surprising. The approach to evaluation at NSF is very much a product of the culture of this organization. What we have done, traditionally, is based on the research model. We think of a research project as being a single investigator that carries out a project according to the proposal that has been peer reviewed.

The peer review of that proposal, plus the research culture, provide the controls on the quality of that work. I think, in the research model, that has not worked badly. I think it has worked really quite well for NSF. It has also meant that NSF had minimal oversight of its projects once they had been funded, for several reasons.

Traditionally, researchers do not expect that there will be much oversight from the agency. Because of the nature of NSF, we never have had the money or staff to do a lot of monitoring. Our job has been basically to distribute money in a fair and efficient manner. The other issues about evaluation, therefore, have not been given the same kind of attention in NSF as a whole. Therefore, it is not too surprising that NSF does not have a strong history of evaluation.

This is particularly telling with the education programs. Many people assume that these are big national efforts all headed toward a common goal. That all of the projects fit neatly into that goal and are almost replicas of one another. Those of you who have been into this, know better.

Our programs are not that. Our programs tend to be a very diverse and loose conglomerate of projects that are put together, certainly, with a common goal, but with a very loose direction. The objectives for our programs, again, tend to be very loose. They tend to be not specific in behavioral terms, and place a lot of emphasis on process, of actually doing the projects rather than talking about the accomplishments of the projects.

The reason the programs have been structured that way in the past is that it gives maximum flexibility to the field. You have been able to propose to the programs what you want to do and have hopes that that idea will be acceptable when reviewed by your peers. If the idea survives, it will receive money. I think that is noble. I think that is great. I am also warning you that I think GPRA is going to make that a lot tougher in the coming years. GPRA, by its very nature, has a very different philosophy of what a program is and the way you go about judging the accountability of the programs.

You are going to see demands from the agency for much more structured programs with clearly defined outcomes. Inevitably, it will mean that there will be less flexibility in the nature of the programs­the nature of the projects that make up a program. I hope it doesn't go totally in that direction. I happen to believe in the old style for NSF. But I do know that the philosophy of the GPRA is what guides us at the current time. It will be a major factor as we go forward from here, at least for a period of time.

So I put you on alert that even though we don't know what it all means, GPRA will undoubtedly mean something, and I do not necessarily think it is good for you or for us.

James S. Dietz

National Science Foundation
Division of Research, Evaluation and Communication

We are at the beginning of a process to design and carry out an evaluation of the Program for Women and Girls.

Basically, our purpose for the evaluation is threefold:

We now have a contract for two years to do an evaluation with the Urban Institute where we will be working with Jane Hanaway and Tony Cluwell. They also promised to hire some very, very important people from the field who are both evaluators and knowledgeable about issues of women and girls in science, mathematics, engineering, and technology. Now would be a good time to hear from the field, the community, and the awardees. Are we on track and what suggestions might you have for us?

When we talk about impact and about program coverage, we want to know who received the benefit of the program. What kind of benefit was it? What has the program accomplished? Has the program made a difference in the lives, careers, and education of women and girls? What is the impact in terms of the cost of the program? What kind of indirect impacts occurred that we might not have expected when we originally designed the program?

Then we have questions that involve the impact of the program in terms of programmatic structure. Given the structure that we currently have, what is the relative impact of the program? How can a greater impact be achieved with the current amount of money, or the same impact at a lesser amount? Something that really relates back to the cost issue. How are the findings, the outcomes of the projects, being disseminated? Has that been effective?

What kinds of partnerships have been formed? Have those partnerships been effective? What can we learn about partnering in figuring out when partnerships work and when they don't? What kind of products came out of the program and the projects? Were those generally of good quality?

Mentoring is a very special topic. What forms of mentoring do we see out there among the diversity of projects? Which ones seem to work and under what circumstances? Which ones seem not to work? We need to learn a lot more about various aspects of the program and its impact.

And then there are the needs of the community. What makes for a successful intervention, either at the career level, college level, high school level or before? What are the sort of factors, components, situations­what is the general mix of ingredients that seems to pay off, and under what settings? How can that be useful to people in the field?

How can project impacts be increased? What effects should this have on the guidelines of the program? What function should the program serve in general? Is there a particular role that seems to be more or less appropriate, given the outcomes of the evaluation?

What makes for a successful project? Are there particular models that have arisen out of the evaluation that should be shared with the community?

What does all this say about interventions in gender equity? Does it add to the knowledge base? What does it have to say about how our programs should be designed? How should we distribute the money? And what does it have to say about how your projects should be designed.

That is what we are after. Those are our basic goals for the evaluation. We will be working closely with the Women and Girls Program staff in determining, throughout the process, just what methodologies we use to carry out the evaluation and fine-tune the general mix. Most of all, we would very much appreciate receiving your feedback.

GPRA Implications
and Impact on PWG

William A. Sibley

National Science Foundation
Acting Director
Division of Research, Evaluation and Communication

The first thing I want to do is congratulate you all as scientists, engineers and educators in the progress you have made with women in this country. At the present time 34 percent of our 24-year olds have bachelor's degrees.

That is, by far, the most of any country in the world, and of those bachelor's degrees, 52 percent of them are women. If it was not for the women­if Japan would educate their women the way we do­Japan would outstrip us with the number of bachelor's degrees.

There are 400,000 scientists and engineers practicing in this country at the present time who are Ph.D.s. About 93,000 of those are women, and that is an increase of almost 10-fold over 10 years. That is really very good.

The idea is to establish performance goals, which must be objective, quantifiable, measurable, have performance indicators and compare actual results. Basically, two of the programs for which I have some responsibility have been in this game already. The key is for you to know what we need in performance indicators and that we, as scientists and engineers and educators, use those to improve the system. Not use them to be hauled into accountability, but use them to improve what we do, and use them positively.

Program activity contains two elements:

You have already heard a discussion about outcomes and impacts. Inputs are raw materials, the human and physical capital that you need in your laboratory or in your situation, to make things happen. In my case, it may be a spectrometer. It may be a photon detector. It may be graduate students or post-docs. That is the raw material. Outcomes are longer-term results. Impacts are what you want overall. What is the impact of this program? What did it really change? Was it positive? Was it negative? What happened in the end, maybe three years after you finished your work?

One program that could be evaluated in this light is called Centers for Research Excellence in Science and Technology (CREST). Its goal is to produce outstanding research and build an infrastructure on campuses, particularly minority campuses in this case, and to produce Ph.D. minority students and graduates. In our case, we can measure objectively and quantitatively whether these people are making a difference in their system or not. Before it was funded as a CREST center, their total research expenditure, including federal, was very low. When they were funded back in '88, they began to make a climb. We can do that for all 12 of our programs and show that they have climbed faster than the national average.

Johns Hopkins is the best research expenditure institution in the nation. They spend more money, more federal money, than anybody else, almost a billion dollars a year, and they have done the same thing that the average has in this country. They have climbed a factor of 2 in 10 years. That is the value, the norm against which we work, a factor of 2 in 10 years. The other thing is that your federal funding should be leveraged by what you are doing. If your work is really good, somebody is going to help you pay for it.

So one of the things we look for is, "Does federal funding go up like the rest of it? And does it stay around 70 percent or more of your total funding?" From your side, you will see that the research expenditure totals have gone up, but the federal funding has stayed down.

Some of these kinds of implications are very important, and some of them will come out of GPRA. You have to realize that people will read into data things of that nature. So there are some measurable things.

The other measurable thing will be how many Ph.D. graduate students did these people generate over the last several years? So there are inputs. How much money we input. How many faculty we are funding. How many graduate students there are. How much equipment they buy. That is the input. These types of numbers and how many Ph.D.s they produce are the output, the direct output.

But are there outcomes? That is more than output. For example, Hampton University had no Ph.D. program when we started funding them. Now they are not only turning out minority Ph.D.s, but one of the Science and Technology Centers, where we place about $5-7 million a year, called and said, "We don't have an accelerator physicist. Can you send us a post-doc who can teach our people about accelerator physics?"

So Hampton University was able to supply a Ph.D. scientist to a major Midwestern university to help them with accelerator physics, because of all the opportunities they have through their program and Ph.D. program. That is an outcome. You will have many outcomes in your programs, of things that have happened very positively that you can't necessarily quantify like an output. But you will have an outcome.

Basically, you will also have impacts. Because of the region where Hampton University is and because they now have a Ph.D. program, Norfolk State and and Old Dominion are going to cooperate with them. They have all been lifted up in a synergistic fashion because of the fact that this program exists.

You all have things you can talk about in mentoring and other areas in that same way. So you will have input, output, outcomes, and impact. As you hand back your data and your experiences, you need to be alert to point that out.

We went through a Performance Effectiveness Review (PER), a review of an Alliance for Minority Participation (AMP) program with 10 of the programs funded by AMP. They came in, they reviewed their data, their outcome and their impact for the Assistant Director of EHR . They asked them questions, and decisions were made from that. We will do it again this year for another 10.

I want to leave you with that and leave that as an example.



Awardee Meeting

September 29-30, 1997

The Visiting Professorships for Women (VPW) Program was inaugurated in 1982 as part of NFS's efforts to develop full use of the nation's human resources for science and technology. The last VPW awards were made in 1996, and it is this last group of awardees who attended the VPW Awardee Meeting held in conjunction with the Women and Girls Awardee Meeting.

The VPW Program provided opportunities for advancement of outstanding women and encouraged female students to pursue careers in science and engineering by increasing the visibility of successful women. Through VPW, experienced female scientists and engineers were given the opportunity to conduct advanced research at academic institiutions of their choice where they had access to the top scientists in their fields and the most advanced research facilities in the country. The VPW award provided funding for travel to the host institution, basic research expenses, and usual salary for a period of 6 to 18 months.

The following is the list of VPW awardees, plus location information, who attended this meeting.


Sponsored Projects Office
University of California-Berkeley
336 Sproul Hall
Berkeley, CA 94720

Office for History of Science and Technology
453 Stephens Hall
University of California-Berkeley
Berkeley, CA 94720
PH: 510/642-4581 (OHST)


Massachusetts Institute of Technology (MIT)
77 Massachusetts Avenue
Cambridge, MA 02139
PH: 617/253-6304
FAX: 617/258-8827

Department of Chemical Engineering
Northeastern University
342 Snell Engineering Center
Boston, MA 02115
PH: 617/373-3900
FAX: 617/373-2209


Physics Institute for Nuclear Theory
University of Washington
Physics/Astronomy Building
P.O. Box 351550
Seattle, WA 98195-1550

Department of Physics
The Ohio State University
174 West 18th Avenue
Columbus, OH 43210-1106
PH: 614-292-1843


Grants and Contracts Services
University of Washington
3935 University Way, NE, JM-24
Seattle, WA 98195

Department of Computer Sciences
University of Wisconsin at Madison
1210 West Drayton Street
Madison, WI 53706-1685
PH: 608/262-3158


College of Rural Alaska, Kuskokwim Campus
University of Alaska Fairbanks
P.O. Box 368
Bethel, AK 99559

College of Rural Alaska, Kuskokwin Campus
University of Alaska Fairbanks
1026 West 10th Avenue
Anchorage, AK 99501
PH: 907/258-2439


International Business
Stern School of Business
New York University
44 West 4th StreetNew York, NY 10012
PH: 212/998-0432

Department of Economics
University of Colorado
Campus Box 256
Boulder, CO 80309-0256
PH: 303/492-5923
e-mail: feeney@spot.Colorado.EDU or


School of Oceanography
University of Washington
3935 University Way, NE
Seattle, WA 98105-6613
PH: 206/543-9810

Applied Physics Lab, Box 355640
University of Washington
Seattle, WA 98195


Population Research Center
University of Texas-Austin
1800 Main Building
Austin, TX 78712

General Motors Research Labs
Operations Reserch
MC 480-106-359
Warren, MI 48090-9055
PH: 810/986-1350


Office of Research and Sponsored Programs
The State University of New Jersey-Rutgers
P.O. Box 1179-ASB, Annex II, Busch Campus
Piscataway, NJ 08855-1179

Department of Geology
Western Michigan University
Kalamazoo, MI 49008
PH: 616/387-5513
FAX: same phone number


Foreign Languages and Literatures
Massachusetts Institute of Technology (MIT)
77 Massachusetts Avenue
Cambridge, MA 02139
PH: 617/253-9778

Department of HDFS & Cognitive Studies Program
Cornell University
Ithaca, NY 14853
PH: 607/255-0829


Department of Mathematics
Harvard University
Holyoke Center, 4th Floor
Cambridge, MA 02138

Mathematics Department
Boston University
111 Cummington StreetBoston, MA 02215
PH: 617/353-9554


Department of Geological Sciences
University of Texas-Austin
Austin, TX 78712

(as of January 1, 1998)
Department of Geology
University of California-Davis
Davis, CA 95616


Chemistry Department
University of Rochester
Hutchinson Hall
Rochester, NY 14627-216

Chemistry Department
State University of New York
526 Natural Sciences Complex
Buffalo, NY 14260
PH: 716/645-6800


Salk Institute for Biological Studies
1010 North Torrey Pines Road
La Jolla, CA 90237

Home: (currently)
Department of Zoology
University of Washington
Box 351800
Seattle, WA 98195-1800
PH: 206/543-0487


Sponsored Projects Office
Regents of the University of California
336 Sproul Hall
Berkeley, CA 94720
PH: 510/430-3220

Department of Mathematics
University of Colorado
Boulder, CO 80309


Institute of Theoretical Dynamics
University of California at Davis
2201 Academic Surge Building
Davis, CA 95616

Home: (now at HOME Institution)
Department of Mathematics and Statistics
University of New Mexico
Albuquerque, NM 87131
PH: 505/277-4613

(at HOST till 12/31/98)

Department of Zoology
University of Hawaii
Kewalo Marine Lab
41 Ahui Street
Honolulu, HI 96923

Marine Laboratory
University of Guam
UDG Station
Mangilao, Guam 96923
PH: 671/780-3072


Incyte Pharmaceuticals
Porter Drive
Palo Alto, CA 94304

Computer & Information Science Department
Six MetroTech Center
Brooklyn, NY 11201
PH: 718/260-3502


Department of Physics
State University of New York at Stony Brook
Stony Brook, NY 11794-3800

Department of Physics
Brookhaven National Laboratory
Building 510B
Upton, NY 11973
PH: 516/874-3127

The National Science Foundation (NSF) funds research and education in most fields of science and engineering. Grantees are wholly responsible for conducting their project activities and preparing the results for publication. Thus, the Foundation does not assume responsibility for such findings or their interpretation.

NSF welcomes proposals from all qualified scientists, engineers and educators. The Foundation strongly encourages women, minorities, and persons with disabilities to compete fully in its program. In accordance with federal statutes, regulations, and NSF policies, no person on grounds of race, color, age, sex, national origin, or disability shall be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any program or activity receiving financial assistance from NSF (unless otherwise specified in the eligibility requirements for a particular program).

Facilitation Awards for Scientists and Engineers with Disabilities (FASED) provide funding for special assistance or equipment to enable persons with disabilities (investigators and other staff, including student research assistants) to work on NSF-supported projects. See program announcement or contact the program coordinator at (703) 306-1636.

The National Science Foundation has Telephonic Device for the Deaf (TDD) and Federal Information Relay Service (FIRS) capabilities that enable individuals with hearing impairments to communicate with the Foundation regarding NSF programs, employment, or general information. TDD may be assessed at (703) 306-0090 or through FIRS on 1-800-877-8339.