Webpage last updated on: October 31, 2013

Industry/University Cooperative Research Centers
Program Evaluation Project

Resources for Evaluators

Overview and General Instructions

1.1 OVERVIEW

The Industry University Cooperative Research Center (I/UCRC) Program and model grew out of an experimental program (The Experimental R&D Incentives Program) which operated from 1972 to about 1979. NSF began systematically building I/UCRCs based on the industrial consortia approach developed and validated through this program beginning in 1980-81. A more detailed description of the history and evolution of the I/UCRC program and its evaluation can be found in the sources listed in the publications list (see Section 7) and provided as supplementary reading (Section 9).

The ongoing evaluation of the individual Centers and the I/UCRC Program began in 1982-83. The evaluation effort has several key features: coordinating mechanisms for the national data collection effort; an on-site evaluator at each Center; standardized data collection protocol and instruments; and longitudinal data collection. Although the specifics of the evaluation have changed over time, these features remain the bedrocks of the evaluation effort. Support for the evaluation effort is provided by the I/UCRC Program, primarily in the form of a line item funding the on-site evaluator in each Center grant. NSF also provides funding for the other coordinating mechanisms and various supplemental studies.

There are three coordinating mechanisms for the evaluation effort. First, NSF I/UCRC staff provide overall guidance for the evaluation effort. Second, all evaluators comprise a coordinating committee for the evaluation effort. This group meets twice each year (typically January and June) to share briefings on the findings of different components of the evaluation effort, exchange information, orient new evaluators, and vote on issues related to the evaluation effort. Finally, a team from North Carolina State University (NCSU) has been contracted to coordinate and support the I/UCRC evaluation assessment activities. The NCSU team will also maintain and update this Evaluator website and analyze structural data collected by NSF from Directors.

The I/UCRC Evaluation project has four goals:

Primary

1)To help NSF and local Centers objectively evaluate their impact by documenting I/UCRC outcomes and accomplishments

2)To promote continuous improvement by giving actionable, timely, data-based (formally collected and observational) feedback, analysis and advice to NSF and local Centers;

3)To identify and communicate information about I/UCRC best practices to NSF and local Centers;

Secondary

4) To help promote a better understanding of industry-university-government research cooperation

The remainder of this page focuses on the role and responsibilities of the on-site evaluator. As a reminder, Centers (and as a result, Evaluators) are funded under different solicitations, many of which are active at any given time. Please be aware of which solicitation(s) apply to you and your Centers. Click here for a table that summarizes the differences between the various active solicitations.

Back to Top

Section Updated October 31, 2013

Tasks

1.2 REQUIRED AND OPTIONAL ACTIVITIES

The evaluator's role may be described in terms of required and optional activities. This section will provide an overview of both; detailed instructions for administering required instruments and examples of optional instruments are provided in subsequent sections of this page. A master copy of each instrument is provided in the relevant section of the webpage. Evaluators should make a copy of each instrument and maintain the master copy for subsequent years.

Required Activities

At a bare minimum, the evaluator is expected to:

1) Attend semi-annual evaluators' meetings (typically in January and June);

2) Attend semi-annual Industrial Advisory Board meetings; (if unable to attend and IAB meeting, please find a substitute evaluator) Lead and/or assist in the implementation of the LIFE feedback process. (Note: Phase III evaluators may elect to attend only one IAB meeting per year.)

3) Prepare an "Evaluator's Report," with cover sheet when a new Center is born and provide an annual narrative summary of significant Center developments for submission to NCSU and your Center Director; this must include an attempt to document Center success case study and/or economic impact assessment (See “Identifying and Documenting IUCRC Center Success Stories and Economic Impacts”).

4) Complete Semi-Annual Meeting Best Practice Checklist (word, & PDF versions) at each IAB meeting and attach to Annual Evaluator's Report for each I/UCRC.

5) Administer "process/outcome" questionnaires to faculty, and Industrial Advisory Board Members annually;

6) Prepare an annual report based on the process/outcome questionnaire data for your Center and submit to your Center Director;

7) Forward process outcome questionnaire data to the evaluation team at NCSU;

8) Provide information and feedback to NSF; and

9) Provide information and feedback to your Center Director.

Directors are required to include a copy of the Evaluator' Report (i.e., item 3 above) with the renewal application which is due each spring. A summary of required evaluation outputs is provided as Table 1. A calendar of annual evaluation activities is provided as Table 2.

Optional Activities

Evaluators may collect data in addition to the data required by the evaluation effort. For instance, in the past, questionnaires were used to collect data from Center administrators, industry technical monitors, departing IAB members and students. In addition, evaluators are free to add questions to the required instruments in order to provide data that are of local interest. A list of some of the instruments which have been used in the past or by individual evaluators can be found in Section 6.

Many evaluators find that they are called upon to perform other functions in their Centers. These functions are negotiated individually and will depend on the needs of the Center, its director and the evaluator's interest to become more involved in the Center's operations. Examples of these activities include assisting in planning semi-annual Industrial Advisory Board meetings, developing marketing plans, preparing or contributing to the newsletter, and providing general management consulting.

Note: For Centers funded under Solicitation #09-565, Minimum, required Duties are: 1) Attend & assist at 1 Industrial Advisory Board (IAB) meeting per year, 2) Prepare a Brief Evaluator's Report with cover sheet for submission to the I/UCRC Evaluation Program and Center Director, 3) Provide information and feedback about the Center to the NSF I/UCRC Program, and 4) Provide information from NSF to the Center Director.


1.3 EVALUATOR AS ADVISOR/CONSULTANT

It should be noted that the last required activity, "provide information and feedback to your Center Director", provides an opportunity for the evaluator to become an advisor or consultant to their director. In fact, over time some directors have used their evaluator extensively in this role and value the contribution they make as advisors. However, other directors have chosen to not use their evaluators in this way. While we encourage evaluators to become advisors/consultants to their directors, the evolution of this role depends upon the needs of their Center director, the evaluator’s willingness to engage in these activities and the relationship, which develops between the director and the evaluator.

1.4 NCSU EVALUATION TEAM

The NCSU Evaluation Team is responsible for coordination of both the process/outcome questionnaire assessment and the Evaluator's Report. If you have any questions about these elements of the evaluation effort you should contact:

Denis O. Gray, Ph.D.
North Carolina State University
Department of Psychology
Box 7650
Raleigh, NC 27695-7650
Tel: (919) 515-1721
Fax: (919) 515-1716
E-mail: denis_gray@ncsu.edu

For general information about coding problems, deadlines, replacement questionnaires, and data entry please contact:

I/UCRC Project Manager
North Carolina State University
Department of Psychology
Box 7650
Raleigh, NC 27695-7650
Tel: (919) 515-3237
Fax: (919) 515-1716
E-mail: iucrc@ncsu.edu


TABLE 1:
SUMMARY OF REQUIRED EVALUATION OUTPUTS

Instrument/ Procedure
When Collected
Data Sources
Reports
Other
Evaluator's Report Ongoing including attendance at meetings, etc.

All Center participants, Center documents.  
Must include Semi-Annual Meeting Best Practice Checklist, the Evaluator Report Cover Sheet,  and Success Story or Impact Assessment

Written at end of planning grant, updated annually* Full Report to Director for inclusion in I/UCRC renewal application; copy to NCSU; include cover sheet
Process/Outcome Questionnaire Annually; August-November. Commence data collection after one year of operation (exclude planning grant) IAB Members, Faculty Report for Center Director Submit data to NCSU
Meeting Summary Report At any IAB meeting in which and NSF representative is not present Center Director, meeting materials, Evaluator observations Send to NSF shortly after the meeting  

* Your Center Director is required to submit this report to NSF with the Center renewal application.

Back to Top

Section Updated October 31, 2013

Timeline

TABLE 2:
CALENDAR OF EVALUATOR ACTIVITIES AND DEADLINES

DATE
TASK (Responsibility)
Annually

Submit Evaluator's Report (all evaluators)

  • Deadline: to be included with Center renewal proposal, due 90 days prior to award anniversary. Submit to director for inclusion with renewal proposal.
  • Submit copy to NCSU for I/UCRC archives.

Collect data for completing the Evaluator's Report (all evaluators)

Aug-Nov.

Collect Process/Outcome Questionnaire Data (all evaluators)

  • Submit report to Director summarizing results a.s.a.p.

Attend your Center's IAB meeting

  • Complete Semi-Annual Meeting Best Practice Checklist (Word/PDF)for inclusion with Evaluators Report.
  • If NSF does not attend, submit Meeting Summary Report
November

Submit Supplemental Research Proposals to Research Committee

  • Deadline: Two months prior to next evaluator's meeting
December

Submit Process/Outcome Questionnaire Data (all evaluators)

  • Submit copy of all questionnaires (faculty and industry) by December 15

January
(date announced annually)

Director Meeting (all evaluators)

  • Review of supplemental proposals
  • Other Business
  • Distribution of Center Director Survey Tables (NCSU)

Meeting of Center Directors

March

Attend your Center's IAB meeting

  • Complete Semi-Annual Meeting Best Practice Checklist (Word/PDF)for inclusion with Evaluators Report.
  • If NSF does not attend, submit Meeting Summary Report
April

Submit Supplemental Research Proposals

  • Deadline: Two months prior to next evaluator's meeting (approx: April 1st)
June
(date announced annually)

Evaluator Meeting (all evaluators)

  • Review of Supplemental Proposals
  • Update: Process/Outcome Assessment (NCSU)
  • Other Business

Back to Top

Section Updated October 31, 2013

Evaluator Report

2.1 BACKGROUND

The Evaluator's Report provides a narrative description of the circumstances, events, and actions that contribute to the Center’s development and growth. The Evaluator’s Report is prepared based on a structured outline. In new Centers, the focus of the Evaluator’s Report will be on the ideas, plans, and efforts that led to the creation of the Center. In Centers that have been in existence for several years the focus will be on the changes that occur as the Center evolves and matures and various Center outcomes and achievements with a particular emphasis on measurable economic impacts. In addition to the report itself, please also include the Evaluator Report Cover Sheet, this document is intended to capture critical information and make it available at a glance, and the Meeting Best Practice Checklist

The intent of this report is to provide Center evaluators with a vehicle for recording the unique history of each Center while allowing the data to be aggregated across Centers for purposes of overall program evaluation. The report provides the director with a qualitative evaluation from an involved yet objective observer. Some Centers share this report with new members as a way of briefing them about the Center’s history. The report is also intended to provide NSF with a record of the Center’s progress and accomplishments.

2.2 GENERAL INSTRUCTIONS

All evaluators are expected to complete and submit an Evaluator’s Report annually. The structure of the report is defined by the outline provided in Section 2.4. Major topics to be addressed by the report include: overview, environmental/ institutional, organizational, research, accomplishments and analysis. Brief instructions for completing these sections are included on the report outline. A particular point of emphasis for the evaluator report beginning Fall, 2012 is a focused effort to document quantifiable economic impacts for members and other organizations. The report submitted to your director for inclusion in the Center’s renewal report should typically not exceed five pages plus the cover sheet.

It should be noted that each of the sections requires the evaluator to directly address issues of stability and/or change for that year and note the impact of these changes. Much of the report can be completed based on information obtained from participant observation. However, evaluators may also need to examine Center documents, refer to national databases and interview relevant Center participants (e.g., director, business manager, and/or other key Center faculty, beneficiary firms). Section 2.5 provides an outline of the kinds of issues, which can be addressed in each section. Section 2.6 provides a sample of a completed report.

Schedule for Submitting Report

NSF requires that a copy of the Evaluator’s Report be included with all Center renewal proposals. Directors are required to submit their renewal proposals to NSF at least 90 days prior to the anniversary of their award date (although it is not uncommon for directors submit it after this deadline). Directors submitting proposals for Phase 2 or 3 funding are required to include their evaluator report with their proposal and may need your report earlier in these instances. Evaluators must consult with their directors on their due date and should plan to submit a copy of their report to their director at least a month before this deadline in order to allow for review and integration. You may consult the NSF website to identify the award date(s) for your Center and it's sites. Evaluators must also forward copies of the report to the NCSU Evaluation Team.

Access to the Evaluator’s Report Database

The Evaluation Team at NCSU will continue to monitor and archive the Evaluator’s Reports submitted each year. It is anticipated that individual evaluators will propose studies, which would include access to the Evaluator’s Report database. Use of the database for this purpose is encouraged. Any evaluator interested in using the database should submit a written request to the NSF IUCRC Program Director for permission to receive any portion of the database.


2.3 HELPFUL HINTS FOR PREPARING AND COMPLETING YOUR EVALUATOR’S REPORT

Below are some "helpful hints" you may want to consider as you embark on preparing your Evaluator’s Report.
Preparation:

1) Gather all relevant files, notes, Center reports, etc. organize them and become familiar with the information they contain.

2) Keep a copy of the Evaluator’s Report handy (blank or the previous year's) and periodically (especially after board meetings, etc.) jot down notes which will form the basis for that year’s answers

3) Prepare a timeline of major events for the past year.

First Draft:

4) Make brief cryptic notes on each question outlining what should be included in your answer and noting where you will have to refer to documents or other sources to get adequate details (including results from Process/Outcomes assessment).

5) Begin by answering questions for which you have the information at your fingertips; move on to other questions later, perhaps after you have had the chance to do some research.

Revisions and Final Draft:

6) Schedule interviews with Center director and other relevant parties to get information you lack and/or clarify points of uncertainty. Do not waste people's time going over issues which are obvious.

7) Refer to outside documents as necessary (e.g., U.S. Industrial Outlook)

8) Make final revisions incorporating this information.

9) Solicit feedback from Center director on accuracy and completeness.

10) Fill in the Evaluator Report Cover Sheet.


2.4. EVALUATOR’S REPORT

Evaluator’s Report
(Revised September 4, 2012)

General Instructions: Summarize the Center's recent history (last twelve months or since the last history was prepared). Please address each domain. Emphasis should be on significant changes, which might influence the Center's ability to achieve its goals and objectives.
Length:  about 5 pages.

1. Evaluator Report Cover Sheet
Fill in the provided form to capture basic information and make it available at a glance.

2. Overview
Provide a general overview of the Center’s status.

3. Goals and Objectives
Please describe the Center's primary technical and organizational goals and objectives.
Instructions: This question can be answered by listing a Center's "official" (written) and/or unofficial (but acknowledged) goals and objectives. Goals and objectives should be updated or elaborated upon (e.g., the basis for inferring informal goals) as needed. You should indicate if your answer reflects a change from previous years.

4. Environmental/Institutional
Please describe any environmental (e.g., decline in industry's competitive position) or institutional/university (e.g., partnerships with other universities, shift in university priorities) changes which have occurred during the last year.
Instructions: Evaluator’s are encouraged to include industry data based from outside sources.

5. Organizational
Please describe any changes in the Center’s personnel, structure, policies, financial status, and operations, which have occurred during the last year.
Instructions: The section can be answered based on participant observation, reference to program documents, data collected via the Process/Outcome questionnaire, and information reflected in the Semi-annual Meeting Best Practice Checklist.

6. Research Program
Please describe any changes in the Center’s research program (e.g., new or modified thrust areas), which have occurred during the last year.
Instructions: The section can be answered based on participant observation, reference to program documents and data collected via the Process/Outcome questionnaire.

7. Center Accomplishments
Please describe any accomplishments or impact the Center has had in the following areas: knowledge/technical advances; technology transfer; educational impacts. In addition, comment on accomplishments, which may reflect unique Center objectives (e.g., forging international linkages, etc.).

Instructions: Evaluators are encouraged to obtain the information needed to answer this question from a variety of sources, including: program documents (e.g., Center's annual report), responses or comments provided in the Process/Outcome questionnaire, discussions with CD, IAB members and faculty.

7.1. Center Success Case Study and Economic Impact Assessment

Effective 2008-9, evaluators are required to produce a mini-case (one to two paragraphs) description of a recent Center “success story” (click here for an example). A Center success story might be scientific, technological, technology transfer, or educational in nature. The purpose of this part of the report is to try to document some Center activity, output or outcome. This mini-case might be used by the Center, NSF or others as evidence the Center or program is achieving its objectives. While not exhaustive, a Center success might include: a significant scientific breakthrough or award; a firm development or commercialization activity that was related to Center research (perhaps listed in the P/O survey); a new significant patent filing; a start-up or spinout company linked to Center research; a scientific, technological or commercial development achieved by a Center alumni, etc. The evaluator is expected to obtain background information from relevant sources (e.g., faculty, members, licensees) about the success story to prepare a mini-case (e.g., think in terms of who, what, where, how and when). Eventually, some of these mini-cases will be examined for inclusion into the “technology breakthroughs” report NSF I/UCRC produce. The evaluator is encouraged to select a success story based on input from the Center director and/or IAB.

Effective fall 2012, evaluators are required to attempt to identify and document via an interview any Center outcome that might represent a significant economic impact. Please refer to the Economic Impact Assessment section for additional information on how to fulfill this responsibility. Please pay particular attention to whether or not the beneficiaries that you interview desire confidentiality or not for the information and estimates that they provide you.

8. Analysis
Based on the information provided above and other relevant information, comment on the health and vitality of the Center. What are the implications of the various environmental, institutional, organizational and research changes? Is the Center making adequate progress in achieving its objectives? Is the research program still vital and current?

Instructions: Comments should reflect the evaluator’s view of the major obstacles and/or opportunities, which may affect the Center's success during the next one to three years?

9. Timeline
Attach an updated timeline of significant events and milestones, which have occurred over the Center’s lifetime.

10. Semi-annual Meeting Best Practice Checklist
Provide a summary of the extent to which the Center adheres to IAB meeting best practices


2.5 EVALUATOR’S REPORT -- OUTLINE OF RELEVANT ISSUES

Instructions: Evaluators can address a variety of issues within the different sections of the report. The bulleted items listed below are a sampling of issues addressed by evaluators in previous reports.

1. Evaluator Report Cover Sheet

Provide as summary of Center status and activity available at a glance

2. Overview

Provide a general overview of the Center’s status.

  • Purpose of the report
  • Age of Center
  • Funding snapshot
  • Membership snapshot including names of firms
  • Number of years remaining on NSF award

3. Goals and Objectives

Technical Goals

  • Center's mission statement
  • Technical goals and objectives

Organizational Goals

  • Increase in number of industry sponsors
  • Increase in technology transferred
  • Reduction in production costs
  • Number of patents produced

4. Environmental/Institutional

Environment

  • Name and number of industries Center serves (e.g., energy and chemical)
  • Relevant economic indicators for industry
  • Percentage of companies in particular industry that have downsized or closed
  • Percentage of change of R&D dollars by firms or industry
  • # of membership terminated due to economic downturns and industry reorganizations
  • Percentage of change in employment in particular industry
  • Level of analysis may be local, regional, and/or international
  • Competition from other Centers

Institutional

  • Changes in university leadership positions
  • Changes in policies and procedures (e.g. overhead, conflict of interest)
  • Willingness and ability of the university, department to support the Center (e.g., facilities, equipment, personnel)
  • Interdepartmental issues
  • Status of linkages with other universities (e.g., multi-Center)

5. Organizational

Personnel

  • Turnover of key faculty and staff (describe transition)
  • # of personnel (faculty) by department supported by Center research
  • # of faculty who have left the Center
  • # of graduate/undergraduates supported by Center
  • Visiting scholars involved in Center research
  • Cohesiveness of key Center staff and faculty
Structure
  • Changes in organizational chart
  • Changes in roles
  • Changes in reporting relations
Policies
  • New or changed policy or procedures
  • Center SOP - are they adequate? Used? Stifle creativity?
Financial stability
  • Status relative to NSF criteria
  • Amount of funding by source
  • Membership status (current, new, left)
  • Major grants awarded
  • In-kind contributions
  • Prospects for new funding or loss of funding
  • Major equipment donations
Operational
  • P/O results on Administration items
  • Recruitment efforts
  • Proposal development and selection efforts
  • Planning efforts
  • Center newsletter activity
  • Formal decision making process (L.I.F.E. form used?)
  • Basic description of semi-annual meetings (e.g., projects funded, voting procedures)
  • Assessment of the effectiveness of meetings (i.e. findings from Semi-Annual Meeting Best Practice Checklist)

6. Research Program

  • P/O data from research items
  • Major thrust areas
  • Number and changes in core, enhancement projects
  • Average dollar value of projects
  • # of total number of faculty/research assistants for each project; multidisciplinary teams
  • # of pre-proposals generated
  • Percentage of team-based research project initiated by the Center
  • L.I.F.E. form ratings
  • TIE projects
  • # of project that were voted on, accepted, rejected
  • # of collaborative research projects with industry
  • Typical length of projects
  • Basic or applied perspective

7. Center Accomplishments

Knowledge/technical advances

  • P/O data from faculty and IAB
  • Technical advances
  • Impact of publications
  • # of articles published; prestige of journals
  • # of presentations made
  • # of posters presented
  • # of abstracts submitted
  • Awards for research or teaching
Technology transfer
  • P/O data from follow-on funding and benefits
  • # of invention disclosures
  • # of patents filed
  • # of visits to and from industry and with other research laboratories
  • # of presentations made during industry visits
  • # of web sites (hits, citations from other web sites)
  • # of workshops conducted
  • # of open conferences
  • # of short courses offered
  • Commercialization
  • Dollar benefits
Educational impacts
  • Curriculum changes and improvements
  • # of graduates hired by industry
  • # of undergraduates that work at the Center
  • # of Masters and doctorates completed with Center research
  • IAB offers on-site training for graduate students/undergraduate students
  • Post doctoral instructional components
  • Any kind of pre-college educational program
  • Impact on student training

7.1 Center Success Case Study.

  • A one to two paragraph description of some significant scientific, technological or other output or outcome of Center activities (see description above).
  • Please use the guidelines and supporting materials provided in Economic Impact Assessment Section.
  • Since it may take a number of years for Center to produce economically significant impacts for their members, evaluators may complete this reporting requirement by indicating that there is nothing significant to report at this time.

8. Analysis

  • Commentary on major trends and developments noted in other sections of the report

9. Timeline

  • Chronology of major events since Center began

10. Appendix A: Semi-annual Meeting Best Practice Checklist


2.6 For a sample of an evaluator's report, please click here. You can click here for an example of a Center Success Case Study

Back to Top


Section Updated October 31, 2013

Process/Outcome Assessment

3.1 INTRODUCTION

The Process/Outcome (P/O) instruments are designed to assess Center processes and outcomes based on the perceptions and reports of industrial and university participants. Process items tend to focus on participant characteristics and perceptions of the Center's operations. Outcome items include the assessment of immediate, intermediate, and ultimate outcomes of Center activities. We've also recently added some new items assessing R&D efficiency outcomes. Here you will find a slide deck you may used to explain these questions to IAB members, as well as slides you may use for reporting results from these questions. There are currently three instruments used in this activity: Industrial P/O Questionnaire, Faculty Year 1-5 P/O Questionnaire, and the Faculty Year 6-10 P/O Questionnaire. Data may be collected in person or through a mail/email survey, or via a web survey format. These P/O instruments are designed to provide directors with critical feedback from industry and faculty participants about how the Center is operating. This should allow directors the opportunity to correct problems and reinforce positive aspects of Center operations. As a consequence, it’s critical that results from the questionnaires be shared with your director as quickly as possible. Section 3.5 provides some suggestions for providing concise and clear feedback to your director. Also provided is a sample report(PDF) that incorporates the process/outcome questionnaire feedback. The I/UCRC Evaluation Team has also recently developed a Checklist for Optimizing LIFE and PO Response at IAB Meeting to help Evaluators collect their data and enhance response rate as much as possible. We have also developed a Process/Outcome Workbook, which includes codebooks and data entry shells to help streamline the data entry and submission process as well as a research cost avoidance (RCA) calculator to help quantify center impacts.


3.2 PROCESS/OUTCOME ASSESSMENT/ADMINISTRATION

3.2.1 Checklist for Optimizing LIFE and PO Response at IAB Meeting

Collecting Process/Outcome (and LIFE) data has become challenging. This document highlights steps the evaluator can take to optimize the compliance and response rate for both the LIFE forms and P/O questionnaire during the IAB meeting. Obviously, the effectiveness of these steps depends on satisfactory attendance by IAB members.

3.2.2 FREQUENCY OF ADMINISTRATION

The instruments are administered yearly during late summer or early fall. Begin administration after your Center has been in operation for one full year (do not count the planning grant period). Thus, the first administration of these instruments would begin in the fall after the first anniversary of the Center's birth (usually the starting date for the operating grant), the second administration the following fall, etc. Evaluators of Centers that have been in the program for at least 5 years may use the shorter Faculty Year 6-10 P/O Questionnaire in place of the full version. If an evaluator believes they cannot collect questionnaires from either industry or faculty for some reason (e.g., response set too small), they should contact the I/UCRC Evaluation Team for advice on alternative data collection options.

3.2.3 RESPONDENTS

In general, potential respondents for the Process/Outcome instruments are all faculty (and research scientists) and all IAB members involved in the Center during the preceding year.

Generation of Respondent List.

Experience has shown that the best way to obtain a respondent list is to contact the Center’s administrative assistant for an up-to-date contact list. However, if there is any doubt about a respondent, INCLUDE the individual in the sample. This includes individuals who participated in Center activities for most of the year but are no longer active at the anniversary date.

Extensive experience has shown that a different data collection strategy works best with the two target populations: industry and faculty.


3.2.4 INDUSTRY SURVEY IMPLEMENTATION

Industry respondents are incredibly busy and are typically only on-site during the Center’s twice-a-year IAB meetings. This is an excellent time for collecting data by administering paper questionnaires during the fall IAB meeting. However, response rates may suffer if attendance is low at IAB meetings. The interpretability of survey findings is greatly influenced by who is completing the survey. Therefore, we also support a web and email based survey administration. Evaluators are encouraged to employ all data collection methods necessary to achieve maximum response rate.

A. Collecting Surveys at IAB Meeting

Steps in Preparing and Collecting Data.

We recommend you prepare your questionnaires for administration in the following manner:

  1. Get an up-to-date list of IAB members from your Center’s administrative assistant.
  2. Contact your Center director and ask permission to get 15-20 minutes on the next IAB agenda so members can complete their questionnaires on-site. Time blocks right before breaks usually work the best.
  3. Download a copy of the Industry Process/Outcome questionnaire
  4. Complete the top portion of the questionnaire. Personalize your questionnaire by typing in the name of your Center. Facilitate questionnaire returns by typing your name, address, and email and a "Return by" date. Make enough questionnaires for your respondent group.
  5. Coding the ID# Each questionnaire requires a six-digit identification code. Assign this code in the following manner:

    a) Columns 1-3: 3-digit Center code. If you are a new Center please call the NCSU evaluation team to be assigned a Center identification number. For older Centers with at 2 digit Center code, please add a "0" at the beginning as we have now graduated to triple digit Center codes.
    b) Columns 4-6: 3-digit Respondent Code. Assign a unique code number to each individual receiving the questionnaire. We suggest you use the following convention for industry respondents: use column 4-5 as a firm (or university) code and column 6 for the individual representative who may change over time.
    c) Be sure to maintain a codebook for your respondents to assist in follow-up.

  6. Attach a cover letter to each questionnaire (personalized if possible) that explains the purpose of the overall evaluation and questionnaire. Sample letters are provided at the end of this section (see Section 3.6.1 - 3.6.6).
  7. When the data collection part of the IAB meeting happens, briefly explain the purpose of the assessment, ask them to fill out the questionnaires right then, hand out the pre-coded questionnaires, and collect the completed questionnaires. Here is a sample slide deck you may use to help explain some of the newly added R&D efficiency questions, as well as sample slides for reporting R&E efficiency results.
Plan to have a few self-addressed envelopes ready if respondents ask to complete the questionnaire outside the meeting. Quickly mail, email, or provide a link to the web-based questionnaire to any IAB member who did not attend the meeting. Since respondents do not put their name on the questionnaire, you will need to consult the identification numbers recorded in your respondent codebook to identify non-respondents.

B. Coding Completed Questionnaires

The Process/Outcome Workbook contains all the information you will need for coding answers and entering them into the data entry shells. There is a codebook tab in the workbook for both faculty and industry surveys which includes the question number/variable label, associated question text, and appropriate response codes. For all forced choice questions, the response code is part of the item; numeric answers are coded as stated. You should be aware of the following miscellaneous coding conventions:

1. Coding missing data or "don't know" responses. Leave the column blank or code the response as "999".
2. Open-ended questions. Please transcribe the text response into the space provided. If the response is illegible or you are not able to transcribe the text for some other reason, please indicate whether or not a answer was given by:

Code "0" if the respondent offered no answer.
Code "1" if the respondent made a comment.


3.2.5 FACULTY SURVEY IMPLEMENTATION

Generally speaking, faculty respondents are a lot easier to track down than industry respondents. As a consequence, faculty questionnaires can be administered anytime during the fall semester but generally no later than the IAB meeting. We are currently supporting two methods of data collection: mail/email questionnaire; web survey.

Mail/Email Survey

After preparing the questionnaires as described in steps 1, 3-6 above.

  1. Mail or email questionnaires to your faculty list (See Section 3.6.4).
  2. Send a follow up letter or email after about 10 days reminding non-respondents to complete and return the questionnaire. Repeat this step until you obtain an adequate response rate.

Web Survey

We now support a web version of both the long and short faculty questionnaires. This data collection method involves local evaluators using an email to contact and direct faculty to a web-based Process/Outcome questionnaire maintained by NCSU Evaluation Team. When data collection is completed, NCSU will send you a dataset. Here are some specific steps for using this option.

    1. Notify NCSU Evaluation Team that you plan to collect your data by web questionnaire (ncsuiucrc@ncsu.edu).
    2. Get an up-to-date list of faculty and research scientists from your Center’s administrative assistant.
    3. Send an email to these individuals requesting they complete the web-based questionnaire (see Section 3.6.5). Please “cc” us at (ncsuiucrc@ncsu.edu).
    4. NC State will send you updates on your response rate (number of questionnaires submitted) upon request.
    5. Depending on your response rate, you should send several reminders to increase your response rate. Since submitted questionnaires are anonymous, you will need to send this to all respondents (see Section 3.6.6).
    6. At your request NC State will send you a documented excel data set for you to analyze and summarize for your Center

Submitting Data to NCSU

When you have all or a good portion of the total response and have recorded your data, send the completed Process/Outcome Workbook with data entered for both faculty and industry to the evaluation team at NCSU, coded according to the associated codebooks. Please retain copies of all questionnaires. If you collect data by web-questionnaire, you can skip this step.


3.5 ANALYSIS OF RESULTS

Evaluators have the responsibility for providing the Center with feedback on questionnaire results. At a minimum, evaluators should prepare a report which summarizes both faculty and industry responses (frequency counts and percentages) to each item. Many evaluators are also asked to give a PowerPoint presentation on the results of the survey at the next IAB meeting. After the first administration, the report should emphasize significant changes in outcome/satisfaction items over the reporting periods. Evaluators have also received positive feedback when they compare their Center to I/UCRC normative scores (distributed every summer by NCSU team at the June NSF Evaluator’s Meeting). The sample report(PDF) provides such comparisons. Please remember you should promise confidentiality to your respondents. If you feel your director would benefit from knowing who gave a particular response (e.g., intention to quit the Center), you must check with the respondent before providing this information.

The NCSU Evaluation Team will be responsible for aggregating across Centers and for any comparative analyses of the Process/Outcome data.


3.6 PROCESS/OUTCOME ASSESSMENT REPORT

To see a sample of the Process/Outcome assessment report, please click here(PDF).


3.6.1 SAMPLE CENTER DIRECTOR INDUSTRY LETTER

October 12, 20XX

Dear [IAB Representative]:

We are collaborating with the National Science Foundation in a study of the NSF Industry/University Cooperative Research Centers program. The Center for [MY RESEARCH (CMR)] at [MY University] is one of the Centers funded by NSF. As you know, NSF support has been and still is very crucial to the development and growth of the [MU] program.

Comparative data for NSF-supported research Centers are being obtained. This information will be used to assess and develop NSF's internal policy and administration for establishing new cooperative research Centers. In addition, this data will provide [CMR] with useful feedback on communication patterns within [CMR] and your evaluation of our structure, procedures and achievements. The final report of last year's survey is available from the Center office on request. Since you were involved in [CMR] during the time frame covered in this study (20XX-20XX) we need feedback from you on the operations of our Center.

[Survey Link]

In the linked questionnaire, please be sure to select "[Center NAME AS LISTED IN DROPDOWN MENU]" from the drop down menu for Center name (Please be careful not to scroll so that the appropriate Center remains highlighted). Also, the first survey question refers to the percent of projects in which your organization takes and active interest. When answering this question, please consider the [NUMBER] projects presented at our last IAB meeting.

Dr. [EVALUATOR] from the Department of [RESEARCH at MU] is responsible for conducting the [CMR] study. S/He will ensure the confidentiality of individual responses. Dr. [EVALUATOR] and I would greatly appreciate it if you would please fill out the linked questionnaire at your earliest convenience. If you have any questions, please feel free to contact Dr. [EVALUATOR] at [(XXX) XXX-XXXX] or [email@address.edu].

Sincerely,

Director


3.6.2 SAMPLE EVALUATOR INDUSTRY LETTER

November 21, 20XX

Dear [IAB Representative]:

Thank you for participating in the recent [Center] meeting. On behalf of the National Science Foundation’s Industry/University Cooperative Research Centers (IUCRC) program, and Dr. Center Director and Dr. Site Director(s), I would like to ask you to complete a brief questionnaire on the Center’s operations and outcomes.
The link to the following survey is part of the National Science Foundation sponsored research IUCRCs.  Since maintaining and strengthening support for the IUCRC program is heavily dependent on the views of industry participants, it is critical that we get your feedback. 

The purpose of the questionnaire is: to provide information that will be useful to Center administration, to provide information that will contribute to the development and administration of the NSF IUCRC Program, to document the impact of participation in the Center for your organization, and to provide data that will enrich our understanding of industry-university cooperation in general.

[Survey Link]

In the linked questionnaire, please be sure to select "[Center NAME AS LISTED IN DROPDOWN MENU]" from the drop down menu for Center name (Please be careful not to scroll so that the appropriate Center remains highlighted).

The questionnaire should take about 5-10 minutes to complete. Your responses will be held in strict confidence; no individual or firm will be identified by name in any report resulting from this survey. Your responses will not be shared with other Center participants except in aggregated form.

I would appreciate it if you could complete the questionnaire by [DATE].  Thank you in advance for your cooperation. Please do not hesitate to contact me if you have any questions. Your response is greatly appreciated.

Sincerely,
Dr. Evaluator
Center Evaluator
(XXX) XXX-XXXX
[email@address.edu]


3.6.3 SAMPLE FOLLOW-UP INDUSTRY LETTER

November 12, 20xx

Dear[IAB member]:

I have begun coding our [Center] questionnaires and discovered that I have not received a completed questionnaire from you. I hope to proceed shortly with preliminary analysis but need a complete data set before I begin. I understand how busy you must be this time of year, but I hope you will take a few minutes to complete the survey by Monday, November 25, 20XX.

[Survey Link]

In the linked questionnaire, please be sure to select "[Center NAME AS LISTED IN DROPDOWN MENU]" from the drop down menu for Center name (Please be careful not to scroll so that the appropriate Center remains highlighted).

Thank you in advance for your cooperation. Please do not hesitate to contact me if you have any questions. Your response is greatly appreciated.

Thanks for your help.

Sincerely,

Dr. Evaluator
Center Evaluator
(XXX) XXX-XXXX
[email@address.edu]


3.6.4 SAMPLE FACUTLY PAPER BASED EMAIL

Dear Center Faculty Member and Research Scientists,

As you likely recall, each year the National Science Foundation requires that we conduct a survey of Center faculty. This survey is a tool for making interim assessments of the progress of the Center and is part of the overall database for the study of all the cooperative research Centers. It is designed to yield information that may be helpful to the Center and for helping to establish successful Centers at other universities.

Please complete the attached survey and e-mail it or fax it back to me [ATTACH SURVEY]. **Note: make sure to “save as” the survey on your hard drive. Saving changes after opening the attachment will not save your data unless you save it to your hard drive.

Please notify me if you'd rather have me mail or fax this survey to you.

Otherwise, please take a few minutes to fill out the attached survey and email or fax us back the feedback.

PLEASE RETURN BY: [DATE]

Sincerely,
Dr. Evaluator
Center Evaluator
(XXX) XXX-XXXX
[email@address.edu]


3.6.5 SAMPLE FACULTY WEB-BASED EMAIL

Dear Center Faculty Member and Research Scientists,

As you likely recall, each year the National Science Foundation requires that we conduct a survey of Center faculty and research scientists. This survey is a tool for making interim assessments of the progress of the Center and is part of the overall database for the study of all the cooperative research Centers. It is designed to yield information that may be helpful to the Center and for helping to establish successful Centers at other universities. This year you are able to complete this survey via the Internet; thus, you will not be receiving a separate evaluation survey.

The link below will take you to the survey, which is a revised version to the one that has been used in previous years. The link to the survey is:

[Survey Link]

At the beginning of the survey, please make sure to select: [ENTER Center NAME AS LISTED ON WEBSITE] (Please be careful not to scroll so that the appropriate Center remains highlighted).

Please remember that your responses will be held strictly CONFIDENTIAL. No individual will be identified by name in any reports resulting from this questionnaire. Your responses will NOT be shared with other Center participants except in AGGREGATED form.

Please submit your survey no later than: [DATE]

As always, if you have any questions, please do not hesitate to contact either [EVALUATOR’S CONTACT INFORMATION] or the I/UCRC Evaluation Project at NC State University (Phone: 919-515-3237, email at ncsuiucrc@ncsu.edu).

Thank you for your cooperation and we look forward to your participation!

Sincerely,
Dr. Evaluator
Center Evaluator
(XXX) XXX-XXXX
[email@address.edu]


3.6.6 SAMPLE FACULTY WEB-BASED EMAIL REMINDER

Dear [Center Name] Faculty Member or Research Scientist;

A couple of weeks ago, I sent you an email requesting that you complete a web-based questionnaire based on your participation in <Center Name>. If you have already completed the questionnaire, thank you for your help and please ignore this reminder.

If you have not completed the questionnaire, I would appreciate it if you could click on the following link and complete the questionnaire. It should only take about 5-10 minutes of your time. Since we do not ask for any identifying information on the web survey, your answers will be completely confidential.

[Survey Link]

Thanks in advance for your help.

Sincerely,
Dr. Evaluator
Center Evaluator
(XXX) XXX-XXXX
[email@address.edu]

Back to Top


Economic Impact Assessment

4.0 Identifying and Documenting IUCRC Center Success Stories and Economic Impacts: Guidelines, Scheduling and Supporting Materials

In a recent report, the NCSU IUCRC Evaluation Team was able to demonstrate that they could document significant economic impacts in mature IUCRCs by engaging in a proactive assessment strategy that involved either face-to-face or telephone interview methodology and provision of confidentiality, if requested, to the beneficiary who would be providing the economic estimates. Based on these findings, beginning Fall 2012, all evaluators are expected to attempt to obtain such information as part of their effort to document Center “success stories” within the Evaluators Report. There is no assumption every Center will produce such outcomes, particularly early in their history. However, evaluators should indicate in their report that they attempted to document such impacts. The NCSU Evaluation Team is prepared to provide consultation including some selected site visits to help evaluators fulfill this new data collection requirement.

Assessment Objective:

To provide early detection and documentation information on Center impacts that may represent significant economic or societal impacts. 

The following are assessment guidelines, scheduling guidelines and supporting materials for fulfilling these responsibilities:

Assessment Guidelines:

  • Impact data collection should become a higher priority for evaluators at Centers as they become more mature with a greater emphasis placed on this activity during Phase 2 and 3 of NSF funding.
  • Special effort should be taken to document impacts that appear to have the potential to produce “significant” economic impacts.
  • Assessment should emphasize data collection via personal interviews of targeted high impact beneficiaries.
  • The evaluator is not expected to perform a precise economic impact assessment. Rather s/he is expected to serve as an informed source who can “prospect” for impacts that appear to be potentially significant and that could be subject to more rigorous assessment by NSF
  • Evaluators should confirm with interviewees whether their organization can be identified in their reports or whether they would like their identity to remain confidential. This may require the evaluator to prepare cases where the identity of the beneficiary is kept confidential and/or to restrict access to their evaluator report.
  • Evaluators should attempt to document forecasted impacts (impacts the respondent predicts will happen in the future) by conducting follow-up interviews with informants in order to validate these estimates.

Scheduling:

Evaluator Reports must be attached to a Center’s Annual Report which is due 90 days prior to the Center’s award date anniversary. As a consequence, evaluators should typically begin the process of exploring whether a significant Center impact has occurred about 60 days earlier or five months before the Center’s award anniversary. Beginning Fall 2012, the NCSU IUCRC Evaluation Project will begin reminding evaluators when they should begin this process.  Centers that are submitting Phase II or Phase III proposals will be required to submit their annual report at the same time as their Center’s proposal. In these instances, the evaluator may need to produce their report earlier.

The typical sequence of events for the assessment would include:

  • Contacting and interviewing the director, faculty or other informants and using archival information to identify a member organization or other beneficiary that may have realized a significant benefit from the Center (See EconImpact 1, 2, 3).
  • If such a potential beneficiary organization is identified, asking the director to send an email introducing you to the beneficiary (See EconImpact 4).
  • Sending an email to the beneficiary representative explaining the assessment and scheduling a screening and assessment interview (See EconImpact 5, 6, 7).
  • Preparing an impact assessment summary that can be included in your Evaluator Report (See EconImpact 8). 

Supporting Materials:

The following materials and tools are intended to facilitate the collection of well documented success cases that include credible economic impacts.

EconImpact 1: Guidelines for identifying beneficiary organizations

EconImpact 2: Guide for first contact briefing with Center director

EconImpact 3: Sample guide for interview with Center director

EconImpact 4: Sample email from Center director to beneficiary

EconImpact 5: Sample email from evaluator to beneficiary

EconImpact 6: Guide for pre-screening interview with beneficiary

EconImpact 7: Guide for interview with beneficiary

EconImpact 8: Sample summary report of impacts


4.1 EconImpact 1: Guidelines for identifying beneficiary organizations

IUCRC Impact Assessment
Guidelines for Selecting Firms/Organizations for Impact Assessment Interviews

Goals:

  • Identify firms that you believe have realized the greatest economic impact from IUCRC research and/or technology;
  • Attempt to obtain a more detailed, preferably quantitative estimate of impact

Criteria for selecting firms:

  • Any firm/organization that you suspect used previous or current IUCRC research and/or technology to improve and/or create products, processes or services.
  • Firm/organization could be an IUCRC member, an ex-IUCRC member, a start-up or spin-out based on IUCRC technology, or even a non-member that has acquired a license to use IUCRC technology.
  • You believe firm/organization’s efforts are far enough along that they have already realized some economic benefits by virtue of cost savings, sales, improved performance, etc.

Examples of Types of Impacts
Impact could be technology created within the Center or research that leads to the creation of IP or commercialized products and processes within the firm. Examples include:

  • The discovery of new knowledge (e.g., regarding the properties or characteristics of relevant materials) that led industry in new promising directions
  • A new research method or technique that helped accelerate technological progress
  • A new measurement tool, device, software, or algorithm that works with or impacted an existing commercial process (either whole or in part)
  • An entirely new process that replaced an existing process
  • A new subsystem or component that improved an existing product
  • A whole new system or product that replaced an existing system or product
  • A whole new system or product that opens a new market or industry
  • Other developments that might have commercial value like creation of an accepted industry standard

Sources of Information on Potential Beneficiary
A variety of sources may led you to identify a firm/organization as a potential beneficiary

  • Discussions you have had with firms/organizations
  • Center PIs who are familiar with member applications of Center knowledge and technology
  • Examples you have documented in previous Center’s annual reports
  • The Center Evaluator’s report, including member survey data and Center structure data
  • Items you have nominated for the Compendium of Technology Breakthroughs  (produced by Craig Scott)
  • University Technology Transfer Office (for patent licensees)

4.2 EconImpact 2: Guide for first contact briefing with Center director

Background: The NSF is interested in identifying organizations that have benefited from various IUCRCs. I would like your help in identifying organizations that you believe have benefited significantly from the transfer of knowledge and/or technology from the Center. The Center research or technology they have benefitted from may have been performed at any time during the Center’s existence. The purpose of this handout is to give you some guidance on the type of firm that might be a good candidate for my assessment.

Types of Beneficiary firms.  Since we’re interested in quantifiable, already realized economic impacts, we are primarily investigating knowledge and/or technology that you suspect a firm used to improve or create products, processes or services.  The ‘firm’ could be an IUCRC member, an ex-IUCRC member, a start-up or spin-out based on IUCRC technology, or even a non-member that has acquired license to use IUCRC technology.

Criteria for Selecting Firms
The IUCRC could have economic impact on firms through various channels, including IP created within the Center or research that leads to the creation of IP or commercialized products and processes within the firm. Specific examples include:

  • The discovery of new knowledge (e.g., regarding the properties or characteristics of relevant materials) that points industry in new promising directions
  • A new research method or technique that could help accelerate technological progress
  • A new measurement tool, device, software, or algorithm that works with or impacts an existing commercial process (either whole or in part)
  • An entirely new commercial process that could replace an existing process
  • A new commercial subsystem or component that improves an existing product
  • A whole new commercial system or product that replaces an existing system or product
  • A whole new commercial system or product that opens a new market or industry

What I plan to do. I recognize that you may only have a general idea about how or if a firm may have benefited from your IUCRC work.  As a consequence, I would like to talk directly with firms that you believe are beneficiaries of IUCRC knowledge and technology. Their participation will be voluntary and we will not share information they provide me without their permission. Depending on the type of knowledge or technology, I will ask a firm representative about specific impacts, like follow-on spending, related employment, total sales, cost savings to the firm, cost savings transferred to customers, and similar economic impact questions. We realize that precise information is likely to be proprietary, too difficult to develop, or both, so rough estimates will be perfectly adequate for our purposes.

Reference sources or materials that could help identify possible impact cases include;

  • The director’s own knowledge and experience with the Center and its members
  • Items reported in the Center’s past annual report
  • Previous Center Evaluator’s reports and member survey data
  • Items included in past copies of the Compendium of Technology Breakthroughs  (produced by Craig Scott)
  • Center PIs who are familiar with member applications of Center knowledge and technology
  • University Technology Transfer Office (for patent licensees)

4.3 EconImpact 3: Sample guide for interview with Center director

Note: Ask about each impact separately. Obtain as many impact cases as possible. At this point look for general understanding of the knowledge/technology before interviewing the firm contact. Remember to ask for any written documents or weblinks that may relate to successes.

  1. Taking them one at a time, can you identify any firms that you think have realized quantifiable economic impacts from Center research? Who is the beneficiary firm? How is the firm related to the Center—e.g., member, non-member, start-up?
  2. How do you think the firm has benefitted?
  3. Can you give a short, layperson’s description of the knowledge or technology that has had quantifiable, already realized economic impacts? Note: If other than basic knowledge, ask:
    • How does this relate to similar technologies on the market—e.g., incremental improvement, whole new technology, replacement technology, etc.
    • What are the benefits over existing technologies on the market?
  4. Can you explain the technology path or trajectory and how the Center fits in? Did the knowledge or technology originate within the Center?
    • When did the firm adopt the knowledge or technology?
    • At what stage of development did the firm adopt it?
    • In your opinion, would this development have happened without the Center’s research?
  5. What do you know about the impact of the knowledge or technology on the firm—for example, impact on processes, products, services, new hires, total sales, etc.?  At this point we’re only looking for a general sense of the impact before we talk with the firm representative. 
    [Note: ask for  any written documents and/or weblinks that may be available.]
  6. Who can we contact within the firm for a short interview about the realized economic benefits related to the IUCRC knowledge or technology (e.g., profits, unit sales, cost savings to customers, etc)? [Note: Get name, email and telephone number].  We need an introduction to this individual.  If I send you a generic “heads up” email can you follow up with your contact? 
    • Do you think this firm representative will be forthcoming in sharing impact-related information, even in general, rough estimate terms?        

*The ‘technology readiness level’ (TRL) offers excellent definitions of different stages of development and can be found here: http://en.wikipedia.org/wiki/Technology_readiness_level
TRL specific to manufacturing can be found here: http://www.dodmrl.com/
Other information can be found in the DoD Deskbook, including TRLs specific to software, hardware and biomedical: http://www.acqnotes.com/Attachments/Technology%20Readiness%20Assessment%20Deskbook.pdf


4.4 EconImpact 4: Sample email from Center director to beneficiary

Dear [Beneficiary]

I am writing you to give you a heads-up and an introduction to [evaluator] who will be contacting you soon concerning your involvement in [Center] and benefits you may have realized.

The [evaluator] is interested in talking with beneficiaries of [Center] research, and would like to talk to you about participating in this effort and potentially participating in an interview.

By copy of this email, [evaluator] will follow up with you. I encourage you to take part, as this study could potentially influence policy on industry-university collaboration.

Sincerely,

[Center Director]


4.5 EconImpact 5: Sample email from evaluator to beneficiary

Dear [Beneficiary]

I wanted to follow-up on [Center Director’s] email to your regarding our study of economic impacts of the National Science Foundation’s (NSF) Industry-University Cooperative Research Center (IUCRC) program.

The NSF is especially interested in the measurable economic impacts of IUCRCs, but we realize that many, perhaps most, of these impacts are very long-term and difficult or impossible to measure in strictly economic terms.  So, we’re asking companies who may have benefited from working with IUCRCs to estimate (roughly) for us the impact that Center “outputs”—ideas and technology– have had on the company and its customers.  We realize that precise information is likely to be proprietary, too difficult to develop, or both, so rough estimates are perfectly adequate for our purposes.

I would like to schedule a brief phone interview with you to discuss the nature of our assessment, what kinds of impacts we are looking for and see if you are willing to help us out. If you decide to participate in our assessment, we can conduct our interview right then or schedule it for a separate time. I will not share information specific to you or your company outside the research team without your permission.

Please let me know when would be a good time to talk or I will follow up with you by phone.

Regards,

[Evaluator]


4.6 EconImpact 6: Guide for pre-screening interview with beneficiary

General Information: Company Name; Center Name;  Participation Years; Company Contact; Title/Position; E-Mail Address; Date of Interview; Interviewed By

Introduction. Thank for agreeing to talk with me. I am assessing the economic impact of ideas and technologies emerging from industry-university cooperative research. [Director] at [Center] suggested that your organization may have benefitted from Center research and/or possibly taken research into commercialization. The purpose of this phone call is to explain the nature of our study,  to confirm that your organization has realized commercial benefits from ideas or technology that could be attributed in whole or in part to the [Center] and see if you’re willing to share this information with us. Also, if some of the information we ask about is contained in written documents or web-based sources, please direct us to those sources.

Nature of the assessment. The National Science Foundation (NSF) is interested in getting information from organizations that have benefitted from ideas and technologies from IUCRCs. My report will highlight documented quantifiable outcomes and impacts. We’d like to report specific successes to the NSF; however we will not identify you or your company without your explicit permission.

Part I: Screening Question

  • Has your firm/organization benefited technologically or economically from its involvement with [Center]? If so, how?
    • Can you describe briefly, in layperson terms? (If necessary, clarify as product, process, service, component, tool, software, method, etc.)      
    • Have these ideas and/or technologies reached commercialization (or commercial application)?  If so, how long have they been commercialized?
    • Would you be able and willing to provide estimates of the economic impact (e.g., sales, personnel growth) for your firm/organization?  We realize that precise information is likely to be proprietary, too difficult to develop, or both, so rough estimates are perfectly adequate for our purposes.

   Research note:

  • IF THE ANSWERS PROVIDED BY THE RESPONDENT APPEAR TO CONFIRM THAT THEY HAVE NOT BENEFITTED IN SOME SIGNIFICANT WAY AT LEAST CURRENTLY, THANK THEM FOR THEIR TIME AND ASK IF YOU COULD FOLLOW UP WITH THEM IN THE FUTURE TO ASSESS FUTURE IMPACTS.
  • IF THE ANSWERS PROVIDED BY THE RESPONDENT APPEAR TO CONFIRM THAT THEY HAVE BENEFITTED IN SOME SIGNIFICANT WAY THEN ASK
    • Thanks that is very interesting. I would be interested in including information about this development in our assessment.  I would need some detailed information about this.  Would you like us to go into our more detailed questions about this now or would you like to set up a separate time to talk?

Research Note: IF THEY WANT TO KEEP GOING, BEGIN WITH PART II OF THIS INTERVIEW

    • Begin full interview

Research Note: IF THEY WANT TO SCHEDULE SEPARATE INTERVIEW ASK THEM TO CONFIRM A DATE AND TIME.


4.7 EconImpact 7: Guide for interview with beneficiary

National Science Foundation (NSF) is interested in doing a better job of estimating the economic impact of Industry/University Cooperative Research Centers (IUCRCs) on organizations and their customers.  They have asked the Center evaluators to try to conduct follow up interviews with organizations that may have received significant benefits.

The NSF is especially interested in the measurable economic impacts of IUCRCs, but we realize that many, perhaps most, of these impacts are very long-term and difficult or impossible to measure in strictly economic terms.  So, we’re asking companies who may have benefited from working with IUCRCs to estimate (roughly) for us the impact that Center “outputs”--ideas, technology, and student hires--have had on (a) the company and (b) the industry.  We realize that precise information is likely to be proprietary, too difficult to develop, or both, so rough estimates are perfectly adequate for our purposes.

Unless you explicitly tell us otherwise, your name and that of your organization will remain confidential. The NSF may be interested in publicizing case summaries that highlight organizations that have received exceptional benefits from an IUCRC. If we think your organization would make a good case study, we will ask for your explicit permission before it is disseminated in this fashion. If you agree, we will share a draft of our written summary for your review and approval, before it is disseminated by NSF. 

Let’s talk about the [insert idea/technology from director or screening interview].
Research Note: Depending on firm/organization type, go to outline for:

I. Established Firms: Product or product-related technology
II. Established Firms: Process or process-related technology
III. Start-ups: (with no realized sales or revenues to-date)

I. Established Firms: Product or product-related technology

  1. Confirm understanding of the product/technology.
    • Is this a product or a component of a larger product?
    • What are the benefits over existing, similar products on the market?
  2. How does this technology relate to the Center?
    • Triggered by the Center and developed into technology/IP outside the Center
    • IP/technology developed by the Center  
    • Would the technology have been developed without the Center?             
      • If yes- time lag or delay without the Center?
    • How long did it take to reach commercialization, from when your organization first became involved with the technology? (in years / months)
  3. Since first commercialization what have been the impacts on your organization? 
    • Sales and net profits (either total, annually, or per unit)
    • Percent of sales / profits attributable to the Center
    • Personnel associated with the product
  4. Can you provide future or prospective estimates on market growth (percent or dollars)
    • Conservative estimate
    • Best case scenario estimate
  5. What are the impacts on downstream customers of the technology?
    • Cost savings over prior technologies
    • Qualitative benefits over prior technologies
    • What percent of savings/benefits are attributable to the Center?
  6. Let me review some of the specifics of what you have told me. I would like to clarify what if anything I can attribute to your firm or if you would like all information to remain confidential.

II. Established Firms: Process or process-related technology

  1. Confirm understanding of the process/technology.
    • Is this a process or a component of a larger process?
    • What are the benefits over existing, similar technologies on the market?
  2. How does this technology relate to the Center?
    • Triggered by the Center and developed into technology/IP outside the Center
    • IP/technology developed by the Center  
    • Would the technology have been developed without the Center?             
      • If yes- time lag or delay without the Center?
    • How long did it take to reach commercialization, from when your organization first became involved with the technology? (in years / months)
  3. Since first commercialization what have been the impacts on your organization? 
    • Cost savings (either total, annually, or per unit)
    • Percent of savings attributable to the Center
    • Growth or improvement in sales, net profits
    • Growth in personnel associated with market growth
  4. What are the impacts on downstream customers of the technology?
    • Cost savings over prior technologies
    • Qualitative benefits over prior technologies
    • What percent of savings/benefits are attributable to the Center?
  5. Let me review some of the specifics of what you have told me. I would like to clarify what if anything I can attribute to your firm or if you would like all information to remain confidential.

III. Start-ups (with no realized sales or revenues to-date)

  1. Confirm understanding of the technology and firm.
    • What are the benefits over existing similar technologies on the market?
    • How long since the start-up was established?
  2. How does this technology relate to the Center?
    • Triggered by the Center and developed into technology/IP outside the Center
    • IP/technology developed by the Center  
    • Would the technology have been developed without the Center?             
      • If yes- time lag or delay without the Center?
  3. How much did your organization invest to get this technology ready to commercialize?
    • Total spending to date (wages, supplies, facilities, equipment, etc)Total funding received from all sources: VC, private, government grants
    • Jobs created person-years through life of the firm
    • [If licensed technology] what would alternative technology have cost the organization?
  4. Can you provide future or prospective estimates on market size and growth
    • Conservative estimate (total market and firm-specific)
    • Best case scenario estimate (total market and firm-specific)
  5. What are the impacts on downstream customers of the technology?
    • Cost savings over prior technologies
    • Qualitative benefits over prior technologies
    • What percent of savings/benefits are attributable to the Center?
  6. Let me review some of the specifics of what you have told me. I would like to clarify what if anything I can attribute to your firm or if you would like all information to remain confidential.


4.8 EconImpact 8: Sample summary report of impact

Company: Confidential
Years as member: 5 (2004 – 2008)
Interviewee: Confidential
Interviewer: Drew Rivers
Interview Date: December 2010

Orientation to Center
The Company does not currently hold a membership in the Center; however, our informant indicated a renewed membership could be forthcoming. During the company’s five consecutive years as a member we identified no quantifiable benefits aside from the hiring of a [Center] student near or just after the final year of membership. Otherwise, all realized economic impacts came after the termination of membership.

Table 1: Technology Transfer Events

No.
Event/Project
Description
Stage
Impact
1
Air compressors
The company’s air compressor technology was operating at below desired efficiencies due to unexpected surges and other factors. This technology was most efficient at near-surge levels. The Center’s technology helped the company to predict surges before they occurred.
Testing of a new algorithm has been completed and the new technology is currently being rolled out across the company.
Testing of the new technology indicated savings of $50,000/year, per air compressor. The company maintains dozens of air compressors at each of 8 manufacturing plants.
2
Robot joints
The company was searching for a method to predict impending failure of robot joints in a production process. After testing various factors, a correlation was found between torque and joint health, resulting in the development of a predictive algorithm.
The new predictive algorithm is in the testing phase.
Based on initial testing results, the technology could save each manufacturing plant 400-500 hours of downtime each year, at $7500/hour in cost savings.  Implementation would take 3 years for the first plant, and 5 years for all 8 plants.

Table 2: Other impact Events

No.
Event/Project
Description
Stage
Impact
1
Student hire
BA level student hired by company.
Hired
Estimated at $50k savings in mentoring costs.

Notable Qualitative Impacts
The company sees additional value from involvement in the Center through collaborative project work and general knowledge sharing among members, as well as an increase in the Company‘s capacity to search and absorb new technologies. 

Quantified Impacts
For prospective benefits the informant estimated initial planned deployment of the technologies in years 1 through 3 at a single plant, then accelerating across all 8 plants in years 4 and 5. Applying a best case scenario, we assume the Company will realize full benefits across its operations within the next five years, with greater benefits realized in years 4 and 5. Table 3 below shows these prospective benefits.

Table 3: Retrospective and Prospective Economic Impacts (estimates in actual dollars)

Year
Air compressors
Robot joints
Other
Total
2004
2005
2006
2007
2008
2009
$50,000
$50,000
2010
$50,000
$50,000
2011(est)
$594,000
$1,113,750
$1,707,750
2012(est)
$1,188,000
$2,227,500
$3,415,500
2013(est)
$1,800,000
$3,375,000
$5,175,000
2014(est)
$7,200,000
$13,500,000
$20,700,000
2015(est)
$14,400,000
$27,000,000
$41,400,000

Total retrospective impacts: $100,000
Total estimated impacts through 2015: $72,498,250

Attribution to the Center
The company views the impact as a collaborative effort; the Center bringing advanced skills and the company bringing knowledge of their manufacturing system. As such, both parties were necessary to achieving the outcomes. Specifically for the robot joints project, the company initially attempted to perform the work internally but later opted to involve their prognostics suppliers and the Center. The Center proved a key player in the effort, with advanced skills in collecting and analyzing data. The Center played an instrumental role but was one of several actors in the development of the technology.

Summary
Despite clear linkages between the technology and the Center, the interviewee expressed uncertainty regarding attribution; a host of actors and resources outside the Center, including the Company itself, contributed to getting the technology from early stages of development and into commercial application. Further, we assumed a best case scenario with the roll-out of these technologies across the Company’s operations. Alternatively, a fully implemented evaluation might also include a conservative estimate or likelihood of complete deployment in the next five years, along with periodic follow-up by the evaluator with the Company representative to gauge progress on implementation and realized benefits from the technology.

Back to Top


Surveys & Instruments

Below you will find survey instruments and interview guides for the Industry and the Faculty Process/Outcome assessment as well as old an optional survey instruments.

Back to Top


Process/Outcome Surveys

Below you will find Word, PDF, and Web versions of the Industry Process/Outcome Survey, the Standard Faculty Process/Outcome Survey for use in Centers' first five years, and the Short Faculty Process/Outcome Survey for use in Centers' 6 years and older. Please see the Process/Outcome Workbook for detailed codebooks, data entry shells, and a research cost avoidance (RCA) calculator. Please return survey data by entering it into the Process/Outcome Workbook and emailing it to the NCSU Evaluation Team. If you plan to use the web surveys, please send an email to ncsuiucrc@ncsu.edu notifying the NCSU Evaluation Team of your plans and requesting a copy of the data for your Center(s).

Back to Top


Word
PDF
Web
Industry P/O Questionnaire Industry P/O Questionnaire Industry P/O Questionnaire
Year 1-5 Faculty P/O Questionnaire Year 1-5 Faculty P/O Questionnaire Year 1-5 Faculty P/O Questionnaire
Year 6-10 Faculty P/O Questionnaire Year 6-10 Faculty P/O Questionnaire Year 6-10 Faculty P/O Questionnaire

Archived and Optional Instruments

Over the years, the I/UCRC Evaluation Project and individual evaluators have developed a number of assessment devices.  Some were used as part of the I/UCRC protocol but then discontinued, others were developed by individual evaluators and used for specific supplemental evaluation projects.  Below is a list of these instruments.  Evaluators may wish to consider using these instruments to supplement their required assessments or use them as the basis for developing a new device.

All of these instruments are available for downloading and printing using Adobe Acrobat Reader.

Old Instruments

Standard Industry P/O Questionnaire 2012 (PDF)
Standard Industry P/O Questionnaire 2012 (Word)

Standard Industry P/O Questionnaire 2007-2011 (PDF)
Standard Industry P/O Questionnaire 2007-2011 (Word)

Standard Industry P/O Questionnaire 1999-2006 (PDF)
Standard Industry P/O Questionnaire 1999-2006 (Word)

Short Industry P/O Questionnaire 1999-2006 (PDF)
Short Industry P/O Questionnaire 1999-2006 (Word)

Faculty P/O Questionnaire (PDF)
Faculty P/O Questionnaire (Word)

 

 

 

 

 

 

 

 

Optional Instruments (PDF)

Tech Transfer Questionnaire: A brief questionnaire to capture tech transfers to member firms.
Source: Craig Scott

Student Questionnaire
: A brief questionnaire for the I/UCRC students which focuses primarily on satisfaction items.
Source: NCSU Evaluation Project

Additional Questions: A collection of miscellaneous items collected by Virginia-Shaw Taylor which covers industry expectations, student perceptions and benefits.
Source: NCSU Evaluation Project

Administrative Staff Questionnaire: This overlaps with the original (and discontinues) P/O instrument; it contains additional items on source and frequency of information requests.
Source: NCSU Evaluation Project

Critical Events List: The instructions ask for an inventory of critical events.  Events are coded in terms of influence on the Center's development.
Source: NCSU Evaluation Project

Barriers and Facilitators: The instructions ask for a listing of perceived barriers or facilitators to Center development.  These are coded in terms of past, present, or future impact.
Source: NCSU Evaluation Project

Entrance Interview: A brief interview guide designed to assess the kind of benefits a new member expects to obtain.
Source: Howard Levine

Inactive Sponsor Interview (Exit Interview): This survey is an assessment of firms who decide to terminate their Center membership.
Source: NCSU Evaluation Project

Director Proxy Inactive Sponsor Interview: Based on a study by David Meyer & Craig Scott, it has become clear that getting access to the individuals who represent these firms after they have left is very difficult. In most cases the director is very informed on why a firm leaves, so we have developed a survey you can use with the Center Director.
Source: NCSU Evaluation Project

Abbreviated IAB Satisfaction Survey: An abbreviated process outcome questionnaire that was implemented to minimize the burden on respondents in the last year of a Center.
Source: B.J. Meadows

Technology Transfer Survey

Back to Top

Home

Webpage last updated on: October 31, 2013