topSkip to main content

Menu, Secondary

Menu Trigger

Menu
AAU STEM Addressing Cross Cutting Challenges

Addressing Cross-Cutting Challenges

Body

Efforts to assess the quality of undergraduate teaching and learning face several types of challenges, including the collection of data, variations in the ways universities have organized student data, as well as the rules and regulations on data governance, stewardship, sharing, and use. This section highlights several of the identified challenges and provides guidance for and examples of moving conversations about measuring teaching and learning forward. Ultimately, documenting institution-level effects of STEM education reforms requires finding “ways to defuse the potential conflict between locally useful classroom-level information and broader measures of program effects.”

Home Essential Questions ●  Data Sources & Analytical Tools

Body

Each campus organizes data collection differently, including the location where data reside. Aggregating and joining data sets that are managed by different units within an institution may be challenging for technical, political, and institutional reasons. Particular types of information are also subject to different levels of restriction in terms of sharing and use (e.g., financial aid data versus academic performance data). Although the development of a single model for organizing data is unlikely, the partitioning of data sets in idiosyncratic ways is counterproductive to effective institutional decision-making, and makes cross-institutional comparisons much more difficult. Campuses have recognized this difficulty, and it would be advantageous for campuses to explore new ways to link data sets to support timely decision-making that benefits the institution while still protecting privacy.

  • For example, the Student Data Matching Tool under development by the CREATE for STEM Institute at Michigan State University aggregates data and provides an interface for asking how a “treatment” (such as an undergraduate research experience) affects an outcome variable (such as graduating GPA) when students are matched on certain factors (e.g., Pell eligibility, race/ethnicity, gender).

Body

The development of common data definitions, standards, formats, and methodologies to the extent possible by the community would greatly facilitate sharing, aggregation, and comparison of data. On many campuses there is a reluctance to adopt wholesale tools and techniques developed elsewhere. Custom- designed assessment tools can generate local buy-in for the purposes of educational reforms but can; make even cross-college or department comparisons within a campus difficult. One goal of Essential Questions & Data Sources is to develop guiding principles to allow for; meaningful sharing and comparisons within and across universities. Even agreeing on common file formats for data when using similar tools could be an important step forward.


Body

Researchers and academic administrators should be clear about the kinds of analysis they seek to perform with institutional and student data, and distinguish research, evaluation, and assessment. Although these distinctions are sometimes nuanced, often they are more generic. Using some type of common definitions and formats can help institutions develop consistent guidelines for how to respond to different types of data requests.

  • One useful framework has been put together by the University of Wisconsin. It distinguishes academic research, institutional research, program evaluation, and student learning assessment based on criteria such as intent, funding source, performer, type of data used, and publication and dissemination of results. Such a framework can intersect with campus data governance models to help institutions become more systematic in how data are shared with researchers and academic administrators.

Body

Institutional Review Board (IRB) roles and oversights differ among campuses. To the degree possible, campuses should find ways to allow IRBs to expedite review of studies that seek to improve educational performance using de-identified campus-based student data for research, evaluation, and assessment purposes.


Body

Much mythology has sprung up around FERPA, the Family Educational Rights and Privacy Act, which sets guidelines for protecting the privacy of student education records. FERPA is often invoked as a reason to prohibit sharing of information. Limitations in data sharing as the result of FERPA should be clarified and made consistent within, and to the extent possible between, institutions. Adhering to FERPA guidelines need not mean over-compliance.

  • In collaboration with the U.S. Census Bureau and the Universities of California, Michigan, and Texas, Institute for Research on Innovation & Science (IRIS) is conducting a pilot project to effectively link, rigorously analyze, and responsibly share data on student career outcomes and instruction that are derived from a variety of restricted administrative records. The membership FAQ and MOU address many questions related to data protection and sharing.

Body

Researchers and academic administrators attempting to provide key information to institutional decision-makers are often unable to carry out this task because individual-level data are often separately housed on campus. De-identifying partitioned data sets would make within-institution (including cross-college and cross-department) analyses possible. Campuses might experiment with tactics for de-identification that preserve privacy while minimizing issues of campus jurisdictions in terms of questions that can be asked and answered with campus data.


Body

The measurement of student learning and related outcomes to assess institutional level performance is rapidly evolving. Many metrics and measures are in a nascent stage. Institutions can help lead the charge on developing and sharing information. Sharing across institutions for internal benchmarking purposes is helpful to foster change. From the perspective of AAU, a higher education association that works with federal policymakers, aggregations of data across institutions can be useful for documenting impact, but institutional and personal anonymity must be maintained.

Home Essential Questions ●  Data Sources & Analytical Tools