Authentic Science Experiences With STEM Datasets: Post-Secondary Results and Potential Gender Influences
Authentic Science Experiences With STEM Datasets: Post-Secondary Results and Potential Gender Influences
Authentic Science Experiences With STEM Datasets: Post-Secondary Results and Potential Gender Influences
To cite this article: Andria C. Schwortz & Andrea C. Burrows (2020): Authentic science
experiences with STEM datasets: post-secondary results and potential gender influences,
Research in Science & Technological Education, DOI: 10.1080/02635143.2020.1761783
Article views: 22
ABSTRACT KEYWORDS
Background: Dataset skills are used in STEM fields from astron- Dataset; post-secondary
omy to zoology. Few fields explicitly teach students the skills to student learning; STEM
analyze datasets, and yet the increasing push for authentic education; authentic science
experiences; gender
science implies these skills should be taught.
Purpose: The overarching motivation of this work is to under-
stand authentic science learning of STEM dataset skills within an
astronomy context. Specifically, when participants work with
a 200-entry Google Sheets dataset of astronomical data, what
are they learning, how are they learning it, and who is doing
the learning?
Sample: The authors studied a total of 82 post-secondary partici-
pants, including a matched set of 54 pre/post-test (34 males, 18
females), 26 video recorded (22 males, 2 females), and 3 inter-
viewed (2 males, 1 female) participants.
Design and methods: In this mixed-methods study, participants
explored a three-phase dataset activity and were given an
eight-question multiple-choice pre/post-test covering skills of
analyzing datasets and astronomy content, with the cognitive
load of questions spanning from recognition of terms through
synthesizing multiple ideas. Pre/post-test scores were compared
and ANOVA performed for subsamples by gender. Select exam-
ples of qualitative data are shown, including written answers to
questions, video recordings, and interviews.
Results: This project expands existing literature on authentic
science experiences into the domain of dataset education in
astronomy. Participants exhibited learning in both recall and
synthesis questions. Females exhibited lower levels of learning
than males which could be connected to gender influence.
Conversations of both males and females included gendered
topics.
Conclusions: Implications of the study include a stronger data-
set focus in post-secondary STEM education, and the need for
further investigation into how instructors can ameliorate the
challenges faced by female post-secondary students.
Authentic science
Students need to experience realistic versions of science so they can make informed
decisions about areas of study and careers (Spuck 2014). These experiences not only
introduce students to the tools used by scientists but also can increase motivation in the
sciences (Hellgren and Lindberg 2017). Authentic science is recapped by Burrows et al.
(2016) as real-world science experiences that work to/towards a solution summarize
information, use technology, analyze data, use findings for conclusions, develop ques-
tions, procedures, and methods, communicate the work, collaborate with others, and
make results accessible to others.
In the United States (USA), where this study takes place, teachers of primary (ages
6–12 years) and secondary (ages 13–18 years) students are explicitly expected to utilize
authentic experiences in both math classes (e.g. National Governors Association Center
for Best Practices – Common Core State Standards, CCSS 2010) and science classes (e.g.
Next Generation Science Standards – NGSS, NGSS Lead States 2013). The USA’s NGSS were
based upon other countries’ successes (Achieve 2010a, 2010b). Additionally, much effort
has been exerted into what other countries are implementing well in computer science,
since today’s science requires processing of data using computational skills (Hubwieser
et al. 2015). The NGSS highlights authentic science experiences, and this same trend is
seen in informal science education as well, through summer camps or nature-based
programs for children (e.g. Burrows et al. 2018). However, studies of authentic science
experiences in primary and secondary school are limited in number, and post-secondary
studies are not found. For example, Slavin et al. (2014) identified no papers published
from 1990–2012 on authentic science experiences in primary or secondary education, and
Cheung et al. (2017) identified only a single such paper. That paper, Harris et al. (2015),
focuses on middle school rather than post-secondary, physical science and Earth science
content rather than astronomy, and does not appear to address dataset learning.
RESEARCH IN SCIENCE & TECHNOLOGICAL EDUCATION 3
The need for authentic science activities does not stop at the secondary level but
continues through the post-secondary level (age 18+, also called collegiate or university).
Research shows that post-secondary students benefit from domain-specific authentic
science activities in many STEM and social science fields (e.g. environmental science:
Carey and Gougis 2017; physics: Wilcox and Lewandowski 2016). Several papers describe
ways of putting domain-specific authentic science into practice (e.g. astronomy:
Kobulnicky and Dale 2016; psychology: Brothen 1984).
Astronomy education
The authors of this study utilized astronomy as the content area to study the acquisition of
dataset skills for a number of reasons, but mainly since data is often made public after
a proprietary period. This practice allows everyone with an internet connection access to
large datasets. Astronomy is a field that serves to capture the interest of the public, with even
non-scientists being able to participate in research through amateur groups and citizen
science projects (e.g. Bailey 2011; Hecker et al. 2018; Raddick et al. 2010; Schwamb et al.
2013). In the USA, post-secondary students are often required to complete courses outside
their major or program of study. The basic math level required for introductory astronomy
makes the course more accessible than physics for these non-majors (Bailey 2011).
4 A. C. SCHWORTZ AND A. C. BURROWS
Research questions
The purpose of this study was to investigate post-secondary students learning to manip-
ulate datasets within astronomy, as an expansion of existing literature into authentic
STEM learning. The following three-part question was developed to investigate their
learning.
How does a three-phase, 1.5-h astronomy dataset intensive activity impact partici-
pants’ short-term learning scores by:
(1) Skills and content focused questions? (‘What are they learning?’)
(2) Recall and synthesis leveled questions? (‘How are they learning?’)
(3) By gender? (‘Who is learning?’)
Theoretical framework
This article uses the framework that learning is social and constructivist in nature (Eymur
and Geban 2017; Kutnick et al. 2017; Vygotsky 1978), and that authentic STEM experiences
lead to improved interest and learning in STEM. These ideals informed the design of the
three-phase learning activity: participants in this study worked in groups, the activity used
actual data from the Sloan Digital Sky Survey (see Electronic Supplemental Materials, ESM,
and Table 1), the activity guided post-secondary students to create their own data
interpretation, and the students were supported through steps of data analysis similar
to those taken by astronomers.
For the quantitative analysis, the authors view the data through the numerical results
of the participants’ work, while the qualitative data provide preliminary clues to how the
participants constructed their knowledge.
Participant sample
A total of 82 individuals participated in this study from 2014 through 2015. Participants
were post-secondary students in introductory astronomy courses for non-majors at
a large research university and a junior college in separate states (USA). Fifty-four parti-
cipants have matched pre/post-tests, nine groups were audio/video recorded during the
activity, and three participants were interviewed 1 week to 3 months after the activity (see
Table 2).
RESEARCH IN SCIENCE & TECHNOLOGICAL EDUCATION 7
Experimental procedure
The authors worked with subsets of the participants for 1.5 hours on two dates over the
course of 2 weeks, with data collected from different groups of participants taking place
from June 2014 through February 2015. On the first contact date, participants took the
pre-test (see ESM), approximately a half-hour in time. On the second contact date,
participants completed a three-phase activity (see ESM) and the post-test (same as the
pre-test), which took an average of an hour.
Quantitative data reported in this study were obtained from eight multiple-choice
questions on both astronomy content and data skills used as a pre/post-test. This instru-
ment was reviewed by four individuals in astronomy and physics (a post-doctoral
researcher, a lab technician, a post-graduate student, and an advanced post-secondary
student), and the questions modified based on their feedback. As cognitive load is known
to be an important aspect of student learning (Tekkumru-Kisa et al. 2019), the pre/post-
test was organized by level of cognitive load to explore students’ levels of understanding
and provide scaffolding to their learning (e.g. Lutsky 1986). The term ‘recall’ is used
throughout this paper for low cognitive load questions involving the tasks of remember-
ing and understanding, and ‘synthesis’ for evaluating and creating (Krathwohl 2010). For
example, if a student showed evidence of having correctly memorized the distance to
a specific astronomical object, this would be considered a high level of recall. On the other
hand, if the student used logic to correctly reason about whether a certain type of
astronomical object was likely to be found near the Earth or farther away, this would be
a high level of synthesis.
Qualitative data used in this study were obtained from audio/video recordings of
groups of participants, individual written responses on the activity handout, and one-on-
one interviews conducted within 3 months after the completion of the activities. See
Table 3 and the ESM for further details.
8 A. C. SCHWORTZ AND A. C. BURROWS
matched normalized gains (Hake 1998), and Cohen’s d as effect size (e.g. Burrows et al.
2016; Cohen 1977; Sullivan and Feinn 2012). It is worth noting that while Cohen’s d was
originally intended to compare two independent groups, the same equation is accepted
to describe the change of one group between pre- and post-tests (e.g. Nissen et al. 2018;
Sullivan and Feinn 2012), so the authors chose to adopt the nomenclature of calling this
formula ‘Cohen’s d’ and ‘effect size’, despite the different context of its usage. Further
effect size explanation is provided by Lakens (2013). With these parameters in mind, the
participants’ performance on questions, by level (recall vs. synthesis), was compared to
their performance by skills or content. Results were compared for male and female
participants.
Results
Through this dataset activity, participants improved their pre/post-test scores on an eight-
question multiple-choice astronomy content and dataset skills test, as shown in Table 4.
Pre-test means increased from 58% to a post-test mean of 75%, with p < 0.01 and a sample
size of N = 54, with a gain of 0.285 and effect size of 0.86, indicating short-term learning
did occur. The following information is organized by research question, with supporting
qualitative data presented to triangulate the results.
showed improvement in both content and skills questions, with content gains of 0.326 or
effect size of 0.44, and skills gains of 0.321 or effect size of 0.99 (see Table 4).
Table 5 is a cross-tabulation, showing how many participants improved their score,
stayed the same, or did worse. Twenty-seven participants (50%) improved in both
categories of skills and content, while 13 participants (24%) did worse on one or both
of skills and content.
improvement on both recall and synthesis questions, 15 individuals or 28% did worse on
one or both categories of recall or synthesis questions.
This response shows a synthesis level of understanding, where Diego makes the connec-
tion that coordinates on the sky can be used to determine the distance between quasars,
a measurement relevant to determining whether quasars are caused by high-density
environments (Shen et al. 2011). Diego also understood that a histogram of redshift can
track not only the distance and speed of quasars but also their history due to lookback
time (see Table 1).
Steve showed evidence of beginning to develop a synthesis level of understanding. When
discussing the histogram of redshift, he wrote, ‘Maybe we don’t see so many in the higher
range is because is the redshift is weaker the further away a quasar gets and were just not
seeing quasars that may actually be there? I can’t fathom why there would be less quasars
closer to us.’ Although inexpertly worded, Steve’s response indicates that he was attempting
to use logic to reason through information about where astronomical objects are located,
rather than using rote memorization. Steve reasoned that because a larger value of redshift
RESEARCH IN SCIENCE & TECHNOLOGICAL EDUCATION 13
corresponds to a farther distance, the radio light could be too faint to detect. However, Steve
did not understand how geometry affects the number of quasars observed at closer distances.
Learning by gender
ANOVA significance levels of gender on improvement from pre- to post-test are shown in
Table 6. Gender revealed a statistically significant effect on the improvement of overall
pre/post-test scores (p = 0.004), and of the dataset skills questions (p = 0.012).
Male participants had a large overall effect size and gains of 1.12 and 0.467, respec-
tively, while those for females were 0.62 and 0.084 (see Table 4), indicating a higher level
of short-term learning from male participants than female. In fact, T-test indicated female
participants’ overall scores did not statistically significantly improve (p > 0.05).
Discussion
Researchers realize that authentic STEM experiences are necessary for students to learn
the skills of science, to develop a realistic picture of science as a field they may or may not
wish to enter, and to motivate students in their STEM courses (Hellgren and Lindberg
2017; Spuck 2014). As research science increasingly moves towards the use of large
datasets, educators must move towards teaching dataset skills during classes. Since
astronomy often serves as a gateway to other sciences both for students and for the
public (Bailey 2011), astronomy datasets could be vital for classroom and general use.
This section first discusses the results about the overarching theme of authentic
science experiences, the research questions, and astronomy content vs. dataset skills.
Additional discussion follows regarding the revealed gender issues, the limitations, and
the greater implications of this study.
14 A. C. SCHWORTZ AND A. C. BURROWS
Research questions
The astronomy dataset activity demonstrates the feasibility of using the activity as a form
of authentic STEM experience. As stated in the literature review, astronomy content can
be used as a tool to teach dataset skills to students in an engaging and interesting manner
(Bailey 2011). The post-secondary students utilized technology with a real dataset to
create questions, procedures, methods, analyze patterns, and potentially propose solu-
tions. They discussed their findings with their group and explained their reasoning and
potential models. Thus, the dataset astronomy activity can be classified as an authentic
science experience, albeit a short one.
Participants self-identified as having little prior knowledge in STEM datasets, and less
experience with astronomy content. The data (such as pre-test scores) reflect the lack of
experience, and post-secondary students most certainly enter the classroom with varying
levels of prior knowledge. Both content and skills were gained by the participants, with
many learning both simultaneously, providing a proof of concept that a single authentic
STEM activity can address both aspects of STEM learning at least in the short term. The
qualitative data confirmed that the level of knowledge about dataset skills and astronomy
content was not constant among participants, but showed improvement in understand-
ing of both content and dataset skills.
Participants exhibited learning at both the recall and synthesis levels on their pre/post-
tests, with some individuals learning both simultaneously. Qualitative data showed that
some participants exhibited traits of recall-level understanding, while others demonstrated
a synthesis level. Importantly, and linking back to previously mentioned work, materials
that contain multiple levels of content can provide differentiated opportunities and
scaffolding that benefit students at all levels and abilities (Lutsky 1986; Westwood 2001).
But it is important to note that male students can be negatively impacted as well, as
Luke’s learning was interrupted by continually having to steer his group away from
gendered discussions and back to work.
The gendered self-deprecation of Felicity fits the known pattern of females in STEM
possessing a lower self-assessment of their abilities than do males (Hill, Corbett, and
St. Rose 2010; Microsoft 2017; Nissen and Shemwell 2016). Felicity’s self-assessment of her
abilities as being merely secretarial ‘convey[ed] . . . second-class status about members of
one gender’ (NAS, 2018b), which is another example of gendered harassment. Not only
did Felicity’s self-assessment reflect inner challenges to her learning but expressing this
aloud could have negative influences on other female students, and reinforced gender
stereotypes for male students in her group.
Only 17% of the female post-secondary students in this study were exposed to women
role models in the highest position of authority in their astronomy courses. The presence
of visible female mentors in STEM is another factor known to improve female students’
outlook of STEM (Microsoft 2017), and the lack of such role models may have been a factor
in these female participants’ struggles in this study. Investigations into female dataset use
within authentic science are worthwhile future endeavors for researchers.
Limitations
This study highlighted only short-term participant learning and retention in content and skills
acquisition. Due to test fatigue, further validation of the multiple-choice pre/post-test ques-
tions was unavailable via free-response questions. Participants were self-selected from among
post-secondary students taking introductory astronomy at the university and college level.
Although the total number of participants in this study was 82, a reasonably large
sample size for a mixed-methods study, it is small for a dataset study. It is unlikely that
disparity by gender is due exclusively to small number statistics (N = 13 for females with
matched pre/post-tests), as ANOVA confirmed these results to be statistically significant.
In addition, the qualitative data does confirm the quantitative results, and both agree with
previous results on gender in STEM learning.
While the qualitative data are rich, the authors cannot know the purpose behind
participant’s language choices. Either use of language, whether profanity, innuendos,
and gender-based discussion, or the lack of such discussion, may have been natural
language choices, or may have been influenced by participants knowing that they were
being recorded.
Implications
The findings of this study imply the need for further dataset education as authentic STEM
experiences at the post-secondary level as well as further research into this skill set. If STEM
fields require the use and analysis of datasets, then it is essential to educate various primary,
secondary, and post-secondary populations in these skills along with the content (CCSS, 2010;
Microsoft 2017; NGSS Lead States 2013). The need for handling big STEM datasets is ubiqui-
tous, and thus it is important to investigate the process by which individuals transition from
novices to experts in this domain (Bransford, Brown, and Cocking 2000).
16 A. C. SCHWORTZ AND A. C. BURROWS
Attention to gender issues must be prevalent during the dataset teaching. Females are
disproportionately leaving STEM fields, including astronomy, and they are not being well
served in the classes they take (Hill, Corbett, and St. Rose 2010; Microsoft 2017; Nissen and
Shemwell 2016). To better serve female STEM students of all levels, researchers need to
find effective pedagogies and approaches to STEM education that do not disadvantage
students based upon their gender. Many steps that educators can take to help female
students will also help male students, such as focusing students away from gendered
behaviors and towards the assignment at hand.
Because the gender-based behavior observed among these post-secondary students
was so drastic at times, a deeper analysis of the rich qualitative data from these partici-
pants and additional ones may provide more insight into these interactions. It would also
be intriguing to see if the gender disparity (in both pre/post-test scores, and discussion
topics) were replicated among the educators who serve as role models and examples to
students (Scantlebury and Baker 2013). Regardless of the cause and age level of students,
it is incumbent on instructors to address gender-based behavior as part of classroom
management as one step to level the playing field for female students (NAS, 2018b).
Instructors of any gender can take many approaches to help female students overcome the
barriers they face both inside and outside the classroom. Assigning single-gender groups have
long been known to support the learning of female students using computers (e.g. Busch
1996; Lee 1993), but has yet to be widely implemented. When learners work in mixed-gender
groups, instructors can mandate rotation of computer control so all individuals can learn
computer skills. Instructors should encourage professional language in the classroom, taking
special care to stop jokes and insults (Hill, Corbett, and St. Rose 2010). Self-derogatory talk by
females should be discouraged as well, as this too contributes to a hostile environment (NAS,
2018b; Nissen and Shemwell 2016). Role models and mentors should be provided for female
students and students from other underrepresented groups to help these individuals see that
they can overcome the stereotypes against them (Microsoft 2017; NAS, 2018b). And using
authentic STEM experiences has been shown to increase the interest and improve retention of
female students (Hill, Corbett, and St. Rose 2010).
For the public to learn either science content, or dataset skills, they need exposure to
STEM datasets and training materials. Astronomy dataset education is important for STEM
educators, including pre-service and in-service teachers, and there are available, accurate
datasets ready to use with these populations. It is reasonable to suggest that if primary or
secondary teachers are to adequately teach astronomy or related content, they need to
understand it at a level better than their students. Thus, dataset professional development
for teachers is desirable and could be integrated with other professional developments, or
taught separately to facilitate dataset skill use. This study reinforces what is found in the
literature that not only students but teachers of all levels, need access to and instruction
about classroom activities (e.g. Hampton et al. 2017). Likewise, professional development
opportunities are necessary for pre-service teachers, in-service teachers, and other science
educators (Burrows 2015; Burrows et al. 2016).
As stated earlier, the authors argue that authentic science, which is shown to work with
students, can utilize computer science and datasets to open doors of STEM opportunities,
but these opportunities could be limited if gender issues affect learning or access to those
datasets. Students in many disciplines – especially in STEM – could be exposed to large
datasets and their content knowledge and skills would improve as evidenced in the
RESEARCH IN SCIENCE & TECHNOLOGICAL EDUCATION 17
findings of this study. Dataset instruction should be included for students at all levels for
both exposure and future success in STEM field careers. Public scholarship (at any level) to
educate diverse democracies includes access to all STEM field components. To be com-
petitive in STEM and open options to all students, education on dataset analysis must
become a standard part of any STEM curriculum, most likely as an integrated piece of
authentic STEM learning. Lastly, researchers must focus on creating a ‘safe space’ for
everyone to learn about STEM datasets and their uses.
Acknowledgments
The authors would like to thank the participants of the study. Author 1 would like to thank the
members of the University of Wyoming Department of Physics and Astronomy for their support
during this study.
Funding
This work was partially supported by the Wyoming Department of Education under Grant
#WY140202; by the US National Science Foundation Division of Undergraduate Education under
Grant #1339853; and by the Space Telescope Science Institute under Grant #HST-EO-13237.001-A
ORCID
Andria C. Schwortz http://orcid.org/0000-0003-1211-7620
Andrea C. Burrows http://orcid.org/0000-0001-5925-3596
References
Abello, J., P. Pardalos, and M. Resende, Eds. 2002. Handbook of Massive Data Sets. 1st ed. Dordrecht:
Kluwer Academic Publishers.
Achieve. 2010a. Connecting Science Standards with Assessment: A Snapshot of Three Countries’
Approaches - England, Hong Kong, and Canada (Ontario). USA: Achieve. https://www.achieve.
org/publications/connecting-science-standards-assessment
Achieve. 2010b. International Science Benchmarking Report: Taking the Lead in Science Education:
Forging Next-generation Science Standards. USA: Achieve. https://www.nextgenscience.org/inter
national-benchmarking
Anderson, J. O., H. S. Lin, D. F. Treagust, S. P. Ross, and L. D. Yore. 2007. “Using Large-scale
Assessment Datasets for Research in Science and Mathematics Education: Programme for
International Student Assessment (PISA).” International Journal of Science and Mathematics
Education 5 (4): 591–614. doi:10.1007/s10763-007-9090-y.
Bailey, J. M. 2011. Astronomy Education Research: Developmental History of the Field and Summary of
the Literature. Las Vegas, NV: National Research Council Board on Science Education.
Berg, C., and S. Boote. 2017. “Format Effects of Empirically Derived Multiple-choice versus
Free-response Instruments When Assessing Graphing Abilities.” International Journal of Science
and Mathematics Education 15 (1): 19–38. doi:10.1007/s10763-015-9678-6.
Bergstrom, Z., and P. Sadler. 2016. “Evolution and Persistence of Students’ Astronomy Career
Interests: A Gender Study.” Journal of Astronomy & Earth Sciences Education 3 (1): 77–92.
Bransford, J. D., A. L. Brown, and R. R. Cocking. 2000. How People Learn: Mind, Brain, Experience, and
School. Washington, DC: National Academies Press.
18 A. C. SCHWORTZ AND A. C. BURROWS
Brickhouse, N. W., P. Lowery, and K. Schultz. 2000. “What Kind of Girl Does Science?: The
Construction of School Science Identities.” Journal of Research in Science Teaching 37 (5):
441–458. doi:10.1002/(SICI)1098-2736(200005)37:5<441::AID-TEA4>3.0.CO;2-3.
Brothen, T. 1984. “Three Computer-assisted Laboratory Exercises for Introductory Psychology.”
Teaching of Psychology 11 (2): 105–107. doi:10.1207/s15328023top1102_14.
Brunner, R. J., S. G. Djorgovski, T. A. Prince, and A. S. Szalay. 2002. “Massive Datasets in Astronomy.” In
Handbook of Massive Data Sets, edited by J. Abello, P. M. Pardalos, and M. G. Resende, 931–979.
1st ed. Dordrecht: Kluwer Academic Publishers.
Burrows, A., M. Lockwood, M. Borowczak, E. Janak, and B. Barber. 2018. “Integrated Stem: Focus on
Informal Education and Community Collaboration Through Engineering.” Education Sciences 8
(1): 1–4. doi:10.3390/educsci8010004.
Burrows, A. C. 2015. “Partnerships: A Systemic Study of Two Professional Developments with
University Faculty and K-12 Teachers of Science, Technology, Engineering, and Mathematcs.”
Problems of Education in the 21st Century 65: 28–38.
Burrows, A. C., M. A. DiPompeo, A. D. Myers, R. C. Hickox, M. Borowczak, D. A. French, and A. C.
Schwortz. 2016. “Authentic Science Experiences: Pre-Collegiate Science Educators Successes and
Challenges During Professional Development.” Problems of Education in the 21st Century 70: 59–73.
Busch, T. 1996. “Gender, Group Composition, Cooperation, and Self-Efficacy in Computer Studies.”
Journal of Educational Computing Research 15 (2): 125–135. doi:10.2190/KQJL-RTW1-VVUY-
BHLG.
Carey, C. C., and R. D. Gougis. 2017. “Simulation Modeling of Lakes in Undergraduate and Graduate
Classrooms Increases Comprehension of Climate Change Concepts and Experience with
Computational Tools.” Journal of Science Education and Technology 26 (1): 1–11. doi:10.1007/
s10956-016-9644-2.
Cheruvelil, K., and P. Soranno. 2018. “Data-intensive Ecological Research Is Catalyzed by Open
Science and Team Science.” BioScience 68 (10): 813–822. doi:10.1093/biosci/biy097.
Cheung, A., R. E. Slavin, E. Kim, and C. Lake. 2017. “Effective Secondary Science Programs: A
Best-evidence Synthesis.” Journal of Research in Science Teaching 54 (1): 58–81. doi:10.1002/
tea.21338.
Cohen, J. 1977. Statistical Power Analysis for the Behavioral Sciences. New York, NY: Academic Press.
Creswell, J. W. 2013. Qualitative Inquiry & Research Design: Choosing among Five Approaches. Los
Angeles, CA: Sage Publications.
Danaia, L., D. H. McKinnon, and M. Fitzgerald. 2017. “Ideal Pictures and Actual Perspectives of Junior
Secondary School Science: Comparisons Drawn from Australian Students in an Astronomy
Education Programme.” Research in Science & Technological Education. 35 (4): 445–460.
doi:10.1080/02635143.2017.1344959.
Day, J., J. B. Stang, N. G. Holmes, D. Kumar, and D. A. Bonn. 2016. “Gender Gaps and Gendered Action
in a First-year Physics Laboratory.” Physical Review Physics Education Research 12 (2): 1–14.
doi:10.1103/PhysRevPhysEducRes.12.020104.
Eymur, G., and Ö. Geban. 2017. “The Collaboration of Cooperative Learning and Conceptual Change:
Enhancing the Students’ Understanding of Chemical Bonding Concepts.” International Journal of
Science and Mathematics Education 15 (5): 853–871. doi:10.1007/s10763-016-9716-z.
Gültepe, N. 2016. “Reflections on High School Students’ Graphing Skills and Their Conceptual
Understanding of Drawing Chemistry Graphs.” Kuram Ve Uygulamada Egitim Bilimleri 16 (1):
53–81.
Hake, R. 1998. “Interactive-engagement versus Traditional Methods: A Six-thousand-student Survey
of Mechanics Test Data for Introductory Physics Courses.” American Journal of Physics 66 (1):
64–74. doi:10.1119/1.18809.
Hampton, S. E., M. B. Jones, L. A. Wasser, M. P. Schildhauer, S. R. Supp, J. Brun, and R. R. Hernandez.
2017. “Skills and Knowledge for Data-intensive Environmental Research.” BioScience 67 (6):
546–557. doi:10.1093/biosci/bix025.
Harris, C. J., W. R. Penuel, C. M. D'Angelo, A. H. DeBarger, L. P. Gallagher, C. A. Kennedy, B. H. Cheng,
and J. S. Krajcik. 2015. “Impact of Project-Based Curriculum Materials on Student Learning in
RESEARCH IN SCIENCE & TECHNOLOGICAL EDUCATION 19
Science: Results of a Randomized Controlled Trial.” Journal of Research in Science Teaching 52 (10):
1362–1385.
Hecker, S., M. Haklay, A. Bowser, Z. Makuch, J. Vogel, and A. Bonn. 2018. Citizen Science: Innovation in
Open Science, Society and Policy. London: UCL Press. https://www.wilsoncenter.org/sites/default/
files/book_downloads/citizen-science.pdf
Hellgren, S., and S. Lindberg. 2017. “Motivating Students with Authentic Science Experiences:
Changes in Motivation for School Science.” Research in Science & Technological Education 35 (4):
409–426. doi:10.1080/02635143.2017.1322572.
Hill, C., C. Corbett, and A. St. Rose. 2010. Why so Few?: Women in Science, Technology, Engineering,
and Mathematics. Washington, DC: American Association of University Women.
Hubwieser, P., M. N. Giannakos, M. Berges, T. Brinda, I. Diethelm, J. Magenheim, Y. Pal, J. Jackova, and
E. Jasute. 2015. “A Global Snapshot of Computer Science Education in K-12 Schools.” In
Proceedings of the 2015 ITiCSE on Working Group Reports (ITICSE-WGR ‘15), 65–83. New York, NY:
ACM. doi: 10.1145/2858796.2858799.
Jackson, D. F., B. J. Edwards, and C. F. Berger. 1993. “Teaching the Design and Interpretation of
Graphs through Computer-aided Graphical Data Analysis.” Journal of Research in Science Teaching
30 (5): 483–501. doi:10.1002/tea.3660300507.
Johri, A., and B. M. Olds. 2014. Cambridge Handbook of Engineering Education Research. New York,
NY: Cambridge University Press.
Kobulnicky, H., and D. Dale. 2016. “A Community Mentoring Model for STEM Undergraduate
Research Experiences.” Journal of College Science Teaching 45 (6): 17–23. doi:10.2505/4/
jcst16_045_06_17.
Krathwohl, D. R. 2010. “A Revision of Bloom’s Taxonomy: An Overview.” Theory into Practice 41 (4):
212–218. doi:10.1207/s15430421tip4104_2.
Kutnick, P., D. C. L. Fung, I. A. C. Mok, F. K. S. Leung, J. C. H. Li, B. P.-Y. Lee, and V. K. W. Lai. 2017.
“Implementing Effective Group Work for Mathematical Achievement in Primary School
Classrooms in Hong Kong.” International Journal of Science and Mathematics Education 15 (5):
957–978. doi:10.1007/s10763-016-9729-7.
Lakens, D. 2013. “Calculating and Reporting Effect Sizes to Facilitate Cumulative Science: A Practical
Primer for T-tests and ANOVAs.” Frontiers in Psychology 4 (863): 1–12. doi:10.3389/
fpsyg.2013.00863.
Langen, T. A., T. Mourad, B. W. Grant, W. K. Gram, B. J. Abraham, D. S. Fernandez, M. Carroll et al. 2014.
“Using Public Large Datasets in the Undergraduate Ecology Classroom.” Frontiers in Ecology and
the Environment 12 (6): 362. doi:10.1890/1540-9295-12.6.362.
Lee, M. 1993. “Gender, Group Composition, and Peer Interaction in Computer-Based Cooperative
Learning.” Journal of Educational Computing Research 9 (4): 549–577. doi:10.2190/VMV1-JCVV-
D9GA-GN88.
Lelliott, A., and M. Rollnick. 2010. “Big Ideas: A Review of Astronomy Education Research
1974–2008.” International Journal of Science Education 32 (13): 1771–1799. doi:10.1080/
09500690903214546.
Leskovec, J., A. Rajaraman, and J. D. Ullman. 2011. Mining of Massive Datasets. 1st ed. Cambridge, UK:
Cambridge University Press.
Lutsky, N. 1986. “Undergraduate Research Experience through the Analysis of Data Sets in
Psychology Courses.” Teaching of Psychology 13 (3): 119–122. doi:10.1207/s15328023top1303_4.
Microsoft Corporation. 2017. Why Europe’s Girls aren’t Studying STEM. Microsoft Philanthropies.
National Academies of Sciences, Engineering, and Medicine (NAS). 2018a. Open Source Software
Policy Options for NASA Earth and Space Sciences. Washington, DC: National Academies Press.
doi:10.17226/25217.
National Academies of Sciences, Engineering, and Medicine (NAS). 2018b. Sexual Harassment in
Academia. Washington, DC: National Academies Press. doi:10.17226/24994.
National Governors Association Center for Best Practices, Council of Chief State School Officers
(CCSS). 2010. Common Core State Standards Math. Washington DC: National Governors
Association Center for Best Practices, Council of Chief State School Officers.
20 A. C. SCHWORTZ AND A. C. BURROWS
National Research Council. 2010. New Worlds, New Horizons in Astronomy and Astrophysics.
Washington, D.C. http://sites.nationalacademies.org/bpa/bpa_049810
NGSS Lead States. 2013. Next Generation Science Standards: For States, by States. Washington, DC:
National Academies Press.
Nieminen, P., A. Savinainen, and J. Viiri. 2013. “Gender Differences in Learning of the Concept of
Force, Representational Consistency, and Scientific Reasoning.” International Journal of Science
and Mathematics Education 11 (5): 1137–1156. doi:10.1007/s10763-012-9363-y.
Nissen, J. M., and J. T. Shemwell. 2016. “Gender, Experience, and Self-efficacy in Introductory
Physics.” Physical Review Physics Education Research 12 (2): 1–16. doi:10.1103/
PhysRevPhysEducRes.12.020105.
Nissen, J. M., R. M. Talbot, A. N. Thompson, and B. Van Dusen. 2018. “Comparison of Normalized Gain
and Cohen’s D for Analyzing Gains on Concept Inventories.” Physical Review Physics Education
Research 14 (1): 010115. doi:10.1103/PhysRevPhysEducRes.14.010115.
Nyhof-Young, J. 2000. “The Political Is Personal: Reflections on Facilitating Action Research in
Gender Issues in Science Education.” Educational Action Research 8 (3): 471–498. doi:10.1080/
09650790000200134.
Nyström, E. 2007. “Exclusion in an Inclusive Action Research Project: Drawing on Student
Perspectives of School Science to Identify Discourses of Exclusion.” Educational Action Research
15 (3): 417–440. doi:10.1080/09650790701549693.
Raddick, M. J., G. Bracey, P. L. Gay, C. J. Lintott, P. Murray, K. Schawinski, A. S. Szalay, and
J. Vandenberg. 2010. “Galaxy Zoo: Exploring the Motivations of Citizen Science Volunteers.”
Astronomy Education Review 9 (1): 1–18. doi:10.3847/AER2009036.
Resnick, I., K. A. Kastens, and T. F. Shipley. 2018. “How Students Reason about Visualizations from
Large Professionally Collected Data Sets: A Study of Students Approaching the Threshold of Data
Proficiency.” Journal of Geoscience Education 66 (1): 55–76. doi:10.1080/10899995.2018.1411724.
Scantlebury, K., and D. Baker. 2013. “Gender Issues in Science Education Research: Remembering
Where The Difference Lies.” In Handbook of Research on Science Education, edited by, 1. Mahwah,
NJ: Lawrence Erlbaum Associates.
Schneider, D. P., P. B. Hall, G. T. Richards, M. A. Strauss, D. E. Vanden Berk, S. F. Anderson, and
W. N. Brandt. 2007. “The Sloan Digital Sky Survey Quasar Catalog. IV. Fifth Data Release.” The
Astronomical Journal 134 (1): 102–117. doi:10.1086/518474.
Schwamb, M. E., J. A. Orosz, J. A. Carter, W. F. Welsh, D. A. Fischer, G. Torres, and A. W. Howard. 2013.
“Planet Hunters: A Transiting Circumbinary Planet in A Quadruple Star System.” The Astrophysical
Journal 768 (2): 127–148. doi:10.1088/0004-637X/768/2/127.
Scott, S., H. Asoko, and J. Leach. 2007. “Student Conceptions and Conceptual Learning in Science.” In
Handbook of Research on Science Education, edited by S. Abell and N. Lederman, 31–56. Vol. 1.
Mahwah, NJ: Lawrence Erlbaum Associates.
Shen, Y., G. T. Richards, M. A. Strauss, P. B. Hall, D. P. Schneider, S. Snedden, D. Bizyaev, et al. 2011.
“A Catalog of Quasar Properties from Sloan Digital Sky Survey Data Release 7.” The Astrophysical
Journal Supplement Series 194 (2): 45. doi:10.1088/0067-0049/194/2/45.
Slavin, R. 2020. “How Evidence-based Reform Will Transform Research and Practice in Education.”
Educational Psychologist 55 (1): 21–31.
Slavin, R. E., C. Lake, P. Hanley, and A. Thurston. 2014. “Experimental Evaluations of Elementary
Science Programs: A Best-evidence Synthesis.” Journal of Research in Science Teaching 51 (7):
870–901. doi:10.1002/tea.21139.
Snyder, T., and S. Dillow. 2011. Digest of Education Statistics, 2010. Washington, DC: National Center
for Education Statistics.
Spuck, T. 2014. “Putting the “Authenticity” into Science Learning.” In Einstein Fellows: Best Practices in
STEM Education, edited by T. Spuck and J. Leigh, 118–157. New York, NY: Peter Lang.
Sullivan, G. M., and R. Feinn. 2012. “Using Effect Size – Or Why the P Value Is Not Enough.” Journal of
Graduate Medical Education 4 (3): 279–282. doi:10.4300/JGME-D-12-00156.1.
Tekkumru-Kisa, M., C. Schunn, M. K. Stein, and B. Reynolds. 2019. “Change in Thinking Demands for
Students across the Phases of a Science Task: An Exploratory Study.” Research in Science Education
49 (3): 859–883. doi:10.1007/s11165-017-9645-z.
RESEARCH IN SCIENCE & TECHNOLOGICAL EDUCATION 21
Türk, C., and H. Kalkan. 2018. “Teaching Seasons with Hands-on Models: Model Transformation.”
Research in Science & Technological Education 36 (3): 324–352.
Vygotsky, L. S. 1978. Mind in Society: The Development of Higher Mental Process. Cambridge, MA:
Harvard College Press.
Wallace, R. M., J. Kupperman, J. Krajcik, and E. Soloway. 2000.0 “Science on the Web: Students Online
in a Sixth-grade Classroom.” The Journal of the Learning Sciences 9 (1): 75–104. doi:10.1207/
s15327809jls0901_5.
Weintrop, D., E. Beheshti, M. Horn, K. Orton, K. Jona, L. Trouille, and U. Wilensky. 2015. “Defining
Computational Thinking for Mathematics and Science Classrooms.” Journal of Science Education
and Technology 25 (1): 127–147. doi:10.1007/s10956-015-9581-5.
Westwood, P. 2001. “Differentiation’ as a Strategy for Inclusive Classroom Practice: Some Difficulties
Identified.” Australian Journal of Learning Disabilities 6 (1): 5–11. doi:10.1080/
19404150109546651.
Whitten, I. H., E. Frank, and M. A. Hall. 2011. Data Mining: Practical Machine Learning Tools and
Techniques. 3rd ed. Burlington, MA: Morgan Kaufmann.
Wilcox, B. R., and H. J. Lewandowski. 2016. “Open-ended versus Guided Laboratory Activities: Impact
on Students’ Beliefs about Experimental Physics.” Physical Review Physics Education Research 12
(2). doi:10.1103/PhysRevPhysEducRes.12.020132.