Skip to main content

Inquiry-based learning and E-learning: how to serve high and low achievers

Abstract

Large-scale implementations of effective inquiry-based learning are rare. A European-wide initiative gave teachers access to innovative e-learning tools (ranging from virtual labs, virtual games and simulations to augmented reality applications) for lesson planning and classroom implementation. We examined 668 such implementations across 453 schools within the period of one school year. Teachers could use a platform with digital resources and tools and were encouraged to adopt five different phases of inquiry-based learning: orientation, hypothesizing, planning, analysis, and conclusion. Additionally, an integrated interface for lesson implementation tracked each students’ problem-solving competence (during the inquiry lessons), culminating in about 12,000 datasets. Every user generated an average of 22 digital inquiry-based digital scenarios, each of which required approximately 50.14 min for completion. These scenarios, using high quality resources adapted to school conditions, yielded significant learning outcomes for participating students (age: 14.4 years, gender balanced). While the PISA study identified 10% high achievers on average, we exceeded this number in our framework scoring 20–29% high achievers and 37–42% low achievers (which was close to the 45% PISA average). Offering tools to teachers, which help creating individual inquiry scenarios and monitoring students’ achievement, does not yield any insurmountable obstacles for classroom-implementation of inquiry-based lessons: Compared to the PISA study, levels of high achievers increased even if complex problem-solving competence was required.

Introduction

E-learning is regarded a novel tool to remove barriers in the way of conventional classroom teaching. It offers innovative teaching and learning environments engaging students of different learning levels. Access to networks as well as tools for information or data acquisition are considered to maximize learning processes (Breiner, Harkness, Johnson, & Koehler, 2012). These may also foster individual competences if relevant knowledge is readily available in the respective simulated situation and additional assessment options provided. Moreover, efficient tools may encourage in-class communication with peers and/or autonomous exploration of databases and archives (Sotiriou & Bogner, 2008). Networking with other schools or with research centers may offer new roles for knowledge management (Sotiriou & Bogner, 2011; Sotiriou et al., 2011). Pilot implementations in schools may turn into large-scale deployments with both teachers and students benefitting from new technologies available in the classroom, which significantly improve teachers’ professional development and students’ learning (Sotiriou, Riviou, Cherouvis, Chelioti, & Bogner, 2016). In the near future, the basic infrastructure of schools may include resource centers operating under a superordinate management structure to encourage structured and open learning activities (Schmid & Bogner, 2015). Information-Computer-Technology (ICT) will be an integral part of school activities to consult with peers or tutors and to search for information on lessons or homework assignments. Furthermore, ICT may play a fundamental role in fostering international interactions between students and teachers. Online discussion forums may provide an open platform for others to participate (Linn, 2000). This may help broaden the horizon of experiences since schools increasingly become virtual spaces. Educational communities will thus become virtual organizations transcending geographical and institutional boundaries (Sotiriou & Bogner, 2005, 2011).

Nowadays, E-learning tools and resources for science education are readily available. They differ in content, intention, interface and learner support to help teachers tailoring their lessons to the needs of their students (Wecker, Kohnlet, & Fischer, 2007). Nevertheless, many barriers remain that prevent teachers from adopting these tools in daily practice (Kuhn, 2005; Moran, 2007). For instance, existing online tools i) usually have no structuring and scaffolding inputs for an inquiry process (Cooper & Ferreira, 2009). ii) have different interfaces and application methods impeding classroom implementation; iii) they often focus on specific age groups and therefore rarely meet teachers’ needs; or iv) they do not complement science curricula. v) Finally, teachers are often not familiar with the application of online tools, particularly debugging procedures, and consequently refrain from regularly implementing such activities in class (Shulman & Valcarcel, 2012). Removal of these barriers may involve a) developing an authoring environment for educational activities that offers structures for experimentation with online tools. b) Providing teachers and students with a (or a variety of) standardized methodology (based on inquiry- and problem-solving approaches) to organize online resources. c) Making online tools and resources adaptable (Keselman, 2003; Pilkington, 2004; Kelly, 2008). d) Organizing these resources effectively, that is, considering curricular guidelines, for instance, the “big ideas of science” (David, 2008; Harlen, 2010; Wilhelm, Sherrod, & Walters, 2008). e) Indicating where and how online tools (and the associated activities) support conventional classroom teaching (Burris, 2012). f) Providing teachers with established support facilities, which are managed by external online tool providers. g) Strengthening online teacher communities with special support infrastructures and offers for professional development (Sotiriou et al., 2016). The long-term objective comprises content knowledge acquisition, application of new technologies as well as turning novice learners into expert learners and reflective problem solvers (Alberts, 2009; Bereiter, 2002). This may also foster critical thinking since students who possess the necessary skills and motivation to self-regulated life-long learning reflect and assess their input and output (Franke and Bogner, 2011; Goldschmidt and Bogner, 2016; Scharfenberg & Bogner 2011, 2013b). Thus, inspired science education should assist all teachers who plan lessons or develop curricula (goals, methods, materials, and assessments) to help them overcome or reduce barriers in the way of innovative classroom teaching, which individually engages and supports students of all learning levels (Anderson, 2002; Breiner et al., 2012).

Embedding inquiry activities into lesson preparation allows the analysis of processes involved in planning and preparing activities that may foster complex problem-solving abilities (Blumenfeld et al., 1991; Rocard, Csermely, Jorde, Lenzen, & Walberg-Henriksson, 2007). To solve a complex problem successfully, related problems should be identified, characterized and understood to represent the problem, to solve the problem, and to help reflecting and communicating a potential solution (following the PISA methodology, OECD, 2006, 2014). To summarize, our study had four objectives: First, to prove that implementing inquiry-based learning into regular school lessons is feasible (Shamos, 1995; Schaal & Bogner, 2005; Scharfenberg & Bogner, 2011). Second, to demonstrate that the systematic introduction of inquiry processes is independent of class size. Third, to exemplify the transformation of the teacher’s role from content user to content developer and supplier. Fourth, to improve low and medium achievers’ performance levels by fostering their problem-solving abilities.

Methods & procedures

We collected our data in the course of a three-years European research project (Inspiring Science Education; ISE) comprising 668 implementations in 453 schools within the period of one school year. All in all, 12,550 students (aged 14.5 years with a roughly balanced gender ratio) participated in 668 lessons. The overall approach is built on three pillars: a) effective introduction of inquiry lessons in the school curriculum, b) development of a systematic approach to assess the impact of such implementations on students’ problem-solving competences and c) to reform teachers’ practice.

Supporting the integration of inquiry-based lessons in the school curriculum

The project platform (Inspiring Science Education Platform, http://portal.opendiscoveryspace.eu/ise) with its open development environment supported teachers in organizing inquiry-based and technology-enhanced learning activities. An “Instructional Design Tool” helped teachers plan these activities. Instead of following step-by-step instructions, a user-centered approach assists teachers in tailoring their learning activities to the desired learning outcomes and the respective classroom conditions. These differed with regard to the students’ age, ranging from strict structures to environments that are more open-ended. As suggested by the inquiry-based model we limited lesson plans to five phases (Bybee, 2002; OECD, 2014; Sotiriou et al., 2016): i) “Orienting & Asking Questions” focused students on answering a question, investigating a controversial question and solving problems. Teachers may support this phase with narratives, videos or animations to encourage students to ask questions, discuss issues and take notes of ideas. ii) “Hypothesis Generation & Design” encouraged students to express hypotheses based on prior experience or on written notes as well as on the structure of the question. This learning activity was particularly supported to generate hypotheses required for the next stage. iii) “Planning & Investigation” built on previously generated hypotheses and aimed at planning work processes. Thereby, the order of activities and intermediate goals, for instance, which tools and data to use, how to set a clear timeline and how these activities could be assigned to participants, was determined. iv) “Analysis & Interpretation” collected data for subsequent analyses. Teachers supported learners who had difficulties and helped students process data by identifying key issues. Problem solving also entails a comparison and examination of existing solutions described by experts with students’ solutions. For investigation of controversial cases, different perspectives of how a situation had been approached were analyzed. Finally, v) “Conclusion & Evaluation” aimed at achieving a consensus about adequate solutions to a problem, producing a common learning artifact, or reconciling different solutions achieve a common decision. Presenting conclusions to a broader audience, resulted in the replication and endorsement of results.

Problem-solving competence framework

For analysis, we refer to PISA achievement levels to validate our categorization in schools. Piloting and field-testing results were analyzed systematically and widely disseminated, ensuring immediate effects and widespread uptake. Problem-solving competence is a central objective in educational programs of many countries (OECD, 2014). The acquisition of enhanced problem-solving competence is vital for future learning as well as for active participation in society and personal activities. Students should therefore be able to apply their knowledge to new situations since their problem-solving competences based on basic thinking and cognitive approaches helps them master various challenges in life (Dewey, 1997; Barrow, 2006; Driver, Squires, Rushworth, & Wood-Robinson, 1994; Lesh & Zawojewski, 2007).

The range of problem-solving assessment tasks included in the PISA 2012 PSF distinguishes six levels of problem-solving proficiency, which can be grouped into three main categories (OECD, 2014, p. 56–60):

  • High Performers (Level 5 and Level 6): students in this category can: a) develop complete, coherent mental models in different situations and b) find answers by means of purposive exploration and methodical execution of multi-step plans.

  • Moderate Performers (Level 3 and Level 4): students in this category can: a) control moderately complex devices, though not always efficiently and b) handle multiple situations or inter-related properties by controlling different variables.

  • Low Performers (Level 1 and Level 2): students in this category can: a) answer if a single, specific constraint must be taken into account and b) partially describe a simple, everyday topic.

In order to be able to assess students’ problem solving competence (following the PISA 2012 Problem Solving Framework) in connection with inquiry-based learning, it is essential to incorporate appropriate assessment tasks in various phases of the inquiry cycle (as specified above). Our proposed framework comprises: a) mapping between problem solving steps and inquiry cycle phases (specified above) and b) proposed guidelines for developing assessment tasks in order to assess each of the problem solving steps at different phases of the inquiry cycle. To develop problem-solving competence, all the steps described should be completed to get from a presented situation to an actual goal. Our definition of problem-solving competence follows the PISA definition and describes it as an “individual’s capacity to engage in cognitive processing to understand and resolve problem situations where a method of solution is not immediately apparent. It includes the willingness to engage with such solutions to achieve one’s potential as a constructive and reflective citizen.” (OECD, 2014, p. 123). Our assessment focused on cognitive methods required to solve real problems. Following PISA, proficiency levels were divided into high, moderate, and low (OECD, 2014). Students proficient at high levels can develop complete, coherent mental models of different situations, and find answers by means of purposive exploration and methodical execution of multi-step plans. Based on PISA results, about 10% of students should be able to answer a question at this difficulty level correctly. Moderately proficient students can control moderately complex devices, though not always efficiently, and handle multiple situations or inter-related properties by controlling the variables. About 45% of students should be able to answer questions at this level. Low-level students can only answer if a single, specific constraint is taken into account and can only partially describe a simple, everyday topic. About 45% of the students should be able to answer questions at this level.

Data collection framework and ethics protocol

Advances in software development tools and the use of networked computers increasingly enabled efficiency assessment analyses, including the capability to handle dynamic and interactive problems. Assessment tasks can record and quantify information about type, frequency, length and sequence of actions performed by students collected while engaging students, arousing their interest, and capturing information about problem-solving processes. The organization of inquiry activities, for instance in school labs assess the impact of complex scenarios on complex problem-solving skills (Wilhelm & Wilhelm, 2010). The different steps performed by students (understanding and characterizing, representing and solving the problem, and reflecting and communicating the solution) are included in the educational design process and, thus, enables the mapping of changes in skills throughout the problem-solving process (OECD, 2014). Thus, the analysis of potential solution indicates paths or strengths and weaknesses on an individual level. This evaluation allows conclusions to be drawn with regard to an attained competence level after a specific science activity.

A systematic data collection approach was adopted to monitor the overall implementation of the inquiry lessons in the different schools. Web analytics were used as data source to determine the time spent on task, the number of resources used, and the number of students engaged per classroom. The teacher provided a URL to students, and a passcode, which links learners to the teacher. The categorization of students into low, medium, and high achievers was adapted from the PISA study. Students had to solve two tasks involving partial ability, that is, if students in the “Orienting & Asking Questions” phase completed the task successfully, they were categorized as high achieving. If students were however unable to solve both tasks, their profile value was low. If answers were equal on high and moderate levels, the profile value was “moderate”. This gave us the opportunity to monitor achievement rating ranges in different lab activities, also taking the complexity of the lab-related tasks into account.

Even though this procedure can lead to an underestimation of students’ real performance, it minimizes the risk of over-interpretation. Finally, percentages per class were calculated for all inquiry phases and all lessons (about 12,000 students’ data sets of about 668 lesson implementations).

All information obtained from students in the course of our implementation was strictly confidential and only for the use of schoolteachers to design their lessons and for the assessment of learning in the respective pilot activities. Revealing this information to third parties was rigorously restricted. In general, management of all information related to participating students complied with ethical rules and regulations required in this context. Children and their parents, who were asked for consent, received all relevant information to sufficiently understand the nature, purpose and likely outcome of the current research. Thus, information sheets were presented to students/participants as well as to their parents or legal representatives for participants below 16 and for participants aged 16 to 18 in countries where they are considered minors. Information sheets ensured that participants and their parents or legal representatives understood the conditions of participation prior to the pilots’ initiation. Explicit information about data storage, access and use was also provided. The team used an internal, password protected area where they stored all relevant data and to which only specific members of the research team had access.

Informing teachers practice

During lesson implementation, teachers were able to monitor students work and progress and to receive direct feedback. They could also provide individual support or address the entire class if a topic was considered crucial for the lesson’s progress. Instead of the entire class, they could additionally support groups of students facing similar problems. The platform therefore offers scaffolds and supplementary materials/tools (e.g. hypothesis creation scratchpad, graphical representation tools, error calculation apps, report templates) to introduce students to inquiry process. This process catalyzes informing teaching practice, helps teachers re-design their lessons (e.g. to highlight specific issues that could reduce or eliminate students’ misconceptions, to distribute the time per task more effectively) as well as delivers focused and effective learning experiences. The assessment method, with reference to the complexity of the tasks and experimental work assigned to students, can also help teachers adopt implementation scenarios, which specifically focus on significant variations in students’ scores.

Results

Resource-based inquiry lessons in the school curriculum

Extended implementation of scenarios and lesson plans indicates the potential of the inquiry model’s effective integration into real school environments. Data from 668 implementations demonstrate that the average instruction time (covering all inquiry based teaching phases) was about 50 min, using approximately 22 digital resources (videos, images, animations, on-line labs, augmentations) to enrich lessons. This timeframe is appropriate for an average lesson period of 60 min. Figure 1 presents the average duration of each phase of the inquiry model along with the average number of resources used in each of the five phases for the total number of implementations. ‘Planning and Investigation’ is the most time-consuming phase as students perform experiments and use online resources. Numerous digital resources were used during both the ‘Problem-orientation and Asking Questions’ phase and the “Conclusion and Evaluation” phase.

Fig. 1
figure 1

The average time spent (in minutes) and the number of digital resources used per phase in the 668 implementations examined throughout the pilots demonstrate that inquiry and digitally enriched learning can be effectively introduced to normal lesson times. Phases: 1 = Orienting & Asking Questions; 2 = Hypothesis Generation & Design; 3 = Planning & Investigation; 4 = Analysis & Interpretation; 5 = Conclusion & Interpretation. The average time per lesson is 50 min, using 22 digital resources per lesson

Implementations are independent of class size

A variety of educational scenarios and lesson plans were implemented in different educational settings. Figure 2 presents the duration of each lesson plotted against class size (number of students). On average, the timeframe allocated to the respective lessons is independent of class size (see the almost horizontal line in Fig. 2). Thus, the entire inquiry experience is available to all students in a classroom, independent of class sizes and with an average instruction period of 50 min.

Fig. 2
figure 2

Average duration of the monitored lessons (in minutes) is plotted against the class size (number of students), indicating that lesson duration is independent of class size

Improving problem-solving proficiency levels

Compared to the OECD study (PISA 2015), where only 10% of all participating students achieved high levels, 45% moderate levels and another 45% low levels (Fig. 3), about 20–29% in our study were high achieving students, while the number of moderate and low achievers was close to the expected PISA-norm (10%, 45%, 45%). Our approach apparently helps students achieve high levels while low achievers were (unfortunately) not affected.

Fig. 3
figure 3

The average pattern of high, moderate and low performers per phase of all students, for all implementations in the framework Inspiring Science Education pilots. Phases: 1 = Orienting & Asking Questions; 2 = Hypothesis Generation & Design; 3 = Planning & Investigation; 4 = Analysis & Interpretation. The last lesson phase (conclusion and interpretation) is not included as those tasks did not involve problem-solving competence

To assess the consistency and reliability of the achievement rating, especially across a variety of different learning activities, students’ scores for different implementation scenarios were analyzed. The data demonstrate that students’ achievement rating varies in each phase of the inquiry process and is dependent on task complexity. Fig. 4 presents such a comparison for two activities with different complexity. The first one (on the left) is based on the use of data collected from the CERN’s detectors of particle Physics to simulate the discovery of the Higgs Boson. Thirty-three implementations with 510 students (average duration 1,5 h) demonstrate significant changes in achievement levels: Students have significant difficulties in the areas of orientation and analysis. Design and experimentation however seem to be more understandable for students. The second one (on the right) displays the outcome of the Eratosthenes-experiment, which is apparently far easier to comprehend for students, since we observed a more balance distribution of achievement levels per phase. An overall of 97 implementations (average duration 1 h) with 902 students were analyzed.

Fig. 4
figure 4

The partial ability achievement rating for different labs demonstrates the sensitivity of the approach on the complexity of the tasks. The students’ scores (510 students involved) per phase are presented for a demanding experimental activity (left graph) with high-energy physics (students groups show difficulty in the areas of orientation and analysis, achieving higher scores if involved in the experimental phase) and the comparably simpler recapitulation of the Eratosthenes Experiment (right graph) to estimate Earth’s circumference. There, we achieved a more balanced distribution of the students’ scores (902 students involved)

Discussion

Inquiry-learning approaches in science lessons have repeatedly proven their feasibility for long-term educational learning outcomes (e.g., Harlen, 2013, Linn et al., 2014, Schmid & Bogner, 2015, Marth & Bogner, 2017). Especially if combined with practitioner experience, inquiry-learning made considerable progress (Schwab, 1960; Shulman, 2004; Rust & Myer, 2006): Mixed method designs in inquiry projects frequently yielded better learning outcomes and produced more motivated and academically successful students compared to control groups (e.g. Falik et al, 2008; Chu, 2009). Inquiry experiences could improve the understanding of science contents and scientific practices. To remain within the set timeframe of school lessons, scientific visualization technologies increasingly support inquiry learning in order to enable high-level differentiated learning without being time consuming. Thus, digital tools and resources offer an effective way to decrease time consumption and to increase the adoption of inquiry processes in everyday lessons (Sotiriou, Bybee, & Bogner, 2017). Organizing and deploying digital resources as part of a normal school lesson is demanding for teachers and the reason why many refrain from following such an approach (Coiro, Castek, & Quinn, 2016; Langheinrich & Bogner, 2016). Technology supported and teacher-generated lessons enriched with high quality educational resources could support such interventions in a variety of classroom settings, meeting the needs of both students and teachers (Scharfenberg & Bogner, 2016). Such interventions could bridge the gap to real life experiences as well as tap lifelong learning experiences and inspiration of students – even of those with limited interest in science and math subjects (Gialouri, Uzunoglou, Gargalakos, Sotiriou, & Bogner, 2011, Langheinrich & Bogner, 2016, Mierdel & Bogner, 2019, Trautmann, 2013). Teachers primarily focus on organizing (or help organize) learning resources, running collaborative activities among students and making use of currently existing curriculum-sets (lessons) (Thousand, Villa, & Nevin, 2006). There is no doubt that all these tasks were demanding for schoolteachers (Hämäläinen & Cattaneo, 2015; TALIS, 2014). It is however this kind of competence-oriented, inquiry-based, lifelong learning experience, in particular sharing (knowledge) domain- and (education) grade-specific practices and solutions, that helps facilitate interaction between teachers or students and “knowledge scaffolds” in peer communities (Valanides & Angeli, 2008; Wu, Lee, Chang, & Liang, 2013).

Implementing inquiry, teachers function as bottlenecks: A lack of experience with new technologies, preparation requirements, and classroom management makes them often feel uncomfortable in this context (Blumenfeld et al., 1991). Confronting them with how to use these technologies for lesson planning, differentiated learning and independent learning in regular classroom settings will allow them to develop advanced educational experiences. It of course accelerates teachers’ metamorphosis if offered a variety of digital resources, which comply with existing curricula (Gordin, Polman, & Pea, 1994). The developed scenarios were in line with school curricula and set timeframes (an average time of 55 min) deploying about 22 digital resources on average. Our platform has successfully provided access to numerous resources, which, due to additional services and tools, teachers have managed to integrate into meaningful educational activities and to share with others (Minner, Levy, & Century, 2010). The platform provided a series of exemplary scenarios and lesson plans, which helped teachers introducing inquiry- and resource-based learning in their classes. It thus provided the necessary framework for the introduction of innovative approaches to classroom teaching (Donovan & Bransford, 2005; Meissner & Bogner, 2012). Timing and organizing lessons should encourage teachers to adopt, improve, implement, re-design and re-apply the scenarios (White & Frederiksen, 1998). This process fosters teachers’ professional development and simultaneously improves their instruction (Lieberman, 1992). In our case, close to 700 implementations in different countries and classroom settings indicated that the the timeframe allocated to the respective lessons was independent of class. This is a significant advantage of our approach as inquiry can be implemented effectively in both large and small size classes.

Most important however is the impact of our intervention on students learning outcomes. Using the platform enabled us to efficiently and effectively assess students’ learning progress, to monitor their reaction to dynamic and interactive problems, to arouse students’ interest and to gain insights into problem-solving processes (Guàrdia, Crisp, & Alsina, 2017; Scharfenberg & Bogner, 2013a). This result could also be defined as deeper learning, which describes long-lasting, sustainable, and successfully acquired cognitive knowledge (e.g. Fremerey & Bogner, 2015, Goldschmidt, Scharfenberg, & Bogner, 2016, Randler & Bogner, 2009; Scharfenberg & Bogner, 2010). Computer-based assessment tasks enabled the recording of data about the type, frequency, length and sequence of actions performed by students as a response to items. The organization of inquiry activities in lab work (through the “Authoring Environment”) helped analyzing the impact of complex scenarios on complex problem-solving abilities (see above). All steps, which students performed to solve a problem (understanding and characterizing the problem, representing the problem, solving the problem, and reflecting and communicating the solution) were included in the educational design process: as a result, the system enabled the mapping of changes in these partial abilities throughout the problem-solving process. The measurement procedure permitted the analysis of solution paths or strengths and weaknesses on an individual level. 12.454 data demonstrate a significant increase in high achievers (20–29% compared to the 10% OECD average) while the impact of the intervention for low achievers was less noteworthy (39–42% compared to 45% OECD average) (OECD, 2014). That usual barriers could be removed in inquiry-based classroom lessons is shown by the substantial increase of proficiency levels in complex problem-solving tasks. It is also possible to determine domain-specific characteristics of the curricular content by analyzing if a student has attained certain competence levels after a specific science activity. Enhancing problem-solving competence is vital for future learning as well as for active participation in society and personal activities. Students should therefore be able to apply their knowledge to new situations since their problem-solving.

Although it is difficult to agree on relevant features for designing a new science-learning model, social support is regarded crucial. Engaged teachers, parents, peers or trainers can for instance provide access to expertise and social networks (Sotiriou & Bogner, 2011). Reformed science-teaching pedagogy based on inquiry provides increased opportunities for cooperation between actors in formal and informal contexts (e.g. Hattie, 2009). Furthermore, technology-rich curricula may support deep understanding but will require competent and technology proficient teachers. Assessment of learning, for instance timely feedback to students, must be meaningful for both students and teachers (Schmid & Bogner, 2015). Learning how to learn as well as critical thinking, higher-order thinking skills, and problem-solving skills are also core issues (Berg, Bergendahl, Lundberg, & Tibell, 2003). Overall, technology provides new opportunities for teaching and learning and enables sustained learning as well as real life experiences.

Moving from theory to practice, there is one vital question: Why it is so hard to achieve effective science learning in school classrooms? Our current mode of teaching fails to engange students who are fluent in technology but disenchanted with science. But how could we improve our teaching given the constraints of single lesson schedules, precise timing, internet firewalls, mandated textbooks, consensus-driven standards, highly disparate abilities within a single grade level, no technical support, and no dedicated spaces for science and computer labs?

In fact, it is a difficult task, given all the efforts in the last decade to reform science classrooms across Europe (European Commission, 2015). Previous education reforms only had a marginal impact on students’ performance, giving little the hope for a simple solution (Osborne & Dillon, 2008). Instead, education policy makers must support the laborious task of improving teachers’ competences and classroom environment while simultaneously providing teachers with due respect and trust in relation to their crucial roles in society. In this context, “good practice” requires a bottom-up approach to set the foundation for learning innovations and to encourage holistic policy-making. Balancing this approach with top-down planning, the challenges for emerging paradigms, for instance access to learning, the creation and sharing of knowledge and the building of competences in learning communities, will be met (Oerke & Bogner 2010). According to the “Science Education for Responsible Citizenship” Report (European Commission, 2015), there are many ways in which science education can provide citizens, enterprises and industries with relevant skills and competences to develop sustainable and competitive solutions to current and future challenges. These efforts call for effective collaboration between formal, non-formal and informal education providers, enterprises and society as such. This increase interest in science studies, science-based careers and will ultimately foster employability and competitiveness.

Our study indicates the impact of using resources from numerous providers in educational settings, besides the ones provided by educational authorities. Building on best practices, bottom-up approaches aim at overcoming constraints of present structures and developing an innovative, shared vision of excellence. Such innovation programs offer great opportunities to conceive future classrooms. If we want a powerful, innovative and self-sustaining culture in schools, we will have to support system-aware practitioners and should see to it being widely adopted instead of isolated pockets of experimentation. Supporting a design-based approach of collaborative learning and inquiry between professional practitioners, will lead to a “pull” reaction not the usual “push” attitude. More specifically, the latter should aim at capturing profiles, needs, contributions and relationships of all school-related actors, to develop a sustainable, innovative ecosystem that operates within a holistic framework of organizational learning and promotion of educational innovations.

Availability of data and materials

provided on request.

Abbreviations

PISA:

Programme for International Student Assessment

ICT:

Information-Computer-Technology

OECD:

Organisation for Economic Cooperation and Development

ISE:

Inspiring Science Education

References

  • Alberts, B. (2009). Redefining Science Education. Science, 323, 427. https://doi.org/10.1126/science.1170933.

    Article  Google Scholar 

  • Anderson, R. D. (2002). Reforming Science Teaching: What Research says about Inquiry. Journal of Science Teacher Education, 13(1), 1–12. https://doi.org/10.1023/A:1015171124982.

    Article  Google Scholar 

  • Barrow, L. H. (2006). A Brief History of Inquiry: From Dewey to Standards. Journal of Science Teacher Education, 17, 265–278. https://doi.org/10.1007/s10972-006-9008-5.

    Article  Google Scholar 

  • Bereiter, C. (2002). Education and mind in the knowledge age. Mahvah: Erlbaum Associates.

  • Berg, C. A., Bergendahl, V. B., Lundberg, B. K. S., & Tibell, L. E. (2003). Benefiting from an open-ended experiment? A comparison of attitudes to, and outcomes of, an expository versus an open-inquiry version to the same experiment. International Journal of Science Education., 25(3), 351–372.

    Article  Google Scholar 

  • Blumenfeld, P., Soloway, E., Marx, R., Krajcik, J., Guzdial, M., & Palinscar, A. (1991). Motivating project-based learning: Sustaining the doing, supporting the learning. Educational Psychologist, 26, 369–398.

    Article  Google Scholar 

  • Breiner, J. M., Harkness, S. S., Johnson, C. C., & Koehler, C. M. (2012). What is STEM? A discussion about conceptions of STEM in education and partnerships. School Science and Mathematics, 112(1), 3–11.

    Article  Google Scholar 

  • Burris, J. (2012). It’s the teacher. Science, 335, 146. https://doi.org/10.1126/science.1218159.

    Article  Google Scholar 

  • Bybee, R. (Ed.) (2002). Learning science and the science of learning. Arlington: NSTA Press. https://doi.org/10.2505/9780873552080.

    Book  Google Scholar 

  • Chu, K. W. S. (2009). Inquiry project-based learning with a partnership of three types of teachers and the school librarian. Journal of American Society for Information Science and Technology, 60(8), 1671–1686.

    Article  Google Scholar 

  • Coiro, J., Castek, J., & Quinn, D. J. (2016). Personal Inquiry and Online Research: Connecting Learners in Ways That Matter. The Reading Teacher, 69(5), 483–492. https://doi.org/10.1002/trtr.1450.

    Article  Google Scholar 

  • Cooper, M., & Ferreira, J. M. M. (2009). Remote laboratories extending access to science and engineering curricular. IEEE Transactions on Learning Technologies, 2, 342–353.

    Article  Google Scholar 

  • David, J. (2008). What research says about project-based learning. Educational Leadership, 65, 80–82.

    Google Scholar 

  • Dewey, J. (1997). How we think. Boston: D. C. Heath & Co.

    Google Scholar 

  • Donovan, S., & Bransford, J. (Eds.) (2005). How students learn: Science in the classroom. Washington, DC: National Acad. Press. https://doi.org/10.17226/11102.

    Book  Google Scholar 

  • Driver, R., Squires, A., Rushworth, P., & Wood-Robinson, V. (1994). Making sense of secondary science. Research into children’s ideas. New York: Routledge. https://doi.org/10.1187/cbe.05-02-0068.

    Book  Google Scholar 

  • European Commission (2015). Science Education for Responsible Citizenship, Directorate-General for Research and Innovation Science with and for Society. Brussels: ISBN 978–92–79-43637-6.

  • Falik, O., Eylon, B., & Rosenfeld, S. (2008). Motivating teachers to enact Free-Choice PBL in Science and Technology (PBLSAT): Effects of a professional development model. Journal of Science Teacher Education., 19, 565–591. https://doi.org/10.1007/s10972-008-9113-8.

    Article  Google Scholar 

  • Franke, G., & Bogner, F. X. (2011). Conceptual change in students’ molecular biology education: tilting at windmills? Journal of Educational Research, 104(1), 7–18.

    Article  Google Scholar 

  • Fremerey, C., & Bogner, F. X. (2015). Learning about Drinking Water: How important are the three dimensions of knowledge that can change individual behaviour? Education Sciences, 4(4), 213–228. https://doi.org/10.3390/educsci4040213.

    Article  Google Scholar 

  • Gialouri, E., Uzunoglou, M., Gargalakos, M., Sotiriou, S., & Bogner, F. X. (2011). Teaching Real-Life Science in the Lab of Tomorrow. ASL (Advanced Science Letters), 4, 3317–3323.

    Google Scholar 

  • Goldschmidt, M., & Bogner, F. X. (2016). Learning about genetic engineering in an outreach laboratory: Influence of motivation and gender on students’ cognitive achievement. International Journal of Science Education Part B, 6(2), 166–187. https://doi.org/10.1080/21548455.2015.1031293.

    Article  Google Scholar 

  • Goldschmidt, M., Scharfenberg, F.-J., & Bogner, F. X. (2016). Instructional efficiency of different discussion approaches in an outreach laboratory: Teacher-guided versus student-centered. Journal of Educational Research, 109(1), 27–36. https://doi.org/10.1080/00220671.2014.917601.

    Article  Google Scholar 

  • Gordin, D. N., Polman, J. L., & Pea, R. D. (1994). The Climate Visualizer: Sense-making through scientific visualization. Journal of Science Education and Technology, 3, 203–226.

    Article  Google Scholar 

  • Guàrdia, L., Crisp, G., & Alsina, I. (2017). Trends and Challenges of E-Assessment to Enhance Student Learning in Higher Education. Spain: UOC. https://doi.org/10.4018/978-1-5225-0531-0.ch003.

    Book  Google Scholar 

  • Hämäläinen, R., & Cattaneo, A. (2015). New TEL Environments for Vocational Education – Teacher’s Instructional Perspective. Vocations and Learning, 8, 135–157. https://doi.org/10.1007/s12186-015-9128-1.

    Article  Google Scholar 

  • Harlen, W. (2010). Principles and big ideas of science education. Hatfield: Association for Science Education.

    Google Scholar 

  • Harlen, W. (2013). Assessment & Inquiry-Based Science Education: Issues in Policy and Practice. Italy: Global Network of Science Academies (IAP) Science Education Programme (SEP).Wynne. Harlen. Assessment.

    Google Scholar 

  • Hattie, J. (2009). Visible learning. London: Routledge.

    Google Scholar 

  • Kelly, R., Lesh, A., & Baek, J. Y. (2008). Handbook of design research methods in education: Innovations in science, technology, engineering and mathematics learning and teaching, (pp. 219–245). London: Routledge.

  • Keselman, A. (2003). Supporting inquiry learning by promoting normative understanding of multivariable causality. Journal of Research in Science Teaching, 40, 898–921. https://doi.org/10.1002/tea.10115.

    Article  Google Scholar 

  • Kuhn, D. (2005). Education for thinking. Cambridge: Harvard University Press.

    Google Scholar 

  • Langheinrich, J., & Bogner, F. X. (2016). Computer-related self-concept: The impact on cognitive achievement. Studies in Educational Evaluation, 50, 46–52. https://doi.org/10.1016/j.stueduc.

    Article  Google Scholar 

  • Lesh, R., & Zawojewski, J. S. (2007). Problem solving and modeling. In F. Lester (Ed.), Second handbook of research on mathematics teaching and learning, (pp. 763–804). Charlotte: Inf. Age Publ.

    Google Scholar 

  • Lieberman, A. (1992). The meaning of scholarly activity and the building of community. Educational Researcher, 21(6), 5–12.

    Article  Google Scholar 

  • Linn, M. C. (2000). Designing the knowledge integration environment. International Journal of Science Education, 22(8), 781–796.

    Article  Google Scholar 

  • Linn, M. C., Gerard, L., Ryoo, K., McElhaney, K., Liu, O. L., & Rafferty, A. N. (2014). Computer-Guided Inquiry to Improve Science Learning. Science, 344(6180), 155–156.

    Article  Google Scholar 

  • Marth, M., & Bogner, F. X. (2017). Does the issue of bionics within a student-centred module generate long-term knowledge. Studies in Educational Evaluation, 55, 117–124.

    Article  Google Scholar 

  • Meissner, B., & Bogner, F. X. (2012). Science Teaching based on Cognitive Load Theory: Engaged Students, but Cognitive Deficiencies. Studies in Educational Evaluation, 38, 127–134. https://doi.org/10.1016/j.stueduc.2012.10.002.

    Article  Google Scholar 

  • Mierdel, J., & Bogner, F. X. (2019). Investigations of modellers and model viewers in an out-of-school gene-technology laboratory, Research in Science Education (online published). http://link.springer.com/article/10.1007/s11165-019-09871-3.

  • Minner, D. D., Levy, A. J., & Century, J. (2010). Inquiry-based science instruction – what is it and does it matter? Results from a research synthesis years 1984–2002. Journal of Research in Science Teaching, 47(4), 474–496. https://doi.org/10.1002/tea.20347.

    Article  Google Scholar 

  • Moran, M. J. (2007). Collaborative action research and project work: Promising practices for developing collaborative inquiry among early childhood preservice teachers. Teaching and Teacher Education, 23, 418–431.

    Article  Google Scholar 

  • OECD (2006). Think Scenarios, Rethink Education. New York. ISBN: 926402364X.

  • OECD (2014). PISA 2012 Results: Creative Problem Solving (Volume V): Students' Skills in Tackling Real-Life Problems. Paris: OECD Publishing. https://doi.org/10.1787/9789264208070-en.

    Book  Google Scholar 

  • Oerke, B., & Bogner, F. X. (2010). Gender, age and subject matter: Impact on teachers’ ecological values. The Environmentalist, 30(2), 111–122.

    Article  Google Scholar 

  • Osborne, J., & Dillon, J. (2008). Science Education in Europe: Critical Reflections. London: Nuffield Foundation.

  • Pilkington, R. M. (2004). Developing discussion for learning. Journal of Computer Assisted Learning, 20, 161–164. https://doi.org/10.1111/j.1365-2729.2004.00080.x.

    Article  Google Scholar 

  • PISA (2015). Results in Focus. Paris: OECD.

    Google Scholar 

  • Randler, C., & Bogner, F. X. (2009). Efficacy of two different instructional methods involving complex ecological content. International Journal of Science and Mathematics Education, 7(2), 315–337. https://doi.org/10.1007/s10763-007-9117-4.

    Article  Google Scholar 

  • Rocard, M., Csermely, P., Jorde, D., Lenzen, D., & Walberg-Henriksson, H. (2007). Science Education Now: a Renewed Pedagogy for the Future of Europe. Brussels: European Commission.

    Google Scholar 

  • Rust, F., & Myer, E. (2006). The bright side: Teacher research in the context of educational reform and policy-making. Teachers and Teaching: Theory and Practice, 12(1), 69–86.

    Article  Google Scholar 

  • Schaal, S., & Bogner, F. X. (2005). Human visual perception—Learning at workstations. Journal of Biological Education, 40(1), 32–37. https://doi.org/10.1080/00219266.2005.9656006.

    Article  Google Scholar 

  • Scharfenberg, F.-J., & Bogner, F. X. (2010). Instructional Efficiency of Changing Cognitive Load in an Out-of-School Laboratory. International Journal of Science Education, 32(6), 829–844. https://doi.org/10.1080/09500690902948862.

    Article  Google Scholar 

  • Scharfenberg, F.-J., & Bogner, F. X. (2011). A new two-step approach for hands-on teaching of gene technology: Effects on students' activities during experimentation in an outreach gene-technology lab. Research in Science Education, 41(4), 505–523. https://doi.org/10.1007/s11165-010-9177-2.

    Article  Google Scholar 

  • Scharfenberg, F.-J., & Bogner, F. X. (2013a). Instructional efficiency of tutoring in an outreach gene-technology laboratory. Research in Science Education, 43(3), 1267–1288. https://doi.org/10.1007/s11165-012-9309-y.

    Article  Google Scholar 

  • Scharfenberg, F.-J., & Bogner, F. X. (2013b). Teaching gene technology in an outreach lab: Students' assigned cognitive load clusters and the clusters' relationships to learner characteristics, laboratory variables, and cognitive achievement. Research in Science Education, 43(1), 141–161. https://doi.org/10.1007/s11165-011-9251-4.

    Article  Google Scholar 

  • Scharfenberg, F.-J., & Bogner, F. X. (2016). A New Role-Change Approach in Pre-service Teacher Education for Developing Pedagogical Content Knowledge in the Context of a Student Outreach Lab. Research in Science Education, 46(5), 743–766. https://doi.org/10.1007/s11165-015-9478-6.

    Article  Google Scholar 

  • Schmid, S., & Bogner, F. X. (2015). Effects of Students’ Effort Scores in a Structured Inquiry Unit on Long-Term Recall Abilities of Content Knowledge. Education Research International, (Article ID 826734. https://doi.org/10.1155/2015/826734.

  • Schwab, J. J. (1960). Enquiry, the science teacher, and the educator. Science Teacher, 36, 6–11.

  • Shamos, M. (1995). The Myth of Scientific Literacy. Chicago: Rutgers Univ. Press.

  • Shulman, A., & Valcarcel, J. (2012). Scientific knowledge suppresses but does not explain earlier intuitions. Cognition, 124, 209–215. https://doi.org/10.1016/j.cognition.2012.04.005.

    Article  Google Scholar 

  • Shulman, L. (2004). The wisdom of practice: Essays on teaching, learning, and learning to teach. San Francisco: Jossey-Bass.

    Google Scholar 

  • Sotiriou, S., & Bogner, F. X. (2005). The Pathway to High Quality Science Teaching. Pallini: EPINOIA. ISBN Number: 960-8339-60-X.

  • Sotiriou, S., & Bogner, F. X. (2008). Visualizing the Invisible: Augmented Reality as an Innovative Science Education Scheme. Advanced Science Letters, 1(1), 114–122.

    Article  Google Scholar 

  • Sotiriou, S., & Bogner, F. X. (2011). Inspiring Science Learning: Designing the Science Classroom of the Future. Advanced Science Letters, 4, 3304–3309.

    Article  Google Scholar 

  • Sotiriou, S., Bogner, F. X., & Neofotistos, G. (2011). Quantitative analysis of the usage of the COSMOS science education portal. Journal of Science and Technology Education, 20, 333–346. https://doi.org/10.1007/s10956-010-9256-1.

    Article  Google Scholar 

  • Sotiriou, S., Bybee, R., & Bogner, F. X. (2017). PATHWAYS – A Case of Large-Scale Implementation of Evidence-Based Practice in Scientific Inquiry-Based Science Education. International Journal of Higher Education, 6(2), 8–17. https://doi.org/10.5430/ijhe.v6n2p8.

    Article  Google Scholar 

  • Sotiriou, S., Riviou, K., Cherouvis, S., Chelioti, E., & Bogner, F. X. (2016). Introducing large-scale innovation in schools. Journal of Technology, Science and Education, 25(4), 541–549. https://doi.org/10.1007/s10956-016-9611-y.

    Article  Google Scholar 

  • TALIS (2014). An International Perspective on Teaching and Learning. Brussels. Paris: OECD Publishing https://doi.org/10.1787/9789264196261.

  • Thousand, J. S., Villa, R. A., & Nevin, A. I. (2006). The many faces of collaborative planning and teaching. Theory into Practice, 45, 239–248.

    Article  Google Scholar 

  • Trautmann, N. M. (2013). Citizen Science: 15 Lessons that Bring Biology to Life, 6–12. Washington, D.C: NSTA Press.

  • Valanides, N., & Angeli, C. (2008). Distributed cognition in a sixth-grade classroom: an attempt to overcome alternative conceptions about light and color. Journal of Research on Technology in Education, 40, 309–336.

    Article  Google Scholar 

  • Wecker, C., Kohnlet, C., & Fischer, F. (2007). Computer literacy and inquiry learning: When geeks learn less. Journal of Computer Assisted Learning, 23, 133–144.

    Article  Google Scholar 

  • White, B. Y., & Frederiksen, J. R. (1998). Inquiry, modeling, and metacognition: Making science accessible to all students. Cognition and Instruction, 16, 3–118.

    Article  Google Scholar 

  • Wilhelm, J., Sherrod, S., & Walters, K. (2008). Project-based learning environments: Challenging preservice teachers to act in the moment. The Journal of Educational Research, 101, 220–233.

    Article  Google Scholar 

  • Wilhelm, J. G., & Wilhelm, P. J. (2010). Inquiring minds learn to read, write and think: Reaching all learners through inquiry. Middle School Journal, 5, 39–46.

    Article  Google Scholar 

  • Wu, H. K., Lee, S. W. Y., Chang, H. Y., & Liang, J. C. (2013). Current status, opportunities and challenges of augmented reality in education. Computers & Education, 62, 41–49. https://doi.org/10.1016/j.compedu.2012.10.024.

    Article  Google Scholar 

Download references

Acknowledgements

We appreciate the support of the students and the teachers, in specific for collecting the data.

Funding

This research was completed within the ISE (Inspiring Science Education) project funded by the European Commission (CIP-ICT-PSP-2012-325123). Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the position of the European Commission. Additional funding originated from Ellinogermaniki Agogi (EA) and the University of Bayreuth (UBT). The funders had no role in the design of the study, in the collection, analyses, or interpretation of data, in the writing of the manuscript, or in the decision to publish the results. This article reflects only the authors’ views. The funders are not liable for any use that might be made of the information contained herein.

Author information

Authors and Affiliations

Authors

Contributions

All three authors contributed to the study conception and design, to the data collection and analysis, to the first and subsequent drafts of manuscript as well as they commented on and approved the manuscript until the final version.

Corresponding author

Correspondence to Franz X. Bogner.

Ethics declarations

Competing interests

The authors declare that they have no conflict of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sotiriou, S.A., Lazoudis, A. & Bogner, F.X. Inquiry-based learning and E-learning: how to serve high and low achievers. Smart Learn. Environ. 7, 29 (2020). https://doi.org/10.1186/s40561-020-00130-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40561-020-00130-x

Keywords