Abstract

As a group of faculty with expertise and research programs in the area of host-pathogen interactions (HPI), we are concentrating on students’ learning of HPI concepts. As such we developed a concept inventory to measure level of understanding relative to HPI after the completion of a set of microbiology courses (presently eight courses). Concept inventories have been useful tools for assessing student learning, and our interest was to develop such a tool to measure student learning progression in our microbiology courses. Our teaching goal was to create bridges between our courses which would eliminate excessive overlap in our offerings and support a model where concepts and ideas introduced in one course would become the foundation for concept development in successive courses. We developed our HPI concept inventory in several phases. The final product was an 18-question, multiple-choice concept inventory. In fall 2006 and spring 2007 we administered the 18-question concept inventory in six of our courses. We collected pre- and postcourse surveys from 477 students. We found that students taking pretests in the advanced courses retained the level of understanding gained in the general microbiology prerequisite course. Also, in two of our courses there was significant improvement on the scores from pretest to posttest. As we move forward, we will concentrate on exploring the range of HPI concepts addressed in each course and determine and/or create effective methods for meaningful student learning of HPI aspects of microbiology.
This study involved the development of a diagnostic assessment tool or concept inventory to measure the level of understanding about host-pathogen interactions (HPIs) after completing a set of microbiology courses. As a group of faculty at a research university with expertise and research programs in the area of HPIs, we are responsible for teaching the undergraduate courses with HPI content (presently eight courses). In fall 2004, we formed a teaching group to bridge learning between our courses. Our group includes faculty from all ranks (full professors, associate professors, assistant professors, and instructors), along with an assistant professor from the College of Education with expertise in science education, and several graduate students with a strong interest in teaching who have joined us for various projects (http://www.life.umd.edu/hpi/).
Our goal was to create bridges which would eliminate excessive overlap in our offerings and support a model where concepts and ideas introduced in one course would become the foundation for concept development in successive courses. Our first task was to develop a list of 13 HPI concepts that were fundamental to an understanding of HPI. We used these concepts to guide the learning progression in our sequence of courses, so that students moving from the prerequisite course to more advanced courses would develop a deeper understanding of HPI (5, 18, 19, 24). We chose two “anchor” organisms to be used as exemplars of fundamental HPI concepts in all of our courses. In addition, we worked together to incorporate well-documented teaching strategies into the classroom and designed active-learning activities that address our concepts (15).
To assess how well our courses support the understanding of fundamental principles as defined by our 13 HPI concepts, we developed an HPI concept inventory. This paper describes the multistep collaborative process of building the inventory, assessing its use, and evaluating student performance through administration of the inventory.
The goal of a concept inventory or conceptual diagnostic test is to assess student understanding of basic concepts in a discipline (1, 7, 27). Concept inventories have been developed to target discipline-specific knowledge and are designed in a multiple-choice format (1, 7, 8, 11, 20). The selection of correct responses to the multiple-choice question reveals the students’ understanding of a basic concept, whereas the selection of incorrect responses or distractors suggests that students possess commonly held alternate conceptions (27).
Physicist educators have considerably altered the way physics is taught in response to student performance on the Force Concept Inventory (8, 10, 11). Their demonstration of the value of investigating what students know about fundamental concepts has encouraged several groups of biologists and chemists to develop concept inventories (1, 7, 13, 17, 20). Many concept inventories have been placed online (Field-Tested Learning Assessment Guide, http://www.flaguide.org; Bioliteracy Project, http://bioliteracy.net/).
Concept inventories are built with a multiple-choice format to allow administration to large numbers of students as well as efficient and objective scoring (27). In concept inventories, the wording of the question and response choices is based on extensive research that includes a thorough review of questions and answers to capture all possible interpretations of wording and all possible correct and incorrect answers. In its final form, each question in the concept inventory consists of one correct answer and multiple distractors. The distractors are incorrect answers based on commonly held alternate conceptions.
“Alternate conceptions” as described by Fisher (6) are ideas that differ from corresponding scientific explanations. As defined, alternate conceptions are usually held by a significant proportion of students and are highly resistant to instruction (6, 17). Alternate conceptions have previously been referred to as “misconceptions”; however, researchers find that the more positive term “alternate conception” recognizes that these conceptions can be used as anchors (4, 21) from which to move to a scientific conception when targeted instructional strategies are developed. The most important role for concept inventories is to provide instructors with an understanding of student alternate conceptions and ideas that may be actively interfering with learning.
The science education literature offers a large body of research that describes students’ alternate conceptions in different scientific topics at different age levels (17). In some cases, the concept inventory could be built based on an existing database for alternate conceptions; however, in other cases, such as in the HPI area, there are very few references in the current literature. Therefore, in order to build an HPI concept inventory, we began with the detailed process of identifying the alternate conceptions that students hold.
Our approach was similar but not identical to the two-tier method advocated by Treagust (26), Anderson et al. (1), Khodor et al. (13), and Odom and Barrow (20). The two-tier survey consisted of multiple-choice questions, each followed by a free response prompt. The two-tier method is attractive because it separates factual knowledge (tier 1, facts) from reasons for choosing a particular fact (tier 2, mechanisms and beliefs). The final product was an 18-question multiple-choice concept inventory.

MATERIALS AND METHODS

The model system for learning and courses involved. Our initiative involves eight HPI undergraduate courses. General Microbiology (600 students/year) serves as a prerequisite for all seven other upper-level courses: Pathogenic Microbiology (120 students/year), Microbial Pathogenesis (25 students/year), Microbial Genetics (80 students/year), Immunology (100 students/year), Immunology Lab (80 students/year), Epidemiology (100 students/year), and Bioinformatics (30 students/year). Our teaching group met monthly, with an average attendance of 13 members. It was decided that to help students build bridges between content presented in the various courses, we would link discussion of host-pathogen interactions in all courses to two organisms, Escherichia coli and Streptococcus sp. Further it was decided that each course should include methods that would expose students to and engage them in the scientific research process (15). Simultaneously with these goals, we developed the HPI concept inventory to evaluate our progress.
Constructing the concept inventory. Mixed methods of qualitative and quantitative approaches were used in developing the HPI concept inventory. The following steps were followed in designing, developing, implementing, and evaluating the HPI concept inventory.
(i) Developing a first version of the survey. To build questions that could be used to evaluate students’ level of thinking and understanding in each course, we considered the work of Bloom (2) and Mayer (16). We discussed the characteristics of questions that reflected rote learning as opposed to meaningful learning, and we learned how to write questions that could reliably assess a deeper level of understanding. Each faculty member submitted two questions that he or she thought a student should be able to answer at the completion of the course. We rated these according to cognitive level (2, 16), and we devised a tool that targeted the HPI concepts. We piloted the tool in three courses. After analysis of the results, we felt that we had learned quite a bit, but our tool was not yet meeting our needs. Our concerns were the following:
a.
The approach to the development of the tool was too individualized. The questions were written by distinct faculty members and merged.
b.
There seemed to be large gaps in the content assessed.
c.
We did not know how this tool could be used to monitor students’ development in meaningful understanding of HPI concepts.
(ii) Defining the content boundaries of the survey. We considered as a group this question: “What do we want our students to truly understand and remember 5 years after they have completed the set of our courses?” Accordingly, we developed a list of 13 HPI concepts (Table 1). We aimed at concepts that we believe are required for understanding HPI at a level of sophistication appropriate for microbiology majors. Content validity of the concepts was established by our complete HPI group.
TABLE 1
TABLE 1 The 13 HPI concepts—the big ideas for our project
Concept numberConcept
1The structural characteristics of a microbe are important in the pathogenicity of that microbe.
2Diverse microbes use common themes to interact with the environment (host).
3Microbial evolution is subject to forces of natural selection. Important consequences include changes in virulence and antibiotic resistance.
4Microbes adapt and respond to the environment by altering gene expression.
5Microbes have various strategies to cause disease.
6Pathogens and hosts have evolved in a mutual fashion.
7The cell wall and the cell membrane affect the bacterial response to the environment.
8There is a distinction between a pathogen and a nonpathogen.
9The environment will affect the phenotype (pathogenicity) of a bacterium.
10Microbes adapt and respond to the environment by altering their metabolism.
11Immune response has evolved to distinguish between self and nonself.
12Immune response recognizes general properties (common themes versus specific attributes, innate versus adaptive).
13Immune response memory is specific.
(iii) Developing a two-tier survey. Based on the HPI concept list, a 23-item multiple-choice survey with free response answers was developed. With the free open-ended response, we aimed to assess students’ alternate conceptions, which later would be used as distractors in the final multiple-choice survey. Therefore, each question had two tiers. The first tier consisted of questions with two to five choices; there could be more than one correct answer. The second tier consisted of requests for explanation (explain your answer or defend your response). Each question covered one or more concepts from the HPI concept list (Table 2). The 23 questions were piloted with a small focus group of two graduate students and two undergraduate students. Results from this focus group were analyzed by the HPI teaching team. Our 23 questions were amended to 18 two-tier questions. To establish content validity (25), we provided the draft instrument for inspection to our science content experts and a science pedagogy expert.
TABLE 2
TABLE 2 HPI concepts addressed in two-tier survey
Question numberQuestionHPI concept addressed
1ASelection of an antibiotic resistant organism is based upon a change in the (a) phenotype (b) genotype (c) both (d) neither (e) either3, 4, 10
1BDefend your response. 
2AWhat determines a Gram stain reaction?
a.
Distinction relating to bacterial structure
b.
Distinction relating to bacterial function
c.
Both
1
2BDefend your response. 
(iv) Obtaining information about students’ alternate conceptions. In the spring of 2006, the 18-question assessment was distributed via our course management system to 200 students in General Microbiology (introductory course) and 60 students in Bacterial Genetics (one of our HPI advanced courses). In order to limit the time requirement for the students in this pilot, only five questions were given to each student. For each question, we received about 60 responses from the General Microbiology course and 20 responses from the advanced course. The student responses were collated and reviewed by our HPI faculty as a group. We met to score student responses for alternate conceptions and then to develop multiple-choice questions that use commonly held alternate conceptions as distractors. Tables 3A and B show an example for analyzing a two-tier question. For each question, we first counted (quantitative analysis) the number of students selecting each choice in the multiple-choice part of the question (first tier). Note, that depending upon how students chose to defend their response, there could be more than one correct option among the multiple choices. Sometimes the student answered the first tier question correctly, but produced an incorrect explanation, and vice versa. This survey was not used to determine student course grades, but students who participated were awarded extra credit points. We were interested in finding what alternate conceptions students held.
TABLE 3A
TABLE 3A Results from analysis of answers to a first-tier multiple-choice question
Question and answersNo. of students choosing each answer
General Microbiology (n = 68)Bacterial Genetics (n = 25)
1. Selection of an antibiotic resistant organism is based upon a change in the  
  a. Phenotype17
  b. Genotype384
  c. Both2513
  d. Neither01
  e. Either40
TABLE 3B
TABLE 3B Results from analysis of student open-ended responses to the second-tier prompt: “Defend your response”
Major categories of students’ responsesaNo. of students grouped under each category
General MicrobiologyBacterial Genetics
Excellent response2116
Basic response, more required to indicate higher understanding90
Students didn’t understand that selection is based on phenotypes281
Student responses indicated that they did not understand that a change in phenotype is due to a change in genotype36
Alternate conception was with the understanding of the differences between genotype and phenotype91
Either student did not answer question or student response was completely off the mark30
a
Students’ open-ended responses were grouped in major categories.
In order to define categories (qualitative analysis) for the second-tier responses (“defend your response”), we decided to use the technique of Hodder, Ebert-May, and Batzli (12). We formed three small groups of three instructors each. Each group received five or six questions to analyze. For each question, the group read all of the answers and established categories (level of correctness and alternate conceptions). Then, each member went through each response and categorized the response. Finally, the three members of the group compared their ratings and discussed responses to reach a consensus for each student response. Below are two examples for common alternate conceptions for question 1.
Question 1. Selection of an antibiotic resistant organism is based upon a change in the (a) phenotype (b) genotype (c) both (d) neither (e) either.
a. Students didn’t understand that selection is based on phenotypes. One student that selected “(b) genotype” wrote: “When an organism becomes resistant to antibiotics (when it acquires an antibiotic-resistant gene that has been inserted as a marker), the organism’s genotype has been changed.”
b. Alternate conception was with understanding of the differences between genotype and phenotype. The student wrote “This must be a change in the genotype because having antibiotic resistance will not necessarily change the look of an organism (phenotype). It will merely allow it to survive in situations where the antibiotic is present.”
(v) Developing a multiple-choice concept inventory. Following the analysis of all questions, each group built two multiple-choice questions for the final assessment tool, the HPI concept inventory. These questions usually included the opening sentence or sentences of the previous question and four or five choices of response: one correct answer and three or four distractors that reflect the students’ alternate conceptions revealed in the analysis. For example, one question developed from the information that is presented in Table 3 was the following:
The selection of antibiotic-resistant, transformed bacteria is based upon a change in the:
A.
phenotype of the bacteria.
B.
genotype of the bacteria.
C.
phenotype and genotype of the bacteria.
D.
genotype and physiology of the bacteria.
E.
genotype and morphology of the bacteria.

RESULTS

In fall 2006 and spring 2007, we administered the 18-question concept inventory in six of our courses. Participation in the survey was voluntary for the students. Students who participated were provided extra credit points. We requested permission from the students to use their responses for our research. Only data from students who gave permission were analyzed in the study. We collected pre- and postcourse surveys from 477 students (gender: 69% females, 31% males; ethnicity: 46% white, 26% Asian, 11% African American, 7% Hispanic, 10% other) with the following course distribution:
General Microbiology (BSCI 223), fall 2006, 127 students
Pathogenic Microbiology (BSCI 424), fall 2006, 96 students
General Microbiology (BSCI 223), spring 2007, 109 students
Bacterial Genetics (BSCI 412), spring 2007, 45 students
Immunology (BSCI 422), spring 2007, 48 students
Epidemiology (BSCI 425), spring 2007, 52 students
Student performance on concept inventory.Table 4 shows average scores for pre- and postcourse concept inventories. Each correct question weighed 1 point; because we removed two questions from the analysis (see below explanation), the maximum number of points possible on the concept inventory was 16. An inspection of these data shows that in both semesters of General Microbiology, the prerequisite course, the pre- and postcourse scores are similar. This is an important finding, as in the spring and the fall semesters we had different instructors teaching the course. For future analysis, we can treat these courses as comparable courses. Encouragingly, using t test analysis, we found that in four of our courses (both BSCI 223 courses, BSCI 424, and BSCI 422) there was significant improvement on the concept inventory scores from presurvey to postsurvey. Moreover, students taking the presurvey in the advanced courses retained the level of understanding gained in the prerequisite course (scores on BSCI 223 postsurvey are around 7.0, and scores on all presurveys in advanced courses are around 7 or greater).
TABLE 4
TABLE 4 Average scores on the pre- and postcourse concept inventorya
Pre or postGeneral Microbiology, fall 2006 (n = 127)Pathogenic Microbiology, fall 2006 (n = 96)General Microbiology, spring 2007 (n = 109)Bacterial Genetics, spring 2007 (n = 45)Immunology, spring 2007 (n = 48)Epidemiology, spring 2007 (n = 52)
Pre4.97.34.77.89.26.6
Post7.0b8.7b7.3b7.69.9c6.6
a
Each correct response was weighted 1 point. The maximum number of points was 16. Values were calculated without data from questions 8 and 13.
b
P < 0.001.
c
P < 0.05.
Table 5 shows percentages of correct answers for each question in the pre- and postcourse surveys. There is a significant amount of data in the chart. As our project involves eight different courses, our first question was to determine how students in different courses would respond to each question. We looked at the percentages of correct answers for each question in each course and examined the relationship of students’ overall scores with their ability to choose the correct response to a specific question (discrimination factor).
TABLE 5
TABLE 5 Percentages of correct answers on pre- and postcourse concept inventory
QuestionPre or postGeneral Microbiology, fall 2006 (n = 127)Pathogenic Microbiology, fall 2006 (n = 96)General Microbiology, spring 2007 (n = 109)Bacterial Genetics, spring 2007 (n = 45)Immunology, spring 2007 (n = 48)Epidemiology, spring 2007 (n = 52)
1Pre253212355636
Post302426446742
2Pre9253203715
Post244621293517
3Pre183321494831
Post274033605038
4Pre878777879285
Post888887848790
5Pre72410315023
Post292628336531
6Pre608265807965
Post788583717769
7Pre284213385640
Post425729335231
8Pre172316131223
Post17253216126
9Pre235925475048
Post466356495636
10Pre417140737558
Post617878736958
11Pre184523564842
Post395938536050
12Pre624564625242
Post496572515035
13Pre11179272321
Post12261192115
14Pre284137474833
Post355243385233
15Pre103012296931
Post403820277536
16Pre193617334823
Post344832246021
17Pre435738607960
Post656057588150
18Pre132417364629
Post164020315825
We grouped students’ scores on the inventory into categories where scores that fell into the top 25% were placed in the “high” category, scores that fell into the middle 50% were placed in the “medium” category, and scores in the bottom 25% were placed in the “low” category. The discrimination values (range, 0–1) were calculated for each question. If a question had a value below .30, it meant that students who did well on the concept inventory (high-performance group) performed poorly on this question. We reviewed all questions administered in each class and found that in every class, two questions (8 and 13) provided poor discrimination; no one did well (Table 5).
Questions 8 and 13 were designed to address issues regarding bacterial metabolism (Table 1, concept 10). The group reviewed and discussed the importance of concept 10 and analyzed the clarity and specific student alternate understandings addressed in both questions 8 and 13. As a result of this discussion, question 8 was reworded. Question 13 was left as is; it was decided that the question wording was clear. We feel that in fact even our best students do not understand this concept. This was truly an excellent question as it revealed to us a gap in our curriculum.

DISCUSSION

The idea of a concept inventory began with the Force Concept Inventory that was developed to measure students’ conceptual understanding of motion and force (11). Similar multiple-choice concept inventories have more recently been developed for the assessment of student learning in other areas, including chemistry, biology, astronomy, statistics, engineering, and geosciences (1, 13, 17, 20, 22). As a group of instructors who care about their teaching and have taken on the challenge of creating a cohesive set of courses that result in meaningful learning of HPI concepts, we sought to assess the effectiveness of our teaching efforts. We believe that the process of constructing a reliable concept inventory as a group has had great value not only in the production of the product but also in the conversation about teaching and learning within our group. Together we worked to articulate the most important concepts for undergraduates to grasp in order to develop meaningful learning of HPI (the “big ideas”). Then through collaborative efforts, we built the HPI concept inventory to assess student progress in our courses. Through the review of student answers and comments on the HPI concept inventory, we have developed a deeper understanding of how students perceive HPI concepts. As was observed by Hestenes and Halloun (9) in an interpretation of results from the Force Concept Inventory, student responses to the HPI concept inventory provided a surprise value to each of us (i.e., “how could my students miss that?”). Reading and discussing students’ choice of answers and their accompanying comments gave us very specific information that we plan to use for course development.
Through the implementation of the concept inventory in eight different courses we generated significant data. Our goal with this data was to analyze our questions as appropriate indicators of student understanding of HPI concepts. This paper reports our review of the data targeted toward understanding how students in eight microbiology courses respond to 18 concept inventory questions. As is the procedure in generating a valid and reliable set of concept inventory questions (14), we reviewed the questions to see how students in each of our courses would respond. We found two questions with a very poor discrimination factor. One question was determined to be worded poorly; the other provided an indication of a gap in our curriculum.
We did not expect that our students would have perfect scores on the concept inventory. Immunology lecture is generally the last course taken by students in the spring of their senior year. The average concept inventory score following the course is 9.9. As we continue to work on the development of the concept inventory and on our curriculum initiatives, we will monitor student scores as indicators of our course development progress.
Through the first distribution of the concept inventory we have significant findings. For the General Microbiology course we found a significant increase in student learning as measured by the HPI concept inventory. This increase was consistent in two semesters where the course was taught by different instructors. General Microbiology is taught according to an active-learning course format (23). This result suggests that the use of this format is instructor independent.
Further we found that the presurvey scores for advanced courses were not significantly different from postsurvey scores for students completing the prerequisite General Microbiology course. We attribute this retention of learning to the active-learning strategies that have been in place in General Microbiology for several years (23). Research widely supports the claim that students learn best when they actively participate and are engaged in their learning. When students learn actively, they retain more content for a longer time and are able to apply that material in a broader range of contexts (3).
Finally, in Pathogenic Microbiology and Immunology, two advanced courses, we observed significant improvement in postsurvey scores relative to presurvey scores. As we move forward, we will concentrate on exploring the range of HPI concepts addressed in each course. The detailed analysis of student performance on each question in each course will help us determine and/or create effective methods for meaningful student learning in each.
We believe that the concept inventory will serve as a useful tool to monitor our course development initiatives. Curriculum initiatives under development include adding and adapting published active-learning activities to our courses such as clicker questions, case studies, and team projects, and developing teaching tools that bring our research interests into the classroom in authentic ways. Building the concept inventory required the collaborative efforts of research and teaching faculty. We believe that this team approach could be a model for others working on curriculum and assessment projects.

Acknowledgments

This research was supported in part by a grant to the University of Maryland from the Howard Hughes Medical Institute through the Undergraduate Science Education Program. This work has been approved by the Institutional Review Board as IRB 060140. Special thanks to Katerina V. Thompson for editorial comments and long-standing support of the HPI teaching group. We also thank Laura Cathcart, science education graduate student, for feedback from the student point of view and Katherine C. McAdams for help with statistics.

REFERENCES

1.
Anderson DL, Fisher KM, Norman GJ 2002 Development and evaluation of the conceptual inventory of natural selection J Res Sci Teach 39 952-978
2.
Bloom BS 1984 Taxonomy of educational objectives Handbook 1: cognitive domain Longman New York, NY
3.
Bransford JD, Brown AL, Cocking RR 2000 How people learn: brain, mind, experience, and school National Academy Press Washington, DC
4.
Clement J, Brown DE, Zietman A 1989 Not all preconceptions are misconceptions: finding “anchoring conceptions” for grounding instruction on students’ intuitions Int J Sci Educ 11 554-565
5.
Duschl RA, Schweingruber HA, Shouse AW 2007 Taking science to school: learning and teaching science in grades K-8 The National Academies Press Washington, DC
6.
Fisher KM 1983 Amino acids and translation: a misconception in biology 407-419, Helm H, Novak JD Proceedings of the international seminar misconceptions in science and mathematics Cornell University Ithaca, NY
7.
Garvin-Doxas K, Klymkowsky MW 2008 Understanding randomness and its impact on student learning: lessons from the biology concept inventory (BCI) CBE Life Sci Educ 7 227-233
8.
Hake RR 1998 Interactive-engagement versus traditional methods: a six-thousand-student survey of mechanics test data for introductory physics courses Am J Phys 66 64-74
9.
Hestenes D, Halloun I 1995 Interpreting the force concept inventory a response to Huffman and Heller Phys Teach 33 502-506
10.
Hestenes D, Wells M 1992 A mechanics baseline test Phys Teach 30 3 159
11.
Hestenes D, Wells M, Swackhamer G 1992 Force concept inventory Phys Teach 30 3 141-158
12.
Hodder J, Ebert-May D, Batzli J 2006 Coding to analyze students’ critical thinking Front Ecol Environ 4 162-163
13.
Khodor J, Gould Halme D, Walker GC 2004 A hierarchical biology concept framework: a tool for course design Cell Biol Educ 3 111-121
14.
Klymkowsky MW, Garvin-Doxas K2008Recognizing student misconceptions through Ed’s Tools and the biology concept inventoryPLoS Biol60014-0017e3doi: 10.1371/journal.pbio.0060003.https://doi.org/10.1371/journal.pbio.0060003
15.
Marbach-Ad G, Briken V, Frauwirth K, Gao L, Hutcheson S, Joseph S, Mosser D, Parent B, Shields P, Song W, Stein D, Swanson K, Thompson K, Yuan R, Smith AC 2007 A faculty team works to create content linkages among various courses to increase meaningful learning of targeted concepts of microbiology Cell Biol Educ 6 155-162
16.
Mayer RE 2002 Rote versus meaningful learning Theory Pract 41 226-232
17.
Mulford DR, Robinson WR 2002 An inventory for alternate conceptions among first-semester general chemistry students J Chem Educ 79 6 739-744
18.
National Assessment Governing Board 2006 Science assessment and item specifications for the 2009 National Assessment of Educational Progress National Assessment Governing Board Washington, DC
19.
National Research Council 2006 Taking science to school: learning and teaching science in grades K-8 National Academy Press Washington, DC
20.
Odom AL, Barrow LH 1995 Development and application of a two-tier diagnostic test measuring college biology students’ understanding of diffusion and osmosis after a course of instruction J Res Sci Teach 32 45-61
21.
Redish EF 2003 Teaching physics with the physics suite John Wiley & Sons, Inc Hoboken, NJ
22.
Rhoads TR, Roedel RJ 1999 The wave concept inventory—a cognitive instrument based on Bloom’s taxonomy Proceedings of the 1999 Frontiers in Education Conference San Juan, Puerto Rico Stipes Publishing LLC Champaign, IL http://www.fie-conference.org/fie99/.
23.
Smith AC, Stewart R, Shields P, Hayes-Klosteridis J, Robinson P, Yuan R 2005 Introductory biology courses: a framework to support active learning in large enrollment introductory science courses Cell Biol Educ 4 143-156
24.
Smith C, Wiser M, Anderson CW, Krajcik J 2006 Implications of research on children’s learning for assessment: a proposed learning progression for matter and the atomic molecular theory Meas Interdiscip Res Perspect 4 1-98
25.
Smith ML, Glass GV 1987 Research and evaluation in education and the social sciences Prentice Hall Englewood Cliffs, NJ
26.
Treagust DF 1988 Development and use of diagnostic tests to evaluate students’ misconceptions in science Int J Sci Educ 10 159-169
27.
Zeilik M Classroom assessment techniques conceptual diagnostic tests http://www.flaguide.org/cat/diagnostic/diagnostic5.php.

Information & Contributors

Information

Published In

cover image Journal of Microbiology &amp; Biology Education
Journal of Microbiology & Biology Education
Volume 10Number 12009
Pages: 43 - 50
PubMed: 23653689

History

Published online: 17 May 2009

Permissions

Request permissions for this article.

Contributors

Authors

Gili Marbach-Ad [email protected]
Department of Cell Biology and Molecular Genetics, College of Chemical and Life Sciences, University of Maryland, College Park, Maryland 20742
Volker Briken
Department of Cell Biology and Molecular Genetics, College of Chemical and Life Sciences, University of Maryland, College Park, Maryland 20742
Najib M. El-Sayed
Department of Cell Biology and Molecular Genetics, College of Chemical and Life Sciences, University of Maryland, College Park, Maryland 20742
Kenneth Frauwirth
Department of Cell Biology and Molecular Genetics, College of Chemical and Life Sciences, University of Maryland, College Park, Maryland 20742
Brenda Fredericksen
Department of Cell Biology and Molecular Genetics, College of Chemical and Life Sciences, University of Maryland, College Park, Maryland 20742
Steven Hutcheson
Department of Cell Biology and Molecular Genetics, College of Chemical and Life Sciences, University of Maryland, College Park, Maryland 20742
Lian-Yong Gao
Department of Cell Biology and Molecular Genetics, College of Chemical and Life Sciences, University of Maryland, College Park, Maryland 20742
Sam Joseph
Department of Cell Biology and Molecular Genetics, College of Chemical and Life Sciences, University of Maryland, College Park, Maryland 20742
Vincent T. Lee
Department of Cell Biology and Molecular Genetics, College of Chemical and Life Sciences, University of Maryland, College Park, Maryland 20742
Kevin S. McIver
Department of Cell Biology and Molecular Genetics, College of Chemical and Life Sciences, University of Maryland, College Park, Maryland 20742
David Mosser
Department of Cell Biology and Molecular Genetics, College of Chemical and Life Sciences, University of Maryland, College Park, Maryland 20742
B. Booth Quimby
Department of Cell Biology and Molecular Genetics, College of Chemical and Life Sciences, University of Maryland, College Park, Maryland 20742
Patricia Shields
Department of Cell Biology and Molecular Genetics, College of Chemical and Life Sciences, University of Maryland, College Park, Maryland 20742
Wenxia Song
Department of Cell Biology and Molecular Genetics, College of Chemical and Life Sciences, University of Maryland, College Park, Maryland 20742
Daniel C. Stein
Department of Cell Biology and Molecular Genetics, College of Chemical and Life Sciences, University of Maryland, College Park, Maryland 20742
Robert T. Yuan
Department of Cell Biology and Molecular Genetics, College of Chemical and Life Sciences, University of Maryland, College Park, Maryland 20742
Ann C. Smith [email protected]
Department of Cell Biology and Molecular Genetics, College of Chemical and Life Sciences, University of Maryland, College Park, Maryland 20742

Metrics & Citations

Metrics

Note:

  • For recently published articles, the TOTAL download count will appear as zero until a new month starts.
  • There is a 3- to 4-day delay in article usage, so article usage will not appear immediately after publication.
  • Citation counts come from the Crossref Cited by service.

Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. For an editable text file, please select Medlars format which will download as a .txt file. Simply select your manager software from the list below and click Download.

View Options

Figures and Media

Figures

Media

Tables

Share

Share

Share the article link

Share with email

Email a colleague

Share on social media

American Society for Microbiology ("ASM") is committed to maintaining your confidence and trust with respect to the information we collect from you on websites owned and operated by ASM ("ASM Web Sites") and other sources. This Privacy Policy sets forth the information we collect about you, how we use this information and the choices you have about how we use such information.
FIND OUT MORE about the privacy policy