These notes come from Punch's text on Social Research.
Operationalize each concept/variable. This means connect concepts to empirical (observable) indicators!
Concept = knowledge change
Indicators = addend, insert, or delete concepts
The purpose of conceptual frameworks is to show the concepts, their conceptual status in relation to one another, and the hypothesized relationships.
Not all quantitative studies need hypothesis, only those that clearly reflect a theory. So, if THEORY, then HYPOTHESIS.
That's all for now...now back to work on my proposal.
Friday, November 12, 2010
Wednesday, November 10, 2010
Observing Becca's class
Today I observed another doctoral student teach an undergrad course in the school of information sciences. I want to list some of the remarkable strategies she used. Becca is a natural instructor! Her 50-minute session was student-centered and memorable.
1. At the beginning of the class, she stated the core concepts and then said "why is this important?" Then, she answered this question with an example.
2. Her powerpoint slides were like guided notes. There were blank spaces that she filled in during the lecture using custom design. This supported learning and student engagement.
3. The slides were short and sweet. They illustrated important, concise concepts. She didn't talk from her slides.
4. Becca added LOTS of real examples. Many examples to illustrate concepts. The examples were meaningful to college students...like JCrew's website.
5. She asked lots of probing questions. Her lecture was a lot like a discussion.
6. She showed good and bad examples of website designs.
I want to watch Becca teach again!
1. At the beginning of the class, she stated the core concepts and then said "why is this important?" Then, she answered this question with an example.
2. Her powerpoint slides were like guided notes. There were blank spaces that she filled in during the lecture using custom design. This supported learning and student engagement.
3. The slides were short and sweet. They illustrated important, concise concepts. She didn't talk from her slides.
4. Becca added LOTS of real examples. Many examples to illustrate concepts. The examples were meaningful to college students...like JCrew's website.
5. She asked lots of probing questions. Her lecture was a lot like a discussion.
6. She showed good and bad examples of website designs.
I want to watch Becca teach again!
Saturday, November 6, 2010
Why environmental literacy?
Some big picture thoughts/conclusions based on the literature review conducted for SIS 495. I want to capture these ideas!
-INTERDISCIPLINARY: Interdisciplinary nature of environmental sciences matches the interdisciplinary nature of information science.
-COMPLEXITY: Complex environmental issues and data necessitate literacy.
-PARTICIPATION: Environmental issues are participatory and involve participation of citizens in things like data collection (citizen science) and policy decisions (voting). Generation Y (today's college students) is known for its spirit of volunteerism. Higher education a good fit for a course enabling informed participation in environmental issues.
-STANDARDS: Science standards map to ACRL information literacy competencies. There is a clear similarity. See Manuel K. 2004 for an comparison chart.
-MANDATING LEGISLATION: National legislation passed in 1992.
-VARIABLES: To assess environmental literacy, look at knowledge, skills, and participation. Must operationalize these concepts. Relate the concept to empirical indicators.
-INTERDISCIPLINARY: Interdisciplinary nature of environmental sciences matches the interdisciplinary nature of information science.
-COMPLEXITY: Complex environmental issues and data necessitate literacy.
-PARTICIPATION: Environmental issues are participatory and involve participation of citizens in things like data collection (citizen science) and policy decisions (voting). Generation Y (today's college students) is known for its spirit of volunteerism. Higher education a good fit for a course enabling informed participation in environmental issues.
-STANDARDS: Science standards map to ACRL information literacy competencies. There is a clear similarity. See Manuel K. 2004 for an comparison chart.
-MANDATING LEGISLATION: National legislation passed in 1992.
-VARIABLES: To assess environmental literacy, look at knowledge, skills, and participation. Must operationalize these concepts. Relate the concept to empirical indicators.
Friday, November 5, 2010
Some thoughts on the big picture of my research proposal
Okay, this whole research process has been fuzzy and vague. I'm reading a book called Introduction to Social Research by Punch. It's been a helpful read to get to the bottom of research, what it's all about, and how to go about doing it. Help at last!
By the end of my first semester of doctoral studies, I will have written a quantitative research proposal, a daunting and huge task considering I've never done ANY research before!
Well, here goes an attempt to simply state what it is that I want to know?
The assumption - we know knowledge structures change with the incorporation of new information. This is a cognitive dimension of information use. It is using information cognitively to construct or modify knowledge. Brooke's fundamental equation of information science supports that our knowledge structures change.
Research area - information behavior
Research topic - cognitive information use
Gap - But, it's hard to observe and measure this process of knowledge change and few studies have attempted it.
What I want to find out (General RQs) - So, I want to know if pre- and post- information concept maps will measure knowledge change. Do they provide a lens to observe this? And how to knowledge structures change as evidenced in pre and post concept maps?
Aim - further understanding of the cognitive aspects of information use, that is, how information is used to build and modify knowledge (knowledge change).
Conceptual framework - Similarity of Brookes equation and concept mapping.
Hypothesis - thus, following his theory I propose concept maps will be an effective tool to measure this.
Specific RQs:
RQ1 Are concepts added or deleted?
RQ2 Are linking words (across concepts) added or deleted?
RQ3 Are cross-links (across domains) added or deleted?
RQ4 Are concepts inserted? (this changes the structure)
RQ5 Are concepts moved around? (this changes the structure)
Research design - single subject experiment; repeated measure
00----X-----01-----x-----02
00=pre-map (current knowledge)
X=intervention (information)
01-post-map 1 (knowledge change 1)
x=intervention (information)
02=post-map 2 (knowledge change 2)
Data collection -
Data analysis - Is there a significant difference between the maps in terms of (t tests):
-Concepts (RQ1)
-Linking words (RQ2)
-Cross links (RQ3)
-Map structure (RQ4, RQ5)
By the end of my first semester of doctoral studies, I will have written a quantitative research proposal, a daunting and huge task considering I've never done ANY research before!
Well, here goes an attempt to simply state what it is that I want to know?
The assumption - we know knowledge structures change with the incorporation of new information. This is a cognitive dimension of information use. It is using information cognitively to construct or modify knowledge. Brooke's fundamental equation of information science supports that our knowledge structures change.
Research area - information behavior
Research topic - cognitive information use
Gap - But, it's hard to observe and measure this process of knowledge change and few studies have attempted it.
What I want to find out (General RQs) - So, I want to know if pre- and post- information concept maps will measure knowledge change. Do they provide a lens to observe this? And how to knowledge structures change as evidenced in pre and post concept maps?
Aim - further understanding of the cognitive aspects of information use, that is, how information is used to build and modify knowledge (knowledge change).
Conceptual framework - Similarity of Brookes equation and concept mapping.
Hypothesis - thus, following his theory I propose concept maps will be an effective tool to measure this.
Specific RQs:
RQ1 Are concepts added or deleted?
RQ2 Are linking words (across concepts) added or deleted?
RQ3 Are cross-links (across domains) added or deleted?
RQ4 Are concepts inserted? (this changes the structure)
RQ5 Are concepts moved around? (this changes the structure)
Research design - single subject experiment; repeated measure
00----X-----01-----x-----02
00=pre-map (current knowledge)
X=intervention (information)
01-post-map 1 (knowledge change 1)
x=intervention (information)
02=post-map 2 (knowledge change 2)
Data collection -
Data analysis - Is there a significant difference between the maps in terms of (t tests):
-Concepts (RQ1)
-Linking words (RQ2)
-Cross links (RQ3)
-Map structure (RQ4, RQ5)
Subscribe to:
Comments (Atom)