Psychology Lecture Notes

Only available on StudyMode
  • Download(s) : 125
  • Published : February 14, 2013
Open Document
Text Preview
Construct
Label for a theoretical dimension with which people differ
i.e. anger
Meaning of the construct in abstract
Measuring in a way that is measurable
Acquiring Knowledge- Methods
Empirical- based on experience and observations
Objective- everyone perceives the same way
Anxiety
Uneasy and distress related to uncertainty of the future
i.e. self report, sweaty palms, heart rate
Characteristics of Science and the Scientific approach
Control
Tentative
Self-correcting/replication
Progressive
Theory-driven
Objective
Empirical
Parsimonious
Conceptual vs. Operational Definitions- Measuring Anxiety
Conceptual definition- meaning of the variable more abstractly or generally Operational definition- how the variable is measured or observed Operationalization/Operational definition
All people consider taking off on time differently
You, the customer
Bureau od transportation statistics
FAA
Measuring Job experience- chart on PowerPoint- chapter one
Consider
Consumer vs producer of research
Basic vs applied research
Peer review cycle
Journal-to-journalism cycle

Characteristics of an Experiment
IV manipulated
Common design
Pretest of DV- ultimately trying to decide ?
Intervention/treatment IV
Posttest assessment of DV
Research Validity- appropriateness of inferences drawn from data Four types
Internal
External
Statistical Conclusion
Construct
Internal Validity- “inside of the study”
Extent to which we can infer that a relationship between two variables is casual or that absence of a relationship implies absence of cause i.e. Online dating number of dates
Reasons for lack of differences
Determinants of Internal Validity
Internal consistency/standardization of research design and variables IV= cause
DV= effect
NOT: confound = cause
Quality of the research design: controlling extraneous and confounding variables Threats to Internal Validity
History- merge in companies
Maturation (naturally occurring change that alters results)- turning 21 Testing- exposer to pretest alters performance on the post test Mortality- idea that people are dropping out of your study
Selection- idea that samples are not random, unique sample
Regression to the mean- extreme scores move to the middle (mean) External Validity- generalization
Inference that results can be generalized to and across alternate measures, participants, settings, and times How generalizable are the results?
If you tighten internal, you sacrifice external
NOTE: Tradeoff between Internal Validity and External Validity Threats to External Validity
Interaction between subjects and treatment: Population Validity Interaction between setting and treatment: Ecological Validity Interaction between history and treatment: Temporal Validity Statistical Conclusion Validity

Appropriateness of inferences made from data as a function of conclusions drawn from statistical analyses Are the IV and DV statistically related?
Is the relationship simply a function of chance?
Threats to Statistical Conclusion Validity
Low statistical power
Violations of assumptions of statistical tests
Measures with low levels of reliability
Validity of the measures/ data questionable
Construct Validity
Extent to which labels placed on what is being observed are theoretically relevant Do the results support the theory underlying the research?
Threats to Construct Validity
Loose connection between theory and experiment
Diffusion of treatment
Ambiguous effect of IVs
Role Demands/Demand Characteristics
Role Demands: participants expectations of what the experiment requires them to do Can lead participants to engage in socially unacceptable behaviors Subject/participant Roles
“good subject”
“negativistic subject”
“apprehensive subject” (evaluation apprehension)
Solutions to participant roles
Deception
Separate the IV and DV in time
Use unobtrusive methods/measures
Experimenter Bias
Ununtentially influencing the results of an...
tracking img