Single Case Research Methodology

Höfundur

Útgefandi Taylor & Francis

Snið ePub

Print ISBN 9781032279312

Útgáfa 4

Útgáfuár 2024

13.790 kr.

Description

Efnisyfirlit

  • Cover Page
  • Half Title page
  • Title Page
  • Copyright Page
  • Contents
  • Preface
  • Author Bios
  • Section 1 Introduction to Research and Measurement
  • 1 Research Approaches
  • Important Terms
  • Applied Research
  • Integrating Science into Educational and Clinical Practice
  • Participatory Action Research
  • Evidence-Based Practice
  • Characterizing Designs
  • Attributions of Causality
  • Assumptions about Generalizability
  • Process versus Procedure Questions
  • Research Approach
  • Conclusion
  • References
  • 2 External Validity and Generalizable Knowledge
  • Important Terms
  • External Validity
  • Replication
  • Parsing Critical from Non-Critical Features via Across-Study Replication
  • Tactics for Maximizing the Impact of an Across-Study Replication
  • Considerations Relevant to External Validity
  • Related Constructs
  • Ecological Validity
  • Construct Validity
  • Recommendations for Across-Study Replications
  • Conclusion
  • References
  • 3 Establishing Internal Validity via Within-Study Replication
  • Important Terms
  • Within-Study Replication
  • Internal Validity
  • Threats to Internal Validity
  • Conclusions
  • References
  • 4 Selection, Characterization, and Measurement of Dependent Variables
  • Important Terms
  • Choosing and Defining Behaviors
  • Characterizing Behaviors
  • Reversible and Non-Reversible Behaviors
  • Behaviors of Long and Short Duration
  • Trial-Based versus Free-Operant Behaviors
  • Selecting a Data Recording Procedure
  • Continuous Recording Systems
  • Event and Timed Event Recording to Measure Count
  • Duration and Latency Recording to Measure Time
  • Estimating Count and Duration with Interval-Based Systems
  • Partial Interval Recording
  • Whole Interval Recording
  • Momentary Time Sampling
  • Variations in Use of Interval Systems
  • Comparisons Among Interval-Based Systems
  • Illustration of Accuracy for Behaviors with Non-Trivial Durations
  • Illustration of Accuracy for Behaviors with Trivial Durations
  • Reporting Use of Interval Systems
  • Data Collection
  • Planning and Conducting Data Collection
  • Using Technology
  • Collecting Data on More than One Behavior
  • Conclusions
  • References
  • 5 Reliability and Validity of Dependent Variables
  • Important Terms
  • Validity
  • Accuracy
  • Reliability
  • Ensuring Reliability and Validity of Data Collection
  • Operationalize Behaviors
  • Pilot Data Collection Procedures
  • Train Observers
  • Use Naïve Observers
  • Collect IOA Data
  • Analyze IOA Data and Conduct Discrepancy Discussions
  • Calculate Agreement
  • Report Agreement
  • Calculating Interobserver Agreement
  • Percentage Agreement
  • Kappa
  • Conclusions
  • References
  • 6 Development and Measurement of Independent Variables
  • Important Terms
  • Planning Study Conditions Using a Theory of Change
  • Different Ways to Approach Condition Design
  • Evaluation of Procedures versus Evaluation of Processes
  • Static versus Dynamic Condition Design
  • Cascading Logic Models
  • Measurement of Fidelity
  • Defining Experimental Conditions
  • Types of Fidelity
  • Formative Analysis
  • Summative Analysis
  • Reporting Fidelity
  • Conclusions
  • References
  • 7 Measuring Generality and Social Validity in Single Case Research
  • Important Terms
  • Generality
  • Domains of Performance Relevant to Generality
  • Measurement of Generalized Behavior Change
  • Terminology, Mechanisms, and Theory of Change
  • Social Validity
  • Social Validity Stakeholders
  • Measurement Strategies and Recommendations
  • Generality and Social Validity: The State of the Field
  • A Case for Mixed Methods Research
  • Conclusions
  • References
  • 8 Data Representation and Performance Characteristics
  • Important Terms
  • Data Representation
  • Figures in Single Case Design Studies
  • Tables in Single Case Design Studies
  • Describing Participant Performance
  • Level
  • Trend
  • Variability
  • Immediacy
  • Overlap
  • Consistency
  • Conclusions
  • References
  • Section 2 Single Case Designs
  • 9 Conducting Studies Using Sequential Introduction and Withdrawal of Conditions
  • Important Terms
  • Features of Sequential Introduction and Withdrawal Designs
  • When to Use Sequential Introduction and Withdrawal Designs
  • Strengths and Benefits
  • Weaknesses and Drawbacks of SIW Designs
  • A-B-A-B Designs
  • Reversal Variation
  • Procedural Steps
  • Multitreatment Design
  • Procedural Steps
  • Threats to Internal Validity
  • Special Case: Changing Criterion Designs
  • Procedural Steps
  • Internal Validity for Changing Criterion Designs
  • Conclusions
  • References
  • 10 Analyzing Data from Studies Using Sequential Introduction and Withdrawal of Conditions
  • Important Terms
  • Formative Analysis and Phase Change Decisions
  • Summative Analysis and Functional Relation Determination
  • Design Adequacy
  • Data Adequacy
  • Control for Likely Threats to Internal Validity
  • Outcome Evaluation
  • Supplementary Analyses
  • Magnitude Estimates
  • Overlap Estimates
  • Describing Visual Analysis
  • Visual Analysis Procedures
  • Visual Analysis Results
  • Conclusions
  • References
  • 11 Conducting Studies Using Time-lagged Condition Ordering
  • Important Terms
  • Features of Time-Lagged Designs
  • Intervention Targets (Participants, Behaviors, Contexts)
  • Concurrence Variations
  • When to Use Time-Lagged Designs
  • Strengths and Benefits of Time-Lagged Designs
  • Weaknesses and Drawbacks of Time-Lagged Designs
  • Multiple Baseline Designs
  • Multiple Probe Designs
  • Procedural Steps for MB and MP Designs
  • Threats to Internal Validity for Time-Lagged Design Family
  • Conclusions
  • References
  • 12 Analyzing Data from Studies Using Time-Lagged Conditions
  • Important Terms
  • Formative Analysis and Phase Change Decisions
  • Summative Analysis and Functional Relation Determination
  • Design Adequacy
  • Data Sufficiency
  • Control for Likely Threats to Internal Validity
  • Outcome Evaluation
  • Supplemental Analyses
  • Describing Visual Analysis
  • Visual Analysis Procedures
  • Visual Analysis Results
  • Conclusions
  • References
  • 13 Conducting Studies Using Rapid Iterative Alternation of Conditions
  • Important Terms
  • Features of Designs Using Rapid Iterative Alternation
  • When to Use Designs with Rapid Iterative Alternation
  • Strengths and Benefits
  • Weaknesses and Drawbacks
  • Multielement-Alternating Treatments Design
  • Procedural Steps
  • Adapted Alternating Treatments Design
  • Selecting and Assigning Sets
  • Procedural Steps
  • Repeated Acquisition Design
  • Threats to Internal Validity
  • Assessing Social Validity Using Simultaneous Treatments
  • Conclusions
  • References
  • 14 Analyzing Data from Studies Using Rapid Iterative Alternation
  • Important Terms
  • Formative Analysis and Phase Change Decisions
  • Summative Analysis and Functional Relation Determination
  • Design Adequacy
  • Data Adequacy
  • Control for Common Threats to Internal Validity
  • Outcomes Evaluation
  • Supplemental Analyses
  • Supplemental Analyses for ME-ATDs
  • Supplemental Analyses for AATDs and Repeated Acquisition Designs
  • Describing Visual Analysis
  • Visual Analysis Procedures
  • Visual Analysis Results
  • Conclusions
  • References
  • 15 Selecting and Combining Single Case Designs
  • Important Terms
  • Combination Designs
  • Types of Combinations
  • Summary
  • Design Selection
  • Choosing Between ME-ATD Designs and SIW Design Variations
  • Choosing Between SIW/ME-ATD Designs and Time-Lagged Design Variations
  • Choosing Among Time-Lagged Variations
  • Choosing Between AATD and Repeated Acquisition Designs
  • Conclusions
  • References
  • Section 3 Ethics, Rigor, and Writing
  • 16 Ethical Principles and Practices in Research
  • Important Terms
  • History of Ethics in Applied Research
  • Ethical Considerations When Conducting Research in Applied Settings
  • Fully Informed Participants
  • Consent and Assent
  • Outcomes that Benefit the Participant Versus the Field
  • Conflicts of Interest
  • Confidentiality
  • Equity
  • Culturally Responsive Research
  • Formal Approvals to Conduct Research
  • Site-Specific Approval
  • University Institutional Review Board
  • Considerations for IRB Applications
  • Special Populations
  • Potential Risk
  • Defining Methods and Procedures
  • Data Storage and Confidentiality
  • Informed Consent and Assent Procedures
  • Sharing of Information
  • Researcher Expertise
  • Ethical Practice
  • Publication Ethics and Reporting of Results
  • Authorship
  • Reporting Results
  • Conclusions
  • References
  • 17 Evaluating Single Case Research
  • Important Terms
  • Internal Validity, Rigor, and Risk of Bias
  • Critical Characteristics of Single Case Studies
  • Design Appropriateness
  • Potential Demonstrations of Effect
  • Reliability
  • Fidelity
  • Data Sufficiency
  • Potentially Important Characteristics
  • Generality and Applicability
  • Randomization
  • Resources for Characterizing Rigor
  • CEC-DR Evidence-Based Practice Paper (2005)
  • RoBiNT Scale (2013) and Updated Algorithm (2019)
  • CEC Standards for Evidence-Based Practice (2014)
  • Single Case Reporting Guideline in Behavioral Interventions (2016)
  • Risk of Bias Tool (2017)
  • Comparative Single Case Experimental Design Rating System (2018)
  • WWC Procedures and Standards Handbook (2020)
  • CEC-DR Next Generation Guidelines (2023)
  • Single Case Analysis and Review Framework v 3.1 (2023)
  • QualiCase (2023)
  • Examples of Use
  • Questionable Research Practices
  • Conclusions
  • References
  • 18 Writing Research Proposals and Empirical Reports
  • Important Terms
  • Scientific Writing
  • Writing Research Questions
  • Finding Research Topics
  • Moving from Topics to Questions
  • Classifying and Stating Research Questions
  • Writing Research Proposals
  • Why Write Research Proposals?
  • Primary Sections
  • Writing Empirical Reports
  • Why Write Empirical Reports?
  • Primary Sections
  • Considerations for Success
  • Conclusions
  • References
  • 19 Conducting Systematic Reviews and Syntheses
  • Important Terms
  • Literature Reviews
  • Approaches and Procedures
  • Types of Literature Reviews
  • Process of Conducting Literature Reviews
  • Organizing Findings and Writing the Review
  • Using the Literature Review
  • PRISMA Guidelines
  • PRISMA 2020 Statement, Checklist, and Flow Diagram
  • PRISMA Protocol
  • PRISMA Extensions
  • Considerations for Success
  • Conclusions
  • References
  • Index
Show More

Additional information

Veldu vöru

Rafbók til eignar

Reviews

There are no reviews yet.

Be the first to review “Single Case Research Methodology”

Netfang þitt verður ekki birt. Nauðsynlegir reitir eru merktir *

Aðrar vörur

0
    0
    Karfan þín
    Karfan þín er tómAftur í búð