About
The iBehavior Research Lab aims to provide researchers and students with the opportunity to collaborate across disciplines to develop innovative technologies that serve individuals with disabilities and their families. The lab includes intelligent sensors, the latest technology for augmented and virtual reality, computers, treatment rooms, treatment materials, and a rich data set representing three years of behavioral observations collected from toddlers at-risk for developmental disabilities. A current focus of the lab is to integrate artificial intelligence and Internet of Things (IoT) technologies to facilitate earlier and more advanced detection of behavioral patterns.
The research generated by the iBehavior Research Lab aims to improve the quality of services and resources available to individuals with disabilities, as well as the expertise of those service providers. Principal investigators involved in the lab are able to leverage the resources of the lab for grant proposals with goals of investigating precision therapeutics, use of AR/VR to train future clinicians/educators, and use of AR/VR for direct therapeutics and education.
The research generated by the iBehavior Research Lab aims to improve the quality of services and resources available to individuals with disabilities, as well as the expertise of those service providers. Principal investigators involved in the lab are able to leverage the resources of the lab for grant proposals with goals of investigating precision therapeutics, use of AR/VR to train future clinicians/educators, and use of AR/VR for direct therapeutics and education.
-
Wearable Tech
-
Artificial Intelligence
-
Supporting with AR
-
Partner Projects
<
>
Measurement of Behavior Using Wearable Technology
Collaborators:
|
ABA is particularly effective in treating challenging behaviors. However, there are very few measures within ABA to automatically measure behavior, to measure intensity of behavior, and the measures that do exist are limited in their objectivity and generalizability. In the current proposed study, we plan to use wearable sensors (i.e., inertial measurement units (IMU)) and non-wearable sensors (e.g., robot dog, cameras) to more objectively measure the frequency and intensity of challenging behaviors. Incorporating the use of these sensors into the existing evaluation procedure may allow us to measure intensity of behavior in addition to the occurrence or non-occurrence of behavior. Therefore, the purpose of this study is to examine the feasibility and value of incorporating wearable and non-wearable sensors to the existing ABA evaluation procedure.
|
Results from this project:
Neely, L., Cantrell, K., Svoboda, M., Graber, J., Wimberley, J., & Oyama, S. (2024). An exploratory study of the use of wearable technology to supplement measurement of self-injurious behavior. [Manuscript submitted for publication]
Neely, L., Holloway*, K., Miller*, S., Cantero*, K., Alaeddini, A., & Oyama, S. (2024). Wearable technology to measure self-injury during a functional analysis. [Manuscript in preparation]
Neely, L., Cantrell, K., Svoboda, M., Graber, J., Wimberley, J., & Oyama, S. (2024). An exploratory study of the use of wearable technology to supplement measurement of self-injurious behavior. [Manuscript submitted for publication]
Neely, L., Holloway*, K., Miller*, S., Cantero*, K., Alaeddini, A., & Oyama, S. (2024). Wearable technology to measure self-injury during a functional analysis. [Manuscript in preparation]
Project #1. AI Data Collection
Collaborators:
|
About:
Using ABA, clinicians collect (via human raters) and analyze data to create an individualized treatment plan for the child. This “conventional” approach to the ABA data collection process has been utilized by clinicians for over 5 decades, with minimal refinement to its execution. While ABA has been shown to be effective in improving outcomes for children with ASD, the approach is not optimized, as human-raters play a central role in data collection and analysis; human error is inherent in the process, which results in therapy that may be individualized, but potentially lacking precision. Our current research is aimed at determining whether the ABA assessment process can be enhanced through the use of artificial intelligence (AI)-based methodologies. Our current projects leverage artificial and human intelligence to further the use of ABA to assess severe behavior for autistic children. Future work will extend to treatment optimization, conducting a large-scale clinical trial, and potential commercialization for eventual use in clinical practice. |
Results from this project:
Kausch, T., Holloway, K., Neely, L. Carnett, A., Lang, R., & Alaeddini, A. (2024). Identification of indices of happiness and unhappiness using advanced neural network approaches. [Manuscript in preparation]
Kausch, T., Holloway, K., Neely, L. Carnett, A., Lang, R., & Alaeddini, A. (2024). Identification of indices of happiness and unhappiness using advanced neural network approaches. [Manuscript in preparation]
Project #2: An AI-Enabled Platform to Support Connected Communities for Coordinated Care
of Children with Autism
This integrative research spans across technology and social science domains with a multidisciplinary team of experts in embedded systems, cloud computing, social science, and privacy. The UTSA team is partnering with a range of community stakeholders including the Autism Treatment Center (ATC), CHRISTUS Children’s Hospital, and caregivers to build an AI enabled connected platform for for coordinated care of children with autism spectrum disorder.
This planning project seeks to identify solutions for how to accurately collect behavior data for automatic measurement of key dimensions of severe behavior (e.g., frequency, intensity, latency, etc.), assist the decision making process with AI algorithms, and communicate to stakeholders in real time for early intervention. |
Project #3: Behavior Buddy
The goal of this project is to pilot an innovative treatment model that integrates the BehaviorBuddy app into caregiver coaching to improve sustainability of interventions. BehaviorBuddy, a mobile app, leverages advanced machine learning technology for immediate and personalized feedback, behavior tracking and intervention tools in real-time. The app's availability outside of sessions will facilitate caregiver implementation between appointments, highlighting the intervention's generalizability and scalability.
This project will enroll participants starting Fall 2024. This project was awarded by the Texas Higher Education Coordinating Board (PI: Neely; Award #00854) |
Project #1: Training Medical Professionals
Collaborators:
|
About: Many autistic persons engage in challenging or dangerous behaviors during health care visits and require physical or pharmaceutical restraints to access the care they need. This project integrates medical professionals in the training to help them work with the individual to feel more comfortable, and access their necessary or preventative care in a safe way. We are using behavioral skills training to train the medical professionals - an evidence-based teaching strategy that has been used to train law enforcement officers, dental professionals, and educators in behavior analytic interventions. Within the medical environment, the modeling and rehearsal phases typically involves paid actors to role-play a person with a disability to give the trainee real-life experience implementing the skill. This method not only poses ethical concerns, as the actors would be “acting autistic”, but also comes at a high-cost. We are addressing this issue by leveraging augmented reality to automate pieces of the training, reduce the long-term costs of medical actors, and increase the accessibility of quality practice opportunities.
|
Results related to project:
Cantu-Davis, K, Neely, L., Ximenez, T., Kirkpatrick, M., & Alaeddini, A. (2024). Augmented reality to train behavior response teams to de-escalate behavior: A replication and extension. [Revising manuscript for resubmission]
Garcia, S., Neely, L., Davis, H., & Alaeddini, A. (2024). Using behavior skills training to teach clinicians de-escalate situations with autistic individuals. Manuscript under review in Behavior Analysis in Practice.
Cantu-Davis, K, Neely, L., Ximenez, T., Kirkpatrick, M., & Alaeddini, A. (2024). Augmented reality to train behavior response teams to de-escalate behavior: A replication and extension. [Revising manuscript for resubmission]
Garcia, S., Neely, L., Davis, H., & Alaeddini, A. (2024). Using behavior skills training to teach clinicians de-escalate situations with autistic individuals. Manuscript under review in Behavior Analysis in Practice.
Project #2: Enhancing Response to Emergencies
Traumatic encounters with first responders are a weekly occurrence for autistic individuals. In a nationally representative study, researchers found that, by age 21, approximately 20% of autistic youth had been stopped and questioned by police and nearly 5% had been arrested. Additionally, over 60% of autistic individuals and their caregivers report fearing interactions with police due to the potential for officers to misunderstand autistic behaviors and escalate rather than de-escalate a situation. This planning proposal will conduct a feasibility and pilot study to build a smart and connected community for a coordinated response to escalated behavior events for autistic individuals through an IoT-enabled generative AI platform. The team will partner with a number of community shareholders (e.g., autistic individuals, caregivers, emergency responder teams, and community initiatives) to analyze current approach gaps, identify trustworthy solutions, and conduct feasibility tests.
|
3D Hand & Full Body Pose Estimation in Telehealth for Children with Autism
Lead by Dr. Kevin Desai
Our long-term goal is to enable 3D telehealth systems that can provide effective training of basic life skills to children with autism. This study will work towards this long term goal with the overall objective of providing effective full-body interaction in non-HMD VR systems through accurate and real-time 3D hand and body pose estimation. The expected outcome is that the comparison of the children's performance assessment across both conditions will inform us on the feasibility of the proposed 3D hand and full-body pose estimation system as a complement to the traditional training and assessment of fine motor skills. The results from the Treatment Evaluation Inventory Short Form (TEI-SF) will also provide additional insights on the feasibility of the proposed system in terms of acceptability of the proposed system by therapists and parents of children with autism in the age group of 3-6 years.
Effects of Desensitization of Plantar Surface in Children with Autism Spectrum Disorder with Atypical Gait
Lead by Dr. Sakiko Oyama
Autism spectrum disorder (ASD) is a neurodevelopmental disorder characterized by impairments in social communication with repetitive and restricted behaviors. About 20% of children with ASD exhibit atypical walking pattern, characterized by toe-walking, well past the age the pattern disappears in typically developing children. In non-ASD children, toe-walking has been associated with lower proficiency in balance and coordination. It is theorized that toe-walking is driven in part by hyper-sensitivity of the foot to touch and forces acting on ground contact and the need to decrease the sensations on the foot. The purpose of our study is to 1) determine the difference in sensitivity to touch, balance, walking pattern, motor proficiency, and sensory processing profile in ASD children with atypical walking pattern, ASD children with a normal walking pattern, and typically developed children with a normal walking pattern and 2) determine the immediate impact of vibratory stimuli on sensitivity to touch and walking pattern in the three groups of children.
Enhancing Programming and Machine Learning Education for Students with Visual Impairments through the Use of Compilers, AI and Cloud Technologies
Lead by Dr. Wei Wang
Attractive high-paying and highly flexible Computer Science careers should be more readily accessible for people with blindness or visual impairments (BVI). Unfortunately, teaching the required computer programming and data science skills to students with BVI is extremely challenging due to the limited capability of current screen readers to properly read computer codes that are a mix of English letters, digits, and punctuation marks and the time-consuming and frustrating code navigation, whereby students with BVI must repeatedly use screen readers to read every line to locate the desired line for editing. Partnering with San Antonio Lighthouse for the Blind and Vision Impaired, the project is developing new accessibility tools, including a program syntax- and semantics-aware screen reader and a voice-command-based code navigation framework to address the above two difficulties. These accessibility tools will be offered through cloud-based web interfaces to provide nationwide access to students and educators. The success of this project will improve the effectiveness of teaching computer programming and data science to students with BVI, which in turn will increase accessibility for more individuals with BVI to participate in Computing Science with high-paying career opportunities and could lead to a more-diverse Computer Science workforce.
Lead by Dr. Kevin Desai
Our long-term goal is to enable 3D telehealth systems that can provide effective training of basic life skills to children with autism. This study will work towards this long term goal with the overall objective of providing effective full-body interaction in non-HMD VR systems through accurate and real-time 3D hand and body pose estimation. The expected outcome is that the comparison of the children's performance assessment across both conditions will inform us on the feasibility of the proposed 3D hand and full-body pose estimation system as a complement to the traditional training and assessment of fine motor skills. The results from the Treatment Evaluation Inventory Short Form (TEI-SF) will also provide additional insights on the feasibility of the proposed system in terms of acceptability of the proposed system by therapists and parents of children with autism in the age group of 3-6 years.
Effects of Desensitization of Plantar Surface in Children with Autism Spectrum Disorder with Atypical Gait
Lead by Dr. Sakiko Oyama
Autism spectrum disorder (ASD) is a neurodevelopmental disorder characterized by impairments in social communication with repetitive and restricted behaviors. About 20% of children with ASD exhibit atypical walking pattern, characterized by toe-walking, well past the age the pattern disappears in typically developing children. In non-ASD children, toe-walking has been associated with lower proficiency in balance and coordination. It is theorized that toe-walking is driven in part by hyper-sensitivity of the foot to touch and forces acting on ground contact and the need to decrease the sensations on the foot. The purpose of our study is to 1) determine the difference in sensitivity to touch, balance, walking pattern, motor proficiency, and sensory processing profile in ASD children with atypical walking pattern, ASD children with a normal walking pattern, and typically developed children with a normal walking pattern and 2) determine the immediate impact of vibratory stimuli on sensitivity to touch and walking pattern in the three groups of children.
Enhancing Programming and Machine Learning Education for Students with Visual Impairments through the Use of Compilers, AI and Cloud Technologies
Lead by Dr. Wei Wang
Attractive high-paying and highly flexible Computer Science careers should be more readily accessible for people with blindness or visual impairments (BVI). Unfortunately, teaching the required computer programming and data science skills to students with BVI is extremely challenging due to the limited capability of current screen readers to properly read computer codes that are a mix of English letters, digits, and punctuation marks and the time-consuming and frustrating code navigation, whereby students with BVI must repeatedly use screen readers to read every line to locate the desired line for editing. Partnering with San Antonio Lighthouse for the Blind and Vision Impaired, the project is developing new accessibility tools, including a program syntax- and semantics-aware screen reader and a voice-command-based code navigation framework to address the above two difficulties. These accessibility tools will be offered through cloud-based web interfaces to provide nationwide access to students and educators. The success of this project will improve the effectiveness of teaching computer programming and data science to students with BVI, which in turn will increase accessibility for more individuals with BVI to participate in Computing Science with high-paying career opportunities and could lead to a more-diverse Computer Science workforce.
The iBehavior Technology Lab is lead by Dr. Leslie Neely and a team of collaborators from UTSA College of Education and Human Development, College of Science, College of Business, College of Engineering, and College of Health, Community, and Policy.
Lab Manager
Katie Holloway is a second year doctoral student in the Department of Educational Psychology at the University of Texas at San Antonio. She earned a Bachelor of Science degree from Texas A&M University in Interdisciplinary Education with a focus in Special Education, and a Master of Education from the University of Texas at San Antonio in Educational Psychology with a focus in behavior assessment and intervention. Her research focuses on the use of applied behavior analysis (ABA) as treatment for young children at-risk for autism.
Email Katie at [email protected] |
About
|
Research |
Students |
Caregivers |
ABA at UTSA |