ALERT!
This site is not optimized for Internet Explorer 8 (or older).
Please upgrade to a newer version of Internet Explorer or use an alternate browser such as Chrome or Firefox.
Ameen Basha
Ameen M. Basha
Foothills Medical Centre C880
Background
- Integrated Cardiac Surgery Residency (FRCSC), Libin Cardiovascular Institute, Present
- Master of Public Health (MPH), Quantitative Methods, Harvard University, 2023
- Doctor of Medicine (MD), McMaster University, 2020
- Bachelor of Health Sciences, First Class Honors (BHSc), University of Calgary, 2017
- Pre-Baccalaureate Education, Town Centre Private High School, 2013
Other Interests
- High-Altitude, Space, and Deep Sea Physiology
- Minimally-Invasive Cardiac Surgery
- Hybrid Coronary Revascularization
- Statistics and Data Science
- Time-to-event/Failure-Time Analysis
- Longitudinal and Cluster Analysis Methods
- Clinical Trials Design
Research
Consider a heart surgeon, looking down into an aortic valve annulus, who must now decide which of two prosthetic valves should be implanted into that space. In the next few years, the results of a well-designed randomized trial comparing the two valves will be published. Yet, that eventual trial does little to inform the surgical decision today, which will instead be based on small highly controlled trials with low external validity, observational data at major risk of confounding, and individual experiences. My research is focused on applying advanced statistical methods to cardiothoracic surgery data in order to generate timely, yet valid, data inferences. The ways in which I pursue this are through the application of randomization approximation methods, synthetic data modelling, and causal observational analysis.
Trial Emulation in Cardiothoracic Surgery
What if you and I could predict what a future randomized trial will show? While a Randomized Controlled Trial (RCT) is the de facto gold-standard study method, many of us are well-versed in the limitations of the RCT. In contrast, modern cardiac surgery research moves fast and often asks questions that cannot be easily answered by RCTs. Target Trial Emulation (TTE) techniques being developed represent a framework through which non-randomized data can be systematically analyzed to approximate the results of randomized trials (Hernan and Robins, 2016). TTE methods extended to the cardiothoracic surgical sphere represent an opportunity to answer questions quickly by utilizing readily available observational data. My research is focused on applying TTE-based analytic techniques to nonrandomized cardiothoracic surgery studies to generate timely and robust findings. Though RCTs remain the gold-standard for clinical research, clinical research itself has changed. TTE-based methods are needed to approximate the randomized trials of tomorrow so that surgical decisions can be made today.
Synthetic Data Derivatives
Every researcher struggles with data acquisiton. Once recruited, study populations remain at risk of changing, such as through withdrawal or loss to follow-up. Synthetic data, created from either statistical simulation or from computational derivation, represent a way to create a theoretical study population that behaves as a “real-world” population would (Foraker et al., 2018). Synthetic data generation and analysis represent a way to bypass the problems of patient recruitment and attrition. Moreover, synthetic data-driven research can be completed rapidly, with simplified monitoring boards, and without exposing real humans to real-world risks. Spanning across the theoretical and applied domains of synthetic data, my research strives to create better statistical models and then verify them against real-world patient performance. Though their fidelity and reproducibility continue to be established, synthetic data methods may represent a powerful supplement to the traditional RCT in cardiothoracic surgery.
Causal Longitudinal, Time-to-Event, and Cluster Database Analyses
Ask yourself, do the numbers make sense? All observational data, and the analyses thereof, are not created equal. Particularly in the variable environment wherein cardiothoracic surgery data is collected, the risk of confounding is high. Alternatives to stratification-based methods to control confounding, such as G-estimation, should be more readily applied to analyses where statistical assumptions permit (Robins, Rotnitzski, and Zhao, 1995). Too frequently, the within-individual correlations present in repeated observations of the same patients over time are ignored. Similarly, time-to-event analyses (i.e., survival analysis) conducted on observational data often requires insight beyond the proportional hazards dependent Cox model because, in the real-world, exposures can change with time. Multiple procedures in the same individuals require a cluster-based analysis approach, extending beyond the rudimentary analysis of proportions, to account for implicit structures within the data. My research is focused on analysis (and re-analysis) of non-randomized cardiothoracic surgical outcomes data using causal inference methods to produce results which can aid surgical decision-making. Summarily, my extension of the core concepts of regression and matrix algebra into the applications of artificial intelligence, deep methods, and machine learning are some of the many ways that the cardiothoracic surgical decisions of the future will be shaped.
The cardiothoracic surgeons of today face intraoperative decisions for which data might not yet be available. With the rapid development of new surgical techniques, randomization is not always possible or ethical. TTE methods, synthetic data modelling, and causal non-randomized data analysis methods are ways in which the shortfalls of RCTs can be atleast partially ameliorated. Taken together, this area of research represents a source of valid and time-sensitive inferences, so that surgeons can make decisions affecting patients today.
Keywords: cardiothoracic surgery, statistics, causal inference, longitudinal analysis, regression, failure-time analysis.
References
- Hernán, M.A. and Robins, J.M., 2016. Using big data to emulate a target trial when a randomized trial is not available. American journal of epidemiology, 183(8), pp.758-764.
- Foraker, R., Mann, D.L. and Payne, P.R., 2018. Are synthetic data derivatives the future of translational medicine?. JACC: Basic to Translational Science, 3(5), pp.716-718.
- Robins, J.M., Rotnitzky, A. and Zhao, L.P., 1995. Analysis of semiparametric regression models for repeated outcomes in the presence of missing data. Journal of the american statistical association, 90(429), pp.106-121.
