Year : 2022 | Volume
: 1 | Issue : 3 | Page : 96--102
The 2nd International Annual Conference of the SSCP Accepted Abstract: Pharmacotherapy; Healthcare Professionals
|How to cite this article:|
. The 2nd International Annual Conference of the SSCP Accepted Abstract: Pharmacotherapy; Healthcare Professionals.Saudi J Clin Pharm 2022;1:96-102
|How to cite this URL:|
. The 2nd International Annual Conference of the SSCP Accepted Abstract: Pharmacotherapy; Healthcare Professionals. Saudi J Clin Pharm [serial online] 2022 [cited 2023 Feb 2 ];1:96-102
Available from: http://www.sjcp.org/text.asp?2022/1/3/96/357712
Theme 3: Pharmacotherapy; Healthcare Professionals
Standard Dosing of Enoxaparin Vs. Unfractionated Heparin in Critically Ill COVID-19 Patients: A Multicenter Cohort Study
Mashael AlFaifi, Khalid Al Sulaiman, Ohoud Aljuhani, Ghazwa B. Korayem, Awtiff Hafiz, Mai Alalawi, Hisham A. Badreldin, Ali F. Altebainawi, Ramesh Vishwakarma, Abdulrahman Alissa, Albandari Alghamdi, Abeer A. Alenazi, Huda Al Enazi, Shahad Alanazi, Abdullah Alhammad, Jahad Alghamdi, Faisal A. Al Sehli, Maram A. Aldossari, Alaa A. Alhubaishi, Anfal Y. Al-Ali, Hasan M. Al-Dorzi
Background: Thrombotic events are common in critically ill patients with COVID-19 and have been linked to COVID-19-induced hyperinflammatory state. Beside its anticoagulant effects, heparin and its derivatives possess various anti-inflammatory and immunomodulatory properties that may affect the outcomes of patients. This study evaluated the effectiveness and safety of standard doses of prophylactic enoxaparin when compared with unfractionated heparin (UFH) among critically ill patients with COVID-19.
Methods: A multicenter, retrospective cohort study included critically ill adult patients with COVID-19 admitted to the intensive care unit (ICU) between March 2020 and July 2021. Patients were categorized into two groups based on the type of pharmacological VTE prophylaxis given in fixed doses (enoxaparin 40 mg q24 h vs. UFH 5000 U every 8 h) throughout their ICU stay. The primary endpoint was all cases of thrombosis. The secondary endpoints were bleeding, blood transfusion requirement, mortality, ICU and hospital length of stay (LOS), ventilator-free days at 30 days, and complications during ICU stay. Multivariable logistic, Cox proportional hazards, and negative binomial regression analyses were used as appropriate.
Results: A total of 1598 patients were screened during the study period; 306 patients were enrolled based on the eligibility criteria. Of these patients, 142 received UFH, whereas 164 patients received enoxaparin. Patients who received enoxaparin compared with UFH had lower all thrombosis events at crude analysis (4.3% vs. 9.9%; P-value=0.05); however, it failed to reach statistical significance in logistic regression analysis [odds ratio (OR): 0.22 (0.05, 1.07); P-value=0.06]. Although there were no significant differences in all bleeding cases between the two groups [OR: 2.21 (0.13, 36.5); P-value=0.58], blood transfusion requirement was lower in the enoxaparin group but did not reach statistical significance [OR: 0.13 (0.01, 1.43); P-value=0.09]. The 30-day and in-hospital mortality were similar between the two groups at Cox hazards regression analysis. In contrast, hospital LOS was shorter in the enoxaparin group (β-coefficient: −0.45; 95% confidence interval: −0.73, −0.17; P-value=0.002).
Conclusion: In critically ill patients with COVID-19, using enoxaparin compared with UFH at standard prophylactic doses might have a thrombosis reduction benefit and was associated with a shorter hospital stay.
The Role of Oseltamivir in COVID-19 Critically Ill Patients Without Other Respiratory Viral Co-infections
Mashael AlFaifi, Ohoud Aljuhani, Ghazwa B. Korayem, Ali F. Altebainawi, Meshal S. Alotaibi, Noura A. Alrakban, Ragia H. Ghoneim, Ramesh Vishwakarma, Abdulrahman I Al Shaya, Shmeylan Al Harbi, Jawaher Gramish, Dahlia M. Almutairi, Ghada Alqannam, Faisal F. Alamri, Abdullah Alharthi, Abdullah Al Amer, Khalid Al Sulaiman
Background: Oseltamivir is an antiviral drug that plays an essential role in treating influenza. During the ongoing COVID-19 pandemic, oseltamivir has been used in COVID-19 patients without solid evidence of effectiveness and safety. This study aims to evaluate the effectiveness and safety of oseltamivir in critically ill patients with COVID-19.
Methods: This multicenter, retrospective cohort study includes critically ill adult patients with COVID-19 admitted to the intensive care unit (ICU) between March 2020 and July 2021. Patients were categorized into two subgroups based on oseltamivir use during ICU stay within 48 h of ICU admission (oseltamivir vs. control). The primary endpoint was viral load clearance. The secondary endpoints were mechanical ventilation (MV) duration, mortality, hospital length of stay (LOS), ICU LOS, and complications during ICU stay. Propensity score (PS) matching (1:1) was used based on the predefined criteria. Multivariable logistic, Cox proportional hazards, and negative binomial regression analyses were used as appropriate.
Results: A total of 1592 patients were screened during the study period; 149 patients were excluded. Of the included patients, 226 patients were matched into two groups based on their PS in a 1:1 ratio. The time for COVID-19 viral load to be undetected was shorter in patients who received oseltamivir than in the control group [11 vs. 16 days, P=0.042; β-coefficient: −0.84, 95% confidence interval (CI): (−1.33, 0.34), P=0.0009]. MV duration was shorter in patients who received oseltamivir than the control group [6.5 vs. 8.5 days, P-value=0.02; β-coefficient: −0.27, 95% CI: (−0.55, 0.02), P=0.06]. In addition, patients who received oseltamivir have a lower odd of hospital-/ventilator-acquired pneumonia [odds ratio: 0.49, 95% CI: (0.283, 0.861), P=0.01]. On the contrary, no significant differences in the 30-day mortality nor in-hospital mortality [(hazard ratio (HR): 1.05, 95% CI: (0.70, 1.59), P=0.79); (HR: 1.01, 95% CI: (0.68, 1.51), P=0.93, respectively)] were observed.
Conclusion: In COVID-19 critically ill patients, oseltamivir use was associated with faster viral load clearance and shorter MV duration without safety concerns. However, it could not be translated as a mortality benefit. Further randomized clinical and interventional studies are required to confirm our findings.
Prescription Patterns of Quetiapine for Multiple Drug Abuse, Depression, and Psychosis: A Retrospective Study
Osama Al-Mohammadi, Ayman Al-Qaaneh Razan, Musharraf Jumanah Al-Saedi, Jana Shaker, Ahmed Al-Dhafiri
Background: Quetiapine is an atypical antipsychotic commonly prescribed for schizophrenia, bipolar disorder, multiple drug abuse (MDA), generalized anxiety disorder, severe depression, dementia, and mood disorders. Many previous case reports highlighted the misuse of the drug. Prescription of quetiapine varies according to use, with side effects increasingly reported with higher doses. Here we studied the prescribing pattern of quetiapine in MDA, depression, and psychosis patients in the Madinah region, Saudi Arabia.
Methods: This is a retrospective single-center study carried out in the main referral hospital for mental health in Madinah, Saudi Arabia.
Results: A total of 158 patients were included in this study. The mean age of the patients was 30.5 ± 10.1 years. Male represented 89.9% of the patients. In terms of quetiapine indications, 46.2% of the patients used it for MDA, 29.7% for psychosis, and 24.1% for depression. For all patients, quetiapine was used with a mean daily dose of 285.2 ± 222 mg and for a mean duration of 13.9 ± 15.4 weeks. Quetiapine was prescribed with a mean of 2.1 ± 2.2 prescriptions. Comparison between different indications shows that quetiapine was more frequently prescribed for MDA (P<0.001). MDA patients were significantly younger than those in other groups (P=0.001). All patients who received quetiapine for MDA were males. However, MDA patients received a smaller dose of quetiapine than other indications (P<0.001). There were no significant differences between groups in terms of the number of prescriptions, duration, and whether the patient was on other medications or not. These results have been confirmed by the regression analysis, in which male and younger ages represented a significant contributing factor to MDA compared with psychosis, 95% confidence interval (CI): 8 × 107 (8 × 107–8 × 107) and 95% CI: 0.943 (0.900–0.987), respectively.
Conclusion: Quetiapine was prescribed more frequently in MDA patients and younger individuals. Low dose was predominant in those patients, indicating a probability of drug abuse.
Role of Pharmacists in Management of Anemia in Hemodialysis Patients: A Retrospective Observational Study in Pakistan
Shinaz Arshad, Rehan Anjum, Nabeel Alvi, Salwa Ahsan
Background: Anemia is a serious complication of chronic kidney disease that occurs during the early stage of the disease and intensifies as the kidney function deteriorates. Iron-containing products and erythropoiesis-stimulating agents (ESAs) are used for anemia management. Thus, the purpose of this study is to identify the importance of pharmacists in the management of anemia and improving the rational use of drugs in hemodialysis (HD) patients.
Methods: A retrospective observational study was conducted in a tertiary care hospital in Islamabad, within a time span of 6 months. Anemia management in HD patients was analyzed by pharmacists through reconciliation and prescription review. Any change in the anemia management therapy was recorded as an intervention by the pharmacist.
Results: A total number of 1248 patients were assessed by the pharmacist regarding anemia management. The result showed a 12% intervention rate (n=149). The majority of the interventions were related to decreasing or holding the dose of ESAs according to body weight and recent complete blood count (CBC) report in about 52% of the patients. About 30% of the interventions were related to outdated CBC reports of patients, and 13% of the interventions accounted for adding or increasing the ESAs dose. However, dose adjustments of iron-containing products contributed to 4% of interventions, and 91% of the interventions by pharmacists were accepted by the nephrology physicians.
Conclusion: Our study showed a 12% intervention rate, which means that 1 in every 8 patients on HD is at risk of receiving inappropriate therapy for the management of anemia in the absence of a pharmacist. The irrational use of drugs can lead to significant risks of stroke, serious cardiovascular events, or even death. Thus, this study highlights the importance of pharmacists and the extreme necessity of expanding the pharmacy services in countries where these services are in developing phase to improve safe and rational use of medication.
Dexamethasone Vs. Methylprednisolone for Multiple Organ Dysfunction in COVID-19 Critically Ill Patients: A Multicenter Propensity Score Matching Study
Rahaf Ali Alqahtani, Ohoud Aljuhani, Ghazwa Korayem, Ali Altebainawi, Mohammed Aldhaeefi, Dania Al-Mohammady, Abeer Alenazi, Faisal Almutairi, Hisham Badreldin, Ramesh Vishwakarma, Amjaad Alfahed, Thamer Alsulaiman, Fahad Aldhahri, Namareq Aldardeer, Ahmed Alenazi, Shmeylan Al Harbi, Raed Kensara, Khalid Al Sulaiman
Background: Dexamethasone has a mortality benefit in COVID-19 patients, particularly those requiring invasive mechanical ventilation (MV). However, it is uncertain whether another corticosteroid, such as methylprednisolone, may be utilized to obtain a superior clinical outcome. This study aiming to compare dexamethasone’s clinical and safety outcomes vs. methylprednisolone in COVID-19 critically ill patients admitted to the intensive care units (ICUs).
Methods: A multicenter, retrospective cohort study includes COVID-19 critically ill adult patients admitted to ICUs from March 2020 to July 2021. Patients were categorized into two groups based on the type of corticosteroids received within 24 h of ICU admission: the active group for patients received methylprednisolone and the control group received dexamethasone. The primary outcome was the progression of multiple organ dysfunction (MOD) score on day 3 of ICU admission. The secondary outcomes were respiratory failure requiring MV, mortality, ICU and hospital length of stay, ventilator-free days at 30 days, and complications during ICU stay. Propensity score (PS) matching was used (1:3 ratio) based on the patients’ age and MOD score within 24 h of ICU admission.
Results: A total of 1385 patients were screened during the study period; 526 patients were eligible after applying the study exclusion criteria. After PS matching, 264 patients were included according to the selected criteria. Of these patients, 198 were given dexamethasone therapy, whereas 66 patients were given methylprednisolone within 24 h of ICU admission. In the regression analysis, patients who received methylprednisolone than dexamethasone had a higher MOD score on day 3 of ICU admission [β-coefficient: 0.17 (95% confidence interval (CI) 0.02, 0.32), P= 0.03]. Moreover, hospital-acquired infection was higher in the methylprednisolone group [odds ratio (OR) 2.17, 95% CI 1.01, 4.66, P = 0.04]. However, other complications during the stay were similar between the two groups. The 30-day and in-hospital mortalities were similar in both the groups on multivariable Cox proportional hazards regression analysis.
Conclusion: In COVID-19 critically ill patients, the use of dexamethasone compared with methylprednisolone resulted in a lower MOD score on day 3 of ICU admission with a similar mortality rate.
Incidence and Clinical Outcomes of New-onset Atrial Fibrillation in Critically Ill COVID-19 Patients: A Multicenter Cohort Study
Rahaf Alqahtani, Raed Kensara, Ohoud Aljuhani, Ghazwa B. Korayem, Hadeel Alkofide, Sumaya Almohareb, Yousef Alosaimi, Ali Altebainawi, Khalid Bin Saleh, Norah Alandas, Shmeylan Al Harbi, Abdullah Al Harthi, Uhood Ashkan, Rema Alghamdi, Hisham Badreldin, Awatif Hafiz, Mashael AlFaifi, Ramesh Vishwakarma, Abeer Alenazi, Mai Alalawi, Khalid Al Sulaiman
Background: Hospitalized critically ill patients with COVID-19 have been associated with new-onset atrial fibrillation (Afib). The incidence and clinical outcomes of COVID-19 critically ill patients with new onset Afib have not been well studied. Therefore, our study aims to investigate the incidence and clinical outcomes associated with new-onset Afib in critically ill patients with COVID-19.
Methods: A multicenter, retrospective cohort study includes critically ill adult patients with COVID-19 admitted to the intensive care units (ICUs) from March 1, 2020 until July 31, 2021. Patients were categorized into two groups based on developing new-onset Afib (control vs. new-onset Afib). The primary outcome was in-hospital mortality. Other outcomes of interest include thrombosis/infarction, bleeding, 30-day mortality, hospital length of stay (LOS), ICU LOS, ventilator-free days at 30 days, multiple organ dysfunction at day 3, liver injury, acute kidney injury, and respiratory failure requiring mechanical ventilation (MV). Multivariable regression and negative binomial regression were employed as appropriate.
Results: A total of 135 (10.7%) patients developed new-onset Afib during their ICU stay. After propensity score (PS) matching (3:1 ratio), 400 patients were included in the final analysis. There was no significant difference in the 30-day mortality between the two groups [odds ratio (OR) 1.55; 95% confidence interval (CI) 0.91, 2.63; P=0.10]. However, patients who developed new-onset Afib had higher odds of in-hospital mortality than the control group (OR 2.76; 95% CI 1.49, 5.11; P=0.001). Also, the MV duration, ICU LOS, and hospital LOS were longer in patients who developed new-onset Afib (β-coefficient 0.52; 95% CI 0.28, 0.77; P<0.0001), (β-coefficient 0.29; 95% CI 0.12, 0.46; P<0.001), and (beta coefficient 0.35; 95% CI 0.18, 0.52; P<0.0001), respectively. Moreover, the control group had significantly lower odds of multiple organ dysfunction scores on day 3, major bleeding, liver injury, and respiratory failure that required MV.
Conclusion: New-onset Afib is a common complication among critically ill patients with COVID-19 that is associated with poor clinical outcomes and higher hospital mortality.
Clinical Pharmacokinetics of Rivaroxaban in Saudi Patients
Razan Saleh AlMofada, Saeed AlQahtani, Jamilah AlNahdi, Asma Bin Hazza
Background: Rivaroxaban is the first oral direct Xa factor inhibitor. Unlike warfarin, the direct oral anticoagulants offer safe and effective anticoagulation with no need for routine laboratory monitoring. It is used in several clinical settings either as prophylaxis or as treatment of different thrombotic disorders. This study aimed to characterize the pharmacokinetic parameters of rivaroxaban in our population.
Methods: This study is a prospective pharmacokinetics study. The study is conducted at the King Saud University Medical City. This study included adult patients who received rivaroxaban as either treatment or prophylaxis. Blood samples were collected at 1, 2, 4, 6, and 12 h following oral administration of rivaroxaban. Rivaroxaban concentration in plasma was determined using high-performance liquid chromatography. Pharmacokinetic parameters were estimated using PKanalix software.
Results: Fourteen patients were included in this study, in which 50% is male. The average daily dose of rivaroxaban was 17.5 ± 3.5 mg. The average age in years was 50 ± 16.2 and the mean body weight was 83 ± 31.8 kg. The time to reach maximum concentration (Tmax) for rivaroxaban was 1 h and the maximum concentration (Cmax) was 0.96 ± 0.15 μg/mL. The area under the curve (AUCinf) was 14.1 ± 6.6 h μg/mL. The apparent volume of distribution and clearance were 53.4 L and 2.18 L/h, respectively.
Conclusion: In this study, we reported the pharmacokinetic parameters of rivaroxaban in Saudi patients. The findings of this study showed that the clearance of rivaroxaban is much lower than what has been reported in the literature in other populations.
Development of a Prediction Model to Identify the Risk of Clostridium difficile Infection in Hospitalized Patients
Abdulrahman Mhdi Alamri, Abdulrahman Alamri, AlHanoof Bin Abbas, Ekram Al Hassan, Yasser Almogbel
Background:Clostridium difficile infection (CDI) is serious, especially in the elderly and in patients with gut microbiota dysbiosis resulting from antibiotics exposure. This study’s objective was to develop a risk prediction model to identify hospitalized patients at risk of CDI who had received at least one dose of systemic antibiotics in a large tertiary hospital.
Methods: This was a retrospective case–control study that included patients hospitalized for more than 2 days who received antibiotic therapy during hospitalization. The study included two groups: patients diagnosed with hospital CDI and controls without hospital CDI. Cases were matched 1:3 with controls by age and sex. Descriptive statistics were used to identify the study population by comparing cases with controls. Continuous variables were expressed as means and standard deviations. A multivariate analysis was built to identify the significantly associated covariates between cases and controls for CDI.
Results: A total of 364 patients were included and distributed between the two groups. The control group included 273 patients, and the case group included 91 patients. The risk factors for CDI were investigated, with only significant risks identified and included in the risk assessment model: age older than 70 years (P = 0.034), chronic kidney disease (P = 0.043), solid organ transplantation (P = 0.021), and lymphoma or leukemia (P = 0.019). A risk score of ≥2 showed the best sensitivity, specificity, and accuracy of 78.02%, 45.42%, and 78.02, respectively, with an area under the curve of 0.6172.
Conclusion: We identified four associated risk factors in the risk prediction model. The tool showed good discrimination that might help predict, identify, and evaluate hospitalized patients at risk of developing CDI.
Epidemiological and Economic Analysis of Early Antiretroviral Therapy with Bictegravir/Emtricitabine/Tenofovir Alafenamide in Kingdom of Saudi Arabia
Asma Mestouri, Jawhar Sarhiri, Evelina Zimovetz, Mohamed Bakry, Ghassan Wali
Background: The aim of this article is to evaluate the potential epidemiological and economic impact of early initiation of bictegravir/emtricitabine/tenofovir alafenamide (B/F/TAF), the only single-tablet regimen recommended for rapid initiation in diagnosed patients, on HIV transmission compared with current antiretroviral therapy (ART) initiation observed in clinical practice in the Kingdom of Saudi Arabia (KSA).
Methods: A previously developed transmission model was adapted to estimate the cumulative HIV infection incidence and potential cost savings based on the number of HIV infections prevented over a 10-year period. The analysis compared early treatment initiation with B/F/TAF (7 days from diagnosis until treatment) vs. current ART practice (29 days). The model used 1-year cycles, consisting of a prevalent population of Saudi men and women living with HIV that was divided into different health states, each with a different risk of transmission. Infectious individuals contributed to the incidence of new infections in that year via their risk of transmission estimated by sexual contact. This HIV lifetime cost (USD, 2021) included direct costs of HIV management.
Results: In the base-case analysis, early therapy initiation was estimated to prevent 157 new HIV infections over the next 10 years compared with late initiation with a pool of integrase strand transfer inhibitors, non-nucleoside reverse transcriptase inhibitors, and protease inhibitors. Considering the lifetime costs (SAR 1,724,208 per person) of treating the infections avoided over the next 10 years, the reduction in HIV incidence could result in a potential savings of SAR 269,966,756. According to one-way sensitivity analyses, besides the time horizon, the time for ART initiation was the parameter with the highest impact on HIV incidence.
Conclusion: Rapid initiation of ART in newly diagnosed patients is a high value strategy for the KSA National Health System. Rapid initiation with B/F/TAF contributes to the reduction in HIV incidence and related costs as demonstrated by the model results.
Prediction of the Vancomycin AUC0–24/MIC Ratio Using the Bayesian Platform: A Retrospective, Single-center, Cross-sectional Study
Manar A. Harbi, Abdullah M. Alzahrani, Anjum Naeem, Rami M. Alzhrani, Manar A. Harbi, Sarah A. Alghamdi, Shahid Karim, Ahmed S. Ali, Ghuson Alsenaini, Hani Hasan, Yahya.A Alzahrani
Background: The AUC0–24/MIC ratio is the most accurate way to track the vancomycin level, whereas Cmin is not an accurate surrogate. Most hospitals in Saudi Arabia are under-practicing the AUC-guided vancomycin dosing and monitoring. No previous work has been conducted to evaluate such practice in the whole kingdom. The objective of the current study is to calculate the AUC0–24/MIC ratio using the Bayesian dosing software (PrecisePK), identify the probability of patients who receive the optimum dose of vancomycin, and evaluate the accuracy and precision of the Bayesian platform.
Methods: This retrospective study was conducted at King Abdulaziz Medical City, Jeddah. All adult patients treated with vancomycin were included. Pediatric patients, critically ill patients requiring ICU admission, patients with acute renal failure or undergoing dialysis, and febrile neutropenic patients were excluded. Based on the Bayesian principle, the AUC0–24/MIC ratio was predicted using the PrecisePK platform. The two-compartment model by Rodvold et al. in this platform and patients’ dose data were utilized to calculate the AUC0–24/MIC ratio and trough level.
Results: Among 273 patients included in the present study, the mean of the estimated vancomycin AUC0–24/MIC ratio by the posterior model of PrecisePK was 573 ± 199.6 mg, and the model had a bias of 16.8%, whereas the precision was 2.85 mg/L. The target AUC0–24/MIC ratio (400–600 mg h/L) and measured trough (10–20 mg/L) were documented in 127 (37.1%) and 185 (54%) patients, respectively. Furthermore, the result demonstrated an increase in odds of AUC0–24/MIC > 600 mg h/L among trough level 15–20 mg/L group (odds ratio=13.2, P < 0.05) when compared with trough level 10–14.9 mg/L group.
Conclusion: In conclusion, the discordance in the AUC0–24/MIC ratio and measured trough concentration may jeopardize patients’ safety, and implantation of the Bayesian approach as a workable alternative to the traditional trough method should be considered.
Efficacy of Off-label Use of Direct-acting Oral Anticoagulants Compared with Warfarin in Treatment of Left Ventricular Thrombus
Background: Left ventricular thrombus (LVT) is usually seen in situations with reduced LV function including dilated cardiomyopathy and post-myocardial infarction. Warfarin is being used as an anticoagulant for LVT. The stroke guidelines do suggest that direct-acting oral anticoagulants (DAOCs) could be used in patients with warfarin intolerance. Nevertheless, the data about the role of DAOCs in the management of LV thrombus and which one of DOACs is more effective are deficient.
Methods: An observational retrospective study was conducted at Prince Sultan Cardiac Center in Riyadh, Saudi Arabia, from January 2016 to 2021. All patients with LVT documented by echocardiography were included. Patients who did not actually have a confirmed LVT or nor have echo follow-up were excluded. This study was approved by the Institutional Review Board. The primary endpoint was to evaluate thrombus resolution and the median time of thrombus resolution (comparing apixaban vs. rivaroxaban vs. warfarin). The secondary endpoint was to evaluate bleeding. Statistical analysis was conducted using SPSS 22nd edition, and numeric variables were presented as mean ± SD and compared using the Kruskal–Wallis test. The percentage and χ2 test were used for categorical variables. The logistic regression model was conducted to assess the independent factors affecting resolution. Any P-value less than 0.05 was considered significant.
Results: Sixty-six patients participated in this study. Rivaroxaban included the highest number of participants (n=36) with a mean age of 58, and the other two groups had a similar number of participants (apixaban, n=15 and warfarin, n=15), with mean ages of 55 and 52, respectively. Resolution was recorded at all three groups; the highest percentage was for apixaban (73.3%), followed by warfarin (66.7%) and rivaroxaban (65.6%) with no significant difference (P=0.86). The median time (months) of thrombus resolution was 25, 11, and 4 for warfarin, rivaroxaban, and apixaban, respectively, with significant difference (P=0.0001). However, the follow-up echo was done within 2 months in the apixaban group. The follow-up echo in the other groups takes more than 2 months. There was no difference in bleeding between both the groups.
Conclusion: The thrombus resolution rates were similar between DOACs and warfarin, although resolution occurred temporally significantly earlier with DOACs. Our study provides preliminary evidence that DOACs use is safe and effective in LVT, and these findings could help guide treatment in these patients. A large-scale prospective study is required to measure the effect of DOACs on thrombus resolution.
Evaluating the Health-related Quality of Life (HRQoL) of COVID-19 Survivors: A Cross-sectional Study in Eastern Province of Saudi Arabia
Nousheen Aslam, Fatima Saleem Abodrees, Mawa Hussain, Zahra Mousa Al-Mousa
Background: Little is known about the health-related quality of life (HRQoL) of COVID-19 survivors in Saudi Arabia. The present study aimed to determine the HRQoL of individuals who survived this infection.
Methods: One hundred and fifty responses were recorded through a structured questionnaire to gather information about sociodemographic and clinical variables. EQ-5D-5L was used to collect information about the HRQoL. The Mann–Whitney U-test and Kruskal–Wallis test were used to determine the HRQoL and its determinants (significance level <0.05).
Results: The mean utility score was 0.8 (±0.23). Females had a lower HRQoL than males (P=0.045). Married individuals had better HRQoL than single or never married and divorced/widows (P=0.010). The transmission of COVID-19 in 48% of the respondents was through family members and their QoL was significantly low (P=0.01). The expatriates who were living alone had poor HRQoL (P=0.001). There were 52 unique cases of “No Problem 11111” (34.7%), whereas one participant had a negative utility score (−0.02, 43551). Respondents who self-isolated themselves during infection and individuals with preexisting respiratory problems had lower HRQoL (P=0.015, 0.02). The post-COVID-19 symptoms such as cough, fatigue/lethargy, loss of appetite, anosmia, arthralgia, insomnia, and chest pain were also reported, and they had significantly reduced the HRQoL (P= 0.034, 0.001, 0.007, 0.012, 0.005, 0.001, 0.001).
Conclusion: The COVID-19 infection has unprecedentedly affected the well-being of affected individuals. Knowing that the various sociodemographic and clinical variables which influence the quality of life of the survivors can assist these individuals, their healthcare providers and policy makers help maintain the HRQoL of COVID-19 survivors.
Comparative Efficacy of Ticagrelor Vs. Clopidogrel on Left Ventricular Remodeling in Acute Coronary Syndrome Patients: A Retrospective Cohort Study
Fawaz Tawhari, Mohammad Zaitoun, Reem Bahmaid
Background: Left ventricular (LV) systolic dysfunction is a major acute myocardial infarction complication. Antiplatelet therapy is recommended for the acute coronary syndrome (ACS) by several clinical guidelines. Limited evidence showed ticagrelor superiority over clopidogrel for LV remodeling after percutaneous coronary intervention (PCI). This study aimed to compare the efficacy of ticagrelor vs. clopidogrel for LV remodeling in ACS patients who underwent PCI and used dual antiplatelet therapy (DAPT).
Methods: A retrospective 10-year cohort study was conducted at a specialized heart center. The study involved all adult patients diagnosed with ACS who underwent PCI and used DAPT. The study outcomes included comparisons of baseline and 1-year post-PCI percentage of changes in left end-systolic volume (LESV), left end-diastolic volume (LEDV), ejection fraction (EF), and brain natriuretic peptide (BNP) levels. The data were collected from the electronic health records system. Comparisons were conducted using Student’s t and χ2 tests for continuous and categorical variables, respectively, with a significance level of 0.05.
Results: One hundred thirty-seven patients were enrolled (50 in the clopidogrel group and 87 in the ticagrelor group). No significant differences were detected in patients’ demographics, comorbidities, medication profiles, and baseline outcome variables, except for significantly older age (65.38 ± 17.04 vs. 59.3 ± 11.35, P=0.027) and more female representation (60% vs. 19.5%, P<0.001) in the clopidogrel group. No significant differences were detected in any of the study outcomes, including percentage of change in LEDV (9.79% for ticagrelor vs. 12.18% for clopidogrel, P=0.48), LESV (18.27% for ticagrelor vs. 26.34% for clopidogrel, P=0.45), EF (0% for ticagrelor vs. 9.09% for clopidogrel, P=0.117), and BNP (82.19% for ticagrelor and 90.77% for clopidogrel, P=0.473).
Conclusion: In this study, despite being administered for significantly elderly patients, clopidogrel showed comparable efficacy to ticagrelor. Further larger sample-sized randomized controlled trials are needed to confirm these findings.
Seroprevalence of Hepatitis B Virus Infection Among Healthcare Workers in an Academic Tertiary Care Center, Riyadh, Saudi Arabia
Leena Saeed, Muneera Al-Jelafi
Background: Hepatitis B virus (HBV) infection is a serious infection that results in about 820,000 deaths annually globally, mostly due to its complications, cirrhosis, and hepatocellular carcinoma. Healthcare workers (HCWs) are among the high-risk populations who need frequent screening programs and strict implementation of infection control measures to prevent infections with HBV.
Objective: This study aimed to assess HCWs HBV immune status and to test the institution compliance with Centers for Disease Control recommendations.
Methods: This is a retrospective cross-sectional study to explore HBV immunization status among HCWs in King Saud University Medical City (KSUMC). Data were collected by reviewing the charts of all screened employees between 2019 and 2020. Positive immunity is defined as a titer of anti-HBs >10 mIU/mL, negative immunity is defined as a titer of anti-HBs < 10 mIU/mL, and non-responders are those with a titer of anti-HBs < 10 mIU/mL, despite receiving two courses (six doses) of HBV vaccination. Included participants are all healthcare workers (physicians, pharmacists, nurses, dentists, dietitians, phlebotomists, and lab technicians) at KSUMC.
Results: Four hundred charts of HCWs were reviewed: 65% were females and 35% were males. Nursing profession was predominant in the study group making 46.3% of the participants. The majority of the participants have positive immunity at the time of screening (86.8%). Only 53 subjects were not immune at the time of screening, of them 46 subjects (11.5%) had positive immunity after one course (three doses) of hepatitis B vaccination. Two subjects (0.5%) had immunity after receiving two courses of hepatitis B. Vaccination non-compliance was recorded for three patients (0.8%). One patient was recorded as non-responder after receiving two series of vaccination with negative immunity. One patient was not immune due to the lack of follow-up.
Conclusion: Despite that the rate of HBV infections has dropped significantly since the introduction of HB vaccination, the risk of exposure to HB continues among HCWs. So, it is advisable to strictly implement infection control measures in healthcare settings, improve vaccination programs, comply with the recommended full vaccination course, promote education and awareness, and most importantly to enforce the annual screening programs. We advocate launching of national guidelines for the prevention, diagnosis, post-exposure management, and treatment of HBV infections in all at-risk populations to standardize institutional practices and to keep safe healthcare environments.