Search Constraints
Filtering by:
Depositor
"ivanovid@ucmail.uc.edu"
Remove constraint Depositor: "ivanovid@ucmail.uc.edu"
Type
Work
Remove constraint Type: Work
1 - 3 of 3
Number of results to display per page
Search Results
-
- Type:
- Dataset
- Description/Abstract:
- Dataset Summary: This dataset studies the main challenges that students in these institutions faced during the transition from face-to-face (f2f) to remote mode of instruction and the resources that they used to minimize these adversities. In order learn about their experiences during this transition, I surveyed at the end of the Spring Semester students enrolled in two Political Science (POL) classes. The results showed that majority of students struggled with stress caused by moving away from campus and self-quarantine leading to deteriorating mental and physical health. Concerns about student health along with distraction at home were identified as top adversities for student well-being. Survey results also showed that educational resources can have varying impact on student learning in introductory and upper-level courses. For example, lecture notes, power point presentations and online videos can be better resources for remote instruction in an introductory class, while class meetings via video conferencing platforms can be the preferred resource of instruction in upper-level courses. Below is the questionnaire used for this study: Survey Questionnaire: Transition to Remote Instruction During COVID-19 Crisis: Qualtrics Link for POL1080: https://artsciuc.co1.qualtrics.com/jfe/form/SV_bd7cF1OF6eNeYBv Qualtrics Link for POL2074: https://artsciuc.co1.qualtrics.com/jfe/form/SV_3xegnXy4LFSC2t7 1. As you know, the University of Cincinnati has transitioned from face-to-face to remote instruction for Spring Semester since March 14, 2020 due to COVID-19. Once it was decided to switch to remote instruction, how did you expect that this decision would impact your performance in this class? I thought it would improve my performance I thought it would impair my performance I did not think that it would impact my performance I don’t know 2. Based on your experience with remote instruction, how do you think the new form of instruction impacted your performance in this class? I did better in this class after we switched to remote teaching I did worse in this class after we switched to remote teaching The switch to remote teaching had no impact on my performance I don’t know. 3. Do you agree or disagree with the following statement: “I felt that the instructor in this class provided timely instructions and information about the switch from face-to-face to remote form of content delivery in the class”? Completely agree Partially agree Partially disagree Completely disagree Not sure/ don’t know. 4. Do you agree or disagree with the following statement: “I felt that the instructor in this class cared about my performance in the class once we switched from face-to-face to remote form of content delivery in the class”? Completely agree Partially agree Partially disagree Completely disagree Not sure/ don’t know. 5. Which of the following course resources (if available) helped you ease the transition from face-to-face to remote instruction (check all that apply)? Online instructional videos created or made available by the instructor Instructor-led class meetings via a web-conferencing platform (e.g. Webex, Zoom, MS Teams, Skype) Meetings with the instructor via a web-conferencing platform (e.g. Webex, Zoom, MS teams, Skype) during their office hours Instructor’s lecture notes and presentation materials (e.g. Power Point Slides) Online quizzes or interactive questions administered via web platforms (e.g. Canvas, Blackboard, Echo 360 or others). Online forums made available for this course Assigned course readings Book publisher’s online resources (websites, book ancillaries, etc.) Supplemental assistance from teaching assistants (e.g. office hours, online sessions, etc.) Supplemental peer-led review sessions (e.g. Learning Assistant Sessions, Supplemental Instruction Sessions, etc.) Group activities with peers enrolled in the class (e.g. study sessions via conference platforms) Others (please list) _________. 6. Which one of the following course resources was most helpful to you in the transition from face-to-face to online mode of content delivery (select only one)? Online instructional videos created or made available by the instructor Instructor-led class meetings via a web-conferencing platform (e.g. Webex, Zoom, MS Teams, Skype) Meetings with the instructor via a web-conferencing platform (e.g. Webex, Zoom, MS teams, Skype) during their office hours Instructor’s lecture notes and presentation materials (e.g. Power Point Slides) Online quizzes or interactive questions administered via web platforms (e.g. Canvas, Blackboard, Echo 360 or others). Online/ web discussion forums made available for this course Assigned course readings Textbook publisher’s online resources (websites, book ancillaries, etc.) Supplemental assistance from teaching assistants (e.g. office hours, online sessions, etc.) Supplemental peer-led review sessions (e.g. Learning Assistant Sessions, Supplemental Instruction Sessions, etc.) Group activities with peers enrolled in the class (e.g. study sessions via web-conferencing platforms) Others (please list) _________. 7. Which of the following, do you think, impacted negatively your performance in this class during the transition from face-to-face to remote instruction (please select all relevant options)? I had to move away from campus in the middle of the semester My physical or mental health deteriorated after we switched to remote instruction I missed face-to-face interaction with the instructor, the TAs and the undergrad assistant (SI) I did not have stable and reliable Internet connection at home I had a lot of distraction at home I lost my job/ income due to the COVID-19 epidemic I had to take an additional job to support myself and/ or my family Self-quarantine and/ or social distancing caused me a lot of stress The news about the COVID-19 epidemic and concerns about my health and the health of my loved ones caused me a lot of stress Other (please list) ___________. 8. Which of the following, do you think, impacted negatively your performance in this class during the transition from face-to-face to remote instruction (please select only one options)? I had to move away from campus in the middle of the semester My physical or mental health deteriorated after we switched to remote instruction I missed face-to-face interaction with the instructor, the TAs and the undergrad assistant (SI) I did not have stable and reliable Internet connection at home I had a lot of distraction at home I lost my job/ income due to the COVID-19 epidemic I had to take an additional job to support myself and/ or my family Self-quarantine and/ or social distancing caused me a lot of stress The news about the COVID-19 epidemic and concerns about my health and the health of my loved ones caused me a lot of stress Other (please list): 9. Based on your experience with this course’s transition from face-to-face to remote instruction for Spring Semester 2020, what aspects of this transition had greatest values for you? Open ended question: 10. Based on your experience with this course’s transition from face-to-face to remote instruction for Spring Semester 2020, what changes would you recommend to ease this transition in the future? Open ended question: 11. What is your gender? Male Female Other/ prefer not to disclose 12. What is your major? Political Science International Affairs Interdisciplinary/ Cyber Strategy and Policy Interdisciplinary/ Law and Society Another major (please specify) 13. What is your class level? First year (freshman) Second year (sophomore) Third year (junior) Fourth year (senior) 14. What is your race or ethnicity? White Black or African American Asian American Indian or Alaska Native Native Hawaiian or Pacific Islander International student Other 15. What do you think your grade will be for this course? A or A- B+, B or B- C+, C or C- D+, D or D- F Nor sure/ don't know
- Creator/Author:
- Ivanov, Ivan
- Submitter:
- Ivan Ivanov
- Date Uploaded:
- 05/14/2020
- Date Modified:
- 05/14/2020
- Date Created:
- 2020-05-13
- License:
- All rights reserved
-
- Type:
- Dataset
- Description/Abstract:
- The NATO and the EU Peacebuilding Missions Dataset is created to use fuzzy seta Qualitative Comparative Analysis (fsQCA) analysis as a method of researching how NATO and the EU missions’ outcomes are influences by organizational assets and decision-making in both organizations. Outcome pertaining to these two sets of missions are intended to measure various aspects of organizational efficacy. There are two groups of variables – condition variables and outcome variables. In the next few sections, we will explain how these two groups of variables were generated, what existing sources and datasets were used and how mission indicators were generated. See attached research note for more detailed information. Condition Sets: Description By and large, conditions sets that have been generated measure organizational assets for these NATO and EU missions, as well as patterns in their decision-making process. Two critical organizational assets used for both sets of missions are their annual operational budget and their annual deployed personnel. The dataset contains two control variables measuring operational legitimacy – number of contributing nations and number of UN resolutions passed in relevance to the situation in the area of deployment for the duration of the EU and NATO Mission. Operational Duration – duration of the operation (in months). For ongoing missions and operations, we have used December 31, 2019 as the end date. All data reflect occurrences no later than December 31, 2019. Type of Operation – based on their mandate, operations are classified as civilian (coded as 0), military (coded as 1) and hybrid (i.e. with military and civilian components, coded as 0.5). Annual Operational Budget – total annual mission budget in USD. Sources include SIPRI yearbook and peace operations database. In cases of missing data from the SIPRI yearbook, mission factsheets and original data from the mission have been used. This latter technique applies for the following missions: AMUK, AVSEC, BAM1, BAM2, CAP1, CAP2, MAM1, NAVF1, NAVF2, TMC1, EUAMI. If data is reported in EUR, average exchange rate for the duration of the mission has been used to convert the cost. Data has been adjusted to reflect operational budget over a 12-month period. Average Annual Mission Personnel – it reflects the average total number of personnel/ staff supporting the NATO or EU peacebuilding mission per annum. Sources have been collected from SIPRI yearbook based on reportings for actual deployments on the ground. In cases when no data has been reported I the SIPRI yearbook/ peace operations dataset, mission factsheets and original data from the mission have been used. The data has been averaged and adjusted for a 12-month period. Days to Launch – describes the number of days needed from the time a decision has been made by the IO top decision-making body (the European Council and NAC) to launch the mission to the time that the mission is officially declared “operational.” If no declaration that the mission is “fully operational” exists, landmark indicators that the mission is fully operational include: ceremony on the ground marking the beginning of the mission, the appointment of mission commander or first recoded operational presence involving activity on the ground. Sources include official EU and NATO documents announcing the decision to create the peacebuilding operation as well as official documents, press releases and reports in reliable media outlets (including New Agencies) documenting an event that would indicate the mission is “fully operational.” Number of Contributing Nations –highest reported number of contributing nations for the duration of the NATO and the EU peacebuilding operation. UN Security Council Resolutions – total number of UN Security Council (UNSG) resolutions relevant for the area of conflict adopted for the duration of the NATO and the EU mission. In cases when UNSC resolutions are relevant for multiple NATO and EU peacebuilding missions those have been reported to all relevant missions. Outcome Sets: Description Outcome sets include various indicators created to measure operational efficacy. They include annual events contributing toward peace, conflict and the mission’s functioning, annual fatalities and annual deaths among mission personnel, as well as annual difference in fatalities. A more detailed description of these indicators is included below: Annual Peace Events – this is an annual indicator based on chronologically recorded events by the SIPRI yearbook that have contributed for the peace process in the conflict area where NATO and EU mission have been deployed. Examples of peace events include steps taken to contribute to the peace process (e.g. creation of buffer zone, cession of hostilities, meeting intended to cease fire or set up the peace process, political events related to or contributing toward the peace process and successful conclusion of a peace agreement. It may also include a decision of an international body (e.g. UN Security Council, UN General Assembly or UN Secretary General, as well as a decision made by the NATO and the EU D-M bodies that contributes toward the peace process in the areas where the mission operates. For ongoing missions is December 31, 2017 the last date when annual peace events are recoded. Annual Conflict Events -- this is an annual indicator based on chronologically recorded events by the SIPRI yearbook that have increased the conflict and the conflict potential in the area where NATO and EU mission have been deployed. Instances include resumption of hostilities among warring parties, occurrence of attacks, clashes, eruption of violence, the killing of civilians, military and peacemaking personnel and other violence-related events that contribute toward instability in the mission’s area. For ongoing missions is December 31, 2017 the last date when annual conflict events are recoded. Annual Mission-related Events -- this is an annual indicator based on chronologically recorded events by the SIPRI yearbook that measures events related to functioning of the mission – the decision to launch, the actual launch, implementation, transfer of authority and/ or mandate, transformation and termination of the mission. It also includes events that reflect decisions made by the contributing nations or sponsoring IOs intended to impact mission’s performances (e.g. decisions related to funding, control and command, transformation of mission mandate and rules and other similar events). For ongoing missions is December 31, 2017 the last date when annual mission-related events are recoded. Average Annual Fatalities – this indicator reports how many average annual civilian deaths have been recorded for the duration of the mission. The data is drawn from the Armed Conflict Dataset (ACD) managed by the London-based International Institute for Strategic Studies ( https://acd.iiss.org/member/datatools.aspx). Average Annual Mission Casualties – average annual number of deaths among peacebuilding personnel as reported in SIPRI yearbook/ peace operations database for the duration of the mission. Authors have used discretion to determine the accuracy in cases when there is discrepancy of reported data. Fatalities Annual Difference – an indicator of differenced annual data of civilian casualties on the ground for the duration of the mission. The indicator is calculated as follows: Differenced Fatalities = Ʃ (CasualtiesY1-Y2 … Casualties Yn-Y(n-1))/ Duration of the mission (in years). It is intended to capture improvement of situation on the ground as a result of presence of the peacebuilding effort. Condition Sets: Calibration and Rationale Annual Operational Budget – mission budget reflects resources USD 5 million or less indicate fully out while USD 100 million or more would indicate fully in. A budget of USD 30 should be the watershed borderline of “nether in, not out.” [5-100 million] Average Annual Mission Personnel – this indicator draws distinction between larger well-resourced missions and smaller missions with limited assets. By and large, missions with 20 personnel or less are fully out, while those with 20,000 or more are fully in. The borderline (net hither in, not out) is 130 people. Days to Launch – the speed with which the decision is taken indicates how decision-making operated in the case of this mission. D-M that took 5 days or less should be fully out (in, change direction) while D-M 150 days or more should be fully in (out, change direction). 30 days (1 month) should be the neither in, nor out border. Number of Contributing Nations –control indicator that demotes how high number of contributing nations contribute toward greater legitimacy (30 or more countries marks fully in), while 5 or fewer nations marks fully out. The “nether fully in, nor fully out” is at 15 nations. UN Security Council Resolutions – total number of UNSC resolutions can vary, fully out is at 0 resolutions while fully in at 50 or more. Since moist of the missions are shorter, Nether fully in, not fully out would be at 8 UNSC resolutions. [Inductive] Operational Duration – 1 year (12 months) denotes fully out (i.e. short-term mission) while 10 year 120 months denotes fully in; nether in not out would be for missions lasting 5 years (60 months). In other words, a decade is too long, a year is to short, five years is in the middle. Outcome Variables: Calibration and Rationale Annual Peace Events – this variable measures the occurrence of peace-related events – 0 events per annum is fully out; 3 events per annum is fully in. 0.8 event is nether in not out. Annual Conflict Events -- this variable measures the occurrence of conflict-related events – 0 events per annum is fully out; 4 events per annum is fully in. 1 event is nether in not out. Annual Mission-related Events -- this variable measures the occurrence of peace-related events – 0 events per annum is fully out; 1 events per annum is fully in. 0.3 event is nether in not out. Average Annual Fatalities – this set measures average number of annual fatalities for the duration of the mission. Cases with 0 fatalities are fully out; cases with 10,000 fatalities are fully in. 1,000 fatalities represent “nether in, not out” value. Fatalities Annual Difference – this is an indicator that measures the average year-to-year difference in number of fatalities for the duration of the conflict. -50 casualties is fully out (i.e. average growth of casualties by 50 per annum) as this indicator reflects low mission efficacy. 500 is fully in. This number indicates high efficacy; it denotes an average annual decline of casualties by 500 people. If the average number of casualties remains unchanged, then 0 denotes nether in, nor out. Average Annual Mission Casualties – this indicator measures average number of annual casualties for the duration of the mission. 0 casualties is fully out; 500 casualties is fully in. 0.5 is nether in, nor out.
- Creator/Author:
- Ivanov, Ivan
- Submitter:
- Ivan Ivanov
- Date Uploaded:
- 08/04/2019
- Date Modified:
- 10/15/2021
- Date Created:
- 2019-08-01
- License:
- Open Data Commons Public Domain Dedication and License (PDDL)
-
- Type:
- Dataset
- Description/Abstract:
- The aims of this study is to evaluate the impact of interactive student response software (SRS technology) in large introductory classes in Political Science taught at the University of Cincinnati. Getting the students engaged in these classes has been one of the main priorities of the College of Arts and Sciences. This study draws on data from Introduction to International Relations offerings from Fall 2012 to Spring 2018, some of which have used interactive software while others have not used any software. Additionally, some offerings have had an assigned supplemental instructor (IS) while others have not had SI. The overall aim is to evaluate whether these instructional innovations have helped improved student performance in this class. The main hypothesis tested during the study is that availability of SRS technology tends to improve student performance during exams. The secondary hypothesis is that the availability of more advanced (second-generation) student response technology (such as Echo 360) tends to improve students performance in class in comparison to earlier (first-generation) SRS devices (known as “clickers”). Background and significance The positive impact of SRS engagement technology on student performance the across different disciplines been well documented in the literature (Marlow et al 2009; Kam and Sommer 2006; Prezler et al 2007 and others). Most of the literature focuses on first generation student response system, also known as clickers (Elliott 2003; Riebens 2007; Crossgove and Curan 2008, Shapiro 2009). Some of the studies focus on the use of this technology without a control group (Beavers 2010; DeBourgh 2008; Kennedy and Cutts 2005; Sprague and Dahl 2010) while others discuss how personal response software impact student performance throughout the whole semester (Evans, 2012). This study differs from existing ones in several ways. First, by collecting data over 5-year period, not only can we compare groups of students using SRS systems with those who don’t but also we can compare offerings using first-generation SRS technology (e.g. the “clickers”) and those using second-generation SRS software (such as Echo 360) that contains more advanced interactive features. Second, the study allows comparison of the SRS impact on different course components and requirements. Third, it evaluates the impact of the student response system in combination with other techniques used in a large classroom such as supplemental instruction or SI. This new setting offers valuable insights about the impact of different types of SRS technology and other interactive techniques designed to engage students in large classrooms. Approach and Source of records Records for student performance collected throughout the whole semester for each student. Demographic information for the students enrolled in class collected from the course rosters and from the University of Cincinnati’s student information system Catalyst ( https://catalyst.uc.edu/). All records are electronic. Those that are not available on Catalyst but are generated as a part of the student performance are currently stored in excel format by the instructor and researcher in an external USB drive which is only accessible to the instructor and PI (same person). No other person has access to the data. The research does not involve the collection of data or other results from individuals that will be submitted to, or held for inspection by, the FDA. No part of the research involves any data that will be provided (in any form) to a pharmaceutical, medical device or biotech company.
- Creator/Author:
- Ivanov, Ivan
- Submitter:
- Ivan Ivanov
- Date Uploaded:
- 07/24/2019
- Date Modified:
- 07/24/2019
- Date Created:
- November 30, 2018
- License:
- All rights reserved