Search Constraints
Filtering by:
Depositor
"ivanovid@ucmail.uc.edu"
Remove constraint Depositor: "ivanovid@ucmail.uc.edu"
Type
Work
Remove constraint Type: Work
1 - 6 of 6
Number of results to display per page
Search Results
-
- Type:
- Dataset
- Description/Abstract:
- Varieties of International Cyber Strategies (VoICS): Text Analysis of National Cybersecurity Documents is a project that compares and contrasts the three main approaches to conceptualize national cybersecurity strategies (NSS): deterrence, norm-based approach (NBA) and cyber persistence engagement (CPE). Scholars and policymakers have initially conceptualized NSS in terms of deterrence or NBA. More recent academic research has demonstrated that these frameworks are inadequate for cyber space. As a result, Cyber Persistence Engagement (CPE) emerged as a third option. The first version (1.0) of the VoICS database on National Cybersecurity Strategies focuses on nations in Europe and North America and includes a total of 77 NCS of the states in the North Atlantic Area—NATO allies, EU members and Switzerland—released from 2003 until the end of 2023. It consists of 27 variables, including country and strategy identifiers, EU and NATO membership, their respective accession dates, and total length of the documents. VoICS include eighteen variables representing different measures of relative and absolute weights of the three NSS types—deterrence, NBA and CPE. The text analysis is based on official NSS documents provided by the NATO Cooperative Cyber Defence Centre of Excellence library (2024) and ENISA’s interactive map for National Cyber Security Strategies (2023). Both sources rely on voluntary submission from the member states. Unfortunately, some official documents were not available or accessible or were not listed at all. Authors have used various sources and contacts with a variety of cyber attachés in Brussels to determine if any additional strategies were released and to obtain the missing documents. The 18 text analysis variables compare and contrast the extent to which different NCS are associated with a specific strategy. They represent different frequency scores based either on words, phrases, or words and phrases combined. These calculations are associated with either deterrence, NBA, or CPE in each strategy. The authors have generated respective vocabularies for the three strategic ideas through which each of these approaches are operationalized. We have conducted a text analysis using WordStat text analysis software by Provialis ( https://provalisresearch.com/products/content-analysis-software/). A detailed codebook for NSS Dataset 1.0 along with a NSS Dictionary 1.0 have been included in this collection/ repository. The process of generating vocabulary associated with the three cybersecurity approaches involved several steps. First, upon reviewing the literature, the authors generated independently a list of words and phrases associated with each type of cybersecurity strategy. Second, the authors compared their lists to determine the degree of overlap in vocabulary. Those words and phrases that included in at least two different lists were reviewed and, if there was consensus, were incorporated in the dictionary. Finally, words and phrases which were identified in only one of lists were once again reviewed and, in case there was a consensus among the authors, these were also included in the dictionary. Third, the three vocabularies were updated on several instances when it was unanimously agreed that these words or phrases should be included in the analysis.
- Creator/Author:
- Millard, Matthew; Kovac, Igor, and Ivanov, Ivan Dinev
- Submitter:
- Ivan Ivanov
- Date Uploaded:
- 05/12/2025
- Date Modified:
- 05/12/2025
- Date Created:
- 2025-04-18
- License:
- All rights reserved
-
- Type:
- Dataset
- Description/Abstract:
- This is a dataset generated as a part of a research project studying the changing support among European Union (EU) members for the war in Ukraine. The dataset contains a number of conditions (variables) used to conduct fuzzy-set qualitative comparative analysis (fsQCA) to test five critical conditions that have shaped the change in public opinion that include economic growth, democratic rule, distance from the front lines, level of energy dependence from Russia and trust in social media. These conditions (or variables) include: Num: Case number in the row MEMBR: EU member state two or three-letter abbreviation WEALTH: GDP per capita in Euro (measured in purchasing power parties) as reported by Eurostat GROWTH: GDP growth in volume based on seasonally adjusted data by Eurostat DEMOCR: the overall score for each EU member’s democracy index for 2022. Data have been drawn from the Economist Intelligence Unit (EIU) 2022 report DISTAN: an average distance (in thousand kilometers) from the geographic center point of the national capital of each EU member-state to the south-western and north-eastern tips of the frontline of the war in Ukraine. I have accepted that the western tip of the frontline is Kinburnsʹka Kosa National Park (Geographic Coordinates: 46°34’37”N 31°30’44”E) and the eastern tip of the frontline is at the village of Topoli in Kharkiv Oblast (Geographic Coordinates: 49°57’52″N, 37°54′31″E). TRADE: volume of trade with Russia per capita in thousand of US Dollars. ENERGY: EU energy dependence on Russia as estmated by the European Commission (from 0 to 100 percent) for 2020. Source: Eurostat. GOVTR: Net trust in national government (difference between the sum of fully trust and partially trust responses and fully distrust and partially distrust responses). MEDIATR: Net trust in social media (difference between the sum of fully trust and partially trust responses and fully distrust and partially distrust responses). ECONSAN: Difference in support for economic sanctions on Russia, Spring 2022-Spring 2024 (difference between the sum of tend to agree and totally agree with economic sanctions and partially disagree and totally disagree with sanctions responses). EQUIPS: Difference in support for financial support for providing military equipment for Ukraine, Spring 2022-Spring 2024 (difference between the sum of tend to agree and totally agree with financial support for equipment and partially disagree and totally disagree with support for equipment). HUMSAN: Difference in support for humanitarian support for Ukrainians fleeing the war, Spring 2022-Spring 2024 (difference between the sum of tend to agree and totally agree with humanitarian support for Ukrainians and partially disagree and totally disagree with humanitarian support for Ukrainians). REFUG: Difference in support for welcoming Ukrainian refugees, Spring 2022-Spring 2024 (difference between the sum of tend to agree and totally agree with welcoming Ukrainian and partially disagree and totally disagree with welcoming Ukrainian refugees). AVCHNG: Difference in average change of the support for Ukraine and economic sanctions on Russia, Spring 2022-Spring 2024. WEALTH1: Calibrated score for national wealth (see paper for details) GROWT1: Calibrated score for economic growth (see paper for details) DEMOCR1: Calibrated score for democracy (see paper for details) DISTAN1: Calibrated score for distance (see paper for details) TRADE1: Calibrated score for trade (see paper for details) RENERG1: Calibrated score for energy dependence (see paper for details) GOVTR1: Calibrated score for trust in governance (see paper for details) RMEDIATR1: Calibrated score for trust in social media (see paper for details) ECONSAN1: Calibrated score for support for economic sanctions on Russia (see paper for details, calibration the same as AVCHNG3) MDIASAN1: Calibrated score for media sanctions on Russia (see paper for details, calibration the same as AVCHNG3) AVCHNG3: Calibrated score for average change/ decline for public opinion (see paper for details)
- Creator/Author:
- Ivanov, Ivan
- Submitter:
- Ivan Ivanov
- Date Uploaded:
- 04/27/2025
- Date Modified:
- 04/27/2025
- Date Created:
- 5-01-2024
- License:
- All rights reserved
-
- Type:
- Dataset
- Description/Abstract:
- "Organizational Response to Emerging Threats" is a project that addresses three separate threat areas -- cybersecurity, peacekeeping and energy security. The data collection for cybersecurity and energy security has been completed. As of June 2019, the data collection for peacekeeping is ongoing. The project documents are organized around three topics, reflected in the filenames -- cybersecurity, peacekeeping and energy security. The overall purpose/rationale of this research project is to develop a framework that explains how different international organizations (IOs) respond to various emerging threats in international relations. These threats can vary and include cybersecurity, energy security, food security, environmental security, and others. For the purpose of our study we focus on two major variables explaining organizational response: (1) IOs’ capacity to acquire and deploy organizational assets (also referred to as asset fungibility), and; (2) IOs’ ability to make swift decisions in response to changing internal and external environments. Drawing from primary sources including interviews with NATO and EU officials, we suggest a new model explaining when organizations are better equipped at addressing cyber threats, when they have capacity to response more effectively, and what they could do to improve their organizational responses in this area. The QDR repository contains interviews with policy makers and senior bureaucrats conducted in 2016, 2017, 2018 and 2019 in Brussels, Belgium, and the Hague, the Netherlands. These interviews have been conducted in person or over skype. Approval to conduct interviews has been granted by the University of Cincinnati's IRB (Study ID: 2018-3371.
- Creator/Author:
- Ivanov, Ivan
- Submitter:
- Ivan Ivanov
- Date Uploaded:
- 10/20/2023
- Date Modified:
- 10/20/2023
- Date Created:
- 2019-07-22
- License:
- Attribution-ShareAlike 4.0 International
-
- Type:
- Dataset
- Description/Abstract:
- Dataset Summary: This dataset studies the main challenges that students in these institutions faced during the transition from face-to-face (f2f) to remote mode of instruction and the resources that they used to minimize these adversities. In order learn about their experiences during this transition, I surveyed at the end of the Spring Semester students enrolled in two Political Science (POL) classes. The results showed that majority of students struggled with stress caused by moving away from campus and self-quarantine leading to deteriorating mental and physical health. Concerns about student health along with distraction at home were identified as top adversities for student well-being. Survey results also showed that educational resources can have varying impact on student learning in introductory and upper-level courses. For example, lecture notes, power point presentations and online videos can be better resources for remote instruction in an introductory class, while class meetings via video conferencing platforms can be the preferred resource of instruction in upper-level courses. Below is the questionnaire used for this study: Survey Questionnaire: Transition to Remote Instruction During COVID-19 Crisis: Qualtrics Link for POL1080: https://artsciuc.co1.qualtrics.com/jfe/form/SV_bd7cF1OF6eNeYBv Qualtrics Link for POL2074: https://artsciuc.co1.qualtrics.com/jfe/form/SV_3xegnXy4LFSC2t7 1. As you know, the University of Cincinnati has transitioned from face-to-face to remote instruction for Spring Semester since March 14, 2020 due to COVID-19. Once it was decided to switch to remote instruction, how did you expect that this decision would impact your performance in this class? I thought it would improve my performance I thought it would impair my performance I did not think that it would impact my performance I don’t know 2. Based on your experience with remote instruction, how do you think the new form of instruction impacted your performance in this class? I did better in this class after we switched to remote teaching I did worse in this class after we switched to remote teaching The switch to remote teaching had no impact on my performance I don’t know. 3. Do you agree or disagree with the following statement: “I felt that the instructor in this class provided timely instructions and information about the switch from face-to-face to remote form of content delivery in the class”? Completely agree Partially agree Partially disagree Completely disagree Not sure/ don’t know. 4. Do you agree or disagree with the following statement: “I felt that the instructor in this class cared about my performance in the class once we switched from face-to-face to remote form of content delivery in the class”? Completely agree Partially agree Partially disagree Completely disagree Not sure/ don’t know. 5. Which of the following course resources (if available) helped you ease the transition from face-to-face to remote instruction (check all that apply)? Online instructional videos created or made available by the instructor Instructor-led class meetings via a web-conferencing platform (e.g. Webex, Zoom, MS Teams, Skype) Meetings with the instructor via a web-conferencing platform (e.g. Webex, Zoom, MS teams, Skype) during their office hours Instructor’s lecture notes and presentation materials (e.g. Power Point Slides) Online quizzes or interactive questions administered via web platforms (e.g. Canvas, Blackboard, Echo 360 or others). Online forums made available for this course Assigned course readings Book publisher’s online resources (websites, book ancillaries, etc.) Supplemental assistance from teaching assistants (e.g. office hours, online sessions, etc.) Supplemental peer-led review sessions (e.g. Learning Assistant Sessions, Supplemental Instruction Sessions, etc.) Group activities with peers enrolled in the class (e.g. study sessions via conference platforms) Others (please list) _________. 6. Which one of the following course resources was most helpful to you in the transition from face-to-face to online mode of content delivery (select only one)? Online instructional videos created or made available by the instructor Instructor-led class meetings via a web-conferencing platform (e.g. Webex, Zoom, MS Teams, Skype) Meetings with the instructor via a web-conferencing platform (e.g. Webex, Zoom, MS teams, Skype) during their office hours Instructor’s lecture notes and presentation materials (e.g. Power Point Slides) Online quizzes or interactive questions administered via web platforms (e.g. Canvas, Blackboard, Echo 360 or others). Online/ web discussion forums made available for this course Assigned course readings Textbook publisher’s online resources (websites, book ancillaries, etc.) Supplemental assistance from teaching assistants (e.g. office hours, online sessions, etc.) Supplemental peer-led review sessions (e.g. Learning Assistant Sessions, Supplemental Instruction Sessions, etc.) Group activities with peers enrolled in the class (e.g. study sessions via web-conferencing platforms) Others (please list) _________. 7. Which of the following, do you think, impacted negatively your performance in this class during the transition from face-to-face to remote instruction (please select all relevant options)? I had to move away from campus in the middle of the semester My physical or mental health deteriorated after we switched to remote instruction I missed face-to-face interaction with the instructor, the TAs and the undergrad assistant (SI) I did not have stable and reliable Internet connection at home I had a lot of distraction at home I lost my job/ income due to the COVID-19 epidemic I had to take an additional job to support myself and/ or my family Self-quarantine and/ or social distancing caused me a lot of stress The news about the COVID-19 epidemic and concerns about my health and the health of my loved ones caused me a lot of stress Other (please list) ___________. 8. Which of the following, do you think, impacted negatively your performance in this class during the transition from face-to-face to remote instruction (please select only one options)? I had to move away from campus in the middle of the semester My physical or mental health deteriorated after we switched to remote instruction I missed face-to-face interaction with the instructor, the TAs and the undergrad assistant (SI) I did not have stable and reliable Internet connection at home I had a lot of distraction at home I lost my job/ income due to the COVID-19 epidemic I had to take an additional job to support myself and/ or my family Self-quarantine and/ or social distancing caused me a lot of stress The news about the COVID-19 epidemic and concerns about my health and the health of my loved ones caused me a lot of stress Other (please list): 9. Based on your experience with this course’s transition from face-to-face to remote instruction for Spring Semester 2020, what aspects of this transition had greatest values for you? Open ended question: 10. Based on your experience with this course’s transition from face-to-face to remote instruction for Spring Semester 2020, what changes would you recommend to ease this transition in the future? Open ended question: 11. What is your gender? Male Female Other/ prefer not to disclose 12. What is your major? Political Science International Affairs Interdisciplinary/ Cyber Strategy and Policy Interdisciplinary/ Law and Society Another major (please specify) 13. What is your class level? First year (freshman) Second year (sophomore) Third year (junior) Fourth year (senior) 14. What is your race or ethnicity? White Black or African American Asian American Indian or Alaska Native Native Hawaiian or Pacific Islander International student Other 15. What do you think your grade will be for this course? A or A- B+, B or B- C+, C or C- D+, D or D- F Nor sure/ don't know
- Creator/Author:
- Ivanov, Ivan
- Submitter:
- Ivan Ivanov
- Date Uploaded:
- 05/14/2020
- Date Modified:
- 05/14/2020
- Date Created:
- 2020-05-13
- License:
- All rights reserved
-
- Type:
- Dataset
- Description/Abstract:
- The NATO and the EU Peacebuilding Missions Dataset is created to use fuzzy seta Qualitative Comparative Analysis (fsQCA) analysis as a method of researching how NATO and the EU missions’ outcomes are influences by organizational assets and decision-making in both organizations. Outcome pertaining to these two sets of missions are intended to measure various aspects of organizational efficacy. There are two groups of variables – condition variables and outcome variables. In the next few sections, we will explain how these two groups of variables were generated, what existing sources and datasets were used and how mission indicators were generated. See attached research note for more detailed information. Condition Sets: Description By and large, conditions sets that have been generated measure organizational assets for these NATO and EU missions, as well as patterns in their decision-making process. Two critical organizational assets used for both sets of missions are their annual operational budget and their annual deployed personnel. The dataset contains two control variables measuring operational legitimacy – number of contributing nations and number of UN resolutions passed in relevance to the situation in the area of deployment for the duration of the EU and NATO Mission. Operational Duration – duration of the operation (in months). For ongoing missions and operations, we have used December 31, 2019 as the end date. All data reflect occurrences no later than December 31, 2019. Type of Operation – based on their mandate, operations are classified as civilian (coded as 0), military (coded as 1) and hybrid (i.e. with military and civilian components, coded as 0.5). Annual Operational Budget – total annual mission budget in USD. Sources include SIPRI yearbook and peace operations database. In cases of missing data from the SIPRI yearbook, mission factsheets and original data from the mission have been used. This latter technique applies for the following missions: AMUK, AVSEC, BAM1, BAM2, CAP1, CAP2, MAM1, NAVF1, NAVF2, TMC1, EUAMI. If data is reported in EUR, average exchange rate for the duration of the mission has been used to convert the cost. Data has been adjusted to reflect operational budget over a 12-month period. Average Annual Mission Personnel – it reflects the average total number of personnel/ staff supporting the NATO or EU peacebuilding mission per annum. Sources have been collected from SIPRI yearbook based on reportings for actual deployments on the ground. In cases when no data has been reported I the SIPRI yearbook/ peace operations dataset, mission factsheets and original data from the mission have been used. The data has been averaged and adjusted for a 12-month period. Days to Launch – describes the number of days needed from the time a decision has been made by the IO top decision-making body (the European Council and NAC) to launch the mission to the time that the mission is officially declared “operational.” If no declaration that the mission is “fully operational” exists, landmark indicators that the mission is fully operational include: ceremony on the ground marking the beginning of the mission, the appointment of mission commander or first recoded operational presence involving activity on the ground. Sources include official EU and NATO documents announcing the decision to create the peacebuilding operation as well as official documents, press releases and reports in reliable media outlets (including New Agencies) documenting an event that would indicate the mission is “fully operational.” Number of Contributing Nations –highest reported number of contributing nations for the duration of the NATO and the EU peacebuilding operation. UN Security Council Resolutions – total number of UN Security Council (UNSG) resolutions relevant for the area of conflict adopted for the duration of the NATO and the EU mission. In cases when UNSC resolutions are relevant for multiple NATO and EU peacebuilding missions those have been reported to all relevant missions. Outcome Sets: Description Outcome sets include various indicators created to measure operational efficacy. They include annual events contributing toward peace, conflict and the mission’s functioning, annual fatalities and annual deaths among mission personnel, as well as annual difference in fatalities. A more detailed description of these indicators is included below: Annual Peace Events – this is an annual indicator based on chronologically recorded events by the SIPRI yearbook that have contributed for the peace process in the conflict area where NATO and EU mission have been deployed. Examples of peace events include steps taken to contribute to the peace process (e.g. creation of buffer zone, cession of hostilities, meeting intended to cease fire or set up the peace process, political events related to or contributing toward the peace process and successful conclusion of a peace agreement. It may also include a decision of an international body (e.g. UN Security Council, UN General Assembly or UN Secretary General, as well as a decision made by the NATO and the EU D-M bodies that contributes toward the peace process in the areas where the mission operates. For ongoing missions is December 31, 2019 the last date when annual peace events are recoded. Annual Conflict Events -- this is an annual indicator based on chronologically recorded events by the SIPRI yearbook that have increased the conflict and the conflict potential in the area where NATO and EU mission have been deployed. Instances include resumption of hostilities among warring parties, occurrence of attacks, clashes, eruption of violence, the killing of civilians, military and peacemaking personnel and other violence-related events that contribute toward instability in the mission’s area. For ongoing missions is December 31, 2019 the last date when annual conflict events are recoded. Annual Mission-related Events -- this is an annual indicator based on chronologically recorded events by the SIPRI yearbook that measures events related to functioning of the mission – the decision to launch, the actual launch, implementation, transfer of authority and/ or mandate, transformation and termination of the mission. It also includes events that reflect decisions made by the contributing nations or sponsoring IOs intended to impact mission’s performances (e.g. decisions related to funding, control and command, transformation of mission mandate and rules and other similar events). For ongoing missions is December 31, 2019 the last date when annual mission-related events are recoded. Average Annual Fatalities – this indicator reports how many average annual civilian deaths have been recorded for the duration of the mission. The data is drawn from the Armed Conflict Dataset (ACD) managed by the London-based International Institute for Strategic Studies ( https://acd.iiss.org/member/datatools.aspx). Average Annual Mission Casualties – average annual number of deaths among peacebuilding personnel as reported in SIPRI yearbook/ peace operations database for the duration of the mission. Authors have used discretion to determine the accuracy in cases when there is discrepancy of reported data. Fatalities Annual Difference – an indicator of differenced annual data of civilian casualties on the ground for the duration of the mission. The indicator is calculated as follows: Differenced Fatalities = Ʃ (CasualtiesY1-Y2 … Casualties Yn-Y(n-1))/ Duration of the mission (in years). It is intended to capture improvement of situation on the ground as a result of presence of the peacebuilding effort. Condition Sets: Calibration and Rationale Annual Operational Budget – mission budget reflects resources USD 5 million or less indicate fully out while USD 100 million or more would indicate fully in. A budget of USD 30 should be the watershed borderline of “nether in, not out.” [5-100 million] Average Annual Mission Personnel – this indicator draws distinction between larger well-resourced missions and smaller missions with limited assets. By and large, missions with 20 personnel or less are fully out, while those with 20,000 or more are fully in. The borderline (net hither in, not out) is 130 people. Days to Launch – the speed with which the decision is taken indicates how decision-making operated in the case of this mission. D-M that took 5 days or less should be fully out (in, change direction) while D-M 150 days or more should be fully in (out, change direction). 30 days (1 month) should be the neither in, nor out border. Number of Contributing Nations –control indicator that demotes how high number of contributing nations contribute toward greater legitimacy (30 or more countries marks fully in), while 5 or fewer nations marks fully out. The “nether fully in, nor fully out” is at 15 nations. UN Security Council Resolutions – total number of UNSC resolutions can vary, fully out is at 0 resolutions while fully in at 50 or more. Since moist of the missions are shorter, Nether fully in, not fully out would be at 8 UNSC resolutions. [Inductive] Operational Duration – 1 year (12 months) denotes fully out (i.e. short-term mission) while 10 year 120 months denotes fully in; nether in not out would be for missions lasting 5 years (60 months). In other words, a decade is too long, a year is to short, five years is in the middle. Outcome Variables: Calibration and Rationale Annual Peace Events – this variable measures the occurrence of peace-related events – 0 events per annum is fully out; 3 events per annum is fully in. 0.8 event is nether in not out. Annual Conflict Events -- this variable measures the occurrence of conflict-related events – 0 events per annum is fully out; 4 events per annum is fully in. 1 event is nether in not out. Annual Mission-related Events -- this variable measures the occurrence of peace-related events – 0 events per annum is fully out; 1 events per annum is fully in. 0.3 event is nether in not out. Average Annual Fatalities – this set measures average number of annual fatalities for the duration of the mission. Cases with 0 fatalities are fully out; cases with 10,000 fatalities are fully in. 1,000 fatalities represent “nether in, not out” value. Fatalities Annual Difference – this is an indicator that measures the average year-to-year difference in number of fatalities for the duration of the conflict. -50 casualties is fully out (i.e. average growth of casualties by 50 per annum) as this indicator reflects low mission efficacy. 500 is fully in. This number indicates high efficacy; it denotes an average annual decline of casualties by 500 people. If the average number of casualties remains unchanged, then 0 denotes nether in, nor out. Average Annual Mission Casualties – this indicator measures average number of annual casualties for the duration of the mission. 0 casualties is fully out; 500 casualties is fully in. 0.5 is nether in, nor out.
- Creator/Author:
- Ivanov, Ivan
- Submitter:
- Ivan Ivanov
- Date Uploaded:
- 08/04/2019
- Date Modified:
- 11/08/2023
- Date Created:
- 2019-08-01
- License:
- Open Data Commons Public Domain Dedication and License (PDDL)
-
- Type:
- Dataset
- Description/Abstract:
- The aims of this study is to evaluate the impact of interactive student response software (SRS technology) in large introductory classes in Political Science taught at the University of Cincinnati. Getting the students engaged in these classes has been one of the main priorities of the College of Arts and Sciences. This study draws on data from Introduction to International Relations offerings from Fall 2012 to Spring 2018, some of which have used interactive software while others have not used any software. Additionally, some offerings have had an assigned supplemental instructor (IS) while others have not had SI. The overall aim is to evaluate whether these instructional innovations have helped improved student performance in this class. The main hypothesis tested during the study is that availability of SRS technology tends to improve student performance during exams. The secondary hypothesis is that the availability of more advanced (second-generation) student response technology (such as Echo 360) tends to improve students performance in class in comparison to earlier (first-generation) SRS devices (known as “clickers”). Background and significance The positive impact of SRS engagement technology on student performance the across different disciplines been well documented in the literature (Marlow et al 2009; Kam and Sommer 2006; Prezler et al 2007 and others). Most of the literature focuses on first generation student response system, also known as clickers (Elliott 2003; Riebens 2007; Crossgove and Curan 2008, Shapiro 2009). Some of the studies focus on the use of this technology without a control group (Beavers 2010; DeBourgh 2008; Kennedy and Cutts 2005; Sprague and Dahl 2010) while others discuss how personal response software impact student performance throughout the whole semester (Evans, 2012). This study differs from existing ones in several ways. First, by collecting data over 5-year period, not only can we compare groups of students using SRS systems with those who don’t but also we can compare offerings using first-generation SRS technology (e.g. the “clickers”) and those using second-generation SRS software (such as Echo 360) that contains more advanced interactive features. Second, the study allows comparison of the SRS impact on different course components and requirements. Third, it evaluates the impact of the student response system in combination with other techniques used in a large classroom such as supplemental instruction or SI. This new setting offers valuable insights about the impact of different types of SRS technology and other interactive techniques designed to engage students in large classrooms. Approach and Source of records Records for student performance collected throughout the whole semester for each student. Demographic information for the students enrolled in class collected from the course rosters and from the University of Cincinnati’s student information system Catalyst ( https://catalyst.uc.edu/). All records are electronic. Those that are not available on Catalyst but are generated as a part of the student performance are currently stored in excel format by the instructor and researcher in an external USB drive which is only accessible to the instructor and PI (same person). No other person has access to the data. The research does not involve the collection of data or other results from individuals that will be submitted to, or held for inspection by, the FDA. No part of the research involves any data that will be provided (in any form) to a pharmaceutical, medical device or biotech company.
- Creator/Author:
- Ivanov, Ivan
- Submitter:
- Ivan Ivanov
- Date Uploaded:
- 07/24/2019
- Date Modified:
- 07/24/2019
- Date Created:
- November 30, 2018
- License:
- All rights reserved