This list contains the titles and publication years of 599 articles from two Archaeology journals, Ancient Mesoamerica and Latin American Antiquity that contain the term, 'bone'. The articles named in this list were used as the dataset to generate LDA topic models for related research.
Six topic models were generated using Latent Dirichlet Allocation, an algorithm that considers the probability of words co-occurring in a document given a collection of documents. The collection of documents that these particular models are based on include 599 articles that include the term 'bone' from two archaeology journals, Ancient Mesoamerica and Latin American Antiquity.
This webinar was a part of the Data and Computation Science Series and one of five webinars focused on the Publishing Lifecycle of Data. It occurred on June 29, 2020, at 1:00 pm EDT.
Presenter Bio: Geoffrey Pinski is the Assistant Vice President for Technology Transfer in the University of Cincinnati's Office of Innovation. Housed in UC's 1819 Innovation Hub, Geoffrey leads the team responsible for identifying and commercializing the research and innovations of UC’s faculty, staff, and students. Geoffrey rose through ranks, holding nearly every position along the way; starting first as an extern during law school. Under his leadership, the office has set records for invention disclosures, licenses, and startups. As the President of the Ohio Technology Transfer Officers Counsel, he helped develop the Ohio IP Promise; a promise by all 14 states and 2 of the private institutions in Ohio to provide a unified process for commercialization.
Session Description: Data is a loaded term - it covers everything from raw numbers to software code. Come learn more about the intersection of Intellectual Property and data; how to protect data, while sharing it; and how and when commercialization might be an avenue. And finally, learn what resources are available to help you navigate the waters of data and Intellectual Property.
There is both a ppt slide deck and a mp4 session.
This webinar was a part of the Data and Computation Science Series and one of five webinars focused on the Publishing Lifecycle of Data. It occurred on July 13, 2020, at 2:00 pm EDT.
Jeffrey Layne Blevins (PhD) - is Head of the Journalism Department at the University of Cincinnati and editor of Democratic Communiqué. His scholarly focus is the political economy of U.S. media industries, and his most research includes data visualizations of social media activity involving social justice issues and the spread of misinformation on Twitter. The Communiqué is the official publication of the Union for Democratic Communications
Victoria Carr (PhD) - Professor of Early Childhood Education/Human Development and Executive Director of the Arlitt Center for Education, Research, and Sustainability at the University of Cincinnati, conducts research related to play and learning environments, teacher pedagogies, and children’s experiences in nature. Her research on nature playscapes and STEM education has been supported by the US National Science Foundation. She serves as Co-Editor for Children, Youth & Environments, co-chair of the Leave No Child Inside Greater Cincinnati Collaborative, and as a Board of Directors member for Cincinnati Nature Center. She is an advocate for mindful, sustainable and child-friendly communities.
Theresa Culley (PhD) - is a Professor and Head of the Department of Biological Sciences. As a plant biologist, she co-founded and currently serves as Editor-in-Chief of Applications in Plant Sciences, an online methods journal published by the Botanical Society of America in association with Wiley Publishing. The journal highlights novel methods in all areas of the plant sciences, serving established professionals as well as junior researchers around the world.
Steven Lange - Director, graduated from Heidelberg College in Tiffin, Ohio with a Bachelor of Science degree in Biology. Steve became our director in 2013. He has over 25 years of experience in the leather industry, including tannery, finishing, and automotive cutting/wrapping operations. In addition to continually growing our roster of clients, he has taught over 200 students in our various classes. His knowledge of leather testing procedures and processes is unrivaled. In his free time, he volunteers for the Leader Dogs for the Blind organization, is the editor of the JALCA (Journal of the American Leather Chemists Association) and enjoys spending time with his family and dogs.
Publications have long been the currency for academia. The first publication can be the hardest. And today’s scholarly articles are more than pdfs and can include multi-media supplemental materials including raw or additional data, videos, interactive maps, and other components of your scholarship process. In this one hour web session, UC faculty who are journal editors will discuss how to - dentify the right journal for your work - avoid predatory journals - maximize your research impact through altmetrics and data publishing - increase your understanding of the publishing process through opportunities such as being a guest editor on a special issue or serving as a reviewer. This event is free and open to all seeking to publish their scholarship and maximize its impact.
This webinar was a part of the Data and Computation Science Series and one of five webinars focused on the Publishing Lifecycle of Data. It occurred on July 27, 2020, at 2:00 pm EDT.
Due to technical difficulties, the presentation starts at 8:47.
The presenter was Claudio Aspesi, Senior Research Analyst. He joined Sanford C. Bernstein & Co., LLC, in 2004 covering European media stocks. Previously he was Global Senior Vice President of Strategy at EMI Music and was responsible for defining the company’s business model as the music industry entered the digital age. Before joining EMI Music in 2002, Mr. Aspesi was a member of the executive team at Airclic, an Internet infrastructure company, and prior to that a Principal at McKinsey and Co., working with many leading media and entertainment companies. Mr. Aspesi graduated with the highest honors from Universita Luigi Bocconi, Milan, with a Laurea in Economia Aziendale.
Session Description - Open data and metadata - opportunities, risks, and possible actions
Research data is at the core of what universities do. Its value to researchers is, of course, paramount - and open science offers significant benefits to the scientific community. But this data, and the attached metadata, are increasingly valuable for third parties as well. We will discuss how research data and metadata increasingly overlaps with all the other data produced by academic institutions, how it is becoming increasingly valuable outside the academic community, and how it could become even more valuable in the future. The collection, analysis, synthesis and preservation of data and metadata, however, pose significant issues as well; for example, data can and is being used to evaluate individuals (with the biases implicit in developing algorithms to analyze them). More broadly, the collection and analysis of data raises privacy and academic freedom concerns, and so does the lack of transparency and accountability of third party users. Ultimately, the deployment of data analytics and Artificial Intelligence tools should fit with the broader values of the academic community, such as equity and sustainability - whether it does so is controversial.
In addition to the need to establish principles for the use of data analytics and Artificial Intelligence, there are also significant ethical questions that need to be addressed, and that pose significant challenges, and there are questions about how to ensure the long term preservation of data and metadata.
We will close the presentation with a look at possible steps that the academic community ought to take to address all these issues. We hope that a discussion will follow, in order to address questions and issues, as well as to gather points of view from participants
This analytical paper asks, does the One-China policy shape the People’s Republic of China’s foreign policy? This paper begins by briefly defining the One-China policy and situating it in the respective histories of China and its current incarnation as the People’s Republic of China (PRC). Then, after untangling the often muddled classifications of soft, sharp, and hard power, the question is interrogated in the context of each class of power (Nye, 2004; Nye, 2011; Nye, 2018; Raby, 2019; Walker & Ludwig, 2017). This analytical essay concludes that the PRC does employ predominantly sharp and hard power strategies that are heavily influenced by the One-China policy.
The current debates revolving around 5G, Huawei, and how they are resolved, are highly visible indicators of the technology based shifts in the global order which are setting the tone for the 21st century. Currently, it seems that many in the US and the PRC are using Cold War and Thucydides Trap paradigms, with a zero-sum mentality. At least in the case of 5G technology, the UK seems to have taken a more nuanced approach.
This article comes as the UK prepares its new National Cyber Security Strategy, reviewing the 5G and cyber security debates surrounding Huawei in a highly interdisciplinary manner, and directing readers to a rich variety of resources. In addition to its analysis of issues and solutions often absent from the discourse, this article’s feature contribution is the argument that the UK can be more than an example of a middle way. Specifically, if the UK scales up and internationalizes its Huawei Cyber Security Evaluation Center, perhaps by creating an International Cyber Security Evaluation Center, it can lead its allies and the world in 5G, 6G, cybersecurity, and international relations, filling a vital leadership vacuum.
Matrices of DNA sequences used to generate the phylogeny of Aniba rosiodora and related species (Lauraceae) presented in the manuscript entitled "Chemical and genotypic variations in Aniba species from the Amazonian forest"
This study is the first of a series of studies, collectively embodying a multiphase mixed methods design. The overall objective of these studies is to explore and address a variety of issues and features of the discipline of economics, particularly as they relate to and represent past present and future factors of globalization, education, citizenship, and society. This is done by collecting and analyzing data on numerous aspects of the undergraduate economics curriculum, economics as a discipline, and economics as applied in the real world.
The overall purpose of these studies is to inform ongoing debates concerning the future of the discipline of economics and how it is taught, by examining and creating paradigms and methods that may be of aide. Additionally these studies collectively aim to outline, and in small ways develop, potential technological and organizational solutions for detailed longitudinal curriculum tracking. The frameworks employed and developed in these studies may eventually be scaled and adapted for all sorts of curricula. Ideally, the completion of this study’s overall objective yields practical insights and tools that empower faculty and departments, in economics and eventually in general, to better understand and design their own curriculum.
This immediate study fills gaps in and updates data on the curriculum of undergraduate economics majors in U.S. institutions, while also establishing a baseline data set for future studies to build on. A qualitative census methodology is adapted and employed to explore how various institutional and program factors relate to certain types of major program requirements. Descriptive statistics are used for analysis, primarily to allow for comparisons to previous studies. In sum, the purpose of the data collected and analyzed in this census is to give a glimpse into the current state of the undergraduate economics curriculum in the U.S., and to inform the qualitative, quantitative, and transformative studies that are to follow in this multiphase series.
This is a preprint of a to be submitted paper that demonstrates that: (1) many important food allergens (eggs, milk, peanuts, tree nuts) induce the unfolded protein response (UPR) in intestinal epithelial cells; (2) induction of the UPR, in turn, stimulates the expression of pro-Th2 cytokines (IL-25, IL-33, TSLP) that are required for the induction of food allergy by these cytokines; (3) egg allergy is suppressed in mouse models by the UPR inhibitor, metformin (a drug widely used to treat diabetes mellitus); and (4) metformin appears to have a protective effects in humans who have alpha-gal syndrome, which is a form of food allergy.
Shortly after the comparative analysis of Codding et al. was published, I prepared a comment on the article that I submitted for publication. In response to feedback from the editors, I eventually revised the manuscript substantially. That revised version has now been published. In this paper, I share the original submission of the comment, which focuses on important considerations for future studies of risk-‐ sensitive foraging. Meanwhile, Codding and his colleagues have published a response to my comment. They exhibit some confusion about my position, which they describe as “paradoxical.” In a reply to their response, I have therefore added some clarifying remarks at the end of this paper
The NATO and the EU Peacebuilding Missions Dataset is created to use fuzzy seta Qualitative Comparative Analysis (fsQCA) analysis as a method of researching how NATO and the EU missions’ outcomes are influences by organizational assets and decision-making in both organizations. Outcome pertaining to these two sets of missions are intended to measure various aspects of organizational efficacy. There are two groups of variables – condition variables and outcome variables. In the next few sections, we will explain how these two groups of variables were generated, what existing sources and datasets were used and how mission indicators were generated. See attached research note for more detailed information.
Condition Sets: Description
By and large conditions sets that have been generated measure organizational assets for these NATO and EU missions, as well as patterns in their decision-making process. Two critical organizational assets used for both sets of missions are their annual operational budget and their annual deployed personnel. The dataset contains two control variables measuring operational legitimacy – number of contributing nations and number of UN resolutions passed in relevance to the situation in the area of deployment for the duration of the EU and NATO Mission.
Operational Duration – duration of the operation (in months). For ongoing missions, we have used December 31, 2018 as the end date. All data reflect occurrences no later than December 31, 2018.
Type of Operation – based on their mandate, operations are classified as civilian (coded as 0), military (coded as 1) and hybrid (i.e. with military and civilian components, coded as 0.5).
Annual Operational Budget – total annual mission budget in USD. Sources include SIPRI yearbook and peace operations database. In cases of missing data from the SIPRI yearbook, mission factsheets and original data from the mission have been used. This latter technique applies for the following missions: AMUK, AVSEC, BAM1, BAM2, CAP1, CAP2, MAM1, NAVF1, NAVF2, TMC1, EUAMI. If data is reported in EUR, average exchange rate for the duration of the mission has been used to convert the cost. Data has been adjusted to reflect operational budget over a 12-month period.
Average Annual Mission Personnel – it reflects the average total number of personnel/ staff supporting the NATO or EU peacebuilding mission per annum. Sources have been collected from SIPRI yearbook based on reportings for actual deployments on the ground. In cases when no data has been reported I the SIPRI yearbook/ peace operations dataset, mission factsheets and original data from the mission have been used. The data has been averaged and adjusted for a 12-month period.
Days to Launch – describes the number of days needed from the time a decision has been made by the IO top decision-making body (the European Council and NAC) to launch the mission to the time that the mission is officially declared “operational.” If no declaration that the mission is “fully operational” exists, landmark indicators that the mission is fully operational include: ceremony on the ground marking the beginning of the mission, the appointment of mission commander or first recoded operational presence involving activity on the ground. Sources include official EU and NATO documents announcing the decision to create the peacebuilding operation as well as official documents, press releases and reports in reliable media outlets (including New Agencies) documenting an event that would indicate the mission is “fully operational.”
Number of Contributing Nations –highest reported number of contributing nations for the duration of the NATO and the EU peacebuilding operation.
UN Security Council Resolutions – total number of UN Security Council (UNSG) resolutions relevant for the area of conflict adopted for the duration of the NATO and the EU mission. In cases when UNSC resolutions are relevant for multiple NATO and EU peacebuilding missions those have been reported to all relevant missions.
Outcome Sets: Description
Outcome sets include various indicators created to measure operational efficacy. They include annual events contributing toward peace, conflict and the mission’s functioning, annual fatalities and annual deaths among mission personnel, as well as annual difference in fatalities. A more detailed description of these indicators is included below:
Annual Peace Events – this is an annual indicator based on chronologically recorded events by the SIPRI yearbook that have contributed for the peace process in the conflict area where NATO and EU mission have been deployed. Examples of peace events include steps taken to contribute to the peace process (e.g. creation of buffer zone, cession of hostilities, meeting intended to cease fire or set up the peace process, political events related to or contributing toward the peace process and successful conclusion of a peace agreement. It may also include a decision of an international body (e.g. UN Security Council, UN General Assembly or UN Secretary General, as well as a decision made by the NATO and the EU D-M bodies that contributes toward the peace process in the areas where the mission operates. For ongoing missions is December 31, 2017 the last date when annual peace events are recoded.
Annual Conflict Events -- this is an annual indicator based on chronologically recorded events by the SIPRI yearbook that have increased the conflict and the conflict potential in the area where NATO and EU mission have been deployed. Instances include resumption of hostilities among warring parties, occurrence of attacks, clashes, eruption of violence, the killing of civilians, military and peacemaking personnel and other violence-related events that contribute toward instability in the mission’s area. For ongoing missions is December 31, 2017 the last date when annual conflict events are recoded.
Annual Mission-related Events -- this is an annual indicator based on chronologically recorded events by the SIPRI yearbook that measures events related to functioning of the mission – the decision to launch, the actual launch, implementation, transfer of authority and/ or mandate, transformation and termination of the mission. It also includes events that reflect decisions made by the contributing nations or sponsoring IOs intended to impact mission’s performances (e.g. decisions related to funding, control and command, transformation of mission mandate and rules and other similar events). For ongoing missions is December 31, 2017 the last date when annual mission-related events are recoded.
Average Annual Fatalities – this indicator reports how many average annual civilian deaths have been recorded for the duration of the mission. The data is drawn from the Armed Conflict Dataset (ACD) managed by the London-based International Institute for Strategic Studies ( https://acd.iiss.org/member/datatools.aspx).
Average Annual Mission Casualties – average annual number of deaths among peacebuilding personnel as reported in SIPRI yearbook/ peace operations database for the duration of the mission. Authors have used discretion to determine the accuracy in cases when there is discrepancy of reported data.
Fatalities Annual Difference – an indicator of differenced annual data of civilian casualties on the ground for the duration of the mission. The indicator is calculated as follows: Differenced Fatalities = Ʃ (CasualtiesY1-Y2 … Casualties Yn-Y(n-1))/ Duration of the mission (in years). It is intended to capture improvement of situation on the ground as a result of presence of the peacebuilding effort.
Condition Sets: Calibration and Rationale
Annual Operational Budget – mission budget reflects resources USD 5 million or less indicate fully out while USD 100 million or more would indicate fully in. A budget of USD 30 should be the watershed borderline of “nether in, not out.” [5-100 million]
Average Annual Mission Personnel – this indicator draws distinction between larger well-resourced missions and smaller missions with limited assets. By and large, missions with 20 personnel or less are fully out, while those with 20,000 or more are fully in. The borderline (net hither in, not out) is 130 people.
Days to Launch – the speed with which the decision is taken indicates how decision-making operated in the case of this mission. D-M that took 5 days or less should be fully out (in, change direction) while D-M 150 days or more should be fully in (out, change direction). 30 days (1 month) should be the neither in, nor out border.
Number of Contributing Nations –control indicator that demotes how high number of contributing nations contribute toward greater legitimacy (30 or more countries marks fully in), while 5 or fewer nations marks fully out. The “nether fully in, nor fully out” is at 15 nations.
UN Security Council Resolutions – total number of UNSC resolutions can vary, fully out is at 0 resolutions while fully in at 50 or more. Since moist of the missions are shorter, Nether fully in, not fully out would be at 8 UNSC resolutions. [Inductive]
Operational Duration – 1 year (12 months) denotes fully out (i.e. short-term mission) while 10 year 120 months denotes fully in; nether in not out would be for missions lasting 5 years (60 months). In other words, a decade is too long, a year is to short, five years is in the middle.
Outcome Variables: Calibration and Rationale
Annual Peace Events – this variable measures the occurrence of peace-related events – 0 events per annum is fully out; 10 events per annum is fully in. 1 event is nether in not out.
Annual Conflict Events -- this variable measures the occurrence of conflict-related events – 0 events per annum is fully out; 10 events per annum is fully in. 1 event is nether in not out.
Annual Mission-related Events -- this variable measures the occurrence of peace-related events – 0 events per annum is fully out; 10 events per annum is fully in. 0.5 event is nether in not out.
Average Annual Fatalities – this set measures average number of annual fatalities for the duration of the mission. Cases with 0 fatalities are fully out; cases with 10,000 fatalities are fully in. 1,000 fatalities represent “nether in, not out” value.
Fatalities Annual Difference – this is an indicator that measures the average year-to-year difference in number of fatalities for the duration of the conflict. -50 casualties is fully out (i.e. average growth of casualties by 50 per annum) as this indicator reflects low mission efficacy. 500 is fully in. This number indicates high efficacy; it denotes an average annual decline of casualties by 500 people. If the average number of casualties remains unchanged, then 0 denotes nether in, nor out.
Average Annual Mission Casualties – this indicator measures average number of annual casualties for the duration of the mission. 0 casualties is fully out; 500 casualties is fully in. 0.5 is nether in, nor out.
This is an ongoing research project focused on creating a framework for capturing various artifacts concerning Internet of Things devices. Research has shown a severe lack of frameworks focusing on collecting data from and about IoT devices. Mozilla’s WebThings Gateway focuses on collecting this information from the devices. This project expects to find methods of IoT data collection through a proposed test-bed utilizing the WebThings Gateway.
With the prevalence of anxiety, depression, and stress among young adult populations, adaptive and innovative treatment options
must be considered for the future. While there are various approaches to mental health treatment, art therapy is one traditional
method that has been used to treat the symptoms of mental health disorders across various health contexts and populations. Some art
therapists have even integrated information and communication technologies (ICTs) into their practices. With these factors in mind
and considering the prominence of ICTs use among student populations, this study seeks to understand how the immersion and
presence afforded by one such technology, virtual reality (VR), can impact the outcomes of art therapy practices. Through the use of
an arts-based VR application, Tilt Brush, this study compares traditional art therapy methods as they are employed in and outside of
VR. Through the comparison of self-reported measures, we can better understand the possibilities and effectiveness of art therapy
practices delivered via Tilt Brush VR.
Signature-based intrusion detection methods report high accuracy with a low false alarm rate. However, they do not perform well when faced with new or emerging threats. This work focuses on anomaly-based data driven methods to identify potential zero-day-attacks using a specific class of neural networks known as the autoencoder.
Cyberspace is one of the most complex systems ever built by humans. The utilization of cybertechnology resources are used ubiquitously by many, but sparsely understood by the majority of the users. In the past, cyberattacks were usually orchestrated in a random pattern of attack to lure unsuspecting targets. However, the cyber virtual environment is an ecosystem that provided a platform for an organized and sophisticated approach to launch an attack against a specific target group or organization by nefarious actors. In 2019, the average cost of cyber-attack in the US was about $1.6 million. This paper proposes a 3D framework to signal new threat alert before the actual occurrence of the threat on the surface web to alert cybersecurity experts and law enforcement agencies in preventive measures or means of mitigating the severity of damage caused by cyberattacks. The methodology combines information extracted from the deep web through a smart web crawler with socio-personal and technical indicators from twitter which is mapped with OTX (Open Threat Exchange). The OTX is an open-source cyber threat platform managed by security experts. The OTX endpoint security tool(OTX python SDK) will be used to identify a new type of cyber threats. The effectiveness of the framework will be tested using the machine learning algorithm precision-recall rate.
This research focuses on two fundamental aspects of hot spot policing that have been widely neglected by previous scholarly research. These aspects include the adequate concentration of crime at a smaller geographical unit to be considered a crime hot spot, and the cost-benefit implication of focusing limited police resources on such a smaller place in an effort to prevent criminal activities. Substantial limitations in call-t- service data from police departments raise concern on the purported concentration of crime at places that warrant such strategy in the first place. We will examine data from the Cincinnati Police Department and propose guidelines on adopting a threshold when designating places as crime hot spots, using time and cost-benefit analysis as key determinants.
The current rapid growth in the computer and internet development has ushered in numerous cybersecurity challenges which are constantly evolving with time. The current cybersecurity solutions are no longer optimal in tackling these emerging cyber threats and attacks. This paper proposes the creation of a cybersecurity dataset to be used for a hybrid machine learning (ML) approach of supervised and unsupervised learning for an effective intrusion detection system. The proposed model entails a five-stage process which starts at the setup of a simulated network environment of network attacks to generate a dataset which feeds into the data normalization stage and then to data dimension reduction stage using the principal component analysis as a feature extraction method after which the data of reduced dimension is clustered using the k-Means method to bring about a new data set with fewer features. This new dataset is afterward classified using the enhanced support vector machine (ESVM). The proposed model is expected to provide a high-quality dataset and an efficient intrusion detection system in terms of intrusion detection accuracy of 99.5%, short train time of 3seconds and a low false-positive rate of 0.4%.
Small office home office networks have become a target for many threat actors, hackers and cyber attackers and hence there is an urgent need to secure the network from such attackers. Most small office home office network users do not see the need to provide enough security to their networks because they assume no one is going to hack them forgetting that the biggest threat of our small home networks today comes from the outside. The challenge of misconfiguration of routers, firewalls and default configurations in our small home networks renders the network vulnerable to attacks such as DDos , phishing attacks , virus and other network attacks hence the need to implement a detection algorithm to help identify flaws in the pattern of the small office network. It turns out that about 75% of existing approaches focused on intrusion detection in 802.11 wireless networks of a SOHO and not the entire network. These approaches do not efficiently secure the network entirely leaving the rest prone to attacks can occur with or without the internet. This paper proposes to add another layer of security to the other preventive measures in a SOHO network by designing, implementing and testing a supervised neural network algorithm to identify attacks on the small home network and also to send a notification to users to keep them informed of the activities on their network. The supervised neural network algorithm will have a dataset representing both attacks and non-attacks which will be used in the training phase. The system should be able to detect and identify the various attacks and anomalies when they occur on the network and help keep the users informed.