Thursday, December 26, 2019

Disease Pathophysiology And Treatment Of Diabetes Mellitus

Disease Pathophysiology and Treatment of Type 1 Diabetes Mellitus Rogelio Gonzales University of Texas Rio Grande Valley 1. Introduction â€Å"Diabetes mellitus, by far the most common of all endocrine disorders, is one of the foremost public health concerns confronting the world today. Over 23 million individuals in the United States, or 8% of the population, have diabetes. An estimated 17.5 million have been diagnosed, but 5.5 million (nearly one fourth) are unaware they have the disease† (Nelms, Sucher, Lacey, Roth, 2011). The prevalence of Type 1 diabetes mellitus in the U.S. is staggering, and in today’s society these estimated figures do not seem to be declining. Furthermore not only is it affecting the U.S., but also many other†¦show more content†¦These two hormones work hand in hand to balance the body’s glucose levels. In normal function, the pancreas secretes insulin in response to the introduction of glucose into the body and to meet basal metabolic needs. In type 1 diabetes, there is a deficiency of the hormone insulin, which can have many contributing facto rs. There is no one cause for this disease. However, one cause stems from the body’s immune response. The immune mediated response of type 1 diabetes is an, â€Å"autoimmunity directed against pancreatic islet cells [that] results in slowly progressing beta-cell destruction, culminating over years in clinically manifested insulin-dependent diabetes mellitus† (Krishna Srikanta, 2015). In short, an individual’s pancreatic beta cells begin to slowly die off which obviously affects the feedback mechanism that is regulating blood glucose concentration. Thus, it makes an individual rely on mechanically introduced insulin. As for clinical symptoms of this disease, the body compensates for the unregulated amount of glucose circulating within the body tissue by expelling glucose by any means necessary. This occurs by having excessive amounts of glucose released into the urine (glycosuria) as the kidneys become incapable of filtering the blood. Polyuria is the outcome o f the urine’s sudden increase in osmolarity. This symptom triggers a

Wednesday, December 18, 2019

A Brief Report On Colgan Flight 3407 Essay - 962 Words

February 12th, 2009. A sad day it was in the United States, mostly in the eastern section of the great land. Colgan flight 3407 was set for departure from Newark, New Jersey set to land in Buffalo, New York. Traffic was significant enough to illuminate plenty of people travelling to Buffalo and back as a total of 110 recorded flights from different carriers were incoming and going between the two locations. In which seven Continental flights were bound for the Newark Liberty International Airport from Buffalo Niagra International Airport on the same day. Little did the passengers and crew members know that Flight 3407 was special. As the plane was approaching the New York area, the cockpit crew observed ice accumulation on the aircraft’s wings. This was an indication of the flight operating in a low temperature area and may have been the cause of the crash. The accumulation of ice on the wing adds weight to the air-bone body and thus, reduces the lift. Additionally, it also ch anges the shape of the wing, and this shape is what gives the lift. During the investigation held by the National Transportation Safety Board, the Cockpit Voice Recorder (CVR) was examined and found out that the crew did not recheck if the de-ice system installed in the aircraft was operational or not. This was critical due to the fact that the ice buildup was a mere 6 minutes before the plane crashed. The Flight Data Recorder was brought into account, it stated that the de-ice system was set in the

Tuesday, December 10, 2019

History of linux free essay sample

The History of Linux began in 1991 with the commencement of a personal project by a Finnish student, Linus Torvalds, to create a new operating system kernel. Since then the resulting Linux kernel has been marked by constant growth throughout its history. Since the initial release of its source code in 1991, it has grown from a small number of C files under a license prohibiting commercial distribution to its state in 2009 of over 370 megabytes of source under the GNU General Public License. Events leading to creation The Unix operating system was conceived and implemented in the 1960s and first eleased in 1970. Its availability and portability caused it to be widely adopted, copied and modified by academic institutions and businesses. Its design became influential to authors of other systems. In 1983, Richard Stallman started the GNU project with the goal of creating a free UNIX-Iike operating system. As part of this work, he wrote the GNU General Public License (GPL). We will write a custom essay sample on History of linux or any similar topic specifically for you Do Not WasteYour Time HIRE WRITER Only 13.90 / page By the early 1990s there was almost enough available software to create a full operating system. However, the GNU kernel, called Hurd, failed to attract enough attention from developers leaving GNU incomplete. Another free operating system project in the 1980s was the Berkeley Software Distribution (BSD). This was developed by UC Berkeley from the 6th edition of Unix from ATT. Since BSD contained Unix code that ATT owned, ATT filed a lawsuit (USL v. BSDi) in the early 1990s against the University of California. This strongly limited the development and adoption of BSD. MINIX, a Unix-like system intended for academic use, was released by Andrew S. Tanenbaum in 1987. While source code for the system was available, modification and redistribution were restricted. In addition, MINXs 16-bit design was not well adapted to the 32- it features of the increasingly cheap and popular Intel 386 architecture for personal computers. These factors and the lack of a widely-adopted, free kernel provided the impetus for Torvaldss starting his project. He has stated that if either the GNU or 386BSD kernels were available at the time, he likely would not have written his own. The creation of Linux In 1991, in Helsinki, Linus Torvalds began a project that later became the Linux kernel. It was initially a terminal emulator, which Torvalds used to access the large UNIX servers of the university. He wrote the program specifically for the hardware he as using and independent of an operating system because he wanted to use the functions of his new PC with an 80386 processor. Development was done on MINIX using the GNU C compiler, which is still the main choice for compiling Linux today (although the code can be built with other compilers, such as the Intel C Compiler). As Torvalds wrote in his book Just for Fun, he eventually realized that he had written an operating system kernel. On 25 August 1991, he announced this system in a Usenet posting to the newsgroup comp. os. minix. The name Linus Torvalds had wanted to call his invention Freax, a portmanteau of freak, free, and x (as an allusion to Unix). During the start of his work on the system, he stored the files under the name Freax for about half of a year. Torvalds had already considered the name Linux, but initially dismissed it as too egotistical. In order to facilitate development, the files were uploaded to the FTP server (ftp. funet. fi) of FUNET in September 1991. Ari Lemmke, Torvalds coworker at the University of Helsinki who was one of the volunteer administrators for the FTP server at the time, did not think that Freax was a good name. So, he named the project Linux on the erver without consulting Torvalds. Later, however, Torvalds consented to Linux. To demonstrate how the word Linux should be pronounced, Torvalds included an audio guide with the kernel source code. Linux under the GNU GPL Torvalds first published the Linux kernel under its own licence, which had a restriction on commercial activity. The software to use with the kernel was software developed as part of the GNU project licensed under the GNU General Public License, a free software license. The first release of the Linux kernel, Linux 0. 01, included a binary of GNUs Bash shell. In the Notes for linux release 0. 1 Torvalds lists the GNU software that is required to run Linux. In 1992, he suggested releasing the kernel under the GNU General Public License. He first announced this decision in the release notes of version 0. 12. In the middle of December 1992 he published version 0. 99 using the GNU GPL.

Monday, December 2, 2019

Iq Testing And Grouping Essays - Intelligence, Psychometrics

Iq Testing And Grouping Running Head: IQ TESTING AND GROUPING INTELLIGENCE TESTING AND GROUPING RON WILLIAMS PSYCHOLOGICAL AND EDUCATIONAL TESTING CAMPBELL UNIVERSITY DR. FATICA IQ TESTING AND GROUPING 2 In defining intelligence, there has always been the question of whether intelligence is measured as a remarkable occurrence or if it has many variables that are combined. For example, is it how ?smart? a person is? Or is it their ability to perform well on standardized tests? Are they measuring a person's intelligence? Or just some arbitrary quantity of the person's IQ? Or is it a mixture of survival, mathematical, social and other abilities. There are many debates regarding whether measuring intelligence is determined from test scores and results, or if it is measured by the person's ability to process and problem solve. Uses of intelligence testing in an educational setting, intelligence and achievement tests are administered routinely to assess individual accomplishment. They are used to improve instruction and curriculum planning. High schools use these tests to assist in the students future educational planning and help decide what college or type of college to attend. Elementary schools utilize screening and testing procedures to help determine readiness for writing and reading placement. Intelligence can be measured, by intelligence tests, among them the Stanford-Binet Intelligence Scale and the Wechsler Scale. These tests are intended to determine an individual's intelligence quotient (IQ). Intelligence tests usually provide an estimate of global cognitive functioning as well as information about functioning within more specific domains. Intelligence tests are quite stable compared to measures of other human traits. However, the degree of stability increases with age such that early childhood and preschool measures of intellectual function are far less predictive of later functioning than assessments taken during middle childhood. The stability of intelligence test scores may change as a function due to important environmental factors. Therefore, intelligence test scores are descriptive of a child's functioning at that point in time when taking a test. The test scores could also be effected by environmental factors, child's psychiatric status or educational program. IQ TESTING AND GROUPING 3 Components of a good intelligence test are (a) Validity; does the test really measure intelligence and not something else? (b) Reliability; does the test produce consistent measures? (c) Norms; are the participants being fairly compared? Components that make an intelligence test flawed are (a) Poor validity; tests may be sensitive to social factors. (b) Poor norms; comparing people who are different. (c) Poor application; tests measure something that the school or job has nothing to do with. Theories of Process Psychometric Model Psychometric approach is defined as psychology that deals with the design, administration, and interpretation of quantitative tests for the measurement of psychological variables such as intelligence, aptitude, and personality traits. There are various psychometric approaches to intelligence. The following paragraphs describe three different theorists and their psychometric model. First is Charles Spearman, who believed that intelligence is a combination of two parts. According to his two-factory theory of intelligence, the performance of any intellectual act requires some combination of g, (general intelligence factor) which is available to the same individual to the same degree for all intellectual acts. (Specific factors) or s is specific to that act and varies in strength from one act to another. S is specific knowledge such as verbal reasoning or spatial problem solving. Spearman equated g with mental energy. If one knows how a person perfo rms on one task that is highly saturated with g, one can safely predict a similar level of performance for another highly g saturated task. The prediction of performance on tasks with high s factors is less accurate. Thus, the most important information to have about a person's intellectual ability is an estimate of their g or mental energy (Plucker 1989). Guilford's theory includes 150 abilities, arranged in three dimensions; contents, operations, and products. Guilford's three-dimensional Structure of Intellect classified intellectual acts into 120 separate categories. These categories are operations dimension, products dimension and material IQ TESTING AND GROUPING 4 or content dimension. He developed firm convictions regarding the ability of individual difference among people. Guilford believed that intelligence is much too complicated to be subsumed by a few primary mental abilities and g factor. His systematic theory gave rise to what is known as