Wednesday, November 27, 2019

prohibition essays

prohibition essays In 1917 Congress passed the Eighteenth Amendment to the Constitution which prohibited the export, import, manufacture, sale, and transportation of alcoholic beverages in the United States. This new law is believed to have had the greatest effect on the twenties creating a feeling of rebellion and wild behavior. Many people thought this law violated there right to live by their own standards and have a good time. The Volstead Act passed by Congress set up penalties to all violators of the Eighteenth Amendment. Prohibition is one of the best things ever done by the United States Government. It single-handedly created new business opportunities and brought people together like never before. It had also created a booming new industry, and created a new way of life for many people. Unfortunately, none of these things were good things. The new business opportunities were all in the organized crime realm. With the banning of alcohol they saw an incredible boom in business. No longer did they have to rely on robbery, brothels and cons. There was a whole new business out there and it was making millions. Prohibition also united the American people more than anything since the World War. Everyone, from the poor to the rich, united to break the law. Even the police, yeah sure they will serve and protect, unless they find a better deal. The police were letting alcohol be made and sold right under their noses. The rich buy the booze to spice up their parties and the poor spend their time and money in bootleggers houses getting drunk. Rarely do the rich and the poor agree on anything. But, prohibition contributed to an increased sense of community and neighborly love. Prohibition also brought big business to the small businessman. Alcohol making used to be done by all the large companies. With prohibition the big companies were put out and the small businesses had to meet the demand. This was what I wa ...

Saturday, November 23, 2019

Pesticides essays

Pesticides essays There are 2.5 billion pounds of pesticides being applied to agricultural products each year in the United States. This is ten times more than was applied forty years ago. It is still unknown as to what type of exact effects these chemicals may have on individuals. Some farmers that have been using pesticides in their fields and developed leukemia are finding that the cause of their disease is from inhaling pesticides. These chemicals are still in use today and most of them have never been tested for the short or long-term effects that they may have on humans. Each year there are 10,000 pesticide related poisonings. On July 4th 1985, over 300 Californians became sick after eating watermelons treated with the pesticide tenik. Testing supermarket produce is a way of determining the amount of exposure the consumer receives through common produce like carrots, tomatoes and lettuce. 44% of foods that were tested in supermarkets were found to have some traces of pesticide residue on them. Of all the pesticides found, nineteen of them were a pesticide called DDT. DDT was banned in this country 12 years prior to the testing. It was believed that these chemicals might have entered this country from another country that doesnt have pesticide restrictions as the U.S. does. Pesticides are contaminating the Earths water supplies. There are seventeen pesticides found in twenty-three states water supplies right now. Scientists at Cornell University conclude that 99% of pesticides miss the intended source and find their way into the water, air and soil. Most of the pollution isnt strong enough to create an immediate impact on humans so the wildlife is the primary target to these contaminates. Animals such as the European Starling birds are constantly being tested and found that they are greatly affected both behaviorally and psychologically. Farming practices that do not use pesticide ...

Thursday, November 21, 2019

Ethics and Corporate Social Responsibility Case Study - 1

Ethics and Corporate Social Responsibility - Case Study Example The study also aims to assess the impact of these strategies on stakeholders of the company, namely customers, employees and promoters of the brand. The paper tries to establish that the strategy of sustainable development has led to growth of customer base, increased customer retention and has been an overall benefitting investment for the company. Best Buy, through low prices and big discounts, has been attracting several customers, but its policies on sustainable development has also garnered goodwill and increased brand equity. The company uses the policy of recycling of e-waste to increase chances of return sales as well as helps in conserving the environment through sustainable strategies (Luo and Bhattacharya, 2006). Hence, in the study, the effort of Best Buy to implement its sustainable development policies and their impact on the brand has been critically examined. In 1966, Richard Shulze had opened a very  small business  at St. Paul in Minnesota, called  Sound of Music. In the next 17 years, the small store of Shulze had gradually grown into a multi-million dollar firm. By 1983, Sound of Music had changed its name to Best Buy Corporation, Inc. The first superstore opened up in Burnsville, Minnesota, under the new name. The store began selling more brands and appliances. It also started offering central service as well as warehouse distribution. In the nineties, Best Buy was the pioneers to offer newest technology such as, DVDs and HD TVs. By 1999, Best Buy and Microsoft had collaborated for mutual promotion. This has also led them to offer a two for one stock split. Best Buy operates through two business segments, Domestic and International. The  financial security  of Best Buy relies on its stores, Magnolia Audio Visual Stores and the Geek Squad. Between the year 2005 and 2008, Best Buy wanted to achieve a higher income rate than earlier. Four strategies that

Wednesday, November 20, 2019

Chart the development of virtual reality from 1950 to 2050 Essay

Chart the development of virtual reality from 1950 to 2050 - Essay Example This implies that virtual reality translates to near reality. Nevertheless, technical phrases have a straightforward description since it describes virtual reality as a three-dimensional atmosphere created by the computer, which an individual can investigate and interrelate with. The individual who interrelates with this virtual sphere or gets enormous inside its setting is able to influence things or perform a sequence of deeds. This individual generally applies goggles, earphones, gloves among various devices, and in this manner, the computer manages in any case three of the five common senses. Prior to supplying sensory input to the client, these gadgets in addition supervise the client’s deeds. For example, the goggles supervise eye progress and react accordingly through forwarding new video input (Vince, 2004: 4). History background of virtual reality There existed abundant debates unto the connotation and appropriate name of what remains presently known as the virtual ar ts, even prior to the growth of these idea commenced. Therefore, describing and knowing virtual arts is a significant feature in providing a concise account of its growth. Its naming fluctuated due to the brandling of its occurrence since virtual reality bore three models namely synthetic reality, virtual atmospheres, and supplemented reality. Nevertheless, the ideal naming of virtual arts has gone through alterations as the virtual art idea advanced. However, the description coagulated to â€Å"virtual† during the 1960s when computers surfaced, giving it a tangible meaning owing to computer visuality. On the other hand, the idea of reality cropped up when theorists questioned if something else existed excluding the discernible and quantifiable reality of virtual manifestation, and the idea concludes as reality owing to the sensible positivity it showed. This implies that virtual reality turned to the common phrase that befitted the idea, and during 1989, Jaron Lanier changed it through definition that comes out by the application of the newest invention of goggles, gloves and associated technologies (Yu, 2010: 310). The account of virtual arts has been current and abrupt. This is because, whereas its constituents have grown for almost forty years, operational virtual systems simply emerged lately on the screen (Mclellan, 1992: 24). Development of Virtual Reality Nevertheless, the past of virtual arts dates back during the middle of 1950s when a futurist cinematographer known as Morton Heilig constructed a multi sensory simulator known as the Sensorama. The gadget contained a stereoscopic exhibit, dischargers moving chair and speakers. These traits enabled the client to watch television in three-dimensional modes since it pre-recorded movie in colour and stereo. More so, the simulator contained binaural sound, moving air, odour and vibration practices. Although the simulator had the total of these modified features, it was not as associative as expected (Steed, 2002: 3). Later on during 1961, another group of engineers known asPhilco Corporation developed the first HMD bearing the name headsight. The helmet contained a video screen band a tracking system that had linkage to a closed circuit camera system (Will, 2009: 4). During 1965, Ivan Sutherland, a famous computer scientist imagined a further advanced method known as the eventual system, which linked the

Sunday, November 17, 2019

Critique of Gallery Shows of Asian Art Essay Example for Free

Critique of Gallery Shows of Asian Art Essay Asian art can refer to the vast genre of art and artists throughout the Asian continent. The history of Asian art is as varied as the cultures that make up this region of the world. From ancient bronze sculptures in India to the Manga cartoons of Japan, each country has a distinctive perspective on the world around them. In this paper I will look at three proposals for gallery shows of Asian art, each completely unique in their view of Asian culture. The first group looks at â€Å"Pop culture in Asia† focusing on the works of artists Wang Guangyi, Satoshi Kon, Takashi Murakami, and Basak Aditya, as well the art of Japanese tattoos. Organizing such conflicting works together seems disjointed and lacking coherence. Indeed the idea of Pop culture in Asia could be defined in multiple ways, but this grouping lacks consistency and logic. Works by Wang Guangyi, Satoshi Kon and Takashi Murakami, each with their pop art style and references, would be a good match for a show focusing on pop culture. Wang Guangyi reinvents propaganda posters from the 1960’s and 70’s into capitalist propaganda posters, using the same triangular composition and palette. Takashi Murakami is known for his sculptures of highly stylized cartoon or invented characters, referring to the popular culture of Japan or contemporary films. Satoshi Kon is a director of animated films that are loaded with Japanese cultural references and symbolism. I believe the work of these three artists would have been enough for a succinct show of Pop culture in Asia. The addition of the works of Basak Aditya and Japanese tattoos makes this grouping lose its focus. Although the work of Basak Aditya, with his poetic landscapes and dream-like portraits, is interesting, it is not a good fit because they are too personalized and make no references to the pop culture of India. And finally the addition of Japanese tattoos just seems like an arbitrary decision. Although some tattoos may have pop cultural references, the inclusion of photographs of skin art is incompatible with the cohesion of the first three artists in the grouping. The next group â€Å"Art and Power† successfully showed a variety of artwork that represented power throughout the ages. Beginning with paintings from the Chinese era of emperors and using concise language to demonstrate their interpretation of power. This group then looks at the brass and copper sculptures of Buddha, Shiva, and Jambhala, clearly demonstrating the power of religion in the regions of Tibet and India. Next are a grouping of decorative and ceremonial items from Korea, signifying the power of the ruling and upper-classes of ancient Asia. This grouping ends with a group of painting and sculpture of samurais and two thangka paintings. The overall consistency of the objects and paintings used for this grouping makes for a successful exhibition. All the works chosen were clearly indicative of power in this well organized grouping. Finally the last group chose â€Å"Asian Animation† as a theme. Again this is a clear and well put together group of mostly Japanese cartoons and figures. This group first looks at the work of Satoshi Trajiri, and the media franchise of Pokemon. This group clearly spent time on creating colorful cartoon-like backgrounds to add to their clear, well planned presentation. They then look at the illustration work of Akira Toriyama and his colorful, well defined sharp edge illustrations. The group then looks at toys and costumes that are created from these cartoons and comics, again using a similar background to unify the presentation. Although some of the content is repeated at the end of the grouping, the overall vision of presenting these comics and cartoons as art forms is cohesive and easy to understand. The group points out how important comics as an industry is to Japan and their cultural affects throughout the world.

Friday, November 15, 2019

David Hume - Naturalistic Metaethics, Politics, and Psychology :: Philosophy David Hume

David Hume - Naturalistic Metaethics, Politics, and Psychology ABSTRACT: According to the views expressed in this paper, influences unrelated to the conclusions of Immanuel Kant and G. E. Moore respecting what they saw as the appropriate foundation for moral systems seems to have been at work in the reactions of both to the earlier criticisms of David Hume. Building on a "recent meeting" with Hume in a pub on Princes Street in Edinburgh, I develop the suggestion that both Kant and Moore were loyal to traditional notions of an intuited, non-prudential basis for ethical injunctions. Kant, by his insistence that any morality linked only to hypothetical imperatives cannot be truly "moral," and Moore by his refusal to see the emptiness of his posited "good as simply good" which he felt must be kept free of any corrupting reference to real-world prudential constituents, thus support the foundation of ethical systems in an inner, unanalyzable moral impulse. And they do so in obedience to commitments that antedate their moral philosophies. I also claim that Hume has been misunderstood in that he did not mean to oppose the naturalistic grounding of moral systems in his famous statement disjoining is-statements from ought-statements; what he really intended was to point out the illogic of moralists who improperly pretend to derive categorical or intuited moral imperatives from real-world is-statements while denying any prudentiality or a posteriority to the transaction. Because both maintain that this simple inner moral impulse must be independent of prudential considerations in making moral decisions and judgments, Kant and Moore oppose naturalistic ethical systems which, like J.S. Mill's, suggest that this-worldly welfare and happiness are in large part coexistent with the true meaning of morality. Their position, therefore, places both of these proponents of intuitionist metaethics at odds with the principle of political social democrats that a respectable moral system must place worldly satisfactions and happiness above obedienc e to any putative "higher" moral law and its intuited imperatives. I had a talk with David Hume one rainy night recently in a pub in Edinburgh, over—naturally—kippers with brown bread and a pint of stout or two. He let me in on a secret and gave me leave to whisper it in turn to a few friends. Which is why I jotted down this account of our meeting and am presenting it to you here. Remember what that great analyst wrote to set in motion the train of thought that culminated in G.

Tuesday, November 12, 2019

Demarcation in Philosophy of Science Essay

The demarcation problem in the philosophy of science is about how to distinguish between science and nonscience, and more specifically, between science and pseudoscience (a theory or method doubtfully or mistakenly held to be scientific). The debate continues after over a century of dialogue among philosophers of science and scientists in various fields, and despite broad agreement on the basics of scientific method. The demarcation problem is the philosophical problem of determining what types of hypotheses should be considered scientific and what types should be considered pseudoscientific or non-scientific. It also concerns itself with the ongoing struggle between science and religion, in particular the question about which elements of religious doctrine can and should be subjected to scientific scrutiny. This is one of the central topics of the philosophy of science, and it has never been fully resolved. The Purpose of Demarcation Demarcations of science from pseudoscience can be made for both theoretical and practical reasons. From a theoretical point of view, the demarcation issue is an illuminating perspective that contributes to the philosophy of science. From a practical point of view, the distinction is important for decision guidance in both private and public life. Since science is our most reliable source of knowledge in a wide variety of areas, we need to distinguish scientific knowledge from its look-alikes. Due to the high status of science in present-day society, attempts to exaggerate the scientific status of various claims, teachings, and products are common enough to make the demarcation issue pressing in many areas. The demarcation issue is therefore important in many practical applications such as the following: Healthcare: Medical science develops and evaluates treatments according to evidence of their efficiency. Pseudoscientific activities in this area give rise to inefficient and sometimes dangerous interventions. Healthcare providers, insurers, government authorities and – most importantly – patients need guidance on how to distinguish between medical science and medical pseudoscience. Expert testimony: It is essential for the rule of law that courts get the facts right. The reliability of different types of evidence must be correctly determined, and expert testimony must be based on the best available knowledge. Sometimes it is in the interest of litigants to present non-scientific claims as solid science. Therefore courts must be able to distinguish between science and pseudoscience. Environmental policies: In order to be on the safe side against potential disasters it may be legitimate to take preventive measures when there is valid but yet insufficient evidence of an environmental hazard. This must be distinguished from taking measures against an alleged hazard for which there is no valid evidence at all. Therefore, decision-makers in environmental policy must be able to distinguish between scientific and pseudoscientific claims. Science education: The promoters of some pseudosciences (notably creationism) try to introduce their teachings on school curricula. Teachers and school authorities need to have clear criteria of inclusion that protect students against unreliable and disproved teachings Ancient Greek Science An early attempt at demarcation can be seen in the efforts of Greek natural philosophers and medical practitioners to distinguish their methods and their accounts of nature from the mythological or mystical accounts of their predecessors and contemporaries. Medical writers in the Hippocratic tradition maintained that their discussions were based on necessary demonstrations, a theme developed by Aristotle in his â€Å"Posterior Analytics†. One element of this polemic (passionate argument) for science was an insistence on a clear and definite presentation of arguments, rejecting the imagery, analogy, and myth of the old wisdom. Aristotle described at length what was involved in having scientific knowledge of something. To be scientific, he said, one must deal with causes, one must use logical demonstration, and one must identify the universals which ‘inhere’ in the particulars of sense. Criteria for Demarcation: Logical Positivism also known as Verificationism * Held that only statements about empirical observations and formal logical propositions are meaningful, and that statements which are not derived in this manner (including religious and metaphysical statements) are by nature meaningless. * The Viennese philosophers who introduced the positivist paradigm effectively laid the groundwork for the modern philosophy of science and one of its most important strands of thought. The early Positivists favored a rather strict approach to the demarcation and strongly affirmed the empirical nature of science, meaning that questions that cannot be empirically verified or falsified are irrelevant to scientific thought. * These philosophers, who called themselves logical positivists, argued that to produce a meaningful claim, one must always return to the tangible observations that result from that claim. * By the late 1970s, its ideas were so generally recognized to be seriously defective. Falsifiability * Proposed by Karl Popper. In his monumental book, â€Å"The Logic of Scientific Discovery† he proposed the idea that scientific hypotheses must be falsifiable; unfalsifiable hypotheses should be considered pseudoscience. Popper’s emphasis on falsifiability changed the way scientists viewed the demarcation problem, and his impact on philosophy of science was enormous. * Popper’s demarcation criterion has been criticized both for excluding legitimate science and for giving some pseudosciences the status of being scientific. Postpositivism * Thomas Kuhn, an American historian and philosopher of science, is often connected with what has been called postpositivism. * In 1962, Kuhn published The Structure of Scientific Revolutions, which depicted the development of the basic natural sciences in an innovative way. According to Kuhn, the sciences do not uniformly progress strictly by scientific method. Rather, there are two fundamentally different phases of scientific development in the sciences. In the first phase, scientists work within a paradigm (set of accepted beliefs). When the foundation of the paradigm weakens and new theories and scientific methods begin to replace it, the next phase of scientific discovery takes place. Kuhn believes that scientific progress—that is, progress from one paradigm to another—has no logical reasoning. He undermines science as a whole by arguing that what is considered science changes throughout history in such a way that there is no objective way (outside of time or place) to demarcate a scientific belief from a pseudoscientific belief. Science, Kuhn argues, is like politics: institutions believe that certain ways are better than others at different points throughout history; however, it is impossible to be more or less certain of our basic assumptions about the world. Within a democracy (a specific political paradigm) there can be progress: an economy can grow, schools can be built, and people can be given healthcare. However, if a revolution occurs and the country becomes socialist, the government is not inherently better or worse than before, but simply begins to follow a different set of assumptions. Paradigm shift * A paradigm shift is a phenomenon described by philosopher Thomas Kuhn in The Structure of Scientific Revolutions. * Kuhn posited a process to explain the persistence of incorrect ideas, and the seemingly rapid and sudden abandonment of these ideas when they finally are rejected. * People tend to believe in what they know, and science is basically conservative. A current â€Å"paradigm† or theory is difficult to dislodge. It takes either a large volume of evidence, or a particularly powerful single piece of evidence to overturn major scientific theories (scientific revolution). When this occurs, it is called a â€Å"paradigm shift†. Lakatos’ research programs * Imre Lakatos combined elements of Popper and Kuhn’s philosophies with his concept of research programs. Programs that succeed at predicting novel facts are scientific, while ones that fail ultimately lapse into pseudoscience. Feyerabend and Lakatos * Kuhn’s work largely called into question Popper’s demarcation, and emphasized the human, subjective quality of scientific change. Paul Feyerabend was concerned that the very question of demarcation was insidious: science itself had no need of a demarcation criterion, but instead some philosophers were seeking to justify a special position of authority from which science could dominate public discourse. Feyerabend argued that science does not in fact occupy a special place in terms of either its logic or method, and no claim to special authority made by scientists can be upheld. He argued that, within the history of scientific practice, no rule or method can be found that has not been violated or circumvented at some point in order to advance scientific knowledge. Both Lakatos and Feyerabend suggest that science is not an autonomous form of reasoning, but is inseparable from the larger body of human thought and inquiry. NOMA * The concept of Non-overlapping Magisteria is a relatively recent attempt at proposing a clear demarcation between science and religion. It explicitly restricts science to its naturalistic foundations, meaning that no conclusions about supernatural phenomena like gods may be drawn from within the confines of science. â€Å"As to the supposed ‘conflict’†¦between science and religion, no such conflict should exist because each subject has a legitimate magisterium, or domain of teaching authority—and these magisteria do not overlap.† Criteria based on scientific progress Popper’s demarcation criterion concerns the logical structure of theories. Imre Lakatos described this criterion as â€Å"a rather stunning one. A theory may be scientific even if there is not a shred of evidence in its favour, and it may be pseudoscientific even if all the available evidence is in its favour. That is, the scientific or non-scientific character of a theory can be determined independently of the facts†. Instead, Lakatos proposed a modification of Popper’s criterion that he called â€Å"sophisticated (methodological) falsificationism†. On this view, the demarcation criterion should not be applied to an isolated hypothesis or theory but rather to a whole research program that is characterized by a series of theories successively replacing each other. In his view, a research program is progressive if the new theories make surprising predictions that are confirmed. In contrast, a degenerating research programme is characterized by theories bein g fabricated only in order to accommodate known facts. Progress in science is only possible if a research program satisfies the minimum requirement that each new theory that is developed in the program has a larger empirical content than its predecessor. If a research program does not satisfy this requirement, then it is pseudoscientific. According to Paul Thagard, a theory or discipline is pseudoscientific if it satisfies two criteria. One of these is that the theory fails to progress, and the other that â€Å"the community of practitioners makes little attempt to develop the theory towards solutions of the problems, shows no concern for attempts to evaluate the theory in relation to others, and is selective in considering confirmations and disconfirmations†. A major difference between his approach and that of Lakatos is that Lakatos would classify a nonprogressive discipline as pseudoscientific even if its practitioners work hard to improve it and turn it into a progressive discipline. In a somewhat similar vein, Daniel Rothbart (1990) emphasized the distinction between the standards that should be used when testing a theory and those that should be used when determining whether a theory should at all be tested. The latter, the eligibility criteria, include that the theory should encapsulate the explanatory success of its rival, and that it should yield test implications that are inconsistent with those of the rival. According to Rothbart, a theory is unscientific if it is not testworthy in this sense. George Reisch proposed that demarcation could be based on the requirement that a scientific discipline be adequately integrated into the other sciences. The various scientific disciplines have strong interconnections that are based on methodology, theory, similarity of models etc. Creationism, for instance, is not scientific because its basic principles and beliefs are incompatible with those that connect and unify the sciences. More generally speaking, says Reisch, an epistemic field is pseudoscientific if it cannot be incorporated into the existing network of established sciences. Rejection of the Problem * Some philosophers have rejected the idea of the demarcation problem, such as Larry Laudan. Others like Susan Haack, while not rejecting the problem wholesale, argue that a misleading emphasis has been placed on the problem that results in getting stuck in arguments over definitions rather than evidence. Laudan * Larry Laudan concluded, after examining various historical attempts to establish a demarcation criterion, that â€Å"philosophy has failed to deliver the goods† in its attempts to distinguish science from non-science—to distinguish science from pseudoscience. None of the past attempts would be accepted by a majority of philosophers nor, in his view, should they be accepted by them or by anyone else. He stated that many well-founded beliefs are not scientific and, conversely, many scientific conjectures are not well-founded. 3 Major Reasons why Demarcation is sometimes difficult: * science changes over time, * science is heterogeneous and; * established science itself is not free of the defects characteristic of pseudoscience