Tuesday, August 6, 2019
Procter & Gamble Essay Example for Free
Procter Gamble Essay Proctor and Gamble Co. is sending video crews run by a small research firm in London, Everyday Lives, Ltd. , into about 80 homes worldwide to record peoples daily routines in the name of marketing research. PG believes that some people have selective memory in focus groups and interviews, and some insights into consumer behavior may be lost. Camera crews will arrive at the participants home when they wake up and not leave until they go to bed for a duration of four days. Cameras may not be manned at all times, and bedroom and bathroom activities will not be recorded. Families will be paid an undisclosed amount for their participation. PG wants to solve problems that their shoppers may not have known they had. For example, marketers discuss multitasking habits while watching a woman in Thailand make breakfast, feed her baby, and watch television simultaneously. The study will initially take place in the UK, Germany, and China since there are such major growth opportunities overseas. Project risks include people behaving abnormally in front of the cameras, local privacy laws, searching hours of videotape for ideas, and producing successful products. PGââ¬â¢s goal is to maintain a huge video library that can be organized by key words, and will give a global perspective on something as simple as eating snacks. By stepping into the homes and lives of their audience PG has found a way to have up to date information on the routine habits of the population of their target market and develop an intimate relationship with them by making themselves a part of the daily rituals that no one other then the individual may see. They will even be able to observe the innate actions that a person may not even realize they do. Market conditions are continuously changing and having this live information will give PG the sufficient knowledge to develop a successful marketing strategy. They will also benefit by being able to see other products the families are buying to gain a perspective of the average budget people are able to afford for similar goods. In the past this obtrusive plan of watching a personsââ¬â¢ every move would have been seen as crossing a line, or illegal, and participants would be hesitant to partake. The plan fits into modern society trends, however, where reality shows are amongst the highest rated programs on television. The intrusion is now acceptable, and even seen as somewhat glamorous, especially when a monetary transaction is involved. In my opinion, PG will have a laborious task ahead of them, but the information from this market research has the prospect of yielding a host of innovative convenience products and goods.
Monday, August 5, 2019
Effects Of Noise In A Data Communication
Effects Of Noise In A Data Communication This report will look into different types of noise that are associated with Unshielded Twisted Pair and Radio Waves. The noise that affects these transmission mediums such as thermal noise, crosstalk, multipath interference, intermodulation noise and impulse noise will be explored and the damages that it can cause to data being transmitted will be explained. I will also discuss the different modulation techniques and technologies that can be used to try and reduce the effect of the noise and reduce the risk of data loss through transmission. Introduction In 1962 Computer Scientist Joseph Carl Robnett Licklider developed ARPANET, which connected 4 computers across America; these computers were located in University of California Los Angeles, Stanford Research Institute, University of California Santa Barbra and the University of Utah. This network was designed for the purpose of sharing sensitive military data between different locations securely. However the first attempt at sending data over the network was not successful, as the UCLA computer crashed as they attempted to log into the computer at Stanford [1]. The result of these connection problems was the creation of TCP/IP and since then networks have grown in size and data rates and transmission mediums have evolved and new technology has been introduced, Noise has also started to play a part in how networks are built, as specific techniques can be put in place to try and reduce noise. Guided Media In a communication system using guided media, the signal is sent in the form of electromagnetic waves along a physical path. This physical path is what guides the signal, and can come in the form of 4 main media types, Unshielded Twisted Pair; Shielded twisted Pair, Coaxial or Fibre-Optic cables. However each of these mediums has several different standards of cables associated with them. This report will cover Unshielded Twisted Pair and the noise that can affect it. UTP UTP first originated in the 1970s, it consists of 8 insulated copper wires, each of these copper wires has a diameter of 0.4mm to 0.8mm, and these copper wires are twisted together into pairs, so there ends up being 4 pairs of 2 wires, then all 4 pairs are wrapped in a protective plastic sheath. However UTP is susceptible to several different types of noise that can lead to signal impairment and even cause the loss of data. UTP uses Manchester Encoding UTP Noise When a data transmission is received, the received signal is often modified from the original signal that was transmitted; this modification is caused by noise. Noise is defined as additional unwanted signals that are inserted somewhere between transmission and reception [2]. There are 4 different types of noise that will be researched; these are Thermal Noise, Cross talk, and Intermodulation Noise. These sources of noise can be placed into one of two categories, internal noise or external noise. Internal Noise is caused by the used of electrical components found in all communication systems. This internal noise could be produced by changes in current or imperfections on conducting materials. External Noise can be caused by different factors, such as lighting storms, or the use of large electrical machinery. [3] Thermal Noise Thermal Noise also known as Johnston or white noise was first observed in 1926 by John B. Johnston in Bell Labs. Thermal Noise is caused by electrons that become agitated at any temperature above 0, at this stage they begin to move in random patterns and bounce off other electrons, however in theory it could be stopped completely if all the components were kept at a temperature of absolute zero which is 0 Kelvin or -273.15à °C as this would mean that all the electrons would move at their slowest meaning thermal noise would be as good as eliminated, however to try and achieve absolute zero and maintain it would be extremely difficult . [4] Thermal noise is found across all the bandwidths typically used in a communication system and currently there is no practical way to completely eliminate it, however you can use different types of modulation to lower the frequency of the signal which in turn will lower the thermal noise, so for example if you had an Ethernet system and used PAM-5 modulation which has a frequency of 125MHz and this would give you a thermal noise value of 4.801510-13 WHz-1 at room temperature, where as if MLT-3 was used, you would end up with a thermal value of 1.20037510-13WHz-1 under the same temperature conditions. [5] To work this out the equation Pn= k . T. Ãâ f was used, where k is Boltzmans constant, T is the temperature plus 273, in this case 18 degrees plus 273 which ends up as 291 for T, and Ãâ f is the frequency of 125106 Hz for PAM-5 and 31.25106 Hz for MLT-3. Cross Talk Cross talk is caused by the coupling of the copper cables magnetic and electric fields, which causes some of the signal to become lost or distorted. There are two main types of cross talk, NeXT (Near End Cross Talk) and FeXT (Far End Cross Talk), NeXT is when the coupling of magnetic and electric fields occurs near the source of the signal and FeXT is when it occurs near the receiver end. To try and prevent cross talk in UTP cables, the copper cables are twisted into pairs, the number of twists per foot/meter is defined as the twist ratio, so a cable with a higher twist ratio will be more efficient eliminating cross talk, as the twisting of the copper wires makes it harder for the coupling of cables as the loop area between the wires is reduced. However if you have a cable with a high twist ratio that means that you will be using more copper cable and the signal will have to travel a further distance to the receiver, meaning attenuation could become a factor. [6] Intermodulation Noise Intermodulation noise may be present in any communications system that sends signals at different frequencies across the same medium. Intermodulation noise produces signals that are the difference, sum or multiple of the two original frequencies. Intermodulation noise is caused by the transmission medium, transmitter and receiver not being linear systems, meaning that instead of the output matching the input, the output is different from the input. It can be caused by signal strength being too excessive for the device to handle or a problem with one of the components. An example of intermodulation Noise would be if there were two signals, 10Hz and 15Hz sharing the same transmission medium and there was intermodulation noise present, these two signals could become one signal at 35Hz. This would mean that not only have the two original signals been disrupted it could potentially disrupt a third signal if there was another 35Hz signal sent out on the medium. To overcome intermodulation noise, you can use Orthogonal Frequency-Division Multiplexing, which is explained more in the multipath interference section under unguided media. [2] Unguided Media When using Unguided Media in a communications system, the signal is sent through the air via an antenna in the form of electromagnetic waves, these waves have no specific path to follow. Unguided media used for several different communications systems like wireless, Bluetooth, infrared and satellite. Each of these systems use different types of unguided media for example satellite uses microwaves, but this report will focus on wireless and the noise that can affect the radio waves wireless uses. Wireless The first radio waves were sent by Guglielmo Marconi in Italy in 1895 and in 1899 he sent the first wireless radio signal across the English Channel [7]. Wireless works by an Omni directional antenna sending out a broadcast of radio waves, these radio waves are sent at a specific frequency depending on which standard they comply to, for example if the standard being used is 802.11n then they will be sent at 2.4GHz or 5GHz. Wireless Noise Wireless can be affected by many different things. This is because radio waves travel through air meaning it can be affected by different types of weather, like rain or snow causing scattering, or obstacles such as trees or buildings causing reflections. However it can also be affected by other devices transmitting at the same frequency causing signal loss. Multi Path Interference Multi Path interference is where a receiver receives multiple copies of the same signal, at delayed times, this mainly affects radio, as satellite or microwaves generally are line of sight so there would be no obstacles present for reflection to take place. However with radio waves it is caused by the antenna sending out broadcast signals, and these signals are then reflected between obstacles, and if these reflections arrive at the receiver it means that it will end up with several different copies of the same signal arriving at varying times, and depending on the different path lengths of the original direct signal and the reflected signals it could create a larger or smaller signal that is eventually received. Multipath Interference can cause a number of problems like data corruption, which occurs if there receiver picks up multiple different reflected signals and is unable to determine the transmission information, it can also cause signal nulling, where the reflected signals are received exactly out of phase with the original signal causing the original signal to be cancelled out. Not only can it cause data loss it can change the amplitude of the signal up or down, so if the reflected signals arrive out of phase with the original signal it will cause a drop in the signal amplitude but if they arrive in phase with the main signal the amplitude will increase. To try and reduce multipath interference a diversity solution can be used. This works by using two antennas with the same gain, that are separated from one another but within the range of the same transmitter, this means that one of the antenna receive most of the multipath interference allowing the other antenna to receive a normal signal. [8] Another way to reduce Multi Path Interference is to modulate the signal with Orthogonal Frequency-Division Multiplexing, OFDM works by splitting the signal up into 48 subcarrier signals. These 48 channels each carry a different portion of the data being sent and transmit them in parallel channels. [9] These subcarrier signals are modulated with BPSK, QPSK, 16-QAM or 64 QAM, and they have a convolution code rate of à ½, 2/3 or à ¾. The data rate of the signals is determined by the modulation used and the convolution code rate. Also there is 0.3125MHz frequency spacing between each of the subcarriers. [10] [11] OFDM also has a guard interval which means that any data arriving at the receiver will only be sampled once the signal has become stable and no more reflected signals are picked up that would cause changes to the phase or timing of the signal. Also because each subcarrier is on a different frequency any interference caused by reflected signals only affects a small percentage of the subcarriers meaning that the rest are received correctly. [9] Impulse Noise Impulse Noise is an unpredictable problem. It consists of short spikes of high amplitude or short irregular pulses, these spikes and pulses are generated from a variety of different unpredictable causes usually however they relate to some sort of electromagnetic instability for example a lighting storm or any faults present in the communications devices. Impulse noise generally affects digital signals worse than it does analogue signals, for example if voice data was sent as an analogue signal and there was occurrences of impulse noise, the voice data would still be understandable as the impulse noise would create short crackles in the data, however with a digital signal the result of impulse noise could mean that all the bits sent through the duration of the impulse noise could be lost, it can however be recovered by sampling the received digital waveform once per bit time, but it can still result in a few bits being in error. As impulse noise is unpredictable, there is no way to el iminate it, however to reduce the effects of it, Coded OFDM can be used, this is very similar to OFDM, in the way that it splits the signal into multiple subcarriers, however Coded OFDM also has forward error correction that is included with the data. Because this error correction is included with the data it means that any data lost by impulse noise can be corrected at the receiver. [12] Conclusion After researching different types of noise and how it effects data communications, it became clear that it is a factor present in all systems and cannot be completely eradicated, as it can be caused by several different external sources made my man and internal sources caused by the data communication equipment. However, different strategies, techniques and error correction systems have enabled us to limit the effect that noise can have on a system and this has enabled technology to advance, meaning the chance of losing any crucial data due to the effects of noise is sufficiently lower now that what it was years ago. Reflection Throughout this report I have gained a better grasp of different aspects of data communications, for example, noise is present in all systems as any electronic device creates noise through the movement of electrons, imperfections in conductive surfaces and fluctuations of current. I also increased my knowledge of different types of modulation, and how they work regarding changing the frequencies or sending additional data to help with error correction. I have also gained knowledge on how noise can be caused by different types of weather and how they can affect the electromagnetic field and cause detrimental effects on data communication systems. Not only did this report help me gain more knowledge on data communications, it also increased my knowledge on different aspects of physics, and how closely the two subjects are connected. I feel I completed this report to a reasonably high standard and found plenty of information available on the subject, however understanding this information was more difficult than expected as maths features highly in several of the sources I found, however this did not put me off, it simply lead me to try and comprehend the more complex maths side of the topic. Once I had completed the report I had to try and remove some parts as I had overshot the word count, this proved difficult as I felt I would be missing parts out if I removed some. Overall I would say I learned a great deal more about the complexity of noise and data communication systems.
Sunday, August 4, 2019
The Failure Of Baggage Handling Systems Information Technology Essay
The Failure Of Baggage Handling Systems Information Technology Essay From the article it is obvious that the city officials and BAE executives were at loggerheads and blame each other for the failure of the handling systems. BAE president and chief executive, Gene Di Fonso, supports his argument against the Denver city officials by pointing out that frequent alteration of the airport plans, involvement of inexperienced managers (appointed by Denver city officials) and failure to fix electrical flaws had left minimal time for testing out the system; were the major reasons behind baggage handling system failure. On the other hand, city officials blame the BAE for not fixing the software and mechanical problems by the time when the system was to be operational. But as it turns out, neither side is completely denying accusation made by other. So from the article, it is obvious that since neither parties have fulfilled their responsibilities, all the above mentioned factors equally contribute towards failure of the baggage handling systems at Denver Airpor t. To put it into simple words, the DIA project failed because those making key decision underestimated the complexity involved. Failure to recognize the complexity and the risk involved contributed to the project being initiated too late. Ãâà What could have been done by all stakeholders to prevent the failure caused by new technology introduction? It is always possible that unprofessional behavior by the city officials or defective equipment and software malfunction is partly to blame for the failure of the baggage handling system. But searching for a scapegoat is far simpler than trying to understand the difficulties faced when trying to develop large-scale projects. The project management team needed to do a better job of planning prior to the start of the project. The major roadblock was the simple fact that the automated baggage system was designed after the airport construction had already begun while it should have been included in the original design of the airport. Lack of communication between DIA airport designers, city officials, the airlines and BAE further caused damage to the project. Before beginning construction all the stakeholders needed to meet so as to put together a formalized plan. While this did not happen, the communication seemed more like a top down approach. Ãâà Give one public works (government) project that has similar or different fates since 1995, and draw comparisons. The much recent failure of the DART mission by NASAs Marshall Space Flight Center is an example of a technology project which can be described as a not one with an expected outcome. The DART projects biggest problem was that it only had one shot to test the technology. Complex hardware and software can fail from just one mistake, flaw, or overlooked factor in millions of actions or components. Mishap Investigation Board investigated the mishap and determined its underlying causes based on hardware testing, telemetry data analysis, and numerous simulations. So to compare with DIA project, we can find similarities in most aspects of its failure, like hardware and software malfunction, and testing problems. Ãâà What are the general lessons for this case? As with any project, the initial step should be to recognize the situation and then work towards it. Had the project management team and the BAEs executives recognized their lack of knowledge and the complexities they were facing, they could possibly reduced the risk, if not avoid it. It would have been a helpful knowledge to listen to those who did have the necessary prior experience. Stakeholder conflict, as in this case, with poorly defined roles and responsibilities and almost non-existent communication can lead to disastrous project results. From the article it is obvious that the city officials and BAE executives were at loggerheads and blame each other for the failure of the handling systems. BAE president and chief executive, Gene Di Fonso, supports his argument against the Denver city officials by pointing out that frequent alteration of the airport plans, involvement of inexperienced managers (appointed by Denver city officials) and failure to fix electrical flaws had left minimal time for testing out the system; were the major reasons behind baggage handling system failure. On the other hand, city officials blame the BAE for not fixing the software and mechanical problems by the time when the system was to be operational. But as it turns out, neither side is completely denying accusation made by other. So from the article, it is obvious that since neither parties have fulfilled their responsibilities, all the above mentioned factors equally contribute towards failure of the baggage handling systems at Denver Airpor t. To put it into simple words, the DIA project failed because those making key decision underestimated the complexity involved. Failure to recognize the complexity and the risk involved contributed to the project being initiated too late. So to sum it all up, the factors that eventually resulted in the failure of DIA project included poor management, conflicting roles and responsibilities, poor communication, no change control process, inadequate testing processes, stakeholder conflict, probably conflicting priorities, and finally scope creep by which I mean expansion of initial project design. It is always possible that unprofessional behavior by the city officials or defective equipment and software malfunction is partly to blame for the failure of the baggage handling system. But searching for a scapegoat is far simpler than trying to understand the difficulties faced when trying to develop large-scale projects. The project management team needed to do a better job of planning prior to the start of the project. The major roadblock was the simple fact that the automated baggage system was designed after the airport construction had already begun while it should have been included in the original design of the airport. Lack of communication between DIA airport designers, city officials, the airlines and BAE further caused damage to the project. Before beginning construction all the stakeholders needed to meet so as to put together a formalized plan. While this did not happen, the communication seemed more like a top down approach. The much recent failure of the DART mission by NASAs Marshall Space Flight Center is an example of a technology project which can be described as a not one with an expected outcome. The DART projects biggest problem was that it only had one shot to test the technology. Complex hardware and software can fail from just one mistake, flaw, or overlooked factor in millions of actions or components. Mishap Investigation Board investigated the mishap and determined its underlying causes based on hardware testing, telemetry data analysis, and numerous simulations. So to compare with DIA project, we can find similarities in most aspects of its failure, like hardware and software malfunction, and testing problems. As with any project, the initial step should be to recognize the situation and then work towards it. Had the project management team and the BAEs executives recognized their lack of knowledge and the complexities they were facing, they could possibly reduced the risk, if not avoid it. It would have been a helpful knowledge to listen to those who did have the necessary prior experience. Stakeholder conflict, as in this case, with poorly defined roles and responsibilities and almost non-existent communication can lead to disastrous project results. The most essential factor that helps a project succeed is if the scope of the project is well defined from the beginning. The scope of the project, if at all possible, should not be allowed to expand. Scope creep ultimately destroys budgets and leads to over time, thus undermining the support a project has. Automation off course in Denver Melvin Ver Dysfunctional decision making is the poison that kills technology projects and the Denver Airport Baggage System project is a classic example. The DIA case examines the key decisions that set the project on the path to disaster and the forces behind those decisions. What was supposed to be the worlds largest automated airport baggage handling system; became a classic story in how technology projects can go wrong. The airports baggage handling system was a critical component in the plan and by automating baggage handling; DIA was going to ensure faster aircraft turnaround which would have provided a competitive advantage over other airports. Despite the plan being technologically advanced and a possible winner, it rapidly dissolved into chaos due to underestimation of the projects complexity which resulted in extensive problems and eventually an embarrassment for everyone involved. The missteps that were involved along the way included a demonstration of the system to the media which illustrated how the system crushed bags, disgorged content and son on. While it is challenging to manage and carry out a technology project on such a massive scale, all it requires is precision in planning, scheduling and controlling; by managing critical interfaces with all the stakeholders involved. Ãâà What factors caused the failure of the baggage handling systems? From the article it is obvious that the city officials and BAE executives were at loggerheads and blamed each other for the failure of the handling systems. BAE chief executive blames the Denver city of frequent alteration of the airport plans, involvement of inexperienced managers and failure to fix electrical flaws resulting in minimal time for testing out the system; for the major reasons behind baggage handling system failure. On the other hand, city officials blame the BAE for not fixing the software and mechanical problems by the time when the system was to be operational. Since neither side is completely denying accusation made by other and have failed to fulfill their responsibilities, all the above mentioned factors equally contribute towards failure of the baggage handling systems at Denver Airport. To put it into simple words, the DIA project failed as it failed to recognize the complexity and the risk involved. Ãâà What could have been done by all stakeholders to prevent the failure caused by new technology introduction? Searching for a scapegoat is far simpler than trying to understand the difficulties faced when trying to develop large-scale projects. The project management team needed to do a better job of planning prior to the start of the project. The major roadblock was the simple fact that the automated baggage system was designed after the airport construction had already begun while it should have been included in the original design of the airport. Before beginning construction all the stakeholders needed to meet so as to put together a formalized plan. Lack of communication between DIA airport designers, city officials, the airlines and BAE further caused damage to the project. While this did not happen, the communication seemed more like a top down approach in this case. Ãâà Give one public works (government) project that has similar or different fates since 1995, and draw comparisons. The much recent failure of the DART mission by NASAs Marshall Space Flight Center is an example of a technology project that did not end up as expected. The DART projects biggest problem was that it only had one shot to test the technology. Complex hardware and software can fail from just one mistake or flaw. Mishap Investigation Board investigated the mishap and determined its underlying causes based on hardware testing, telemetry data analysis, and numerous simulations. So to compare with DIA project, we can find similarities in most aspects of its failure, like hardware and software malfunction, and along with testing problems. Ãâà What are the general lessons for this case? As with any project, the initial step should be to recognize the situation and then work towards it. Had the project management team and the BAEs executives recognized their lack of knowledge and the complexities they were facing, they could possibly reduced the risk, if not avoid it. It would have been a helpful knowledge to listen to those who did have the necessary prior experience. Stakeholder conflict, as in this case, with poorly defined roles and responsibilities and almost non-existent communication can lead to disastrous project results.
Person Centered Therapy and Cognitive Behavioral Therapy for Post Traum
Carl Rogerââ¬â¢s believed that everyone is inherently good. Therefore, even the vilest of people would be included. Some disputes have been made among behavior theorists that because the theory lacks structure, it is not as effective in treating illness. However, it is one of the main theories utilized by therapists today. Cognitive Behavioral Therapy is another popular theory that is used. It emphasizes the present and fixing cognitive distortions that clients may have. However, it too received some arguments against it, such as; treating symptoms and not the underlying cause of an illness. The theories that will be discussed are Carl Rogerââ¬â¢s theory of Person Centered Therapy, Aaron Beckââ¬â¢s Cognitive Behavioral Therapy (CBT) and how they would treat Post Traumatic Stress Disorder (PTSD). In Person Centered therapy, the therapist establishes a solid therapeutic alliance with the client. ââ¬Å"The therapeutic alliance is a more encompassing term for therapy that emphasizes the collaborative nature of the partnership between counselor and client. This partnership incorporates client preferences and goals into treatment and outlines methods for accomplishing those goals. The therapeutic alliance is an alliance based on listening to the client without being judgmental or giving unwarranted advice.â⬠Individuals are working toward self actualization. They also look for ways to improve experiences. Individuals try hard to reach an optimal sense of satisfaction. This eventually leads them to become fully functioning. After the individual is fully functioning they are able to trust their own feelings and experience a better life (Rogers, 1961). Rogers found that very few become fully functioning. To cope with this they create defense mechanisms.... ...Capsule. Post Traumatic Stress Disorder. American Psychiatric Association's Diagnostic and Statistical Manual of Mental Disorders (DSM-IV-TR). 14 Mar. 2008.http://www.behavenet.com/capsules/disorders/ptsd.htm. National Institute of Mental Health. Post Traumatic Stress Disorder : Signs and Symptoms. U.S. Department of Health and Human Services. 15 Mar. 2008. http://www.nimh.nih.gov/health/publications/anxiety-disorders/post-traumatic-stress-disorder.shtml. National Institute of Mental Health. Post Traumatic Stress Disorder : Treatment. U.S. Department of Health and Human Services. 15 Mar. 2008. http://www.nimh.nih.gov/health/publications/anxiety-disorders/post-traumatic-stress-disorder.shtml. Wikipedia. ââ¬Å"Post Traumatic Stress Disorder.â⬠Wikimedia Foundation. 15Mar. 2008. http://en.wikipedia.org/w/index.php?title=Post-traumatic_stress_disorder&redirect=no
Saturday, August 3, 2019
Free Essays - Tale of Two Cities :: Tale Two Cities Essays
Tale of Two Cities The main purpose of this book is to show the contrasts between the peaceful city of London and the city of Paris, tearing itself apart in revolution. This is apparent in the very first line of the book, "It was the best of times, it was the worst of times...." This is a contrast of the two cities, London, the tranquil home of Mr. Lorry and the Darnays'; and Paris, the center of a bloody revolution. The author shows gentleness in these violent times in the persons of Dr. and Lucie Mannette, both gentle and peaceful. He also characterizes the evil side of the revolution in the apathetic and depraved Misuser and Mademoiselle Defarge, who go about their business while death carts roll-- as do heads-- through the streets of Paris. He does though, depict a ray of light amongst all this evil; the heroic Carton, who gave his life for his friend and a woman he knew he would never have. The biggest contrast of all, is in the person of Misuser Darnay, the gentle English family man, who is also related to the evil Marquis Evremonde. I personally like stories that use historical events as backdrops because it brings these seemingly distant events closer to us. This book definitely offers insight into life in the two cities at the time of the French Revolution. I think it does an excellent job of depicting just how totally involved some people became in the revolution. It shows how people were blinded by the desire for freedom from their former oppressors, so much so, that they attacked anyone and anything that was even remotely related to their past rulers. I think this was effectively done by excellent characterization, using each character to depict a different aspect of society, then contrasting them by making them rivals. I really took away a different view of that time period. Some of the language he used was definitely outdated. The language was exactly what you would expect for a novel of that time period. I was able to follow the story pretty well, although there were a few times, in switching back and forth between cities, that I got a little lost Still on the whole I liked the way the story flowed. Unlike some stories of that time, there wasn't really any profanity or taking of God's name in vain, which is always good to see.
Friday, August 2, 2019
Fahrenheit 451 Summary Essay
Fahrenheit 451 by Ray Bradbury Matthew Hart Nov. 12, 12 Fahrenheit 451 doesnââ¬â¢t provide a single, clear explanation of why books are banned in the future. Instead, it suggests that many different factors could combine to create this result. These factors can be broken into two groups: factors that lead to a general lack of interest in reading and factors that make people actively hostile toward books. The novel doesnââ¬â¢t clearly distinguish these two developments. Apparently, they simply support one another.The first group of factors includes the popularity of competing forms of entertainment such as television and radio. More broadly, Bradbury thinks that the presence of fast cars, loud music, and advertisements creates a lifestyle with too much stimulation in which no one has the time to concentrate. Also, the huge mass of published material is too overwhelming to think about, leading to a society that reads condensed books rather than the real thing. Guy Montag is a fir eman in charge of burning books in a grim, futuristic United States.The book opens with a brief description of the pleasure he experiences while on the job one evening. He wears a helmet emblazoned with the numeral 451, the temperature at which paper burn, a black uniform with a salamander on the arm, and a phoenix disc on his chest. On his way home from the fire station, he feels a sense of nervous anticipation. After suspecting a lingering nearby presence, he meets his new neighbor, an inquisitive and unusual seventeen-year-old named Clarisse McClellan. She immediately recognizes him as a fireman and seems fascinated by him and his uniform.She explains that she is crazy and proceeds to suggest that the original duty of firemen was to extinguish fires rather than to light them. She asks him about his job and tells him that she comes from a strange family that does such peculiar things as talk to each other and walk places. Clarisseââ¬â¢s strangeness makes Guy nervous, and he lau ghs repeatedly and involuntarily. She reminds him in different ways of candlelight, a clock, and a mirror. He cannot help feeling somehow attracted to her. She fascinates him with her outrageous questions, unorthodox lifestyle, perceptive observations, and incredible power of identification.She asks him if he is happy and then disappears into her house. Pondering the absurd question, he enters his house and thinks about this stranger and her comprehension of his innermost trembling thought. Montag and Mildred spend the afternoon reading. The Mechanical Hound comes and sniffs at the door. Montag speculates about what it was that made Clarisse so unique. Mildred refuses to talk about someone who is dead and complains that she prefers the people and the pretty colors on her TV walls to books.Montag feels that books must somehow be able to help him out of his ignorance, but he does not understand what he is reading and decides that he must find a teacher. He thinks back to an afternoon a year before when he met an old English professor named Faber in the park. It was apparent that Faber had been reading a book of poetry before Montag arrived. The professor had tried to hide the book and run away, but after Montag reassured him that he was safe, they talked, and Faber gave him his address and phone number. Now Montag calls the professor.He asks him how many copies of the Bible, Shakespeare, or Plato are left in the country. Faber, who thinks Montag is trying to trap him, says none are left and hangs up the phone. Montag goes back to his pile of books and realizes that he took from the old woman what may be the last copy of the Bible in existence. He considers turning in a substitute to Beatty (who knows he has at least one book), but he realizes that if Beatty knows which book he took, the chief will guess that he has a whole library if he gives him a different book. He decides to have a duplicate made before that night.Mildred tells him that some of her friends ar e coming over to watch TV with her. Montag, still trying to connect with her, asks her rhetorically if the ââ¬Å"familyâ⬠on TV loves her. She dismisses his question. He takes the subway to Faberââ¬â¢s, and on the way tries to memorize verses from the Bible. A jingle for Denhamââ¬â¢s Dentifrice toothpaste distracts him, and finally he gets up in front of all the passengers and screams at the radio to shut up, waving his book around. The astonished passengers start to call a guard, but Montag gets off at the next stop.Montag goes to Faber and shows him the book, which alleviates Faberââ¬â¢s fear of him, and he asks the old man to teach him to understand what he reads. Faber says that Montag does not know the real reason for his unhappiness and is only guessing that it has something to do with books, since they are the only things he knows for sure are gone. Faber insists that itââ¬â¢s not the books themselves that Montag is looking for, but the meaning they contain. The same meaning could be included in existing media like television and radio, but people no longer demand it.Faber compares their superficial society to flowers trying to live on flowers instead of on good, substantive dirt; people are unwilling to accept the basic realities and unpleasant aspects of life. Faber says that people need quality information, the leisure to digest it, and the freedom to act on what they learn. He defines quality information as a textured and detailed knowledge of life, knowledge of the ââ¬Å"poresâ⬠on the face of humanity. Faber agrees with Mildred that television seems more ââ¬Å"realâ⬠than books, but he dislikes it because it is too invasive and controlling.Books at least allow the reader to put them down, giving one time to think and reason about the information they contain. Montag suggests planting books in the homes of firemen to discredit the profession and see the firehouses burn. Faber doesnââ¬â¢t think that this action would get to the heart of the problem, however, lamenting that the firemen arenââ¬â¢t really necessary to suppress books because the public stopped reading them of its own accord even before they were burned. Faber says they just need to be patient, since the coming war will eventually mean the death of the TV families.Montag concludes that they could use that as a chance to bring books back. Montag bullies Faber out of his cowardice by tearing pages out of the precious Bible one by one, and Faber finally agrees to help, revealing that he knows someone with a printing press who used to print his college newspaper. Montag asks for help with Beatty that night, and Faber gives him a two-way radio he has created that will fit in Montagââ¬â¢s ear; that way the professor can hear what Beatty has to say and also prompt Montag. Montag decides to risk giving Beatty a substitute book, and Faber agrees to see his printer friend.Montag gazes at Clarisseââ¬â¢s empty house, and Beatty, guessi ng that he has fallen under her influence, berates him for it. Mildred rushes out of the house with a suitcase and is driven away in a taxi, and Montag realizes she must have called in the alarm. Beatty orders Montag to burn the house by himself with his flamethrower and warns that the Hound is on the watch for him if he tries to escape. Montag burns everything, and when he is finished, Beatty places him under arrest. Beatty sees that Montag is listening to something and strikes him on the head.The radio falls out of Montagââ¬â¢s ear, and Beatty picks it up, saying that he will have it traced to find the person on the other end. After Beatty eggs him on with more literary quotations, his last a quote from Julius Caesar, Montag turns his flamethrower on Beatty and burns him to a crisp. The other firemen do not move, and he knocks them out. The Mechanical Hound appears and injects Montagââ¬â¢s leg with anesthetic before he manages to destroy it with his flamethrower. Montag stum bles away on his numb leg. He goes to where he hid the books in his backyard and finds four that Mildred missed.He hears sirens approaching and tries to continue down the alley, but he falls and begins to sob. He forces himself to rise and runs until the numbness leaves his leg. Montag puts a regular Seashell radio in his ear and hears a police alert warning people to be on the lookout for him, that he is alone and on foot. He finds a gas station and washes the soot off his face so he will look less suspicious. He hears on the radio that war has been declared. He starts to cross a wide street and is nearly hit by a car speeding toward him.At first, Montag thinks it is the police coming to get him, but he later realizes the carââ¬â¢s passengers are children who would have killed him for no reason at all, and he wonders angrily whether they were the motorists who killed Clarisse. He creeps into one of his coworkersââ¬â¢ houses and hides the books, then calls in an alarm from a p hone booth. He goes to Faberââ¬â¢s house, tells him what has happened, and gives the professor some money. Faber instructs him to follow the old railroad tracks out of town to look for camps of homeless intellectuals and tells Montag to meet him in St.Louis sometime in the future, where he is going to meet a retired printer. Faber turns on the TV news, and they hear that a new Mechanical Hound, followed by a helicopter camera crew, has been sent out after Montag. Montag takes a suitcase full of Faberââ¬â¢s old clothes, tells the professor how to purge his house of Montagââ¬â¢s scent so the Hound will not be led there, and runs off into the night. Faber plans to take a bus out of the city to visit his printer friend as soon as possible. Captain Beatty comes by to check on Montag, saying that he guessed Montag would be calling in sick that day.He tells Montag that every fireman runs into the ââ¬Å"problemâ⬠he has been experiencing sooner or later, and he relates to him the history of their profession. Beattyââ¬â¢s monologue borders on the hysterical, and his tendency to jump from one thing to another without explaining the connection makes his history very hard to follow. Part of the story is that photography, film, and television made it possible to present information in a quickly digestible, visual form, which made the slower, more reflective practice of reading books less popular.Another strand of his argument is that the spread of literacy, and the gigantic increase in the amount of published materials, created pressure for books to be more like one another and easier to read. Montag withdraws money from his account to give to Faber and listens to reports over the radio that the country is mobilizing for war. Faber reads to him from the Book of Job over the two-way radio in his ear. He goes home, and two of Mildredââ¬â¢s friends, Mrs. Phelps and Mrs. Bowles, arrive and promptly disappear into the TV parlor. Montag turns off the TV wall s and tries to engage the three women in conversation.They reluctantly oblige him, but he becomes angry when they describe how they voted in the last presidential election, based solely on the physical appearance and other superficial qualities of the candidates. After witnessing the anonymous scapegoatââ¬â¢s death on the television, Granger turns to Montag and ironically remarks, ââ¬Å"Welcome back to life. â⬠He introduces Montag to the other men, who are all former professors and intellectuals. He tells Montag that they have perfected a method of recalling word-for-word anything that they have read once. Each one of them has a different classic books stored in his memory.
Thursday, August 1, 2019
Computers – Invention of the Century
The History of Computers only once in a lifetime will a new invention come about to touch every aspect of our lives. Such devices changed the way we manage, work, and live. A machine that has done all this and more now exists in nearly every business in the United States. This incredible invention is the computer. The electronic computer has been around for over a half-century, but its ancestors have been around for 2000 years. However, only in the last 40 years has the computer changed American management to its greatest extent. From the first wooden abacus to the latest high-speed microprocessor, the computer has changed nearly every aspect of management, and our lives for the better. The very earliest existence of the modern day computer's ancestor is the abacus. These date back to almost 2000 years ago (Dolotta, 1985). It is simply a wooden rack holding parallel wires on which beads are strung. When these beads are moved along the wire according to programming rules that the user must memorize. All ordinary arithmetic operations can be performed on the abacus. This was one of the first management tools used. The next innovation in computers took place in 1694 when Blaise Pascal invented the first digital calculating machine. It could only add numbers and they had to be entered by turning dials. It was designed to help Pascal's father, who was a tax collector, manage the town's taxes (Beer, 1966). In the early 1800s, a mathematics professor named Charles Babbage designed an automatic calculation machine (Dolotta, 1985). It was steam powered and could store up to 1000 50-digit numbers. Built in to his machine were operations that included everything a modern general-purpose computer would need. It was programmed by and stored data on cards with holes punched in them, appropriately called punch cards. This machine was extremely useful to managers that delt with large volumes of good. With Babbage's machine, managers could more easily calculate the large numbers accumulated by inventories. The only problem was that there was only one of these machines built, thus making it difficult for all managers to use (Beer, 1966). After Babbage, people began to lose interest in computers. However, between 1850 and 1900 there were great advances in mathematics and physics that began to rekindle the interest. Many of these new advances involved complex calculations and formulas that were very time consuming for human calculation. The first major use for a computer in the U. S. was during the 1890 census. Two men, Herman Hollerith and James Powers, developed a new punched-card system that could automatically read information on cards without human (Dolotta, 1985). Since the population of the U. S. was increasing so fast, the computer was an essential tool for managers in tabulating the totals (Hazewindus,1988). These advantages were noted by commercial industries and soon led to the development of improved punch-card business-machine systems by International Business Machines, Remington-Rand, Burroughs, and other corporations (Chposky, 1988). By modern standards the punched-card machines were slow, typically processing from 50 to 250 cards per minute, with each card holding up to 80 digits. At the time, however, punched cards were an enormous step forward; they provided a means of input, output, and memory storage on a massive scale. For more than 50 years following their first use, punched-card machines did the bulk of the world's business computing (Jacobs, 1975). By the late 1930s punched-card machine techniques had become so well established and reliable that Howard Hathaway Aiken, in collaboration with engineers at IBM, undertook construction of a large automatic digital computer based on standard IBM electromechanical parts (Chposky, 1988). Aiken's machine, called the Harvard Mark I, handled 23-digit numbers and could perform all four arithmetic operations (Dolotta, 1985). Also, it had special built-in programs to handled logarithms and trigonometric functions. The Mark I was controlled from prepunched paper tape. Output was by card punch and electric typewriter. It was slow, requiring 3 to 5 seconds for a multiplication, but it was fully automatic and could complete long computations without human intervention. The outbreak of World War II produced a desperate need for computing capability, especially for the military (Dolotta, 1985). New weapons systems were produced which needed trajectory tables and other essential data. In 1942, John P. Eckert, John W. Mauchley, and their associates at the University of Pennsylvania decided to build a high-speed electronic computer to do the job. This machine became known as ENIAC, for Electrical Numerical Integrator And Calculator (Chposky, 1988). It could multiply two numbers at the rate of 300 products per second, by finding the value of each product from a multiplication table stored in its memory. ENIAC was thus about 1,000 times faster than the previous generation of computers. ENIAC used 18,000 standard vacuum tubes, occupied 1800 square feet of floor space, and used about 180,000 watts of electricity. It used punched-card input and output. The ENIAC was very difficult to program because one had to essentially re-wire it to perform whatever task he wanted the computer to do. It was efficient in handling the particular programs for which it had been designed. ENIAC is generally accepted as the first successful high-speed electronic digital computer and was used in many applications from 1946 to 1955. However, the ENIAC was not accessible to managers of businesses (Beer, 1966). Mathematician John Von Neumann was very interested in the ENIAC. In 1945 he undertook a theoretical study of computation that demonstrated that a computer could have a very simple and yet be able to execute any kind of computation effectively by means of proper programmed control without the need for any changes in hardware. Von Neumann came up with incredible ideas for methods of building and organizing practical, fast computers. These ideas, which came to be referred to as the stored-program technique, became fundamental for future generations of high-speed digital computers and were universally adopted (Dolotta, 1985). The first wave of modern programmed electronic computers to take advantage of these improvements appeared in 1947. This group included computers using random access memory, RAM, which is a memory designed to give almost constant access to any particular piece of information (Dolotta, 1985). These machines had punched-card or punched-tape input and output devices and RAMs of 1000-word capacity. Physically, they were much more compact than ENIAC: some were about the size of a grand piano and required 2500 small electron tubes. This was quite an improvement over the earlier machines. The first-generation stored-program computers required considerable maintenance, usually attained 70% to 80% reliable operation, and were used for 8 to 12 years (Hazewindus,1988). Typically, they were programmed directly in machine language, although by the mid-1950s progress had been made in several aspects of advanced programming. This group of machines included EDVAC and UNIVAC, the first commercially available computers. With this invention, managers had even more power to perform calculations for such things as statistical demographic data (Beer, 1966). Before this time, it was very rare for a manager of a larger business to have the means to process large numbers in so little time. The UNIVAC was developed by John W. Mauchley and John Eckert, Jr. in the 1950s. Together they had formed the Mauchley-Eckert Computer Corporation, America's first computer company in the 1940s. During the development of the UNIVAC, they began to run short on funds and sold their company to the larger Remington-Rand Corporation. Eventually they built a working UNIVAC computer. It was delivered to the U. S. Census Bureau in 1951 where it was used to help tabulate the U. S. population (Hazewindus,1988). Early in the 1950s two important engineering discoveries changed the electronic computer field. The first computers were made with vacuum tubes, but by the late 1950s computers were being made out of transistors, which were smaller, less expensive, more reliable, and more efficient (Dolotta, 1985). In 1959, Robert Noyce, a physicist at the Fairchild Semiconductor Corporation, invented the integrated circuit, a tiny chip of silicon that contained an entire electronic circuit. Gone was the bulky, unreliable, but fast machine; now computers began to become more compact, more reliable and have more capacity. These new technical discoveries rapidly found their way into new models of digital computers. Memory storage capacities increased 800% in commercially available machines by the early 1960s and speeds increased by an equally large margin (Jacobs, 1975). These machines were very expensive to purchase or to rent and were especially expensive to operate because of the cost of hiring programmers to perform the complex operations the computers ran. Such computers were typically found in large computer centers operated by industry, government, and private laboratories staffed with many programmers and support personnel. By 1956, 76 of IBM's large computer mainframes were in use, compared with only 46 UNIVAC's (Chposky, 1988). In the 1960s efforts to design and develop the fastest possible computers with the greatest capacity reached a turning point with the completion of the LARC machine for Livermore Radiation Laboratories by the Sperry-Rand Corporation, and the Stretch computer by IBM. The LARC had a core memory of 98,000 words and multiplied in 10 microseconds. Stretch was provided with several ranks of memory having slower access for the ranks of greater capacity, the fastest access time being less than 1 microseconds and the total capacity in the vicinity of 100 million words. During this time the major computer manufacturers began to offer a range of computer capabilities, as well as various computer-related equipment (Jacobs, 1975). These included input means such as consoles and card feeders; output means such as page printers, cathode-ray-tube displays, and graphing devices; and optional magnetic-tape and magnetic-disk file storage. These found wide use in management for such applications as accounting, payroll, inventory control, ordering supplies, and billing. Central processing units for such purposes did not need to be very fast arithmetically and were primarily used to access large amounts of records on file. The greatest number of computer systems were delivered for the larger applications, such as in hospitals for keeping track of patient records, medications, and treatments given. They were also used in automated library systems and in database systems such as the Chemical Abstracts system, where computer records now on file cover nearly all known chemical compounds (Dolotta, 1985). The trend during the 1970s was, to some extent, away from extremely powerful, centralized computational centers and toward a broader range of applications for less-costly computer systems (Jacobs, 1975). Most continuous-process manufacturing, such as petroleum refining and electrical-power distribution systems, began using computers of relatively modest capability for controlling and regulating their activities. In the 1960s the programming of applications problems was an obstacle to the self-sufficiency of moderate-sized on-site computer installations, but great advances in applications programming languages removed these obstacles. Applications languages became available for controlling a great range of manufacturing processes, for computer operation of machine tools, and for many other tasks. In 1971 Marcian E. Hoff, Jr. , an engineer at the Intel Corporation, invented the microprocessor and another stage in the development of the computer began. A new revolution in computer hardware was now well under way, involving miniaturization of computer-logic circuitry and of component manufacture by what are called large-scale integration techniques. In the 1950s it was realized that scaling down the size of electronic digital computer circuits and parts would increase speed and efficiency and improve performance. However, at that time the manufacturing methods were not good enough to accomplish such a task. About 1960, photoprinting of conductive circuit boards to eliminate wiring became highly developed. Then it became possible to build resistors and capacitors into the circuitry by photographic means. In the 1970s entire assemblies, such as adders, shifting registers, and counters, became available on tiny chips of silicon. In the 1980s very large scale integration, VLSI, in which hundreds of thousands of transistors are placed on a single chip, became increasingly common. Many companies, some new to the computer field, introduced in the 1970s programmable minicomputers supplied with software packages. The size-reduction trend continued with the introduction of personal computers, which are programmable machines small enough and inexpensive enough to be purchased and used by individuals. One of the first of such machines was introduced in January 1975. Popular Electronics magazine provided plans that would allow any electronics wizard to build his own small, programmable computer for about $380. The computer was called the Altair 8800. Its programming involved pushing buttons and flipping switches on the front of the box. It didn't include a monitor or keyboard, and its applications were very limited. Even though, many orders came in for it and several famous owners of computer and software manufacturing companies got their start in computing through the Altair. For example, Steve Jobs and Steve Wozniak, founders of Apple Computer, built a much cheaper, yet more productive version of the Altair and turned their hobby into a business. After the introduction of the Altair 8800, the personal computer industry became a fierce battleground of competition. IBM had been the computer industry standard for well over a half-century. They held their position as the standard when they introduced their first personal computer, the IBM Model 60 in 1975. However, the newly formed Apple Computer company was releasing its own personal computer, the Apple II. The Apple I was the first computer designed by Jobs and Wozniak in Wozniak's garage, which was not produced on a wide scale. Software was needed to run the computers as well. Microsoft developed a Disk Operating System, MS-DOS, for the IBM computer while Apple developed its own software. Because Microsoft had now set the software standard for IBMs, every software manufacturer had to make their software compatible with Microsoft's. This would lead to huge profits for Microsoft. The main goal of the computer manufacturers was to make the computer as affordable as possible while increasing speed, reliability, and capacity. Nearly every computer manufacturer accomplished this and computers popped up everywhere. Computers were in businesses keeping track of even more inventories for managers. Computers were in colleges aiding students in research. Computers were in laboratories making complex calculations at high speeds for scientists and physicists. The computer had made its mark everywhere in management and built up a huge industry. The future is promising for the computer industry and its technology. The speed of processors is expected to double every year and a half in the coming years. As manufacturing techniques are further perfected the prices of computer systems are expected to steadily fall. However, since the microprocessor technology will be increasing, it's higher costs will offset the drop in price of older processors. In other words, the price of a new computer will stay about the same from year to year, but technology will steadily increase. Since the end of World War II, the computer industry has grown from a standing start into one of the biggest and most profitable industries in the United States. It now comprises thousands of companies, making everything from multi-million dollar high-speed supercomputers to printout paper and floppy disks. It employs millions of people and generates tens of billions of dollars in sales each year. Surely, the computer has impacted every aspect of people's lives. It has affected the way people work and play. It has made everyone's life easier by doing difficult work for people. The computer truly is one of the most incredible inventions in history to ever influence management, and life.
Subscribe to:
Posts (Atom)