Thursday, December 26, 2019

The masters thesis - Free Essay Example

Sample details Pages: 32 Words: 9560 Downloads: 5 Date added: 2017/06/26 Category Statistics Essay Did you like this example? Abstract This masters thesis concerns development of embedded control systems. Development process for embedded control systems involves several steps, such as control design, rapid prototyping, fixed-point implementation and hardware-in-the-loop-simulations. Don’t waste time! Our writers will create an original "The masters thesis" essay for you Create order Another step, which Volvo is not currently using within climate control is on-line tuning. One reason for not using this technique today is that the available tools for this task (ATI Vision, INCA from ETAS or CalDesk from dSPACE) do not handle parameter dependencies in a satisfactory way. With these constraints of today, it is not possible to use online tuning and controller development process is more laborious and time consuming. The main task of this thesis is to solve the problem with parameter dependencies and to make online tuning possible. 1 INTRODUCTION 1.1 Background Volvo technology (VTEC) is an innovation company that provides expert functions and develops new technology for hard as well as soft products within the transport and vehicle industry. Among other things VTEC is working with embedded control systems. For one of the embedded control systems particularly Climate Control Module (CCM), VTEC is working with the whole chain. VTEC does this for Volvo Cars, Volvo Trucks, Volvo Construction Equipment, Renault Trucks and Land Rover. The work process for embedded control system developmet is typically as follows: Control Design Rapid Control Prototyping Fixed-Point Implementation Hardwar-In-the-Loop Simulation Online Tuning. It is an iterative process, but there is one problem for the last step, which limits the possibilities of working iteratively. Control design is typically made in MATLAB/Simulink and Fixed-Point implementation is typically made with a tool such as TargetLink. During these steps the parameters may be handled in an m-file. When going to the on-line tuning step however, the parameters are handled in a tool such as ATI Vision, INCA or CalDesk. Once you have taken this step the connection to the m-file is lost. Therefore the last step is somewhat of a one-way step. It is not completely impossible to go back to the earlier steps in the development chain, but the iterative process is not well supported by available on-line tuning tools of today. The following m-script instructions are examples of parameter dependencies that will cause the mentioned problems: Heating = [ -100, -20, 0, 20, 100 ]; BlowerHt = [ 12, 5, 4, 5, 10 ]; Blower_min = min[ BlowerHt]; Defrosting = [ 0, 20, 100 ]; BlowerDef = [ Blower_min, Blower_min, 10 ]; Using the above vectors in interpolation tables, one table with Heating as input vector and BlowerHt as output vector and another table with Defrosting as input vector and BlowerDef as output vector would cause problems during on-line tuning process. Three of the elements are meant to have identical values, but the tools, as it is today would allow them to be tuned individually. This is just one of many constructs, which may be very useful as long as you are in the MATLAB environment but causes problems during the on-line tuning process. 1.2 Goals and objectives The main goals of this masters thesis are: To investigate the problem of parameter dependencies. To find possible solutions. To make online tuning possible for dependency parameters in the development process of embedded control systems.. 2 BACKGROUND 2.1 EMBEDDED SYSTEMS 2.1.1 History of Embedded Systems In the era of earliest development of computers i.e. 1930-40s, generally computers were capable of doing a single task. Over time with the advancement in technology, traditional electromechanical sequencers presented the concept of programmable controllers using solid state devices. One of the first recognizably modern embedded systems was the Apollo Guidance Computer, developed by Charles Stark Draper at the MIT Instrumentation Laboratory.[1] After the early applications in 1960, the prices of embedded systems have come down and their processing power has been increased dramatically. A standard for programmable microcontrollers was released in 1978 by National Engineering Manufacturing Association. This standard was for almost any computer-based controllers for example event-based controllers and single board computers. When the production cost of microprocessors and microcontrollers fell, it became feasible to replace old, big and expensive components like potentiometers and varicaps with microprocessor read knobs. With the integration of microcontrollers, the application of embedded systems has further increased. The embedded systems are being used into areas where generally computers would not have been considered. Most of the complexity is contained within the microcontroller itself and very few additional components are needed. So because of this most effort is in software area.(last sentence is difficult to understand). 2.1.2 Common Characteristics Embedded Systems have several common characteristics. Uni-Functional: Embedded systems are usually designed to execute only one program but repeatedly. For example, an ordinary scientific calculator will always do only calculations. While on the other hand, a laptop computer can execute an enormous number of different programs, like web browsers, word processors, programming tools and video games. New programs or softwares are added very frequently. Tightly constrained: All computing systems have constraints on design metrics, but these constraints can be very tight for embedded systems. A design metric is defined as, a measure of an implementations features, such as cost, size, performance, and power. Embedded systems are often desired to cost just a few dollars, they must be designed for minimum size to fit on a single chip, they must be able to perform fast processing in order to process real-time data, and they must consume minimum power in order to extend battery life or may be to prevent the requirement of a cooling fan. Reactive and real-time: Many embedded systems should be able to continually react to changes in the systems environment. They must also compute certain results in real time without too much delay. For example, a cruise controller in cars have to monitor and react to speed and brake sensors continuously. It must compute acceleration or decelerations repeatedly within quite limited time; a delay in computation of results could result in a fatal failure to maintain control of the car. On the other hand, a desktop computer generally focuses on computations with comparatively infrequent reactions to input devices. In addition, a delay in those computations may perhaps be inconvenient to the user but that does not result in a system failure. 2.2 Model Based Design Model-Based design in short MBD is a mathematical and visual method of addressing problems associated with designing complex control systems. It is used in many industrial equipment designing, automotive and aerospace applications. Here in this thesis our focus is on climate control of new vehicles. This methodology is used in designing embedded software. Embedded software development consists of four steps: Modeling a plant. Analyzing and synthesizing a controller for the plant. Simulating the plant and controller. Integrating all these phases by developing the controller. Model-based design is quite different from the conventional designing method. In this methodology designer use continuous and discrete time building blocks instead of using long and complex software coding. This model based design enables designer to fast prototyping, testing and verification. Along with all these advantages, dynamic effects on the system can also be tested in hardware-in-the-loop (HIL) simulation mode. Some important steps in model-based design approach are: By choosing appropriate algorithm and acquisition of real-world system data, various types of simulations and analysis can be performed before producing a real controller. The model produced in step one is used to identify characteristics of the plant model. Then a controller can be made based on these characteristics. Using this model, the effect of time varying inputs can be analyzed. In this way the possible errors can be eliminated and it is very convenient to change and test any other parameters. Last step is deployment. Advantages of model based design compared with the conventional approach are as follows: Model based design provides common design environment which is important for development groups from the view point of general communication and specifically for data analysis and system verification. Model based design enable engineers to detect and correct errors in early phase of development. This is crucial point for minimizing time and financial impact of system. Model based design can be reused later for upgrading and for derivative systems which are capable to expand. 2.3 ECU DEVELOPMENT 2.3.1 Conventional Approach for ECU Development The conventional approach for electronic control unit, ECU, development is summarized in following four steps: Some experienced personal define the functions and system architecture and then the hardware engineer design the hardware circuit. Control engineer design the control algorithms and a programmer generate a handwritten code for that algorithms. Then these control algorithm program codes and hardware are integrated and tested by system engineer or maybe hardware engineer. Then on the engine test bench the complete system is tested. There are few problems with this conventional approach for ECU development. First and very major problem is that the hardware circuits are made before the confirmation of control rules and results. Only this factor adds a big risk in the process of ECU development. Secondly if some error is encountered during the program code testing, it is very difficult to judge whether the error is because of software codes or in the control algorithms. This programming of the control algorithm is itself a very time consuming process and it take additional time when some errors are encountered and the process of debugging. Since many people from different field of work are involved in this process so coordination between them also take time and it makes the development cost to increase. [2] Thats why the conventional development process can not satisfy the demand of modern age and its requirements. 2.3.2 Modern ECU Development On the bases of integrated development environment, the modern development of electronic control units can be efficiently completed and tested. Using model based simulation and hardware-in-the-loop simulation it is very easy and convenient to eliminate software errors and to modify the control algorithms. Due to this the development cost is reduced and development efficiency is improved. This modern development process is called V-cycle development process. This process is illustrated in Ffig. 1. (when you use figures from other publications, you have to get permission from the auther. It is not enough to put a reference) Fig. 1. The V-Cycle of model-based software development. [2] This process is summarized as follows: Using very sophisticated tools like MATLAB/Simulink/Stateflow and dSPACE TargetLink, the control algorithms are modeled. These control algorithms are confirmed using off-line simulations. The ANSI C code is produced using code generation tool. The one we are using is dSPACE TargetLink. The code produced in above step is compiled and downloaded into the control module and simulation is done in Hardware-in-the-loop mode, which confirms the credibility of the control algorithms. This tested program code of control algorithms is downloaded into the electronic control unit for further test and modification. Finally calibration of the whole control system is done. 2.4 Universal measurement and Calibration Protocol (XCP) XCP is a standardized and universally applicable protocol with much rationalization potential. It is not only used in ECU development, calibration and programming, it is also used to integrate any desired measurement equipment for prototype development, functional development with bypassing and at SIL(define) and HIL(define) test stands.[16] For calibration and measurements, it is common practice to connect electronic control units in a *CAN* network. For this purpose CAN calibration protocol is used extensively. With increasing demands of more sophisticated controllers, new electronic control units are becoming more and more complex and for that reason new networks are being developed such as, FlexRay, TTCAN etc.(give references) To meet the needs of new networks, the measurement and calibration protocol should be more generalized and flexible. This generalized and flexible protocol is XCP (Universal measurement and calibration protocol). XCP is independent of transport layers. So in XCP, X generalizes the various transport layers that * Details about CAN are provided in Appendix A. are used by the members of the protocol family e.g. [9] XCP on CAN XCP on FlexRay XCP on Ethernet XCP on USB and so on (you have to refere to each figure) Fig. 2. XCP support for different transport layers [10]. 3 PROBLEM INVESTIGATION This chapter will give answers to the following questions: What is parameter dependency problem? What is the effect of parameter dependency problem on tuning of embedded control systems? What are the difficulties to solve the problem at different platforms? Note: All examples used in this report are only for illustration purposes and are NOT the actual parameters used in climate control module of Volvo Cars and Volvo Trucks. 3.1 Complete process for developing embedded control systems The complete process for developing embedded control systems is illustrated in Ffig. 3. First step of this development process is to define parameters and that can be done in the m-file. These parameter values are loaded into MATLAB base workspace from where TargetLink/Simulink model fetches these values to simulate the process. After checking the simulation results and doing some modifications if required, C-code is generated by TargetLink. That C-code contains all the information about the control algorithm and input values. In the next step the auto-generated C-code is compiled using a Green Hills Suite. Fig. 3. Complete Production(rapid prototyping process?) Process. Green Hills software together with GNU Make and VBF converter is used to generate a map file and VBF file (Volvo Binary Format). This vbf file is downloaded in the embedded controller. The map file is used to generate A2L file using TargetLink. This A2L file is required by the calibration tool (for this project ATI VISION is used for calibration) and then using this calibration tool we can do parameters modifications in ECU. These modifications are also called tuning. 3.2 Parameter Dependency As all parameters are defined in a m-file, some parameters depend on the values of some other parameters. It may also be possible that the values obtained as a result of calculation between two or more parameters are used in the definition of other parameters. So, all those parameters which contain some other parameters or calculations of some other parameters in their definitions are called dependent parameters e.g. In above example parameters: Parameter 2 is dependent on parameter 1. Parameter 4 is dependent on parameter 2 and 3. Parameter 6 is dependent on parameter 2 and 3. 3.3 Reasons for introducing parameter dependencies Thinking of parameter dependencies a question may arise in minds that, Why do we need to introduce parameter dependencies at the first place? Answer to this question is that, when designing a control algorithm in a tool such as Simulink, it is convenient to use named parameters (variables) instead of hard coded numbers (constants). For instance, if the highest fan available corresponds to a voltage of 13.5V. Designer may want to have a parameter for this, so that instead of using the value 13.5 at many instances of algorithm, the name of parameter specified for that value can be used. If one day that hardware is needed to be changed and for new hardware 13.4V is the maximum that can be used for highest fan level, then it is easier to change one parameter value rather than changing many hard coded values at different instances. Sometimes it is good to have one parameter depending on another. For instance in a look-up table, there are several values in each vector and these values may depend on other parameters. It would be rather limiting if a vector or a matrix could only contains hard coded numbers. So, the use of dependent parameters helps keeping a good structure in the algorithm. It makes easier to work with the parameters. 3.4 Statistics about parameter dependency There are quite significant numbers of parameters which are dependent on other parameters. For instance in Climate Control P3, total number of parameters is 1618 out of which 227 parameters are dependent on other parameters and 1391 parameters are independent. We call independent parameters as Base Parameters. Fig.4 Percent of Dependent parameters 3.5 Parameter dependency problem in development process To analyze the problem of parameter dependency, lets walk through the development process of embedded control systems and find out what exactly is the problem with parameter dependencies. As the process starts with parameter definitions in m-file, so the investigation starts from m-file, see Fig. 5.. To visualize this process, an illustration with an example of parameter with dependencies in its definition is shown as follows: Fig.5. Example of parameter definition in m-file. After defining all parameters, the m-file is run in MATLAB. In this step all the values of dependency parameters are evaluated by MATLAB and are loaded into MATLAB base workspace. Precisely during this loading process the dependencies are replaced by their values and any information about the relation of a parameter with dependency parameter is lost. Fig.6. Dependency loss in MATLAB base workspace. As now the dependency information is lost, so this loss will propagate through all the further steps, for example in C-code generation, A2L file and in strategy(?) file. Following F fig 7. shows that the propagation of dependency information loss. So in C-code there is no information with the help of which we can trace dependency parameters. Fig.7 Propagation of dependency loss from MATLAB to C-code. 3.6 Effect of parameter dependencies on development process The problem caused by parameter dependencies comes to the surface during the calibration step. During calibration the values of parameters are tuned. When the information of parameter dependencies is lost, then we have to tune each parameter value individually. This is shown in the Ffollowing fig 8. Fig.8. Effect of dependency loss on development process. So if a parameter is used, for instance, in the definitions of five different parameters, then we have to tune the value of that parameter at those five locations individually. If there is any calculation involved in any parameter definition, then we must do it manually and update the value. This process of changing values manually is very time consuming and error prone. There is another possibility that to avoid doing these calculations and tuning parameter values individually. We can change the parameter values in the original m-file, where we have all parameter definitions and repeat the complete process again. This is very laborious work and it also takes a lot of time, so this possibility is not so feasible. 4 POSSIBLE SOLUTIONS As the complete process for developing embedded controllers is a multistage process and it depends on four highly sophisticated software platforms. So there can be different approaches to solve the dependency information loss. Following are the possible platforms for doing modifications in order to handle the dependency loss problem. MATLAB TargetLink model C-code Calibration tool Separate windows application Following is the in depth analysis of above mentioned platforms and possibility of finding a feasible solution. 4.1 Parameter dependencies and MATLAB When m-script, containing all parameter definitions, is run in MATLAB, all parameter values are evaluated and stored in MATLAB base workspace. Right at this first step dependency information in m-script is lost. The Rreason of this loss is that MATLAB base workspace support values belonging to only one class type. That can be char, double, struct or any other class but the values can not belong to a mixture of two or more class types, i.e., values cannot consist of two elements of an array belonging to char class and other elements of array belonging to double class.(I guess that a struct can consist of chars as well of doubles) Fig.9. Supported Class types in MATLAB base workspace. In our case of parameter dependency for example, we have an array of eighth elements. Second element and eighth element of our example array are names of some other parameters, so these names belongs to char class and rest of elements of that array are numerical values belonging to double class. So MATLAB evaluates the values of dependency parameters and replace all names with their corresponding values and our dependency information is lost. Although there is a function in MATLAB called eval and this function can be used instead of dependency parameter name but this does not solve our problem because this function will evaluate the values of those parameters and eventually its the value of parameter which is updated in the base workspace and dependency information is still filtered out. Moral of the story is that we can not do anything in MATLAB to save our dependency information until unless MathWorks do some changes in MATLAB so that base workspace would be able to support values belonging to different classes in same definition. 4.2 Parameter dependencies and TargetLink In TargetLink we can use custom lookup tables and we can include custom code. Let us suppose for a moment that by adding these custom lookup tables and using some extra blocks we manage to introduce lost dependency information in TargetLink model. But when TargetLink will generate C-code, most probably it will evaluate all those values and resulting values will be included in C-code. There are two reasons for this behavior of TargetLink: First reason is that, TargetLink work inside MATLAB so all the calculations are done in MATLAB and we face the same problem as described previously. Second reason is that, dSPACE claims that TargetLink generates C-code in the most efficient way, because this C-code is flashed into controller in binary format, so it is the maximum effort of TargetLink to keep C-code as small as possible because of the limited memory of ECU and demand of high operational speed. So TargetLink does not generate extra variables and pointer in C-code until unless some significant changes are done in TargetLink by dSPACE. 4.3 Parameter dependencies and C-code C-code generated by TargetLink can be modified and it is possible to add any kind of extra information but there are two reasons which make this possibility impracticable. First reason is that, this C-code will be flashed into ECU and there is very limited memory in the control unit and bigger C-code will result into a less efficient embedded controller. Second reason which makes this possibility impracticable is that. iIt requires a lot of manual labor every time we change something. This is also error prone. 4.4 Parameter dependencies and Calibration tool In calibration tool like ATI VISION, there is an option to use script written in Vision scripting language or in Visual basic. Instead of doing manual calibration we can automate calibration using the script. In our case, we have matrices with dependencies. So in order to do calibration using thescripting option we have to write function for doing matrix calculations and then that script must be able to evaluate dependencies according to new values. So this option is not so feasible. 4.5 Separate windows application After analyzing all possibilities only one option is left. That is to develop a separate windows application which will extract dependency information from m-script, calculate the values of dependency parameters according to the values tuned in calibration tool and will implement those new values of dependencies back in calibration tool. 5 SELECTED SOLUTION After analysis of all possible solutions, it is deducted that the most feasible solution to the dependency loss problem is a separate windows application which: Extracts dependency information from m-file. Gets tuned parameter values from calibration tool. Calculates all values corresponding to those tuned parameter values. And implements updated values of dependency parameters back in calibration tool. 5.1 Reasons for selecting this solution Among other solutions we have selected development of separate windows application, as a feasible solution. Major reasons for selecting this solution are as follows: Selected solution which is developing a separate windows application does not need any modification of present softwares. This solution is fast, no extra licenses are required for this and it works just according to our requirements. If we choose any solution which includes modification in software tools, then that involves the involvement of tool makers. That process of convincing toolmakers to modify their software according to our requirements and if they agree then the process of developing and releasing new version of software may take very long time. Tool makers would charge a great sum of money to make specified changes or for making an add-on application for the softwares. 5.2 Overview of solution The solution is an application named Dependency Calibrator. It works in two steps. In the first step the m-file is parsed and the information of dependency parameter along with their location in parent parameter areis extracted and rearranged in a way that it can be used in the second step that is calibration. During the second part of the process, first of all the application will import data from VISION so that if user has tuned any value in calibration tool, that data will be updated in MATLAB and then the application will do calculations in MATLAB after that new values obtained as a result of those calculation will be updated again back to VISION. This cyclic process from VISION to MATLAB and back to VISION will update parameter values. If user has changed values which was used by other parameters, those new values will be updated on all locations where they are used. This is shown in the following figFig. 10.. Fig. 10. Overview of solution. The application Dependency Calibrator is divided into two parts. Parser Calibrator Detailed explanation of how this application isn working is as follows. 5.3 Required Softwares Parser works without any requirement of external software but in order to run Calibrator following softwares must be installed on your system: MATLAB R2007b ATI VISION 3.5.3 MATLAB is automatically launched by the application but make sure to launch ATI VISION before you use Calibrator part of Dependency Calibrator application. 5.4 Project file Project file is a key to control the Dependency Calibrator application. Instead of using hard coded paths for different files used in this application, an option is given to the users to select their desired locations. These locations can be specified in a separate file which is named as project file. In this project file the instructions can be given after certain tags. One must be very careful because these tags should not be altered. While user inputs can be given after the symbol @. Dependency Calibrator application is in fact capable of handling multiple m-files and multiple c-files. Directory path for these files can be specifies in project file. Project file contains following tags: VISIONs Device Name @ : After this tag, name of the hardware device which is used in the VISION device tree, should be given. For example, VISIONs Device Name @ PCM Or VISIONs Device Name @ CCM Path of m File @ : After this tag the full path for m-file should be given. If number of m-files is more than one, then this tag followed by file path of those m-files should be given on a new line. Parser will read all these files and will merge them into one file. For example, Path of m File @ C:FolderNamesubFolderFile_Name.m Path of m File @ C:FolderName2subFolder2File_Name2.m Root directory for c files @ : In general practice c files can be generated in different folders but their root directory remains same. So in order to avoid repeating same address and to minimize the chances of error this tag is introduced in project file. So after this tag path of root directory for c files should be specified. Please note that there should be no at the end of root directory path. For example: Root directory for c files @ D:ABC_XYZsubFoldersubSub Folders containing c-files @: After this tag the names of folders which contain c files should be specified. If there are more than one folder containing c files then those folders names should be added after a comma ,. The parser will then search these folders for all c files contained in them. For example: Folders containing c-files @ FolderMedCfiles,subFoldercFolder Root Output Directory @: This tag should be followed by the path for required location where the user wants the application to generate all files. For example: Root Output Directory @ C: Extra File for calibrating non-calibratable parameters @: After this tag, there should be the path for the file containing names of those parameters which are not calibratable but they are desired to be calibrated in VISION. Those names should be exactly the same as defined in m-file, followed by underscore _ and followed by any desired word or character. For example: Extra File for calibrating non-calibratable parameters @ C:ExtraParNames.txt 5.5 Parser The F first part of the complete dependency calibration process is the parser. When Parser is executed, a window appears showing two options, Load Project File and Parse. It is required to load the project file before hitting the Parse button. Once the project file is loaded, the parser will have all the information to start parsing. Parser application is shown in following fig. Fig.11 Parser application There are six operations done by Parser on input m-file(s) and c-file(s) which are explained as follows. 5.5.1 Comments Removal First operation done by parser on m-script is comment removal. It removes all the comments from m-file(s). Some comments start from the beginning of line and other are at the end of parameter definitions. If there is more than one m-files, then all those files will be merged into one file as a result of this step. The output file produced in this step will be without any comment. The reason for removing comments is that, in next steps we have to convert multiple line parameter definitions to single line. For that it is required that there must be no comments. Second reason is that some comments contain the same structure as the parameter definitions, in fact those are old values of the same parameters. So in order to minimize any possibility of error, we have to remove comments. After removing comments, the parser also removes empty lines and extra white space inside the parameter definitions. 5.5.2 Multiple line parameter definitions to single line It is required that all parameter definitions should be single lined. There are two reasons for this operation on parameter definitions. First reason is that parser is reading complete file line by line, so it is important to read the complete information about a parameter in one step. Secondly we have to separate all dependencies present in parameter definitions. It is also possible to read multiple lines but in doing that we face big problem of setting a new line record (in programming record is a tag which tells about the end of line by default it is n) which tell the parser that current parameter is finished here and new parameter definition has started. In that case we must add any symbol or specific number of white space or something like that which must repeat after each parameter definition. In our case there is no such pattern repeating periodically and symmetrically in the m-file and it is not practical to modify all m-files by putting a symbol after each parameter. So, this parser application handles this problem by converting all parameter definitions into single line definitions. 5.5.3 Separating parameters with dependencies Up till this point all parameter definitions are converted into single line definitions and all comments are removed from the m-file(s). Next step in the parsing process is separating those parameters which are dependent on some other parameters. This goal is achieved by using regular expressions. The regular expression searches for any parameter name in the parameter definitions and if there is any parameter name found in the definition, it saves that parameter in a separate file. After this step we have all parameters with dependencies filtered out in a separate file. 5.5.4 Position of dependencies in a parameter definition Going one step ahead, now we have to parse each parameter to find where exactly the dependency lies. So this is a very crucial moment in the whole parsing process. According to our m-file there can be three major groups of parameters. Complete dependency Dependency in an array or vector Dependency in a matrix Complete dependency can be defined as, when whole parameter definitions depends on some other parameter or some calculations of other parameters. e.g., Max_Fan = max(Maximum_Fan_Speed_Mode4); Dependency in an array or vector can be defined as if there is some element of array or vector depending on other parameter or some parameter calculations. In this case we have to know, where precisely the dependency lies in that array or vector. For example: VentFan_Speed = [0, Min_Fan_Speed, 30, 45, 110, Max_Fan]; In this example, VentFan_Speed is dependent on Min_Fan_Speed which is 2nd element and Max_Fan which is 6th element. Third group of parameters can be dependencies in a matrix. This is even more complicated because in this case we have to keep track of two things, column index and row index. The output of this step is according to following format. Keyword; Parameter Name; Dependency Name; X-Offset; Y-Offset; So this format is a semicolon separated string in which: Keyword: can be any word but in our case it is defined as Parameter. The sole purpose of this keyword is to distinguish this string from any other information in the file that can be comments or some other information. So to be sure that this is the information of parameter dependency it must start with the specific keyword. Parameter name: represents the actual name of parameter which has dependency in its definition. Dependency name: is the name of parameters on whicth the parent parameter is depending on. This can be only name of other parameter or it can be the result of some calculation of other parameters. X-Offset: In case of a 1D array or a vector X-Offset will be the location of dependency i.e. the number or element in the array. In case of a matrix X-Offset is column index of the dependency element. Y-Offset: In case of 1D array or vector y-Offset will always remain y. This indicates that the respective parameter is a vector. In case of 2D array and matrices Y-Offset indicates the row index of the dependency element. Zero based indexing is used in this format for X and Y-Offset. When the value of both X-Offset and Y-Offset is d, that means the complete definition of that parameter is a dependency. This conversion of parameter dependency information from MATLAB format to new format is shown in following figFig. 12. Fig. 12. Extracted parameter dependency information. 5.5.5 C-code parsing In the previous step we have generated the file which contains information about the parameter name, dependency name and the location of dependency in parameter definition. The problem here is that in the calibration tool, the names of parameters are not the same as were defined in the m-script. These names are changed by adding different tags during the C-code generation in TargetLink. So in order to find the respective parameter names we need to parse C-code. The good news here is that TargetLink only change the actual names according to a certain pattern which can be selected and modified in the TargetLink model. So according to that pattern we can extract the corresponding names of parameters. 5.5.6 Replacing parameter names At this step we have dependency information of parameters from m-script and we have their corresponding names in C-code which can be found in calibration tool. In this step parser will replace the names of parameters as defined in m-script with their corresponding C-code names. The output of this final step of parsing is ready to be used for calibration process. These new parameter names are the same as defined in the strategy file of ATI VISION. So in the file generated in this step we have all information of dependency parameters and their positions in the definitions of parent parameters. After completing this process the application will show a message informing about completion of the parsing process. By clicking OK this application will exit. 5.6 Calibrator Second part of the dependency calibration process is another application which is interfaced with MATLAB and ATI VISION. Make sure that before running Calibrator a project is open in VISION otherwise the application will display an error message. Calibrator is shown in the following figFig. 13.. Fig.13 Calibrator application If a vision project is open a calibration application is run then, at the start window two options are shown. Load project file Calibrate It is required to first load the same project file that is used for parser. Completion of this operation will be confirmed by displaying a message box. After this, the calibrator has got all information required to run and it is ok to press Calibrate button.ÂÂ ­ When calibrate button is pressed the application export parameter values tuned in VISION to MATLAB and update corresponding parameters in MATLAB base workspace. If the value which is tuned belongs to dependency parameter then all the parameters depending on that value will be updated according to their relation defined in m-file. After updating all parameter values, those values are sent back to VISION at their appropriate locations and in this way we get our desired results. Detailed explanation of how this happens is as follows. For explanation we further divide calibrator into two parts, Parameter values from VISION to MATLAB Parameter values from MATLAB back to VISION 5.6.1 Parameter values from VISION to MATLAB In this step the application will use MATLAB and ATI VISION as COM servers. When MATLAB is invoked as a COM server, it looks like as in Ffig.14. Original m-file which contains all the parameter definitions is required in this step. The path for that m-file is specified in the project file. The application commands the MATLAB server to change the current directory of MATLAB to the root directory of the specified file. After that it runs the m-file and that all parameter definitions are loaded in MATLAB base workspace. Now to find out which parameters may be tuned in calibration tool, it is required to look into C-code generated by TargetLink for the variables belonging to the variable class CAL. Type of variable class for any variable can be changed from Data Dictionary of TargetLink. Fig. 14 MATLAB invoked as COM server. As this information is saved into a file during parsing, so the application will get this information from that file. When the values of these calibratable parameters are tuned in VISION, the application gets these new values for updating them in MATLAB. But the parameter names in MATLAB are different. So the application will translate the names. The program now knows the names of parameters as defined in m-file and their corresponding names in VISION. To read the values of calibrated parameters from VISION, first of all the application will check the data type of those parameters, whether they are scalar, 1D array, 2D array, 2D table or 3D table. The method for getting values of parameters belonging to different data type is different. Once the data item type is know the program will send that parameter information to its respective method. For a scalar the actual value of that scalar in VISION is transferred to MATLAB. For a 1D array the actual values of Y-axis are transferred to MATLAB, because in case if 1D array x-axis values are just the index number. For a 2D array the actual values of z-axis are transferred to MATLAB. For a 2D table, actual values of y-axis are the concerned values. For a 3D table, the actual values of z-axis are transferred to MATLAB, because on x-axis and y-axis the actual values belong to some other parameters which are generally 1D array and are handled separately under DataItemType1DArray. After doing this process the program will run the file containing parameter dependency information. This file was generated during the parsing process. According to this file all dependency parameters are updated with new values got from VISION. Now, as all dependency parameter values are updated so, program will save all this information in a mat file which is named as calibration. mat. This file is a binary file and all information present in the MATLAB base workspace is saved in it. At this point the process of updating parameter values from VISION to MATLAB is completed and the next step is to update all the changes caused by changing the values of parameters which are dependent. Here another very important point is that if a value is changed in VISION and that value actually was a dependency then, according to the requirement that the value must not be allowed to change until the change is made in the base parameter value. This calibration application does exactly that. Tuning of independent parameters is not affected by the application. 5.6.2 Parameter values from MATLAB to VISION Now coming to the second part of calibrator, it is now required that the values of all parameter dependencies should be updated back in VISION. For this purpose the application will load the calibration.mat file that was saved with new values in previous step. Another file required by this part is the final output file obtained in the parsing process. According to that file the program will find the name of a parameter as it is in the calibration file. The program will find the value of dependency parameters from the mat file in MATLAB and it also has the information about the location of the dependency in the parameter definition, so it will update the corresponding value of dependency in VISION. To update the value of dependency at the right location the program will first check the data type of the parameter and according to the data item type it will send the information to the appropriate method. That method will check first that if the dependency is a part of array or a matrix or is it some resultant value of calculation between some other parameters. So according to that information the program will do all required calculation in MATLAB and then import the value to the right place in the ATI VISION. When all the values are updated it will show the message that the values are updated and if there is any parameter that did not belong to variable class CAL, the program will show all these parameters with warning in a list box. This process is iterative and it can be repeated as many times as the user wants. When all the parameters are calibrated then this application can be closed. By closing the application, the command window of MATLAB, which was opened as a COM server, will also be closed. This process of updating the parameter values can be monitored in VISION using screen window and control items. 5.7 Calibration of non-calibratable parameters In the system there can be some parameters which are used indirectly. Indirectly means their values are used in some other parameters but they them self are not used anywhere in the TargetLink model due to which they can never appear in the C-code and as a result those parameters are not available in the calibration tool for tuning. The application Dependency Calibrator handles this kind of parameters as well. For tuning these parameters we can create new data item in calibration tool and then add names of those data items into a text file. Path of this text file should also be specified in the Project File after the tag Extra File for calibrating non-calibratable parameters. For creating new data item in ATI VISION, go to Data Item Manager. In DataItemGroups go to device name e.g. CCM. Open the Characteristics folder and click on Values. The window on the right side of this panel will display different data items and some other information about those data items. In this window by right clicking and then selecting New will show a dialog box titled as Select Data Item Type. Now select the type of data item according to the type of parameter. After selecting data item type, a dialog will appear asking the name of parameter. The name of data item should be selected as, actual name of parameter as defined in m-file followed by an underscore _ and some other name according to your wish. By choosing this kind of name, the data item will be connected to the original parameter as defined in m-file. After that a dialog will appear showing the properties of that data item. Make sure that the Base address of this data item does not coincide with the base address of any other predefined data item. Memory type should be selected to RAM [adjust and monitor], then you will be able to change the value(s) of this data item. After doing all these adjustments, click Apply and then click OK. By following this procedure a new data item is created which was not calibratable by default, but now this parameter can also be tuned in the calibration tool. 5.8 Dependency Calibrator in a Nutshell All the steps of Dependency Calibrator are summarized and depicted in the following fig.(Add figure text) 6 RESULTS (As I understand, you here show that one example seems to work, that is good for pedagogic reason. I think you have to tell something whether you have verified the functionality in some other more rigorous way. One way could be that you document a reasonable large set of examples. Another way can be that you let an experienced engineer try to use your application and document his judgment.) Results of this project are demonstrated with the help of an example of somefew parameters which contain dependencies. Following graphs are made in ATI VISION using control objects in Screen file. Graphs in fig.15 show the values of parameters before tuning. So here it is required that if we tune any parameter value on which other parameters are dependent, then all the value of that dependency parameter should be changed at all instances where that dependency is used. In this example there are two parameters on which values of some other parameters are dependent, we call these two parameters as base parameters which are as follows: Minimum_Fan_Speed = 16; Maximum_Fan_Speed_Mode4 = [114 133 144 151 168 173]; Fig.15. Values of parameters before calibration. The parameter values to notice are outlined with red blocks. Now when we tune the values of base parameters, the application will change these values at all instances where they are used as dependencies. New values of base parameters are as follows, Minimum_Fan_Speed = 20; Maximum_Fan_Speed_Mode4 = [114 133 144 151 176 173]; Now as, Max_Fan = max(Maximum_Fan_Speed_Mode4) So the value of Max_Fan will become 176 according to the changed value of Maximum_Fan_Speed_Mode4 The calibrator application will update these new values of Minimum_Fan_Speed (i.e. 20) and Max_Fan (i.e. 176) in all other parameters. Updated values are shown in the following fig.(number) Fig.16. Values of parameters after calibration Changed values are outlined with red blocks. 7 CONCLUSION This thesis report concludes that the application Dependency Calibration handles the parameter dependencies quite efficiently. The fact that all calculations are done in MATLAB makes it possible that all kind of parameter operations which are supported by MATLAB are also supported by this application and we can use the capabilities of TargetLink/Simulink model to the maximum extent. So this application contributes to following: Makes online calibration possible for dependency parameters. Less error prone. Is efficient and saves valuable time. Requires minimal manual labor. 8 FUTURE WORK Current application saves updated parameter values in a .mat file which is a binary file. The application does not write or update parameter values in the original m-file. Future work related to this project can be developing a text editor application specifically for updating m-file. That application should save old parameter values by commenting it and then write a new parameter with updated values. This may also be achieved by using new MATLAB Editor API, available in MATLAB 2010a which is expected to be released in March 2010. This new MATLAB Editor API provides programmatic control over opening and saving files, navigating and modifying file contents, and querying file properties [12]. REFERENCES(are all references refered to?) Embedded Systems [Online]. Available: https://en.wikipedia.org/wiki/Embedded_system#History [Accessed: Sep. 14, 2009]. Yu Shitao, Zhou Xingli, Yang Lin, Gong Yuanming, and Zhuo Bin, Study on the model-based development approach for the electronically controlled system of a high-pressure common-rail diesel engine, Journal of Automobile Engineering, vol. 220, no. 3, pp. 359-366, 2006. Bosch (1991) CAN- specification 2.0. Germany: Bosch CAN specifications [Online]. Available: https://www.specifications.nl/can/protocol/can_UK_protocol.php [Accessed: Sep. 24,2009]. press_release_targetlink_3_0_dsapce_july2008.zip Available: https://www.dspace.de/ww/en/pub/home/company/dspace_pressroom/press/targetlink_3_0.cfm [Accessed: Sep. 25,2009] ATI VISION User Reference Manual GNU Make manual [Online]. Available: https://www.gnu.org/software/make/manual/html_node/index.html [Accessed]: Sep. 24,2009 BL51 Users Guide [Online]. Available: https://www.keil.com/support/man/docs/bl51/bl51_ln_mapfile.htm [Accessed: Sep. 24,2009] ASAM e.V. (2003) XCP version 1.0 The Universal Measurement and Calibration Protocol Family XCP: The Universal Measurement and Calibration Protocol Family [Online]. Available: https://www.vector.com/vi_xcp_layers_en,,223.html [Accessed: Dec. 11,2009] PÃÆ'Â ¤r EnstrÃÆ'Â ¶m and Erik SvebÃÆ'Â ¤ck, Road Vehicle Diagnostic using Bluetooth, LuleÃÆ'Â ¥ University of Technology. MATLAB Simulink Release Notes for R2010a [Online]. Available: https://www.mathworks.com/access/helpdesk/help/pdf_doc/matlab/relnotes_pr.pdf [Accessed: Jan. 04,2010] ATI VISION Calibration Data Acquisition Software [Online]. Available: https://www.accuratetechnologies.com/images/stories/product-datasheets/vision%20software%20web.pdf [Accessed: Jan. 08,2010] ETAS Catalog 2008/09 Chapter 7: Measurement, Calibration and Diagnostic Tools [Online]. Available: https://www.etas.com/en/products/download_center.php?entrylist=4609fileID=4610 [Accessed: Jan. 07,2010] CalDesk Universal tool for measurement, calibration, and diagnostics [Online]. Available: https://www.dspaceinc.com/shared/data/pdf/catalog2009/CalDesk_ebook.pdf [Accessed: Jan. 07,2010] XCP at the Focal Point of Measurement and Calibration Applications [Online]. Available: https://www.vector.com/portal/medien/cmc/press/PMC/XCP_UseCases_ElektronikAutomotive_200705_PressArticle_EN.pdf [Accessed: Jan 11,2010] APPENDIX Appendix A CONTROLLER AREA NETWORK- CAN Background (Is this text taken some where from?) In 1980s automobile industry was growing rapidly and vehicles were getting more and more sophisticated and intelligent. In this growing industry and sophistication there was one bottle neck. With the increasing number of sensors, wires to connect those sensors were getting more and more this tremendous increase of connecting wires caused increase in weight. Something was needed to be changed. That was finding an alternative to the conventional cabling in automobiles. An amazing idea evolve in the mind so one person, the ideas was, why not put everything in contact with each other making cross communication possible. The man who was thinking this was Robert Bosch. Robert Bosch introduced this serial bus system Controller Area Network (CAN) in February 1986 at the Society of Automotive Engineers (SAE) congress. That was the hour of birth for one of the most successful network protocols ever. Facts about CAN CAN is serial communication and the maximum data transfer rate is 1Mb/s. This data transfer rate is compromise between the cable length and flexibility. Normally the network runs on 256Kb/s or sometimes 500Kb/s. These specifications are the part of standardization in the Bosch CAN specification [3]. CAN serial communication is different from other serial communication in a way that it has two signal wires CAN High and CAN Low. Both are used to send and receive messages. CANL has a digital signal output of 0-2.5V and CANH has output of 2.5-5V. Output of CANH and CANL are inverted compared to each other. This inversion of signals is terms as differential signaling. The effective difference between these two signaling wires is 5V. If due to some reason some disturbance occurs it is most likely that the effect of disturbance will be same on both wires. Since differential signaling is used, the effect of disturbance vanishes. This differential signaling is illustrated in the fig. A1. Fig. A1 Differential signaling on CAN [11] Every node in the network is connected to bus in series. This serial connection has its own pros and cons. The biggest advantage is that, very less amount of cable is required as compared to the old cabling. The worst disadvantage is that if the cable breaks at some critical point, several nodes might be disconnected from the network and that can end up at some loss of control or lose of some critical information. Serial connection of nodes is illustrated in fig. A2. Fig. A2.Serial connection of nodes on CAN bus [11] Principles of data exchange CAN is a transmission protocol and it is message oriented and based on broadcast communication mechanism. CAN message basically have unique message contents instead of having information of stations and addresses. The stations attached to the CAN have very tough competition for getting buss access. This problem is solved by prioritizing the messages. System configuration became more flexible because of using addressing scheme based on the content of messages. When new stations are required to add to a CAN network and if they are purely receivers, there are no alterations required in hardware or in software. It is easy to service or upgrade CAN network because transmission of data is not dependent on the availability of some specific stations. This is illustrated in Fig. A3. Fig. A3 CAN data handling [4] Data Transmission On CAN network the messages urgency may change very rapidly for example the information about load on engine must be sent more frequently compared to the other messages. Priority of message with which the message should be handled is defined in the identifier of every message and this priority is set during designing phase of system and these values are not allowed to change dynamically. Fig.A4 Transmission of data on CAN in real time [4] The requests for data transmission are handled according to the importance of those requests for overall system. This is a good way to handle requests when the system is overloaded. Format of message frames Two types of formats for message frames are supported by CAN protocol. The difference between these two formats is only the difference in their length. CAN base frame CAN extended frame CAN base frame supports identifier that is 11 bits long. It is also known as CAN 2.0 A. CAN extended frame supports identifier that is 29 bits long and it is also known as CAN 2.0 B. Fig. A5 Formats of CAN Message Frame [4] Appendix B Calibration tools Major calibration tools used at Volvo Technology are: ATI VISION CalDesk by dSPACE INCA by ETAS ATI VISION ATI VISION software is an integrated calibration and data acquisition tool that collects signals from ECU and external sources, measures relationships between inputs and outputs, enable real-time calibration and modification to closed loop control systems, time aligns and analyzes all information, manages calibration data changes and programs the ECU.[13] VISION software can do following: Flash the ECU Monitor and measure signals in the ECU and external signals Calibrate ECU parameters in real-time Compare, import and merge calibration data Analyze data Perform ECU algorithm rapid prototyping VISION has an integrated script manager and it is possible to write automation scripts for iterative processes using VISION scripting language. VISION includes an Application Programming Interface (API) enabling the integration of application-specific components from different vendors and data exchange between VISION and other applications.[13] CalDesk CalDesk by dSPACE is a tool for different stages of the ECU development process like prototyping a control strategy, calibrating an ECU, or validating vehicle behavior. Rapid prototyping systems, ECUs and the vehicle bus can be accessed in parallel. Data from different sources may be recorded and analyzed as a whole. [15] Functionality overview of CalDesk is as follows: Fig.B1 CalDesk functionality overview [15] CalDesk Application Programming Interface (API) is based on Windows COM/DCOM components and is accessible by various programming languages such as C#, Python or Visual Basic. CalDesk has an integrated Python editor with syntax high-lighting and automatic code completion. In conjunction with the python interpreter it is possible to write automation scripts.[15] INCA The INCA base product comprises the system core with its measurement and calibration functionality. It supports online and offline adjustments of characteristic values, lines and tables. Simultaneously to the parameter optimization, INCA allows to acquire measurement signals from ECUs as well as easy and quick reuse of existing calibration data and hardware configurations. [14] In addition to the measurement and calibration core system, the INCA base product includes tools for managing the configurations of ECU projects and calibration parameters, for analysis and measured data and for reprogramming the ECU flash memory.[14] Besides interactive optimization of parameters, INCA supports access via remote control interfaces for automation of experiments. For example, test benches access INCA via the ASAM MCD-3MC remote control interfaces for MATLAB and for other windows applications via the Microsoft COM-interface.[14] Fig. B2 INCA functions at a glance [14]

Wednesday, December 18, 2019

Euthanasia For Animals Essays - 861 Words

Euthanasia One of the most widely debated topics in the animal industry is euthanasia. The topic of euthanasia causes arguments around the world because the word takes the best of the emotional side of human nature causing the people against euthanasia to feel that it is inhumane, unfair, and a sin to euthanize animals for any reason. In a way these emotional feelings overcome the human mind thus stop them from thinking logically. Even though the word euthanasia can have several meanings throughout different dictionaries, there is only one origination of the word which is from the Greek word â€Å"eu† meaning good and â€Å"thanatos† meaning death; when combined together the word means good death. This meaning is the true meaning of the word,†¦show more content†¦The animals would have a â€Å"life† however, it would be full of pain and distress because there would be more chances of them being abused and not fully taken care of since they would be locked in a kennel 24/7 with the thousands of other animals without homes that would also want attention but would only get divided attention if any, with only 6-8 staff members working per hour and only available during the 10 hours per day. Euthanasia is by far the most humane thing to do because it saves the animals from living a life not worth living. Lastly, if none of the other reasoning’s are arguable enough, people bring in the religious perspective. One might say that in the religion of Islam or Christianity all life is sacred and can only be taken by god, however, this is not true for animals. The last prophet of Islam named Prophet Muhammad (peace and blessings be upon him) is believed to be sent to the world to spread Islam and is known as one of the greatest followers of Islam. The Hadith contains a large collection of traditions, admonitions and stories about his relationship to animals. Through the stories in the Hadith it is known that Prophet Muhammad (peace and blessings be upon him) believed that as part of God’s creation, animals should be treated with dignity. Further more, â€Å"Allah has ordained kindness (or excellence) in everything. If killing is to be done,Show MoreRelatedAnimal Euthanasia Essay1658 Words   |  7 PagesTo Kill or Not To Kill In the past few decades, there has been a big uproar regarding the topic of euthanasia in dogs. Euthanasia used to be a term used to end the suffering of a life by putting them in a painless and permanent state of sleep. I believe that today, the term ‘euthanasia’ when referred to dogs has transformed to a word used to justify the mass murder of dogs across the world. Most people will agree that the only time a dog should be put down is when it is sick or suffering from painRead MoreEuthanasia And Shelter For Animal Shelters1560 Words   |  7 Pagesphilosophy for animal shelters, which simply stated, is an operating philosophy for a shelter that is based upon the premise that no healthy adoptable animal should be euthanized for any reason and that they should be sheltered until they find a loving home. The number of animals potentially impacted by this philosophy is very large. â€Å"Every day nearly 5,500 cats and dogs are killed in America’s shelters even th ough they could have made beloved family pets.† (Battista). Although most animal shelters haveRead MoreShould Animals Be The Mass Euthanasia Of Animals?1923 Words   |  8 PagesAside from the specific effect on individual pets, buying pets from pet stores is a massive contributor to pet overpopulation, which, in turn, leads to the mass euthanasia of animals. Animal shelters are put in charge of more than 7.6 million pets per year, 3.9 million of them being dogs (Pet Statistics sec. 1). This massive amount of animals is much more than most shelters can handle, meaning that they are forced to euthanize millions of potential pets every year, simply for the sake of making roomRead MoreEuthanasia: The Second Death of Animals1000 Words   |  4 Pagespet-beauty. Not only how to make animals beautiful, but what pet’s body structure looks like should be learnt for studying the subject. She said all of the sacrificed animals for anatomy classes are abandoned animals. This is because, according to her, once abandoned animals are sent to an animal shelter, they will be euthanized unless they are adopted in 10 days. Therefore, the interested parties generally accept the opinion that abandoned animals facing euthanasia had better be sacrificed for educationRead MoreCommunication in Euthanasia of an Animal Essay2252 Words   |  10 Pagesimportant that communication is part of a euthanasia. In most cases clients pets are a large part of their life, they would want it to be stress free and painless for there animals ending. The decision that the clients have to make to get to this stage is usually hard for them so it is crucial that they are clear on how a euthanasia takes place and the possible after effects. This is so that the client is not disturbed if any unfortunate movements happen once the animal has passed away. Also communicationRead MoreEssay on Taking Care of Animals: Kill Versus No-Kill Shelter1408 Words   |  6 PagesAnimals are beloved creatures among the human race, and are deeply cared for. As caretakers of pets, those caretakers have a responsibility to these animals. One of those responsibilities includes making sure all animals are properly taken care of and are placed in a loving home. Many of these poor animals, mostly cats and dogs, are cast unwanted onto the streets to fend for themselves. Once they are cast out, where are they to end up? The ones that survive being discarded by neglectful owners areRead More The Horrors of Animal Euthanasia Essay1429 Words   |  6 PagesThe Horrors of Animal Euthanasia   Ã‚  Ã‚   Due to the domestication of cats and dogs their populations have skyrocketed. This is due in part to the lack of pet owners acting in a  responsible manner. These responsibilities include the spaying and neutering of   pets. These numbers of homeless animals in communities have caused humane societies to euthanize too many animals. This, I feel is a violation to animal rights and is a cruel way for these animals to have to leave this world. I disagreeRead MoreEuthanasia Is The Intentional And Painless Ending Of Life For Animals2030 Words   |  9 PagesEuthanasia is the intentional and painless ending of life for animals that are suffering from an incurable disease. In small animals is performed by an injection into the vein of an agent consequently ending in the death of the pet. When an animal is euthanized they become unconscious which is firstly followed by respiratory arrest and then cardiac arrest. If there are no cardiovascular defects in the animal generally within 30 seconds of unconsciousness cardiac arrest will occur. –  © 2014 VeterinaryRead MoreAdoption Programs Help Increase Dog Adoptions1191 Words   |  5 PagesMany people understand the awful living conditions in animal shelters. Upon entering, one immediately notices the stench of unwashed animals, the dirt and grime that c oat the walls, and the pleading, desperate faces of the inhabitants. While this is a serious issue, there is a far more important one at hand: the euthanization, or killing, of healthy animals in shelters. In a population chart of dogs in the United States, â€Å"over a third (2.4 million) died in shelters†(Ortega-Pacheco Jimà ©nez-CoelloRead MoreThe Morality Of Euthanasia And The Application Of Utilitarianism1213 Words   |  5 PagesThe Morality of Euthanasia and the Application of Utilitarianism. Introduction When a patient is in immense amounts of pain from a terminal illness, one of the options to relieve a person from the pain is euthanasia. Euthanasia is seen as a controversial and moral issue because of the different viewpoints. Euthanasia is seen by some as killing, while others see it as removing a person from immense amounts of pain. This paper will first discuss the principles of utilitarianism through Bentham

Tuesday, December 10, 2019

Weather and Climate free essay sample

Type of Fronts : NATSC 003 Weather and Climate 9 Type of Fronts 1. Cold Front -forms when a cold air mass pushes up a warm air mass ahead of it. Formation of a Cold Front : NATSC 003 Weather and Climate 10 Formation of a Cold Front Type of Fronts : NATSC 003 Weather and Climate 11 Type of Fronts 2. Warm Front -forms when warm air mass pushes cold air mass ahead of it. Formation of a Warm Front : NATSC 003 Weather and Climate 12 Formation of a Warm Front Type of Fronts : NATSC 003 Weather and Climate 13 Type of Fronts 3. Occluded Front -forms when a cold front overtakes a warm front and its able to lift the warm air mass. Formation of an Occluded Front : NATSC 003 Weather and Climate 14 Formation of an Occluded Front Stationary and Moving Fronts : NATSC 003 Weather and Climate 15 Stationary and Moving Fronts Stationary front occurs when the boundary between a cold air mass and a warm air mass is not moving. We will write a custom essay sample on Weather and Climate or any similar topic specifically for you Do Not WasteYour Time HIRE WRITER Only 13.90 / page Moving fronts is already classified as the cold, warm, and occluded fronts. The Weatherman’s Tools : NATSC 003 Weather and Climate 16 The Weatherman’s Tools The weatherman’s forecast is educated guesswork, based on the accurate observation obtained by man and machine. The weatherman, like any other professional, needs certain instruments to assist him in performing his work. Basic Weather Instruments : NATSC 003 Weather and Climate 17 Basic Weather Instruments Temperature Atmospheric Pressure Wind Velocity Humidity Precipitation Clouds Temperature : NATSC 003 Weather and Climate 18 Temperature 1. Thermometer -it measures the degree of hotness and coldness of a given substance. Temperature : NATSC 003 Weather and Climate 19 Temperature 2. Thermograph -it records air temperature on graphing paper. It consists of a cylinder made to revolve once each week by means of clockworks inside. Atmospheric Pressure : NATSC 003 Weather and Climate 20 Atmospheric Pressure 1. Mercury Barometer -made by filling a glass tube 32 in. long with a mercury and inverting it so that the open end of the tube is below the surface of mercury in a cistern. Atmospheric Pressure : NATSC 003 Weather and Climate 21 Atmospheric Pressure 2. Aneroid Barometers -made by exhausting the air from a thin circular, metallic box. Atmospheric Pressure : NATSC 003 Weather and Climate 22 Atmospheric Pressure 3. Barograph -it is a recording barometer and it is similar to a thermograph in construction. Wind Velocity : NATSC 003 Weather and Climate 23 Wind Velocity 1. Wind Vane -it is used to indicate wind direction. Wind Velocity : NATSC 003 Weather and Climate 24 Wind Velocity 2. Anemometer -it measures the wind speed and its made of propeller cups, which are rotated by the motion of the wind. Wind Velocity : NATSC 003 Weather and Climate 25 Wind Velocity 3. Aerovane -it indicates both the wind direction and the wind speed or simply the wind velocity. It is shaped like an airplane minus the front and the tail wings. Humidity : NATSC 003 Weather and Climate 26 Humidity 1. Psychrometer -it consists of a dry and wet-bulb thermometer which is use to measure relative humidity. Humidity : NATSC 003 Weather and Climate 27 Humidity 2. Hair Hygrometer -it uses human hair fro which the oil has been removed by using ether. The hair becomes longer as the relative humidity of the air increases. Precipitation : NATSC 003 Weather and Climate 28 Precipitation 1. Tipping-Bucket Rain Gauge -it records air temperature on graphing paper. It consists of a cylinder made to revolve once each week by means of clockworks inside. Clouds : NATSC 003 Weather and Climate 29 Clouds 1. Clinometer -it records the base height of the clouds. Special Weather Instruments : NATSC 003 Weather and Climate 30 Special Weather Instruments 1. Pilot Balloon -it determine the speed and direction of winds at different levels of the atmosphere. Special Weather Instruments : NATSC 003 Weather and Climate 31 Special Weather Instruments 2. Radiosonde -it is attached to the pilot balloon to measure pressure, temperature, and relative humidity in the upper air. Special Weather Instruments : NATSC 003 Weather and Climate 32 Special Weather Instruments 3. Station Model -it is used by weatherman or meteorologist to forecast weather. Recording Local Weather Conditions : NATSC 003 Weather and Climate 33 Recording Local Weather Conditions How a Weather Forecast is Made? Step 1: Observation The meteorologist use his instincts to obtain weather forecasts. He also uses weather instruments to state a certain observation on the existing weather condition in a large area. Recording Local Weather Conditions : NATSC 003 Weather and Climate 34 Recording Local Weather Conditions How a Weather Forecast is Made? Step 2: Collection/ Transmission of Data These weather observations made by meteorologists are condensed into coded figures, symbols and numerals are transmitted to designated collection centers for further transmission to local forecasting centers. Recording Local Weather Conditions : NATSC 003 Weather and Climate 35 Recording Local Weather Conditions How a Weather Forecast is Made? Step 3: Plotting and Analysis of Data Upon receipt of data to forecasting centers, they are decoded and each set of observations is plotted in symbols on weather charts over respective areas in which they were taken. The meteorologist now analyzes every detail of the data. Recording Local Weather Conditions : NATSC 003 Weather and Climate 36 Recording Local Weather Conditions How a Weather Forecast is Made? Step 4: The Forecast From the analysis of the different charts and graphs, definite weather systems will appear such as areas of high and low pressure centers. Weather patterns are observed with the pressure centers. Recording Local Weather Conditions : NATSC 003 Weather and Climate 37 Recording Local Weather Conditions Terms used in Weather Forecast a. Fine Weather It designates a weather condition of few clouds and no rain. Recording Local Weather Conditions : NATSC 003 Weather and Climate 38 Recording Local Weather Conditions Terms used in Weather Forecast b. Fair Weather It states that clouds are present and may produce rain in scattered patches, but the greater portion of the day is sunny or without rain. Recording Local Weather Conditions : NATSC 003 Weather and Climate 39 Recording Local Weather Conditions Terms used in Weather Forecast c. Rainy Weather It refers to a condition in which rain occurs during a greater portion of the day but the winds are mostly light to moderate. Recording Local Weather Conditions : NATSC 003 Weather and Climate 40 Recording Local Weather Conditions Terms used in Weather Forecast d. Stormy Weather It refers to a weather condition characterized by rains and strong winds. Storms : NATSC 003 Weather and Climate 41 Storms Weatherman use the name vortex to describe a phenomenon in the atmosphere in which the wind blows around a low-pressure area. Whirlwinds or dust devils, tornadoes, waterspouts and typhoon all belong to the family of vortices. The typhoon is considered as the king of all vortices because it is largest in point of breadth and height and therefore the most destructive. Types of Vortices : NATSC 003 Weather and Climate 42 Types of Vortices Dust Devils Waterspouts Typhoon Tornadoes Tropical Cyclone : NATSC 003 Weather and Climate 43 Tropical Cyclone The term tropical cyclone or tropical storms are used to denote the bigger types of vortices characterized by a low pressure at the center with circular wind motion which blows counterclockwise in the Northern Hemisphere and clockwise in Southern Hemisphere. Tropical Cyclone : NATSC 003 Weather and Climate 44 Tropical Cyclone Tropical storms includes: Formation of Storms in the Philippines : NATSC 003 Weather and Climate 45 Formation of Storms in the Philippines Two regions of Formation: In the Pacific Ocean between the Philippines and the Caroline- Marianas Island. In the South China Sea between the Philippines and the Asiatic Mainland. Movements of Storms in the Philippines : NATSC 003 Weather and Climate 46 Movements of Storms in the Philippines Two Directions: Storms that form from the East of the Philippines move towards the West-Northwest direction. Storms forming over the South China Sea usually move towards the North-Northeast direction. Factors that Affects the Formation of Storms : NATSC 003 Weather and Climate 47 Factors that Affects the Formation of Storms A warm ocean surface with a sea temperature of at least 26 degrees Centigrade (26Â °). A thick layer of moist air which extends to a height of 3 kilometers or more. Sufficient latitude. Tropical storms cannot form at the equator, and rarely form within 5Â ° of the equator. Weather : NATSC 003 Weather and Climate 48 Weather Weather is a general condition of the atmosphere over a specified area within a brief period of time. Weather Causing Phenomena in the Philippines : NATSC 003 Weather and Climate 49 Weather Causing Phenomena in the Philippines Rainfalls Intertropical Convergence Zone (ITCZ) Monsoons Local Disturbances Rainfalls : NATSC 003 Weather and Climate 50 Rainfalls Intertropical Convergence Zone (ITCZ) : NATSC 003 Weather and Climate 51 Intertropical Convergence Zone (ITCZ) It is the region where the Northern Hemisphere trades meet the Southern Hemisphere trades. Monsoons : NATSC 003 Weather and Climate 52 Monsoons Local Disturbances : NATSC 003 Weather and Climate 53 Local Disturbances Thunderstorm -is a storm that generates lightning and thunder. Thunderstorms frequently produce gusty winds, heavy rain, and hail. -at any given time, there are an estimated 2000 thunderstorms in progress on Earth. The greatest number occur in the tropics where warmth, plentiful moisture, and instability are common atmospheric conditions. Stages in the Development of a Thunderstorm : NATSC 003 Weather and Climate 54 Stages in the Development of a Thunderstorm Weather Modification : NATSC 003 Weather and Climate 55 Weather Modification In the earliest times, people have used prayers, wizardry, dancing, and even black magic to change the weather. Today, man has attempted to modify weather but much of it is still experimental. Weather Modification : NATSC 003 Weather and Climate 56 Weather Modification Weather Modification includes: 1. Cloud Seeding -is the attempt to change the amount or type of precipitation that falls from clouds, by dispersing substances into the air that serve as cloud condensation or ice nuclei. Dry ice (solid CO2) and silver iodide are two agents used in cloud seeding. Cloud Seeding : NATSC 003 Weather and Climate 57 Cloud Seeding Weather Modification : NATSC 003 Weather and Climate 58 Weather Modification Weather Modification includes: 2. Frost Prevention -is the attempt to hinder frosting among crops by using devices like wind machines, heaters and sprinkling of warm water. Climate : NATSC 003 Weather and Climate 59 Climate is the average condition of the atmosphere over a long period of time. Climate General Types of Climate : NATSC 003 Weather and Climate 60 General Types of Climate 1. Tropical Climate -this kind of climate has the highest temperature. The average temperature during the coldest month does not go below 18Â °. General Types of Climate : NATSC 003 Weather and Climate 61 General Types of Climate 2. Polar Climate -this kind of climate has the coldest average temperature. The average temperature during its warmest month does not rise above 10Â °. General Types of Climate : NATSC 003 Weather and Climate 62 General Types of Climate 3. Temperate Climate -this kind of climate has the moderate temperature, that is, in between the average temperatures of the tropical and polar zones. General Types of Climate : NATSC 003 Weather and Climate 63 General Types of Climate Two kinds of air masses that affect the amount of precipitation/ rainfall an area will receive: Marine/ Oceanic Climate- maritime air masses are located near the bodies of water. More precipitation will receive within areas near bodies of water Continental Climate- continental air masses are found in these areas. Their climate is drier than the marine climate. Earths Major Climate Zones : NATSC 003 Weather and Climate 64 Earths Major Climate Zones Factors Affecting Climate : NATSC 003 Weather and Climate 65 Factors Affecting Climate 1. Temperature Altitude- the higher the altitude, the colder the air within the area. Ocean Currents- Cool water will cool the air and warm water will warm the air. Factors Affecting Climate : NATSC 003 Weather and Climate 66 Factors Affecting Climate 2. Moisture/ Precipitation Prevailing Winds- air that blows from water to land will have more moisture and air that blows from land to water will have less moisture. Mountain Ranges- mountains cause air to rise. As air rises, it cools and most of its moistures condenses, falling to the ground as precipitation. Climates in the Philippines : NATSC 003 Weather and Climate 67 Climates in the Philippines The Philippines has a tropical marine climate dominated by a rainy season and a dry season. The summer monsoon brings heavy rains to most of the archipelago from May to October, whereas the winter monsoon brings cooler and drier air from December to February. Manila and most of the lowland areas are hot and dusty from March to May. Climates in the Philippines : NATSC 003 Weather and Climate 68 Climates in the Philippines Even at this time, however, temperatures rarely rise above 37 Â °C. Mean annual sea-level temperatures rarely fall below 27 Â °C. Annual rainfall measures as much as 5,000 millimeters in the mountainous east coast section of the country, but less than 1,000 millimeters in some of the sheltered valleys. Prevailing Wind System in the Philippines : NATSC 003 Weather and Climate 69 Prevailing Wind System in the Philippines Northeast Monsoons (hanging amihan) which prevail from months of November to February. Southwest Monsoons (hanging habagat) which prevail from the months of July, August, and September Trade winds which is the prevailing wing over the tropics. It prevails during the rest of the year and whenever the Northeast and Southwest monsoons are weak. They generally comes from the East. Climatic Types in the Philippines : NATSC 003 Weather and Climate 70 Climatic Types in the Philippines End of Report : NATSC 003 Weather and Climate 71 End of Report

Monday, December 2, 2019

Lord Of The Flies An Analysis Essays - English-language Films

Lord of the Flies: An Analysis "The two boys faced each other. There was the brilliant world of hunting, tactics, fierce exhilaration, skill; and there was world of longing and baffled common-sense." A quote showing the two main contrasts of the story. Savageness, and civilization. This, is the Lord of the Flies, a book written by William Golding. The Lord of the Flies has some interesting and deep thoughts, pertaining to the theme, plot, characters, and setting in this novel. William Golding did not just start writing a book; he took his time and worked out every little matter, to make sure the book was entertaining, and most of all, did not bore the reader. The Lord of the Flies begins with about 20 pre-adolescent boys who are on an airplane, and the airplane crashes on an uninhabited coral island in the Pacific. The airplane crew has been killed, and the boys are left on their own. They start to collect themselves into a society of food gatherers under an elected chief, Ralph. Ralph is about 12 years old, and has a very sensible, and logical personality. At first, the boys create duties to follow, and they live amicably in peace. Soon however, differences arise as to their priorities. The smaller children (know as littl'uns) lose interest in their tasks; the older boys want to spend more time hunting than carrying out more routine duties, such as keeping the signal fire on the top of the mountain going, and building shelters. A rumor spreads that a "beast" of some sort is lurking in the forest, and the children have nightmares. Jack, (A ruthless, power-hungry person), promising to fulfil the children's desire for a reversion to the ways of primitivism, is chosen as the new leader, and the society splits into two sections: those who want to hunt and soon become savages, and those who believe in rational conduct, and a civiliized manner. Ralph, the rational leader, soon finds himself as the outcast with Piggy, (a fat, non-athletic, logical type, boy). Simon, one of the more rational boys, finds out the secret of the "beast", and sees that it is only a dead parachuted pilot. He goes to the hunting group, and before he can say anything, they kill him by accident. Piggy is later killed by Jack when he accused Jack of stealling his glasses, which Jack did do. At the end of the story, Ralph finds himself all alone, and Jack sees the opportunity to track him down and rid himself of his nemisis. Jack gives orders to his savage group to hunt down Ralph, and Ralph finds this out. Just as Ralph is about to be killed by the "savages", a naval officer arrives with a rescue party. The 'world' of the Lord of the Flies is projected as a very realistic and plausible story to comprehend. If the reader found this specific world filled with people who do not talk or act in the ways that he or she is used to, he or she may decide that the characters are unbelievable, and unreal. In Lord of the Flies, Golding has shown that the characters are quite believable, and that their experiences are at least possible. The characters talk with a bit of broken and slang-like English, and have the characteristics and personalities of normal pre-adolescents. A few quotes from the novel to demonstrate the realistic talking of kids, and not heros from fairy tales, are these: "Look i'm gonna say this now...." or, "when are we goin' to light the fire again?" This shows the realism of the novel. The boys are also not impossibly brave, but only as brave as they want to be. They are no cleaner than boys can be with no soap available, and they like to play, but not work. They are not very responsible, and almost all are afraid of the dark. The plot is also very reasonable, except that there is no nuclear war going on in the world. But that does not make the story implausible, for there could easily be one. There are a myriad of strengths contained in this novel. The main points are basically the structure of the plot, theme, and setting. A remote jungle seems to be a very effective setting to establish the main theme about savages. It focusses on simple things, and would be much more effective than a great city-scape. Golding uses the jungle so he can focus on such fundamental themes as the conflict

Wednesday, November 27, 2019

Mid service learning reflection Essays

Mid service learning reflection Essays Mid service learning reflection Essay Mid service learning reflection Essay Am currently completing my service learning at Martin Luther King Elementary School working with children ranging from Kindergarten to 5th grade in an after school program. This program provides the children with a safe and comforting place to do their homework, hangout with their friends, and get a snack. Since am just getting to know these children I do not know everything about them and their background, so I do not have a lot of examples that relate back to the class work we do. In class feel as if a lot of the material we focus on relates back to the parents, home life, and how they eave been raised thus far. Only see them once a week, so cannot make judgments regarding their course of development and influences that have left impacts on their life. The two examples that I have noticed that we have covered in class include Bigotrys Coloratura Theory and The Ecological Systems Theory. I have noticed elements Of the Coloratura Theory in the childrens dialog ue. Considering their age the children have a very advanced dialogue with rather colorful language. Personally, was not even familiar with some of these words that Kindergrtners and 1st graders are expressing. I think this has to do with the culture they were raised and possibly the community. Think that the Kindergrtners at the school pick up on language that is used by the older kids and then they start to say it. Although teachers and administrators try their best to stop the kids from using bad language and reprimanding them when they do I feel as if the colorful language they have come to know is forever embedded in their brain. Think the Ecological Systems Theory plays a big role in understanding these children and the school I am working at. For me it was rather different to go onto a school like this in a bad neighborhood and see children behaving in ways that I had never seen before. However, for the teachers that work there are so used to it they dont even notice. The majority of the teachers either grew up in similar situations or they have been working in situations like this for so long that they are so used to it. I think that the children act the way they do and talk the way they do because they dont know any better. If thats the way their parents, siblings, and classmates talk they are bound to talk like that as well. I think the children also have a hard time focusing because there s a lack of listening to the teacher. Although there is structure it is not always followed. I think that some of the things that am learning in class arent playing out in my service learning experience the way they were described in class because like I already said I am just getting to know these kids. Do not know what their home life consists of. I do not know if each of their families is struggling immensely or minimally. I dont know if they have both parents at home or whether or not they have faced a traumatic event in their life. Hind that if had a little bit more insight into their home life then I could sibyl understand why the kids act the way they do. Think for the future when it comes to me learning class concepts when a new topic is brought up I can either write down a personal example that I have seen at service learning or even look for an example when I go that week. Think when it comes to studying and trying to really process and analyze the material should use my service learning as a tool to help me better understand. If I make note in my head of certain examples feel like am more likely to remember and be able to take away more from this course and service learning as well. I think the main social justice issues that I can see is the separation of races, so essentially racism and results of poverty stricken families. The first time I went to Martin Luther King Elementary School a few weeks ago was essentially one of the only white females in the building. The young girls in particular were immediately drawn to me and the other white female volunteer was with. Having naturally curly hair the girls all wanted to play with my hair and were all very affectionate and touchy. I personally found their reactions to me very nice and cute, but I couldnt help but think about why they were so drawn to me. Personally, I feel like this happened because the young students are not very used to seeing and interacting with people of my age and color. Martin Luther King Elementary School is a predominately black student population, and the surrounding area Of the school is a predominately black neighborhood. Milwaukee itself is one of the most segregated cities in the country which I feel is a huge issue because it translates into situations and schools like Martin Luther King Elementary School. I think that these children need more exposure to different ethnicities and they are not getting it because of the severe poverty among s. Another social justice issue I have noticed is the low income and poverty that affects these children. A large number of the kids at this school receive breakfast, lunch, and a large afternoon snack which could almost be counted as dinner. This happens because the parents of these children are severely struggling to properly feed them. Also think it is important to note that a lot of the children Stay after school because their parents cant afford babysitter while they work. I feel like the poverty that strikes these families is a vicious cycle. The parents have jobs, but the jobs just dont pay high enough and hey are working crazy hours. Going through this experience I can see that the social issues I brought up are very hard to avoid. The unintentional separation of races is largely due to the vicious cycle of poverty. The parents all work so hard to support their family and it is never enough. Going to college is expensive. Think that my initial thoughts have not changed that dramatically. Took a social welfare and justice class last semester and completed service learning for that class as well, so doing my service learning this semester basically just reaffirms the issues that face our society. Last master worked with babies, so working with older children this semester and being able to actively interact with them is so much different. I think that being able to talk with them and talk with the administration about what these kids face is extremely sad and eye opening. I never had to struggle the way these kids do, and I dont even think they realize now that they are struggling they dont know any different. Think that this service learning experience has opened my eyes to many different issues that are so local to where live. Working in this school and working with people in poverty in unreal has inspired me to think outside the box on things. Eel that sometimes forget the little things and am so simple minded. Ideally, want to try and do Teach For America after I graduate and think that I could bring a lot of positive influences with me. Working in this school has opened my eyes and shown me that there are so many more people that are affected by poverty than thought and I want to do my part to help diminish that huge difference between a poor inner-city education and top quality education that is received by peop le not affected by poverty.

Saturday, November 23, 2019

Free Essays on The Influence Of Nationality On The Accuracy Of Face And Recognition.

The influence of nationality on the accuracy of face and recognition. The article that I chose to review can be found in â€Å"The American Journal of Psychology†. Nathan Daniel Doty in Pensacola, Florida held this study. The study included sixty English and United States citizens. Each of these people was tested to see how nationality affects their ability to recognize prior witnessed faces and voices. The subjects looked at front facial pictures and then were asked to select ten oblique facial pictures. The subjects listened to recorded voices and then had to choose ten of those voices. They were then asked to identify a male and female voice from England, Belize, France, and the United States. The experimenter came up with two theories. The facial prototype theory, and the facial schema theory. The facial schema theory is when the mind develops an algorithm, which is a rule or procedure for solving a problem. This theory state recognizing a face would be automatic. The only problem with this is that the person was only exposed to a certain type and will have a limited and biased schemata. The facial prototype theory is based on an individual's past exposure. It compares that face at hand to on it’s the person’s memory. I guess this is why people say â€Å"Hey you know whom he looks like†¦Ã¢â‚¬  The prototype is a mixture of all the faces the person has seen plus the new faces. Exposure to a persons same nationality or group can trigger a persons identification skills. This experiment had facial pictures and voice recordings that were taken from ten men and women from four countries. They were from France, Belize, England, and the United States. They were chosen because of their availability, English fluency, willing to participate, and their ages ran from twenty to forty. The racial make up included three African-Americans and seven Caucasian males and two African-American, one Hispanic, and seven Caucasian women all from t... Free Essays on The Influence Of Nationality On The Accuracy Of Face And Recognition. Free Essays on The Influence Of Nationality On The Accuracy Of Face And Recognition. The influence of nationality on the accuracy of face and recognition. The article that I chose to review can be found in â€Å"The American Journal of Psychology†. Nathan Daniel Doty in Pensacola, Florida held this study. The study included sixty English and United States citizens. Each of these people was tested to see how nationality affects their ability to recognize prior witnessed faces and voices. The subjects looked at front facial pictures and then were asked to select ten oblique facial pictures. The subjects listened to recorded voices and then had to choose ten of those voices. They were then asked to identify a male and female voice from England, Belize, France, and the United States. The experimenter came up with two theories. The facial prototype theory, and the facial schema theory. The facial schema theory is when the mind develops an algorithm, which is a rule or procedure for solving a problem. This theory state recognizing a face would be automatic. The only problem with this is that the person was only exposed to a certain type and will have a limited and biased schemata. The facial prototype theory is based on an individual's past exposure. It compares that face at hand to on it’s the person’s memory. I guess this is why people say â€Å"Hey you know whom he looks like†¦Ã¢â‚¬  The prototype is a mixture of all the faces the person has seen plus the new faces. Exposure to a persons same nationality or group can trigger a persons identification skills. This experiment had facial pictures and voice recordings that were taken from ten men and women from four countries. They were from France, Belize, England, and the United States. They were chosen because of their availability, English fluency, willing to participate, and their ages ran from twenty to forty. The racial make up included three African-Americans and seven Caucasian males and two African-American, one Hispanic, and seven Caucasian women all from t...

Thursday, November 21, 2019

Movie Review on Seperate but Equal Example | Topics and Well Written Essays - 2250 words

On Seperate but Equal - Movie Review Example When Clarendon County, South Carolina's the black students are not given the right for a school bus, a harsh and brave fight for justice as well as equality begins. The Thurgood Marshall is a lawyer for the NAACP and he put up a frantic fight for the civil rights and these rights were not given when slavery was abolished, the whole fight turns into a grand fight both in his private life and the court (Separate but equal. 1996). Sidney Poitier, also a DGA director member, starred in the film as Thurgood Marshall, the NAACP lawyer who successfully argued the Brown case and later himself famously served on the Supreme Court (Jerry 2004). John W. Davis is Marshall's opponent; both of them argue keenly and fluently in front a Supreme Court that is led via Chief Justice Earl Warren. Separate But Equal is a touching human performance of a very important court case in American history. The prosecuting attorney and the protagonist in the movie was Thurgood Marshall, the famous black lawyer who was played by Sidney Poitier. Defense lawyers headed by Thurgood Marshall were invited by the local black citizens of kansas so that they could represent them in NAACP. The case is lost in the Federal District Court thus they give a plea to the Supreme Court. ... Board of Education, Delaware, as well as Virginia, all these cases had the same mission . Through strong actors such as Poitier, the movie was very well done. John W.Davis John W. Davis portrayed by Burt Lancaster,one of the top litigator, hired by school board of Clarendon County, South Carolina to argue their case before the Supreme Court, opposed Thurgood Marshall's effort to turn over the Supreme Court decision that gave the verdict of segregation being legal in public schools . The case is bought in the Court, before any decision can be made Chief Justice has a heart attack and dies. His place is taken by Earl Warren, this character is played by Richard Kiley (Goldberg 1993). Earl Warren Earl Warren, played by Richard Kiley appointed as a Chief Justice of the Supreme Court after the death of the last Chief Justice. Warren is extremely strong character, he is successful in convincing the court's other members to make a unanimous decision along with him to eliminate the tradition of segregation in the public schools all over the nation (Leonard, 1991). He had a diffcult time trying to convince one of the justices but finally he got him to agree with his decision. Kiley gives a brilliant and powerful performance, but the importance on Warren in Stevens' script seems like it is unpredictable. Harry Briggs Harry Briggs, the father of a black student in the movie. His original name was Tommy Hollis. He is sick and tired of watching his son walk long distances home from school, he feels sorry for his son, as his son is too tired too even do his home work after such a long and tiring walk (Brigid 2005). He enrolls his son's teacher to plead the local supervisor of schools to give a bus. The movie is