Analysis Of Aplications Of The Rayleigh Distribution

The Rayleigh distribution has an extensive variety of applications including life testing experimentations and clinical studies. One major application of this model used in analyzing the wind speed data. Wind turbines convert the wind kinetic energy into electrical energy. In communication systems, the signal amplitude values of a randomly received signal can be modeled as a Rayleigh distribution. In mobile radio channels, the Rayleigh distribution is usually used to depict time varying nature of the received envelope of a flat fading signal, or the envelope of a flat fading signal or an individual multipath component. It is well known that the envelope of the sum of two quadratic Gaussain noise obeys a Rayleigh distribution. Rayleigh distribution can be applied in magnetic resonance imaging (MRI).

In this case, the estimation of the scale parameter σ is extensively used by the method of MLE. As MRI images are recorded as complex images but most often viewed as magnitude images, the background data is Rayleigh distributed. The concept of Rayleigh distribution was also actualized in the division of nourishment for connecting dietary supplement levels and human and creature reactions. In this case, the scale parameter σ of the Rayleigh distribution might be utilized to compute response reaction relationship. In recent times, Rayleigh distribution has been used to model data that is skewed (non-normal); such data occurs, for example, in life testing and reliability contexts. As our mentioned distribution deals in the life testing situations, in communication sectors to detect randomly received signals, in MRI sectors, in reliability contexts. So, these terms lead us to a concern about these process or products quality.

The word ‘quality’ is frequently used to signify ‘excellence’ of a product or service. Quality means fitness for use or a level of standard which, in turn, relies upon several factors like materials, manpower, machines and management etc. Quality presently has turned out to be a standout amongst the most vital consumer concerning and choices factors in choosing among contending items, products and services. Consequently, controlling, understanding and enhancing the quality are key factors in business success, achievement, development, growth, and in enhanced competitiveness and has become key strategy of success for many business organizations, manufacturers, distributors, transport companies, financial services organizations, health care providers, and government agencies. Quality control of a manufactured product or item considers the 8 major dimensions that was introduced by Garvin in 1987, the major dimensions are – performance, reliability, durability, serviceability, aesthetics (how product looks like), features, perceived quality, conformance to standards. Quality control is an intense productivity technique for effective analysis of lack of quality in any of the quality dimensions (specified earlier) or in the materials, processes, machines, or end-products. It is the decline of variability in process, items and products. Variation in the quality of manufactured product is the tedious procedure in industry is intrinsic and unavoidable. Basically, there are two types of variability and they are, chance causes of variation which is natural, random in manner and can’t be prevented and other is assignable causes of variation which is non-random and preventable that occurs for the factors like defective raw materials, new techniques or operations, negligence of the operators, wrong or improper handling of machines, defective equipment, unskilled or inexperienced technical staff etc. and these causes can be identified and removed.

The extreme tolerable value for a quality characteristics is called upper specification limit (USL), and the least tolerable value is called lower specification limit (LSL). The determination of Statistical quality control (SQC) is to familiarize statistical methods to cut apart the assignable causes from the chance causes and to remove and determine the assignable causes. So, now we would like to concentrate on the major part of quality control that is Statistical process control (SPC). Statistics is an assemblage of methods suitable for settling on choices about a process or population based on an analysis of the investigation of data contained in a sample from that population. Statistical techniques and strategies play an indispensable part in quality control and development. SPC is a straightforward, operative technique to deal with problem resolving, process enhancement and process control and it is a standout amongst the most sophisticated techniques of quality control which is extensively utilized in industry to monitor the process by utilizing statistical tools. SPC isn't just a tool kit yet additionally a technique for diminishing variability, the reasons for most quality issues, variation in products, in times of deliveries, in methods for getting things done, in people’s attitudes, in equipment and bits use, in maintenance practices, etc. SPC looks to lessen variability. SPC is one of the best innovative and technological advancements of the 20th century because it is based on the accompanying sound underlying key monitoring and investigating tools which are often called “the magnificent seven”. The steam-and-leaf plot is one of the the most valuable graphical technique.

Although it is an phenomenal way to visually show the variability in data but it does not take the time order of the observation into account where time is often a vital factor that contributes to variability in quality control and development. A Histogram although provides more compact summary of data then stem-and-leaf plot but we simply can notice that in passing from the original data or a stem-and-leaf plot to a histogram, we have in a sense lost some information because the original observations are not preserved on the display. A check sheet can be extremely useful tool for data collection activity but it cannot able to infer the defects of data. The Pareto chart is simply a frequency distribution (or histogram) of attribute data arranged by category but Pareto chart does not automatically identify the most important defects, they only identify the most frequent. In situations where causes are not obvious (sometimes they are), the cause-and-effect diagram is a formal tool frequently useful in underlying potential causes. A defect concentration diagram is a picture of the unit, demonstrating all relevant views and various types of faults are drawn on the picture, and the diagram is analyzed for determining whether the location of the defects on the unit conveys any useful information about the possible causes of the defects but it does not show the pattern of the observations. The scatter diagram is a convenient plot for detecting a potential association between two variables. Data are collected in pairs on the two variables which specifies a strong positive correlation.

This sort of philosophy is potentially perilous, because correlation does not essentially imply causality however it is useful for classifying potential associations. A control chart is one of the primary, essential and most refined techniques of SPC which includes all the five steps of DMAIC (Define, Measure, Analyze, Improve, and Control), defect prevention, process adjustment, analytic information and information about process ability. DMAIC is extensively used organized problem-solving methodology in quality control and process enhancement. A control chart is a tool anticipated to be utilized at the point of operation, to routinely monitor quality, where the process is conducted by the operations of that process. It’s a very crucial tool of continuously monitoring the enhancements and great improvements in quality where the results of control charts are plotted on a chart which mirrors the variation on the process by presenting a center line (CL), the upper control line (UCL), and lower control line (LCL). The earthshattering innovation and advancement of control charts was made by a young physicist Dr. Walter A. Shewhart of the Bell Telephone Laboratories in 1924. In the subsequent year, in view of the theory of probability and sampling Shewhart control charts yet providing reliable results which are being extensively used to identify large shift in the process. Now-a-days with the extensive use of control charts we have 2 types of control charts – Variables control charts (Shewhart individuals control charts,x ̅ and R chart, x ̅ and S chart, Three way chart, Regression control chart) and Attributes control charts (p-chart, np-chart, c-chart, u-chart).

Almost all these control charts and their usual interpretation based on the normality assumptions that is they are sensitive to non-normality and they are used to detect the large shift in the process. Shewhart control charts are extremely convenient in phase I implementation of SPC, where the process is likely to be out of control and experiencing assignable causes that result in large shifts in the monitored parameters. But a key drawback of a Shewhart control chart is that it uses only the information about the process contained in the last sample observation and it ignores any information given by the entire sequence of points. This feature makes the Shewhart control chart relatively insensitive to small process shifts and insensitive to non-normal situations. But in the current thesis, we are intended to improve the phase II monitoring system that’s why we are dealing with the EWMA for process monitoring. Shewhart control charts are commonly known as the memory less-type control charts where these type of charts are insensitive for the detection of large shifts in the process monitoring but the memory type control charts (EWMA and cumulative sum (CUSUM)) uses the past information with current information to provide a better result for detecting small to moderate shifts, rather than Shewhart-type charts that uses only current information. EWMA control charts are outstanding alternatives to the Shewhart control chart for phase II process monitoring situations. The shift detection properties of the EWMA are uniformly superior to the Shewhart chart for individuals and EWMA is very operative against small process shifts. The EWMA is also used widely in time series modeling and in forecasting and since EWMA can be regarded as a weighted average of all past and current observations, it is very insensitive to the normality assumption and therefore it extensively used in non-normality situations. The control limits on the EWMA chart can be used to signal when an adjustment is necessary, and the difference between the target and the forecast of the mean can be used to determine how much adjustment is necessary. The EWMA can be modified to enhance its ability to forecast the mean. We will try to improve the forecasting performance of the EWMA in this case. The public health surveillance community often uses scan statistic methods instead of more conventional control charting methods as EWMA control chart. For example, a scan method would signal an increased rate at a given time. Based on those evidence, we would recommend a properly designed EWMA control chart that has a wide range of applications in connection with Rayleigh distribution which also properly follows the skewed property where we mentioned earlier that EWMA performs quite well for both heavy-tailed symmetric distributions and skewed distributions.

15 April 2020
close
Your Email

By clicking “Send”, you agree to our Terms of service and  Privacy statement. We will occasionally send you account related emails.

close thanks-icon
Thanks!

Your essay sample has been sent.

Order now
exit-popup-close
exit-popup-image
Still can’t find what you need?

Order custom paper and save your time
for priority classes!

Order paper now