Please use this identifier to cite or link to this item:
https://scholarbank.nus.edu.sg/handle/10635/169972
Title: | MICROPROCESSOR BASED ADAPTIVE CONTROL FOR SERVO APPLICATIONS | Authors: | LIM BOON CHOON | Issue Date: | 1992 | Citation: | LIM BOON CHOON (1992). MICROPROCESSOR BASED ADAPTIVE CONTROL FOR SERVO APPLICATIONS. ScholarBank@NUS Repository. | Abstract: | This research studies the use of adaptive controllers for servo applications. It examines two popular adaptive control algorithms - Model Reference Adaptive Control and Generalised Minimum Variance Control, and suggests modifications so that they can be applied to control servo motors in real-time. Model Reference Adaptive Control is used when all states are measurable. Generalised Minimum Variance Control is used when only input-output measurements are available. In the case of Model Reference Adaptive Control, we modified the basic algorithm of the Liapunov based adaptive controller to use the framework of adaptive control of partially known systems. With this modification, we were able to reduce the computation load significantly. In servo applications. sampling times are typically very short due to fast dynamics inherent in the systems. Reducing the computation load helps to ensure that the computation can be handled within the short sampling period. The theoretical rigor of the Liapunov stability proof is not sacrificed at all. Global asymptotic stability is maintained throughout This work is further extended to augment an integral state in the sysem so that it will reject a d.c. load disturbance at the input. As for Generalised Minimum Variance (GMV) Control, we developed a new GMV-type controller which uses the delta operator instead of the usual q operator. The delta operator endows the GMV controller with superior numerical properties: superior finite word length coefficient representation and superior finite word length rounding error performance. Stability of the new control system is also rigorously proven. All the concepts suggest above were mathematically proven rigorously. After that, the alogorithms were simulated using a simulation software known as SIMNON. Finally, to demonstrate the validity of the above concepts, the algorithms were implemented in a computer and an Intel microcontroller-based board to control a motor in real-time. An IBM PC/ AT computer was initially used as a development platform. The motor output signals and the control gains were plotted on the computer screen in real-time. Most industry users would not consider using a personal computer to control a motor. That would be too bulky and expensive. To take care of this, we extended the work to a microcontroller-based board. We showed that, by using the algorithms we suggested, a simple low-end microcontroller was also fast enough to control the motor adaptively. This shows that the concepts may be implemented equally well by using an off-the-shelf chip like the Intel 8096 microcontroller. In the course of implementing the new control systems, we also introduced several measures to make the systems more robust. Issues such as choosing the sampling period, putting bounds on the controller gains, incorporating a Dead-Zone, using a dither signal, and handling actuator saturation were discussed. | URI: | https://scholarbank.nus.edu.sg/handle/10635/169972 |
Appears in Collections: | Master's Theses (Restricted) |
Show full item record
Files in This Item:
File | Description | Size | Format | Access Settings | Version | |
---|---|---|---|---|---|---|
b18330289.pdf | 6.33 MB | Adobe PDF | RESTRICTED | None | Log In |
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.