Home > Term: optimal control (linear systems)
optimal control (linear systems)
A branch of modern control theory that deals with designing controls for linear systems by minimizing a performance index that depends on the system variables. Under some mild assumptions, making the performance index small also guarantees that the system variables will be small, thus ensuring closed-loop stability.
- Μέρος του λόγου: noun
- Κλάδος/Τομέας: Επιστήμη
- Category: Γενική επιστήμη
- Company: McGraw-Hill
0
Δημιουργός
- Francisb
- 100% positive feedback