Microcontroller Definition and Understanding

Microcontroller is a digital electronic device that has inputs and outputs and controls with a program that can be written and erased in a special way. Simply put, how the microcontroller is actually only read and write data. Just an example, imagine yourself as you begin to learn to read and write, when you can already do that you begin to read any kind whether writing books, short stories, articles and so on, and you also started to write things otherwise. Similarly, if you are already adept at reading and writing data to the microcontroller then you can create a program to create a regulatory system using a microcontroller in accordance with your wishes. On-chip microcontroller is a computer used to control electronic equipment, which emphasizes efficiency and cost effectiveness. It literally can be called "little control" in which an electronic system that previously many require supporting components such as TTL and CMOS ICs can be reduced / minimized and eventually centralized and controlled by the microcontroller this. By using this microcontroller then:
* An electronic system will be more concise* Design of electronic systems will be faster because most of the system is software that is easily modified* Search disorders more easily traced because the system is compact.

However, the microcontroller can not completely reduce the TTL and CMOS IC components are often still required for high speed applications, or simply increase the number of input channels and output (I / O). In other words, the microcontroller is a mini or micro version of a computer that already contains some of the peripheral microcontroller which can be used directly, for example a parallel port, serial port, comparators, digital to analog conversion (DAC), analog to digital conversion and so only use the system minimum that is not complicated or complex.

0 comments:

Post a Comment