The June 23, 1988 issue of EDN included a Maxim Design News insert where we asked “Who in their right mind would choose a computer interface standard that uses ±12V supplies, requires expensive connectors, works over a limited distance, is error prone, difficult to network, and has no current loop isolation?” Yet, here we are 28 years later and the classic interface lives on, particularly in industrial applications and applications that need to connect just one peripheral to a host computer.
First sold in February 1988, the MAX232 skillfully married two seemingly unrelated functions, power and interface, and marked the beginning of a different way of thinking about what “analog” ICs can do. That thinking has taken on an even larger and more complex scale with the mixed-signal devices we see today.
The MAX232’s success was as much a tribute to the vision of its definer, Charlie Allen, as it was to the ingenuity of its designer, Dave Bingham. Before the MAX232, RS-232 interfaces consisted of two chips, a 1488 quad line driver and a 1499 quad line receiver (both 14-pin DIPs), along with a bipolar ±12V power supply. In most systems leading up to that point, the common voltages used in hardware were +5V for logic and ±12V for analog. Because ±12V was already generated for other parts of the system, tapping them for the serial interface drivers was not an inconvenience.
By the 1980s, however, more and more hardware, including analog functions, could be powered from a single 5V rail, both because more of the system became digital and because the analog circuitry they did have could now be built with newer analog ICs that could operate from 5V, making ±12V less necessary. In fact, Maxim and its peers hastened this shift by developing high-performance 5V-powered analog ICs such as single-supply op-amps, data converters, and analog switches.
For more detail: MAX232: The classic IC lives on since 1988