The history and development of CNC Milling
The CNC milling industry has a long history. The year 1818 is the year of birth, when the American Eli Whitney developed the first milling machine for metal. Around 34 years later, on March 14, 1862, Brown & Sharp finally delivered the first universal milling machine. This new technology developed very quickly. At the end of the century, the first special machines for specific parts such as gears came onto the market, the accuracy of which was already surprisingly high.
The next big step followed during the 1950s. The American John Parsons designed the first NC-controlled milling machine (NC = Numerical Control). The Brendix company adopted this technology in 1954 and developed an NC machine with over 300 electron tubes, controlled with punch cards. The NC program used here, which contained the sequence of individual pieces of information, can be described as the direct predecessor of the CNC program.

Five years later, NC machines arrived in Europe, where they started an industrial revolution. It wasn’t long before companies were retrofitting many older milling machines with numerical controls. However, this was just the beginning. In the following years the technology became much more sophisticated. Many machines received upgrades for a more stable design or for roller guides.
The gradual automation of milling machines began in 1965. IC technology was used for the first time in 1986, and microprocessors were used for the first time in 1976. The machines were no longer controlled directly on the hardware, but increasingly via software.
The programming of these new CNC machines had to be done laboriously by hand. Even small mistakes could have a devastating effect. Shortly before the turn of the millennium, however, this problem was also solved since the programs were created directly from the CAD/CAM system. This innovation led to today’s CNC machines and their simple, precise control.