The paper deals with optimal control problems for dynamic systems governed by a parametric family of discrete approximations of control systems with continuous time, wherein the discretization step tends to zero. Discrete approximations play an important role in both qualitative and numerical aspects of optimal control and occupy an intermediate position between discrete-time and continuous-time control systems. The central result in optimal control of discrete approximations is the Approximate Maximum Principle (AMP), which is justified for smooth control problems with endpoint constraints under certain assumptions without imposing any convexity, in contrast to discrete systems with a fixed step. We show that these assumptions are essential for the validity of the AMP, and that the AMP does not hold in its expected (lower) subdifferential form for nonsmooth problems. Moreover, a new upper subdifferential form of the AMP is established for both ordinary and time-delay control systems. This solves a longstanding question about the possibility to extend the AMP to nonsmooth control problems.