Classically, the Fisher information is the relevant object for defining optimal experimental designs. However, for models that lack certain regularity, the Fisher information does not exist and, hence, no notion of design optimality is available in the literature for such situations. This thesis fills this gap by proposing a so-called Hellinger information that generalizes Fisher information in the sense that the two measures agree in regular problems, but the former also exists in certain nonregular problems. A Hellinger information inequality is derived, showing that the Hellinger information defines a lower bound on the local minimax risk of estimators. This provides a connection between features of the underlying model, in particular, the design and the performance of estimators, motivating the use of this new Hellinger information for defining a notion of design optimality in nonregular problems. Hellinger optimal designs are derived for several nonregular regression problems, and numerical results are shown to demonstrate empirically the improved efficiency of these designs compared to alternatives.