We investigate means for the explainability of Decision Model and Notation (DMN) models. These are especially relevant for industrial settings, where the scale and complexity of DMN models can otherwise quickly make it unfeasible for companies to understand and maintain their decision logic. To this aim, we present a formal approach for measuring the impact of decision inputs on the decision output. In particular, we show how the decision logic of a DMN model can be transformed into a coalitional game based on (Datalog) queries over the decision tables, which allows one to apply the game-theoretic underpinning of Shapley values for measuring impact. Intuitively, the inputs of the decision act as the players of a coalitional game, and the payoff is the impact of an input/player on the decision output. The motivation of this work stems from real-life settings where means for understanding decision models are crucial, e.g., models of industrial complexity and domains such as fraud management. We implement our approach and evaluate it with real-life DMN models from the SAP-SAM dataset.