The success of the convex-concave procedure (CCCP), a widely used technique for non-convex optimization, crucially depends on finding a decomposition of the objective function as a difference of convex functions (dcds). Despite the widespread applicability of CCCP, finding such dcds has attracted little attention in machine learning. For graphical models with polynomial potentials, existing methods for finding dcds require solving a Sum-of-Squares (SOS) program, which is often prohibitively expensive. In this work, we leverage tools from algebraic geometry certifying the positivity of polynomials, to derive LP-based constructions of dcds of polynomials which are particularly suited for graphical model inference. Our experiments demonstrate that using our LP-based technique constructs dcds for polynomial potentials of Markov random fields significantly faster compared to SOS-based approaches used in previous works.