When an ultrathin metal film of thickness h (<20 nm) is melted by a nanosecond pulsed laser, the film temperature is a nonmonotonic function of h and achieves its maximum at a certain thickness h*. This is a consequence of the h and time dependence of energy absorption and heat flow. Linear stability analysis and nonlinear dynamical simulations that incorporate such intrinsic interfacial thermal gradients predict a characteristic pattern length scale Lambda that decreases for h>h*, in contrast to the classical spinodal dewetting behavior where Lambda increases monotonically as h2. These predictions agree well with experimental observations for Co and Fe films on SiO2.